A spec is a long, tedious, human-readable document that explains the behavior of a system in unambiguous terms. Specs are important because they allow us to reason about a language like Markdown without reference to any particular implementation, and they allow people to write implementations (Markdown processors) independently that behave identically. The Markdown Syntax Documentation is not a spec (it's highly ambiguous), nor is any implementation (not human-readable; some behaviors are probably accidental or incidental and difficult to port perfectly). The hard part of writing a spec is codifying the details in English, and secondarily making decisions about what should happen in otherwise ambiguous or undefined cases.
I would love pointers to Markdown processors that are implemented in a more principled way than the original code, for example using standard-looking lexing and parsing passes, but that still handle nested blockquotes and bullet lists together with hard-wrapped paragraphs.
Nobody should be using the original script, and unfortunately many of the other implementations out there are direct transliterations that replicate all of its absurd errors, like where if you mention the MD5 hash of another token in the document, the hash will be replaced with the token, because it uses that as an inline escaping mechanism! Reddit got hit with a XSS virus that got through their filters because of it: http://blog.reddit.com/2009/09/we-had-some-bugs-and-it-hurt-...
See the changelog for what started as a PHP transliteration and turned into a rewrite that squashed 125 (!) unacknowledged bugs: http://michelf.com/projects/php-markdown/
The worst part is that he outright refuses to either disclaim or fix his implementation, and so far he's repudiated everyone else's attempts to do so. He's a terrible programmer and a worse maintainer, he really still thinks the documentation on his site is comprehensive and canonical. As much as Jeff Atwood leaps at every chance to play the fool, there's no way his directorship can be anything but an improvement.
In other words, let the user type:
It will save a lot of trouble -- and especially when linking to a Wikipedia page whose URL contains parentheses.
Could I soft-wrap in my editor? Sure, but that would mean that the text files sitting on my hard drive now have very long strings in them making it harder to grep, making it harder to add to git (change a single character, entire line is now a diff :-().
I hope that doesn't become the default.
Why get all angry at John Gruber? As many have already noted, he created Markdown for himself and released so that others could use it. AFAIK he didn't put any license/restrictions on it outside of calling himself BDFL. Whatever his skills as a programmer, writer, or his role as Mouthpiece of Apple, the vitriol is unnecessary (but absolutely fanscinating to watch). My panties bunch up naturally, no need to allow my feelings regarding Gruber to bunch them further.
Why get his approval? In the same spirit that Gruber created something for himself, you should just create something for yourself. I find it hard to believe that Gruber was the first person that conceived the idea of user-friendly text-markup. The new standard could just be inspired by Markdown and that would be a win-win: a respectful nod towards Gruber as well as the ability to move towards something 'better'.
If you have not taken a pandoc for a spin I highly recommend you do so soon. In addition to being a great markdown dialect the pandoc tool set is the swiss army knife of text formatting. It is amazing how many formats pandoc can read and/or write.
EDIT: I spoke too soon, Fiddlosopher continues to impress. I just checked the open issues and a little less than a month ago he added "limited org-table support." Based off of the rest of pandoc "limited" probably means something like 85% to 95% :)
I ended up writing my own in Objective-C. It's not very pretty, and it doesn't use a formal grammar (just a lexer + custom grammar code), but it does the trick. I took a few liberties with the spec: throwing in GitHub-flavored code blocks.
And then, for the LaTeX that you can't shim in, just have some escape hatch that sends fragments out to a renderer.If I could only have:
* Math mode * Citations and Bib files * Labels and References
EDIT: Having just investigated Pandoc, which many here are talking about, I realize this might be exactly what I've been looking for :)
"I'm reminded of the guy who decides that there should be one standard because there are n divergent implementations. So he goes and writes his own. Now there are n+1 divergent implementations."
The idea of Markdown is great, but I found the implementation of links is less than obvious. (haven't tried it in 4 years, so there was probably other issues that I had that I've forgotten)
The problem I inherently always end up having with "parses to HTML" syntax conventions is there are always warts where the syntax is harder to remember than the HTML it is supposed to parse to.
I love it because the world needs an easy-for-humans way to format in pure ASCII without any tool. It is much simpler than using even the most well designed GUI. You can even write books with it, and you can focus on content.
But I hate Markdown. I hate it because it is superficially good: a lot of Markdown seems to make sense at a first glance, but if you look at it more closely you see that a lot is broken in its design (IMHO the fact that the reference implementation is broken is the minor of the issues).
It is surely possible to fix it. However it's better to have a broken Markdown now that no markdown at all. The fact that Github and Stack Overflow and Reddit are using it makes it absolutely obvious how useful and great the concept is. The actual design, implementation, and specifications can be fixed now. So kudos to the original inventor, but it needs a second pass from people that can give it a more coherent shape, with clear behavior, minor surprise, and parsing in mind.
The current behavior of Markdown solves this problem very well. I don't want the newlines I enter for non-wrapping editors to remain in the generated HTML.
Edit: I've wondered whether the original Markdown didn't have underline support because <u> was deprecated/removed from HTML. FWIW, <u> is now back in HTML5.
If this gains some traction I'm sure I'll be adding support for it at some point.
: a wonderful almost-everything-to-everything text converter http://johnmacfarlane.net/pandoc/
IMHO, pandoc markdown support is the mother of all implement featuring lots of goodies (table and footnote to name 2)
I don't think such a thing is feasible. I also don't think it's feasible for any proposed standard to simply look at the largest users and say "okay, we'll accept the idiosyncratic extensions of all of these differing flavors in an unambiguous way."
So assuming this pushes forward, there are (to my mind) two possible outcomes:
1) A backwards-incompatible standard emerges. No existing project adopts it, but new projects do. It gains legitimacy only once Github, Reddit, et al fade into obscurity.
2) A backwards-compatible standard emerges. Every large existing project adopts it, but the standard is so full of cruft and TIMTOWTDI that in ten years it gets usurped entirely by a challenger that emphasizes simplicity.
Mou + the (built in) Github theme = best Markdown editing experience.
If only a couple sites band together, then I see it more like this:
But I have learned to love Markdown too, I hope in the future, distant future: Someone will create a language that integrates HTML and CSS into a nice Markdown-like language.
> The problem with writing my own Markdown parser in Clojure is that Markdown is not a well-specified language. There is no "official" grammar, just an informal "Here's how it works" description and a really ugly reference implementation in Perl. http://briancarper.net/blog/415/
I absolutely love the simplicity of Markdown, especially with github's addition of code fences/blocks. It's so trival now to add code and have it automatically highlighted. It's not nearly that simple in other formats (to get autohighlighting I guess).
Excited to see what will come of this.
There are many questions â€" "What is Markdown?", for starters â€" that feel unaddressed by the mark. Instead, we get the brute force approach: splitting up the word into smaller word parts, which is what you do with a word if you don't know what it means, or you have to gesture it in Charades.
Rather uninspiring for an idea so beautiful that Jeff and others can get so excited just thinking about it, but what else can you expect from such a mark whose approach is so stubbornly literal? I take that back â€" only one word part actually gets to be represented literally... the other only managed to become a letter, in a moment I can only imagine involved the creator muttering "good enough". He must have found this mark uninspiring as well, given that he sought to put a box around it.
At least consider that the down arrow on its own is an overloaded concept, particularly on the web. Without context â€" and a mark should not need context â€" Mâ†" could read like a hotkey or command of some kind. This kind of ambiguity is utterly unnecessary â€" you're making a mark; it can be whatever you want it to be. Push!
I also see no reason for text and _text_ to produce the same output. It just seems like a fault in the original spec to me.
Dodgy HTML, content pasted in from Word (with crazy styling intact), and a general encouragement for users to see text content in terms of styling rather than structure are all things that it will be delightful to see the end of.
rst just looks more powerful and yet still as readable as markdown.
Aside from that (and implementation bugs) I've been very happy with markdown.
The one change for good I can think of would be removing the ability to embed HTML.
I'm very happy that GitHub has an Org Mode renderer, even if rudimentary - I don't have to rewrite my notes and READMEs to Markdown.
1. hello something 2. foobar
1. hello something 1. foobar
This is an [example link]<http://www.example.com/>;
You can play with it here: http://www.markdowncms.com
If there was a standardized Markdown, we would implement that for sure.
 http://www.aaronsw.com/weblog/001189 http://en.wikipedia.org/wiki/Markdown
I'd certainly be interested in switching over to their version, provided some of the noted kinks get worked out.
The Surface one up there is just some guy's blog review. It's not poorly written, but why are we reading randomly selected Surface reviews? There's an entire post right now that is basically a Samsung press release via CNET, describing some (totally unquantified, of course) minor uptick in sales for the latest Android phone. There is literally nothing to talk about there except to proffer essentially baseless flames, praise, or speculation.
I would have no qualms asking the moderators to fix this. I can't understand any metric by which these are useful posts to have on the front page. There is lots of much better stuff sitting on the New page which is being crowded out by noise that I could go read in two hundred other places. "Intellectual curiousity" is not referring to what you have every time a phone comes out which is 20% lighter and 10% longer.
Notionally, this is a forum for creators, but it seems increasingly pre-occupied with utterly unproductive posturing over whose tastes are 'better'. It's a troubling trend.
Besides, you're supposed to up-vote comments you don't necessarily agree with so long as they are well argued. That is what a good debate is about.
I think that the reason is the same: when you spend money on one, you buy into a community and an ecosystem. You become a part of a tribe and naturally begin to see the world in an us vs. them paradigm.
It's worth noting that this is an irrational behaviour set, and best avoided if you want to learn anything objective. In typically-emotive arguments like these, you have to make the decision yourself and realise that, whatever you choose, you'll likely justify it to yourself afterwards however you can. Once you start to realise that, you begin to realise how inconsequential "what type of tablet or console you own" is, and the less likely you'll be to fall into that destructive us-vs-them mindset.
* yeah, I know you really like [company] and really don't like [competitor], but please don't say mean things about those who disagree with you, and especially don't say mean things about the staff at those companies without very good reason
* it's election season in the US, which means more than the usual number of offhand derogatory comments about the other side's politicians and voters. Please refrain from this.
* I've seen a few shots taken at other peoples' religions. Principled disagreement is OK, but try to resist name-calling.
* There have even been a couple of recent arguments about nationality that have involved some unnecessary name-calling.
* As a final heads up, remember that even deleted posts may be cached by various external services that grabbed them via the API. It's good to think better of something after the fact and take it down as soon as you can, but it's even better to avoid posting nastiness in the first place.
As a community, let's do a better job of controlling our own posts first and foremost, and let's do a better job of downvoting and flagging when others cross the line.
Hacker News is usually a pretty nice place to hang out, but that comment thread reminded me of the ten minutes hate from 1984.
These threads remind me of reading newspaper articles that discuss how uncivil our current political discourse is compared to the far more civil past. And you can read essentially the same article from a 1880's/1950's/2012 newspaper archive.
The conflict doesn't arise when switching cost is low or the differences are too minor (e.g. Sony TV vs. Panasonic TV, Verizon vs. AT&T, Unitarianism vs. Baha'i fail to generate rancor on both counts).
The conflict would appear to arise from people struggling with cognitive dissonance. In other words, if an iOS or Android user were supremely confident of the superiority and perfection of their chosen platform there would be no dissonance and and no outward invective.
Just as Freud (correctly, for once) observed that the most passionately homophobic individuals were often in denial of their own urges, the most fervent boosters of a platform are probably plagued with doubts about it.
I regularly see long and technically strong articles sink with less than ten votes and zero discussion, while those lambasting Apple yet again get dozens of votes and comments. Add hair-splitting with strong passive-aggressive undertones, and what's left is vacuous and mildly toxic.
Those are pretty strong words. All I've seen is a few geeks trading opinions about--ultimately petty--consumer electronics issues.
It's all just opinion. Nobody's said "Person X is ignorant waste of consciousness and they should kill themselves" (which would be uncivil, inappropriate, bilious, and divisive.) They just have opinions about products. Products that in the grand scheme of human achievement really aren't that important.
You're just causing even more drama with this self-righteous post. It's all, like, your opinion, man; take it easy, let the geeks bicker (relatively politely) about fruit versus miniature eiderdown, and save the outrage for things that are truly worth it.
Edit: I'm not going to upvote this parent meta-post, and neither should you, dear reader, for it itself is the one causing drama, not the majority of posts on HN in the past few days.
Maybe I can help with the understanding part. Here are some things that I've observed, as a hacker, about humans:
1. (most) people like to form groups and then compete with other groups
2. (most) people enjoy feeling superior to other people
These are things that seem to have been true in any part of the world, throughout all of human history.
So what's our plan here? Are we going to turn hackernews into the only collection of humans to ever live that defies these rules? Is there some technical solution that will change fundamental aspects of human nature? Maybe getting rid of the voting arrows will remove all of the meanness and tribal thinking on the planet.
I say all this because I don't understand the impetus for your post. Of course it would be nice if everything everyone said made an insightful contribution. But you know that people aren't like that. No amount of blog posting or commenting is going to change how people interact with each other. It seems like your problem isn't with the hacker news community, but with the nature of human socialization.
This looks like a clear case of selection bias. It's hard to do good as a hacker if you isolate yourself in an ivy tower of ycombinator hackers and geniuses. Making things does take some understanding of the average person and how they behave. If you truly think that hackernews is negative when compared with just about anywhere else, then you might be out of touch.
The multiplicity of products that we have to choose between and the lockin we experience once we've made the purchase (we have a contract for the phone and have made significant monetary commitment to the devices in general) mean we have to make a hard decision and then try to feel good about it.
Once we've picked, if we admit that another device is better, then we're saying that we made the wrong choice and that we have to live with a subpar device for another few years. Most of us tend to get defensive about our purchases instead, even when we are trying to be objective.
The truth is that there are trade offs between all of the devices that are related to our priorities, our personalities, and our social circumstances all of which make us feel personally invested in a gadget decision. This makes it hard for us to come from an objective place to talk about some of our favorite topics. Many of us are looking for validation more than information (I've definitely been guilty of this).
The trick, then, might not be to try and be more objective, but to take criticism of the products less personally. Headlines are meant to get clicks, not express thoughtful opinions. The intricacies of the tradeoffs are worth considering, but you won't find them in most tech coverage. Save your hate and try to understand why the competition is valuable to others and what your product could learn from.
Re lack of civility, this is a normal feature of anonymous interaction which stems from lack of accountability - the only way to deal with it is to impose social sanctions on the users responsible. Everyone can do this by refusing to be baited, and calling out others for antisocial, insulting, or extreme comments.
Actions will have more impact that meta discussions.
Don't get me wrong, I was honored - but it's off topic.
As hackers, I believe we all subscribe to the old mantra that one should use what is best for the job at hand, and arguments about whether Microsoft Surface or iPad Mini or so on and so forth are the "right way" detract from the quest for knowledge in which all of us participate.
So can someone create a 3rd party site that displays HN, but removes/hides these off-topic posts? Then everyone would be happy. There are already some similar implementations (like http://ihackernews.com for a mobile version), so it can't be that technically difficult. It would also be great for users to be able to specifically block certain domains (e.g., I could get rid of all Gruber and Marco blog posts from the list of links I personally see).
Edit: this could also be done with a browser extension, but that wouldn't work on mobile devices (I think)
As for the issue you're talking about, this guy here is obviously a flaming Microsoft-fanboy: http://news.ycombinator.com/item?id=4706624 .. I wanted to call him one, but refrained from doing so, mostly because I thought it would be met with a negative reaction.
But you know, talking to fanboys is really frustrating. Their posts are full of such obvious, annoying bullshit/misdirection that it's just really difficult to ignore, but on the other hand, going through the effort of shutting them up is pointless too.
That's why it's tempting to just call a fanboy a fanboy, instead of wasting a lot of time and effort in a civil discussion with them.
As is evident by the comment threads on the tablet releases, people have strong opinions, supressing these with calls for 'civility' are nothing more than asking for people to only post comments that you approve of, which seems extremely bourgeois to me. I enjoy seeing the comments where people express strong opinions because I am able to learn for example what kind of person is going to like the Surface and who won't. There's signal in the noise and in a public forum it's not about what you want to read. If I could down vote your thread I would because I find it extremely distasteful to see someone wanting to read just what they enjoy. It's really no different to me posting an ASK requesting that we focus more on Python or jquery plugins. Please, less of the high horse rhetoric.
related: best God joke ever:
In many other discussions it seems like one controversial sub-topic ends up dominating as well.
Perhaps downvoting controversial comments isn't always a bad thing? There seems to be a big fear of the downvote button, but in some cases, even if a comment is useful on its own, in the end it sparks massive amounts of arguing back and forth which could be avoided if it were just downvoted instead.
Liking or disliking Apple or Android or Windows... well sure. people can have preferences. But self identifying or rejecting people based on their computing software? Being rude to people because of their technology preferences?
Ask yourself, why do we do that? Does it make sense logically? Not really. But at an emotional level, it feels good to have a group of people who one can feel part of, and a group of people that are outside it that one can disparage as not being one of us. Making moral judgements based on what tech company a person likes? Human tribal groups.
The truth is, we can do better then that.
I'd like to see a /startup or similar, moderated by entrepreneurs to set the tone of what posts or comments aren't welcome.
Performance, cost, usability, etc. are all factored into the system I use, the phone, tablet, etc. are all purchased based on these factors. If you do not like a particular product just do not buy it and if for some reason someone asks for your opinion on a product you can give it without being fanatical about it, it is just a product.
Agree - this should be the default in any comment. It would be interesting to see a "karma" score for those who hold their tongue when they have nothing constructive to say, but obviously, that's pretty much impossible in an online format. In a way, the karma on a forum encourages opinions whether vacuous or not.
One thing I would add to the topic of 'determination': Are we speaking about determination to make a startup successful or determination to try out as many ideas in our life as possible, learn as much as possible and try to make at least one startup successful in our life?
I mean first we have to analyze what we optimize for:
If we optimize for the success of a given startup then it is obvious that the optimal strategy is to never give up on the startup.
If we optimize for the success of a person in his lifetime then it is different. In this case we have to examine all kinds of opportunity costs. Could it be a better strategy to very quickly abandon a startup when it seems that people do not want the product, so that we can start much more startups in our life, to increase the chance of at least one becoming successful?
It really is great! I encourage everyone on HN to read or watch it if they have not already. As a not-yet founder, it has a lot of interesting advice that I don't think is documented anywhere as concisely and practically as it is here.
Founders at Work was written by Jessica Livingston, who is a cofounder of ycombinator. She's married to Paul Graham. But do not think that she's in there just because of the personal connection. Her book is truly excellent. And in previous articles I've seen Paul say that the #1 thing that they want in a founder is determination, and the person that they rely on to spot it during the interview is Jessica.
The pizza place was very confused by this, but they send the pizza guy without a pizza, Kyle answers the door, and the pizza guy says, "The site is down."
PG: "You know, that is her deepest wish. If she is watching this, she'll be laughing so much at this point because that's what she would like the most too to be able to spend more time on the new version of Founders at Work. There's a new, she's working on a new edition, with a bunch of new interviews."
Any updates on this?
The effects of those two are very large, the effects of everything else comparatively small per decades of startup and longitudinal entrepreneurial studies.
Nonsense about hustle is exactly that: nonsense. The weight of evidence suggests that, if anything, hustling and creativity have a net negative effect on long term health of a startup.
But there's money to be made keeping up the lie.
Lastly, beware of pseudo-pop-science that opens with only a few people's stories. People manage to succeed as founders all over the world; these stories are not remarkable and tell us nothing.
In general the whole "determination" thing has little to no value in any serious consideration of startup success: it's about on the same level of credibility as diet fads.
Is she ever going to pursue writing a sequel to Founders at Work?
He has been collecting data on start-ups and then looking at survival lengths and outcomes. He wrote a book on the topic
Also, since I just survived a dual-founder breakup (company intact), it was encouraging to know that this was probably a bigger bullet to have dodged. (For those curious, post-breakup I reached out to an old friend with whom I've shared some tenuous situations and we have applied to YC for the next batch)
Edit: I forgot about the pizza comment! When she asked how to contact someone in Lake Tahoe, I audibly said pizza (in my empty apartment). When the solution was pizza, I had a celebratory moment.
Jessica mentions the Codecademy team launched 2 days before Demo Day and managed to signup 200k users. If I remember correctly, they launched on HN through a Show HN thread.. and so on..
What I really want to know is, how many of those initial 200k users stuck around? I was one of them and I have only signed in maybe twice since their launch.
So what does that mean? they leveraged the curious users to get VC interest? Did they really engage me, us, the 200k? is that a false positive?
I guess if the net result is a positive one Today, none of this really matters.
3rd sentence: "There's a talk I've always want to give at the beginning of each batch...". I think this should be either "I've always wanted to..." or "I always want to..." right?
-- This is a great point. Even outside of startups.
CMD+F for "luck" = 0 results.
Luck is a huge factor and sometimes you just need to move on to either something new, or working for a company to fill in the gaps, and trying again soon.
I guarantee you cannot name three books that have done a better job capturing this topic, because they don't exist.
Claiming that Livingston's relationship with YCombinator/Graham is the reason why the book is so wonderful, is like claiming David Pogues relationship with the NYT is why he's such a popular tech reviewer, or Manohla Dargis is such an amazing movie reviewer
It misses the point of both their contribution, and talent and is frankly, quite rude.
1) A private frame relay network that one day stopped passing packets over a certain size. Worked around by lowering the MTU at both ends till I was able to convince the frame relay provider that yes, the problem was in their network. This was relatively straight-forward to diagnose, but it was still odd being able to ssh into a box, then have the connection hang once I did something that sent a full-size packet (cat a large file, ls -l in a big directory, etc).
2) A paging gateway program I wrote (email to SMS) that worked fine when testing on my Mac, but couldn't establish connections to a particular Verizon web site when I ran it from a Linux box. Turned out that the Linux TCP stack had ECN enabled and at the time the Verizon website was behind a buggy firewall that blocked any packets with ECN bits set.
3) A Solaris box that could randomly be connected to, but not always. Turned out someone had deleted its own MAC address from its ARP table (yes, you can do this with Solaris) so it wasn't replying to ARP packets for itself. As I recall, it could make outbound connections, and then you could connect to it from that same peer until the peer timed out the ARP entry. Then the peer couldn't reach the Solaris box again.
None of these are nearly as complex as the scenario in this story.
I was troubleshooting with a user of an audio streaming application running over a LAN. The user could stream classical music but not rock music. Seriously. Classical was fine, but when streaming rock, the connection would drop after a few minutes.
The application took chunks of audio, compressed them with a lossless codec, and then sent each chunk in a separate UDP packet to the other end. It tried to use IPv6 whenever possible because it was generally more reliable in the LAN environment, although it would happily use IPv4 if need be.
After a huge amount of boring troubleshooting going back and forth with this guy, I finally figured it out. Somehow, he had set his network interface's MTU to 1200 bytes. IPv6 won't perform automatic IP-level fragmentation for MTUs below 1280 bytes, so larger packets simply could not be sent at all. The streaming application would try to send an audio packet larger than 1200 bytes, get an error, and bail out of the connection.
Why did it only happen with rock music? Turns out to be pretty simple. Lossless codecs are necessarily variable bitrate, and classical music compresses better than rock music. When streaming classical, each chunk of audio consistently compressed to less than 1200 bytes, but rock music produced occasional packets over the threshold.
The user didn't know why his MTU was turned down and didn't need it, so we turned it back up and everything worked just fine.
Shortly after bringing up a second T1 into a remote location we discovered that some web pages would show broken JPG images at the remote site.
Some troubleshooting revealed that this only happened when traffic was routed over the new T1. The old T1 worked just fine. Pings, and other IP traffic seemed to work over either line but we kept seeing the broken image icon for some reason when traffic came over the new T1.
We tried several times to confirm with the telco that the T1 was provisioned correctly and that our equipment matched those telco parameters. Still had some mangled bits going over that new T1.
Finally had the telco check the parameters over every span in the new (long-distance) T1 circuit and they eventually found one segment that was configured for AMI instead of B8ZS (if I can remember correctly, certainly it was a misconfigured segment though).
The net result is that certain user-data patterns that didn't include sufficient 0/1 transitions would lead to loss of clock synchronization over that segment and corrupted packets. Those patterns were most likely to occur in JPGs.
Once they corrected the parameters on that segment, everything worked as expected.
Quite a bit of head scratching with that one and lots of frustration as the layer-1 telco culture just couldn't comprehend that layer-2/3 Internet folks could accurately diagnose problems with their layer-1 network.
We had a similar issue at Blekko where a 10G switch we were using would not pass a certain bit pattern in a UDP packet fragment. Just vanished. Annoying as heck, the fix was to add random data to the packet on retries so that at least one datagram made it through intact.
About 3 visits in a row I went to look at problems (core dumps or errors) that the customer could reproduce at will, only for them to be unable to replicate the problem with me present on site.
I sat at one customer (in sunny Minneapolis) for 2 hours in the morning with the customer getting increasingly baffled as to why he couldn't get it to fail; it had been happily failing for him the previous evening when I was talking to him on the 'phone. We gave up and went for lunch (mmm, Khan's Mongolian Barbeque). A colleague of his called him midway through lunch to tell him that the software was failing again. Excellent I thought, we'll finally get to the bottom of it. Back to their office and ... no replication; it was working fine.
As a joke I said I should leave a clump of my hair taped to the side of the E450 it was running on. The customer took me up on that offer and, as far as I know (definitely for a few years at least), the software ran flawlessly at that customer.
It's the closest I've got to a "'more magic' switch" story of my own.
A little background... I was brought up in the network ranks, I worked as a network / sys admin in high school, ended up working for an ISP as a junior network engineer in college (while I went to college at one of the first Cisco NetAcad baccalaureate programs - which was a combo of network study and Cisco curriculum and certifications) and have gone on to work in every major vertical since then for the past 10+ years; government, finance, healthcare, retail, telecomm, etc. I always tell clients and potential employers that having a network background generally gives me somewhat of an edge in the industry I primarily focus on: security, and I generally will study and take Juniper & Cisco tests and work on labs just to stay current. Most software devs and security folks I've run into (keep in mind there are a lot of really good folks who have a better grasp on network than a lot of seasoned engineers do) are generally overzealous in the thought that they truly do understand IP from a debugging and troubleshooting standpoint.
Case in point: I interviewed for a "Network Architect" position with a very well known online backup company (think top 4). The interview was the most bizarre I've ever had, not that it spanned more than 5 interviews, but that every time they positioned a complex network problem it was generally solvable within 5 to 10 minutes of pointed questions. The software dev who was interviewing me was baffled by how I came to a reasonable solution that took them over a week, in some cases, that quickly - and it was pretty simple in the fact that 1) I've seen something similar and 2) that's what I studied and still have a passion for over the course of 20+ years (when I found the Internet in 1991).
Most of the time when I run across a "magical" problem it's because someone hasn't looked at it from L1 up. As this article showcases you generally have two generic stack angles to approach it from - application back down to physical, or the inverse. Having been in network support - by the time you get a problem like this it's often so distorted with crazy outliers that really have nothing to do with the problem your best bet is to start from that L1 and go back up through the stack. Reading into the problem the author describes I think there were some key data that was missed and/or misinterpreted. There most surely would have been key indicators in TCP checksum errors and it was glossed over pretty lightly in the explanation - but it's interesting that those items of interest are often cast aside when digging into something like this. Nobody in this thread has indicated where a bit error test or even something as simple as iperf, or similar, would have been able to more accurately showcase/reproduce the problematic network condition.
But back to the labels remark - I don't believe, as some people have said, that this is a DevOps role largely. I don't mean to cut down on DevOps folks because I think, at some level, if you're a jack-of-all in any org then that's your role, it is what it is. However, this would be a problem most suited towards a professional network engineer - and you don't see much of that need in the startup space until people get into dealing with actual colo / DC type environments, otherwise it's often very simple and not architected with significant depth or specific use cases.
Long story short: network professionals are worth the money in the case of design, build, fix of potentially issues that may seem complex to others, but can be solved or found in minutes when you know what you're looking at. That being said, I'm impressed that the OP dug into it to get to a point where he could ask a specific person (who was probably a network engineer / tech of some level) to validate/fix his claim.
I suspected the VM code at the time, but it is very likely that my packets had to go through the same router (geography would support this).
I'm so glad somebody debugged this problem. Also, I'm quite glad that at least this time I'm not the only person with a weird issue (I have a knack for breaking things).
I've been wondering about something not entirely unrelated we see sporadically from a small but widespread number of users. We serve deep zoom images and the client appears to run normally but sends malformed image tile requests - e.g. in the URLs "service" is consistently garbled as "s/rvice", "dzi" as "d/i". I've seen this from IPs on every continent and user agents for most common browsers as well as both iOS and Android. My current theory is that it's some sort of tampering net filter as a fair number of the IPs have reverse DNS / Whois info suggesting educational institutions but have thus far failed to confirm this, particularly since none of the users have contacted us.
Today, people are relying on SSH for binary transfer more than ever. SFTP and SCP are the new defacto file transfer standards between machine to machine over a secured connection. Source control like GIT (or even SVN) make heavy use of binary transfers over SSH. The performance benefit to the entire world is immeasurable. Yet unless you explicitly go out of your way to manually compile and install SSH-HPN, you don't get it.
That said, given how slow SSH is on Windows (GIT pushes and pulls are exponentially slower than on *nix or OS X), does anyone have a good link to a Putty HPN build?
We tracked it down to a switch that was corrupting packets enough that the TCP checksum wasn't sufficient protection, and the packets would simply pass their checksum despite having been altered.
The out come was that we always use compression, or encryption, as an added layer of protection.
I had a similar problem, less hairy, involving a bad bit in a disk drive's cache RAM. Took a day or so to figure out a solid repro.
Stuff like this does happen. Handling bit errors in consumer electronics storage systems is an interesting problem, and one that I'd love to see more attention paid to.
The more ambiguous situation is that early Juniper routers would fairly frequently re-order packets. That's nominally allowed, but a lot of protocols didn't like it.
There are way weirder things on satellite or other networks (spoofing acks, etc.).
Curl has an option, CURL_SSL_VERIFYHOST. When VERIFYHOST=0, Curl does what you'd expect: it effectively doesn't validate SSL certificates.
When VERIFYHOST=2, Curl does what you'd expect: it verifies SSL certificates, ensuring that one of the hosts attested by the certificate matches the host presenting it.
When VERIFYHOST=1, or, in some popular languages, when VERIFYHOST=TRUE, Curl does something very strange. It checks to see if the certificate attests to any hostnames, and then accepts the certificate no matter who presents it.
Developers reasonably assume parameters like "VERIFYHOST" are boolean; either we're verifying or we're not. So they routinely set VERIFYHOST to 1 or "true" (which can promote to 1). Because Curl has this weird in-between setting, which does not express any security policy I can figure out, they're effectively not verifying certificates.
They cast a really wide net, looking for as many examples as possible where non-browser applications fail to do SSL validation correctly, but then conclude that this will result in a security compromise without fully examining the implications.
For instance, they point out that many SDKs for Amazon FPS don't validate certificates correctly. But I didn't see them mention that the FPS protocol does its own signature-based authentication and that credentials are never transmitted in the clear: it was essentially designed to operate over an insecure transport to begin with.
Likewise, they point out an "unsafe" construction that an Android application that I wrote (TextSecure) uses. But they don't mention that this is for communication with an MMSC, that this is how it has to be (many don't present CA-signed certificates), and that the point of TextSecure is that an OTR-like secure protocol is layered on top of base transport layer (be it SMS or MMS).
So I think the paper would be a lot stronger if they weren't overstating their position so much.
Many security flaws found in commonly used SSL libraries.
Other than that, it is a great find.
This causes the page to throw an HTTPS warning: "this page loads insecure content" due to the css loaded over HTTP.
"Not the most interesting technically, but perhaps the most devastating (because of the ease of exploitation) bug is the broken certificate validation in the Chase mobile banking app on Android. Even a primitive network attackerâ€"for example, someone in control of a malicious Wi-Fi access pointâ€"can exploit this vulnerability to harvest the login credentials of Chase mobile banking customers."
It validates SSL certificates correctly by default. How about other languages?
I suppose my web browser has an extended list of CA that my OSX lion does not know about.
i'm not saying that this would solve all the problems, or that you should develop critical financial software by having people that don't understand much writing tests. but tests are pretty much common culture now; you'd think people would have considered this. and the argument the paper makes is not that the programmers are clueless, but that they are confused by the API, so they should be able to think up some useful tests...
of course, integration testing with sockets is a bit more complicated than unit tests (perhaps something toolkit apis should support is a way to allow testing without sockets?), but it's not super-hard. [edit: hmm. although testing for unreliable dns is going to be more tricky.]
You can see it here:https://github.com/rails/rails/blob/3-2-stable/activeresourc...
I'm pointing it out as it was not mentioned in the paper.
Edit: It looks like it has been that way since SSL was first implemented in Connection.
At any rate, here is a pull request for PHP which attempts to address the issue:
require 'always_verify_ssl_certificates'AlwaysVerifySSLCertificates.ca_file = "/path/path/path/cacert.pem"
http= Net::HTTP.new('https://some.ssl.site, 443)http.use_ssl = truereq = Net::HTTP::Get.new('/')response = http.request(req)
It is destined to be flawed as long as insecurity is allowed. Only when every exploit is exploited continously will people be vigilant.
? This code is intended for deployment in potentially dangerous regions for getting around government censors.
<falls off chair>
"The built-in front-facing camera for Skype is angled so that it'll work great when the kickstand is open, but again, only for Danny DeVito, or maybe for people who want to show off their chests in Skype."
"The Touch Cover is one of the Surface's biggest innovations. I thought I would hate it, but I didn't. It's not like typing on a completely flat surface: each â€śkeyâ€ť is raised slightly, so while there isn't any mechanical feedback, it does feel a bit like a keyboard."
"The Type Cover (the one with real keys) just works. I've got big hands that often struggle on undersized keyboards, but I can type very quickly on the Type Cover."
"He showed me Office, which was almost unusable: it was extremely sluggish, and touch targets were tiny and difficult to hit."
"So quickly, in fact, that I can outrun Microsoft Word on the Surface. I get the feeling that the Surface RT's CPU or Word code just can't keep up with my typing. Here's an example video:"
"The standard gestures don't help, requiring many in-from-the-edge swipes that not only aren't discoverable"
"After waiting over a minute for the machine to boot and launch the mail app, I got a blank gradient screen. User interface 101: if the app needs to be set up on the first launch, offer to do that, please. Folks from Twitter suggested that I swipe out from the right side and click Accounts"
So, can we conclude that these observations might be real (V. 1) problems without resorting to ad-homs regarding the author?
- Love the build. Very solid overall.
- 16:9 means it's one long tablet. Oddly, it's actually fairly usable in portrait; can't say the same for my old 16:10 Transformer (maybe just better balanced?)
- The touch cover is, like most say, surprisingly usable. Desperately needs a way to no-op Caps Lock though.
- Screen res lower than iPad, but still usable. Difference not near as noticeable as between iPad 2/3, but too many factors in play to make an objective call there.
- Metro takes getting used to, but I like it (even with KB/trackpad).
- It's the first time I've seen proper desktop Gmail and Google Docs usable in a tablet browser.
- Performance is generally decent. Not blazing, but decent.
- Windows RT appears to still contain far more of Windows than we've been led to believe. Even `csc` is installed, but missing a few dlls.
- No SSH client for Metro yet. That's one of the risks you take on a new platform (esp. a non-Unix one), but still aggravates me.
- Snapping is very, very handy; nice solution to bring proper multitasking to a tablet UI.
- When touch-scrolling over on desktop apps (what few remain), the entire window "bounces" at the head/tail of the content. Odd decision.
- No central notification bin (like Android's shade or iOS's Notification Center). Have to rely on scanning Live Tiles if you miss anything.
- The back camera seems to exist only to make the iPad 2's back camera feel better about itself. Has to be the blockiest camera I've ever seen.
- Handwriting recognition is pretty solid. Wacom junkies will be very pleased when so-equipped tablets ship. (Capacitive styli still suck)
- None of the Twitter apps have really thrilled me. Given the circumstances, I'm not that surprised.
- OS-level share support is a smart move; similar to Android's impl but more thorough (sharing pops up a share pane from your selected app in the sidebar, instead of bouncing you out of your current app entirely).
- Printing is mildly unintuitive; you have to open the "Devices" charm and pick your printer. No one is going to guess that's how to print.
- On the bright side, our network printer/scanner was detected and installed immediately, with zero user intervention. Very, very far cry from the WinXP days.
- There's no way to see your precise battery life outside of the desktop (in the classic sys-tray).
- Presumably due to the use of pressure sensors vs. capacitive, the Touch Cover isn't quite as accurate without a solid surface underneath.
- If you're not using the keyboard (watching movies, etc.), flip the cover backwards with the kickstand out and it's nearly as stable as a laptop.
- The intro tells you about the basic edge swipes (right for charms, left for app switcher, top/bottom for menu); not mentioned is swiping straight from top-center to bottom kills the current app.
- Screenshot is Win+VolDown.
- Wordament can be played while snapped. This is dangerous.
- IE lets you swipe on the outer edge of the page for back/forward, which would be smart if this didn't occasionally clash with the app switcher.
(PS: I typed this entire post on the Touch Cover.)
Yesterday was the big retail launch. I was on a mission to check out what my local stores had and, if they had anything that could do the job for me, buy it. I've always wanted a tablet, but only if it could be as useful as a laptop when paired with a keyboard. The new Windows 8 tablets are supposed to be just that.
Best Buy had one (1) Windows 8 tablet. It was a Asus Vivo Tab running Windows RT... supposedly. I don't want an RT tab, and this store didn't even have a working floor model of the one tablet they were selling. The one they had was stuck on a "failed to automatically repair Windows" screen. It was also glued to the display stand so I couldn't pick it up and get a feel for the hardware.
OfficeMax had zero (0) Windows 8 tablets. Heck, they had no Windows 8 touch screen laptops either. Or price tags. Or product specs. Or anything I could play with, really. There was one employee there setting up a display model of some laptop while complaining to another about how they were supposed to have tags for the computers but had none. Their electronics section was a joke.
Staples had one (1) Windows 8 tablet. It was a Samsung ATIV running Windows 8. Success! I actually spent some time playing with this one. Again, I couldn't really get a feel for the hardware, or specifically the weight, given it's got a pound of security alarms and tethers bolted onto the back chaining it to the display area. Beyond that, the specs just weren't up to snuff -- with 2GB RAM and 64GB storage, I'd just barely be able to run enough software to occasionally use it as a portable development machine. With nothing installed on it, there was only 14GB of free space -- the OS and preinstalled apps were using 50GB of the 64GB out of the box.
So all those trips were a waste of time. There's no Microsoft Store anywhere within 4 hours of me, so those 3 were the full range of retail options here.
I'm basically looking for a Surface Pro (Intel Core processor, 4GB RAM, 128GB storage). It's amazing that despite knowing Microsoft would be building this, nobody else built something comparable, and stores aren't carrying even the few tablets/hybrids they did build.
Right now there are two exceptions to this: Office (preview version - buggy) and a Desktop-mode version of IE. Everything else is 100% Metro. And I don't think you can even install anything yourself on it except via it's App Store. Hence it's Desktop-mode is not really there for the benefit of the consumer. And the Office offering will need to be further ported and refined for RT before everything is worked out. I'm not even sure why they put Office on it.
It's a device made mostly for browsing the internet and running some apps while holding it in your hands. Which is what the bigger market is for.
While this was a good and honest review, I think his use-case is off on this one and he will be better suited waiting for Surface with Windows 8 Pro.
I would also be curious to know what his height is, so I'd know what "for short people" means... The pics I've been able to find of the author, he's at least 6'2", maybe even 6'5".
If you are as toll as the author, you could probably either move the device away a bit, zoom out the image, or perhaps put something underneath it's stand to angle it properly.
After using Windows 8 I just see no good reason for anyone to use it on an old PC instead of Windows 7. I only see drawbacks, such as the forced Metro interface, and the inconsistencies in the desktop mode UI, which seem like a patched-up job done 6 months before the release or something, to make it more "Metro".
- A mail app that opens to a completely blank screen with no cues on how to continue. - An infinite login dialog that doesn't allow you to cancel and back out.
I've read my share of man pages and hand written my Xorg.confs many times in previous lives, I'm no stranger to complex and arcane software setup procedures.
But in 2012, in the world of smartphones and tablets, this is stuff that should just work. The answer to "the mail app is completely blank on launch" shouldn't be "sorry, you failed to read the manual". Ever.
And while I greatly respect Microsoft's attempt at entering this market, someone on their team, at some point, had to look at these issues and say, "okay, this software is ready to ship anyway". That does not bode well.
 The alternative, I suppose, is that no one noticed. Which is even worse.
and I still think it is true... IMO Microsoft made a mistake by leading out with the RT. Leading with the Pro and then offering the RT as a feature-reduced lower cost version would have cut down on the confusion as to what RT really is and lessened the initial impression that the Windows 8 experience is kind of underwhelming.
As a WIN32 developer for the last ten years I would have to agree that a good bit of software today for Windows lacks performance. Why ? IMO a lot has to do with mindset of not only developers, but also those who produce the programming languages developers use. I would venture to say that most programmers would admit that the computer they develop on is likely a more advanced computer than most mass market PC's. They like i5 or i7 CPU's, 8,16 or more gigs of memory, SSD's, etc. The mass market PC though, to be affordable comes far less equipped. This is why when I write software, my development PC is closer to a more mass market PC. I need to feel the problems with performance the moment I compile and run. Now if you write apps which run fast on a slow PC, imagine how they will run on the higher end devices.
Huh, in the Anandtech review they thought the kickstand worked well everywhere except airplanes.
Has anyone else tried the SRT? This post alone is enough to scare me away.
> I admit, I fully expected a tablet version of my laptop. I wanted it to do everything my laptop could do, but with the added bonus of the touch screen, so I can play my games that make my phone freeze up while I'm sitting at my kids dance or karate classes.
If you're technically savvy enough to understand and follow focus of GUI elements, and don't mind a stylus, then there are a number of existing tablets that will fit this bill. In fact, they've been around since ~2000.
But the Surface will not be true competition to Apple. This product fails in too many ways, and I predict that the iPad will remain dominant for at least a few years to come.
also vs the ipad 3 here - http://goodereader.com/blog/good-e-reader-videos/microsoft-s...
the pro might be a better investment, but most of the apps crash/buggy, not really worth being an early adopter with this product.
It wasn't the first time I've bough I Microsoft mouse. I bought one back about a decade or so ago when another Logitech mouse died. It suffered the same fate as the Microsoft touch mouse. It was returned to the store and exchanged for another Logitech - for exactly the same reason.
Neither was acceptable for my workflow. Unsurprisingly, I spend a meaningful amount of time using CAD/BIM software. The touch mouse zoomed in when I adjusted my grip ("drawing" with a mouse largely involves holding it). There was no way to program the gestures. Likewise, the earlier Microsoft mouse had lots of buttons, but no way to program the middle button as a middle button - as an early "many button" mouse, the middle button had some dedicated function and I had about a decade of muscle memory and projects to push out the door.
The author is experiencing the same thing. The new device isn't tailored to his workflow. It probably isn't reasonable to expect it to be. It's competitors aren't; most people don't have a similar workflow; and it's still version one of the software (Word for RT).
This doesn't excuse the devices performance. But it also puts the author's experience in perspective. Right now, he's somewhat of an edge use case.
Even RT could be tolerable with the right apps as a remote machine with that keyboard, similar to what people do with Android+Transformer. I can program on it, work on remote machines.
That having been said, assuming it's somewhat usable on a lap, I'll wait for the Pro too, I have several things that need x86.
It is weird combo of laptop and tablet.
When they go bad, they're catastrophically bad.
Overall the design looked cool, as I am interested in tablet with an attached keyboard with a trackpad. Though I want a tablet/PC type of device that allows me to use it as a tablet or a PC laptop. I guess the Surface is not what I imagined.
Camera viewpoint doesn't cover your face when you put tablet on kickstand mode? put it little away. what's wrong. Other leading tablets in market doesn't even have one. Its been stated design wonder along with cover with keyboard. Should appreciate instead.
All issues noted in this article are exaggeration except the live sign-in bug while saving office doc.
This is a Surface Pro, which has not been released yet. Slowness is totally understandable because that guy installed an OS X on an iPad, which is the same thing as using Win8 full desktop environment on Surface RT hardware.
More info at the author's blogpost: http://musicmachinery.com/2012/10/28/infinite-gangnam-style/
The Echonest stuff, done over the selected works of an artist could make for some interesting mashups of their work.
Someone should analyze why this song is so catchy.
Infinite Gangnam Style - Frequently Asked Questions
What is this?
- Infinite Gangnam Style is a web app that dynamically generates an ever changing and never ending version of the song 'Gangnam Style' by Psy.
It never stops?
- That's right. It will play forever.
How does it work?
- We use the Echo Nest analyzer to break the song into beats. We play the song beat by beat, but at every beat there's a chance that we will jump to a different part of song that happens to sound very similar to the current beat. For beat similarity we look at pitch, timbre, loudness, duration and the position of the beat within a bar.
How come this doesn't work in my browser?
The app requires the web audio APIs which are currently best supported in Chrome and Safari
What does Psy think about this?
I don't know. I hope he doesn't mind that we are using his music and images. We hope you check out his official video and his web site too (but really you probably already have).
Who made this?
Paul Lamere at Music Hack Day Reykyavik on October 28, 2012
Sorry, this app needs advanced web audio. Your browser doesn't support it. Try the latest version of Chrome
I also like the helpful visualization below that shows which part of the song it is currently using.
Anyone else experiencing this?
You would have to improve the program a little bit, but this concept being realized with a vast music library?
Sounds quite interesting...
Btw, quick bug report: doesn't work for me if open in non-active tab in Chrome 22.0.1229.94 on Mac OS X 10.8.
Good fun and now do an automated version where ppl can paste their youtube links.Greetings,lx
Warning: if you watch it, the lyrics will get stuck in your head. http://www.facebook.com/photo.php?v=10101449851143489
> I still had a job, which made everything near impossible, that I couldn't afford to quit. I worked during the day as a report writer, snuck in emails and business calls for Altsie over my lunch, and worked late into the night to take care of hundreds of necessary details to keep the project going.
> Despite my downward physical spiral, I managed to marry the love of my life
I appreciate that people have lives too but you just can't do two jobs and have a personal life. Sorry. Something has to give. I've read many tales of where having just the startup has put a strain on personal relationships.
I wonder what the situation was with the cofounders. How many were there? Were they full-time? If so, that could be a problem (in that they might end up feeling that they've gone "all in" when you haven't).
> Two years building and eight months running Altsie took its toll.
Two years to launch? i wonder how much quicker it would've been to launch if it had full-time resources. For something that isn't hugely technically sophisticated (correct me if I'm wrong but this doesn't sound like that kind of startup) that is (IMHO) too long. People talk about MVPs for a reason. You need to prove your idea and get feedback ASAP.
Whatever the case, eight months doesn't seem long enough to prove anything one way or the other.
I don't mean to be harsh so I apologize if it comes across that way. Lucas, good luck to you. I would suggest that when you wish to try your next venture (assuming you do), you do so when you can dedicate it to yourself full-time.
Lucas says "I put three years of my life into building and running Altsie,..." ... "As we approached launch last May" and "Two years building and eight months running "
What are the expectations on a business where you are looking for people to integrate a new thing (going to a bar to catch an indie movie) into their lifestyle? A week? A month? a year? five years? If you look at the restaurant business most seem to require a 3 year 'boot' cycle, the first year nobody knows about them but perhaps the local food critic trys them. The second year they have some foot traffic and perhaps they get written up in a more widely distributed guide, then the third year they have people coming who have read about them in the guide or found them on their phone's 'maps' product and they get to see how successful they are going to be. I can't imagine that any idea which requires people to change their behaviors in the real world could really be tested in less than a year.
The other thing that was sad to read was this bit, "I'd signed up to fight on the front lines. I still had a job, which made everything near impossible, that I couldn't afford to quit. I worked during the day as a report writer, snuck in emails and business calls for Altsie over my lunch, and worked late into the night to take care of hundreds of necessary details to keep the project going."
There is a reason YC and others ask you to quit your job if you're doing a startup. There isn't a lot of excess time. If you have a spouse or partner who can bring in enough income to pay the bills and maybe health care that is one thing, but being both the 'stable income source' and the primary mover of the new venture? Not a good idea as Lucas discovered.
Now the most important thing to do is to capture all of the things you learned into something you can use in the future. What worked? What didn't work? How did you spend your time, could you have out sourced any of that? What were your costs and how did you evaluate the business? What variables did you guess at? Did you guess high or low? People who have been through the ringer are twice as valuable as people who haven't done it yet because they have a better idea of what they need to know to make forward progress.
I hope that Lucas' next venture is a lot less stressful on his health/psyche and much more satisfying overall.
It's a bit sad to see. Especially because i believe that many people here know (or should know) how complex these topics are.
In my opinion great article. Thanks for sharing that honestly.
Perhaps one of them has an idea to cut costs, or would like to open source the code, or can line up a buyer for the assets, or ... something.
Telling your stakeholders/investors/cofounders after you've pulled the trigger seems like the exact backwards way to do it.
I have no idea how good a business idea that is (I guess not such a great one), but it sounds like a great idea and I wish something like it could be successful. In my moderately sized UK city it's impossible or very difficult to see a large proportion of new releases on a big screen.
Definitely identify with gaining weight. It's brutal how quickly you can fall out of shape.
After playing basketball 6 times a week since college I barely get out once every three months. I'm 30 now and feel 40.
Aside from the up and down roller coaster ride, the hardest part for me has been balancing a relationship that began at roughly the same time that my co-founder and I went into business together. I have no idea how you could possibly balance anything else (like a real job) outside of a startup and a new relationship for extended period of time.
There are times my relationship has been a distraction to our business. But well worth the juggling act :)
1. What pain does my idea solve?2. Does it solve it for a large number of people?3. Just how painful is it for not being solved?
Do you know plenty of people who are in pain because they can't find a venue to watch an indie flick? Does not being able to find an indie flick at an appropriate venue eat at their thoughts 24/7? Are they going to go nuts finding a solution if you don't provide one? How much money would solving this problem be worth to them?
Admittedly I know diddly about Altsie, and I'm not one for indie flicks, but let's compare Altsie to Airbnb. Airbnb solves a basic human need: that of housing. How painful is it when you don't have a house? Immensely. How much money are you willing to pay for a roof over your head? Thousands per year. How many people are searching for your solution. A shitload. Now replace housing with "Indie Flick", and objectively recalculate.
After doing so, you might think three years is a long, loooong time investment, hugely out of proportion to the level of pain Altsie solves, not to mention the price of solving that pain.
I honestly don't think anyone understands what it really feels like to build a company until you do it. Before I started running my first startup, I thought that the hardship and mental anguish other people describe was somewhat like what I already experienced during hard times at other companies. It wasn't. You pour your heart and soul into a startup and push to the side your physical health, hobbies, family and basically everything else. Then after a year or more of doing everything possible to try and succeed, you potentially end up with nothing. Like Lucas says, you don't really end up with nothing, but it sure as hell feels like it at the time.
First, Altsie is a pretty awesome idea! I really like the idea of going to a bar to watch an indie movie, I'm sure producers would love to get their film shown, and bars want extra customers coming in. This is something that definitely could have worked.
Second, the technology behind this product is trivial, a 2 year build is a huge warning sign. I cannot find on the site or in this description anything that should be hard to put together, and the fact that Lucas spent a few years building this in his spare time instead of hiring someone to do it in a (few) week(s) shows a dangerous prioritization of money over time.
Third, it takes a strong presence of mind (or maybe just good communication with your partner) to realize that what you're doing isn't making you happy. Kudos on letting it go.
UPDATE: Must have been a bug. It has now been fixed.
And the quality of the actual video isn't even HD?
PS: Already a down vote. Be man or woman enough to state your case.
My favorite speaker probably was Joel Spolsky (and his slow, organic growth vs land-grab talk).
I love how Joel used Fog Creek to fund StackExchange's development and now Trello, which both seem to be land-grab businesses. It's almost like Fog Creek is it's own startup incubator now. Maybe a new model of funding/startups?
This recording will never be available? I would like to watch his talk...
I felt a recurring theme was "don't give up"... so I'll really try to remember that lesson when I hit future roadblocks.
I enjoyed attending and meeting some of you in person. Definitely looking forward to next year's edition!
NOOO! This talk was fabulous!
I had to strain to hear what the speakers were saying.
Was the volume OK for those on the main level?
I find it hard watching talks where only slides got recorded or others where only the speaker gets recorded.
For the latter, I'll normally download slides and use them to move along with talk.
I wish I had something more substantive to say here, but the problem is that we give Facebook an extraordinarily huge power in our personal lives. It's not just some random web service.
I was impressed with the account recovery process ("you entered an old password -- do you want to recover your account?"), but I felt like they were completely optimized for recovery versus preventing the intrusion in the first place (ala Google's two-factor auth).
Anyway, in this case they obviously took the wrong approach with the blogger and I hope it blows up in their faces. (Microsoft and everyone else used to not be nice to security researchers, Facebook will no doubt learn that cooperation is a better strategy too).
By using an app you are giving them access to a whole bunch of your personal information. I always assumed that many were scraping data from my profile. This is why I have never use Facebook for authentication.
When I read the original post I figured Facebook would want the data so they could narrow down who the probable culprit is. I would have thought finding a common app among a million users probably wouldn't be too difficult.
That said the nature of this conversation is ridiculous.
On the other hand, they are trying to solve this issue secretly, no disclosure. And we dont yet know if they are taking any privacy measures to prevent this kind of data leak.
By reacting like that, I think Facebook can be considered as guilty as charged.
I would suggest just posting once a day, and using the Promoted Posts for the occasional big news that you want to make sure everyone reads.
Facebook pages isn't a panacea for brands or publishers -- not by a long shot. That panacea is one of those Frighteningly Ambitious Startup Ideas.
Uh huh. "Mom & Pop business" seems to be the new "won't somebody please think of the children" line designed to extinguish all rational thought. I'm getting a little tired of it.
(I'll save my rant on why I think most Mom & Pop businesses should be out of business for another day. I have to say I'm amused when I see a restaurant in my neighborhood apply a bunch of signs that say "absolutely no laptop use" and then go out of business a month later. Idealism is a bitch.)
You can STILL see posts of your favorite bands by going to their pages, which is how you used to have to find updates: by checking for them. The Newsfeed is new, and it's not a right.
Actually? Quite a few. I despise Comcast. I despise the big-4 cell phone companies (Verizon, AT&T, T-Mobile, Sprint). I despise the oil companies (BP, Chevron, Texaco, et. al). Notice a pattern? Despite my (and presumably many others) despising these companies, they are all enormously profitable. I think Facebook has got to the state where they at least think they have a monopoly on their users' social graphs and are willing to raise access prices sky-high. I'm not surprised it happened. I'm surprised it took this long.
I'd be angry if I'd given Facebook money under the old system only for them to change the value of what I got from them. The basic takeaway is that the rules that were in place where I might be willing to pay $2 for a like - a person who likes your page sees your post - had to be changed because there wasn't that much user attention in existence. Now it's been inflated to be worth about a tenth as many views, which is what you were buying, only Facebook called it a "Like" and it somehow means something completely different now.
I guess the moral of the story is don't invest in anything whose value can be arbitrarily changed by someone else.
You built a business inside someone's shopping mall, they started charging rent, so you complain. And at $4 CPK for promoted posts, you'll find FB advertising to be slightly cheaper.
 CPK aka CPM aka cost per 1000 views. Calculated from: To reach 100% of of our 50k+ Facebook fans they'd charge us $200 per post. Edit: $200 / 50 = $4, thanks Ryan.
What really frustrates me is that I'm missing entirely non commercial messages from my actual friends. I've missed posts from my girlfriend for godssake, it's ridiculous.
I understand that they need to make money, but the entire reason I and others are on facebook is to connect with our friends. Facebook needs to allow us to do that and then augment our experience with monied options, not imply that most of your friends will never see your posts unless you open up.
Don't make me go back to email. It's still there, waiting, full of delicious SMTP guaranteed delivery.
Another in a long, long list of customers whose plans fall apart when a free or one-price-for-life service realizes it cannot continue with business as usual. Today's pro tip: Do not build your livelihood around a third-party's free service. Eventually that service will either 1.) shut down, 2.) kick you out of their ecosystem, or 3.) start charging you.
I'm not sure what is more surprising: that people continue to build businesses with these Achilles heels or that they seem shocked when the third-party changes the game.
Facebook: Oh, definitely. Just have a look at your NewsFeed and see what they're doing.
User: Wait, I've got 2000 friends. Why am I only getting a NewsFeed post twice an hour?
Facebook: Because we decided that's the information that you're most likely to want.
User: But what if I want to know what everyone's doing at any specific moment?
Twitter: Can I be of assistance?
User: Oh, hello Twitter.
He want control of his fans, his like-ees. Not his (Facebook) "friends". Most of us know that is not a bug but a feature.
Now the problem that Facebook makes it to share one's email address with one's own real Facebook friends is annoying and something to complain about. But trying to leverage that to complain about not being able to push your feed is problematic. This is exactly what use Facebook for. An experience where you aren't bombarded with everyone's BS.
This is why I don't like many pages, and it's why FB needs clear and easy to use controls for what does or doesn't show up on my wall.
I only have 300,000 likes too. ;-) Basically, the trick is engagement. Give the audience what they want, when they want. Timing matters, pictures matter. Do it right, and you don't need to pay anything.
P.S. Making money from advertisements, pfft how ancient and boring! shamelessplug use Teespring instead.
That perspective actually gives me increased hope for Tent (https://tent.io), the decentralized social networking protocol that could one day be a Facebook alternative. When Tent was announced here on HN, a common criticism was that if you're popular, and you host your Tent server yourself, you end up paying a lot for the bandwidth cost of sending each post to thousands or millions of followers. Whereas the perception is that on a centralized social network you can send a post to millions of followers for free.
For now, that's still the case on Twitter, but on Facebook, apparently not. If you really want significant reach, you pay to publish even to people who already (by liking) signed up to follow you. So the situations aren't actually that different. I guess there really is no free lunch.
"See what we can do for you? See the traffic we can drive and link to you? Want more? Choose your level of traffic, choose your price."
The article makes the assumption that 3rd-party businesses that have been suckling at the teat of the social graph are the value to the facebook users. They're not. The users, the actual people are - businesses are just there to help pay for the whole thing, and follow the personal users. I say this as a business owner who uses facebook heavily, and occasionally pays them for the right to get a little bit back out of them.
I've yet to see a single person in my timeline say "I'd stop coming to facebook if all of these businesses didn't have pages here."
Facebook has a level of PR software as service which is free. They have another which is premium. If a company wants to spam their "fans," they have to pay.
If a business wants to have a high level of control over communications with it's fans, customers, likers, or whatever they are called, there's no free lunch. Either pay a third party (e.g. Facebook) or invest the hard work.
Using your blog or whatever to make specious (I assume) arguments about what someone else should/should not be doing with their business is your prerogative. Just don't expect people to actually listen to what you're saying while you beat them over the head with ads for trucks and cooking shows.
Again, I didn't read the whole thing, or even half before I bailed. But am I wrong in assuming this site uses the popular activity of Facebashing(tm) as a ploy to shove ads at unsuspecting visitors?
His conclusion? Not Facebook
I run a nonprofit alumni association here in Boston and I use FB as a way to update alumni of changes in events so that we can limit the numbers of emails we send. We were using Facebook as sort of an information platform and don't profit or make any money in any way.
I am very careful to not post too much, even entering in to specific agreements with the national alumni association so that they do not to post ads on our page for their merchandise etc.
What am I supposed to do now? Should I pay out of my pocket to reach users who definitely want to be reached already?
Facebook provides a great service, and they should be compensated, but I will now have to look at other options to potentially reach our group.
---And the flip side of this is that I would like to see posts from everyone I am friends with that I haven't explicitly blocked from my feed, going through all those names to re-add them seems like an amazing amount of trouble for me.
---The OP is hard to sympathize with, but he/she has a good point.
As a user, if my friends post something I want to see it. If my daughter's karate school or my favorite band posts something, I want to see it. If they're spammy, I'll unsubscribe. I would like to make this decision for myself, not have it made for me. If it has to be made for me, I would prefer it be made based on some approximation of relevance and quality, not because someone paid $5 to spam me with it.
As an advertiser, Facebook has consistently promoted ads as a way to build a following via the 'like' button. So I pay Facebook to gain exposure to build a following of 10,000 fans and now I have to pay again if I want to reach them all?? Classic bait and switch. I wonder how many past advertisers would have paid to build up their 'likes' if they had been told very clearly up front "Just because someone likes your page does not mean they will see your posts in their news feed".
The add supported model is terrible for social networks and needs to go. If you can afford a computer, smartphone, etc. Then you can pay 5-20$/year for an account.
Free limited accounts for people <18 years old, which have limited access to adult content? (Idea, but may work to both hook future customers, and protect kids)
What follows is speculation, but it's easy to imagine that out of a total fanbase, only a certain percentage "catch" your post while it's fresh, before it's buried behind newer stuff coming in from the ever-increasing number of pages people like. While it may have been the case that back in the day the response one got from posting something on a facebook page was much better than it is now, it's also true that facebook was never as popular as it is today and that users' newsfeeds were never as busy as they are now. And as people subscribe to multiple publishers and their attention gets diluted, you can't expect their engagement with all of these pages to remain at pre-growth levels (or grow).
There's another twist to this. Too many posts from pages thumping activity from friends may alienate users. How do you balance these two types of information? Someone's going to get less airtime, and since (I assume) the bulk of posts comes from pages, they get silenced based on whether or not you interacted with them recently and whatever other criteria facebook can come up with. Same for friends you don't care much for.
Whether or not facebook can be more transparent with regards to how it determines which posts to show and which to hide is another issue. Does the average Joe care? Will he mess things up if given controls that are too advanced? Note that Facebook doesn't censor information, it merely filters what you see by default. You can still go to individual pages or profiles and see their full activity.
There also seems to be a backlash against any commercial endeavour facebook may have. "Facebook is selling your information!" - is it? where can I buy this information? is it really selling in the sense that most people would understand? No. But that's the term that is being used. "Facebook is making people pay for airtime!" well, kinda. Personally I think that should be "Facebook is making people pay for ADDITIONAL airtime" for all the reasons stated above. Maybe they got into this mess due to poor communication but I don't buy the "broken on purpose" argument. That's against facebook's interest in the long term.
I don't mean to defend facebook, just bring into discussion the potential complexities behind developments which people tend to imply are malicious.
x) Disallow users from merely being a fan of the page, instead replacing that with "like"
x) Now make it so businesses can post to their page and the post shows up in the newsfeed for those who like the page. Previously only friend updates were shown. So liking a page has the side effect of getting spam by the company.
x) Facebook has now successfully facilitated spam, which is necessary for
x) Their new spam-prevention algorithm, leading to the end goal:
x) Now that Facebook has facilitated spam and we accept limited posts, the antispam filter can be circumvented by paying Facebook.
Voila, Facebook is now the post office, and spammers pay the post office to bulk spam you. Imagine if you went to local businesses and said, "Hi, I like you guys", that resulted in spam to your snailmail mailbox. You said, "Cut that out, that's wrong." So they fixed the problem they created, but now that the businesses are hooked, they can charge them for the ability to send out spam.
Facebook could easily make it so users are in charge of their filter, but this is counter to how Facebook wants to make money, so the UI is horrid for this and no one does this in practice. Imagine a UI where users rank friends of order of importance, with an easy UI, and the most important friends of mine are the ones who I am more likely to see. O wait, I have just described g+. Facebook will never have such an intuitive interface("close friend" is horrid), where you the burden of filtering is put on the user. Facebook wants to control that filter.
Eventually it will get to the point where you don't even need to like a page, you will get spam from the highest bidder, decided by auction. One of the main purposes of 'like' was to get users accepting communication from companies, once that was done, then they went in to monetize the link, before that it was just friend to friend chit chat, which doesn't pay the bills.
But who knows what special sauce is in FB algos. If I were them I would certainly distinguish between companies, news/blogging, musician/art and image macro posters. Those all have very different usages and annoyance levels.
Probably the interaction rate is factored in, but that also gets spread thinner and thinner. Obviously God and George Takei are winning the game, so the game isn't unwinnable.
FB's job is to keep the average user (who won't put much effort into sanitizing their wall whatever they clicked on in the past) happy while getting enough money out of their userbase as a whole to stay in business and keep the stockholders happy. It's not their job to keep the promotors who use FB as a tool happy.
That's about my consumption. On the other end, I have a friend, an artist with 5,000+ friends. He told me that the engagement on his posts dropped drastically, from like 200-300 'likes' per photo to something like 20 earlier this year, and as such he's considering not bothering to use the site any longer. Apparently Facebook thinks those people aren't interested in his content? Or they want him to start paying. That isn't going to happen.
I will say, if your posts show up so frequently in my stream, I will unlike your page. Facebook is definitely saving you from a lot of unlikes. Facebook is not Twitter - it's baby pictures from your friends.
I trust Facebook to control what to display to me MORE than I trust advertisers to post only things I would be interested in. That they can pay money ($200?) to get it there, that filters it too. They'll only pay for interesting stuff presumably. So thank the Lord Facebook pages don't get to control my stream directly.
The story here is now that Facebook is willing to be paid by brands to degrade the news feed experience for their users :)
IIRC there was previously a "see only important messages from this person" choice and it was better.
Facebook is a company, it's not a democracy asking their users what they should do. They can destroy their business if the want to, and your responsibility as a customer is go and look somewhere else to signify that their new rules do not work for you anymore.
1. Advertisers2. Real users3. Social media marketing scum
If they think most users would prefer not to see 10 posts per day after accidentally clicking a like button, then they're probably going to do that.
Do we as business and individuals really want to pay to promote our content AND be sold to advertisers AND build their network at the same time?
A few years ago Facebook had a feature where you could weight your friends' from 1-10 and that would affect your feed. Now you can just limit by "only important updates" and such. It's not really clear what that even means.
This isn't Facebook scamming you - it's simply that 100% of your fans don't check 100% of your posts 100% of the time.
For example, if my Crossfit box posts a new WOD everyday, I would greatly prefer to have that in my news feed rather than having to go search out the fan page again. I could have just gone to their actual web site.
It would be very nice if you could use the search box to search on your news feed posts. If I could quickly do a search for the Crossfit box and get to the daily post.... awesomesauce!
$75 for a 17-30K user reach is $0.0044 per user or less.
I actually think that's a good deal if you're announcing a new product or important product update.
Can't really do anything here other than sigh and shake head.
Reasonable? No way.
Best part is, the only way to change this is to shut it off for each individual friend - not exactly convenient.
I dropped out of a CS program after first year. I was the classic case of a student who had always been told he was brilliant, so I never worked very hard. In high school, I coasted along simply on a fantastic memory, often 'studying' for the final exams that determine graduation the night before. I never learned how to learn.
Going to college was like being thrown into a bath of cold water. I had never been particularly conscientious, so being in an environment where I was now responsible for my learning was new to me. I skipped lectures, forgot homework that was due, turned in coursework late; the usual suspects. On raw talent though, I qualified for 2nd year, only failing Pre-Calculus. (I skipped the classes and tried to learn math from 1st principles. Ugh...)
I got a summer job at a small telecom startup. By time 2nd year rolled around, my student loan was denied, so I dropped out. I'd always hated school, so I didn't care. I never applied for leave of absence, nothing. I just didn't show up in September. That was 2006.
I was 20 then. I'm 26 now. I've had a lot of time (6 years!) to reflect on why I did so poorly despite being talented (not being conceited; my lecturers in 1st year said as much). There are quite a few reasons; but the major one is that I didn't know how to learn. So if something didn't immediately click, I'd give up in frustration, and decry the teacher as an idiot who couldn't teach (oftentimes true; but irrelevant). I didn't know there was another way.
Being around HN and places like LessWrong which exposes you to so many thought-leaders brought about some interesting side-effects, which culminated earlier this year. Upon reading an article on LW entitled "Humans are not automatically strategic", which was a reply to a Sebastian Marshall article "A failure to evaluate return on time fallacy", I had an epiphany that being systematic about things was the route to accomplishing great things. "Rationalists should win", the LW meme goes, and it's correct. I came to realize that for every goal, there exists an efficient path to achieve it. My task was to find that path, and execute ruthlessly upon it.
Since then I've made leaps and bounds in my personal development. I still slack off sometimes, but I won't fall into my old perfectionist way of thinking that I'm a failure. It's better to be 80% there than 0%.
I made the decision a few weeks ago to get my CS degree, albeit at a different, larger university. Since then, I've been devouring articles like this one. I recently bought two of Cal's books and wanna sometimes slap myself when I realize that if I had had this knowledge and the discipline to implement it 6 years ago, my life would be so much better. But c'est la vie. These articles on meta-learning are priceless.
So if you're in school now, or are going soon, pay attention to articles like these, Here are a few gems I've dug up recently:http://news.ycombinator.com/item?id=3427762
Thanks to knowledge like this from Cal Newport and others, I'm going back to college full-time as someone with an above-average cognitive toolset, and a myriad of experiences that will suit me. I'm much more sociable, have a great eye for design having moonlighted as a freelancer some years back, and will now know how to engage my lecturers on an adult level rather than the kid I was 6 years ago. I'm going for a 4.3 GPA. I'm tempted to say wish me luck, but with tools like these, I'll make my own luck.
This rationalist will win.
PS If y'all have more articles like this, let me know. If you wanna chat privately, email's in profile.
EDIT: formatting; clarity
I have ever seen. The shtick is getting old. Gee-whiz posts about a dilettante ramping up to a beginner's knowledge of a subject with little time and effort have nothing to do with the really challenging learning tasks in this world.
I'll be impressed when I see a headline like "Middle East diplomatic issues resolved by undergraduate who completed one course in international relations" or something like that. Show me someone who has solved a genuinely hard problem before proclaiming a new breakthrough in learning. For a refreshing change of pace from the usual blog post on quick-and-dirty learning, see Peter Norvig's "Teach Yourself Programming in Ten Years"
or Terence Tao's "Does one have to be a genius to do maths?"
for descriptions of the process of real learning of genuinely challenging subjects.
Based on that test, I think the title is link-bait as it isn't "mastering linear algebra" but "passing an introductory algebra course."
As an aside, I've never heard it called the "Feynman Techniques." However, one of my favorite things in the world is the so called "Feynman's Algorithm": (1) Write down the problem. (2) Think very hard. (3) Write down the answer. I just found to hilarious, but I digress.
There are two points of his with which I agree 100%.
Firstly, the process of writing a short summary paragraph of what you just read after reading a chapter or big section of a technical book. There is actually a fantastic book -- maybe one of my favorites of all time -- called, somewhat strangely, How to Read a Book. It's all about very active reading over passive, almost to the point of having a "conversation" with the text you're reading.
Ever since reading that book, I've gotten into the habit of writing a summary of each thing that I read. It really forces you to confront whether or not you "got" the point of what the book is saying. I usually find that there are quite a few bits that I either missed, or didn't quite understand, at which point I go through and search for the pieces I'm missing.
Secondly, looking at all of the low level pieces to understand the whole. This is something Salman Khan, of the Khan Academy talks about in (I believe it was) his TED presentation. Quite often, I find that there is some early concept that I glossed over which is slowing my understanding of the current material significantly. For me, doing this makes me being 'honest' with myself over the state of my current understanding -- which was kind of hard at first when I took this new approach to learning. So much of my 'ego' seems to be unfortunately wrapped up in 'what I know,' and thus I convince myself incorrectly that I do understand something, even when I don't, just because it's something that I "should" already know. Admitting to myself that I didn't understand, for instance, some basic math concept that I should have learned in high school was somewhat difficult -- as odd as that may sound. I suppose I have a fragile ego! But sometimes, getting a good grasp on my modern course work, meant stopping what I was doing, and going back a couple of levels and starting at the beginning.
The question of "What do I need to know in order to understand this" is, I find, an extraordinarily powerful one.
I absolutely believe what he writes, because he's quite precise about his experiment and how he did it and this really works for a couple of reasons:
* This guy isn't 20 anymore. He has actually explored and learned and trained "productivity and focus" which he blogs and writes books about - so he doesn't start like a 18 year old directly from school, unexperienced maybe in this level of focus and discipline.
* He was pragmatic in his goals - very much so. He didn't write "becoming the world's foremost expert in linear algebra" but "passing an exam". And so he did. He also didn't write "passing everything with a top grade" but "just pass, if better - wonderful".
* He actually did his math on "hours to put in" - a semester doesn't take full 6 months, you usally don't attend lectures/lab every day 3 hours a day but 1-2 times a week, 2 (university) hours plus preparation. If you carefully add this up, you actually get a surprisingly low count of actual course/lesson hours.
* Taking in a course in a focused manner is actually quite efficient and helps you (at least it does for me) follow the material without interruptions. You also can repeat as often as you like (he mentions a fast forward and replay button in his TEDx talk) - which btw. makes part of the success of e.g. Khan university material.
* He also put some effort and training into the right way of learning and _that_ pays off massively in terms of speed.
Also, one of the points he is actually making is part of what most of you critizise: Going through the list of MIT requirements is something different compared to "becoming an expert in X" - don't mix that up.
Would be more compelling if he was not selling books. Nothing wrong with making a profit but I'm just saying...
For maths-heavy subjects, I'm not really inclined to believe that traditional exams are the best way to assess a student's knowledge and understanding of the material (especially with regard to rote memorisation). Exams in such subjects haven't changed fundamentally in many many decades, even though we now have lots and lots of new things we could do with them.
For instance: do more with computers - like getting the students to solve real-world, many-tentacled, hairy problems by numerical methods, rather than giving them some carefully pruned equation that just happens to have nice analytical solutions. Or introduce more computer-assisted mathematical modelling (e.g. use classical mechanics, to start with). Or on the pure front, teach students to write or at least understand some interesting automated theorem prover.
Stuff like that.
I suspect that traditional exams have survived simply because they serve their purpose: a percentage of exam-takers fail the exam (which allows the exam-setters to claim that their standards of assessment are rigorous), and a fair percentage will pass the exam, some with flying colours. Whether or not the actual learning goal was achieved has not been determined, since the exam is deemed to be the only instrument that can measure that.
When I got into university I found every course very easy, didn't attend any lectures, got all my workshops to run on the same day to reduce my face time and maxed out my free time to do whatever I wanted (work/friends/extra/etc). I'm a STEM major at a top 30 world ranked engineering school with good grades.
I've often asked if I could max out my classes and finish a degree within a year and a half - but I've never been allowed to skip more than a few subjects (tests/bugging the heads of departments).
University shouldn't be time capped or subject load restricted - people should be allowed to do as many as they wish - or you'll find more and more moving towards MOOCs instead.
Not something I would ever want to repeat and was first year level courses. Basically I was doing a correspondence 3 year degree while working full time. I got heavily involved in my work and decided that I wouldn't continue studying. Then with about 4 weeks to go to the 2 week final exams period I thought, what the heck let's give it a shot...
Amazing what focus and hard work can achieve!
It's true that I didn't attend a lot of classes (since they all overlapped anyways), and had 2-3exams virtually every week. The only issue I see is that there is only so much you can do online. I also did the same thing with Chemistry and Biology, which had lots of laboratory classes, and I don't see how one could gain the practical experience of putting knowledge to work in those fields without a wet lab class. EECS however is amenable to this (for the most part - likely hard for an optics laboratory), and most of my EECS labs were really done in Athena clusters instead of a distinct laboratory.
He is, however, a master of self-marketing:
"To find out more about this, join Scott's newsletter and you'll get a free copy of his rapid learning ebook (and a set of detailed case studies of how other learners have used these techniques)."
This is a useful technique, giving motivation and focus. Though imperfect: it can't detect incorrect understandings that seem consistent. But to be fair, that's a tricky case.
> That works out to around 1 course every 1.5 weeks
WTF? What kind of university imposes that you take only one course at any given time? It's not just linkbait, it starts from a wrong assumption. When you take many related courses simultaneously, you see the pieces meshing together and that helps learning. That's different from taking them in a serial manner.
Let's ignore the discussion about dynamic range and bit depth etc., and assume that the volume control on your operating system controls the DAC rather than doing the stupid thing of digital volume reduction. The fundamental issue is signal to noise ratio on the analog line. If you turn the volume too far down on the computer and turn the volume up on your speakers, the sound on the analog line is too low with regard to the electrical noise and will be hissy. If you turn the volume up too much on the computer and turn the volume down on your speakers, then the signal will be so loud as to produce distortion either in the DAC or on the line itself. You're looking for a middle ground: as loud an output from the computer that you can produce without causing distortion in your loudest music parts. Once you've got that set, change the volume on the speakers to compensate.
(A) Pretend that everything except the DAC was noiseless: The noise would be due to the nonlinearities and quantization in the DAC.
(B) Pretend that the DAC was perfect: The noise would be dominated by the noise-equivalent input-power introduced by the resistance present in the components (including the transistors used for amps).
In short: (A) is a function of how wide the range of bitcodes that you use. The smaller the range, the larger the noise component relative to the signal.
OTOH: (B) is a function of temperature: All of the noise power before the final dial to your amp is passed through as is the signal, so the ratio stays constant. There is also a constant noise power introduced after that final amp, but I would guess it is negligible compared to the amplified noise power.
So tl;dr = For a decent sound card, maximize the software volume and then use the analog dial.
Assuming this is true, the correct option would be to maximize any application volumes (e.g. YouTube), to maximize master volume to a level just below the sound clips (distorts) at the amplifier input, and to reduce the amplifier's pre-gain (if it has any) so the master volume control has a reasonable range.
This method will minimize the three (not just one) culprits of poor computer audio quality: quantization at the application layer, electronic interference over the physical connection, and clipping at the pre-amp.
On the PC, though, I rarely set my system volume to anything other than 100%.
Max your software (usually this is 80% to prevent clipping and distortion), then attenuate speakers to 50% (analog boost is much worse than digital as it raises the noise floor).
Source: Mixing at studios for last 10 years
This is really only true when The Audio System represents samples as integers and not floats like CoreAudio does.
You can see the objective differences between 16-bit and 24-bit output in NwAvGuy's measurements of the 2011 MacBook Air's DAC: http://nwavguy.blogspot.com/2011/12/apple-macbook-air-5g.htm...
One of my 'weird unverified theories of life' is that turning the volume on portable device down (laptop/phone/mp3 player) and the volume on the speakers up saves the battery of the device itself. (For example when you're in a car.)
For example if you built the YouTube player, what makes you think you need a volume control?
As far as I can tell, I rarely if ever have this problem with the same hardware in Linux with PulseAudio (though I can intentionally cause it using alsamixer by pushing "Master" to 100%) and didn't have this problem in the past on Windows with Creative Labs soundblaster cards.
For the case where an analog potentiometer immediately follows the DAC, of course, there's no practical difference.
Not all machines work this way, though. One way to check is to hook up an external amp and headphones, turn the computer's volume way down and the amp up to listen levels. If the quality is crap the it's probably just decreasing the bit depth. Or you can do a teardown on the sound pathway.
(Oh, if it isn't clear by this point, keep all your apps turned all the way up for best quality. Only turn them down on an individual, as-needed basis. All-software stuff has to decrease bit depth to decrease volume on a per-app basis.)
Volume should always be controlled as close to the source as possible. Anything else is simply inefficient and a waste of processing power.
There is no reduction of bit depth. total hoo-eee.
I've been in one of the schools when they have the after school club and it's amazing how much the kids get really quickly. They're making their own games without needing any help from the assistants, the drag and drop element of Scratch makes it a lot easier than getting syntax wrong and getting frustrated. Some of the kids love it so much that they're disappointed when it's half term and they can't do it that week. One kid now wants to be "a programmer or stuntman" when he grows up.
They're in around 300 schools in the UK now and have roughly 15 children per club, so that's an extra 4000+ children in the UK learning to code each week.
Disclaimer: I help out Code Club and develop their site
She showed the children a Python program with a while loop, and says they "got it". I've tried explaining iteration to a (bright) seven year-old by using indented text and they found it hard to comprehend, but the equivalent in a graphical lego programming environment was obvious to them.
Kids today (both male and female) grow up with so much technology around them. My bet is that this will drastically influence the number of women entering technology focused career paths in the coming years.
What struck me the most was the sheer number of questions I got. 4 or 5 hands in the air the whole time when I was answering questions. A lot of smart questions and comments. Very intense and high energy. Contrast that to giving a talk to adults - usually there a lot fewer questions.
Overall it was a great experience, and I recommend it if you have the opportunity.
One of the things I tried to impress upon the kids is to look at where the jobs are, and what they pay. I don't think that's emphasized nearly as much as it ought to be. For instance, prior to making the switch to full-time development last year, I was working as an editor at a newspaper. I loved it, and I was good at it ... but the newspaper industry was (and still is) in the tank, and there was very little job security. And, of course, there's an oversupply of people with journalism degrees, so the wages aren't much to write home about.
I told them I wasn't trying to talk them out of pursuing a highly competitive, not-so-highly-paying career. But I think students should know, going in, what they're getting themselves into.
I have no real affection for scratch, but I feel that he was making the argument that children should learn to program in an environment that models (at least to a point) the one in which a developer develops in, at least with regard to language preference.
I feel at this point, the language choice (barring ease of use etc) is pointless. Whether you use scratch, python or Haskell, if it piques the interest of a child, then nothing will stand in the way if that student wants to go on and learn every programming language available.
If you think of the first language you ever learned, and what you are now programming in. For me, my first language (a type of kiddy basic) gave me what I needed. A concept of execution flow. How to make things come up on the screen, basic 2d programming and it made it very easy to make some GUI based stuff.
My point is that don't hate any language (even if it is a fake language like Scratch) if it builds the initial building blocks in a child (or adults) head.
Ha! My inner child feels somewhat vindicated.
The school where I visit is really average, some rich kids some poor kids, all kinds of backgrounds. The format of this career day is that each class period somebody will come and talk to the class that is somewhat related to the subject - so I usually end up speaking to a math or computer class. In a class of 25, there are probably one or two kids who already know some limited programming (or have made a website). Almost everybody that age is online (all Facebook, a handful of Twitter) and plays console video games. Probably about half have cell phones.
When they ask me questions, it's usually about how to steal their friends' Facebook passwords, conceal their browsing history, or build their own video game. I do spend some time talking about privacy, reminding them that their behavior online can stay around forever and that they should be careful who they are talking to online.
The terms don't vary much by district; they vary by age. Kids younger than these use the term "number" to mean positive, decimal, integral numerals. That's all they know.
Kids at this age are introduced to some new distinctions: fraction vs. whole, negative vs. positive, and decimal fraction vs. common fraction. At that point, they will use the term "whole number" to mean not some type of fraction and "decimal" to mean a number that uses this nifty, new fractional notation that has digits on the right side of the decimal point.
A few more years pass, and they no longer see "fractions and decimals" but just "numbers." At that point, they switch over to referring to integers and real numbers (with no emphasis on exactly how a fraction is represented), and if they begin working with binary numbers, they'll use the same term, "decimal", to make the distinction of base, not type of fraction notation.
The term "float" is not a mathematical term. Many older math professors don't know it. It is a tech term for a form of storage and display of approximations of real numbers.
These terms are not regionalisms; they represent the distinctions being made by the students at their stage of development.
Wait another 2-3 years, and you will be their new hire.:)
Seriously, I wouldn't think of them as 'juniors' or 'new hires', that will be only a very short temporary state. Think of them as your future colleagues, competitors, hacker friends, fellow tax payers.
Great article though!
I'd recommend it for any technical parent - what you do is definitely cooler than being a lawyer. =P
Just like trying to give the computer the same input over and over again. I find this hilarious, kids are the best
How is this person a "step above" a nutritionist?
Now, there is even nothing strictly wrong about ignoring research like that. It's just annoying how they revel in ignoring all recent progress in the field.
Time for honesty: What bullshit.
- Go codebases by non experts are peppered with magical incantation (sleeps, etc.) to avoid the dreaded "all goroutines are sleep". Of course "they are doing it wrong", but that is the germinal point.
- A concurrent Go program will likely behave differently given 2 bits (just 2 lousy bits) of difference in the object binary. (runtime.GOMAXPROCS(1) vs runtime.GOMAXPROCS(2)). Imagine someone touching those 2 bits in a "large codebase". It is practically impossible to do the same thing in a large Java codebase and fundamentally change the programs runtime behavior. (Happens all the time in Go.)
- It is very difficult to reason about a Go routine's behavior in a "large codebase" without global view and a mental model of the dynamic system e.g. which go routine is doing what and who is blocking and who is not. Pretty much defeats the entire point of "simple" concurrency, to say nothing of "scaling". Programming in Go's variant of cooperative multithreading is actually more demanding than preemptive multithreading. Cute little concurrency pet tricks aside, Go concurrent programming actually requires expert level experience. "You are doing it wrong". Of course. Point.
- There is nothing, absolutely nothing, that you can do in Go that you can not do via libraries in Java. Sure, the cute syntactic go func() needs to be replaced with method calls to the excellent java.util.concurrent constructs, but the benefits -- high performance, explicit-no-magic-code -- outweigh the cute factor in this "programmer's" book.
- On the other hand, there are plenty of things you can do in Java that are simply impossible to do in Go.
- Once we factor in the possibility for bytecode engineering, then Java is simply in another higher league as far as language capabilities are concerned. (Most people who rag on Java are clearly diletantes Java programmers.)
If Go actually manages to be as effective as Java for concurrent programming at some point in the future (when they fix the somewhat broken runtime) then the Go authors are permitted to crow about it. Until that day, go fix_the_runtime() and defer bs().
One thing that programming in Go has made me realize is just how awesomely Sun/Gosling, et al. hit that "practical programming" sweet spot. No wonder the modern enterprise runs on Java and JVM.
It just works. (But it is "boring" because it's not bling anymore. Oh well, kids will be kids.)
We get some nice concurrency primitives, garbage collection, cleaner syntax, something between structs and objects that fits the right feeling, automatic bounds checking, cute array syntax, and a big-ass, well defined standard library. Oh, and this concept of interfaces that is so well executed it's not even funny.
Except. I feel like they are forcing the fanboy mindset. At one point in this slide deck, there is the following bullet: "The designs are nothing like hierarchical, subtype-inherited methods. Much looser, organic, decoupled, independent."
I didn't see the talk. But that is the most vapid, meaningless description I've ever seen of a feature of a programming language. Rob might as well have said, "it's hipster better," which would have conveyed exactly as much meaning.
So here's my question - and I hope there are real answers - can someone point me to >3 real, big systems that are built using Go? I'll accept Google internal systems on faith.
I just don't get this. If you statically link in small functions from a big library, you only get the little bit you need anyway. Are they saying you avoid compiling the "big library" over and over? But if it is already compiled, that should not be necessary. And the chances are you are going to be importing lots of "little code" from the "big library" anyway. Unless they are saying the implementation of net's itoa is somehow simplified and not a just a straight code copy...otherwise I don't understand this approach.
"Dependency hygiene trumps code reuse.Example:The (low-level) net package has own itoa to avoid dependency on the big formatted I/O package."
...now, if this kind of attitude stays in the core dev land I don't really care about it. But when I'll consider Go as an alternative for a large project, I'll start worrying if people adopt the "it's OK to reinvent some wheels" philosophy when they start building higher level frameworks in Go... I mean, how hard can it be to split the "big formatted I/O package" into a "basic formatted I/O" package and an "advanced formatted I/O package" that requires the first one, and have the "net" package only require "basic formatted I/O" (or maybe even make "basic formatted I/O" something like part of the language "built ins" or smth - I don't know Go so I don't know the Go terms for this)?
I've seen a lot of Go talks from various Googlers, and I have to say that this was the best-motivated, most humble, and most honest of them that I have seen. Rob knew he was speaking to an extremely PL-oriented audience, and structured his talk accordingly, and the result was fantastic. Go comes from a very different standpoint than almost all academic PL work, and in that respect, for those of us in academia, it's an interesting breath of fresh air and a reminder of the uniquely fine line between industry and academia in computer science.
I clicked on the link in the second slide: http://golang.org
Even Dart looks great: http://dartlang.org/
I feel a little guilty being negative about this, but presentation does matter, and Google ought to be able to afford it.
> What makes large-scale development hard with C++ or Java (at least):
None of the points apply to Java.
(This is, of course, horribly broken in C++ which likes to inline everything.)
Can someone tell me why GC was the obvious choice as opposed to say automatic reference counting?
Once I discovered exceptions back in the 90s, life got a lot easier.
Of course Java ruined exceptions with the invention of the CheckedException, maybe this tainted the Go designers' thinking?
Update: was trying to use a mouse to switch between slides, but later tried, and figured out that it only works with keyboard :)
Compare that to cereal crops like wheat or maize or vegetable crops, which require long uninterrupted growing seasons and irrigation.
Why is this important? When a troop of rampaging soldiers cuts through your village and pillages everything in sight, you grab your cows and family and boogey out of there. Essentially, you have a mobile food supply.
In the event of a drought, you have options as well. With wheat or vegetables, no rain == no food. With a dairy animal, you go kill the guy who controls the next pasture and let Old Bessie the cow feast on the grass. (The other key development was the introduction of potatoes, which remain buried under the ground safe from the rampaging army above -- my Irish ancestors subsisted on potatoes hidden from the English taxman and a cow that lived in the house.)
In Europe and the Near East, these things were really important, because there was always pillaging armies marching across the continent. Today, it's unlikely that some Mongol horde is going to loot my supermarket, so I drink milk and eat cheese because they are really tasty.
edit: Growing up, we always had 2% in the house. From college on I drink skim, occasionally (once every few months) I get 1 or 2%, just to up the fat content (I'm a runner, not terribly concerned with weight gain, more or less trying to maintain body mass...)
The success of the lactose tolerance mutation may be partly due to sexual selection. It's been proposed that neoteny is a key feature of human evolution. The ability to drink milk as an adult is a neotenous trait, and it may have been "accidentally" selected for when other beautiful features were sexually selected.
David Rothenberg's book, Survival of the Beautiful, argues that biologists are sometimes "blinded" by natural selection and ignore sexual selection.
 - http://en.wikipedia.org/wiki/Sexual_selection
 - http://en.wikipedia.org/wiki/Neoteny
 - http://blogs.scientificamerican.com/thoughtomics/2012/10/25/...
Here's a tip for others - you can buy Lactase pills at a pharmacy and take them just before you eat any meal that contains milk. This gives you the enzymes you need without your body producing in it.
And it's really awesome. I only started doing this a year ago, but now I can eat many more cheeses, drink milkshakes, etc., without feeling bad. And it happens surprisingly often - every time you want to eat pizza, pastas, etc.
Seriously, is you're lactose intolerant, give it a try - it improved my life considerably.
I believe the benefit of drinking milk is obvious. A herd can take calories from grass and drink mud, while the human enjoys a source of clean, caloric, nutrient rich drink that can go anywhere. Farmers, on the other hand, can just be ran over, pilled or sieged by enemies.
It moderates strong flavors, smooths out acidic drinks, fluffs up eggs among many other thousands of beneficial food uses.
Other dairy products like butter and cheese are key to an immense palette of flavors and cooking techniques.
Dairy is so delicious that I've even seen people with violent milk allergies put up with the consequences just to scarf down a few bites of custard or ice cream.
Odd that they don't mention physical displacement: invasion, dispossession, death. The gene would likely have coincided with other developments of civilization, such as weapon technology, greater numbers, greater cooperation, specialised soldiers etc. Maybe there's evidence against it, but odd it's not addressed, with a puzzlingly high "selection differential". Another factor might have been sexual selection, if the new folk were healthier looking etc.
Note they are talking specifically about the West - agriculture and civilization spread throughout the East without this gene.
Really? A plant-based whole foods diet is probably the best cure out there for heart disease and type 2 diabetes. (google Dean Ornish, Neil Barnard, John McDougall)
The author tells a good story but his bashing of agriculture is unsupported.
Does anyone know why some East Asians (such as myself) are lactose tolerant? Is that evidence of interbreeding in the past?
I'm just glad I'm not lactose intolerant, so thanks to whoever in my billions of ancestors decided to keep at it.
Likewise, people in Sweden for example have a 100x higher lactose tolerance, because there's less sunlight throughout the year.
Plus, animals can graze on land you can't farm, and they're very portable.
"We became, in the coinage of one paleoanthropologist, â€śmampiresâ€ť who feed on the fluids of other animals."
Because the dairy industry in the US alone gets $4 billion per year in subsidies from taxpayers?
In comparison tests, longer copy almost always wins. You keep offering more and more reasons to buy, and you keep converting more and more readers.
This relates to one of the basic observations about selling: people don't like to change their minds. They won't spontaneously go from "no" to "yes". But if you offer a new piece of information, they can change their mind without admitting they were "wrong" before. Every new piece of information, or new story, is another opportunity for them to get to "yes".
Obviously, the copy also needs to be good.
But in re-checking the site I didn't see any claim that these are somehow trends of 2012; in fact, they say, "Let's take a moment to look around some trends we witnessed in last couple years."
The prototype should never be better than than the final version because what we try to give is awesome design which work awesome as a static version but when it come to dynamic view developers keep changing things.that will completely change everything you need to change accordingly
sometimes people think that single page apps are better.it's true in some cases but not in all the cases
trello is the best example it can be a single page app but they dint.pjax is what you can really use for dynamic design but still when it comes to micro-blog or blog the ajax will just fine. but you should really try pjax technic for mega apps.
i work on django so thats what i suggest for others using pjax is awesome
Edit: Just correcting to help, not mocking.
I might consider buying a copy of Windows 8 Pro at that price and then waiting until it hits SP1 to install it.
I might even spin up a VM to try it out.
I like that the $39 upgrade applies to anyone with Windows XP, Windows Vista or Windows 7. I think they're realizing that a lot of people don't upgrade OS because they don't want to upgrade their hardware.
(like my old Win XP laptop that I use as a VNC terminal to other machines).
The only reason why I wouldn't want to jump in with two feet is that I have a general dislike for the Xbox dashboard and I suspect that Metro would be very similar to it.
 You can use this tool to check that you have a genuine version of Windows http://go.microsoft.com/fwlink/?linkid=52012
 Windows OEM licenses are transferable if it included the hardware
 Windows retail licenses are transferable
Here's a direct link to a PDF for Windows 7 Home Basic in English
 Windows Anytime Upgrades are pretty much considered to be OEM
17. TRANSFER TO ANOTHER COMPUTER. (retail) a. Software Other than Windows Anytime Upgrade. You may transfer the software and install it on another computer for your use. That computer becomes the licensed computer. You may not do so to share this license between computers. b. Windows Anytime Upgrade Software. You may transfer the software and install it on another computer, but only if the license terms of the software you upgraded from allows you to do so. That computer becomes the licensed computer. You may not do so to share this license between computers. 18. TRANSFER TO A THIRD PARTY. (retail) a. Software Other Than Windows Anytime Upgrade. The first user of the software may make a one time transfer of the software and this agreement, by transferring the original media, the certificate of authenticity, the product key and the proof of purchase directly to a third party. The first user must remove the software before transferring it separately from the computer. The first user may not retain any copies of the software. b. Windows Anytime Upgrade Software. You may transfer the software directly to a third party only with the licensed computer. You may not keep any copies of the software or any earlier edition. c. Other Requirements. Before any permitted transfer, the other
Short version - outside of Metro it's basically Win7SP3 and it works great. Metro is every bit the usability disaster that people have claimed when not running on a touch screen.
The good news is that you really don't have to interface much with Metro at all. It replaces the start menu, but it does so in a manner that works with how I'm used to dealing with the start menu already. That is, I already just hit the Win key and then start typing until the thing I want pops up, and that behavior has carried over.
So, yeah Metro is awful for all the reasons everybody has already laid out. Despite that, Win 8 has been a solid performer and I won't be loading Win7 back on this system.
My primary home system will continue to run Win7 until I am comfortable that my production applications will all run successfully (and by that, I mean "games").
The Surface, not so much (7.0): http://www.theverge.com/2012/10/23/3540550/microsoft-surface...
For instance, going to the traditional desktop is as easy as clicking the "Desktop" tile. And opening a new tab in the metro-IE was a bit confusing but after figuring out that double finger pressing the touchpad brings up the tab list and url bar it has become easier.
I also like the new native mail client and calendar apps.
For the record, I am running Windows 8 on a 2011 macbook air via bootcamp and it runs perfect. Guild Wars 2 also gets about 10 fps more than it does on the mac client for what its worth and makes it actually playable on an Air :)
Following this tangent a bit more, I feel like if the drivers were updated enough to support 3 finger left and right gestures to wipe between the different screens I wouldn't revisit OSX for a while.
Windows 8 is a fun operating system.
I'm surprised more people haven't picked up on this rather bold move.
The elephant in the room for me is the horizontal scrolling. I'm sitting there spinning the mouse wheel vertically, and what's on the screen is moving horizontally. That's a total disconnect.
Why this emphasis on horizontal scrolling? I don't see how the horizontally scrolling items are in any way easier to use than a vertically scrolling set of items. Seems like different, for difference's sake.
The price is much lower than for previous versions of Windows, this makes me suspect that we should start expecting new releases of Windows much more frequently, similar to how Apple does it.
With the radical changes going on in Windows 8 it wouldn't surprise me to see a tweaked and improved Windows 9 in less than 2 years.
The computer booted up to a home screen with icons for all of your programs, and you had to click exit to desktop to get into windows.
However I can't say I am any more productive than I was with Windows 2000.
Looks like they're setting the font explicitly to 'Segoe UI' and nothing else in many spots. Telerik, a .NET CMS provider does a similar thing.
At work the IT dept will hopefully skip this version all together, or take a few years before "approving" it.
> Back when Firefox 2 was released (six years ago this week!), > the Internet Explorer team started a friendly tradition of sending Mozilla a cake > as congratulations. This continued for Firefox 3 and Firefox 4.
They should have doubled the padding to be safe.
> Just 30 minutes later, Michael Bolan tweeted that the cake was gone.
I don't know how many people there are in that office, but I hope it's sufficiently few that no-one got Miltoned :).
This way they will know in which games people are really interested and I also think that people will gladly support those kickstarter. Overall much less risk on investment.
Yes, there are exceptions, but those who feel different are in the minority.
As a matter of fact, through the Right To Information Act, there's an activist who is currently raking up dirt on a whole bunch of politicians serially.
Makes me thankful of the freedoms we enjoy and take for granted!
EDIT: mircocosm was a poor word choice
If that's true, it's disappointing the Times didn't do a simultaneous release in anticipation of the block.
"HONG KONG â€" The Chinese government swiftly blocked access Friday morning to the English-language and Chinese-language Web sites of The New York Times"
"By 7 a.m. Friday in China, access to both the English- and Chinese-language Web sites of The Times was blocked (...). The Times had posted the article in English at 4:34 p.m. on Thursday in New York (4:34 a.m. Friday in Beijing), and finished posting the article in Chinese three hours later after the translation of final edits to the English-language version."
In fact it was two of my Chinese friends who told me about the article this morning....
A different variation would be promoted weekly, but you'd be able to play any of them whenever via email or real time.
Would that interest anyone here?
There is a great library of them here, except it's single player only which IMO takes a lot of the fun out of it:
There is just a single space for the pawns on the end to move to, after all, and it isn't well defined what moves the penultimate pawns could make should they move one square forward (can they attack end-pawns that have moved just one square?)
What might be a better variant would be to assign directionality to all pawns (starting as forward), and allow them to take left or right turns, perhaps diagonally. This greatly increases the number of game possibilities while introducing no new confusing scenarios.
Each player gets a deck of cards which change the rules of chess, for example, you can play a card that makes the board cylindrical, so the edges wrap, and you can move pieces from one side to the next, or another card that allows the king to move two spaces at a time, etc. I always thought it was a neat game.
I think I agree with the others saying pawn movement is hacky. I'd be inclined to say that a pawn promotes when it reaches either side of the board, but this isn't great because you need to keep track of where a pawn started to know which direction it's going.
Don't get me wrong, it sounds interesting, but with such games you usually only know for sure once you've tried it.
But "Singularity Chess" is an awkward name. The center of the board is not a singularity in any reasonable sense of the word. The space of allowable movement trajectories appears to be nonsingular.
Maybe a more correct name, albeit drier, would be quadratic chess (because the transformation looks like a quadratic form).
It was at this point that I stopped taking this chess variant seriously.
That said, there are no doubt folks on the other side of those down links with calls in to three or four NOCs, a couple of trouble tickets being escalated, and people driving out to non-nondescript buildings near railroad tracks and in industrial areas carrying weird looking devices which can measure the intensity of laser light and do time-domain reflectometry (TDR) measurements. We can only wait and see what they discover. If we were playing the Ops edition of the game Clue I'd guess "Colonel Mustard with a Backhoe in New Jersey" :-)
Fortunately to date the affected services are all non essential, mainly entertainment/trivial stuff like blogs, instagram, dropbox etc etc, but when we start to see things like water supply and electrical power management systems, hospital records, aviation system etc affected the consequences could be severe.
If the very best IT minds at AWS and GAE can't keep their systems running, what hope have government departments got? Anyone that's ever been to a DMV, or USPS knows just how good the US Government's IT departments are.
Another place to check for good information is http://www.outages.org/
There have been a few incidents as of the past few days. Last night, there was a nationwide outage from Frontier that has since been resolved.
The day prior there was a triple failure in the Midwest as reported http://vielmetti.typepad.com/vacuum/2012/10/windstream-outag... that affected lots of services in a large area.
Of course, they couldn't possibly be that dumb as to make a massive DDoS in retaliation. snicker
The Ontario router seems to be dropping packets:
$ ping gw02.wlfdle.phub.net.cable.rogers.com PING gw02.wlfdle.phub.net.cable.rogers.com (184.108.40.206) 56(84) bytes of data. From <snip> icmp_seq=1 Packet filtered From <snip> icmp_seq=2 Packet filtered From <snip> icmp_seq=3 Packet filtered --- gw02.wlfdle.phub.net.cable.rogers.com ping statistics --- 3 packets transmitted, 0 received, +3 errors, 100% packet loss, time 10206ms
$ traceroute <snip> traceroute to <snip> (<snip>), 30 hops max, 60 byte packets 1 <snip> (192.168.1.1) 1.489 ms 2.038 ms 2.669 ms 2 * * * 3 220.127.116.11 (18.104.22.168) 17.599 ms 17.584 ms 17.339 ms 4 so-4-0-0.gw02.wlfdle.phub.net.cable.rogers.com (22.214.171.124) 31.992 ms 31.972 ms 31.819 ms 5 126.96.36.199 (188.8.131.52) 33.198 ms 34.687 ms 34.596 ms 6 * * * 7 pos-3-15-0-0-cr01.ashburn.va.ibone.comcast.net (184.108.40.206) 35.557 ms 28.952 ms 28.818 ms 8 220.127.116.11 (18.104.22.168) 33.029 ms 42.176 ms 41.924 ms 9 he-0-4-0-0-cr01.350ecermak.il.ibone.comcast.net (22.214.171.124) 49.244 ms 45.218 ms 44.940 ms 10 pos-1-2-0-0-pe01.350ecermak.il.ibone.comcast.net (126.96.36.199) 37.146 ms 40.169 ms 40.372 ms
Most of those zeroes have been zero for a long time. The ITR isn't well-maintained and I wouldn't use the data as a primary source.
Perhaps I'm just lucky? Or there is issue with how this is reporting or there is more than one router that everyone else on my ISP uses.
I wonder how reliable this is.
Pinging 188.8.131.52 with 32 bytes of data:Reply from 184.108.40.206: bytes=32 time=14ms TTL=56Reply from 220.127.116.11: bytes=32 time=15ms TTL=56Reply from 18.104.22.168: bytes=32 time=14ms TTL=56
A few things of note:
1) The hover effect over the colored blocks feels odd, almost like the default blue border around image links.
2) ListViews seem under-styled with poor spacing. Perhaps this is intended
3) The ApplicationBar is pretty sharp
4) The text-color on all the button styles looks odd as black, with the exception of btn-primary
5) The Carousel in Chrome is way too thin. thinner than the arrow buttons, making the text unreadable.
1. Take the hover border off and instead add a little CSS3 background transition animation and set the hover background color to something a tad lighter. I think this would be a cool effect.
2. Use Segoe UI or something close instead of the font you're using now, especially on the blocks
3. One of the cool things about the "metro" style is how they effectively use a ton of padding and space around things. Yours is very cramped in parts. Definitely be liberal with the padding and space stuff out.
I think those three things would make a big difference.
I'm not trying to be a hater or a downer, I just can't believe that this is the state of web design right now. As Gob would say, "COME ON!"
.metro .metro-sections .metro-section.tile-span-1 #charms .charms-header a.win-command span.win-commandimage.win-commandring
I've really warmed to the whole metro look when it's done right and I look forward to where this goes. White text/icons on block color squares works beautifully and looks "cool". This could be bigger than Twitter Bootstrap if done right.
Idea is good, but implementation feels...unfinished to say the least.
Borrowing some of the css transitions, color scheme and typography from there would go a long way toward really delivering the "Metro experience" with your framework.
Some things I noticed: On the demo, scrolling horizontally is shaky and broken on Mac OS X Mountain Lion. Although I guess this is my fault due to my browser size, but the set of tiles that needed to scroll to be seen were completely hidden. I didn't realize you had to scroll at first. In addition, trying to scroll back to the original position doesn't work since instead, my Mac thinks I want to go back a page in browser history. (Mac has two finger swipe to go back in a browser if you're not familiar.)
Anywho, I love where this is going and hope to see more. It'd be awesome to be able to create actual Metro-styled web apps. It appears right now that the framework is only set up to re-create the home tile screen in Win8.
Keep up the good work, I'm definitely going to use this.
Am I missing something?
Btw, slightly misleading title. It kind if implies that 2.0.0 is out.
It's interesting to think about the cultural impact of this if it really catches on, at least in the hacker/geek world. This isn't just a tech demo: It's a self-conscious reconstruction of a cultural artifact, and drags along with it other cultural references and context. Nobody these days is going to 'grow up' with an Apple ][ because of things like this, but, previously, the only way to experience that specific system was to either have been born in the narrow window of time where you had one when they were still at least vaguely mainstream, or to decide to run an emulator and likely get into emulation as a hobby. It's the difference between knowing every Beach Boys song because you grew up in 1960s California, knowing them because you deliberately chose to collect that era's music, and knowing them because, like me, your parents played them practically from your birth and so they became the first band you really liked.
This just makes the past that much more mainstream, the software equivalent of the deliberately dated aesthetic of a Quentin Tarantino film.
Good times. I think I will now make a low-res snowman.
10 X=1 20 FOR Y=1 TO X 30 PRINT " "; 40 NEXT Y 50 PRINT "HELLO" 60 IF X=34 THEN LEFT=1 70 IF X=0 THEN LEFT=0 80 IF LEFT=1 THEN X=X-1 90 IF LEFT=0 THEN X=X+1 100 GOTO 20 RUN
20 GOTO 10
7th grade memories come flooding back ...
As an old geezer who earned his programming wings on an Apple ][ coding 6502 machine language and later UCSD Pascal, I am absolutely delighted about this project. I had been toying with the same idea for some time but skipped it due to lack of time and the fear of Apples legal stormtroopers.
Now if only I find disk images of Bandits and Dogfight...
IIRC (and as the emulation does apparently) the games started themselves at boot, so I don't think I have ever typed any command on it !
it unassembly a portion of memory.
and then everything comes to mind :)
peek and poke ftw!