Maybe it had less to do with magnitude than direction.
I look forward to reading about yc start-ups, but increasingly find myself shaking my head, wondering how some of them will ever amount to sustainable businesses. I've always attributed this to the fact that yc must know a whole lot more than me.
But pg's recent disclosure that so many yc start-ups have co-founder issues really got me wondering. I find it unimaginable that any team can work itself into such a good position and then blow it away over seemingly petty issues. Is it possible that some trend other than scale is at work here?
Actually, I think you have always published many of the predictors of failure. http://www.paulgraham.com/startupmistakes.html seems like a good start, but there are probably more specific indicators during an interview or during 3 months before Demo Day.
The ones which seem most relevant are all variants "not making something people want" -- either not making something effectively at all, or that which you make is a bad idea/market, or making something which is a good idea but crap implementation. Obviously several potential causes of each.
If you look at each class and say, "Well, last year we had 5 breakouts out of 50. Why not increase that denominator to change the numerator?"
Unfortunately when you do this too quickly, the numerator doesn't change. It stays the same or sometimes even decreases (in my example, holds at 5).
Staying focused while growing is a very difficult thing to do.
Maybe at 84 startups the total number of founders went above 150?
In person, I rarely meet like minded souls with whom I can have intimate technical discussions, but on-line it's easy (thank you Hacker News friends).
But with customers, I've never found a good substitute for being there with them. I want to see everything they're doing, listen to them bitch, and feel their pain. I want to suffer with them during the day and celebrate with them over beers at night. You just can't do that the same way on-line.
For me, regular in-person contact with friends is essential to feeling motivated and connected with the world. They don't have to be close friends, and they don't need to share my career or understand the work I do as a web developer. Most of the programmer friends I would sling code with on a weekend project are global â€" scattered across the world. Whereas, many of the friends I hang out with the most locally use feature phones, have old or no computers, and think Twitter is pointless.
But that doesn't matter because what I need them for is to sit down and have a good conversation. Or play a game of table tennis. Or ride our bikes out to the suburbs and back. If I don't get these kinds of outlets, a week spent sitting down and coding leaves me feeling strangely unphysical, unhinged from the world around me.
I've grappled with the local/global issue because my life does feel like a duality, where local (mostly offline) and global (online) are separate. I'm not sure how to resolve that (maybe living somewhere with more early adopters of tech would help). But Derek's solution is an extreme one. You don't have to be exclusively local or global.
Coincidental timing for me. I'm Singapore for a few weeks, and even though I've spent months here in the past and have a long-term visa, I have almost no local friends here. I feel like I should feel guilty about it. But I don't feel guilty about it, except in an abstract way.
I spend most of my time coding and working on music. And eating Thai food.
I mean, I especially loved the answer based on the idea of "not favoring anyone", when the author was asked about what he did for the local market.
But this requires some "enlightenment" - as the author said, he lived in many places, so he can now relate to human being as equals - wherever they come from or live.
It seems to me it is just wishful thinking to believe most people - or just a significant enough mass - will be able to think like this.
Most people are lost in their own thoughts and community. They just can't think global, and certainly don't want to.
They just care about the ones they know - and cronyism is just an artifact. It's the same story again an again - dozens of people dying is the news, but having a deep cut in your little finger is a tragedy.
* Use dict and set comprehensions.
* Use collections.Counter.
* Use json.dumps or the pprint module for pretty printing.
* Create simple web services with Python's XML-RPC library.
* Use open source libraries...
Here are some other items about the Fourier transform:
The point is, if you're really determined, working hard can go a long way. The key is to try not to worry about the other people that make it look easy. Later I heard this quote:
"If people knew how hard I had to work to gain my mastery, it would not seem so wonderful at all." - Michaelangelo
Let me share with you an anecdote: In undergrad, I took an intro computer networking class. Everyone else could understand what was going on. There was no assigned reading material and lectures didn't follow from any text. I always was the lowest student in the class, no matter how hard I tried. Everyone else could understand it quickly. I worked hard on that class but it didn't matter. Not everyone who works hard will show positive results, I guess. I more or less flunked my way through school. Sometimes I'd work hard, sometimes not. It didn't seem to matter.
Or, if I'm facing someone who has worked hard and is also very intelligent? I don't have a chance.
The rhetoric of these articles and the rest of these posts here have a coded meaning behind them: there are no disadvantaged people or students -- only stupid or lazy ones. It's a corollary of Survivor Bias. If you work hard, everything will be peachy-keen OK and nothing will go wrong. If you work hard and fail, you deserve only scorn.
For me, it was, never mind if you work so hard that ignored everything else, worked so hard to not get beaten by your parents, worked so hard to get mastery of material -- I must be still too lazy or stupid since I failed.
I don't know what it is that I don't have that everyone else does. I'm likely to not ever know.
This is a topic that I'm very familiar with so that's why I wrote a lot about it and sound perhaps a little crazy.
One mantra tossed around of late is "_perfect_ practice makes perfect," meaning not to never make mistakes, but to be conscious and analytical when you do.
It doesn't matters if you have an IQ of 200 when you're born several thousand years too early. You never make use of your genius, except to make lot of babies and become the chief of your tribe, if you're lucky.
Granted, there's lot of people who are innately smarter than you, but chance are they're also stuck in various positions of life where they can't become a genius like a job at wal-mart, or having children too early, or is stuck in a hut in a third world country somewhere.
If you're reading this, chance are you have the money and the time to rearrange your environment and your behaviors to achieve mastery. The hard part is figuring out how to do that and how to sustain that.
Ah, no. It implies that some gas station attendants might think that it's not. It's idiot-proofing the instructions.
> According to Dr James Grime of the Millennium Maths Project at Cambridge University, reaction time experiments in the 1990s revealed people are 10% slower at deciding whether zero is odd or even than other numbers.
7 books of a highly academic programmer.
A highly effective programmer probably should be reading other topics as to be honest, most of programming is laborious non comp Sci stuff.
If so, shouldn't the book be broader by including more of literature on LISP or even AI programming in general?
2. Some of this stuff is absurdly broad and weirdly pandering.
3. "I have no idea what the copyright implications of this are, so I will be printing out only my own private copy and not making them available publically;" is a totally inadmissible position for a professional programmer to have. The answer is, no, of course you can't make a printed compilation of other people's work and put it up for sale.
A few minor problems, though - it kind of slows the down the browser. Like a lot. I'm using a new Macbook Air and this thing is incredibly fast and has actually never slowed down on me once (and I really abuse it). Also, it seems like the image is kind of stretched. Maybe I'm getting fat and don't realize it. But all in all, excellent work. Would you consider putting it on GitHub and allowing people to contribute? I'd be first in line!
Would it be helpful if forcing the â€śvideo only modâ€ťe would be exposed as a public method? (Currently it can only by switched off by setting the forceHSB flag to true.)
Please find the github repo for the project at https://github.com/WolframHempel/photobooth-js
Come back to me when you are using json to encode an entire document ... you might look at XML a bit differently.
tl;dr use the right tool for the right job.
Could somebody familiar with the changes to 2.10 explain where the performance boost came from and what code it will affect? Does this make Scala a more compelling choice for the discerning functional programmer?
What I'd really like to see is faster compile times. I know it's a tall order given the complexity of implicits, but compared to other languages Scala compilation slowness can be a tough sell.
Anyone on here go both years?
On the other side now it looks a little less magic...
You only have to look at the comments to see why most normal people aren't going to touch it with a forty foot pole.
A Mac Mini with one of these $20 sticks is a decent little system for working on this stuff. It has enough horsepower, is totally quiet, doesn't take up much room, and only uses about 20 watts during ordinary receiving tasks.
I took notes on how I made this work on my Mac here: http://rachelbythebay.com/w/2012/09/19/brute/
Some key things that were not mentioned. There is the GQRX frontend and the SDR# frontend. Both are fairly straightforward to set up and use.
Also not mentioned were the hardware differences (r820t vs e4000 vs fc0013). To get decent signals you need to add a bit of filtering to the USB power supply. And of course antennas, but that topic needs an entire book.
I've been working on a very simple sdr stack (currently merged in rtlsdr as rtl_fm) that is meant to provide a sox-like experience. It is also stupidly fast, enough so that even a little RaspberryPi can easily do pager decoding or police scanning.
I guess most of you should know what SICP is. I took a class where it was used as a textbook (only the first 3 chapters, unfortunately) and loved it.
Most of the syntax-highlighted code fragments can be clicked on and edited. Either click somewhere else after that or press ctrl-enter to re-evaluate the scheme code. There are also some auto-graded exercises which involve writing code, as well as some multiple-choice questions (inspired by coursera)
Brian Harvey have done a great job to simplify and make it more freshman-friendly. I think his CS61A is the best intro course available (but I don't think that the sentence ADT is a such great idea, and it make things little messier, not more clear).
The HtDP2 approach is also remarkable, but problem is it is part of Racket promotion (word dr.racket is used hundred times in the first chapter). The idea to make changes in a program visible, via using graphical primitive functions is brilliant, but controversial one - it is too soon (but it kicks and makes a progress visible)
I think that functions with multiple arguments, first-class "citizens" (values has a type, everything is a pointer), pairs, lists, then generic functions and environments must be taught first, and visualized interactively, similar to that python tool. Then, after, you can teach parts from HtDP and then SICP.
People who really enjoyed this initiation will go through whole books themselves.)
(/ (+ 5 (+ 4 (- 2 (- 3 (+ 6 (/ 4 5)))))) (* 3 (* (- 6 2) (- 2 7))))
and it tells me the result is wrong, but wolfram and my own REPL confirms the result.
To OP. Don't worry. It is intuitive enough (for me at least (-; )
Is an MIT computer science textbook.
FTA "The conceptâ€"known as â€śsender paysâ€ťâ€"would radically alter today's Internet economics. Some countries say their networks are groaning under video and other content provided in large part by U.S. companies such as Facebook, Netflix, and Google. These countries suggest that fees on content providers would help defray local infrastructure costs."
This is exactly why ITU and other agencies should stay out of the way. They have no clue how the internet works. Content providers do pay Bandwidth fees at their edges. If the networks are having a hard time with this content they should have not signed deals at the edges of their networks that put them in these positions. What they really want is to dig their hands into the deeper pockets of content providers and create a sort of "Tax" to improve their "groaning" networks. What would really happen is they would get the money and continue running their inefficient networks and magically the extra money would just disappear in a flash of "operating expenses".
Of course, we'd need a cheap-transceiver-project too. Hm.
Sadly Apple stopped the development of that web solutions years ago and kept it for itself... I heard from various folks at Apple that there are still a few engineers improving it and it is being used a lot for iTunes.WebObjects got some bad reputation from many people that have no clue on how it works and citing that WebObjects can't even support database update and this why the Apple Online Store is often offline when Apple release a new product. This is total BS of course, since take the iTunes AppStore which serve the content for all stores: music, apps, books, movies, etc... And you barely see that iTunes Store is offline.
WebObjects was a great technology but Apple decided to keep it for itself.
: https://github.com/Midar/objfw (Mirror)
I think we need a totally new Objective-C framework for web development based on newer design patterns, GCD, etc, but it needs a new port of Foundation and Cocoa to Linux and other systems.
As much as I'd love an Objective-C web framework, I wouldn't use one that only ran on Mac OS, pretty useless in terms of production hosting I think.
Leave much of the law today as is (copyrights can apply to more than the original copyright law allowed and are automatically in force even without registering) but change the duration mechanism.
For the first 14 years of the work's existence you get a copyright for free.
For the next 14 years, you can re-register but with a higher but still nominal fee.
For each year after that, you must re-register and the fee goes up at an accelerating rate eventually reaching millions per year per copyrighted item. I'm assuming there would have to be a cap somewhere.
The government gets new revenue, most copyrights are shortened and enter the public domain more quickly and for the copyrights that are extremely valuable and still producing value in excess of the ever increasing registration fee, it will be worthwhile for the holder to pay.
How's the video acceleration on this? I need to start a blog. I read like 2-3 hours worth of ARM news once or twice a week trying to track down the best HW/SW/price combination for a simple dumb XBMC-upnp frontend.
The Rockchip devices have pretty crap support, though AMLogic released some sources that might help. Allwinner is finally making progress via AW's CedarX, though it's apparently still sufficiently buggy and they're back at the mercy of Allwinner.
I've yet to see much about the Samsung setup though. I love the Exynos and would very happily spend another $100 in an attempt to find the right thing.
That having been said, there are ~$50 versions of the HDMI stick that have powerful Cortex A9s in them that make these look over priced.
I am the founder of https://appstorerankings.net , a startup focused on ASO (App Store Optimization), so I'd looked up your app's keywords -- you're using "personal,live,movie,cloud,stream,air,share,remote" . That's only 49 characters out of 100 you could be using (or you've got duplicates with the title).
You should add some new keywords to your app, here are some suggestions - "server,vlc,player,audio,airplay,wmv,mkv,avi,nmp4" in order to get more downloads (These were automatically generated based on your current keywords). A lot of your competitors use file formats as keywords, so it's probably a good idea!
Tonido for AppliedMicro
Tonido for Freecom
Wouldn't it be cool if you could load up a URL and get transcoded audio and video of your media library for whatever browser you happen to be using?
purly -> tinyurl -> baconized purly
See http://en.wikipedia.org/wiki/Persistent_Uniform_Resource_Loc... also, this is a "protocol"
It might be better packaged as something blogs and forums can automagically implement for a fee instead of trying to make money off ads.
I cannot add a URL that does not have "http"I cannot type a URL that has "https"
Would be nice to be able to add https adresses and add them in any form.
Also an idea, make it as a web browser plugin so I can change the url in the brower and add it to my link library. Then it's even better. I don't like detours.
Question, what happen if I prul.ly a Url once then the content changes and I want to save the new content as well (Different content, same url)?
Anyway, I really like the concept, keep going!
I'll get around to caching the full content of the destinations at time of purl.ly creation next, and serve that if google is missing it.
This will cut my development time drastically because I use autocomplete on many different pages of an internal web application and I decide what browsers the sales people (85% of the company) use.
I'll for sure install Chrome for them and save the company time and money by using this HTML5 feature. Nice find!
I recently had the non-existent pleasureâ€"on the same projectâ€"of implementing a fall-back for the `datetime-local` input type, which is only available on very few browsers, even though it is absolutely critical to a good mobile experience.
This fails very gracefully compared to the `datetime-local` input type turning into a `text` input.
This is one of the perks of using Opera as my default desktop browser, by the way. Shame so many developers choose to ban it.
So far i've learned a few seemingly effective techniques:
1) Occasionally construct some weird sentences, throw grammar out the window for a sentence (any longer doesn't work) and people will recall that whole section of the document, months, even years later. I have no idea why this is so effective, but i'd love to know.
2) State an assumption and have it missing the obvious consideration. Don't make it some intricate detail, it has to be in plain sight even for someone not familiar with the topic. Again, people recall these sections and are happy to suggest how you could improve the document.
3) Never tell anyone you're playing tricks like this, it completely loses its effectiveness when people know what you're doing with the slightly strange documents.
4) Test the recall of others frequently on your text. Probe them and figure out what's easy to remember and what's not. You can use this to tweak your documents for future.
5) Don't over use it. The goal doesn't have to be to have them remember everything, but it must allow them to remember where they found the information previously.
YMMV, but i'm blown away so far and i haven't seen anyone talking about this along similar lines yet. It all started after reading one too many psychology books.
maybe the answer is in the middle and above, great tool makers are great docs writer.
We should propose how to implement that feature, (spec) get some feedback, go do it, review the implementation against the spec and even then I would want to have a sort of stack overflow Q&A site so people can ask why did you do that ?
Anyone else feel we are missing a whole boatload of tricks?
More documentation is not necessarily better. Just like code, the best documentation is the documentation you never had to write because you found a simpler way, and eliminated the issue completely.
It's my opinion that it should just be part of the developer's discipline to update documentation as they update code, but this seems not to be practical when not everyone shares my idealism for documentation. What ways are there to get buy in from those that need to be convinced about the value of documentation?
Note, this is not to say that documentation doesn't matter. It definitely has a prominent role. So does "does the job." Preferably well. The blogger mentions a prime lens works because it doesn't allow zoom. Sure it does, you just have to move your feet. This can actually be rather annoying when you are trying to get a candid shot in the house. Framing doesn't automatically happen with a prime lens, that is. However, the quality of image from a prime is huge. More than makes up for the short comings for many of us.
Also it looks like you hardcoded your personal HN saved page url into the script.