hacker news with inline top comments    .. more ..    5 Nov 2016 News
home   ask   best   3 years ago   
Show HN: Make an app by adding JSON to this app github.com
180 points by gliechtenstein  4 hours ago   50 comments top 14
mpweiher 2 hours ago 4 replies      
Instead of JSON, you could also consider sending XML markup. Oh wait, or maybe reuse an existing SGML/XML standard called...HTML.

And then you could add scripting...

gliechtenstein 4 hours ago 6 replies      
Guys, the author here. This is my first open source project and I worked really hard on it for 5 months.

I think it's really cool, and I hope you guys like it. I would appreciate any kind of feedback!

mentos 2 hours ago 1 reply      
Really great stuff!

What is the most advanced application you have created with this? What functionality do you think is still lacking?

If you were to try to create a full featured app I imagine you'd find that working in Swift is the better option.

What this reveals to me is that the App Store submission and update process is so time consuming you would rather write your logic in Json than in native Swift/ObjC.

If Xcode let you instantaneously push app binary updates would this be as useful?

felipeccastro 1 hour ago 1 reply      
This is very interesting, although JSON is not a very comfortable format to build manually - I think writing in YAML or CSON would be much cleaner to read/write, and easily converted to JSON. Also, any plans on building a UI for generating this JSON? If you have predefined components, you could make some sort of designer that builds the JSON for the user, that could turn into a commercial project I think.
oliv__ 1 hour ago 0 replies      
Wow this is really impressive.

If you had enough JSON-to-native "modules", basically anyone could write a native app in a few hours (since functionality in most apps is pretty much standard stuff)!

Hell, if you pushed this further you could create an app store for code, letting you buy additional modules to plug into your app like lego.

libeclipse 3 hours ago 1 reply      
Wow that is actually a brilliant idea. Would you ever consider making an android version?
amelius 37 minutes ago 1 reply      
Is this HTML reinvented? :)
walke 3 hours ago 2 replies      
Very neat and well put together! Thank you for taking the time to prepare nice documentation to go along with it! Congrats on your release!
ldjb 1 hour ago 0 replies      
This is a wonderful idea! Making app development faster and easier can only be a good thing.

Now if only it were possible to place a generic version of this app on the App Store and allow users to load in the JSON for whatever app they want. Sadly I very much doubt Apple would allow it.

nhorob67 2 hours ago 0 replies      
Native app newbie here. Can this solution be used to present data tables?
hex13 1 hour ago 1 reply      
Been there, done that. When I coded games, I made some simple JSON based file format and almost whole game was in form of JSON. I even implemented conditional and loops in JSON.

But It can easy lead to inner platform effect. Not needed complexity. And in JSON you can't write JS code (needed for e.g. event handlers).

So I switched to JS. Still data-driven approach (simple hierarchical objects with data), but with power of JS (and if you're use Babel, you could use JSX for your DSL).

So: I don't think JSON is best format for what you try to achieve. It's just so limited.

Besides what is added value of Jasonette? When I look on Jasonette I have feeling that it's just reinvents ReactJS but in JSON not JSX. Not sure if there is much profit with JSON format alone.

ruudk 3 hours ago 3 replies      
Cool idea. But I'm wondering if this is allowed in the App Store?
ashish348 25 minutes ago 0 replies      
how to make an app in 15 min | How to develop a mobile app free https://www.youtube.com/watch?v=tnsYachOafk
ashish348 24 minutes ago 0 replies      
Learn how to make an app in 15 min | How to develop a mobile app freeWatch this videohttps://www.youtube.com/watch?v=tnsYachOafk
H265: Technical Overview sonnati.wordpress.com
34 points by tambourine_man  1 hour ago   6 comments top 3
IshKebab 27 minutes ago 0 replies      
Funny how interesting comments always end up on the front page a day or two later. Happens on Reddit all the time too.

This one is because of this comment: https://news.ycombinator.com/item?id=12872108

userbinator 18 minutes ago 0 replies      
The massive increase in the number of choices the encoder has to make (and correspondingly, the decoder to follow) is the main reason why H.265 requires so much more processing power than H.264, and H.264 compared to previous standards as well. The intra prediction modes (33 directions!) are a good example of this.

That said, if you actually get the standard and look at the section on the intra prediction modes, how to compute them is spelled out in very detailed pseudocode.

takdi 52 minutes ago 2 replies      
I've no knowledge of this kind of stuff but what's the big difference between h265 and VP9?
Physicists demonstrate existence of new subatomic structure iastate.edu
20 points by upen  1 hour ago   4 comments top 3
coldcode 3 minutes ago 0 replies      
Physics is one science where the seemingly impossible winds up being possible all the time. Strange to think people spend a good portion of their lives studying things thought to not exist.
philipov 39 minutes ago 1 reply      
"On their own, neutrons are very unstable and will convert into protons positively charged subatomic particles after ten minutes. "... "For the tetraneutron, this lifetime is only 510^(-22) seconds (a tiny fraction of a billionth of a nanosecond). "

10 minutes is an eternity and 510^(-22) seconds is closer to what I'd consider 'very unstable'

sctb 42 minutes ago 0 replies      
We've updated the link from http://sciencebulletin.org/archives/7339.html to this, which looks like the original source.
Ring, officially a GNU package savoirfairelinux.com
87 points by terraforming  5 hours ago   22 comments top 9
irl_ 2 hours ago 1 reply      
The DHT system for identities is cool, but the thing that gets me is that they don't have support for SRTP with ZRTP, only SRTP with SDES. There's no perfect forward secrecy, and a bunch of other features that ZRTP has.


Ruud-v-A 3 hours ago 1 reply      
This appears to be something phone-related, not the Ring cryptography library written in Rust based on Boringssl. (https://github.com/briansmith/ring)
themihai 27 minutes ago 0 replies      
Is there a well-known discovery document or any other way to create shortcuts for the ringID? (i.e. mapping it somehow to web or email address) I doubt many fancy spelling ringIDs.
qwertyuiop924 16 minutes ago 1 reply      
Why this over Tox, Psyc, or Matrix? There doesn't seem to be much benefit...
reitanqild 5 hours ago 1 reply      
Official web page seems to be https://ring.cx/
metilda 5 hours ago 1 reply      
Has ring improved much? I remember a few months back it used 28gb of background data over LTE (which was fine since I'm uncapped) on my phone, and was less stable than sflphone, which would randomly stop registering occasionally.

I can recommend pjsip though, very reliable so long as you read its docs before writing a script to leverage it.

fulafel 2 hours ago 2 replies      
There seems to be no browseable source code around, what are the implementation language(s)? I want to know whether the protocol implementations are written in a memory-safe language.
frumiousirc 3 hours ago 1 reply      
What stops attackers from poisoning the DHT? Could one publish false name <--> IP address associations?
geofft 1 hour ago 3 replies      
What's the advantage of being a GNU project these days? It seems like it ties you very strongly to the FSF's political opinions and in particular Richard Stallman's political opinions (e.g., eugenics) and restricts your technical decision-making options (e.g., limited plugin architecture, limited support for non-free OSes, mandatory support for things like GNUTLS), while not giving you very much in return - with the existence of GitHub and a wide variety of competitors, it's pretty easy to attract a healthy development community independent of GNU. What am I missing?
Cartoon Laws of Physics toronto.edu
32 points by davesailer  3 hours ago   4 comments top 4
seanmcdirmid 11 minutes ago 0 replies      
There is also always this gem of a paper on applying cartoon physics to user interfaces:


tempodox 1 minute ago 0 replies      
This article should have a warning sign. I almost fell out of my chair with laughter. Insta-add to my Instapaper.
qwertyuiop924 24 minutes ago 0 replies      
Yet another classic example of Internet humor at work. In this case, the classic formula of describing something decidedly non-intellectual in an intellectual manner. Never gets old.
acqq 58 minutes ago 0 replies      
Duh. So unfunny and uninsightful, at least for anybody who did watch the old cartoons.

I'm sorry I've attempted to read it. Now I need a dose of Loony Tunes to wash the feeling away.

Be careful about what you dislike pocoo.org
206 points by JonoBB  4 hours ago   77 comments top 22
hellofunk 1 hour ago 2 replies      
I have realized over the years that it is wise to be naturally skeptical of any opinion that is strong, either positive or negative. People who have an appreciation for gray areas, even if they ultimately do have a preference, tend to be a lot more emotionally balanced than those who maintain a very strong stance on something. I have noticed this so consistently over the last 15 years that I now consider it a fundamental benchmark by which I can gauge my ability to work or socialize with someone in general, on any topic, over the long term.
andybak 3 hours ago 8 replies      
I'm fascinated by the topic of English Prime: https://en.wikipedia.org/wiki/E-Prime

It introduced me to the idea that 'is' should be treated very carefully. Any assertion outside of strict formal languages that use it are half-truths at best. It also introduces heightens the emotional tone of a discussion. If you say "John is foo" you tend to create the impression that John will always and has always been foo. Foo-ness is a taint on his soul. Contrast that with reformulations that make it explicit that John's foo-ness is a fleeting association related to both his present situation, your current perception of it and the current socially accepted meaning of foo along with all it's implied baggage.

I realise I might be rather off-topic :-)

TazeTSchnitzel 2 hours ago 1 reply      
> Then the entire thing spiraled out of control: people not only railed against TTIP but took their opposition and looked for similar contracts and found CETA. Since both are trade agreements there is naturally a lot of common ground between them. The subtleties where quickly lost. Where the initial arguments against TTIP were food standards, public services and intransparent ISDS courts many of the critics failed to realize that CETA fundamentally was a different beast.

CETA has ISDS as well, and if only on that point alone, CETA is objectionable. This argument comes off as disingenuous, the similarities between the deals are not imagined. ISDS isn't even the only similarity; CETA also contained objectionable new copyright provisions (though apparently those are mostly gone now), for example.

cyberpanther 2 hours ago 1 reply      
A very common cognitive bias or logic pattern our brain follows is to whitelist or blacklist things. When we trust something, we follow it without question or we begin rationalizing it no matter what. And in the day of the internet and Google we can confirm basically any bias we have on either side of an issue.

You should scrutinize your own thoughts and opinions and others to see if they are just believing something because it was true in the past.

In terms of Javascript, there is definitely a lot of hate out there for the language and ecosystem which was entirely true. But I would argue JS has the best trajectory right now of any language out there. So you better learn it if you want to stay relevant in development.

Lastly, I've found it best to not be so opinionated about everything. Sure having some opinions are great but you develop too many biases otherwise. So what if something sucks, use it anyway. You might learn something new, or maybe you can help improve it if it has potential.

crawfordcomeaux 2 hours ago 2 replies      
Our realities are each a collection of stories we each tell ourselves. Sometimes parts of the stories two people believe will overlap and we'll call those opinions or facts depending on situation.

I'm finding it helpful to view every signal my body encounters as a chance to choose how to process it, including what I do, taste, or hear.

Since adopting this view, I've effortlessly enjoyed eating foods I've hated my entire life (tomatoes, olives, CILANTRO?!), listening to country music, and doing things like chores that used to bore me to tears.

If anyone sees danger in learning to view the world that way by default, I'd love to hear about it.

mooreds 16 minutes ago 0 replies      
Try not to move the goalposts. If someone compromises, acknowledge that and thank them for it, rather than saying "I am glad you finally saw the light, but now we need to take it a step further".

Brinkmanship rarely serves to get anything done, and burns bridges when it does actually accomplish something.

carsongross 1 hour ago 0 replies      
I naturally see both sides of almost any argument, and my personality is such that I would rather synthesize the arguments of both sides into a final position via dialectic.

I have lost almost every major argument I've had in a corporate environment.

kstenerud 41 minutes ago 0 replies      
It's unfortunate, but we have a tendency to take some beliefs so deeply that they become a part of our core identity. Once this happens, validation of the idea becomes validation of ourselves. Attacks upon the idea become attacks upon ourselves.

Once someone has reached this point, logic simply cannot reach them. Successfully defeating their arguments will only strengthen their resolve (the backfire effect), because they're being driven by the amygdala, which only understands threat response. They will grab onto any argument, no matter how flimsy, and be completely unaware of how little sense it makes. Any further argument with them will at best do nothing, at worst make you look as much a fool as he.

The wise man learns to recognize this state and back off.

dorianm 3 hours ago 3 replies      
For comparaison, Ruby 3 is gonna introduce a pretty big breaking change (frozen string literals) but they already shipped a way to optional enable it by-file (magic comment) and globally to the ruby interpreter (just a parameter) so that all the libraries and projects can slowly fix it in a compatible manner (often just calling .dup is enough).

So that's when it's time for Ruby 3 the transition will be pretty painless.

More info: https://wyeworks.com/blog/2015/12/1/immutable-strings-in-rub...

(Frozen string literals allows strings to be in memory only once and not having to reallocate each time, so a pretty big memory and cpu optimization)

(Also for instance rubocop already recommends adding the magic comment to all ruby files)

Unman 56 minutes ago 0 replies      
Hmmm... while agreeing with the sentiment I am unimpressed by the lack of evidence for one of his supporting examples. What stood out for me was this bald assertion with no reference to falsifiable specifics:

"_Not_only_was_it_already_a_much_improved_agreement_from_the_start_,but it kept being modified from the initial public version of it to the one that was finally sent to national parliaments."

Either the writer of this is an expert on the topic, well-known in the field and the weight of this judgement on its own is a valuable primary source; or, the writer is referring to such an analysis conducted by other experts but has not bothered to include a citation/link; or, the writer has their own critique but instead of presenting _that_ has just stated an opinion which they know to be controversial.

All of the above possibilities contribute substantially to the noise around any discussion.

lazyjones 3 hours ago 1 reply      
It's not the responsibility of the author to anticipate future changes that might weaken his current arguments. The reader is responsible for taking into account the time and context of the text they are reading.

It's why we like to have e.g. "(2013)" added to anchor texts on HN, for example.

pimlottc 1 hour ago 0 replies      
This brings to mind a fantastically lucid comic about the utility of questions vs answers:


After all, the an opinion is just an answer to the question, "What do I think of this?"

xtiansimon 1 hour ago 0 replies      
Headline: Engineer cries Political Arguments are not 'valid'; Forks off own nation.
dorfsmay 2 hours ago 0 replies      
Part of the issue is efficiency, we have to make choices and cannot reevaluate everything constantly. Also, we can't be specialists in everything.

So programming languages, we have to pick a few and become good at them. It's one thing to take another hard look when applying for a new job for example, but we cannot keep track of all programming languages and their evolutions.

rdslw 1 hour ago 0 replies      
Paul Graham in one of his best text explained similar concepts writing "I finally realized today why politics and religion yield such uniquely useless discussions"

Highly worth read: http://paulgraham.com/identity.html

simonhamp 3 hours ago 0 replies      
I think a sideline point here is to not appropriate other people's opinions from a specific point in time just because they happen to align with yours (opinion/bias) at the current time.

And of course, try to have as wide and deep an understanding of the subject as possible before forming strong publicised opinion in the first place.

datashovel 2 hours ago 0 replies      
It may be less the responsibility of the "consumer" of the information and more the responsibility of the "producer" of the information.

If the argument is presented as if something is and will always be a certain way (or even if the argument is presented without admitting that something may change) it can probably lead a lot faster to groups of people assuming the argument will be valid forever.

EDIT: Or can be misinterpreted that someone presenting an argument believes the argument will remain valid forever.

btw. never saw the talks the author cites, and have not followed the trade agreements very closely so I'm only speaking generally here.

9mit3t2m9h9a 3 hours ago 0 replies      
I think the effect described in the text has another side: imagine that at some point using XYZ was obviously a bad idea for multiple reasons for a specific person in specific circumstances. Obviously, keeping track of the changes in XYZ will have a lower priority for a person who is not going to use XYZ anyway, even if one of the multiple show-stoppers gets fixed/changed/redesigned. This means that the person's opinion about XYZ slowly gets stale.
minusf 1 hour ago 1 reply      
for me personally it is news the_mitsuhiko is "not vocally against python3 anymore". i cannot find any other recent blog posts besides this one, where python3 is praised or encouraged fully. so why be surprised if people still think he is a big python3 critic?

as i see it, the issue is less about parroting other's outdated technical opinions, it's about not being vocal enough about the change of heart.

michaelsbradley 2 hours ago 0 replies      
Hear! hear! I also recommend, more generally, reviewing logical fallacies, cognitive biases, and misconceptions as part of a regular self-review. It's important to keep a flexible mind, though achieving greater degrees of interior freedom is hard work.




ak39 1 hour ago 1 reply      
Good article.

List of some of the things I don't like for which I have to occasionally take another peak to see if I'm finally wrong:

1. (In languages) Garbage collection and the idea of "safe code". I didn't like it then and still don't.

2. ORMs

3. (Relational) Data models with compound keys flying around as FKs everywhere.

4. The idea of self service BI (like PowerBI etc in the hands of a business user)

5. Regexp

msinclair 2 hours ago 0 replies      
Except Internet Explorer... that will always be the same. :)
Show HN: Cloud Commander orthodox web file manager with console and editor cloudcmd.io
32 points by coderaiser  3 hours ago   6 comments top 3
60654 1 hour ago 0 replies      
Hey, that looks like Norton Commander in the browser! Even the icon is an homage to the original. Neat.

Also TIL that "orthodox" file manager is an actual term, and there's even a huge online book about them [1]. That's funny, we used to call then just "file managers"... ;)

1. http://www.softpanorama.org/OFM/Paradigm/index.shtml

PuffinBlue 2 hours ago 1 reply      
Anyone want to give a run down of what this is/does?

The live example isn't working either.

Gruselbauer 2 hours ago 0 replies      
This looks hella useful. Checking it out today. Thanks!
DeepMind and Blizzard to release StarCraft II as an AI research environment deepmind.com
784 points by madspindel  21 hours ago   304 comments top 41
theptip 18 hours ago 8 replies      
This is pretty interesting.

DeepMinds last triumph (beating the best human Go players with AlphaGo) is impressive, but Go is a great fit for neural networks as they stand today; its stateless, so you can fully evaluate your position based on the state of the board at a given turn.

Thats not a good fit for most real-world problems, where you have to remember something that happened in the past. E.g. the fog of war in a strategy game.

This is a big open problem in AI research right now. I saw a post around the time of AlphaGo challenging DeepMind to tackle StarCraft next, so it is very cool that they have gone in this direction.

When Googles AI can beat a human at StarCraft, its time to be very afraid.

formula1 20 hours ago 12 replies      
I suspect this will eventually lead to AI as a service for games. Rather than build a terrible AI that delays a game by months, approaching a company that can build a decent AI initially which gets better overtime would probably be ideal and create better experiences.

Im curious if a startup can be built from this.

rezashirazian 15 hours ago 7 replies      
If DeepMind is planning on building an AI that can beat the best human SCII player, they have their work cut out for them.

I'm not sure how familiar people are with StarCraft II, but at the highest levels of the game, where player have mastered the mechanics, it's a mind game fueled by deceit, sophisticated and malleable planning, detecting subtle patterns (having a good gut feeling on what's going on) and on the pro scene knowledge of your opponent's style.

chrishare 17 hours ago 2 replies      
Very interesting stuff.

Allowing researchers to build AIs that operate on either game state or visual data is a great step, IMO. Being able to limit actions-per-minute is also very wise. The excellent BWAPI library for Starcraft Broodwar that is referenced (https://github.com/bwapi/bwapi) provides game state - and was presumably used by Facebook to do their research earlier this year (http://arxiv.org/abs/1609.02993). For mine, the significant new challenges here not present in Go are the partial observability of the game and the limited time window in which decisions need to be made. Even at 24 frames per second, the AI would only have 40 milliseconds to decide what to do in response to that frame. This is more relevant to online learning architectures.

The open questions here are how freely this will be available - and in what form. Will I need to buy a special version of the game? Clearly there will be some protection or AI detection - to ensure that competitive human play is not ruined either by human-level bots, if they can truly be developed, or by sub-par bots. Starcraft 2 (presumably the latest expansion, Legacy of the Void, will be used here) does not run on Linux, whereas most GPU-based deep learning toolkits are, so having a bridge between the game and AI server may be necessary for some researchers.

Besides being great for AI researchers, this is probably good for Blizzard too, since it will bring more interest to the Starcraft series.

2017 can't come soon enough.

tdaltonc 19 hours ago 1 reply      
I'd love to see a e-sport league where the teams are AI human hybrids (Centaur teams). We know that AI human hybrid teams are great at chess [1], and I'd love to see rts games played by 'Centaur' teams. In the same way the innovations made in F1 often trickle down to consumer cars, can you imagine the advances that could be made in human-machine interactions in the crucible of a real-time Centaur gaming league?

[1] http://bloomreach.com/2014/12/centaur-chess-brings-best-huma...

rkuykendall-com 20 hours ago 0 replies      
This would be so much more fun with a turn-based game where speed isn't a variable, like Civilization. I'd love to play against several AIs that were better than me because of code, not because they get a bunch of extra in-game bonuses. With a nice AI API, you could have online competitions where AIs battled every month.
wodenokoto 14 hours ago 0 replies      
This paper is a few years old, but it gives a good overview of the problems faced when building an AI for starcraft and the methods used.

Ontann, Santiago, et al. "A survey of real-time strategy game ai research and competition in starcraft." IEEE Transactions on Computational Intelligence and AI in games 5.4 (2013): 293-311.


Miner49er 21 hours ago 10 replies      
Looks like they are going to limit the APM of the AI. I wonder how they are going to decide the limit? I've never played StarCraft, but from what I understand very high APM is needed to play the game at the highest levels.
cryptrash 16 hours ago 0 replies      
I'm pretty excited about this. I think some kids out there will really enjoy an environment like this to mess with, and maybe learn a thing or two about machine learning along the way.

Starcraft is a really fun game, and I think it's enough to engage kids a little more than something like Minecraft where there's plenty of room for some cool ML hacking, but not enough stimulation from it. Instead of just seeing blocks here or there or whatever, starcraft has hard goals that will force them to use critical thinking skills, their knowledge of the game, their own personal strategic insights, and the ML skills they accrue.

So exciting! Love the feature layer idea also, well done!

deepnotderp 11 hours ago 1 reply      
By the ways, if anyone's interested, there was a deluge of deep learning papers today and one of them basically used deep learning to make deep learning models and it did better than humans.
Savageman 20 hours ago 2 replies      
That's so cool! I wish they could start doing AI for team based competitive games like League of Legends where meta-play and team decision making is important.Is that too complicated to tackle yet?
fitzwatermellow 19 hours ago 0 replies      
Training data? After all, AlphaGo trained on a database of over 30M expert human moves. I suspect one championship round from Team EnVy is worth billions of iterations of random exploration ;)

Kudos to both Blizzard and DeepMind. Anticipating a lot of fun with this. StarCraft 2 could indeed become the AI pedagogy standard.

xg15 17 hours ago 2 replies      
I applaud the idea, but I'm worried about how open the results of the research will be in the end.

I think the worst possible outcome for society would be if we ended up with capable AI but with the algorithms only accessible for a handful "AI-as-a-service" companies.

The second concern is understandability of the algorithms: from what I've read, it's already hard to deduce "why" an ANN in a traditional problem behaved like it did. An AI that is capable of coping with dynamic, unpredictable situations like an SC game (with only pixels as input) is impressive but it seems less useful to me if it's not understood how that is done.

jasikpark 15 hours ago 0 replies      
Comparing AI and humans in games is not useful unless all limitations are controlled for both parties. The artificial intelligence only gets the video output of the game and output to simple controls with a human reflex - like lag on how long it takes for the controls to take effect. It just comes down to the scientific method.
simopaa 19 hours ago 2 replies      
I would love to keep a 24/7 stream open where the AI is continuously playing against the game's own bots in order to improve its playstyle.
brylie 20 hours ago 0 replies      
It would be cool to see a project like this for an open source game, such as OpenTransportTycoonDeluxe (http://www.openttd.org/). The AI developed by interacting with the OpenTTD economy might even prove useful for urban planning of real geographic regions.
bluebeard 20 hours ago 0 replies      
This will be good for games moving forward due to the meta changing for players as the AI adapts to their tactics and vice versa. Lessons learned from this can then be applied to other areas. And as an added bonus it creates more interest in AI research.
simonebrunozzi 12 hours ago 0 replies      
This is a wasted opportunity for other strategy games to become the most played game on the planet.

I sometimes play strategy games and I always find the AI disappointing. Any game with a great AI would be my favorite for years. Heck, I would even pay a few dozen cents/hour to be able to compete against a proper AI.

seanwilson 14 hours ago 1 reply      
Can a simulation of a complete Starcraft game be done quickly and assuming it can't doesn't that present problems to training an AI? For example, I'm guessing complete games of Go are order of magnitudes faster to simulate which makes it more practical to do things like getting AlphaGo to play against itself to train.
oblio 8 hours ago 0 replies      
I wonder when we'll be at a point that a small, portable AI (such as the one included with games) is actually competitive with decent humans.
ambar123 11 hours ago 0 replies      
I will consider AI to human level.. When it can fully play gta san-andreas
prawn 9 hours ago 0 replies      
Makes me wonder if any game companies have seeded empty servers with bots, acting as humans, to give their games a sense of popularity.
plg 16 hours ago 4 replies      
I understand the challenge, and the importance of the proof-of-principle. But again? Having done Atari games, then Go, at what point exactly does google deepmind start attacking some real world problems of consequence?

Or maybe the answer is never, other companies are supposed to do the hard work? We only play games?

KZeillmann 20 hours ago 4 replies      
This is so exciting. I've always wanted to program bots to play online games -- mainly for learning purposes. (Can I make a bot that plays better than me?)

But I've never done it because of the risk of bans. I'm glad that Blizzard has opened it up for people to experiment with this. I wonder how it will interact with any sort of anti-cheat systems in place, etc.

luka-birsa 20 hours ago 10 replies      
Anybody else sees a problem with training the AI to move troops around a battlefield, with the purpose to exterminate the opponent?
randyrand 20 hours ago 0 replies      
I hope the AI will have a handicap on the speed of mouse movement. IRL you can't just teleport the mouse around the screen.
ChrisAntaki 20 hours ago 0 replies      
> Were particularly pleased that the environment weve worked with Blizzard to construct will be open and available to all researchers next year.

This is awesome. I've only ever reached the Platinum league in Starcraft II (1v1), but I'd almost feel more driven to create bots to (hopefully) surpass that skill level, than actually playing the game.

andrew3726 20 hours ago 0 replies      
This is really good news!Lets hope DeepMind can improve even further on their Differentiable neural computers (DNC) which seems like an requirement for this kind of AI to work (exploiting long-term dependencies).I also hope that other research/industry teams will join on the competition to create competing AIs. Very exciting!
kleigenfreude 12 hours ago 0 replies      
First we teach it Atari games, and now strategy and war?

Why not give it lots of data to solve real problems? Training it on useless games will have no benefit.

tylerpachal 19 hours ago 0 replies      
For anyone looking for more information about Starcraft 2, the world championship is on this weekend and the broadcast for today has just started (16:30EST)


lanius 20 hours ago 1 reply      
I can't wait to see how far DeepMind can go in this area. I was initially skeptical that AlphaGo could defeat top human players, but then it happened. Who knows, perhaps one day AI can compete against progamers in GSL!
pizza 18 hours ago 0 replies      
Nice! I cracked a joke about how SC2 was nothing more than an AI testbed just last week, lol. Very glad to see it's becoming a real thing!
Havoc 17 hours ago 0 replies      
This is awesome. I know there is a happy AI community on SC1 front so glad to see Blizzard anything on SC2 front.
andr 18 hours ago 0 replies      
I wonder if AI will take over eSports (Twitch, competitions, etc.), as well. It could be a variant of the Turing test.
simpfai 20 hours ago 2 replies      
Does anyone know of any resources for someone looking to learn more about Game AI for real time strategy games?
noiv 20 hours ago 1 reply      
Hopefully there's a JS/Spidermonkey interface. I'd be happy to port Hannibal from 0AD.
komaromy 20 hours ago 0 replies      
My mostly-uneducated o/u on the time to beat the best human players is 8 years.
felix_thursday 20 hours ago 0 replies      
OpenAI vs DeepMind?
partycoder 20 hours ago 0 replies      
That's great.AIs on SC1 relied on many hacks. Initially I thought that DeepMind was going to create a bot for the original SC.

I hope some of the advances in SC2 AI can be integrated into the in-game AI. e.g: a trained neural network that plays better than the "hard" AI, but can run on a consumer box and not on a massive cluster.

flamedoge 20 hours ago 2 replies      
This is dangerous. Overmind will come to life.
FlyWeb An API for web pages to host local web servers flyweb.github.io
252 points by collinmanderson  14 hours ago   57 comments top 25
SchizoDuckie 45 minutes ago 0 replies      
Oh great. yet another way for ad services and botnets to talk to eachother...

Because yeah, that's where this is going to be used first and foremost.

Most people are not intelligent enough to understand how to secure their internet banking, and now we're going to bake-in hosting tcp connectable servers?

These security prompts better have some real clear language and require giving permissions every time.

Now I can see some good things for this too, start a flyweb from your desktop and easily transfer some stuff from your phone for instance (something that still sucks in 2016)

I just think that most of it's use will be malicious.

macawfish 7 hours ago 2 replies      
I'm very excited about this! I've been waiting for something like this for a long time!

To me, the power here is in using the technology to foster local, human-scale interaction.

Intranets are totally underutilized. How many people do you know who can reliably transfer personal files over a local area network? Not nearly as many as those who know how to use google or send an email... that's absurd to me, given how ancient of an application file sharing is.

It's my opinion that the survival of the internet may very well rest on p2p webs like this.

nstart 10 hours ago 1 reply      
This is potentially huge. If other browsers can also jump into this, we could potentially see a rise in a new generation of apps that are local first and enable much richer real time collab features. I might be reading into this wrong but this could also usher in a better and open implementation for iot devices to provide interfaces for the user. Excited to adopt this and try some experiments out.

One interesting thing to figure out is the combination of local and global. When I have an iot device and I'm away from home, or someone collaborating with me from a different location, the same app needs to fall back to using standard internet based interfaces. Not sure if that disqualifies it from being a potential use case of this.

realworldview 3 hours ago 1 reply      
I'm obviously reminded of http://www.operasoftware.com/press/releases/general/opera-un.... Is Mozilla reinventing the web again, too? Will this following the footsteps of Persona, and all those other Mozilla experiments. I would prefer less focus on such a public lab space where money is thrown, and greater focus on reality. Sorry for the negative view but Mozilla appears to have the same lack of focus problems as many other companies.

Edits: speling correctoins

Matthias247 1 hour ago 0 replies      
I certainly like the discovery feature through mdns. Could be helpful for a lot of scenarios. Windows allows out of the box already something similar by showing UPnP devices in the network browser and double-clicking on them navigates to their advertised webpage. That makes it windows and UPnP (instead of mdns) only, but works with all browsers. Having it directly in the browser would allow to have it on all OSes, which is certainly also good.

I understand why they hide the real IP addresses behind UUIDs, but I think there should be an option to also convert it to the real IP/host address. Because often you want to share the address of the embedded device with your coworker, use the address in another tool, and so on.

However I'm not sold on the idea and state of the webserver in the browser API. It just leaves a lot of questions open: E.g. pages are often reloaded, how will this impact the experience. Or HTTP request and response bodys are possibly unlimited streams, the simplified API however does not expose this. What attack vectors are enabled through that and how will it limit use-cases?

shakna 11 hours ago 0 replies      
This is actually kind of cool. The discoverable features for pre-existing non-FlyWeb servers stands out to me.

Secondly, the FlyWeb server gives you access to a really flexible API for serving just about any content.

It feels like federated content, we just need to question whether it should be locked to the local network.

coldnebo 3 hours ago 0 replies      
Curiously enough, the only thing that went through the mind of the bowl of petunias as it fell was: 'Oh no, not again.'"

[edit: ok, so it is cool, but I'm not sure it's secure, and I'm not crazy about web pages from other domains being able to setup local discovery on my network. Seems like a massive security problem. Uuids sounds like obfuscation, not security. ]

[edit: ok, well at least they've started thinking about it: https://wiki.mozilla.org/FlyWeb#Security

Would like to see this fleshed out some more. ]

formula1 9 hours ago 1 reply      
Im not reading anything about opening ports on your routers firewall. Does this somehow circumvent this? Reading it further it seems to explicitly say "local" which probably implies you.ip.address.[0-255] is targeted.

I think this technology is intriguing and with some real use cases (more peer to peer) but the api seems disorganized. I cant tell if it wants to be another webstandard or be something different.

A part of me wants to dislike this and consider it as a distasteful competitor to pre-existing technologies that have learned to survive without "the web". Another part realizes that sandboxing these technologies protects and enables the average user in regards to awesome tech. This certainly wont replace torrent, webrtc or other existing p2p technology. But I certainly think its a cute way of opening up the field.

Animats 7 hours ago 1 reply      
There seem to be two parts to this. One is a way to inventory your LAN using multicast DNS and find all the web servers on it. (There may be one in every lightbulb.) The other is to run a web server in the browser. These are independent functions.

The first seems useful. The second seems to need a more compelling use case. Also, opening the browser to incoming connections creates a new attack surface in a very complex program.

deno 3 hours ago 0 replies      
It seems the feature here is server-less discovery (mdns). Because otherwise intranet communication between apps is already possible via WebRTC.

Along that track, it would be nice to see native DHT support in the browser, for global server-less discovery.

Unfortunately, just using WebRTC is not a great fit for a DHT, because of connection costs. Also it makes more sense to have DHT persist between app sessions.

BHSPitMonkey 3 hours ago 1 reply      
Here's a little Chrome addon I just whipped up which lets you browse and launch local FlyWeb services:


gpsx 9 hours ago 1 reply      
I think it is a great idea for devices to be web servers to allow remote devices to serve as their user interface.

I lean towards using bluetooth as a discovery mechanism rather than wifi. Google's "Physical Web" I think does something along these lines, though I am not sure whether or not they are thinking about web servers on these local devices. I think that is a key part of the idea.

Senji 10 hours ago 1 reply      
It would be interesting to see how this plays out with websockets based torrent clients.

Meshnetwork torrent trackers with DHT anyone?

pmontra 6 hours ago 0 replies      
Some tips for whoever wants to try it out.

In the desktop FF Nightly the Flyweb menu must be picked from the customization menu (Menu, Preferences, drag the Flyweb icon to the toolbar). I think Mozilla forgot about this in their page.

Another important bit of information is how to install Nightly alongside with the current FF http://superuser.com/questions/679797/how-to-run-firefox-nig...

My take on this: interesting, especially the server side part. Instead the server inside the browser could be at best a way to drain batteries and at worst a security risk because of the increased attack surface. I wonder how locality applies to phones on the network of a mobile operator vs on a smaller WiFi network.

Anyway, if we have to rely on browsers to implement the discovery mechanism I'm afraid that it won't fly (pun intended). I'd be very surprised if Apple, Google and even MS will include this into their browsers. I got a feeling that they might want to push their own solutions to sell their own hardware. I hope to get surprised.

Maybe there will be apps autodiscovering those services or other servers acting as bridges to a "normal" DNS based discovery service.

Btw: Mozilla should test their pages a little harder. I had to remove the Roboto font from the CSS to be able to read it. The font was way too thin in all my desktop browsers and FF mobile. Opera mobile was OK, it probably defaulted to Arial.

rjmunro 5 hours ago 1 reply      
Is this compatible with Safari's Bonjour functionality?

Apple have hidden it behind flags in Preferences -> Advanced in recent versions, but when enabled, you get a "Bonjour" item in the favourites menu, which will show the internal settings websites of compatible printers etc. that are on the LAN.

adrianN 7 hours ago 1 reply      
I don't really understand how this is better than just running a small webserver. The discoverability feature is nice, but I think you could do the same with a port scan in the local network. Can someone explain?
cm3 5 hours ago 0 replies      
Sounds like extensions that delegate text fields to native editors should be easier to write with better ability to expose a localhost http endpoint.
cm3 5 hours ago 3 replies      
How similar is this to Opera 12's built-in httpd?
forgottenacc57 7 hours ago 0 replies      
Examples of useful reasons to use?
esafwan 9 hours ago 0 replies      
Really cool. If embraced by other browsers this can have good impact in iot space.
logronoide 6 hours ago 2 replies      
Flyweb botnets DDoSing services in 3,2,1...
chris_wot 7 hours ago 0 replies      
Ok, this is cool and I think finally Mozilla have hit on something innovative. I'm going to check this out soon.
ilaksh 7 hours ago 0 replies      
This is awesome but if Chrome and Firefox both supported UDP then we could build things like this in userland. Is that happening?
alphapapa 9 hours ago 0 replies      
> Enabling web pages to host local servers and providing the ability for the web browser to discover nearby servers opens up a whole new range of use cases for web apps.

That's not all it opens up. "Enabling web pages to host servers"--who thought this was a good idea?

To top it off, later in the page, they tell users how to upgrade Node by running `curl ... | sudo bash -`. Good grief, the anti-patterns!

This FlyWeb site has me seeing red.

anysz 10 hours ago 0 replies      
"Hello Flyweb"? smh blasphemous hubris
R Course Finder r-exercises.com
17 points by rexercises  2 hours ago   1 comment top
kylebenzle 1 hour ago 0 replies      
R has done more to bring non-coders into professional programming than anything else in my opinion and I love that they are sticking to their roots with self-paced courses.
I dont like computers happyassassin.net
304 points by cheiVia0  7 hours ago   173 comments top 60
m_fayer 2 hours ago 3 replies      
This struck a chord with me.

Unlike the author, I think I still like computers, but only in their essence. I like programming, the detective game of debugging, learning new paradigms, getting lost in abstraction, the thrill of watching powerful automation doing it's thing.

But I don't like what computers and the internet have become. Without constant mindful adjustment, all my devices inevitably become attention grabbing pushers of just-so packaged bits of media. I don't let that happen, but that's clearly their essential inclination. Keeping this at bay feels like swatting away the tentacles of some persistent deep sea creature.

I feel everyone's attention span eroding. I feel people packaging themselves for social media, opening their self-image and self-worth to the masses. I see a flood of undifferentiated information, the spread of hysteria and belligerence, the retreat of quietude, humility, and grace.

This is all downside, but lately I'm losing the upside. While I still love the technology underneath it all, more and more I feel like I'm working in the service of something that's driving humanity collectively insane.

mouzogu 4 hours ago 2 replies      
I agree with the sentiments. I don't agree with the notion of the good-old days however. It only takes 5 minutes on a PC running Windows 3.1 to remind me how much of pain those days were at times - at least in comparison to now.

You see the difference is that I was much more patient and tolerant then. Now, thanks to the Internet - I have become very impatient, anxious and my attention span has dropped almost to zero.

I hate theee things. The way technology has changed me. This is why I have grown more and more disinterested in technology and all its promises. Even though if I were being honest we have never had it better in terms of the range, options and diversity of the field.

I think technology has made me a worse person. More informed but less interested. It's given me more opportunnities at a time when i feel most exhausted and apathetic. Perhaps this is normal considering we are going through the "internet" revolution. A lot of changes. Many of which I don't like within myself and society in general.

noir_lord 56 minutes ago 0 replies      
My GF laughs at me and says that I'm a terrible techie, outside of computers and programming, I'm just not that interested in technology anymore, A lot of the fluff around IoT seems just that fluff, I'm excited by the possibilities in terms of things like city management, I couldn't care less if I can turn my lights on when I walk into my house or if my toaster is connected to the internet.

I don't buy gadgets, I own tablets to watch the odd movie and for device testing otherwise I would only have one, my phone is a 4 year old Nexus 4 which I broke the back on and covered in black electrical tape (I could replace it but I don't care enough to do so), I use a 17" Vostro for working the odd time I can't be at my office or at home and it's dented and has stickers stuck all over the scratches, I'm not even sure I remember what the stickers where for.

I'm just not excited by new hardware like I used to be, I only care when it'll have a demonstrable impact on my enjoyment of programming where once I'd have lusted after the latest and greatest I couldn't even name the best model of i7 or whatever at the moment, I only care about that stuff when I'm building a new desktop.

What does excite me is how technology is having a meaningful impact on peoples quality of life.

I think in a way thats just part of getting older (I'm 36).

That and every time I interact with technology that isn't one of my linux machines I come away feeling like I should hunt down whoever wrote the software with a bat and some bad intentions, one of the downsides to been a programmer is that the deficiencies of everything are so much more obvious.

Prime example, I bought a LiFX 1000 bulb (WiFi/IoT bulb) to put into a ships lamp as a christmas present for the GF, it took me 45 minutes to set the thing up, followed all the instructions to the letter, de nada then I thought "I wonder if changing the wireless channel might work" and lo and behold changing from Channel 13 to channel 9 made it work.

Nowhere was that documented in the instructions (which I read) and had I not been a techie I'd have never thought to try it, my point been where once I'd have thought "This is cool" now I just resent the 45 minutes I won't get back.

clarry 6 hours ago 4 replies      
I kinda share the feeling. Well I still like tinkering with some things that nobody else seems to care about. But most of the time it feels like doing stuff with computers is just fighting the new technology (which I don't care for) and then there's politics, copyright & contracts, things that further try to ruin it for me.

For most part I can't get excited about any of the news about software, programming languages, new services or big tech corps. I look at the front page of HN, yawn and move on. I don't care what Apple is doing, I don't care what Google is doing, I don't care about your new javascript framework or microservice, not about your new OS, I don't care about a new smartphone or laptop...

The few things I find interesting are things I keep to myself because every time I've tried to make a discussion about them, nobody else seems to be interested. Or it may even be met with hostility.

bitL 5 hours ago 3 replies      
I really think there was a hidden shift in the past 10 years from dreamers trying to implement things helping humanity to asocial computing whose only purpose is to extract more money than humanly possible. Before these dreamers had an edge as the power-hungry people didn't get it; now they get it and use it to extend their power and even using these dreamers as disposable ways to reach their goals. It's difficult to get excited about it.
ensiferum 4 hours ago 2 replies      
Heh, and I don't have a problem to say that I'm looking for a career switch. In fact I've been doing software engineering professionally for +15 years now and quite honestly I'm sick of it. Have been for years now.

I still enjoy programming but only that and when I get to program my own hobby programs and focus on the parts and problems that I find worth solving and doing. I don't enjoy the SW dev work at work, doing stuff that I don't care (or the world doesn't care about), solving problems like fixing build files, or having a shitty tool that crashes or having all these stupid useless (meta) problems and the general nonsense prevalent in the IT/tech industry. Just as an example of what I mean the other day I was having a problem with automake (WARNING: 'aclocal-1.14' is missing on your system.) when building protobuf (not going to get into details, it's very obscure). My motivation for this kind of (nearly daily) crap is about absolute 0. I'm sick and tired of it all.

The only reason why I'm still doing this is because I haven't realized what would be a feasible alternate job for me to do and which direction to go to.

Overall I feel like this job has changed me as a person as well. I'm extremely cynical these days about anything related to tech/IT. But hey at least I have a great taste for cynical and sarcastic humor now (for example Dilbert)!

dasmoth 6 hours ago 1 reply      
I'm guessing a similar-ish age to myself.

Besides the internet, one thing that's changed is that computing has become a much less solitary activity: in the 90s and 2000s we were still seeing the tail end of the microcomputer era which was very much built by individuals hacking away on stuff at home or in tiny businesses -- and when larger businesses hired "microcomputer" (and to some extent PC, and web) people, they still worked in very much the same way.

Today, the IT workplace is all about "teams and practices", and even if you're working on something intensely personal as a side project, there's still a degree of expectation that if you want it to amount to anything you need to get it out there as a collaborative, open source project. Or a company with other people involved.

At least for introverts, computing used to seem like something of a refuge. That's definitely less true today unless you deliberately do something that's totally personal.

MrQuincle 5 hours ago 2 replies      
A different angle, but I don't like computers that are in your face, costing time, rather than giving me time.

I don't like games. I don't like VR. I don't like AR. I don't like television. Also reading HN too much makes me feel empty.

However, I do like smart things that do stuff for me and get out of my way. I really like waking up in a warm bedroom while the rest of the house is allowed be cold. I like the convenience of telling Alexa, "play something relaxing" when I come home from work. I like having to clean a little less thanks to a Roomba. I like not having to switch off stuff because it's done automatically. I like an AI to schedule my appointments.

Every computer that minimizes my interactions with computers or gives me time, the most precious resource, I like!

kleiba 5 hours ago 1 reply      
I guess I can relate to the general feeling, but I would say that it's not computers I don't like, it's the internet. And if you look at the bulleted list in the original blog post, you see that most of the things listed there are internet related - probably, as one might argue, because computers == internet these days.

But that's why I say I share the feeling: what drew me to computers when I was little was the tinkering with this fascinating machine that did as you told it (so you better told it the right things or else it would end up in a mess without mercy). It was a time were you felt you could still reach a point were you're actually in control, computers were still simple enough that one person could pretty much understand all of it.

This is no longer the case today. The complexity of the modern IT landscape is just intimidating. You couldn't possibly feel like you could one day be in control or on top of things anymore. Everything's changing, everything's growing at too fast a pace to keep up.

Therefore, if what drew you to computing in the first place was a personal connection and interaction between yourself and the machine, it's no wonder that that magic has gone now.

eludwig 5 hours ago 1 reply      
This is a normal part of growing up. It sounds like the author is perhaps in his mid-30s? 40ish? Am I close? Actually that doesn't matter at all, because this happens throughout your entire life. It's happened to me several times.

The secret to human interests is that they have an arc. A beginning, a middle and an end. Are you still doing the same things you were doing when you were ten? Maybe, but maybe not. I'm certainly not. There were no computers when I grew up. Well maybe a few ;)

It's natural to be bummed out when your interests (work interests, love, play, etc) change. It feels weird and uncomfortable, like we are losing something. It feels bad. You wonder if you are in a deeper funk...like real depression. Will it return? Is it a phase? You don't know.

The best way I have found to deal with this is just to watch. Observe. Hmm. I'm really not feeling this today and haven't for a while. That sucks. Don't get too caught up in it. Let the feelings rise and fall. Keep noticing. What is it that I do get turned on by? Well, I'd really like to be reading right now. So make time and do it. Let your urges take you where they will. Trust them. Let them lead you towards something that does it for you. The author seems to have that covered. He (she) is aware of things that are interesting. Keep doing these things. Let the things that interest you reveal themselves. Have faith in this cycle. It does eventually resolve itself.

I realize that this whole deal is tough due to responsibilities. Family, etc. People are counting on you. You have bills to pay. Appointments to keep. Keep them. Stick to the routine while you explore. This is important, because learning about yourself is easier when the external drama levels are low.

You will know if this course works, because you will feel better. If you still have angst and it is getting worse, then you may need to talk to a real person (a whole other kettle of fish).

My advice: listen and watch. Do what you need to while exploring what makes you happy.

marcusr 6 hours ago 1 reply      
I'm guessing I'm a similar age to the author from the reference to parents shouting at the phone bill from my 1200/75 modem running all night. And just recently I've felt exactly the same way. My job involves both running systems and writing software, and the joy has disappeared from both. I used to work all day, then come home and hack all evening, but now I don't know if I'm burnt out, but I can find no interest in making computers do cool things any more.

There's been one small bright spot - I tried learning Haskell and loved the way functional programming stretched my brain but there's an awful lot to learn to do anything useful. But Elm, wow, do I love Elm. I feel the excitement I felt when I saw Ruby on Rails for the first time ten years ago. It's finding something interesting and useful to build with Elm that I'm struggling with now.

I wonder if it's the message that if you're not building a product that will build a unicorn company, then it's not interesting that's part of the general malaise.

pmyjavec 7 hours ago 3 replies      
"Somewhere along the way, in the last OH GOD TWENTY YEARS, we along with a bunch of vulture capitalists and wacky Valley libertarians and government spooks and whoever else built this whole big crazy thing out of that 1990s Internet andI dont like it any more."

It was great fun before it all got so serious. Very funny and true ;)

jasonkostempski 46 minutes ago 0 replies      
"I use computers for...well, I use them for reading stuff. That is, actually reading it. Text. Pictures if I have to."

The other day I starting thinking of a way to filter the internet down to text/plain content only. I couldn't find a way to make Google filter on Content-Type, you can filter on filetype:txt but not all urls to text/plain content will have that extension. I also looked for a aggregate site that only allowed users to submit links to text/plain content but didn't find one, thinking about making one. Multimedia, markup, JavaScript and hypertext are all really useful, but they are abused so much that I think it would be better to start with the assumption that it's useless until proven otherwise. I'd rather have to copy and paste a URL to get a picture of a diagram relevant to an article than open the flood gates for in-line media, styling and scripting just because it looks a little nicer and saves few keystrokes.

jordigh 5 hours ago 3 replies      
I don't have a phone (mobile or not). I especially do not want to carry a pocket computer around with me, but that's just because it would make me feel so powerless to have a device that can track me, that I can't hack, and I don't have certainty of what it can or can't do.

Am I just old? I'm in my mid 30s. According to Douglas Adams, that's kind of the age at which new things are just perceived as being against the natural order of things. Kids these days are being raised thinking that talking to Alexa and having it bring back accurate results is completely normal and natural.

Are there young people out there who think modern pocket computing is just plain wrong? Do they have any second thoughts about putting their entire life online under the control of 3rd parties?

teekert 6 hours ago 1 reply      
OP is getting older. And, like me you move from nights of Gentoo tinckering to Arch, to 30 min Ubuntu LTS installs and hoping the default config files are what you need. And, in the future I see myself buying a Synology.

A child is intrinsically motivated to play, you loose this as an adult. No biggie but your shit just needs to get the job done, and the job is not learning as much about the shit as you can. Such is life, you have other things to do now, like raising a kid, and getting enough sleep while doing it.

As with life I learned a lot when young, taking the time to learn the stuff that I still use now. Perhaps computers extend the playing age because they are intellectually satisfying for much longer than other forms of play, but eventually you're done playing.

neals 6 hours ago 0 replies      
I just wanted to add, that whenever I've worked too hard for a few nights straight and find myself on the edge of that dreaded burnout... I start to "not like" things.

I know you say you've been watching your hours, but burnout doesn't maybe just come from hours.

For me, it's the first signal I need to do something when I start to feel there's no food that is really tasteful anymore, there's no games I like playing and there's no job or person in the world that could possibly make me happy.

I get that's not the issue of the post, but maybe it's something to think about for all of us?

LeoPanthera 6 hours ago 0 replies      
I find solace by retreating back into my childhood. My home office contains a collection of obsolete yet comfortable pieces of hardware. A BBC Micro, an Amiga, a Twentieth Anniversary Macintosh, a MAME cabinet, and a small collection of pinball tables.

I can happily spend hours immersed in the past, and when I'm done, returning to modern digital life is somehow refreshing.

See also, the Computer Chronicles YouTube channel: https://www.youtube.com/user/ComputerChroniclesYT

quickben 3 hours ago 0 replies      
I'm 35 and in good balance and desire for life. I want to share that you too should:

- read your Marcus Aurelius

- listen to some Alan Watts

- you are not alone, or the first person to get existential, many before you did and many after you will. Detach from anything technology from time to time, and spend some serious time reading about who people think they are, and what all this is about.

coldnebo 4 hours ago 0 replies      
I can sympathize, and there are areas of computing that are tiresome and never seem to get better, but there are just way too many things on my bucket list: unbiased rendering, physics, AI, mathematics education, visualization, theory of computation...

Heck, there are so many research projects out there completely changing what it means to compute (i.e. Bret Victor), let alone rediscovery of what the founding scientists (i.e. Turing) had for their original vision (did you know Turing generated music from his computer? Decades before the first synthesizer?! or Bell Labs, or PARC.

There is so much to know and so very little time to even scratch the surface. Maybe I'll get bored later, but right now there are things to do!

qwertyuiop924 4 hours ago 1 reply      
I love computers. I hate the BS: I don't care about that new sillicon valley project. But I do care about that new project that's going to change how we think about computing. I care about the programming language that will show me a radically different way to program. I care about the tool that's so elegantly designed that it takes 5 minutes to explain how it works, and does its job amazingly well.

I care about things that remind me why I got into computing in the first place: For the sheer joy of it.

glaberficken 5 hours ago 1 reply      
IMHO the feeling the OP is describing is nothing specific or exclusive to computers or the internet.

If you listen/read closely to people that work in all sorts of fields, this feeling is quite common.

You made a career out of a hobby you really enjoyed. And after a few years it became your work and you no longer enjoy it. You now find joy in some other activity. That new thing? That's your hobby now.

I got this impression after years of thinking of throwing myself into video-game journalism or bicycle mechanics as a profession (2 of my favorite hobbies). When I started speaking to actual video-game journalists and bicycle mechanics I immediately noticed that I couldn't find a singe one that still enjoyed his respective activity anymore.

I'm not going to try to play "psychology expert" here, but for me the reason seems to be pretty simple:Those people could no longer spend their time playing the video games they liked or riding and fixing their own bikes. They now had to play all the games they were "told" to play and on top of it take notes and write meticulously about them, the bicycle guy now had work on a bunch of strangers bikes he didn't care about and keep up with a bunch of new bike tech he actually thought was needless bullshit, and he had to sell bullshit Lycra shorts and stuff like that.

To this day (37yo) its one of the decisions I think I got "the most right" in my life. Not turning one of my hobbies into my job. (curiously this runs right against the common advice "Take what you are passionate about and make that your life's work.")

arximboldi 5 hours ago 3 replies      
This article was very touching. I'm 28 and I feel exactly thesame way. I have spent quite some time thinking about the topic.I have even used the same words in conversations with friends.

There is a generation of people that got into computers becausethey were a tool for empowerement and creativity. When I was achild, my younger sister would create movies editing frame byframe in MS Paint while I would learn Pascal to make a sequencerto play "melodies" using the PC speaker bell commands. Her friendwould learn HTML to create a manually updated blog where shewould post fantasy short stories. In the Internet, we all hangaround with nicknames in chat rooms and learn to make flashywebsites and get through the chain emails from relatives. Weneeded no Netflix or Facebook to share stuff, we had P2P andemail and IRC. Then we learnt about GNU/Linux: the ultimate toolto get control of our machines. It was all organized chaos,instant communication that no one could control, limitlesscreativity, the ultimate dream of a post-capitalist anarchistsociety...

At some point, some got to believe that if only these tools wouldbecome mainstream, the mainstream would adopt these values. Atecho-revolution!

This overestimated the transformative power of technology. Whathappened was otherwise: technology is now mainstream and hasbecome a tool for social control and the ultimate frontier ofconsumerism. Tech didn't change society, society changed tech...

I still want to believe in these utopic values. But I understandthat it is a long way traveled in little steps whose significanceis hard to see while at it. In the meantime it's often tiringand lonely to live in the computing underground. One has toexplain people why you don't have a smartphone (and it getsharder to reach people without having Wassapp and so on), one hasto explain relatives why you don't want to work for BigTechCorp,while tryin to stay "up to date" one has to go through the angryrants of Apple users on HN [1] or the celebration of the newMicro$oft facelift, and the collective systemic submission in thestartup world in this new gold rush...

The hardest part for me is to find stuff that I can do well andthat I find valuable to the world... and still get paid for it.And I am an Software Engineer, the profession of the future! Howcan I be so obnoxious to have plenty of well paid jobs around meand not be interested in them? This makes me very sad and makesme feel deeply alienated...


[1] You are not angry because of the design of a computer, youare angry at the realization that you are so personally investedin a technology that you have no control of, but has control overyou!

erikb 6 hours ago 4 replies      
I think all technological revolutions were a big game changer and people who didn't follow it lost a lot of advantages for that. It is the same with this one. Now if you are 55 years old (and this guy seems to be) and you have saved enough money to stop caring, then go ahead.

But don't misunderstand that this is a luxury that you need to be able to afford. If you don't have rich parents, or saved enough money to live without the internet, you must must must find a place in the internet world that you can stay at (e.g. some FOSS 1990 style mailing list) and at least find some way to use social media (gnu social or G+ anyone?) in some reasonable way and have some kind of internet presence (e.g. a github page and some foss projects you supply commits to).

Really, try if you can't afford the luxury to ignore it. Politics always talks about the gap between rich and poor that gets bigger and bigger. But the same is true for the gap between people who take part in the internet and use it to their advantage, and those who ignore it. Both these gaps already overlap to some degree, and that overlap will continue to grow!

gravypod 3 hours ago 0 replies      
Leave your job, become a welder, a package delivery guy, or pick up fixing cars.

It seems like you have one real hobby but no one should have one real hobby.

Welding is really rewarding. You're working with a really dangerous machine to turn 2 piece of metal into 1.

Working in the world of the 1st class currier is great to. A lot of time driving or traveling around where you live.

I didn't realize how fun working with motors is until I tried fixing something in my car. I'm trying to get my hand on a motorcycle that's broke so I can rebuild the engine to further learn how they work.

Also if you're fed up with the internet and still want to communicate with people become a HAM and learn all about RF propagation and other important things. Really fun, one of my favorite hobbies

pesenti 44 minutes ago 1 reply      
I am the opposite. I used not to care much about computers (I liked math). But I have been blown away but what they enable us to do. And it keeps getting better. Yes computers allow us to Tweet or Facebook which may not seem like a great advance. But they allow us to send rockets in space and make them come back. They allow a majority of humans to access almost all knowledge instantly. They allow my company to develop new medicine much more efficiently. How amazing!

My advice to the OP: go work for a company that uses your computer skills to do something good, something meaningful to you. It will change your perspective.

snarfy 4 hours ago 2 replies      
I felt that way years ago.

I liked video games. I wanted to make video games. To do that requires programming a computer. Ok, how do you do that? Let's go down that rabbit hole. 30 years later and I'm still going down the rabbit hole. I've haven't made a video game yet, only bits and pieces and some mods, but at this point, I don't really like video games much anymore. So now what?

I've been doing a lot of hardware, electronics, arduino and general maker stuff. I still like making stuff, but it doesn't have to all be on the computer, and it doesn't have to be a game. I'm more interested in how a HAM radio transmitter works than the latest js framework these days.

hellofunk 5 hours ago 1 reply      
My main problem with computers is how they have infected global society such that human social behavior now revolves around how computers work.

Other advancements, like the automobile, also changed society, but at least once you leave your car and are having a real conversation with someone, your car won't suddenly take your attention away.

jay_kyburz 6 hours ago 0 replies      
I suspect I am a similar generation, and I still love to tinker and make things happen on my computer.

But Twitter, Facebook, Netflix, Spotify, Snapchat, or Uber, have nothing to do with tinkering or creating something.

I also don't want to surround myself with the internet of things because I know how insecure and broken everything is. I'd rather be buying appliances that I can leave for my children when I am gone, rather than buy new ones every two years.

I'm still perfectly happy with the 5 year old MBP. I hope it will last another 5 years - even more with luck.

nul_byte 4 hours ago 0 replies      
I appreciate where this guy is at.

After work you should do what you want to do. If that includes sports, going out and eating nice food etc, good for him. That is a balanced lifestyle, and worthy an effort to make.

I am kind of the opposite though when it comes to computers & IT and wanting to retreat a little. I started my coding career at 43 years old. I have worked in tech all my life, so the industry is nothing new to me, but I was never in engineering / software development. I was more of a Linux / network admin / systems integration engineer or run of the mill network architect (lots of time in powerpoint, visio (yuck!)). I kind of always had a healthy envy of developers, as I knew they were working within the real guts of computers and creating things. I was always the one trying to mop up the mess of a less bright developer who managed to get something dire into production. All this made me even more curious to get into that area myself.

With the advent of cloud, namely OpenStack and all the other devops'y type applications in the eco-system such as kvm, containers, vagrant, ansible, puppet etc etc, I found my nix skills could be reinvented and started learning python, brushing up my shell scripting, learning about serialising data, restful API's, messaging, models, views, controllers yada, yada, and then in turn learning lots of new tools including, git, gerrit, travis etc.

I am now loving what I do and I am super keen to learn more and more, so I do spend lots of spare time now absorbed in writing code and getting up to speed on different tooling available to developers.

Right now my spare time is spent learning rust as I would really like to get into systems programming and work with the kernel space for networking based apps.

Its weird, in that now is the time when I should be just specialising and not being so absorbed (a lot of senior guys do this at my firm, they are happy just to sit looking at some spreadsheet or project plan until 5pm and then go home), but instead I really want to develop a new career as a programmer over the next 10 - 20 years, and I love the idea of that.

I now have a laptop covered in stickers, have grown a big beard and I go all goey at the sight of some new snazzy framework. My wife jokes about it being a mid-life crisis.

I don't seem to be slowing down either, but in fact going quicker then ever before.

I am with him on instagram though, and I have no idea what alexa is.

kleigenfreude 1 hour ago 0 replies      
I was here about 10 years ago. Sick of technology. No longer wanted to learn about it. Just went into survival mode because of self-loathing that I was just contributing something that I no longer felt was a good thing.

Here's what I've learned since:

Part of what you are experiencing is real. This will never leave you and will transform you. It is part of maturation. It is natural to start seeing that what matters in the world are its life, its people, its wonder, and its love, and that you have human failings which over and over again will leave you feeling guilt for not reaching a potential. Or perhaps you will transcend this and just be ok with everything, or to devote your life to doing everything the best way you can, and accepting you will fail along the way in a way that limits self-pity.

Part of what you are experiencing is due to your health and circumstance. This is something you can affect. If you are tired, maybe you need more sleep and exercise. Maybe CPAP or an oral appliance from your dentist could help with sleep apnea. Maybe you shouldn't drink before you go to bed as often. Maybe you could see a recommended psychiatrist and get some medication. Maybe yoga, a martial art, tai-chi, or guided meditation would help. Maybe you should read more.

Computers in their many forms, but particularly mobile computers ("phones") are way too distracting. So is streaming entertainment. Too much of our lives are wasted on them. Go buy a bicycle, or some running/walking clothes and shoes, and get out into nature. Buy a tent, camp stove, ramen, sleeping bag, inflatable mat, and backpack and go camping.

Feel like what you are doing is B.S.? If you're smart, join Geekcorps and travel to another country doing something cool: http://www.iesc.org/geekcorps . Even the peace corps has jobs in dev/IT like: https://www.peacecorps.gov/returned-volunteers/careers/caree...https://www.glassdoor.com/Jobs/Peace-Corps-software-engineer... Or if you're an engineer: http://www.ewb-usa.org/

SubiculumCode 6 hours ago 1 reply      
I also dont like computers anymore. Its from being on them all the time. I want life apart from the screen.Then I get bored.
theparanoid 6 hours ago 1 reply      
It was a shock talking to a colleague and realizing I used to have his enthusiasm, now I want own a night club like jwz.
eswat 3 hours ago 0 replies      
Ive been feeling this. Still not quite sure why. But I think a good reason is Im starting to see newer technologies come out as veiled hypotheses on how to extract the most time or money out of the user, not so much as things that actually provide real, long-term value to people.

Im not actively trying to be a luddite or think I need to stick it to the man. But I cant shake the feeling that many technologies coming out simply dont care enough about humans to warrant actually being used. Thats not to disregard side projects and such. Most of the time the creation out of those projects is out of pure intentions. A lot of those same intentions get thrown out the window when money and company survival and thrown in.

sz4kerto 5 hours ago 0 replies      
Neither do I. Well, I do like my main computer, I enjoy having lots of ram, cpu power, large monitor. Because I spend a lot of time with it.


I have an iPad for 2 years that I haven't used, almost ever (got it as a present). I don't want a smart fridge. I run without monitoring myself all the time. I don't play on my (otherwise high-end) smartphone, I only use 6-7 apps.

The reason: I realized that these stuff are not _that_ smart yet. When I use their 'smartness', they consume more time than the non-smart things. We use a simple post-it for grocery lists with my wife because opening Trello is much more complicated than just picking up the pen when I realize that we don't have more garlic in the fridge.

I still enjoy hacking things for the sake of hacking, but that activity is not 'sold' as something smart that will save me time. It doesn't save any time, just makes me feel good.

hackerfromthefu 1 hour ago 0 replies      
Yes, yes.

The signal to noise ratio of the modern internet has changed for the worse, western/global culture has lost it's manners, and what signal there is left shows leaders have either lost their culture or their clothes..

none of these global trends are anything to do with you personally .. those trends are external!

thus even if you look after yourself, if you avoid burnout from todays overpaced pace, if your hardware is ready and able to be inspired ..

Then, to feel that inspiration again, you must really appreciate and nuture the inspirations you find amongst the noise

Personally I believe the next frontier is hacking and implementing political/social/power cultures and social mores inspired by Libre Values

lazyjones 5 hours ago 0 replies      
I second this, but don't think it's just a matter of getting older and less curious or being fed up with computers due to years of professional work on them. For me, a large part of the frustration comes from having moved previously simple tasks and habits to complicated, complex and unstable computer-based solutions. Not only have the tasks themselves become more difficult and in some ways less efficient (reading tiny text on small displays - no thanks!), but they come with a huge burden of having to maintain an OS, network infrastructure, software updates, security risk, privacy considerations, prevent data loss (make backups). Sure, there are many advantages with our new approaches, but the burden of complexity far outweighs them if you stop ignoring it.
andretti1977 5 hours ago 1 reply      
I was born in 1977 and programming since i was 7. I am a software developer. Seven years ago i had to start freelancing to enjoy my work again. Now i now that work can take no more than 8 hours per day, 5 days per week, no more. This way, computers are still beautiful and interesting.
dictum 4 hours ago 0 replies      
I'm younger than the OP (going by the modem speed etc) but I've been experiencing the same apathy for a while (coupled with similar feelings about visual/interactive design), but I'm slightly more comfortable with it now.

Maybe I learned to deal with my own cynicism, but the turning point was probably when I started looking at my work (and computing) less as a goal/ultimate meaning and more as just another piece in peoples' lives; a way for them to accomplish non-tech goals.

lugus35 6 hours ago 0 replies      
I was in the same mood. Now I 've stopped coding for a living (you can try management, presales,...) and at home I do what I like (exploring data structures and algorithms) with the tools (Emacs) and languages I like (Common Lisp)
dendory 4 hours ago 0 replies      
I think there's multiple things there and I don't think taking any particular stance is wrong. I love technology, in the sense that if I get a problem to solve which makes me dig deeper into an area to figure out how things work under the hood, I really dig that. But I don't use Facebook, Netflix, Siri, Alexa or any of those things. I want nothing to do with the Internet of things. I suspect this is common among those of us who grew up with technology, as opposed to those who had technology by the time they grew up. They see technology as a service they should always have available in every facet of their lives, while we see it as something that used to be cool and mysterious, but now has been wrapped by so many commercial interests.
Yenrabbit 4 hours ago 0 replies      
Reading this made me realize I have similar feelings. But I don't think it's technology's fault - I only really got into computers ~6 years ago, and to me then they were fascinating! I think it's just that we start to use them for work, and sooner or later stop caring about how everything works and start wishing it would all get out the way and let me browse the web or write my document. I think it's curable though. A few days ago I dug out my early code, and felt the old excitement welling up again - I'm going to spend some time trying to find that again.
pjc50 3 hours ago 0 replies      
I wonder how much of what we're discussing here is "future shock"; we've lived in a time of extremely rapid change and a high-speed cycle of hype and disillusionment.
runesoerensen 5 hours ago 0 replies      
Just going to leave this here (from 49s): https://www.youtube.com/watch?v=ZP6lIM3OAFY&feature=youtu.be...

"You know, I see the look on your faces. You're thinking, 'Hey Kenny, you're from America; you probably have a printer. You could have just gone on the internet and printed that bitch.' Yeah, you know what? I could have, 'cept for one fact: I don't own a printer. And, I fucking hate computers. All kinds. I come here today, not just to bash on fucking technology, but to offer you all a proposition. Let's face it, y'all fucking suck."


JKCalhoun 4 hours ago 0 replies      
It may be an age thing. I've been coding professionally for maybe 30 years and having been doing more or less the same thing, interacting with machines. I find that, while I too can get momentarily caught up in the chase for a programatic solution or hunt for a bug, if I am truly honest with myself, there is very little intellectual curiosity left that might drive me to learn new language or framework. And yet, I recall having this enthusiasm years ago....

I find too though that when I am left to pursue my own projects at home, on weekends, some of the magic comes back a bit. Perhaps it is just Corporate America that has sucked the life out of my soul when I am at the workplace.

chridal 5 hours ago 0 replies      
I loved this post, and so I wrote a small "reply post". http://valleybay.me/2016/11/05/death-of-the-internet/
thght 6 hours ago 1 reply      
I think you shouldn't dislike computers for not enjoying what other people do with it. But a part of the magic of computers and the internet in the 90's is definitely gone, for ever, true. But hey, would you prefer to go back to connecting to a BBS with a 14k4 modem? I prefer my wireless 340Mbps broadband modem, really.

Fortunately I do enjoy every new day and can still become excited about new technology, which is emerging all the time. And I truly believe that computers and the internet have become much better and ever more interesting. You just have to be very selective in the vastness of things out there.

arekkas 6 hours ago 2 replies      
"The world hates change, yet it is the only thing that has brought progress."
tibu 6 hours ago 0 replies      
What I still like is creating valuable solutions through programming. That made me a computer maniac when I got my ZX Spectrum and this is the part what I still mostly enjoy - writing some core and watching how others use it
tech2 5 hours ago 0 replies      
I ended up feeling similarly, no longer hacking at home, no more linux installs on my home machines, etc.

Instead I started on other hobbies, I repair physical things (mechanical, electrical, electronic), I enjoy photography, I work on my car.

I used to make the joke that if ever computers were no longer a thing for me that maybe I'd move to New Zealand and make violins for a living... that time isn't here yet, but I can feel it.

creyer 4 hours ago 0 replies      
One should imagine the days without computers... waiting days for a letter to arrive... Even if you don't like computers many of the benefits that come with them you might like... so think of them as a necessary evil.
erikbye 6 hours ago 2 replies      
Not using/liking Netflix, Spotify, Snapchat, or Uber, has nothing to do with "I don't like computers'.
apeacox 5 hours ago 1 reply      
This is a modern manifesto. I'm almost there, just a bit less, for now.
z3t4 6 hours ago 0 replies      
when something feels like work it usually is!
fagnerbrack 7 hours ago 0 replies      
That's really interesting.
debt 1 hour ago 0 replies      
Congrats you've reached the end of your programming career.

Sitting at a computer all day actually is quite uninteresting and boring. It's the light at the end of the tunnel or big ah-ha moment many programmers have.

It's simply more fun to socialize all day.

Many big projects say self driving cars or FB or whatever don't actually require that many antisocial, introverted engineers; only 10's of thousands so things like that will always get built anyway.

It pays well but its not good for your health to sit at a computer all day nor is it fun to socialize only through a chat screen all day.

Time to take a break.

andrewvijay 4 hours ago 0 replies      
Reading the bullet points with actual gun shot sound in our minds makes it so amazing! Try it
andrewclunn 5 hours ago 1 reply      
We have Wikipedia, a user generated encyclopedia. We have countless musicians, artists, and amateur film makers just a few clicks away. We have archives of history from the edge of wax audio recording and papers from the 1800s. Want your old internet back? It's as simple as not creating an account for anything, not bookmarking portal sites, and running an ad blocker.
cairo_x 6 hours ago 0 replies      
Everything in moderation I guess. A good comedy podcast every now and then is quite therapeutic, but a lot of it can become like being possessed by an insatiable trivia-demon demanding to be fed 24/7.
owenversteeg 3 hours ago 0 replies      
I guess I'm not as far along as the OP is, but I can definitely feel myself getting there.

The meaningless Internet bullshit used to be meaningless, but mean a lot to me; the big news would be a 2% drop in Firefox users or something and everyone would lose their minds. Now, the meaningless Internet bullshit is some site with no vowels and no revenue selling for twenty billion dollars, and it actually means something because twenty billion dollars is a lot of money in the real world; for a sense of perspective, read this [0] and realize that twenty billion dollars could supply all of those things yearly for _a decade_. And yes, I know that 15 years ago there was a bubble too, but it was a lot smaller. In 1999 there was barely north of ten billion total invested in software by VCs.

I don't know exactly when it was, but at some point I went from excitedly tracking the latest versions of distros, googling "shareware" and installing whatever I could find, getting wrapped up in flamewars, formatting my hard drive every week (and it was a hard drive, not an SSD; in 2010, the price of a 120gb SSD dropped from $420 to $230.)

I think that point was systemd. Six years ago, the initial version was released. That was 2010, and things seemed different. No way I would accept a complex, huge init system on MY carefully tuned $distro_of_the_week.

Today? I'm 100% in support of systemd. It makes my life easier. I have zero desire to tweak a complex mess of init scripts. And sure, I run Arch Linux, but that's mostly because it Just Works (tm) and I'm used to Linux. If someone gives me a Windows box, I won't lecture them on how they're contributing to the downfall of humanity, I'll take the damn machine and write the code they're paying me to write. I shudder at the thought of googling "shareware" and just randomly installing programs, and it looks like I'm not the only one; it seems that trend died... yep, around 2010. [1]

I no longer give friends USB drives of "cool software", and if they gave me one I'd think it's a strange joke. I no longer read stuff like WinSuperSite; I'm sure Paul Thurrott is still churning out the same quality content as always but I have no interest in reading about the latest features of whatever.

[0] https://www.cardonationwizard.com/blog/2011/07/01/unicef-usa...

[1] https://www.google.com/trends/explore?date=all&q=shareware

[edit] Turns out WinSuperSite is gone. Or, technically it's there but it's not Paul. "SuperSite Windows is the top online destination for business technology professionals who buy, manage and use technology to drive business"... wow, that's depressing. The old URLs even 404. :(

fit2rule 4 hours ago 0 replies      
Computers are broken.

Wait, no. Operating Systems are broken.

Wait, no. It should all just be the Web.

But wait, no .. the Web is broken.

Ah well, I guess its time for something new. Something, not-broken ..

Wherefore art thou Macintosh? asymco.com
22 points by tambourine_man  1 hour ago   16 comments top 8
bshimmin 12 minutes ago 0 replies      
Not that it really matters, but "wherefore" means "why", not "where". Juliet wasn't wondering where Romeo was, she was lamenting that he was born a Montague.
marricks 54 minutes ago 1 reply      
This piece seemed kind of interesting and then lost me when it just lead to multi-pargraph heavy handed metaphor,

> But Apples immune system was suppressed. It allowed a disruptor to emerge from within. Apple gave birth to its future by suppressing the reaction to that new seemingly parasitic organism. It took an immense willpower to allow this to happen.

It's strange that they're criticizing Apple for successfully making a product, the iPhone, that can take off a lot of your PC work, browse/emails/basic things.

Clearly Apple is trying forge a distinctly Macbook Pro path for themselves without diving into phone/tablet territory. Until we have more reviews and impressions from the touch bar out in the wild it's a bit early to write off that effort.

kryptiskt 32 minutes ago 2 replies      
Apple should make the Mac business a separate business unit (they could call it Apple Computer...). Then that unit can concentrate on making the best computers they can do. As it is now, the Macs is secondary in importance for Apple's management, and it also takes their attention away from their core phone business at times. Why not just spin it off? Apple can still own 100% of it, the important part to get a management that is completely focused on that business.
imron 29 minutes ago 0 replies      
Without the mac there is no iPhone.

Before disagreeing, ask yourself what every single piece of iPhone software was written on.

Someone has to make the software and you ignore developers at your peril.

auggierose 9 minutes ago 0 replies      
> capturing over 60% of the available PC hardware profits

These profits are not available, they are MADE. Mostly by Apple.

mturmon 57 minutes ago 0 replies      
An article on the MBP hardware evolution with some vision and intellectual substance instead of ankle biting. Well done.
jerf 11 minutes ago 0 replies      
The future is not touch. The future is mobile. Mobile is constrained to using touch interfaces, or even more precisely, mobile is constrained to using touch interface on the active screen (as opposed to the touch interface of a trackpad or something). Mobile did not choose this, it was forced on it because there is no other choice. There's no room to put any other input devices on the phone itself, and nobody is going to carry around extra mice or something. (We could barely get some people to use styluses that fit in the device itself.)

It's weird to hear people waxing poetic about touch nowadays, because now that we've all been using touch interfaces for several years, we all should be realizing that they aren't necessarily that awesome. We've all by now used a map app that, as nice as the touch interface was, just didn't quite gel in terms of whether we were zooming, panning, twisting, or trying to select a single location. We should all be able to observe that the act of using a touch interface also means that we must block the line-of-sight to that same interface, meaning both that the touch thing we are interacting with can't feed back in the most natural place to do so, and that touch interfaces must be very large to accommodate the lack of precision. We must all have dealt with the accidental dialing brought on by the touch interface that can't distinguish between fingers and buttocks, or the stray emails we deleted in the act of simply picking up our phone, or the accidental order placed when we tried to clean a drop of water or a speck of dust off the screen. And I don't care how good you are with that touch keyboard, you'd be better on something that could have custom-built haptic feedback, and ideally, haptic feedback you can "feel" without triggering inputs, too. Touch screens are still the best solution for mobile, but they are clearly not "all that".

From this point of view, where mobile is the future but touch just happens to be along for that ride, bodging touch on to a laptop isn't that impressive and there's no reason to believe it's the wave of the future. I've actually got "touch" on my laptop, in the form of my trackpad, and for a 2D slab, I get rather a lot of distinct inputs out of it. Remember that everything I can use that for comfortably is one more opportunity that I don't need a "touchbar" for. (That's been my real criticism of the touchbar; it's not necessarily intrinsically a bad idea or fundamentally useless, it's that the list of things for which it is the best solution is rather short. And I'm not saying it's empty... just short. In particular, it's much shorter than the list of things that it could be used for, but for which it's not the best solution.)

Further, despite not particularly wanting it the laptop I'm typing this on has a full touchscreen. Mostly I remember this when I go to clean the screen of some bit of dust and my mouse cursor starts jumping around. I don't need it for much. Even when acquiring a button on the screen, the touchpad is faster than removing my hand from the keyboard and clicking. My touchpad already has several useful gestures, draining away the marginal utility of other touch interfaces. It's not that useful, even though it's sitting right in front of me. Now, it's not a 2-in-1, where I can at least see the use case, but, neither is the new Mac, right? It's just not that useful. Not useless, but not very useful.

It's not touch that's the future, it's mobile. The still-not-yet-mature-but-still-inevitable "mobile phone that docks to a desktop and provides its guts" will be a hybrid device, and the "touchiness" is irrelevant.

(Actually, now that I think of it this way, the Nintendo Switch may be the closest thing to a successful implementation of that I've seen in a while. Perhaps if that succeeds, it'll open the doors to computer versions of that idea.)

tempodox 27 minutes ago 0 replies      
I think the article has some good points but I disagree on calling the keyboard an indirect input method wrt text. Until touchscreen keyboards feel like a physical keyboard, I don't see them holding a torch when it comes to entering more than a few characters. There's a good reason you can connect a physical Bluetooth keyboard to the iDevices. And while a keyboard / touchscreen combination is certainly possible, it's also not as ergonomic as moving your hand just a little bit to reach a trackpad or mouse from the keyboard. Excitement about new tech is not a good long-term replacement for ergonomics.
Ask HN: Do you still use UML?
81 points by dmitripopov  5 hours ago   108 comments top 48
bane 1 hour ago 0 replies      
In its entirety? No. Never go "full UML", it will annihilate productivity and produce abominable software. It also is taught wrong, it tends to be taught as a design tool, but there's such a tremendous impedence mismatch between the diagramming tools and the way the code actually gets written that it ends up doing more harm than good. (not to mention that the people who usually end up creating the UML might be many organizational layers away from the developers).

But there's some goodness in there. The principal of using diagramming as a descriptive documentation and communication solution is highly worthwhile, but again it should be limited to pieces of the system that need such things. And in addition, the level of detail should be just as much as is sufficient to communicate what's necessary -- don't "prematurely optimize" by trying to document every bit of the system in excruciating detail.

There's also often better, simpler ways to document many aspects of a system, a few boxes and arrows work well for many things. Lightweight versions of the Archimate style work well for describing complete systems. Protocols are well described by a lightweight treatment of sequence diagrams, etc.

They'll often go out of date as quickly as you make them, so keeping them up to date and well versioned turns into a challenge.

Because it's free and provides cross platform compatibility (and the diagrams are supposed to be communication devices), we tend to use yEd for most things.

febeling 3 hours ago 3 replies      
I use boxes-and-arrows sketches a lot. The UML which was so popular around 2000 was this detailed quasi-standard graphical language. It was very centred around being correct, and diagrams being of a specific type of a number of permissible types, and so on. And that whole part I never found to be too helpful.

It is useful to draw ideas as graphics for people who's brains are wired visually. And it can makes nice figures for books and articles explaining structures and concepts. But in neither case does the value predominantly depend on the depictions begin adherent to a standard, as much as other qualities, like focusing on the right part of a larger system, or leaving out unimportant detail, etc.

So nonstandard diagrams offer the author or user more creative flexibility, which is often very important.

I do see value in loosely following UML notation, for the obvious reason that one can immediately see if someone tries to show classes, states, requests, systems parts, and so on. That was probably the original goal behind UML all along, even if people lost sight of it during the fad phase.

smoyer 3 hours ago 5 replies      
UML can be a great communication tool in specific situations but when it becomes a religion your organization will suffer. I'm older than most here, drank the cool-aid that predicted code generation and round tripping but spit it back up before the poison had a chance to set in.

A bonus comment for the youngin's ... When you hear that some new system will allow "the common man" to write his own software without developers, smile and agree with them because they'll come back when it doesn't go as planned and you can charge a higher rate for the resulting expedited project.


I should also admit that I liked (like?) the idea of writing code using diagrams. In the '80s I wrote a program I called "Flo-Pro" in Turbo C that never quite became self-compiling. It wasn't at all OOPsish or FP. In the '90s I wrote several tools in Prograph [0] (now known as Marten) but was stymied by the fact that I was the only one in the company using the tool. In the early aughts, I tried URL tools that promised to write my code from the diagrams - it worked for very simple code but I never saw round-tripping work.

I love drawings in general - my coworkers joke that it's not a meeting unless I have a dry-erase marker in my hand. But those diagrams are invariably system-level, architectural drawings. As others have noted, I also appreciate ERD as a way to visualize relationships in RDBMS. So as much as I like the idea, development stays in the world of text - I'm not holding my breath for some magic bullet.

[0] http://www.andescotia.com/

jschwartzi 5 minutes ago 0 replies      
I use sequence and communication diagrams, but I find the other diagramming techniques less applicable to the type of work I do. In the systems I work with most problems arise from transitions between states and from communication issues between isolated units. I find that static models just end up having to change rapidly throughout the implementation, so I focus on creating APIs and tests instead.
sidlls 2 hours ago 2 replies      
HackerNews may not be the best place to sample for UML usage. It consists mainly of two communities that have a bias against formal engineering methodologies in the development of software.

I won't comment on the pros and cons of UML. Instead I'll invite you to ask yourself a couple of questions.

1. What other clients do you support who have similar characteristics as this client (and may therefore also benefit from UML support)? If the number is significant in terms of impact to your bottom line versus the time you'd have to spend implementing it, then you should consider it worth your time, and view it as an opportunity to up-sell (if you can) or keep existing customers.

2. Do you intend to attempt to move into supporting large enterprise, and especially government contractors? If so, you might consider UML support just because it is ubiquitous there.

eksemplar 4 hours ago 1 reply      
I do, quite a lot actually.

Originally it was mostly because it was the default setting in my Enterprise Architect tool, but it's proven more useful than Archimate (and other notations) because people without architect knowledge understands it much better.

On the business side it's mainly the system integrations, dependencies and information flows that are of value and you could honestly do them in Word if you wanted. Because it's very easy to build upon it, it's also easy to turn the business specs into something I can hand to an actual developer or use as a common language when talking features and requirements.

I wouldn't use UML if I was doing the systems engineering from requirement specs handed to me, and it is very rare that we use more than 10-25% of its full functionality, but it has enourmous value when you are trying to get future system owners to understand their own business processes and what they actually need from their IT.

arethuza 4 hours ago 2 replies      
I absolutely hate UML and regard the associated tools as time swallowing abominations and the main advocates that I encountered as the worst kind of snake oil salespeople.

Of course, other people's experience may differ - but I largely thought it was a big con.

wyldfire 59 minutes ago 0 replies      
"What's your favorite editor?" pales in comparison to the divisiveness inspired by "how do you feel about UML?"

I used Rational Rhapsody for a few years. We used it for use case diagrams, sequence diagrams, class diagrams, object model diagrams, statecharts+code generation.

Many folks scoff at and draw the line at code generation. By default, tools like Rhapsody seek to box you in to a certain way of doing things. It's not difficult at all to customize but it requires effort to opt out of some defaults. I felt like I experienced significant pros and cons. One one hand it was awkward to use their IDE to edit the code. OTOH it helped encourage a level of organization to the code. Statecharts are very expressive and very clear, I really liked them. There's no limit to the expressiveness of the code you can write. But the vocabulary used to describe the widgets I was working with was new to me, so it took a good deal of time to look up and understand the customizations required.

In the absence of code generation features of UML, the diagramming features are really great. Developers are too quick to treat it like a religion and (on both sides) become inspired to pray at the altar or preach about the evil that lies within. But really, it's just a glossary of visual representations mapped to software design concepts. That's all it needs to be -- conventions like the ones used in other engineering discipline's diagrams. Diagrams with "boxes and arrows" are just fine but there's always the implicit questions: "does that rectangle represent the process executing the 'flabtisticator executable' or the 'flabtisticator class'?

segmondy 4 hours ago 2 replies      
Yes I do. When you have a team of developers with different model of how the system should work or works, it's usually a recipe for disaster. By modeling, we get to unify our thoughts and idea of the system.

When starting out a project, I tend to lean more towards well labeled conceptual diagrams. I will also use activity diagram, sequence and state diagrams.

While I have often read about people designing class diagram before hand, then writing or generating code. I never use class diagram before code is written. I use class diagram to document an existing system.

It's a tool, if you are willing to be flexible and realize it for what it is, then it's useful.

elsurudo 4 hours ago 1 reply      
UML is too heavy and rigid.

I use my own subset/version of UML, which uses a simplified "grammar", and allows you to express basically only the following:

- Class with attributes- Parent/child relationship- One-to-one relationship- One-to-many relationship- Many-to-many relationship

If I have some other need (rare), then I improvise. Usually I'm the only one who looks at these, but if a client or someone else needs to, the language is simple enough to understand that I can explain it in a few examples/sentences.

cageface 4 hours ago 0 replies      
Not exactly UML but I find rails-erd can be helpful to understand a data model, particularly if I'm trying to come up to speed on an existing codebase. It has the very large benefit of automatically staying in sync with the code.


bunderbunder 1 hour ago 0 replies      
For general purpose data modeling, I tend to favor some loose variant of Bachman or Chen notation. UML feels complicated to the point of being hard to read.

For architectural diagrams, I just use basic boxes, arrows, cans, etc. UML also tends to feel complicated to the point of being hard to read.

In both of the above cases, I think my not using UML is because its goals differ from mine. UML seeks to capture how a system comes together as completely an accurately as possible. I tend to think that the code should suffice for that (and if it doesn't, it's time to have a long hard talk about technical debt). I prefer diagrams to just be a gloss that helps to explain how things come together at a high level.

For understanding protocols and suchlike, though, UML sequence diagrams are my go-to. That's a rare spot where I really do want the diagram to capture a whole lot of fine detail, and the UML standard provides a pretty clear, intuitive and uncluttered visual language for the job.

larve 1 hour ago 0 replies      
I do quite a lot. I use it for sketching out and documenting software. I use state machines for, well, state machines, which make up 90% of the software I write (embedded). They can almost mechanically be transformed to sourcecode, but I do the coding by hand. I use it for sequence diagram to sketch out and document sequences of events, and also generate sequence diagrams out of trace logs. And finally I use class diagrams to document the software architecture. For all of these I use plantuml because the text format is simple, human readable, and easily versioned. For the kind of software I write (embedded software with a lot of state machines and strong OO architecture), it is absolutely great. Definitely one of the big tools in my toolbox.
kaio 4 hours ago 1 reply      
Yes, but not in the formal sense. And i don't get why agile and proper documentation should not go Hand in Hand. There are components and class Diagramms that are invaluable for me when i'm joining an existing code base. Even if they are not up-to-date everytime.

State and sequence diagrams are really cool to discuss dynamic flows and identify potential logic holes. UML-like diagrams are way better then to come up with your own representation of this everytime

jakub_g 2 hours ago 0 replies      
Been working in a large corp for last 5 years. Never created a single UML chart, maybe I've seen one or two.

Maybe it's because I work in front-end, which "traditionally" is a bit less strict (JS is dynamically typed etc.), but also, I think that typically the codebase changes too rapidly and the fancy graphs can't catch up with that, they get outdated in a few months, and no one bothers to update them or even look at them anymore (they might be useful in the beginning of the project though).

the_mitsuhiko 5 hours ago 0 replies      
I don't think i ever did. It feels like an enormous waste of time.
xtiansimon 1 hour ago 0 replies      
UML was useful for visually communicating document architectures in the Web 1.0 world. But what's a UML diagram for a dynamic web application? Server<-->Database, done? If the tool doesn't fit the problem, don't use it.

And then there's the domain specific UMLs, such as Operations Management and BPMN, where the diagram can be programmatically "powered up" to analyze operational efficiency. If you work in a hierarchical organization where you need deliverables that filter to other departments, and there is a perceived value, then someone is going to be tasked to make it. But in a flat organization in startup mode, it's a waste of money.

If you're working across organizations, in public/private partnerships; if your government organization needs to be accountable at diverse levels, then UML is visual language that communicates a lot of information at once--in one artifact. Tax dollars going for new transportation infrastructure in New York City, maybe there's a need to get diverse groups on board. But you're going to pave potholes in Levittown, NY--who cares? Get it done; stop wasting money.

And finally, there is a the language-cultural dimension. Europe is multi-lingual, so it's no surprise the Open-Education Resources offering UML-like education materials are from European universities [1][2], and not American Universities. That's not our language problem (yet).

If you have a customer asking for UML, you need to understand their problems. Once you do that, then you can decide if the problem vector they present is profitable sector for your company.

To put all this in other words, UML is a tool and a visual language. Use it or not, it's not going away--ever.

[1]: https://open.hpi.de/courses/bpm2016

[2]: https://www.edx.org/course/creative-problem-solving-decision...

gaius 1 hour ago 0 replies      
I'd rather use XML than UML and that's saying something.
sfaruque 50 minutes ago 0 replies      
If I must use UML, I prefer using Yuml (http://yuml.me) to generate them.

Easy to use, allows you to "code" the UML structure in a simple template language, and the output looks rather nice.

alemhnan 3 hours ago 3 replies      
It is useful if you are able to generate code from it. I know one big software company (more than 3000 employee) that generate 90% of all it's code base (~20M lines of code).
euske 1 hour ago 0 replies      
UML (as ERD) is used by the official IT certification in Japan. So yes, some of us still have to learn it.

cf. https://www.jitec.ipa.go.jp/1_04hanni_sukiru/mondai_kaitou_2...

BjoernKW 3 hours ago 0 replies      
Yes. Keep in mind though that the - perhaps historically, perhaps still - most widely used kind of UML diagram - the class diagram - is just one component of UML.

Other than for explaining particular design patterns I don't find class diagrams all that useful, certainly not for giving you a complete picture of a system that consists of more than a handful of classes.

Sequence diagrams, state diagrams, use case diagrams, basically anything that involves or describes activities: I think those are tremendously useful.

mikekchar 2 hours ago 0 replies      
I draw UML diagrams frequently. I almost never persist them, though. As they say, the only thing worse than no documentation is out of date documentation.

I can definitely see an argument for certain types of projects (libraries and frameworks). If you have diagramming capability, and you are in the enterprise Windows market, I think this is a no-brainer. I'd be curious what diagramming support you had if it were not UML....

Having said that, I wouldn't try to implement a full object modelling solution. It's not the kind of thing that help files need. Actor diagrams and sequence diagrams would make more sense to me.

altharaz 4 hours ago 1 reply      
Good UML diagrams are hard to write, easy to understand.

I used to write a lot of UML diagrams in Mechatronics Engineering for a big company, where specs were not supposed to change often.For projects with a long lifetime and a slow change velocity, UML is totally justified.

UML diagrams are not worth it for Software Engineering: code evolve too fast to keep your diagrams up to date.In that case, I replace UML diagrams by simple sketchs / mockups and simple tables.

brunosaboia 3 hours ago 0 replies      
As almost in any case with computer science, the answer is "depends".

UML diagrams can be very useful to represent a system to someone which is not technical enough to understand code, but can understand the basics of the diagrams.

Personally, to myself it's usually more a waste of my time.

steedsofwar 1 hour ago 0 replies      
In the last 16 years (i've worked at over 10 companies) i've only had one company use UML and it was only for an overview of the proposed architecture.
llndr 5 hours ago 1 reply      
Sequence and state diagrams are fine, why not using them when you want to communicate certain things?
qznc 5 hours ago 1 reply      
I'm teaching it. It feels increasingly out of date. Tool support has stagnated for years on Linux. Modern features like lambdas are not really supported.
myf01d 4 hours ago 0 replies      
It's just big corporations bullshit to fill meetings time.
mugsie 2 hours ago 0 replies      
Yes - mainly for communicating with other teams. Its not the full spec, but a sub section, so that others (mainly the security architecture review) can understand the basic parts of the system.

I tried to keep it up to date for the public docs, but that can be an uphill battle.

Svan 4 hours ago 1 reply      
I do. UML class diagrams can help you turn real world business objects into model and think about dependencies and relationships of entities. I would say that it is the best tool to model software. Sequence and activity diagrams can help you design and document a process.

A picture is sometimes worth hundred words and this applies to UML as well.

slim 4 hours ago 1 reply      
Yes. Everyday. With a pencil on a notebook.

When I take notes, some concepts are better/faster materialized as relationships between objects or actors or activities

Also reasoning about schematics topology is useful and enlightening when a problem is large

coldcode 2 hours ago 0 replies      
I don't use UML, nor its predecessors going back to the early 1980's. I draw little pictures to ensure I understand what my code is/will do but that's it. Formal diagrams are as useful as formal documentation written wearing a formal suit.
sheraz 5 hours ago 0 replies      
Nope. Whiteboards and maybe a photo of it for posterity :-)
prav 2 hours ago 1 reply      
Yes, used the UML class diagrams, sequence diagrams and Statechart diagrams for an embedded device software.

It's just a tool for modelling. You can pick and choose the diagrams you need, and at the abstraction level you desire.

_nalply 4 hours ago 1 reply      
I still think in use cases but without the goofy graphics. I just make a Markdown bulleted list then give the use cases each a name and a short description.
mhd 4 hours ago 1 reply      
In recent times, about on the same level as ERDs for databases. So mostly as a quick top-level sketch when designing something (mostly on paper), not really kept up to date when the code/model changes. Goes along with getting away from rigid class models in general, and for UI classes, the relationships are a bit more self-evident.

I do miss Booch's fluffy clouds a bit, though.

Torn 4 hours ago 1 reply      
Sometimes, in google drawings to talk about code or systems architecture in the abstract
mattmanser 2 hours ago 0 replies      
No, nor have I seen any diagrams or had it mentioned to me by any other programmer for at least 5 years.
Raed667 4 hours ago 1 reply      
I only ever used it for school projects, and only because it was required.
burnt1ce 2 hours ago 1 reply      
I like to draw UML every so often.

Question to HN: What tools do you guys use to draw UML diagrams?

staticelf 2 hours ago 0 replies      
No, I have always hated it. I rather be flipping burgers at McDonalds than use UML.
wayn3 5 hours ago 3 replies      
never did. whats the point?
je42 4 hours ago 2 replies      
plantuml is nice. ;)
qwertyuiop924 4 hours ago 0 replies      
never have, never will.
maplechori 5 hours ago 1 reply      
requirements in any real life scenario change every day, useless
cmrdporcupine 2 hours ago 0 replies      
UML use case, activity, and sequence diagrams are perhaps quite useful. I'm too lazy to make them :-) but I find them useful to consult at times.

The class diagrams that everyone is really thinking about when they say "UML" are imho kind of useless. It reflects a kind of obsessive OO purism, and taxonomical obsession, that was quite trendy in the late 90s, early 2000s.

But it turns out in most cases looking at a class diagram doesn't really tell you much about what software does or how it works. And in any case I personally find it easier to look at header files or source files to get a picture of how things fit together. Class diagrams don't really help.

douche 4 hours ago 0 replies      
Probably it is not. I think I'm late enough that I missed the UML wave, and I also did well enough on the AP CS test that I tested out of the CS 101 Java course in college, and got thrown into the Haskell one instead, so I was never forced to do UML.

Once in a while I'll fire up Visio and sketch out a state machine or sequence diagram, but all I'm really doing is throwing down some bubbles or rectangles, drawing some arrows between them, and tacking on some labels. It's nowhere near as formalized as UML, but it works well enough.

25 Years After Junk Science Conviction, Texas Admits Sonia Cacy's Innocence theintercept.com
292 points by finid  21 hours ago   109 comments top 15
seibelj 21 hours ago 11 replies      
This is something I find extremely scary, because an unusual series of events (accidental fire, accidental drowning, suicide) could be framed to look like something entirely different (arson, murder), which could drag any one of us into a nightmare. If you don't have the money to hire your own experts and a strong legal team, potentially bankrupting yourself, you are at the mercy of a motivated prosecutor with nearly unlimited resources.

In 2009 a scathing report was released by the National Academy of Sciences that essentially says that blood spatter, handwriting, hair, fingerprints, and bite mark analysis are all junk science[0]. If two "experts" can look at the same evidence and come to entirely different conclusions, how is this science? It's opinion wrapped up as scientific fact. Who knows how many people are innocently convicted. It's terrifying.

An excerpt from WikiPedia about hair analysis:

The outcry from defense attorneys has forced the FBI to open up on disputed hair analysis matches since 2012. The Justice department began an "unprecedented" review of old cases involving hair analysis in July 2013, examining more than 21,000 cases referred to the FBI Lab's hair unit from 1982 through 1999, and including as many as 27 death penalty convictions in which FBI experts may have exaggerated the reliability of hair analysis in their testimony. The review is still in progress, but in 2015, it released findings on 268 trials examined so far in which hair analysis was used. The review concluded that in 257 of these 268 trials, the analysts gave flawed testimony that overstated the accuracy of the findings in favor of the prosecution. About 1200 cases remain to be examined.[1]

[0] http://www.nytimes.com/2009/02/05/us/05forensics.html?pagewa...

[1] https://en.wikipedia.org/wiki/Hair_analysis#Microscopic_hair...

mabbo 21 hours ago 5 replies      
> In an exceptional move by the notoriously conservative panel, the BPP agreed that Cacy should be paroled, just six years after she was convicted.

She served 6 years before parole, not 25 years behind bars

I'm far more concerned with Cameron Todd Willingham. Governor Perry had this evidence, that much of the state evidence being used was junk science, and did nothing while an innocent man was put to death. Shameful.

rdtsc 21 hours ago 3 replies      
"Expert" witnesses for US courtrooms is a special kind of a parallel voodoo-science world. Especially when it comes to arson.

Prosecutors like to pick the same people to testify as "experts" and their top qualification is that they have testified before as "experts". I imagine many have optimized putting up an act and throwing around fancy terms to make it seems really precise and scientific. Their future employment depends on that.

garyclarke27 21 hours ago 1 reply      
Similar junk in the UK has put many innocent parents away for "shaking baby syndrome" just based on a theory, not proven by scientific evidence.Expert witnesses who don't agree with the consensus establishment, have even been banned from practicing medicine, thus most now refuse to testify.
Tloewald 1 hour ago 0 replies      
This reminds me of a New Yorker article on the same topic (covering an even greater injustice, also in Texas -- in fact referred to in this article)


geff82 19 hours ago 3 replies      
The nightmare is also that in some countries, when the police knocks on the door to arrest you, you might get killed in a gruel archaic ritual called "execution" at the end, even if you did nothing wrong, just the odds were against you. Here in Germany I do not have to fear the police. If if the judges wrongly sent me to jail "for life", at least I'd have some hope that one day I can convince them they were wrong and get to freedom again.
rmchugh 17 hours ago 0 replies      
The other case mentioned, the Willingham case is even more horrifying. A man was convicted of murdering his children and sentenced to death on bogus evidence. When presented with evidence to the contrary, the state of Texas under Rick Perry ignored it and allowed the man to be executed. This is state sanctioned murder of an innocent man. Why is the Governor not on trial for this?
metafunctor 20 hours ago 2 replies      
Is junk science in court rooms a root cause or just a symptom?
johnhattan 14 hours ago 1 reply      
Got a friend currently doing time in Texas for basically the same thing. Here's hoping this gets the case some notoriety. http://thearsonproject.org/case-studies/curtis-severns/
finid 20 hours ago 0 replies      
Somethings the main qualification of some so-called experts is a certification from a 6-hour or 6-week class. From then on, they are eligible to testify as an expert in serious criminal cases.
edblarney 10 hours ago 0 replies      
Question is: what is considered 'junk science' at the time it was used in court?

Because I'm sure we are using some 'junk science' we just don't understand at the present time.

lanius 16 hours ago 0 replies      
Gerald Hurst and Chris Connealy are true heroes.
draw_down 21 hours ago 1 reply      
We need a pretext for what we want to do, which in America is to lock people up. If we can fool ourselves with something that sorta looks and smells like science, that fits the bill perfectly.
gourou 21 hours ago 1 reply      
Making a Murderer season 2
yuhong 21 hours ago 1 reply      
Anti-discrimination laws are even worse in that discrimination can happen with no evidence at all. One of the methods used to enforce them (particularly in things like hiring) is statistics, most of which assumes employees are interchangeable commodities. They were designed back in the 1960s for things like manual labor jobs. I am willing to suggest a compromise to limit them to these kinds of jobs.
Paris agreement on climate change has come into force bbc.com
16 points by tbarbugli  5 hours ago   4 comments top 3
zzalpha 1 hour ago 1 reply      
And two, five, ten years from now we'll all be talking about how the Paris agreement failed and how the <insert name of major city> agreement is a major breakthrough and we'll limit warming to 3-4C this time, we swear!

But good news, while the Great Barrier Reef is now 90% bleeched, the Seychelles are forming a new underwater ecosystem!

vowelless 15 minutes ago 0 replies      
Slightly off topic:

Is it possible to create technology that can suck out the CO2 from the atmosphere? If so, what would the back of the envelope costs for this be like?

If this is a viable option, why continuously try to have these large scale agreements instead of building and selling technology to counter the CO2 emissions?

Shivetya 1 hour ago 0 replies      
What force? It is still fully voluntary and your pretty much on your own in deciding what to do. If anything it really only is payout scheme to appease some smaller nations. The emission goals are NON BINDING.

Its worse than no agreement because countries can point to it and declare they did something whereas they really aren't obligating themselves to more than a small tax. China has what, said their emissions may slow by 2030?

Watching Larry Ellison Become Larry Ellison steveblank.com
23 points by sajid  3 hours ago   8 comments top 2
dave_sullivan 1 hour ago 1 reply      
For anyone interested in learning more about Larry Ellison and Oracle, I'd highly suggest "The difference between god and Larry Ellison (god doesn't think he's Larry Ellison)."

Oracle is a 150 billion dollar company, similar in market cap to Intel. Once you've read histories on these people, it suddenly makes sense that Warren Buffet and Bill Gates hang out, while Steve Jobs and Larry Ellison were BFFs. Like with Jobs and Apple, Ellison and Oracle are aggressively polarizing.

I see a future of declines for Oracle; Salesforce.com is redefining and taking their market (corporate IT budgets). There is simply no way other than lock-in and being dicks that they'll be able to see the profitability they once had, let alone sales growth (unless by acquisition). You can see it already with their patent suits; they're out of ideas. But if you work in the business of software, Oracle's history is worth knowing.

datashovel 1 hour ago 1 reply      
I have a policy when assessing opportunities. Use of Oracle products is a deal breaker.
Solid metallic hydrogen has been produced in the laboratory arxiv.org
207 points by ars  19 hours ago   55 comments top 10
twic 6 hours ago 2 replies      
> We used type IIac conic synthetic diamonds (supplied by Almax Easy-Lab) with ~30 micron diameter culet flats. About 5 microns were etched off of the diamond culets using the technique of reactive ion etching, to remove defects from the surface. The diamonds were then vacuum annealed at high temperature to remove residual stress. Alumina is known to act as a diffusion barrier against hydrogen. The diamonds, with the mounted rhenium gasket, were coated with a 50 nm thick layer of amorphous alumina by the process of atomic layer deposition.

Incredible technology!

> The pressure was initially determined to ~88 GPa by ruby fluorescence using the scale of Chijioke et al (20); the exciting laser power was limited to a few mW. At higher pressures we measured the IR vibron absorption peaks of hydrogen with a Fourier transform infrared spectrometer with a thermal IR source, using the known pressure dependence of the IR vibron peaks for pressure determination (see SM).

Just incredible!

> Photos were taken with a smartphone camera at the ocular of a modified stereo microscope


toufka 15 hours ago 3 replies      
Super cool stuff! There are some greatly quotable sentences in the paper:

> Moreover, SMH (solid metal hydrogen) is predicted to be metastable so that it may exist at room temperature when the pressure is released. If so, and superconducting, it could have an important impact on mankinds energy problems and would revolutionize rocketry as a powerful rocket propellant.

> The principal limitation for achieving the required pressures to observe SMH in a DAC (diamond anvil cell) has been failure of the diamonds.

> The sample was cryogenically loaded at 15 K and included a grain of ruby for pressure determination.

> As of the writing of this article we are maintaining the first sample of the first element in the form of solid metallic hydrogen at liquid nitrogen temperature in a cryostat.

npunt 15 hours ago 0 replies      
Very exciting! For context, here's a nice long article (Aug 2016, 2 months before this arxiv paper came out) on the recent history of attempts to make it, with quotes from author of current paper and others in the field:


> A few months later, Silveras group squeezed hydrogen hard enough to make it nearly opaque, though not reflective not quite a metal. We think were just below the pressure that you need to make metallic hydrogen, Silvera says. His findings are consistent with Eremets new phase, but Silvera disputes Eremets speculations of metallicity. Every time they see something change they call it metallic, Silvera says. But they dont really have evidence of metallic hydrogen.

> All this back and forth may seem chaotic, but its also a sign of a swiftly progressing field, the researchers say. I think its very healthy competition, Gregoryanz says. When you realize that somebody is getting ahead of you, you work hard.

InDemoVeritas 14 hours ago 4 replies      
I saw it!A tiny, shiny speck under the microscope, and the thrill of seeing a form of matter which, it can be argued, has never existed anywhere in the universe. Ever.
resist_futility 11 hours ago 2 replies      
Just for reference the pressure at the center of the Earth is estimated to be around 360 GPa, while they are using 495 GPa and DACs able to reach over 700 GPa.


mrfusion 15 hours ago 3 replies      
Guys, this is literally the holy grail of high pressure physics!
fratlas 11 hours ago 1 reply      
Someone with a chem/physics background, could you please explain why metallic hydrogen is only theorized to be a superconductor? Is such a property not predictable?
simonhamp 3 hours ago 1 reply      
495GPa... doesn't seem so bad...

Standard atmospheric pressure ~= 100Pa

So they increased the pressure in that chamber by 4.95billion times.

And then they hypothesise that the hydrogen metal could be stable at room temperature.

Oh. My. Days.

mrfusion 16 hours ago 2 replies      
Did they test if it's super conducting?
Serverless Map/Reduce tothestars.io
228 points by emilong  22 hours ago   135 comments top 17
ralusek 22 hours ago 6 replies      
This is a screenshot of my google search from 2 days ago:


I've been using Lambda quite a bit, I think it's SO amazingly useful. Tasks that are highly parallelized and CPU intensive can literally be infinitely scaled out. I find it weird that their poster child use case is still always a reactive event like watching S3 and formatting images. There are so many use cases for directly invoking a lambda directly from your code.

Imagine a case where you had to parse a million documents with a relatively expensive computation, let's say 250MS per document. Maybe you have a solid machine with a few cores that's running your server, but even then you can't have the server cpu locked for so long, so naturally you'd need some sort of worker server set up. With a good machine and multiple cores, maybe you get 8 running at once. With a lambda, you can forego the worker server altogether. Just invoke a million lambdas directly from your application server, completely parallelized.

Theoretically, you've taken something that would take 70 hours and had it run in 250ms without having to set up any additional infrastructure.

stochastician 21 hours ago 0 replies      
Author of the figures used in the blog post here. We wrote https://github.com/ericmjonas/pywren somewhat on a lark, because it seemed to fit well with our research goals and it's fun to push systems to their limit. I'm now a total serverless convert! I'd love more collaborators and feedback, the goal is to make these sorts of computations as easy as possible for python developers, especially on the scientific computing side of things.
danso 20 hours ago 4 replies      
OT: I teach computational methods and even as much as I dislike teaching/conflating it with web dev, I have included "let's build a web app" because students like building and deploying a thing, and because Heroku has a free tier.

I've considered the possibility of having students do things on AWS (beyond web dev), including Lambda, and just expensing the costs. It seems feasible to quickly set up every student with controlled access via IAM...but is there a way to set up rate-limiting, ideally through a policy? That is, shut an IAM down if a student accidentally invokes a million processes? Or, for that matter, limiting the storage capacity of a S3 bucket?

stcredzero 22 hours ago 2 replies      
I wonder if something like AWS Lambda could be applied to multiplayer games? It seems like game-loop based games would be a good domain for such a programming model. The entire game could be expressed as a function that turns tick N into tick N+1. Such a function would be composed of many other functions, of course. So for example, there would also be a function that took as an argument the player at time N and gave the player at time N+1.

Such a model would allow infrastructure developers to abstract away most of the concerns around networking, collisions, security, etc., and let game developers concentrate their efforts on simply making the game.

I currently have a game server cluster written in Golang, where the locations are instantiated with an idempotent request operation. It doesn't matter if a particular location-instance exists at a particular moment. It's sufficient for the "master control" server to only approximately know the loads of the different cluster server processes. My experience leads me to believe that something like AWS Lambda, but optimized for implementing game loops would work well, so long as game developers could get their heads around pure functional programming and implement with soft real-time requirements in mind. (John Carmack already advocates the use of pure functions, and game devs in general already do the latter.)


lucd 22 hours ago 1 reply      
How does it compare to 3 years old Joyent's Manta ? AFAIK it was especially designed for this kind of purposes. The processing is made directly on the servers storing the data..
thinkloop 17 hours ago 1 reply      
The article counts characters in documents stored on S3 - which makes sense since S3 is great for storing documents and can handle unlimited concurrency, priced per usage.

But what's the solution for structured data? DynamoDB is the obvious main candidate, but it's billed by hour and high concurrency is very expensive, requiring complicated temporary increases and decreases of concurrency that are hard to predict.

Is there a good solution for running massively parallel lamdas on stuctured data?

plandis 19 hours ago 2 replies      
I've always had one big question about Lambda. Is it really worth the cost you get for the convienience of it?

Is anyone using it in production that can comment?

boulos 11 hours ago 0 replies      
Note: the underlying comparison to other systems is from a 2014 blogpost [1] which suggest they used the m2.4xlarge series of EC2 VMs (which were Nehalem class parts from 2010). Nehalem vs Haswell or Broadwell (the likely parts underlying Lambda) is a pretty big jump.

Disclosure: I work on Google Cloud, but I'm just pointing out a fact ;).

[1] https://amplab.cs.berkeley.edu/benchmark/

eistrati 22 hours ago 0 replies      
Saw the presentation last week at ServerlessConf in London and it really looks very promising. The cost behind this solution is what will really make me check this out :)

P.S. Quoting the author: "As you can see for these queries, the reference implementation performs reasonably well; it's nowhere near Redshift performance for the same queries, but for the price it really can't be beat today"

mallya16 10 hours ago 0 replies      
Implementation guide for Serverless MapReduce: https://aws.amazon.com/blogs/compute/ad-hoc-big-data-process...
dnackoul 20 hours ago 1 reply      
Does anyone have experience building mobile backends in Lambda? I was looking at an API Gateway / Lambda / Amazon RDS stack for building a central data store and was wondering what people's experience with that setup is?
frenchhacker 7 hours ago 0 replies      
I guess the example assumes the data is already somehow in AWS. How is the total cost affected if I wanted to run this setup on a 10TB dataset?
c-smile 21 hours ago 1 reply      
About the site: quite hard to read - almost white text on white background.
willcodeforfoo 18 hours ago 2 replies      
I wonder if Amazon will ever open Lambda up to any Docker image? (I know it's possible to run binaries, but its a bit of a pain to compile with the Amazon AMI, etc.) Being able to have a bunch of `docker run` with any image would be pretty powerful.
partycoder 20 hours ago 5 replies      
I do not agree with term serverless. Amazon Lambda is a service, therefore there is a server involved.

It's like saying deathless meat, because someone else killed the animal you are consuming.

elcct 21 hours ago 3 replies      
Is there any AWS Lambda equivalent that could be deployed on bare metal?
amelius 22 hours ago 4 replies      
If it doesn't run on a server, then what does this plumbing-work run on? Clickbait name?
Shrimp Trap primitivetechnology.wordpress.com
127 points by nikolay  17 hours ago   50 comments top 8
alva 3 hours ago 1 reply      
The PrimitiveTechnology videos are such an incredible example of the wonders the internet provides. Here you have a guy making such interesting, engrossing videos, uploaded for free and accessible to all. The series is worthy of being on the BBC.

I wonder if there are any companies who spot this sort of content with the aim of getting in on tv screens.

sdrothrock 10 hours ago 2 replies      
For anyone curious about this who didn't want to watch the video, the trap consists of:

1. A large woven cone

2. A smaller woven cone without a tip

The smaller cone is placed in the larger one; shrimp swim into the small cone to explore, but then get caught in the space between the two cones when they try to get out (presumably because it's difficult to find the single entrance).

He mentions that the only skill necessary is basketweaving -- I wonder if it would be possible to carve something similar (two interlocking geometric shapes) or if the trap being woven is essential to its function, for example, for allowing flowing water in to entice shrimp.

One of the parts that stood out for me was

> In practice, a long stretch of creek might have several traps collecting food each day without any effort on the part of the fisherman.

If he were to go whole hog long-term, shrimp traps would free up his time for doing other crafts in ways that spear fishing or actively hunting wouldn't, though I suppose the local yield of shrimp would factor into that (whether he could collect enough calories consistently to fund his other efforts).

keeganjw 56 minutes ago 0 replies      
Yes, yes, yes! I didn't know he had a blog. I've only seen his Youtube channel. Time to take a deep dive into his written material.
robtaylor 56 minutes ago 1 reply      
On reddit overnight there were tens of posts on this from separate users, now I see this on here.

Is this common / natural or is there some PR at work?

Lxr 1 hour ago 1 reply      
Can anyone actually start a fire that easily without chemicals?
mrkgnao 8 hours ago 2 replies      
> I humanely killed the shrimp using the splitting method which destroys the central nervous system (boiling alive is more painful).

How do we know this?

mynameishere 11 hours ago 2 replies      
He killed some shrimp here. That was the big revelation. I suppose many of the people watching him fabricate spears and bows thought, "That was cute, but good luck taking a stag with it". Well, he finally took a shrimp.

Maybe he could get a budget and go to Alabama or somewhere with a wild hog problem and get a legal sign-off on hunting something properly with stone age tools. I mean, that's actually what cavemen were doing most of the time. Fire alone doesn't make a meal.

bradknowles 11 hours ago 6 replies      
It's a couple of paragraphs on what a shrimp trap is and how it can be used to catch shrimp.

Remind me why this is on HN?

Turbo.js GPGPU made simple turbo.github.io
100 points by Lolapo  17 hours ago   25 comments top 10
ris 29 minutes ago 0 replies      
I thought something exciting had been done here but it's been a long time since "GPGPU" meant "kernels written in GLSL".
macawfish 8 hours ago 1 reply      
Seems cool... but it doesn't work for me!

> Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.33731377391086426.

I'm on linux 4.7 + chromium 53 with Intel 4000 graphics

problems 9 hours ago 0 replies      
> In fact you're using it right now

Thanks, I could tell when my fan spun up when I loaded the page and my browser lagged.

andreapaiola 7 hours ago 1 reply      
Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.3373333334457129.


Values are:0.00900000031106174and1.0210000006482005


Chrome on Ubuntu on Intel NUC

msimpson 1 hour ago 0 replies      
Great work minxomat, and under 200 LOC as well.
sdrothrock 10 hours ago 1 reply      
I'm confused -- there's a section mentioning a benchmark, but I can't find the results anywhere on the page. Chrome 54 on macOS 10.12.1.

There are sections for "PURE JAVASCRIPT" and "JAVASCRIPT & TURBO.JS", but they only display a triangle/circle illustration.

ooqr 4 hours ago 0 replies      
As for many others, this does not work for me, yet I'm eager to try this.
evan_ 9 hours ago 1 reply      

> Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.673000000262012.

Latest Chrome on a macbook air. Looks like at least one other person has had this issue, I'll report my experience as well. https://github.com/turbo/js/issues/1

erpellan 4 hours ago 0 replies      
frameBufferStatus was false.frameBufferStatus.message is undefined.

if (!frameBufferStatus) throw new Error('ERROR: (fatal): ' + frameBufferStatus.message);

Chrome on Windows 10 + AMD R9/280.

bradknowles 11 hours ago 3 replies      
Does it really do anything on mobile?

Anyone got some example results they can share with us, including hardware/OS/software you're running and what kind of speed up was found?

The Mega Rich Have Found an Unlikely New Refuge bloomberg.com
213 points by dmmalam  1 day ago   218 comments top 24
jorblumesea 22 hours ago 12 replies      
The modern world has such a ridiculous disconnect. The people fleeing to remote places because of instability are the very same people causing these issues in the first place. Hedge fund managers selling toxic assets bemoaning the uncertainty of the financial world? And what will your island paradise look like with no one to trade with or if the world economy collapses? NZ doesn't have much of an economy in itself, what are you going to do, go back to farming?

It's even more hilarious because the reason NZ is a desirable place to live is because there is little money in politics and there are few class distinctions. How will that change when all the billionaires move there and want to get their way (and are used to the world conforming to the bank account)? Just another place for the wealthy elite of the world to slowly ruin, maybe the next step is the moon?

The system is rapidly growing out of control and no one seems to be able to stop or halt the insanity. It's like we're all marching towards oblivion and everyone knows it but no one has the will to turn the ship around because the system incentivizes short term greed.

te_chris 23 hours ago 8 replies      
Fantastic. As a NZ expat currently resident in London it's depressing the way global market conditions have sent house prices skyrocketing in the places I'd like to live. My partner and I are doing well here, and we plan to go back to NZ one day, but in just 6 years since my partner's brother was here what we'd go back to has changed immeasurably.

He and his partner managed to come over here when the pound was worth $3 NZD, rent was cheaper, save well and move back to Auckland when houses were depressed (2009-2010). Since then they rode the wave and now have a beautiful home with a view straight down the Waitemata harbour.

My partner and I? We came here when the pound was at $2.10ish, not so bad I thought, not as good, but y'know, we make good money, we can deal with it. Then BREXIT. BOOM. Now the pound is $1.70, our salaries are, of course, still the same. Also a shithole house in Auckland in a bad suburb now costs at least $700k NZD - the first house that my partner's brother bought cost $650k NZD, in a nice suburb on transport links.

The world is mad.

clydethefrog 23 hours ago 2 replies      
The ultimate NIMBYs. The same people that are responsible for the many uncertainties multinational corporations bring and therefore stimulate the dangerous populism in this global age now go to Queenstown, have a pint and wait for this to all blow over.

Money does not buy style if you see all those new houses by the way...

okreallywtf 22 hours ago 1 reply      
I've long expected these kinds of stories. The same people who have benefited from climate change and bred a lot of the chaos we're now experiencing will be able to live quite comfortably while the rest of us are left to deal with the consequences. This is one of the most distressing things about climate change: so much wealth was created at the expense of the planet and so little of that will ever go into fixing it. We will pay for it in terms of higher energy prices a million other ways instead.

As the arctic continues to melt I expect there to be some nice openings in northern canada that are even more remote.

fatdog 23 hours ago 4 replies      
Funniest thing I ever heard about it was a girl from Cornwall saying "Ah, New Zealand, the thinking man's Australia."
neaden 23 hours ago 8 replies      
"House prices in New Zealand increased 12.7 percent in the year through October, and the average price in largest city Auckland has almost doubled since 2007 to more than NZ$1 million." That's about $730,000 USD which would make me worry if I was a New Zealander.
mahyarm 21 hours ago 3 replies      
When vancouver adopted a %15 non-resident house purchase tax, prices and demand dropped a lot. Maybe nz wants to adopt something similar, combined with singapore's for citizen public house purchase program.
JustSomeNobody 23 hours ago 3 replies      
So, now NZers will not be able to afford to live in their own country.[0]

The mega rich should buy sailing vessels and live on those and leave the housing prices affordable to the rest of us.

[0] Or at least not own a house.

RandyRanderson 21 hours ago 0 replies      
In BC, Canada we laugh at your measly 12.7% and raise you a ~18% [0] for a total of +31.4% 2015aug-2016aug. Yes, you read that right.

It's funny how this article's impression is "prices are returning to normal" when in fact the government's new token rules haven't had a chance to take effect and certainly haven't had time to be analyzed.

Sadly, these new rules are just for show (an election is nearing here) and until effective reforms are put in place the bubble will continue to rise.

[0] http://www.cbc.ca/news/canada/british-columbia/26-slump-in-v...

FrancoDiaz 23 hours ago 5 replies      
...have helped make the South Pacific nation -- a day by air away from New York or London

Isn't that essentially anywhere on the planet?

imron 24 minutes ago 0 replies      
Rubbish. New Zealand's not that great. It's rubbish in fact.

Please everyone stay away.

FuNe 23 hours ago 3 replies      
The world is heading into a major crisis, said Internet entrepreneur Kim Dotcom in an Oct. 31 Twitter post. I saw it coming and thats why we moved to New Zealand. Far away & not on any nuclear target list.

It will probably ruin his day but the good thing with nuclear fall out is it's globalized nature. https://en.wikipedia.org/wiki/On_the_Beach_%281959_film%29

thompo 23 hours ago 1 reply      
expensive, isolated, absurdly beautiful...

i guess "unlikely" probably isn't the word i would use in this situation.

EdwardDiego 22 hours ago 1 reply      
I really like it when Russian oligarchs buy entire beaches here. :/
1_2__3 18 hours ago 1 reply      
I know nobody wants to admit this but this is only going to get worse. India and China are going to eat the world, because this is a numbers game.
XJOKOLAT 22 hours ago 1 reply      
Really sad to read this in a "wish I could preserve it just as it is" protective kind of way. It's a beautiful place.

If any of you mega-rich are reading this. Please don't fuck it up.

Consider your impact - and I'm not just talking about the enviroment.

dharma1 21 hours ago 0 replies      
I've heard this story in the past. People buying farmland and building air strips in NZ. Makes sense, it would be fairly isolated from turmoils in the rest of the world



wmeredith 23 hours ago 1 reply      
Isn't the irony here that the world isn't becoming more uncertain?
santaclaus 21 hours ago 0 replies      
I always thought someone could make a fortune introducing insulation to homes in NZ. Loved living in Wellington, but my god, the homes there are the draftiest coldest places ever!
mountaineer22 23 hours ago 2 replies      
ISP pricing/caps?
mishkinf 18 hours ago 0 replies      
^^^^^^^^^ THIS IS More evidence that the condition of the world is degrading. Making huge sums of money is immoral. The 62 wealthiest people in the world have as much wealth as the bottom 3.5 billion people. This is completely immoral and should be illegal. The mere fact that a person has billions of dollars is evidence that there is tyranny in the world. The paradox of capitalism is that the more a capitalist succeeds in reducing his costs (labor), the less money the mass of people will have to buy the very products by which a capitalist produces and thus cannot sell the products and services he is producing. It is bound to fail in time... and after it does, we will have to find a system which puts morality and love for one another at the center of its goals.
fbreduc 20 hours ago 1 reply      
What i think is funny is that the mega rich thing in a collapsed society they will be able to hold on to these areas? People... will just come and take it.
branchless 23 hours ago 0 replies      
How odd. A former British colony with a land price problem washed through banks that are making life worse for working people.

Most unusual. /s

kodt 23 hours ago 1 reply      
Just don't try to plant your own garden!
H.264 is Magic sidbala.com
1078 points by LASR  1 day ago   206 comments top 58
lostgame 1 day ago 7 replies      
Absolutely love this:

'Suppose you have some strange coin - you've tossed it 10 times, and every time it lands on heads. How would you describe this information to someone? You wouldn't say HHHHHHHHH. You would just say "10 tosses, all heads" - bam! You've just compressed some data! Easy. I saved you hours of mindfuck lectures.'

This is a really great, simple way to explain what is otherwise a fairly complex concept to the average bear. Great work.

userbinator 1 day ago 2 replies      
The lossy transform is important, but I think what's actually most important in video compression is getting rid of redundancy --- H.264 actually has a lossless mode in which that transform is not used, and it still compresses rather well (especially for noiseless scenes like a screencast.) You can see the difference if you compare with something like MJPEG which is essentially every frame independently encoded as a JPEG.

The key idea is to encode differences; even in an I-frame, macroblocks can be encoded as differences from previous macroblocks, and with various filterings applied: https://www.vcodex.com/h264avc-intra-precition/ This reduces the spatial redundancies within a frame, and motion compensation reduces the temporaral redundancies between frames.

You can sometimes see this when seeking through video that doesn't contain many I-frames, as all the decoder can do is try to decode and apply differences to the last full frame; if that isn't the actual preceding frame, you will see the blocks move around and change in odd ways to create sometimes rather amusing effects, until it reaches the next I-frame. The first example I found on the Internet shows this clearly, likely resulting from jumping immediately into the middle of a file: http://i.imgur.com/G4tbmTo.png That frame contains only the differences from the previous one.

As someone who has written a JPEG decoder just for fun and learning purposes, I'm probably going to try a video decoder next; although I think starting from something simpler like H.261 and working upwards from there would be much easier than starting immediately with H.264. The principles are not all that different, but the number of modes/configurations the newer standards have --- essentially for the purpose of eliminating more redundancies from the output --- can be overwhelming. H.261 only supports two frame sizes, no B-frames, and no intra-prediction. It's certainly a fascinating area to explore if you're interested in video and compression in general.

szemet 1 day ago 4 replies      
I thought I'll learn something special about H.264, but all information here is high level and generic.

For example if you replace H.264 with a much older technology like mpeg-1 (from 1993) every sentence stays correct, except this:

"It is the result of 30+ years of work" :)

amluto 1 day ago 1 reply      
Nice article! The motion compensation bit could be improved, though:

> The only thing moving really is the ball. What if you could just have one static image of everything on the background, and then one moving image of just the ball. Wouldn't that save a lot of space? You see where I am going with this? Get it? See where I am going? Motion estimation?

Reusing the background isn't motion compensation -- you get that by encoding the differences between frames so unchanging parts are encoded very efficiently.

Motion compensation is when you have the camera follow the ball and the background moves. Rather than encoding the difference between frames itself, you figure out that most of the frame moved and you encode the different from one frame to a shifted version of the blocks from a previous frame.

Motion compensation won't work particularly well for a tennis ball because it's spinning rapidly (so the ball looks distinctly different in consecutive frames) but more importantly because the ball occupies a tiny fraction of the total space so it doesn't help that much.

Motion compensation should work much better for things like moving cars and moving people.

adilparvez 1 day ago 1 reply      
Related, how h265 works:http://forum.doom9.org/showthread.php?t=167081

This is a great overview and the techniques are similar to those of h264.

I found it invaluable to get up to speed when I had to do some work on the screen content coding extensions of hevc in Argon Streams. They are a set of bit streams to verify hevc and vp9, take a look, it is a very innovative technique:


mherrmann 21 hours ago 1 reply      
I recently experienced this as follows: https://www.sublimetext.com has an animation which is drawn via JavaScript. In essence, it loads a huge .png [1] that contains all the image parts that change during the animation, then uses <canvas> to draw them.

I wanted to recreate this for the home page of my file manager [2]. The best I could come up with was [3]. This PNG is 900KB in size. The H.264 .mp4 I now have on the home page is only 200 KB in size (though admittedly in worse quality).

It's tough to beat a technology that has seen so much optimization!

1: http://www.sublimetext.com/anim/rename2_packed.png

2: https://fman.io

3: https://www.dropbox.com/s/89inzvt161uo1m8/out.png?dl=0

woliveirajr 1 day ago 2 replies      
I love how you can edit photos from people to correct some skin imperfections without loosing the touch that the image is real (and not that blurred, plastic look) when you decompose it in wavelets and just edit some frequencies.

Don't know in photoshop, but in Gimp there's a plugin called "wavelet decomposer" that does that.

the8472 23 hours ago 2 replies      
> Chroma Subsampling.

Sadly, this is what makes video encoders designed for photographic content unsuitable for transferring text or computer graphics. Fine edges, especially red-black contrasts start to color-bleed due to subsampling.

While a 4:4:4 profile exists a lot of codecs either don't implement it or the software using them does not expose that option. This is especially bad when used for screencasting.

Another issue is banding, since h.264's main and high profiles only use 8bit precision, including for internal processing, and the rounding errors accumulate, resulting in banding artifacts in shallow gradients. High10 profile solves this, but again, support is lacking.

dluan 23 hours ago 2 replies      
By the way, this is an incredible example of scientific writing done well. It's very tangible jelly-like feeling that the author clearly has for the topic, conveyed well to the readers. This whole thread is people excited about a video codec!
algesten 1 day ago 6 replies      
"See how the compressed one does not show the holes in the speaker grills in the MacBook Pro? If you don't zoom in, you would even notice the difference. "

Ehm, what?! The image on the right looks really bad and the missing holes was the first thing I noticed. No zooming needed.

And that's exactly my problem with the majority of online video (iTunes store, Netflix, HBO etc). Even when it's called "HD", there are compression artefacts and gradient banding everywhere.

I understand there must be compromises due to bandwidth, but I don't agree on how much that compromise currently is.

eutectic 1 day ago 1 reply      
Anyone who likes this would probably also enjoy the Daala technology demos at https://xiph.org/daala/ for a little taste of some newer, and more experimental, techniques in video compression.
bjn 17 minutes ago 0 replies      
Well written article.
el0j 2 hours ago 0 replies      
If the author truly wants 'magic', how about we take a 64KiB demo that runs for 4 minutes. That's 64KiB containing 240 seconds of video, and your H.264 had to use 175 for only five seconds on video.

We can conclude that 64KiB demos are at least 48 times as magical as H.264.

alexandrerond 1 day ago 2 replies      
Very well explained. But I could have understood it all without the bro-approach to the reader. You see where I am going with this? Get it? See where I am going? Ok!
spacehacker 1 day ago 0 replies      
The part about entropy encoding only seems explain run-length encoding (RLE). Isn't the interesting aspect of making use of entropy in compression rather to represent rarer events with longer longer code strings?

The fair coin flip is also an example of a process that cannot be compressed well at all because (1) the probably of the same event happening in a row is not as high as for unfair coins (RLE is minimally effective) and (2) the uniform distribution has maximal entropy, so there is no advantage in using different code lengths to represent the events. (Since the process has a binary outcome, there is also nothing to gain in terms of code lengths for unfair coins.)

john111 1 day ago 6 replies      
Can someone explain how the frequency domain stuff works? I've never really understood that, and the article just waves it away with saying it's like converting from binary to hex.
Cuuugi 1 day ago 2 replies      
This was A LOT more interesting than I suspected when I started reading.
amelius 1 day ago 3 replies      
> discard information which will contain the information with high frequency components. Now if you convert back to your regular x-y coordinates, you'll find that the resulting image looks similar to the original but has lost some of the fine details.

I would expect also the edges in the image to become more blurred, as edges correspond to high-frequency content. However, this only seems to be slightly the case in the example images.

kakarot 1 day ago 0 replies      
Ya'll wanna get the most out of your H.264 animu rips? Check out Kawaii Codec Pack, it's based on MPC and completely changed my mind about frame interpolation. http://haruhichan.com/forum/showthread.php?7545-KCP-Kawaii-C...
amelius 1 day ago 3 replies      
What are directions for the future? Could neural networks become practically useful for video compression? [1]

[1] http://cs.stanford.edu/people/eroberts/courses/soco/projects...

Savageman 1 day ago 1 reply      
I wonder if across a lot of videos, the frequency domain representations look similar and if instead of masking in a circle we could mask with other (pre-determined) shapes to keep more information (this would require decoders to know them, of course).Or maybe this article is too high-level and it's not possible to "shape" the frequencies.
nojvek 22 hours ago 0 replies      
This is a really well written article. Exactly why I love HN. Sometimes you get this nice technical intros into fields you thought were black magic.
el0j 18 hours ago 1 reply      
The PNG size seems to be misrepresented. The actual PNG is 637273 bytes when I download it, and 597850 if I recompress it to make sure we're not getting fooled by a bad PNG writer.

So instead of the reported 916KiB we're looking at 584KiB.

This doesn't change the overall point, but details matter.

 $ wget https://sidbala.com/content/images/2016/11/FramePNG.png --2016-11-04 22:08:08-- https://sidbala.com/content/images/2016/11/FramePNG.png Resolving sidbala.com (sidbala.com)...,, 2400:cb00:2048:1::6819:1112, ... Connecting to sidbala.com (sidbala.com)||:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [image/png] Saving to: FramePNG.png FramePNG.png [ <=> ] 622.34K --.-KB/s in 0.05s 2016-11-04 22:08:08 (12.1 MB/s) - FramePNG.png saved [637273] $ pngout FramePNG.png In: 637273 bytes FramePNG.png /c2 /f5 Out: 597850 bytes FramePNG.png /c2 /f5 Chg: -39423 bytes ( 93% of original)

iplaw 1 day ago 3 replies      
H.265 gets you twice the resolution for the same bandwidth, or the same resolution for half the bandwidth.
umbs 4 hours ago 2 replies      
"1080p @ 60 Hz = 1920x1080x60x3 => ~370 MB/sec of raw data."

I apologize if this is trivial. What does 1920 in above equation represent?

rimbombante 1 day ago 0 replies      
Articles like this are what makes HN great, and not all those repeated links to the visual studio changelog.
mtw 1 day ago 1 reply      
Even better H.265 with 40-50% bit rate reduction compared with H.264, at the same visual quality!
i336_ 16 hours ago 0 replies      
I found this explanation of Xiph.org's Daala (2013) very interesting and enlightening in terms of understanding video encoding: https://xiph.org/daala/


BPG is an open source lossless format for images that uses HEVC under the hood, and is generally better than PNG across the board: http://bellard.org/bpg/

For a runner-up lossless image format unencumbered by H265 patents (completely libre), try http://flif.info/.

aaron695 11 hours ago 0 replies      
H.265/HEVC vs H.264/AVC: 50% bit rate savings verified


notlisted 1 day ago 1 reply      
Well done. The only thing that could make this better is an interactive model/app for me to play around with. The frequency spectrum can probably be used while retouching images as well.

A video on youtube led me to Joofa Mac Photoshop FFT/Inverse FFT plugins [1] which was worth a try. I was unable to register it, as have others. Then I came across ImageJ [2], which is a really great tool (with FFT/IFFT).

Edit: if anyone checks out ImageJ, there's a bundled app called Fiji [3] that makes installation easier and has all the plugins.

If anyone has other apps/plugins to consider, please comment.

[1] http://www.djjoofa.com/download

[2] https://imagej.nih.gov/ij/download.html

[3] http://fiji.sc/

optimuspaul 20 hours ago 0 replies      
I enjoyed this for the most part and even learned a little. But it started out very simple terms and really appealing to the common folk. But then about halfway through the tone changed completely and was a real turn off to me. It's silly but this "If you paid attention in your information theory class" was the spark for me. I didn't take any information theory classes, why would I have paid attention? I don't necessarily think it was condescending, but maybe, it's just that the consistency of the writing changed dramatically.

Anyway super interesting subject.

monochromatic 12 hours ago 0 replies      
This is great as a high-level overview... except that it's way too high-level. These are all extremely well-known techniques. Is there any modern video compression scheme that doesn't employ them?

In other words, why is H.264 in particular magical?

xyproto 3 hours ago 0 replies      
Copyrighted and patented magic.
afghanPower 1 day ago 1 reply      
A real fun read. Had an assignment a couple of weeks ago where we used the most k most significant singular values of matrices (from picture of Marilyn M.) to compress the image. H.264 is on a whole other level, though ;)
ludwigvan 20 hours ago 0 replies      
What is the latest in video compression technology after H264 and H265?

The article discusses lossy compression in broad terms, but have we reaped all the low hanging fruit? Can we expect some sort of saturation just like we have with Moore's law where it gets harder and harder to optimize videos?

dirtbox 23 hours ago 0 replies      
I like this video explaining the difference between H.264 and H.265 https://www.youtube.com/watch?v=hRIesyNuxkg

Simplistic as it is, it touches on all the main differences. The only problem with H.265 is the higher requirements and time needed for encoding and decoding.

problems 1 day ago 0 replies      
Really cool stuff, one thing though seems a little odd:

> Even at 2%, you don't notice the difference at this zoom level. 2%!

I'm not supposed to see that major streakiness? The 2% difference is extremely visible, even 11% leaves a noticably bad pattern on the keys (though I'd probably be okay with it in a moving video), only the 30% difference looks acceptable in a still image.

syastrov 20 hours ago 0 replies      
An enjoyable, short and to the point article with many examples and analogies. But my favorite part was this:

"Okay, but what the freq are freqX and freqY?"

neo2006 22 hours ago 1 reply      
The comparison doe not make any sense, and no h264 is not magic!!:- The guy is comparing a lossless format PNG to H264 which is a lossy video format, that is not fair.- he is generating a 5 frame video and compared to 1 frame image, only the I-frame at the begining of the video matter in that case al the others are derived from it, P-Frame.- What is the point of having that comparaison we already have images format comparable to the size of a H264 I-frame and using the same science (entropy coding, frequency domain, intra frame MB derivation...)?
some1else 22 hours ago 0 replies      
Try scrubbing backwards. H264 seeking only works nice if you're fast-forwarding the video. Actually, that is kind of magical.
vcool07 1 day ago 1 reply      
This was a good and interesting read. Is h.264 an open standard ?
andrewvijay 23 hours ago 0 replies      
Well explained. I was thinking of reading about h264 and this is an amazing starter. Thanks Sid!
markatkinson 1 day ago 1 reply      
Damn, lost me during the frequency part.
11thEarlOfMar 1 day ago 1 reply      
Do H.264 and WebRTC have different use cases? Or do they compete directly?
imperialdrive 20 hours ago 0 replies      
Great Write-up, thank you for your time and effort!
molind 1 day ago 0 replies      
Wow, now tell me how H.265 works!
mozumder 1 day ago 1 reply      
So what's the final car weight? It looks like you stopped at the Chroma subsampling section..
andrey_utkin 19 hours ago 0 replies      
Too trivial, too general, too pompous. I'd downvote.
necessity 16 hours ago 0 replies      
wizkkidd 19 hours ago 0 replies      
time to make move on: h.265
wizkkidd 19 hours ago 0 replies      
time to move on: h.265
wizkkidd 19 hours ago 0 replies      
time to move on H.265
mentioned_edu 1 day ago 0 replies      
imaginenore 1 day ago 0 replies      
> "If you don't zoom in, you would even notice the difference."

First of all, I think he meant "you would NOT even notice".

Second of all, that's the first thing I noticed. That PNG looks crystal clear. The video looks like overcompressed garbage.

VikingCoder 1 day ago 2 replies      
Ugh. Comparing the file size difference between a lossless PNG and a LOSSY H.264 video of a STATIC PAGE is absurd. Calling it "300 times the amount of data," when it's a STATIC IMAGE is insulting in the extreme. It really doesn't matter if the rest of the article has insights, because you lost me already.
cogwheel 21 hours ago 1 reply      
MB is 1024 * 1024 * bytes not 1000 * 1000 * bytes. Unless you're a HDD/SSD manufacturer.
kutkloon7 1 day ago 1 reply      
"This concept of throwing away bits you don't need to save space is called lossy compression."

What a terrible introduction of lossy compression. This would mean that if I empty the thrash bin on my desktop, it's lossy compression.

The concept of going through all compression ideas that are used is pretty neat though.

Is democracy a failure? (1861) nytimes.com
137 points by siavosh  10 hours ago   197 comments top 30
hairy_man674 7 hours ago 7 replies      
"Under the relentless thrust of accelerating over-population and increasing over-organization, and by means of ever more effective methods of mind-manipulation, the democracies will change their nature; the quaint old formselections, parliaments, Supreme Courts and all the restwill remain. The underlying substance will be a new kind of non-violent totalitarianism. All the traditional names, all the hallowed slogans will remain exactly what they were in the good old days. Democracy and freedom will be the theme of every broadcast and editorialbut Democracy and freedom in a strictly Pickwickian sense. Meanwhile the ruling oligarchy and its highly trained elite of soldiers, policemen, thought-manufacturers and mind-manipulators will quietly run the show as they see fit."

Aldous Huxley, Brave New World Revisited (1958)

nostrademons 9 hours ago 6 replies      
Democracy is interesting because it requires you to step outside your own head, understand that there are people who hold different values and different experiences from you, and then mentally engage with them, holding your own doubt and revulsion at bay, until you can come to a consensus that's acceptable to everyone. It's a thoroughly unnatural and uncomfortable experience that can be both fatiguing and time-consuming. No wonder everybody predicts that it will fail - by definition, a democracy requires occasional subjugation to points of view that are alien to your way of life, and which point of view is often unpredictable and changeable.

But I'd much rather have it than any system of forced social roles, where there is one person or small cabal of people who make the decisions and everyone else knows their job is simply to obey.

capkutay 9 hours ago 2 replies      
I really think the movie Idiocracy nailed the scary potential future of democracy. If elections continue and build on this trend of featuring reality tv show style candidates and media that lives off hyperbole, eventually we'll just start electing the most popular celebrities.
Aldo_MX 9 hours ago 4 replies      
I believe that democracy in theory is excellent, but in practice is a failure, not because of democracy per-se, but because a gut-feeling vote and a properly-researched vote count the same.

I'm in no way suggesting that one vote should be valued more than another one, but that people should be doing their homework and researching properly the benefits and consequences of their decisions...

Having a majority of the population voting with the gut can lead us to disastrous results...

ozy 1 hour ago 0 replies      
There are two perspectives on democracy. You can compare it to some ideal world that has never been, and easily see it falls short, by a lot.

Or you can look back on history, and look at countries who have done things differently, and realize it is by far the best system we have had.

If you want to live a full lifetime in peace and security and health. Be born in a democratic and capitalist country somewhere in the past 70 years, and you maximize that chance. Any other time and place and the chance of a good life drops quickly.

To other commenters in this thread, describing democracy or capitalism with words like "disastrous" or "complete failure" or "elite rule", please some perspective.

roenxi 8 hours ago 2 replies      
We haven't yet found a decision making process that works better than democracy, and a lot of people have tried. At some point, someone has to evaluate how good or bad the decisions being made by the political apparatus are. Either that person is the average voter, or it is a minority group. If it is a minority group, they are in a position of extreme moral hazard to funnel societies resources to their benefit and lock out change.

It is worth reflecting that, although a lot of people complain that people are 'voting for a slogan' or similar, there is evidence (mainly the outrageous success of democracies vs. non-democracies) that the average voter does actually have some idea what is going on.

It is also a subtle and interesting fact that if a large group of people are voting essentially randomly, then they will cancel out. In this way, a person voting with no thought for policy will probably cancel out another person voting with little thought. A 52-48 type margin can mean that 96% of the population had no idea, and the 4% that knew what was going on voted unanimously in favour. The point being that a vote can be sliced up theoretically so that ignorant voters have less influence than might be expected - and again, the practice of democracies suggests this tends to happen more often than intuition suggests.

Democracies throw out some cruel decisions, but that usually means the interests of the voters are being served rather than democracy failing.

unclenoriega 9 hours ago 1 reply      
The part that struck me the most:

> We have been so accustomed to hear from infancy eulogies of the wisdom which shaped our Constitution, praises of its perfection, hymns to its symmetry and strength, that to doubt its fullness of all excellence has come to sound like sacrilege.

Some things never change.

matt_wulfeck 7 hours ago 1 reply      
The author speaks about the "provincial" partisans that exist between states, cities, and areas of the Union as if it's the byproduct of a failing Democracy. He or she points specifically to the different ways laws are written and enforced, as one example.

This is, to me, perhaps one the strongest strengths of our Republic. Different ideas and values can be thrive in different areas at the same time, and we can test and experiment with what's true and right.

Houshalter 8 hours ago 1 reply      
Yes. Consider that attractive candidates get two and a half times as many votes as unattractive ones. Consider that most voters have very little or no knowledge of most policy and issues. Most voters can't name their representatives even. Most just blindly vote for one party or issue.

It's still better than nondemocratic systems I guess. But that's a terribly low bar to pass. That's not something to be proud of.

Everyone always says that it's the best system of government that has been tried. Well maybe we aren't trying hard enough! There are other systems, and here are a few that are my personal favorites.

My ideal system of government has no politicians. It forms a parliament or congress just like normal, but the representatives are sampled randomly from the population. Ideally they would be filtered for IQ or education, but this is optional. And then they would debate and vote on issues, without having party loyalty, and without having to pander to the general population. It's sort of like direct democracy, but the random sampling lets it scale to much larger populations.

I also really like the model of the supreme court. I have to say that every supreme court decision I've looked into, they seem remarkably rational and competent. They aren't perfect of course, but it seems so much better than congress. Statistics show that even biased judges tend to become much less biased by the time they retire.

I'm not sure how they accomplish this. My guess is the lifetime appointments, and the structure of the court being to debate issues extensively, and for the judges to at least try to weigh them objectively. I would love to try a system of government modelled after something like the Supreme court.

There is futarchy, proposed by Robin Hanson. The idea is to use prediction markets to make predictions about the future, like whether policies will actually work. Then voters can vote on values ('I approve of Brexit, conditional on it being predicted to increase median wages.') But they bet on beliefs.

Another idea I like is the "Ideological Turing Test". In this case representatives can vote on policy just like normal. But they have to pass a test that proves they fully understand the other side's point of view. By writing arguments for the other side of the argument, and blinded reviewers not being able to tell if it's authentic or not. This would be complicated to implement without people gaming the system, but I think it's worth a try.

There is also alternative voting systems. These are just small modifications of regular democracy. They modify the voting system so you can vote for third parties without being punished for splitting the vote.

nick0garvey 9 hours ago 1 reply      
This was published less than a month before the Civil War started.
paulsutter 7 hours ago 0 replies      
Maybe we should ask,

- is representative democracy a failure (in contrast with direct democracy[1])

- is a two-party system a failure (in contrast with a multiparty system[2])

[1] https://en.m.wikipedia.org/wiki/Direct_democracy

[2] https://en.m.wikipedia.org/wiki/Multi-party_system

rdtsc 8 hours ago 1 reply      
Yes it is a failure, but everything else is even more of a failure. So we settled for a lesser failure.

Also "democracy as a failure" is a common trope that is used by those who perceive the election isn't going according to how they planned. "They are not voting the way I like, therefore democracy has failed" or if the election or polls go the expected way then "democracy and clear minds prevailed again!".

One interesting thing I found about the current election is the role the media plays. To control people in a dictatorship is easier, you just make criticism and dissent punishable, nationalize all the media and it is all simple and easy. In a Democracy controlling is a bit harder, but is still done over the media using sophisticated and not-so-sophisticated methods. Related to that my favorite quote so far comes from CNN's Chris Cuomo talking about the emails: "its illegal to possess these stolen documents. Its different for the media. So everything you learn about this, youre learning from us." It is as if, there was a tiny crack in the matrix and the underlying code was exposed for a moment.

jayajay 8 hours ago 0 replies      
"51% of people want spin-up, 49% of people want spin-down, therefore we are going to be a spin-up only society". That right there is the problem with Democracy. Democracy finds controversy, but it does absolutely nothing about it. Since the majority side is always favored, there's no incentive to actually get past the disagreements; there's no incentive to grow. It's just about mindlessly acquiring votes.

Democracy provides useful data: which topics society agrees on, which topics they disagree on. For the ~51/49 (controversial) cases, instead of enlightening ourselves, we just blindly take the majority choice. This is not the way a scientific society should be approaching government.

WheelsAtLarge 8 hours ago 0 replies      
It's not a failure but democracy is about to be tested like no time before. With the ability for everyone to have a voice via social media single minded groups are easier than ever to create. As we have seen many of these groups are unwilling to compromise their ideas which makes it likely for chaos to erupt over any hot issues. We saw an example with the Arab spring and Occupy Wall Street. They erupted but yet have had no real results. Primarily because there was no real leadership behind it to move it forward. In many ways you can say that the revolution made matters worse.

When everyone is upset and there are many points of view there is no common way to move forward but there are many hot heads that are willing to shoot first and ask questions later. Imagine a million hot heads without a common goal but the willingness to fight and we can see chaos with out results.

People get upset at the "do nothing congress" because they can't get X done but people aren't willing to admit that the reason is that voters have sent individuals with very diverse ideas to try to get things done. Voters are the ones that are pushing them to not compromise any idea or be punished by being voted out. I can see a future where one person that can use social media very well can push people to vote in ways that we consider distasteful now. What will happen then? Groups will erupt with opposing view and many will be ready to fight.

We've heard allegations that the voting system is rigged but that's very unlikely. We have laws and watch dogs that prevent that in any significant way. We don't have the same for social media but we know that it's possible to manipulate it, even by foreign powers, and that's not illegal worse yet it's hard to impossible to prevent. It's hard to even contemplate how that effects a democratic system.

The founding father created a representative government because they knew that rule by majority can be as distasteful as government by a monarchy or emperor. They thought a functioning government needs representatives that can sort out what's needed. With everyone having a voice that's going to get extremely difficult. Social media is about to let the US test out its governmental system, lets hope it can pass the trouble ahead. Can the US stay together as a nation?

muad 2 hours ago 0 replies      
The problem isn't democracy, it is the republic.

We have the technology to make congress obsolete.

ezequiel-garzon 6 hours ago 2 replies      
It's interesting to see that back then (I now checked that even before the Civil War period) the term United States of North America was used. To the history aficionados, was USNA more commonplace than USA? If so, when did USA become prevalent?
louithethrid 9 hours ago 0 replies      
The success of democracy and its creations have stabilized dictatorships- a constant even small surplus in a economy where the human individual matters near nothing, keeps societys with a frozzen over public relatively stable for a long time.

Dictatorship is what came most natural to our animal ancestors, so we tend to rationalize it while condemning the "Zumutung der Komplexitt".

But are they stable? Dictatorships tend to embrace the conservative point of view, which produces usually overpopulation, raging nationalists (the bottled up anger redirected against the guys next door) and religious fanatics. So if this economic surplus, aka innovation is not imported (or even more dangerous constantly produced)theey unravel rather fast, usually by the forces they called upon to stabilize.

There is no inbuilt re-juvenation without weapons. So every new App - any new product or production methode, disturbing the equilibrium, can blow up such a social powder keg.

The seperate problem usually associated with democracys, is that the fullfillment of all wishes in the wests way of life, is rather self-destructive on society. <Anecdata Begin>I have several couples i know who really looked forward to having grandkids, after raising (quite large) familys. And in the west this just doesent happen any more. Nothing more depressing then seeing those baby-boomers and there bottles all in tears about "What went wrong with there kids?"<Anecdata End>It doesent make the situation better, that democracys have a tendency to import large swaths of people from dictatorships - mostly for economic and sociological reasons.

hal9000xp 7 hours ago 2 replies      
Centralized democracy is indeed a complete failure.

I really hate an idea that minorities must live as majority wants. The assumption that majority is always smart and able to make wise decisions is completely wrong.

Unlike others, I actually think that some form of democracy exist even in authoritarian countries. I lived 22 years in Uzbekistan, then 9 years in Russia. I can say for sure that almost every dictator appeals to masses. Mediocre people (masses) is always their primary audience. For example, in Russia, tzar Putin perfectly represents mentality of majority of people in Russia. People actually love the style he speaks and acts. Dictator won't last long if he looses support from majority.I wrote about this here:


(I was surprised this answer got a lot of upvotes)

Also, I noted that even when masses don't like their current government's ideology, they jump to another mediocre idea.

For example, the mob in Uzbekistan is attracted to radical islam as opposition to current secular dictatorship. So if current secular regime in Uzbekistan will fall, then masses choose to go back to 15th century as an alternative. The mob in Uzbekistan certainly won't choose liberal market economy with highly developed technology sector attracting international capital. The backward silly ideas of islamic clerics are much, much, much closer to the mob.

Another example, next after Putin'ism in the priority queue of ideologies in Russia are: communism, and right next after communism is national-socialism. So there are a lot of people who oppose Putin because he is not true communist or do not fully support national-socialism. Again, there is no "liberal market economy with highly developed technology sector attracting international capital" in their queue of ideas.

I can't even imagine the masses go to the streets demanding relaxing regulations for businesses, reducing government spending, attracting international capital.

I guess in US republican party is relatively popular because of religion. Remove strong support of religion in the GOP and after that their popularity will probably drop 10 times.

In Europe, masses a bit smarter than in Uzbekistan and Russia but still they are demanding nanny state, taking money from high earners.

I spent a lot of time and effort to escape poor government policies supported by masses.I born and lived in Uzbekistan, then moved to Russia, then to Sweden, then to the Netherlands.So I'm not afraid to say to entire society - "fk off, you are all wrong, I'm leaving!".I already did it 3 times!

For example, I left Sweden because of ridiculously high taxes and really big nanny state.

I see decentralized democracy as a solution.For example, I would support an idea of small federal government and pretty independent states.So that voters can vote for laws only in their states (with rare but inevitable exceptions).It would be competition between states and eventually people with certain ideas would concentrate in particular states.Some states would be more socialist, some more capitalist.Head of federal government should not be a single person but rather a group of persons from each party.

I think Switzerland is closest example to this.

In such country, you can easily move between states with different laws, taxes, ideologies. It's far easier than moving between countries if you are disagree with prevailing political sentiment (what I'm doing right now).

afsina 9 hours ago 0 replies      
According "Democracy, god that failed" from Hans Herman Hoppe, it is failed miserably. He sees it even inferior to aristocratic monarchies with good reasons.
thght 5 hours ago 0 replies      
At least not in creating the illusion of choice for the individual.
riffic 8 hours ago 0 replies      
It never fails logging into a Hacker News thread near election season but to find commenters yearning for literacy tests.
vasili111 7 hours ago 0 replies      
Democracy and Capitalism are not a failure because they don't have better alternatives.
sjclemmy 7 hours ago 0 replies      
"Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time."

Winston Churchill

happy-go-lucky 8 hours ago 1 reply      
Is there a system of government that enjoys the most success? Just curious.
chrismealy 9 hours ago 0 replies      
The bond holders seem to like it, so it'll probably stick around.
saynsedit 9 hours ago 1 reply      
Who wrote this?
BorisMelnik 9 hours ago 0 replies      
does anyone know who the author of this is? I couldn't seem to find it on the article?
known 8 hours ago 0 replies      
In democracy it's your vote that counts; In feudalism it's your count that votes.http://www.quotationspage.com/subjects/democracy/
y80 8 hours ago 10 replies      
Democracy isn't a failure. Capitalism is. Now before you dismiss me, I'm not defending socialist states, they are far far worse than liberal democracies, however liberal democracies are run by money. Politicians are bought and sold, and we see an undemocratic group of very powerful individuals influencing legislation and pushing around politicians and manipulating public opinion. How can we claim to live in a democracy if the people we trust with developing legislation must filter everything they develop through the approval filter of an undemocratic, and unfairly powerful minority?

Fortunately we have options that aren't the failed states of the 20th century, we need a democratic economy and a democratic workforce. Those are the only solutions to this problem, and if you spend enough time looking at the problems and their causes, it becomes readily apparent why this is so.

Democracy is not a failure, if you believe democracy is a failure, you're saying that your ability to control your own life is a failure. The problem is the structure on which modern democracy has been built.

andrewclunn 5 hours ago 0 replies      
So article asks if the fundamental idea of democracy and republics are flawed, and concludes that we must impose a singular republic, without addressing those initial concerns. Good to know that the mentality of the North Eastern liberal hasn't changed since the civil war.
Advanced Data Structures mit.edu
1115 points by ingve  1 day ago   128 comments top 23
stuxnet79 1 day ago 5 replies      
I never hear anybody mentioning him but Jeff Erickson's 'Algorithms' textbook [1] has some of the most lucid explanations I've come across. CLRS is often times impenetrable and for the times I didn't like its explanation of something I turned to Jeff Erickson's book and it hasn't failed me yet. I'd urge anybody trying to solidify algorithms and data structures to take a look at it.

[1] http://jeffe.cs.illinois.edu/teaching/algorithms/

0xmohit 1 day ago 4 replies      
amelius 1 day ago 2 replies      
This is very nice.

But right now as a programmer, I am using data-structures more on an as-needed basis. Sometimes it is difficult to find the right data-structures for the job. So then it would be nice to have a resource that provides the functionality of data-structures as a black box. Learning these would make more sense than learning also all of the gory technical details upfront.

mrleinad 1 day ago 1 reply      
List of videos/notes 2014: http://courses.csail.mit.edu/6.851/spring14/lectures/

(spent 2 min trying to find them)

Koshkin 1 day ago 1 reply      
Often shied away from as too complicated, this book deserves the obligatory mentioning: Knuth's The Art of Computer Programming. Although not using Python and perhaps being too analytical and detailed for an average programmer's taste, the book is the single biggest classic treatise on algorithms and data structures.

At the other end of the spectrum - accessible and brief, I find Dasgupta et al.'s Algorithms a refreshingly engaging read.

40acres 22 hours ago 0 replies      
I've been watching the lectures & recitations from 6.006 Introduction to Algorithms (Fall 2011) to brush up prior to an interview. Erik Demane, Srini Devadas & Victor Cossan (Recitations) have been an amazing resource.

I've learned so much and am really impressed with their depth of knowledge and how they are able to convey complex ideas in a very easy to understand way, I can't wait to start the next courses.

fmardini 15 hours ago 1 reply      
A very underrated book and one of my favorites is Udi Manber's Introuction to Algorithms, highly recommended
adamnemecek 23 hours ago 1 reply      
Does anyone use any of these ideas day to day? That's not to knock it, I'm genuinely curious.
0x54MUR41 1 day ago 9 replies      
Thank you for sharing this.

Anyone would recommend resources for learning fundamental of data structures?

Book, video, or courses are welcome. I don't care the programming languages that are used for implementations. I am OK with C.

nahumfarchi 1 day ago 0 replies      
Anyone knows which year has the best scribe notes?
nhatbui 16 hours ago 0 replies      
> If you haven't taken 6.854, you must have a strong understanding of algorithms at the undergraduate level, such as receiving an A in 6.046, having had a relevant UROP, involvement in computer competitions, etc.

Quite the pre-reqs...

ohyoutravel 1 day ago 0 replies      
These are great, Erik is a really smart guy. His intro to algorithms class is also fantastic, which should be on MIT OpenCourseWare.
lawless123 1 day ago 2 replies      
why are these hand drawn diagrams easier for me to understand and remember?
zvrba 1 day ago 0 replies      
My small contribution to the field: http://zvrba.net/downloads/fusion.pdf
burnbabyburn 1 day ago 0 replies      
also very interesting is Erik Demaine's work on geometric folding, at least it's fun to watch various structures he prints and plays with.
mathnode 1 day ago 3 replies      
I take it solutions by students are mostly done in Python now?
zem 20 hours ago 0 replies      
from the two-birds-with-one-stone-dept i've been looking for a good excuse to dive into pyret, and using it to do the exercises from this course might just be it. would anyone like to join me in a slow-paced workthrough?
abbiya 1 day ago 3 replies      
erik is the prof.
MciprianM 1 day ago 1 reply      
Will there be a 2016 version?
interdrift 1 day ago 0 replies      
Thank you for this, I'm so excited to take it.
ausjke 1 day ago 0 replies      
is the site down?
albertTJames 1 day ago 0 replies      
That's a great teach
jasonjei 22 hours ago 2 replies      
This is a great resource for anybody that isn't formally trained in computer science. A lot of programmers use an abstract data type like a dictionary or hash table, but many of the self-taught and even some formally trained treat it like a magical black box that stores key-value entries very efficiently. What a hash table/dictionary gives it near O(1) properties is a good hashing function for the key, and having a good distribution of buckets for all the keys when collisions occur.

I think a lot of programmers have good understanding of many data structures. But I think hashes and dictionaries are still taken for granted. What they really need to think of hashes as many magical black boxes and the hashing function directs which key to go to which magical bucket. :)

Mysterious 'ping' sound from sea floor baffles Igloolik cbc.ca
127 points by thom  19 hours ago   65 comments top 13
nom 16 hours ago 7 replies      
Mysterious, unlocateable sounds have always fascinated me. Right now I was just looking for that impossible to triangulate low frequency vibration that plagued that one town i can't remember, but then I typed "mysterious sound" into YT and found out about this world wide event [1] [2] that occurred this year Jan 9-14 and nobody was able to explain it

1: https://youtu.be/mTOQvDzihTU?t=223

2: http://strangesounds.org/2016/01/unexplained-booms-increase-...

hapless 18 hours ago 3 replies      
Well, there go the methane clathrate deposits.

It was nice knowing y'all.

nom 16 hours ago 0 replies      
This paragraph somewhere in the middle caught my eye:

He also says some of his constituents suspect the sound is being generated on purpose by Greenpeace to scare wildlife away from the rich hunting ground. The organization has a tense past with Inuit stemming from its opposition to the seal hunt in the 1970s and 1980s.

gitaarik 11 hours ago 4 replies      
This could as well be some organization's military machine that they're experimenting with. Just like how mysterious objects in the sky could be explained. It's actually the most logical explanation in most of these mysterious cases. This is explained well in the awesome recent documentary "Hypernormalisation". Check it out!
acqq 18 hours ago 2 replies      
The news that are more recent:


"Military plane investigates mystery 'ping' near IgloolikSearch turns up nothing"

anotheryou 17 hours ago 2 replies      
Why can't they triangulate it? Too deep to go?
simooooo 7 hours ago 0 replies      
It's ice creaking
yoda_sl 12 hours ago 0 replies      
Reminds me of a book from Barjavel (French author): The Ice People https://en.m.wikipedia.org/wiki/The_Ice_People_(Barjavel_nov...
cyberferret 14 hours ago 0 replies      
Locator beacon ping from another US 'broken arrow' ??

(Sorry, got carried away with the other 'discovery of a broken arrow' thread here yesterday!)

GarrisonPrime 10 hours ago 0 replies      
A 300 year-old human spaceship? :)
EmmEff 16 hours ago 2 replies      
Russian sub...
jhwhite 17 hours ago 4 replies      
Godzilla. It's the only logical explanation.
social_quotient 15 hours ago 0 replies      
I'd think it was Megatron but "Revenge of the fallen" happened.
An Introduction to Deep Learning algorithmia.com
82 points by felix_thursday  17 hours ago   4 comments top 2
amelius 2 hours ago 1 reply      
> Ford predicts self-driving cars will see widespread use within the next five years. The U.S. government has gotten on board by issuing safety guidelines.

I would love to see what those guidelines entail. For example, how thoroughly do systems need to be retested after major and minor updates? And how are we going to enforce those safety guidelines, given the scandals with emission guidelines (which is a much less complicated territory) we've seen lately (e.g. VW).

singham 3 hours ago 0 replies      
I found this panel video to be quite good discussion of DL. https://www.youtube.com/watch?v=furfdqtdAvc
Switching from macOS: Developer Environment elementary.io
238 points by bdcravens  16 hours ago   148 comments top 26
nebulous1 15 hours ago 6 replies      
Elementary doesn't strike me as a particularly good distro for dev. It's not that I've anything against it, but other than your personal preference in the DE (and Pantheon isn't without its charms) it doesn't seem to have much that's going to lift it over any other linux distro. Perhaps I'm missing something.
jkrems 12 hours ago 5 replies      
Wonder why they didn't go with Cmd+C/Cmd+V for copy&paste. As a developer, that's one of the reasons I really enjoy working on macOS. There's no chance to confuse Ctrl+C and Cmd+C - both of which are shortcuts I use frequently.

P.S.: Not to mention that I appreciate using my thumb for the primary meta key instead of my little finger.

nmalaguti 12 hours ago 2 replies      
One of the major benefits of macOS has been that everyone who uses it has a consistent experience. Some people will use more specialized applications or tools, but the base has been very consistent.

Homebrew has made things even easier and has been adopted as the one right way to install things in a lot of projects and companies. And the fact that it is a rolling release package manager means you can always get the latest and greatest or use homebrew/versions to stick with an LTS version.

I have always found installs of the same Linux distro by different people to be almost incompatible, let alone installs of different distros. Different hardware, different desktop environments, different applications and configurations. On the one hand everyone can have a tailor made experience, but it makes it hard to debug or come up with common configurations and instructions.

Elementary is making some simple and familiar choices that make it easier for everyone to start at the same place. It looks and feels good, but is different enough that I can't just switch without feeling all the rough edges.

If developers are serious about migrating to a linux distro and PC hardware, I think a hybrid rolling release for devtools and versioned releases of the base system might be needed to capture a lot of the success of macOS. I'm not even sure if that's really possible.

unhammer 6 hours ago 1 reply      
> Similarly, you can just Ctrl+V to paste in the terminal instead of having to work around with extra modifier keys.

that's a bit dangerous; Ctrl-V is normally used to "escape"/make literal the following keypress, or do block select in vim.

The notification-on-long-running-process looks very handy though (I've been using https://gist.github.com/unhammer/01c65597b5e6509b9eea , but of course clicking it doesn't put me back in the right tmux window). And the "energy-sucking apps" indication mentioned in http://blog.elementary.io/post/152626170946/switching-from-m... looks very handy. (I've been considering creating wrapper for Firefox that Ctrl-Z's it when it's minimized )

Is anyone running the Elementary DE (or parts of it) on Ubuntu? Does it work OK, or do you have to run the whole OS for it to be worth it?

lukaszkups 4 hours ago 0 replies      
I don't get all the hate of elementaryOS distro here on HN as a dev machine. I've worked before on osx, ubuntu, xubuntu and fedora. Comparing to other linux distributions, it is just another linux-like system and works as a dev machine similar to any other distribution, but IMHO looks nicer. Please, provide me information what makes elementaryOS worse than e.g. Ubuntu as a dev machine? (I'm a webdev working with cordova/phonegap, RoR, Django and Node.js every day and eOS works like a charm for me)
meesterdude 15 hours ago 3 replies      
Why are they even making a code editor? Seems like effort that could go towards more fruitful endeavors.
nickbauman 12 hours ago 1 reply      
I bought a System 76 laptop a couple of years ago. It completely smokes my 2x more expensive MBP (which has faster processors) in important tasks like running test harnesses and compiling projects. The body, keyboard and trackpad all have this cheap, "dollar-store" quality that initially drove me nuts but I got used to it after a couple of days.
JustSomeNobody 13 hours ago 1 reply      
Wow, they are really pushing hard in the wake of all the "controversy" with the new MBPs.
jordic 1 hour ago 0 replies      
I'm quite happy with my desktop less i3 + tmux for shells. (Ubuntu) I switched from Mac three years ago tired of iTunes and the rest of bloatware.
rerx 6 hours ago 1 reply      
I've been running only Linux for years. Here's what I miss and why I still regularly contemplate just getting a Mac:

- a modern full featured client for email, with an efficient and pretty UI, with good shortcut support (at least as good as the Fastmail and Gmail web interfaces)

- a fast and full featured PDF viewer that supports annotations properly -- anything based on Poppler unfortunately does not cut it

- friendly software to create pretty presentations -- Keynote still seems to be king

Development tools are the least of my worries.

rco8786 1 hour ago 0 replies      
This whole thing reads like an Apple product release. Not sure if that's good or bad considering the intent.
cyberferret 15 hours ago 0 replies      
I installed Elementary in a VirtualBox on my old Windows 7 Thinkpad, and am loving it. Seriously considering installing my Ruby (Padrino) development environment within it to fully test, with a view to completely scrapping Win7 from the laptop and running pure Elementary in the future.
vijucat 10 hours ago 3 replies      
I guess Elementary had to copy the Cmd+spacebar shortcut to mimic the Mac OS experience (Spotlight), but on that count, Windows' just-press-Win-and-start-typing experience is much better. It's just one key less, but opening up a program is used ALL the time, and eliminating that key press makes a huge difference, IMHO. Not sure when they introduced it in Windows, but that was a good one.
yulaow 3 hours ago 0 replies      
As someone suggested also in the previous post about elementary, take a look at Apricity Os (arch-based)

[ https://apricityos.com/download ]

tananaev 9 hours ago 2 replies      
The only reason I have to use macOS for development is Xcode which I need to make iOS mobile apps. I used to use macOS in a VM with Linux as a host system, but it's just too slow and laggy even on good hardware.
EugeneOZ 11 hours ago 1 reply      
Does fonts rendering look smooth on hidpi screens?
joeevans1000 11 hours ago 0 replies      
This may be a good transitional and familiar OS for people now having to migrate away from Apple now that it isn't taking developers and professionals seriously. Some may find this meets all their needs.
jasoncchild 15 hours ago 2 replies      
i was reading this and wondering why there would be so much emphasis on stuff like apt...then realized that there are indeed developers who've only ever used OS X (and perhaps windows). i guess i assumed everyone ended up with Linux as a daily driver at some point, if even for a short time
pmlnr 15 hours ago 1 reply      
elementary again within 2 days? Come on.

Anyway, Geany beats Scratch.

achikin 16 hours ago 8 replies      
As a mac user I wonder why do I need to use sudo to install packages?
shorodei 15 hours ago 1 reply      
An year ago I started dualbooting Elementary as my daily *nix OS. All was well, until one day, with no hardware change or OS update, the touchpad stopped working. I'm back to VMs now.

If I wasn't dualbooting I might have spent more than a day to figure out what happened - but I was too lazy and scrapped dualbooting.

mirekrusin 6 hours ago 0 replies      
elementaryOS is ok'ish but before you dive into it you should know that there's no way of upgrading system, you're going to have to do fresh install once new version is available.
erokar 15 hours ago 1 reply      
My gripe with Elementary OS is that it's too much like MacOS. It's dervied and feels boring and stale in the same way that MacOS does. If you're switching, do it with a bang, not a whimper.
gchokov 10 hours ago 0 replies      
Not switching anytime soon. No reason.
oblio 8 hours ago 0 replies      
Man, I didn't expect this surge in ElementaryOS articles :)
cocktailpeanuts 10 hours ago 0 replies      
I'm an iOS developer. Everything is irrelevant.
New Scandals Show How Pervasive Mass Surveillance Is in the West theintercept.com
108 points by lisper  9 hours ago   17 comments top 3
1024core 40 minutes ago 0 replies      
Wake me up when someone goes to jail.

When you start sending the perps to jail for violating others' constitutional rights, they'll fall in line pretty quickly.

Until then, these "scandals" are no different than Kim Kardashian's nip-slip or that Paris Hilton video.

trendia 3 hours ago 3 replies      
Nearly everyone has smartphones, even the most privacy-conscious people. Until we find an alternative to the current devices, which have a mic/camera/gps that could be turned on remotely, I don't think we will be able to combat the invasion of privacy
pmyjavec 7 hours ago 2 replies      
It's practically out of control. How do you educate a whole population on how it all works to help them avoid the mass surveillance trap?
Y Combinator's Hardware Guy Leaves After 14 Months bloomberg.com
225 points by kgwgk  17 hours ago   108 comments top 18
sama 15 hours ago 4 replies      
I counted at least 5 factual errors in the photo caption and first 3 paragraphs.

We also heard that this reporter was 'encouraged' to write this by someone from a competing accelerator with an obvious agenda. Seems like manufactured controversy otherwise.

All that said, we like hardware and we like Luke, and we're excited he's in the new batch! And we're thankful for the work he did to get our hardware program set up.

sushid 17 hours ago 7 replies      
"Sam Altman, president and co-founder of Y Combinator."

Uhh, they clearly didn't do their homework if they thought Altman co-founded YC.

rezashirazian 17 hours ago 4 replies      
Still, hardware is, well, hard. Silicon Valley has produced a raft of world-changing software startups, from Airbnb (a YC veteran) to Facebook. But it's a whole lot easier to beta-test an app than to prototype and then manufacture a gadget with a bunch of moving parts. Then you have to market the thing to the masses, who are already enamored of their Apple and Samsung products. Exhibits A and B: Fitbit, the fitness tracking company, and GoPro, maker of rugged little cameras. This week, both cut their sales forecasts for the crucial holiday quarter and watched their shares plunge. (Neither started life at YC.)

Fitbit and GoPro's downturn is mostly fueled by mismanagement and poor decisions than hardware being intrinsically hard.

The hard part of hardware is building something that works well enough and is in demand. When you have your device on people's wrist or mounted on their helmet, you have overcome the "hardware is hard" part. (for the most part).

After that it's building on your original success, expanding your market and creating value for your customers; all of which are true for any other business, not strictly hardware. For fitbit and GoPro, the latter has proven more difficult.

kenferry 17 hours ago 3 replies      
"Iseman says his departure was purely about his desire to get back into the founder's role and build something new. He's already been accepted into the winter 2017 YC cohort with his new housing startup."

Doesn't sound very acrimonious.

dmix 17 hours ago 1 reply      
Everything must be a controversy these days in the news world. Including all of the times they're really not. Like this time.
nom 11 hours ago 0 replies      
Hardware development requires a lot of spare resources, it doesn't pair well with startups.

The development cycles are much less predictable compared to software, you never know how many prototypes you have to burn through until you reach the production stage. Small changes become hugely expensive once you've finalized the designs, and even a minuscule error like a tolerance mismatch can cause hundreds of thousands in damage and ruin a startup almost instantly. I'm always wary of a HW startup, unless the lead engineer is well known and has experience.

the_watcher 17 hours ago 0 replies      
I met Luke while he was working on Edyn (pre-YC). He was really interesting, and clearly enjoys hacking and tinkering with hardware and physical things more than anything. It's not surprising to me that he'd get bored after a working process for supporting hardware startups was established. Interested to see what he's doing around housing, and somewhat hopeful it won't just be tiny homes (since while I think they're cool, living in one isn't for everyone).
Animats 17 hours ago 0 replies      
Hardware may be a bad fit for YC, with its low initial funding and short deadlines. If you're doing some me-too product, others can do it too, probably better. If you're doing something hard that takes R&D, it may take more time and resources than YC provides.
patcon 11 hours ago 0 replies      
As someone who saw Luke & Heather's Boxouse talk at HOPE conf with some friends, and were inspired to go at it stealth, I've gotta say I'm so so glad they're focussing on the housing project. I just got my rental shipping container dropped off last week, and am working to quickly and non-destructively insulate it for Canadian winter :)
louprado 13 hours ago 0 replies      
I don't know Luke Iseman's vision for Boxouse. But if the goal is to produce structures like the ones shown in the link below then I am really glad he made the decision to focus full time on his company.


cdibona 11 hours ago 0 replies      
I don want to sound like a jerk, but Trevor Blackwell has always filled the ycombinatoe hardware 'slot' from my perspective.
samstave 17 hours ago 1 reply      
WRT goPro...

(I do not know if they already do this);

Should it not be a sound idea to work with companies that make the THINGS that your users would be on to bundle a device with the things...

Rosignol branded gopro when you buy a pair of skis

[SkateBoard] branded gopro bundle, or a tony hawk edition

Water-proof versions bundled with every [insert water product]

Specialized/Brooks branded versions...

Try to get the companies already selling the transport mechanisms the users of go-pro would be using to boost sales. Lower margins, higher(???) volume?

dnprock 15 hours ago 0 replies      
This shows being YC founder is more fun than YC partner.
Jugurtha 15 hours ago 0 replies      
There's "silicon" in Silicon Valley. The reason it exists is hardware, yet they seem oblivious to that fact and make it seem that, somehow, SV's pursuit of hardware is a new path.
Steeeve 15 hours ago 0 replies      
That's a brilliant photo.
debt 11 hours ago 0 replies      
the position is probably pointless and is why he left.

it's cool that yc experiments with shit like this but didn't work out and now he's doing something else.

hardware is hard.

sethbannon 17 hours ago 4 replies      
TL;DR -- Luke Iseman, YC's hardware focused Partner founded a hardware startup, which he's now taking through YC as a founder, after having set up a bunch of deals and processes to help hardware startups as a YC Partner.

Seems like pretty good evidence YC is now great at helping hardware startups that he's choosing to go through the program himself.

jabbanobodder 17 hours ago 1 reply      
Is it just me or does that picture make him look like a zombie on TWD?
       cached 5 November 2016 16:02:02 GMT