hacker news with inline top comments    .. more ..    10 Jul 2016 Best
home   ask   best   3 years ago   
Mental Models I Find Repeatedly Useful medium.com
910 points by orph  4 days ago   187 comments top 53
mikekchar 3 days ago 16 replies      
Interestingly, I find my favourite nitpick: Ockam's razor. The article quotes it as "The simplest solution is usually the correct one". This is a common misinterpretation of it and it's interesting that the quote links to the wikipedia page that has a better statement: "Among competing hypotheses, the one with the fewest assumptions should be selected."

The key problem is equating simplicity with correctness. This is usually disastrous. Once you feel that something is "correct" you stop looking for ways to falsify it. That's the exact opposite for what Occam's razor is used for.

Instead, if you have 2 competing hypotheses (two hypotheses for which the evidence supports both), you use the one with less assumptions. Partly because the one with less assumptions will be easier to work with and lead to models that are easier to understand. But mostly because less assumptions makes it easier to falsify.

Abusing this principle outside of the scientific method leads to all sorts of incredibly bad logic.

ucaetano 3 days ago 4 replies      
Interestingly, that's about 75% of my 2-year MBA.

Sure, it's far different doing daily training to get those concepts ingrained in your mind so you don't have to actively think about them, but it's nice to see them listed like this.

Here are a couple more:

- Overconfidence bias: we usually think we're better than the average on something we know how to do (driving) and worse than the average in something we don't (juggling), even if almost nobody knows juggling and everyone knows how to drive

- No alpha (aka can't beat the market): you can only consistently beat the market if you're far better at financial analysis than a lot of people who do it every day all day. So don't bother trying.

- Value chain vs. profits: you'll find that most of the excess profits in the value chain of a product will be concentrated in the link that has the least competition

- Non-linearity of utility functions: the utility of item n of something is smaller than item n-1. Also, the utility of losing $1 is smaller than (1/1000) utility of losing 1000. This explains insurance and lotteries: using linear utility function, both have a negative payout, but they make sense when the utility function isn't linear

- Bullwhip effect in supply chain: a small variation in one link of the supply chain can cause massive impacts further up or down as those responsible for each link overreact to the variation (also explains a lot of traffic jams)

- Little's law: in supply chain (and a lot of other fields): number of units in a system = arrival rate * time in the system

I'll add more as I think about them.

btilly 3 days ago 1 reply      
Two adjustments that I would make.

Remove Metcalfe's law. It is a massive overestimate. See http://www.dtc.umn.edu/~odlyzko/doc/metcalfe.pdf for the better n log(n) rule for valuing a network.

And I find Le Chtelier's principle generally applicable, and not just to Chemistry. It says that if you observe a system at equilibrium, and try to induce a change, forces will arise that push it back towards the original equilibrium. It is one thing to recognize this at work in a chemical reaction. It is quite another to be blindsided by it in an organization.

See http://bentilly.blogspot.com/2010/05/le-chateliers-principle... for my explanation of why this holds in general outside of chemistry.

uola 3 days ago 1 reply      
Ugh, maybe I'm the only one but I don't find this list useful. Not because it isn't interesting, but the implication that it will actually make you smarter. The problem today isn't information, it's knowledge. Even if you can correctly and fully understand all these models, something that could take years, you still most likely wouldn't be able to implement them, especially when they are in conflict with each other.

I think it's a much better idea to study things like critical thinking, practical reasoning and operational leadership. Back in the day hacker values stated that you could ask for directions, but not for the answer. Because the process itself was as important as the answer. Not just for amusement, but because there might not be a right answer and the next time you're confronted with a similar problem you now have some experience of making those decisions.

A great deal of "stupidity" in technology these days seem to stem from schools that promote check box answers to complex problems and the popularity of these "laws" that make people so sure of themselves that it prevents them from proper reasoning.

sitkack 4 days ago 7 replies      
This is super useful, I have a similar list but it also includes techniques and ideas

 * Dimensionality Reducing Transforms * Hysteresis, Feedback * Transform, Op, Transform * Orthogonalization for things that are actually dependent * Ratios, remove units, make things dimensionless
A big one, that helps me immensely, is that when I need to do a big/risky/complex task, is to imagine myself doing with with sped up time. Instantly creates an outline and list of tools that one will need.

rwallace 3 days ago 0 replies      
Good list! A few suggested tweaks:

Veblen goods clearly exist, but the evidence for the existence of Giffen goods is much more suspect. (Did the poor really eat more bread because the price of bread rose, or because there was an across-the-board increase in the price of all kinds of food?)

The Precautionary Principle is not just dangerous or harmful, but guaranteed suicide; as things stand right now, we are all under a death sentence. It needs to be replaced by the Proactionary Principle, which recognizes that we need to keep making progress and putting on the brakes is something that needs to be justified by evidence.

Any list that has sections for both business and programming needs some entry for the very common fallacy that you can get more done by working more hours; in reality, you get less done in a sixty-hour week than a forty-hour one. (Maybe more in the first such week, but the balance goes negative after that.)

The distinction between fixed and growth mindset is well and good as far as it goes, but when we encourage the latter, we need to beware of the fallacious version that assumes we can conjure a market into existence by our own efforts. You can't become a movie star or an astronaut no matter how hard you try, not because you lack innate talent, but because the market for those jobs is much smaller than the number of people who want to do them.

erikb 3 days ago 1 reply      
I think pg also wrote an essay about a mental model that I find interesting: When in doubt, it's probably not about you.

There are many events that we usually think are related to us, but actually aren't, like your boss or customer being angry is in most cases not about you but something else.

I have looked through a lot of pg's essays but didn't find it. He probably removed it just that I can't find it (/example).

If someone else finds it, please link.

source99 3 days ago 5 replies      
A technique I often use to test a theory is to change the inputs to be the maximum and minimum possible values and see if the model still holds true. I've found it to be incredibly useful in a few specific situations.
csallen 3 days ago 3 replies      
I'm surprised he rates cost-benefit analyses as a 2 ("occasionally" used) rather than a 1 ("frequently" used). Making good decisions almost always requires taking a hard look at both the costs and the benefits. It cannot be overstated how often bad decisions are made because the parties involved simply neglected to factor in the costs (including opportunity costs).

I personally use cost-benefit analyses for every non-trivial decision in my life.

delish 4 days ago 3 replies      
Some commenters here are saying, "I already know this stuff." Indeed. I'd be curious if people could put out a list of "advanced" mental models. For example, Bayes' theorem is more advanced than Occam's razor.

What's clearly more advanced than Bayes' theorem, and as useful? ET Jaynes' flavor of probability theory? I'd posit the more advanced version of active listening as, "being able to perform a bunch of kinds of therapy--freudian, rogerian, family and systems etc." Of course I don't mean you go get a license for these things. I'm positing them as difficult, generally-applicable life skills. I'm not claiming these are good examples; I think HN can come up with better ones.

agorabinary 3 days ago 0 replies      
A nice metacognitive cheat sheet.

Missing a couple interrelated mental models I find very important:

- emergence: a process whereby larger entities, patterns, and regularities arise through interactions among smaller or simpler entities that themselves do not exhibit such properties

- decentralized system: a system in which lower level components operate on local information to accomplish global goals

- spontaneous order: the spontaneous emergence of order out of seeming chaos. The evolution of life on Earth, language, crystal structure, the Internet and a free market economy have all been proposed as examples of systems which evolved through spontaneous order.

LeicesterCity 4 days ago 4 replies      
Could someone give me a real example of somebody using mental models in a real world application? I just find the idea of learning and studying mental models to be distracting and confusing. Pardon my ignorance.
karmacondon 3 days ago 1 reply      
I have a similar list of useful concepts. My goal so far this year was to expose myself to those concepts as often as possible. I made an app for my phone that displays the concept of the day on my home screen (right now it's the rhetorical concept of periodic sentences). I also made images for each of the concepts that I use as my chromecast backdrop. I've seen each of them dozens of times by now, mostly unconsciously.

So far, mixed results. I would like to say that I think of "Bayes Theorem" at the perfect time because I wrote it on a list, but that never happens. I guess I've benefitted from thinking about these concepts more, but that's almost impossible to measure. A list of 100 useful mental models has limited value if you can't hold all of them in memory at once and retrieve them at the right time. I'm still trying to come up with a solution for this. Unfortunately I think this might be a fundamental limitation of human learning.

adamnemecek 3 days ago 0 replies      
To the development section, I would add the concept of computational context/state, caching, and queue/event loop.

This HN comment summarizes it pretty nicely "everything in an OS is either a cache or a queue" https://news.ycombinator.com/item?id=11655472

Also Overton window

Double_Cast 3 days ago 0 replies      
> What am I missing?

In planning a strategy, I've found it helpful to consider Win Conditions. It forces me to think backwards from the goal, construct a dependency tree, and consider resource allocation. I first heard about it from videogames but I've also seen it in math, engineering, logistics, recipes, etc. I also pattern-match it the insight that solved the Problem Of Points [0] which motivated Probability Theory. If it were on the curated list, I'd expected to find it under "models" next to cost-benefit analysis.

[0] https://en.wikipedia.org/wiki/Problem_of_points#Pascal_and_F...

frankus 3 days ago 1 reply      
Great list, although I prefer the term "thought technology" (as coined by John Roderick) to "mental model".
shunyaloop 4 days ago 0 replies      
On-going series on mental models at http://www.safalniveshak.com/category/mental-models/
agentgt 3 days ago 1 reply      
I still think Social Psychology was one of the most useful classes I ever took in college. Sure some if it is probably dated by now but the cognitive biases theories really helped me further in life.

I remember telling some class mates to take the class and they assumed it was for an easy A and not for how useful the class would be (and I went to a GaTech a long time ago and well the social sciences were just not respected like engineering disciplines at the time).

ternaryoperator 3 days ago 1 reply      
His definition of a "strawman" is incomplete. It's not simply misrepresenting someone's argument, it's misrepresenting it specifically by analogizing it falsely to something similar that is easier to attack. The example he links to is a rather exaggerated strawman. I think most people would favor the strawman explanation in Wikipedia[1]

[1] https://en.wikipedia.org/wiki/Straw_man

k__ 3 days ago 0 replies      
The wrong assumption about Ockams Razor is probably the cause of so many people re-inventing the wheel.

"I don't need this big framework, I can do with much less!"

vetras 1 day ago 0 replies      
Perfect, but how do you use these models?

Are you supposed to know all 100's of them by heart and then, in the middle of conversation, go: "Ah, but X principle says Y, therefore we will go with Z option". Is it?Am I missing something?

I mean, I'd love to use this but I don't have enough brain cells for all of those :)

RickHull 3 days ago 0 replies      
I'm just getting exposed to this line of thinking and find it fascinating. Another resource I found recently was https://www.farnamstreetblog.com/mental-models/

Disclaimer: I'm not sure if it's derivative blogspam or legitimately insightful / original

votr 4 days ago 3 replies      
How would one actually use this stuff?
Lordarminius 3 days ago 0 replies      
I would add to the list 'revealed preference'

'... an economic theory of consumption behavior which asserts that the best way to measure consumer preferences is to observe their purchasing behavior. Revealed preference theory works on the assumption that consumers have considered a set of alternatives before making a purchasing decision. Thus, given that a consumer chooses one option out of the set, this option must be the preferred option' http://www.investopedia.com/terms/r/revealed-preference.asp

In other words "observe their actions, not their words"

ElijahLynn 3 days ago 0 replies      
This is the core of the book Peak: Secrets from the New Science of Expertise. It is a book of how to create a mental representation of what successful mental representations look like.

The most successful people, peak performers are those who have the best mental representations.

jtlien1 2 days ago 0 replies      
The mental model from economics that is widely misinterpreted is comparative advantage. Most think it means you/a country etc should specialize in that which you are best at. And then free trade will work to your advantage. But it actually means that even if you are worse at producing products A and B than another country, if your ratio of A/B is better than the other country, it would be good for you to produce A and trade it to the other country for B etc. I
ForHackernews 3 days ago 0 replies      
Are these "mental models" or just a bunch of clichs / pithy aphorisms? To me, a mental model would be something more like "visualizing possible state transitions as a directed graph" or something like that.
preordained 3 days ago 0 replies      
Nice. They got Hick's law...that's one of my favorites, not so much in development, but sports. I train Brazilian jiu-jitsu, and I find substantial improvement in my reaction time by having only 2-3 well-worn options at my disposal (even 3 starts to feel crowded) in an given position, rather than a multitude of counters/attacks. When someone is trying to strangle you, go left or right is often a better choice than let's-check-the-mental-database-for-the-ultimate-move.
barrystaes 3 days ago 0 replies      
Quite a few of these "mental models" are just a definition of terminology like "botnet". Come to think of it, the complete list is just definitions..
rdlecler1 4 days ago 0 replies      
A couple more:


> Frequency-dependent selection: fitness of a phenotype depends on its frequency relative to other phenotypes

> Evolutionarily stable strategy (ESS) is a strategy which, if adopted by a population in a given environment, cannot be invaded by any alternative strategy that is initially rare. It is relevant in game theory, behavioural ecology, and evolutionary psychology. Related to Nash Equilibrium and the Prisoners dilemma.


> Debasement (gold coins): lowering the intrinsic value by diluting it with an inferior metal.

agentgt 3 days ago 0 replies      
I would say "Divide and Conquer" should be a 0... it is that useful and it can be applied to many many different categories.

So many things seem intractable and formidable in complexity yet once these things are broken down into pieces things become clear. The Asana CEO once talked about this. Breaking things out provides clarity and once you have clarity productivity is massively increased.

projectileboy 3 days ago 0 replies      
If you enjoy these sort of summaries, I encourage you to check out the book "Seeking Wisdom" by Peter Bevelin https://www.amazon.com/Seeking-Wisdom-Darwin-Munger-3rd/dp/1...
mizzao 3 days ago 0 replies      
This book is a very handy pocket reference that overlaps with many of the ideas mentioned here:


galfarragem 3 days ago 0 replies      
I recurrently use: Everything is a... [1]

Even when this model doesn't explain 100% of occurrences is great as a starting point of view to understand the main pattern of a complex system.

[1] - http://c2.com/cgi/wiki?EverythingIsa

misterdata 3 days ago 0 replies      
I'd add Amdahl's Law [1], which is about the relationship between adding resources for executing a task, and the speed-up that delivers.

[1] https://en.wikipedia.org/wiki/Amdahl%27s_law

philip1209 4 days ago 0 replies      
I thoroughly enjoyed the book Inside The Box, which presents four mental models for creative problem solving. The core idea that creating rules can help creativity is a pattern toward which I think most technical people (including myself) feel averse, but actually can be beneficial when studied with an open mind.
steveeq1 4 days ago 0 replies      
There is also an app from the apple app store that has most of these mental models in book form: https://itunes.apple.com/us/book/think-mental-models/id61236...
RivieraKid 3 days ago 0 replies      
Very underwhelming, I'm actually quite surprised that most people seem to find this useful and interesting. I mean, normal distribution, Moore's law, minimum viable product, paradox of choice... that's pretty basic stuff.
cvwright 4 days ago 0 replies      
It's an interesting list. Though I'm a bit baffled at why he has Power-law as a "1" (comes up frequently) and Heavy-tailed distribution as a "3" (rarely comes up). A power law is a heavy-tailed distribution!
jacquesm 3 days ago 0 replies      
Nice list! I really miss this one:


bootload 3 days ago 1 reply      
under competing, I'd add OODA loops ~ https://en.wikipedia.org/wiki/OODA_loop
quadrangle 3 days ago 0 replies      
Along with the reference to Arrow's Impossibility Theorem, I'd want a reference to the fact that voting can be done in ways other than ranking, e.g. approval or score voting.

Overall, a superb list.

adamnemecek 4 days ago 0 replies      
Is Gabriel Weinberg related to Gerald Weinberg? No right? I've been wondering this for some time now.
xchip 4 days ago 0 replies      
An article that goes straight to the point. I like it!
fatdog 3 days ago 0 replies      
TL;DR; What you learn in an Economics degree.
hackaflocka 3 days ago 1 reply      
"Spamming" is a mental model? Mmmmmkay.
criddell 4 days ago 1 reply      
Inflation is a mental model? Peak oil? Botnet?
bcherny 4 days ago 2 replies      
Nothing ground breaking here - I imagine most readers here already use most of the author's models - but this is a nice comprehensive list, which I have not seen before.
fizixer 4 days ago 1 reply      
I guess what you described goes by a well-known term called 'critical thinking'?
crimsonalucard 3 days ago 0 replies      
aka buzz words to make you sound more intelligent.
peterkshultz 4 days ago 1 reply      
If the 29 minute read time is intimidating, consider this link: https://www.farnamstreetblog.com/mental-models/

All the information, easier to read quickly.

throwaway_1004 4 days ago 1 reply      
Such cynical words, besides depriving the world of a much needed listicle, will also get us downvoted. Please don't offer such awkward comments which might cause people to pause and think. Now back to my facebook feed..
Facebook Messenger begins testing end-to-end encryption using Signal Protocol whispersystems.org
603 points by mayneack  1 day ago   284 comments top 47
alva 1 day ago 15 replies      
From what is written, I understand this to mean that users can select this feature for specific conversations. That not all messages are subject to this encryption.

I am not usually one for paranoia, but is anyone else becoming more suspicious about Facebooks motivations and involvement with gov? This feature is a massive boost for intelligence services dealing with unsophisticated actors. This reduces the haystack significantly, by users self flagging messages that may be incriminating. Multi-millions of FB messages must be sent every day, brute-forcing encryption on all of these is probably not possible. A small % marked as 'secret conversation'? Much easier.

Why doesn't FB just apply encryption on all messages? Surely they have the resources avail. Is it because this feature makes somebody else's job a lot easier?If my suspicions are correct, what sort of threats would this pick up. Are serious threats likely to use FB messages with 'secret conversation' flagged to co-ordinate actions?

Techbrunch 1 day ago 3 replies      
Reasons from not enabling it by default by @alexstamos (CSO @ Facebook):

- FBM is multi-device, and we'd like to see E2E usability improve to support this. For now, pick one device and keys never leave it

- Secret conversations don't currently support popular features like searching message history, switching devices, voice/video, etc

- Hundreds of millions use Messenger from a web browser. No secure way to verify code or store keys without routing through mobile.

"We don't want to disrupt people's current experience."

Source: https://twitter.com/alexstamos

etiam 1 day ago 3 replies      
"End-To-End Encrypted Secret Conversations" in software that is ordinarily used to harvest electronic phone books and rummage through user photos, from a company that made its whole fortune trying to obliterate privacy as a part of human culture?

It's going to be pretty high standards of proof to give this anything that resembles credibility.

sigmar 1 day ago 2 replies      
This article doesn't mention it, but Facebook Messenger will be using the Signal protocol: https://whispersystems.org/blog/facebook-messenger/

also, here is the white paper (from the above post): https://fbnewsroomus.files.wordpress.com/2016/07/secret_conv...

r2dnb 1 day ago 3 replies      
I've read the whole thread and I'm surprised that nobody mentionned how easy it would be for Facebook to store the secret keys.

Page 10 of the white paper mentions that there is a remote key stored on Facebook servers which can be used to decrypt the local key. If Facebook still is to be trusted, I don't see what's the deal here.

I think that as soon as you put the words "end-to-end" encryption on a marketing material, you have to be ready to open-source your client. This is the cost that companies aiming to be credible can't escape.

End-to-end encryption without open-source has no value. It is a waste of energy for the company doing that too - or perhaps a marketing cost.

agd 1 day ago 3 replies      
It's worth remembering that this does not protect metadata. It's believed (though not known for sure) that WhatsApp logs metadata for their encrypted messages, and it looks like Facebook do the same here.

If you want to resist mass surveillance this is not a good solution.

eyeareque 1 day ago 2 replies      
Moxie and team, bravo. You've made the snoopers jobs a whole lot harder.

Your goal of making encryption easy to use by the masses is coming come true. It looks as if PGP's days are numbered.

cdown 1 day ago 0 replies      
eganist 1 day ago 1 reply      
I'm not immediately seeing any insight into whether this covers conversations initiated in-browser. If this does exist, it'd be interesting to see how they've tackled the security of crypto logic in-browser and compare it to what Cyph has in place for in-browser code signing.

Reading the technical docs now (https://fbnewsroomus.files.wordpress.com/2016/07/secret_conv...).

Edit: Yep, this seems device-to-device; there doesn't seem to be a web component here. Still useful given how many people use messenger primarily via phone, and I suspect implementation wasn't hard given WhatsApp did it first. It would be neat to see if Messenger and WhatsApp are ever bridged through this.

AdmiralAsshat 1 day ago 0 replies      
Genuinely thrilled to see the Signal protocol adopted for Facebook and Google stuff (albeit optionally). Now if we can just get Microsoft and Amazon to hop on-board, we might actually have a shot at getting this standard to be pervasive.
jalami 1 day ago 0 replies      
If the messenger is not open sourced, it's trivial for Facebook to add something to the client binary (now, with a flag or at some later date) before the Signal libraries are hit. I'm not saying they are doing so, but without a clear way to verify continually, this is just short of security theater. Then there's Facebook facilitating the key exchange which of course is another blind trust as well as all the juicy meta data. Maybe this will quiet some of the nerves of privacy conscious individuals already on the network, but it seems to me more like a marketing label.

I still find it hard to believe so many people trust what they believe to be private communication with close-lipped advertisement companies.

marak830 1 day ago 2 replies      
Last I heard, didn't their messenger app pull a ton of not required permissions on Android?
grandalf 1 day ago 0 replies      
While this is a great step, let's not forget that all FB has to do is track who chooses to use encryption and it can easily use that metadata to aid law enforcement.
znpy 1 day ago 0 replies      
When it was still possible to use the Facebook chat via XMPP I used to use Pidgin as client, and chat securely using the Pidgin OTR plugin.

Message appeared in the Facebook page as "encrypted message".

I guess you hardly can get better than this.

jswny 1 day ago 2 replies      
As much as this is a step in the right direction, you have to specifically enable encryption for individual conversations in Messenger. This implementation seems a little sketchy to me. They really should just encrypt every conversation automatically. Otherwise, encryption only encourages scrutiny.
shmapf 1 day ago 3 replies      
I've been wondering about this for a while.

Now both whatsapp and Facebook have this, but surely they have the encryption keys too, or how else would they seamlessly fetch your messages and decrypt them when you get a new phone?

If they do, then what's the point?

ge0rg 1 day ago 1 reply      
Now that the Signal Protocol is deployed in so many different places, is there a proper specification of the (current) protocol? (The old axolotl spec and the GPL implementation don't qualify)

What are the licensing conditions / restrictions for using the protocol?

anotheryou 16 hours ago 0 replies      
I'd be very glad. I keep facebook for the people who don't have any proper IM (and for the in my view much more legit micro blogging on FB).

I'm scared of what will be possible to extract from my chat logs in a few years, but the benefit of being able to IM people that only have FB feels greater right now.

Biggest problem I see so far is the multiple devices issue, but for most it will be just Desktop and Mobile, so why can't you send each message twice, encrypted separately for each device (automatically, not manually)? Does OTR3 have this feature?

em3rgent0rdr 1 day ago 1 reply      
I'll use if could communicate with non-Facebook programs implementing Signal protocol.
em3rgent0rdr 1 day ago 1 reply      
How can I verify my device is indeed running the signal protocol? Messenger is a proprietary app.
myf 1 day ago 0 replies      
dumb question: how do we (know|prove) a message is encrypted with signal when we use facebook, google allo etc. etc.
dang 1 day ago 0 replies      
A user complained that the title was misleading compared to https://newsroom.fb.com/news/2016/07/messenger-starts-testin..., so we replaced "deploys" with "begins testing" above. If someone suggests a better (i.e. more accurate and neutral) title, we can change it again.
amq 1 day ago 0 replies      
I would still use other means of communication for something really private until this becomes the default, because by opting-in I would essentially mark myself as suspicious.
jsn117 1 day ago 0 replies      
Facebook still keeps the messages, and uses the App to track you. Unless the Signal protocol is against FB itself, I don't see how this is news.
secfirstmd 1 day ago 1 reply      
I wonder by "end-to-end" do they mean they will be implementing Signal Protocol, that would be a pretty awesome increase in security (for most peoples threat models)...WhisperSystems are genuinely amazing at what they do, they have like less than 5 staff and very little budget but they are literally saving hundreds if not thousands of lives. Any chance some rich HN member will recognise this and open up the chequebook to OWS?
Sir_Substance 1 day ago 0 replies      
That's great! Since they're thinking of rolling out a (relatively) standard protocol, maybe we could have the ability to message our friends on facebook from other services again?

Ya know, now that it won't be such a pain to support another protocol and all, since they're doing it anyway.

uola 1 day ago 1 reply      
One thing I don't understand how Signal implemented by platform providers is supposed to work with lawful interception? Either it doesn't work, in which case we expect law enforcement to just give up the right to wiretap things with a warrant (which seems unlikely) or it does work and is less private than one would expect.
birdmanjeremy 1 day ago 0 replies      
If this is why they are pushing me away from using messenger in the mobile browser, I'm suddenly way less upset.
curiousgal 1 day ago 0 replies      
I downloaded Signal only to find out it doesn't support phones with dual SIM cards when sending unsecured SMS.
free2rhyme214 1 day ago 0 replies      
Facebook took the same route as Google by providing secret conversations. However you look at this, this is a good step in the right direction. I still prefer Whatsapp and Signal because they both use E2E encryption by default.
akerro 1 day ago 0 replies      
So what. It's metadata that counts.
sandstrom 1 day ago 0 replies      
Awesome news!

They should call it 'private conversations' instead of 'secret conversations' though.

Tharkun 1 day ago 0 replies      
This is worse than useless as it doesn't encrypt browser initiated messages and doesn't work cross device. It's yet another attempt to force FB users to switch to the very shady FB Messenger app. I'm still not touching it with a ten foot pole.
xnull2guest 9 hours ago 0 replies      
Yeah no thanks. I don't trust Facebook with anything. I consider any software touched by Facebook backdoored.

They deserve that reputation.

em3rgent0rdr 1 day ago 1 reply      
Ill try Facebook messenger on emulated android without a google account. Not a chance that I share all my phone contacts and everything else in permissions, simply so I can talk privately with my friends that are stuck in Facebook.
kalsk 1 day ago 1 reply      
How susceptible is this Signal Protocol to a man-in-the-middle attack? Because if Facebook is going to be the man in the middle, then this feature is pointless.
hatsunearu 1 day ago 0 replies      
End to End as in from Facebook user to Facebook user?

This is insane--I thought the whole model of Facebook chat was that they are grabbing all sorts of info from the messages for ads. What the fuck?

DavidWanjiru 1 day ago 0 replies      
Me, if I was working at Facebook, I'd have called it "Private Conversations." But hey, what's in a name?
justcommenting 1 day ago 0 replies      
Kudos to moxie for building free alternatives that people (Signal) and companies (Signal Protocol) can freely choose.
awqrre 1 day ago 0 replies      
But the Facebook app and Android still have access to unencrypted messages ...
dylanops 1 day ago 1 reply      
Telegram needs you to opt in too, I though?
SixSigma 1 day ago 4 replies      
I'm still sticking with the website version, thanks


Messenger, Facebook

This app has access to:


 find accounts on the device read your own contact card add or remove accounts

 find accounts on the device read your contacts modify your contacts

 precise location (GPS and network-based) approximate location (network-based)

 edit your text messages (SMS or MMS) receive text messages (SMS) send SMS messages read your text messages (SMS or MMS) receive text messages (MMS)

 read phone status and identity read call log directly call phone numbers reroute outgoing calls
Photos / Media / Files

 modify or delete the contents of your USB storage read the contents of your USB storage

 modify or delete the contents of your USB storage read the contents of your USB storage

 take pictures and videos

 record audio
Wi-Fi connection information

 view Wi-Fi connections
Device ID & call information

 read phone status and identity

 receive data from Internet download files without notification control vibration run at startup draw over other apps pair with Bluetooth devices send sticky broadcast create accounts and set passwords change network connectivity prevent device from sleeping install shortcuts read battery statistics read sync settings toggle sync on and off read Google service configuration view network connections change your audio settings full network access

thefastlane 1 day ago 2 replies      
what's the difference between

- Messenger

- plain vanilla messages i get in Facebook web site

- 'chat' messages, were I to turn on 'chat' in Facebook web site

i'm not asking rhetorically. i honestly can't keep up with all the messaging avenues availabale today...

calinet6 1 day ago 0 replies      
Can't stop the Signal.


jswny 1 day ago 1 reply      
I'm really skeptical about this. First of all, Facebook collects more user data than just about any company out there. They make most of their money off of advertising and harvesting user data and metadata. Facebook is just about the last company I'd trust to encrypt data of mine. It's like them saying, "hey, I know we make most of our revenue off of collecting user data but I think we should throw away a huge portion of that."

Additionally, from what I've gathered, they are going to role this out so that you have to specifically tell Messenger you want to encrypt a chat. Why would they not just make encryption universal? If anything, this makes it even easier for the government or other entities to target "suspicious activity." I'm far too skeptical of Facebook and how they are going about this whole process to be happy about it.

eps 1 day ago 1 reply      
Is this some sort of elborate trolling?

Can an exact match for a FB-provided binary be recreated from the open source code? If it's a No, then it's back to trusting FB to do the right thing and it doesn't make a slightest difference what exact protocol it's running or if the source was peer-reviewed behind closed doors.

danesparza 1 day ago 3 replies      
So ... no comment on the choice of picture in the blog post? I hadn't heard of Jules Bonnot, but found the wikipedia article illuminating: https://en.wikipedia.org/wiki/Jules_Bonnot

This seems like it's a subtle endorsement for using Open Whisper Systems for criminal activities. Is it just me, or does that seem like the wrong image to gravitate towards?

Apollo 11 Guidance Computer source code github.com
617 points by uptown  2 days ago   137 comments top 37
blueintegral 2 days ago 3 replies      
Someone opened an issue: "Check continuity on O2 cryogenic tanks before allowing stir"


js2 2 days ago 5 replies      
elcapitan 2 days ago 1 reply      
This is amazing and contains so many gems.

I think this one is my favorite module:https://github.com/chrislgarry/Apollo-11/blob/master/THE_LUN...


jdimov10 2 days ago 0 replies      
From the code comments:

"This source code has been transcribed or otherwise adapted from digitized images of a hardcopy from the MIT Museum. The digitization was performed by Paul Fjeld, and arranged for by Deborah Douglas of the Museum. Many thanks to both. The images (with suitable reduction in storage size and consequent reduction in image quality as well) are available online at www.ibiblio.org/apollo. "

I mean, I realise that this is the least of the amazing achievements we're talking about here, but yea.. respect :)

sgt 2 days ago 0 replies      
Take note of the KALMAN_FILTER.s source code file. See https://en.wikipedia.org/wiki/Kalman_filter for details. The filter is named after Rudolf Kalman who recently passed away. (https://en.wikipedia.org/wiki/Rudolf_E._K%C3%A1lm%C3%A1n)
sjtgraham 2 days ago 5 replies      
BURN_BABY_BURN--MASTER_IGNITION_ROUTINE.s - https://github.com/chrislgarry/Apollo-11/blob/dc4ea6735c4646...
Animats 2 days ago 1 reply      
There's a simulator, if you want to run it.[1] But it's just a simulator for the computer; there's no spacecraft attached.

There's a mod for Kerbal Space Program which gives it real solar system planets and dimensions. (KSP's world is way undersized so things happen faster.)[2]

There's another mod for Kerbal Space Program to give it real dynamics.[3] (KSP doesn't really do dynamics right; the spacecraft is under the gravitational influence of only one body at a time. This is why there's that sudden trajectory change upon Mun capture.)

Someone should hook all that together and do a moon landing in simulation.

[1] http://svtsim.com/moonjs/agc.html[2] https://www.reddit.com/r/KerbalSpaceProgram/comments/1piaqi/...[3] http://forum.kerbalspaceprogram.com/index.php?/topic/62205-w...

uptown 2 days ago 3 replies      
Love this line:


ianbertolacci 2 days ago 2 replies      
Have they considered rewriting it in rust?
libria 2 days ago 0 replies      

The 1969 version of "This should never happen".


EvanAnderson 20 hours ago 0 replies      
It's interesting to me that the AGC contains an implementation of a virtual machine that is used to perform the higher-level mathematical functions (called 'The Interpreter'). Some details are available in this PDF starting on page 74: http://www.ibiblio.org/apollo/NARA-SW/E-2052.pdf

It would be fun to do some research into the embedding of higher-level virtual machines in earlier computers. I'm thinking of 'The Interpreter' in the AGC as being an ancestor to 'SWEET16' in the Apple II (https://en.wikipedia.org/wiki/SWEET16), or the 'Graphic Programming Language' (http://www.unige.ch/medecine/nouspikel/ti99/gpl.htm) in the TI-99/4A.

Practicality 2 days ago 0 replies      

I want to just leave comments like this and have users be responsible for avoiding said chaos.

brendangregg 2 days ago 1 reply      
It's amazing that this is all online now, and easy to browse. Lots of source here too http://www.ibiblio.org/apollo/links.html

FWIW, I did performance analysis of the guidance computer and the 1202 and 1201 alarms at the start of my ACM Applicative 2016 keynote: https://youtu.be/eO94l0aGLCA?t=4m38s

WhitneyLand 2 days ago 0 replies      
In case you're wondering what hardware this source code is for:https://en.wikipedia.org/wiki/Apollo_Guidance_Computer

The Apollo Guidance Computer (AGC) was a digital computer produced for the Apollo program that was installed on board each Apollo Command Module (CM) and Lunar Module (LM). The AGC provided computation and electronic interfaces for guidance, navigation, and control of the spacecraft. The AGC had a 16-bit word length, with 15 data bits and one parity bit. Most of the software on the AGC was stored in a special read only memory known as core rope memory, fashioned by weaving wires through magnetic cores, though a small amount of read-write core memory was provided.

stuxnet79 2 days ago 4 replies      
Other than gaining satisfaction from the historical importance of this code is there any conceivable way we can get some use of it - like try it out.

Even setting that aside, what is it I'm looking at? Assembly?

sixothree 2 days ago 3 replies      
What was the development environment like for this code?
hcrisp 2 days ago 0 replies      
"It is correct to say that we landed on the moon with 152 Kbytes of onboard computer memory." - Don Eyles

Ref: http://www.doneyles.com/LM/Tales.html


jonathankoren 2 days ago 0 replies      
I recommend reading Digital Apollo[0] about the development of the computer, and actually the entire man-machine interface of early spaceflight. The machines were made in the milieu where computer mediated control was highly controversial. (e.g. "A machine might work when everything is fine, but will never work in an emergency.") Essentially there was huge argument between pilots and engineers, about how much automation should be done. It was so bad, that pilots even tried to insist on flying the rocket into orbit. (If I recall correctly, in simulations in a centrifuge, only Armstrong was able to successfully not crash the Saturn V in a manually controlled ascent.)

The other recurring theme in the book is the disturbingly short MTTF for flight computers during the mid 1960s. Statistically, NASA had to plan for a computer failure in route to the moon, and so repair-vs-replace became a serious issue. (Yeah, they seriously considered soldering in zero-g.)

[0] http://web.mit.edu/digitalapollo/

fergyfresh 2 days ago 1 reply      
Alright, who wants to make a video game using this source code with me?
emcrazyone 2 days ago 1 reply      
I often wonder about the electronics of 1969 and what was done to mitigate radiation problems.

For instance, the type of memory was called core rope memory https://en.wikipedia.org/wiki/Apollo_Guidance_Computer

For anyone interested, XPrize winner Karsten Becker talks to popular youtube blogger David Jones about radiation, extreme heat & cold in space and specifically talks about bit flip and how electronic parts are sourced for such endeavors.


Interesting to me was the "paper work" cited in the interview for space harden components. In other words, people are concerned with stuff falling back to Earth (wouldn't it burn up?) or used for not so friendly purposes (war).

wepple 2 days ago 0 replies      
2.048MHz clock

16-bit wordlength

2048 words of RAM (4k 'bytes'/octets) using magnets?!

36,864 words of ROM

Ok this is a actually a really interesting read: https://en.wikipedia.org/wiki/Apollo_Guidance_Computer

lukateake 1 day ago 1 reply      
I will now be adding this comment to all of my code: HONI SOIT QUI MAL Y PENSE


crocal 2 days ago 1 reply      
Is this what I think it is?


akshatpradhan 2 days ago 2 replies      
If this were to be rewritten in a high level language, I wonder what would it look like?
alehander42 2 days ago 5 replies      
people could've really used a higher-level language compiling to optimized AGC(apollo computer) assembly. Is there any reason why they didn't develop one? It seems it would've helped tremendously with the productivity and verification (and a lot of the explanations and equations would be readable as code, not as an non-executed comment)
burnbabyburn 2 days ago 3 replies      
#NOLI SE TANGEREthis should be noli me tangere, shouldn't it?
yeukhon 2 days ago 0 replies      
So how did this guy get the source code and why is he the one publishing it?
nerdy 2 days ago 0 replies      
How about the code apparently marked for deletion? https://github.com/chrislgarry/Apollo-11/blob/dc4ea6735c4646...

I've often wondered many things about the cleanliness, maintainability and style of such code (this particular system, in fact). It's fun to be able to actually poke through it.

strgrd 2 days ago 1 reply      
intrasight 2 days ago 1 reply      
No credit to Margaret Hamilton?
_pmf_ 2 days ago 0 replies      
If you want to be entertained really, really well, watch this: https://www.youtube.com/watch?v=4Sso4HtvJsw

It's an incredibly well done and at times hilarious narration of the moon mission. (Spoiler: contains a part where Armstrong overrides the automatic control and lands manually)

This is probably my favorite presentation ever.

userbinator 2 days ago 0 replies      
Schematics are also available for the hardware it runs on:


The CPU is built entirely from 3-input NOR gates.

j1vms 2 days ago 1 reply      
A less important aside: what license is this available under? Or what's the history behind this source release?

Edit: .s files indicate:

# Copyright:Public domain.

artellectual 2 days ago 0 replies      
Wow this is surreal. I can't believe I'm seeing this. It's historical, thank you for sharing.
bitfox 1 day ago 0 replies      
What if you'll receive a pull request?
therobot24 2 days ago 0 replies      
this is super cool, awesome post
vun87 2 days ago 0 replies      
it's amazing they ever got anything to fly
See the things youve searched for, visited, and watched on Google services myactivity.google.com
556 points by hokkos  3 days ago   275 comments top 62
tinco 3 days ago 7 replies      
Excellent, I like how it weaves my girlfriend looking for the perfect bra through my research on collision search strategies in physics engines.

Would Google in any way discern between those two sessions? Me sitting at the desk doing some hobby coding performing a query like "DBVTBroadPhase", her on the couch on my laptop looking for inspiration performing queries like "michelle branch instagram".

To a human it doesn't take much to jump to the conclusion that two totally different persons are using the same account at the same time.

But maybe that isn't even of interest to google. Me and my GF are basically one entity to them. We frequent HN and skinnycurvy.com, we like Michelle Branch and Steve Klabnik (;)), we do pilates at the gym while at the same time play underwaterhockey at the sports centre.

Any ads Google shows us might have a 50% chance of being shown to the right person, and that's probably good enough. Maybe they get lucky and I'll mention some womans brand I saw in an advert to my gf sometime.

1gn1t10n 3 days ago 4 replies      
Ahh... It's so satisfying to see the stream of everything I was doing on Android suddenly stop when I switched to Cyanogenmod and not linked my e-mail account.

The trick was to use K-9 Mail. Otherwise, when configuring the e-mail (Gmail), the default mail application adds the entire Google account and the link to the mothership is reestablished. Although I have installed GApps, I transitioned to a dummy account per device plus Xprivacy, plus NetGuard.

Long before Android, the stream had dropped to a trickle when I started sandboxing the Google account to a special session for Gmail. Everything else, searches, youtube went on an incognito window or to a separate Firefox profile.

I knew it to be effective from the constant e-mails I was getting that "Google does not recognize your sign-on". Guess what, Google, I want it that way! Now myactivity.google.com confirms it.

blfr 3 days ago 6 replies      
I'm always surprised how little Google manage to do with all that info. I have a personal Google Apps account and their absolute highest achievement from knowing nearly my entire location history over the past three years, searches, current position, having all the contacts, calendar, list of my apps, even purchase history (through email receipts if they wanted it) is Google Now occasionally correctly suggesting the next destination.

Other location suggestions? Crap unless explicitly entered in the Calendar. Article suggestions in Google Now? Crap. Despite having my entire Feedly info (300+ feeds) from which they pull most useless cards. Youtube suggestions in Google Now are somehow even worse than video suggestions in the Youtube app. Ads? Complete garbage and borderline fraud against all these companies paying to advertise mobile games which I almost never play and they know it.

Bizarrely, I have a Google Apps account at work with much, much less info and it's actually a little better.

oliwarner 3 days ago 5 replies      
"Only you can see this data"

Side-stepping for a minute that Google and governments can also see this data, this sort of wholesale data aggregation and presentation seriously ups the ante for account security.

Getting somebody's Google account from third-party breach-de-jour used to mean you got their email history, or could pretend to be them... But with this you have their browsing, app, search and location history. That is to say, you can discover: What they're doing. Where they're doing it. What they're thinking about (I search everything). Who they interact with.

Worse, I wasn't prompted for any sort of password. Physical access to my computer (and I assume phone) now gives easy access historical surveillance data.

Welcome to the new generation of identity theft.

falcolas 3 days ago 5 replies      
Funny - I remember opting out of all of Google's data collection years ago. I went there fully expecting Google to have no activity recorded, as I have seen when this page (or its ilk) have popped up in the past.

Much to my surprise, it was well populated again, with my web browsing/searching, YouTube, and location histories all turned back on again. I don't use the Chrome browser (aside from compatibility testing), and I don't use Android. I'd be very curious to know how all of this was re-enabled without my involvement.

Very unfortunate, and another straw for the camel's back.

bluegate010 3 days ago 2 replies      
This site exemplifies something that really annoys me with Google's material design framework. On my 13" screen, I can see a grand total of four items at a time: https://i.imgur.com/6YJxj0b.png

I really wish they'd consider information density as a plus when designing pages like this.

cdnsteve 3 days ago 2 replies      
I'm glad to see this data provided back to me, however, I'm very concerned about Google knowing absolutely everything I do while on the web and on my Android device.

- Used Messenger app, sent X messages

- Used maps, with location data, search data

- Use phone, with number of calls

- Used (Any installed app) included how often and what I did.

I've since turned everything off.

While browsing on my phone last night I saw a new Google.com feature where they were using my email address to try sign me up to email lists at the top of search results.... Not cool.

makecheck 3 days ago 1 reply      
The frustrating thing to me about current sharing/observing of data is that everything is a trapdoor: if your data leaks out, there is probably no way to regain control (privacy panels are nothing but a feel-good measure that still depends a lot on bug-free software and the goodwill of others). You only need to fail to protect your privacy once, and after that it almost doesnt matter what you do.

We must rethink infrastructure to the point where the only data that we transmit is data that is inherently useless after a time. If I do something like revoke my key, it should be impossible for anyone to further use that data. Expiration dates should be baked directly into protocols so that revocation of keys and expiration are the same thing: either I revoke my key myself, or my software revokes it for me but no one else (and their buggy or insidious software) gets to decide how to respect the expiration time.

jdimov10 3 days ago 3 replies      
"You can easily delete specific items or entire topics. You can also change your settings and decide what data gets associated with your account."

I guess that's nice. I wonder if they'd still keep the original data points.

laser 3 days ago 2 replies      
Funnily I've gotten pretty used to seeing my logged browser history the past week or so intermixed with a few other friends as I've built this webapp (http://www.websee.io/) that anonymously aggregates browsing activity from all the users and displays the content with the highest recent crossover, basically sharp peaks in browsing distribution across all URLS.

It's odd to me that Google collects all this data, but doesn't really seem to offer any specific applications that depend on it, individually or in aggregate. Even though they have trends, they don't have any lists like Alexa of top traffic websites, despite the fact they have a better sample than anyone, and could make such information available.

I guess I can just hope that between the more "interesting" ads they show me and whatever other magic they use the data for to improve their services, that it's worth letting them have it.

codq 3 days ago 1 reply      
Extremely excited to learn that Google is tracking my DuckDuckGo searches from the ChromeBar.
whym 3 days ago 2 replies      
It seems that this page lists activities I have done intentionally, such as queries I entered.

I'd be much more interested to see what data I gave to Google unintentionally - e.g. what Google-provided ads I saw on non-Google sites, what sites with Google Analytics I visited, etc.

bikamonki 3 days ago 0 replies      
I immediately deleted all activity from all time, however, I do not trust this was truly deleted nor that it was never shared "anonymously" with third parties.
jasonkostempski 3 days ago 0 replies      
It seems every few weeks some new account tracking feature turns up that I have to clear and disable. Considering this had stuff I though I cleared last time, I'm pretty sure "Delete" doesn't mean jack to Google. The only thing keeping me on Google is a pretty simple calendar and spreadsheet script I use to track my finances. If I can find an alternative to that, I'm Audi 5000.
mikegerwitz 3 days ago 3 replies      
Can we start a thread (this one) listing the data that people see available, for someone who doesn't have a Google account and is curious what sort of data are collected?

There's some stuff scattered throughout the comments, but it'd be nice to have a single spot.

oDot 3 days ago 1 reply      
A long time ago I opted out of Google Map's search history.

A few months ago I searched for a business on maps, and had to type the full name. The second time I searched for it, it came up the top suggestion.

I do not trust these "privacy features" since.

whamlastxmas 3 days ago 2 replies      
It's safer to assume everything you input into a device is being recorded regardless of settings. If I want privacy, I'll use Tails OS with Tor. Since that's a pain to use I just don't do or write anything on the internet I don't want tracked back to me.
rajathagasthya 3 days ago 3 replies      
Stupid question, but can Google (or Bing) associate what you searched for or which link you clicked in incognito searches? I've always wondered how truly incognito they are.
buremba 3 days ago 1 reply      
If I understand correctly, you can prevent Google to track your activity in this page: https://myaccount.google.com/activitycontrols
graeme 3 days ago 1 reply      
What are best practices for blocking tracking on iOS?

What I'm doing now is:

 Vpn Logged out of Google maps, YouTube, etc. Safari privacy settings to most private. Only exception was allowing cookies on current site only, rather than never Focus by Firefox with all privacy options checked DuckDuckGo for search.
Anything else?

Also, how do you do a non indented list in hacker news? I have never figured this out.


I added some stuff. I began using 1blocker. Disabled all cookies by default, but am whitelisting them on site only for a few sites I have to log in to.

Using this site to test. Now with Vpn on, only hardware, software and gyroscope are leaking. Not sure there is any way to block this on iOS.


58028641 3 days ago 3 replies      
I might switch to DDG or StartPage now.
benologist 3 days ago 1 reply      
This needs a much more efficient way to purge everything instead of the three-clicks-per-day.

Edit: there is: https://myactivity.google.com/delete-activity

drusepth 3 days ago 0 replies      
This is awesome. Crazy to be able to analyze activity so in depth.

Does anyone know if there are any opt-in data sources that feed into this? I'd like to keep it as comprehensive as possible.

richdougherty 3 days ago 1 reply      
I'm looking forward to using Firefox's new 'Contextual Identities' feature so I don't have to run separate browser profiles:


i2shar 3 days ago 0 replies      
I always browse and search incognito, lately on Opera with free VPN. That takes care of my desktop/work search history not being associated with my account/IP.

But what is particularly insidious is the mobile click tracking. I am wary of clicking news items or links in Google Now that might reflect on my intentions in an undesirable way.

dredmorbius 3 days ago 0 replies      
My history is refreshingly empty.

Or should I say "histories" -- there are multiple Google accounts involved, few having any variant of the names I'm known by in meatspace.

Still, for some reason, there's a bunch of Google Shit I Don't Use which I cannot simply get rid of or turn off. Instead I've got to laboriously go through each one (for each Google profile), and ensure that all tracking and history are disabled.

It'd be rather nice to have that fixed. After all, linking G+ and YouTube accounts just went so well, right?



mderazon 3 days ago 0 replies      
I'm really happy to see this transparency.I wanna see Facebook doing something like this
cloudjacker 3 days ago 0 replies      
From a UX perspective, I thought I was only searching for certain things on certain accounts, but saw the majority of those searches index under my employer's non-default google apps account.

Thats not what I was going for at all.

spaceisballer 3 days ago 1 reply      
That last data points they have from me is April 20, 2015. Maybe I locked down all my privacy settings then. Or I guess it helps to be using an iPhone.
ohitsdom 3 days ago 3 replies      
It's only showing Youtube activity for me, I must have some privacy settings on search to disable this but can't remember changing anything...
MOARDONGZPLZ 3 days ago 1 reply      
For those of you interested, and I didn't see it in the above link, this is a link to all your Google Location History, it must only be Android because I'm not seeing a lot of travel I did while I had an iPhone:


imron 3 days ago 0 replies      
Excellent. Clicking on the drop down allowed me to delete everything.

I thought I'd already turned everything off, but there were still a few youtube searches in there from several years back, I guess from before I'd turned things off fully.

jaseemabid 3 days ago 0 replies      
Got there, and deleted all of it. ~10 years of search, browsing history and all other activity.

I feel good.

stirner 3 days ago 0 replies      
They don't show you the dossier if you don't log in, even though they certainly have one.
exabrial 3 days ago 1 reply      
Given all this, why does Google News ABSOLUTELY INSIST I live in St. Louis when in fact I do not?
tlrobinson 3 days ago 1 reply      
Is there any way to dump the raw data?

My wife and I maintain a list of ridiculous oddball Google searches we've made in the past (no, you can't see it). It would be interesting to try to train an AI to find the ones we've missed...

fibbery 3 days ago 1 reply      
Wow the filtering controls are pretty terrible. No way to type in dates, for one.
geomark 3 days ago 0 replies      
I'm only seeing the very few times I've signed into Google to upload a video to YouTube. I infrequently use Gmail on my Android but no signs of it in my activity - aren't Gmail logins along with location logged?
bojo 3 days ago 0 replies      
Huh, I turned off web search tracking years and years ago. Glad to see they (apparently) honored that and the only thing that was being tracked was the random youtube visit.
sheeshkebab 3 days ago 2 replies      
Thanks - deleted all of my stuff and turned of all further collection.

gosh google...

dom96 3 days ago 1 reply      
Wow, my activity goes back all the way to 2006. While it is very interesting to see what I've been searching in 2006, it is also very scary knowing Google has this data.
AngeloAnolin 3 days ago 0 replies      
Just another way of saying that whether in the browser or mobile device and you're signed in to Google, any stuff you do that touches Google is being tracked.
y04nn 3 days ago 0 replies      
When I want to download my searches, it goes up to 1 year. Does google not keep more than 1 year of data? Is it required to them to remove data after 1 year?
tempodox 2 days ago 0 replies      
Hmm. I'd have to sign in, giving up even more data. I'd rather delete all of my cookies every day.
rampage101 3 days ago 0 replies      
That is so crazy that do login is needed if you are already signed into Google. I am also surprised they are sharing this feature with users.
tclover 3 days ago 0 replies      
No activity.Some activity may not appear yet. :)
mirimir 2 days ago 0 replies      
I wonder how many Google staff can view these data for arbitrary user foo. Is LOVEINT an issue, as with NSA?
beyondcompute 3 days ago 0 replies      
I'd switch to Apple the month it assures me it doesn't store all that data about myself.
phil248 3 days ago 0 replies      
Great, let me waste some time analyzing an in-depth report on how I waste my time!
Aoyagi 3 days ago 0 replies      
Yeah, all I have there are three Youtube videos I watched in 2015 (even though I do use my Youtube account almost daily). I'm content with this.

What I'm not content with is that they're showing it in a wrong language, but I guess I can't ask for good webdesign from someone like Google.

sickbeard 3 days ago 1 reply      
More like "anyone can see this data if you don't log out"
lucaspottersky 3 days ago 0 replies      
Their date widget is ridiculous.

It doesn't allow you to type dates. :sad:

ausjke 3 days ago 2 replies      
Is there a panic button that I can press now to 100% opt out of this ?

Google is effectively transforming itself from an ad company to a big/huge brother, the real big brother that now knows everything I do. Facebook is no better either.

uberneo 3 days ago 0 replies      
An Analytics solution on top of this data would be amazing ..
uberneo 3 days ago 0 replies      
I wonder if we can download/scrape all the data..
supersan 3 days ago 1 reply      
I was not expecting to see my Reddit activity on that list.
jomamaxx 3 days ago 1 reply      
Does anyone not feel that this is deeply problematic?

What's worse is the faux 'Silicon Valley' ethos / koolaid they put on us.

A regular company doing this does it for obvious reasons.

But Google sticks to their 'Do No Evil' mantra, which I find entirely hypocritical: 'We say we are Doing No Evil, ergo, we are not doing any evil'.

I find it doubly disturbing.

They're even worse than classical companies, and yet somehow through their own branding koolaid would have us believe that they are 'more moral'.

vmateixeira 3 days ago 0 replies      
This is just what they want you to think they know...
mpitt 3 days ago 3 replies      
Wow, I know Google knows a lot about me, but apparently it can predict that I'll use Telegram on March 15, 2016 at 4:53 AM!

(Either that, or my phone briefly had a very incorrect time setting.)

reddit_clone 3 days ago 0 replies      
I use three different browsers.

Chrome : Work related applications only. Minimal extensions.

Firefox: Locked down. (NoScript, Privacy Badger, uBlock origin) For general web browsing and searching.

IE: For gmail and google apps and nothing else. I like to see google frantically suggesting I use chrome every time I access GMail !

lucb1e 3 days ago 1 reply      
I'm really happy to see that the last thing I did on any Google service while logged in, was June 27th. This means Self-Destructing Cookies is really working and all embedded Youtube videos in pages don't keep the cookie active.
bogomipz 3 days ago 1 reply      
This is horrifying. Are they recording everything even when you aren't signed in to your Google account?

Why do I need this at all? Why do I care about what I searched for 5 years ago? Time to move off of Chrome. Does all my data with Chromium get hoovered up as well?

Rudolf Klmn Has Died hungarytoday.hu
545 points by szemet  3 days ago   68 comments top 19
hcrisp 3 days ago 2 replies      
The Kalman Filter was used the in the Apollo 11 Guidance Computer [0] (discussed in the past on HN [1]).

As someone linked previously, here is a historical perspective [2], and a link to the actual state vector update computations [3].

The AGC maintained the state vectors for the KF. Ground control would run batch mode least squares solutions, and pass it on to the LM, where the updates to the state vector would be applied by hand. The variables of the state vector were a 6x6 matrix of position and velocity in X, Y, and Z or a 9x9 matrix when including radar/landmark bias.

I have great admiration for Mr. Kalman. Controls engineering has greatly benefited from his work.

[0] http://en.wikipedia.org/wiki/Apollo_Guidance_Computer

[1] https://news.ycombinator.com/item?id=8063192

[2] http://www.ieeecss.org/CSM/library/2010/june10/11-Historical...

[3] http://www.ibiblio.org/apollo/listings/Comanche055/MEASUREME...

RogerL 3 days ago 3 replies      
If you are interested in learning about them in depth, I'll toot my own horn and point you to my interactive book on them:


It uses Jupyter Notebooks to run code in the browser. Check out the book, or run online using Binder.

lacker 3 days ago 2 replies      
Kalman filters are really neat. I wrote one when learning C a while ago and it is just cool what some matrix math can do with practical data. https://github.com/lacker/ikalman

Although I guess I should have been calling it a "Klmn filter" this whole time.

mnw21cam 3 days ago 2 replies      
I'm always surprised when people talk about Kalman filters without mentioning the killer app, which is weather prediction. Quite a few weather prediction organisations are at least experimenting with Kalman filters, and some are running whole ensemble forecasts using the method. It may sound strange, but the Kalman filter can be a less CPU-intensive (or at least more parallelisable) way of calculating atmospheric state than the current variational data assimilation methods.
tnecniv 3 days ago 0 replies      
I was waiting for this headline. Someone edited his wikipedia article last week, but provided no source and no articles were published yet.

The Kalman filter is a fantastic tool. I use it and the particle filter regularly. Both are ubiquitous in robotics and cyber-physical systems and incredibly powerful tools.

Interestingly, Kalman wasn't able to publish it in a prestigious journal at the time and had to settle for a lesser known one.

EDIT: Forgot to finish my sentence...

diydsp 3 days ago 0 replies      
For those interested in Kalman filters, this previous thread has a LOT of great pointers:


I also like this interactive tutorial: http://home.wlu.edu/~levys/kalman_tutorial/

pj_mukh 3 days ago 0 replies      
Flashback to tuning covariance matrices for hours and when you get it juuuust right and it looks like the filter is working magic on your data.

What a giant of his field! Here he is receiving the National medal of Science from Barack Obama[1]


dxbydt 3 days ago 0 replies      
With the massive resurgence in Neural Networks, it may interest you to know that DEKFs ( decoupled extended Kalman Filter ) are a very fast sequential weight update procedure while training large neural nets. For instance: http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=614185
bmmayer1 3 days ago 2 replies      
> "emigrated to the United States in 1943"

It's always sad hearing about great minds who fled Europe shortly before or during World War II. It's a reminder of all the great minds whom we lost in the last war, and of how destructive war is to technological and human progress.

Qworg 3 days ago 0 replies      
I wrote my thesis on mathematical filters for robot localization. A true great has passed. =/

If folks are interested, here's some ancient (and likely awful) code I wrote for it. It has a MH set, a Fuzzy set, a PF and a SPKF as well: https://github.com/Qworg/Robot-Sensor-Fusion

univalent 3 days ago 0 replies      
RIP. For many many engineering students his work was the first introduction to something amazing that they could implement and understand.
platz 3 days ago 0 replies      
there's an unobserved state that changes over time according to relatively simple rules, and you get indirect information about that state every so often. In Kalman filters, you assume the unobserved state is Gaussian-ish and it moves continuously according to linear-ish dynamics (depending on which flavor of Kalman filter is being used)
joezydeco 3 days ago 8 replies      
Just last week I needed to smooth out a display reading on an oven controller. The RTD was being read way too fast so I'd get a lot of flicker between values due to ADC resolution.

In the back of my head I remembered one word: Kalman. This line of code fixed it right up:

 static float display_temp = 0; display_temp += 0.04 * (adc_temp - display_temp);

texan 3 days ago 0 replies      
It's interesting that there is close to none of any personal background or information available on the interwebs.
TickleSteve 3 days ago 0 replies      
The accuracy of your GPS device has a lot to do with this man.His work has had an extremely wide impact.
ramisama 3 days ago 1 reply      
I remember using the Kalman Filter in my Advanced Macroeconomics class to calculate economic cycles out of the GDP data of my country!
ignorantguy 3 days ago 0 replies      
my undergrad project was on kalman filter. He is awesome.
bendykstra 3 days ago 3 replies      
Edit: This is wrong. This is a particle filter, another type of Bayesian filter. I can't delete now, so please downvote to hide.

I made a Kalman Filter visualization[1] last year to learn more about them. It's amazing to see how good a result you can get from very poor sensor data.

In the visualization, a lawnmower (green dot) is tracked (blue circle) using triangulation. The distance sensors have very low accuracy (grey regions). When the mower reaches the edge of the yard, its position and velocity are randomized, but the filter is not told, so it has to reacquire.

1. https://jsfiddle.net/bendykst/tfcub3tj/

Bulgaria Passes a Law Requiring Open Source medium.com
535 points by bozho  5 days ago   86 comments top 12
bluetomcat 5 days ago 7 replies      
You wouldn't know the background motivating this decision unless you have been a frustrated user of the nearly non-functional software of Bulgarian state institutions.

Ludicrous amounts of money are paid by the government to a selected niche of companies for developing all kinds of useless websites which barely work under load and have abysmal implementations with blatant security holes. This law can act as a safeguard against such "epic failures", so that the taxpayers can be aware of what they are actually paying for. 300k euros for a static website? Let's hope it's over.

onestone 5 days ago 6 replies      
I remember the CEO of Information Services JSC (the de-facto Bulgarian monopolist in governement software procurement), prof. Mihail Konstantinov, making the ridiculous claim on TV that "We can't release the source code of the elections counting software. Anyone who has the source can hack into the system, even children know that. If you don't understand that, you should tear your diploma". Glad to see that morons such as him will no longer have the final say.
emilecantin 5 days ago 0 replies      
> It means that whatever custom software the government procures will be visible and accessible to everyone. After all, its paid by tax-payers money and they should both be able to see it and benefit from it.

I've been thinking that way for a long time, nice to see I'm not alone. Let's hope other jurisdictions follow suit.

chme 5 days ago 2 replies      
Would be nice if bigger nations like USA, UK/GB, Germany would adopt this policy and have to open source the exploits and root kits that where develop with tax payers money.

Open source XKeyscore, yay!

stanislavb 5 days ago 0 replies      
"The fact that something is in the law doesnt mean its a fact, though."..."companies will surely try to circumvent it."

Yeah, this is very well said. Most laws in Bulgaria are either not enforced or "avoidable" :)

breakingcups 5 days ago 1 reply      
This is very interesting, I wish more countries followed suit.

In my ideal fantasy world, at some point other countries might have a look at one of the open source projects of Bulgaria and collaborate when the goals align closely.

lamarkia 5 days ago 3 replies      
It mentions "OpenOffice", which is now defunct.

In any case it is good. Future procurements will show how well the law is applied.

donkeyd 5 days ago 2 replies      
I've personally seen the Dutch government spend millions implementing open source software. This was something that could've been fixed for a fraction using a closed source solution. After a couple of years, the project was canceled and the closed source solution was implemented anyway.

I'm not saying that using OSS is a bad thing. I don't, however, think that 'OSS only' is the solution to the problem at hand.

anaolykarpov 5 days ago 1 reply      
If facebook, google, twitter and others are able to run their world scale software on OSS solutions without being hacked, I am sure that OSS can power some national scale software as well.
leandot 5 days ago 0 replies      
That is great news, hope it works out well.
blahi 5 days ago 2 replies      
Every law passed by Bulgarian parliament serves only one purpose - to put pressure on somebody, so people in the shadows can get a slice.

edit: A new government agency is tasked with enforcing the law

Ah, I see now.

jdimov10 5 days ago 5 replies      
I think this is a matter of perspective. I don't live in Bulgaria but my company has a pretty big lab in Sofia and I work with the engineers there constantly. They are excellent. I have had nothing but good experiences working with them. There are talented people around, and Sofia is pleasant and safe. The country itself is quite beautiful.

Bulgaria is not free of problems (see the corruption perceptions index [1]) but it's also far from doomed.

[1] https://en.wikipedia.org/wiki/Corruption_Perceptions_Index#2...

Writing a video chat application from the ground up, part 1 bengarney.com
572 points by kimburgess  2 days ago   53 comments top 18
perplexes 2 days ago 2 replies      
This is wonderful. I read the whole series in one sitting. It actually made video codecs feel way more approachable, rather than some patented black box magic I'll never understand.

It also reminded me of a recent article talking about how you can break audio codecs by guessing which quantizer was used by the packet, then using it in reverse to produce speech! Which I suppose is obvious in retrospect, that lossy codecs are trying to compress data by making it perceptually similar, whatever the domain.

I also appreciated the ties to video game networking. Gaffer on Games has had a long-running series on designing multiplayer networking protocols with UDP and you two approach bit-shaving very similarly (unsurprisingly I suppose - it's a very specific process with its own tools).

Anyway, thank you! I learned a lot.

munificent 2 days ago 1 reply      
My absolute favorite kind of writing is the kind that leaves me itching to try coding something myself since it now seems so much more approachable than it did before. Though I knew a bit about encoding, I never would have thought to build something like this, and yet now I find myself wishing I could carve out some time to try.

Great job!

Sanddancer 2 days ago 3 replies      
If I may, I'd like to inject a plea for sanity here. Please, please, pretty please with sugar on top, don't reinvent the wheel when it comes to chat protocols. Right now, on my desktop, I have 5 different windows open dedicated to various chat networks and chat protocols. I have steam chat, a window with my irssi session, a pidgin session with connections to a slack session and an aim session, a telegram session, and a skype session. Yes, implementing someone else's protocols is complicated and sometimes painful, but when you end up needing a half dozen different /types/ of connections, there is something wrong and broken with how we're approaching the whole talk to other people thing.
Someone1234 2 days ago 1 reply      
So all four parts are available, just click on the link in the last paragraph to go to the next.

I love projects/blogs like this, since it is "back to the basics" and we all learn something by better understanding how things from codecs, to compression, and so on work. This one is wonderful and one of the best reads this week.

sim- 2 days ago 0 replies      
While TCP is not ideal for this application by nature of it trying to be a fully reliable stream protocol, one often overlooked advantage of its congestion control is that it allows the stream to play nicely with others. For example, if you develop a datagram-based transport whose data rate results in congestion at some point in the network path, any TCP going through the same point would back off to nearly nothing in an attempt to save the link.

You can be greedy and take the bandwidth anyway at the expense of everything else, but possibly (in some conditions) this may even cause a worse outcome even for your traffic. It's likely better to change your data rate target and drop rarely than to send too much and drop randomly at a higher rate.

fovc 2 days ago 0 replies      
The series was amazing. Could not stop clicking through. I was sorely let down at the end when there was no more to read. Also when it didn't turn into an open-source high-performing P2P video conferencing app :)
ape4 2 days ago 1 reply      
Like the idea of the codec: trying to minimize the error
CorvusCrypto 2 days ago 0 replies      
This intro (and series as a whole) was AWESOME! My background didn't really touch on compression at all and these parts were what I really loved learning from the post. More please! (Any other resources for compression are welcome :D)
lisper 2 days ago 0 replies      
Same story submitted eight hours earlier: https://news.ycombinator.com/item?id=12047710#12050483
dopeboy 2 days ago 0 replies      
Really neat. Reminded me of a live stream video conference system I built for an embedded systems course a long time ago: http://www.cs.columbia.edu/~sedwards/classes/2009/4840/repor...

As far as I remember, bandwidth was moreorless unlimited. It was nasty syncing issues I remember giving me nightmares.

cm3 1 day ago 0 replies      
Regarding overhead of H.264 and VP9, using Intel's QuickSync or AMD's VCE would make sense in a production version, in order to have a fast implementation. That's VAAPI (and maybe VDPAU) on Linux and BSD. The encoders' output will look good enough for video streaming.
ausjke 2 days ago 1 reply      
Are there any code available somewhere and is this Windows-only?
fmilne 2 days ago 1 reply      
Thanks for writing this, I learned a lot! Last weekend I started a peer-to-peer group video calling project. Seeing your whole approach made the entire system easier to understand.
Keyframe 2 days ago 1 reply      
imgui is great, but I have a bit of an allergy on C++. There's Nuklear https://github.com/vurtun/nuklear which is a re-take on it, but in ANSI C. It's interesting to see GUI rendering takes so much of your processing time slot, or is it that everything else is so little?
ww520 2 days ago 0 replies      
This is excellent. It's not often to see a technical in depth posting.

The Dear ImGui library looks excellent with simplicity.

felix_thursday 2 days ago 0 replies      
Wow, this is incredible. Nice, succinct post, too.
zk00006 1 day ago 0 replies      
Inspired by Silicon Valley?
joncrane 2 days ago 1 reply      
When I read the headline all I could think about was Dinesh from Silicon Valley writing a blog and bragging about his video chat project.
U.S. Bans Theranos CEO Elizabeth Holmes from Operating Labs for Two Years wsj.com
413 points by kevinnk  1 day ago   268 comments top 23
bane 1 day ago 3 replies      
I hope that Holmes' bizarre behavior as the issues went public keep her from ever running anything even remotely close to things that matter for people's lives and well being.

Even as the questions were building and the noose was tightening, she would announce a public talk to reveal and openly share information on the company's tech, then mysteriously cancel a short while later -- she did this over and over.

I'm still not convinced that Theranos was meant to be a scam, or at least not a scam in the way most people are thinking about it. But it has definitely produced a similar output, and that makes it functionally equivalent.

I hope, really hope, that someday the true story about all the WTF-ness around Holmes and Theranos comes out.

I've been on the inside of a company like this once and I ran away as soon as I realized the place was up to no good. What still bothers me about my personal experience is that I, even as a person on the inside, still don't know the truth about that company due to all the same weird kind of cult of secrecy things we've seen at Theranos. The truth in these kinds of things, I suspect, lives only inside the heads of the people who run these kinds of organizations and it may not ever be possible to get at what the truth actually is depending on how far down the delusion hole they've fallen.

Real kudos to the press who broke and made very public the stories. This was the media at its finest. Those journalists may have helped save many lives.

kumarski 1 day ago 5 replies      
I've talked to over 100 hematologists.

This business, didn't pass a basic litmus test of objective criticism from people who work in the space.

There seems to be this bravado among founders who believe they're sticking it to all the people who say something's amiss. I think if there's an elephant in the room though, it probably should be assassinated with a huge body of transparent evidence.

deftnerd 1 day ago 4 replies      
Sometimes company leaders have severe problems with ethics. As mentioned in other comments, even YC has had companies that have bundled adware/malware with software, practiced dark patterns, hidden news of security breaches, etc.

I tend to see YC as being more "evolved" than other VC's, but I also think that these problems are more ingrained into the human condition.

YC could consider making funding contingent on all high level company officers attend or participate in some kind of ethics course. It could even be remote.

There could also be penalties built into VC agreements. If a company violates contracted ethical rules, then penalties could include replacement of staff, more shares being given to the VC, low share buyback prices for the VC, etc.

It could also be a two-way street. If YC violates some kind of ethical rule, then they could be punished by having to do something for the companies they represent.

Ethics are important but so far they have just been "best practices" in our industry and not something contracted and enforced.

arcticbull 1 day ago 5 replies      
Hear that? That's the sound of $9B in valuation disappearing :| There is really a thin line between delusion and brilliance.
fnbr 1 day ago 4 replies      
I have no sympathy for her. It's incredibly irresponsible to do what she did- play marketing games with medical technology. Theranos' unethical behaviour has made it much more difficult for future innovation in the medical field, and has potentially cost lives. This sanction is appropriate.

My fiancee works as a laboratory scientist in a hospital conducting patient sample testing, and her and her co-workers take their work incredibly seriously- checking, and re-checking their work, with complex protocols to guarantee the accuracy of their testing. It's disappointing to see that same attitude lacking in Theranos.

joeyrideout 1 day ago 1 reply      
Ouch. Sanctions mean "shutting down and subsequently rebuilding the Newark lab from the ground up, rebuilding quality systems, adding highly experienced leadership, personnel and experts, and implementing enhanced quality and training procedures". There is a chance to appeal, but "such appeals have rarely succeeded in the past". Couple that with an "unspecified monetary penalty" and this looks like a very big nail in the Theranos coffin.

Side note: Anyone else have trouble viewing the WSJ article? I had to read the full text through a private news outlet, even though I tried signing in to WSJ with Facebook :/

Edit: Added a sentence of detail to the last paragraph, for those who still can't see the article.

Animats 1 day ago 4 replies      
The WSJ article and the Theranos press release [1] don't agree. There's no mention of the 2-year ban in the press release. The press release indicates it's business as usual for Theranos at their Arizona lab. Supposedly only their Newark (CA) lab is being shut down.

[1] http://www.businesswire.com/news/home/20160707006570/en/Ther...

redmaverick 1 day ago 0 replies      
The difference between a startup like Theranos/uBeam and a product based "soft"ware company is that in one case you can make outrageous claims and then use sheer determination, will power and lots of money to make it happen retroactively but you cannot change the fundamental laws of nature.
mkagenius 1 day ago 0 replies      
little more text than wsj (unsubscribed): http://medcitynews.com/2016/07/cms-fines-theranos/
dcgudeman 1 day ago 2 replies      
I wonder how long she will be able to hold the CEO position?
aabajian 1 day ago 1 reply      
The trajectory of this company is just terrible. They surmounted the most difficult obstacle in healthcare - breaking into existing strongholds (e.g. Walgreens). At that point if there was any doubt that their technology worked they should've used existing tools to run their lab tests. Yes they would've lost money, but they could have used the time to build out their rapid technology or pivoted to a different business model. Getting a contract with Walgreens or any major vendor in the healthcare space is an incredible accomplishment, but such unethical behaviour will make it even harder for future startups to secure such partnerships.
josh_carterPDX 1 day ago 0 replies      
This is a great example of needing to do more due diligence. Walgreens and all of the investors were swept up by the whole "disruption" rhetoric. They should have spent more time understanding how Holmes and her team were going to handle compliance. It's clear none of that was done. This is what happens when disruption pushes back.
sndean 1 day ago 4 replies      
If they're going to ban her from operating labs for two years, does that mean they think she'll be, in some way, rehabilitated and capable of soundly running labs after that?

Maybe there is little precedent, but that length of time seems a bit arbitrary (and too short?).

jasonlaramburu 1 day ago 1 reply      
The article mentions that the company's current governance structure may prevent the board from terminating Holmes. Does that mean she is the majority shareholder?
rgbrenner 1 day ago 1 reply      
The article says it isn't clear what the monetary penalty would be.. but the letter CMS sent to theranos in March (re the newark lab) proposes a 10000$/day penalty for noncompliance that would continue until the lab is brought into compliance.


Fede_V 1 day ago 4 replies      
I'm usually a huge fan of pg, but in his essay about founders, there was one section which I wasn't very comfortable with: http://www.paulgraham.com/founders.html

Specifically, when he talks about naughtiness, he says:

Morally, they care about getting the big questions right, but not about observing proprieties. That's why I'd use the word naughty rather than evil. They delight in breaking rules, but not rules that matter.

I have absolutely no interest in trying to play language games - and maybe I'm misunderstanding what pg is saying, but it has always seemed to me that the judgement of whether the rule that was broken was consequential or not is post-hoc. For example: had airbnb gotten in serious trouble and floundered in the aftermath of when they started scraping craigslist for listings, they wouldn't be clever and naughty, but reckless and foolish. Had zenefits managed to grow even more or hire some key lobbyists and get the law changed in time, their CEO would be hailed as a visionary genius that cut through pointless red tape.

Anyway: the reason I brought that up is that a lot of the ethically dubious things that Elizabeth Homes did are very similar to things which a lot of tech companies did at one time or another. Trying to push ambitious young men and women to look at rules and regulations as something they should take pride in hacking and bypassing is a dangerous game - even more so in fields that are highly regulated.

Addendum: part of the original hacker ethic was to ignore stupid rules. For example, we take delight in Feynmann cracking safes at Los Alamos, or finding some clever hack to bypass a pointless procedure. However: I think it's one thing to hack a system to make a point about how stupid it is, but it's completely different if you add a monetary incentive, and suddenly the rules that get broken are those that stand in the way of you making money. The two sets have a fairly small overlap.

Addendum two: in a complex society like the one we live in, we have a lot of dumb rules. I'm not trying to defend them - we should obviously get rid of them, even in healthcare. A lot of economists have written intelligently about how to make the approval process of the FDA more agile.

yeukhon 1 day ago 0 replies      
And she went to Stanford? What a joke. Seriously, I am tired and sick of government and regulators not sending people to trials at all. I was watching Elizabeth Warren's hearing in the Senate on finance issues, and she brings up a good point. Wall Street banks have not been punished hard enough. In case of evident and deliberated fraud and cover up, no one was sent to trial. This is stupid.

FYI, I am not the kind of guy goes around and preach about reform. But this shit is stupid as hell. I think major offense like these should be sent to trial or even requires congressional hearing and congressional punishment.

tardo99 1 day ago 0 replies      
Still waiting for the perp walk...
acosmism 1 day ago 0 replies      
hopefully forever!
abpavel 1 day ago 0 replies      
I've personally heard Tim Draper blaming regulation for her downfall. So in the eyes of SV giants, it's us and the government who are delusional, and we should let her do her thing.
dannylandau 1 day ago 1 reply      
Schadenfreude is probably merited in this case, but I for one think one needs to consider that this could quickly spiral down and end up in Holmes taking drastic action that we will all regret. Hope my meaning comes through here.
gravypod 1 day ago 3 replies      
I cant read this article. Can someone provide some backstory? What is Theranos? Why was it shut down?
No grades, no timetable: Berlin school turns teaching upside down theguardian.com
470 points by passenger  5 days ago   242 comments top 29
ada1981 5 days ago 7 replies      
Sudbury Valley Schools are the gold standard as far as I can tell from my research. (I spent a number of years as an education/students rights activist; built and ran the country's most famous education blog back in 2009; and have otherwise been doing advocacy work for a couple decades)

They are fully democratic and focus on producing empowered self directed citizens who understand how to share limited resources and cooperate.

The kids have as much power as any adult or teacher, which I really love -- they say you can't make someone partially equal -- either you share power with them or you don't. No grades or grade levels, students decided what to focus on. Great mini documentary on the website to watch as well.


bbayles 5 days ago 4 replies      
My partner became involved in Montessori a few years ago, and I've been really impressed with how it works.

If you don't know much about Montessori, here is my "in a nutshell" version: It's a method of schooling usually associated with preschool and elementary education. There are mixed age groups (e.g. 3 to 6) in classrooms. There is at least one adult who gives one-on-one lessons on how to work with "materials." The materials do the teaching, not the adult.

There are hundreds of different types of materials. They're designed to teach or exercise a particular skill, but they look like games and are all designed to be "beautiful" as to entice children to them (Montessori is really serious about this; materials are not allowed to be broken or chipped or worn). They follow a particular progression. Children may work with any material for which they've had a lesson. They can work with it as many times as they like.

My favorite example of how Montessori works is how it develops reading skills. First a child is introduced to materials that involve very short crayons - when they use them they strengthen the muscles in their hands. Then they're introduced to letter-tracing materials. Then they're introduced to letter/sound matching materials.Children who follow these lessons wind up writing first, and then reading follows very naturally. The emphasis on developing physical capabilities first really demonstrates the attention to detail that's typical of Montessori.

The math education progression is impressive also. There are materials that have children doing proto-multiplication, exponentiation, algebraic manipulation, and more. My partner developed a material that teaches counting, addition, and subtraction in base 8. (Her 4 and 5-year-olds understood it much more quickly than her colleagues!)

There are parts of the method I don't like (e.g. dogmatic resistance to rewarding performance, for example), but overall it feels like a huge improvement over typical early childhood education.

gwern 5 days ago 0 replies      
" Year after year, Rasfelds institution ends up with the best grades among Berlins gesamtschulen, or comprehensive schools, which combine all three school forms of Germanys tertiary system. Last years school leavers achieved an average grade of 2.0, the equivalent of a straight B even though 40% of the year had been advised not to continue to abitur, the German equivalent of A-levels, before they joined the school. Having opened in 2007 with just 16 students, the school now operates at full capacity, with 500 pupils and long waiting lists for new applicants.

Given its word-of-mouth success, it is little wonder that there have been calls for Rasfelds approach to go nationwide. Yet some educational experts question whether the schools methods can easily be exported: in Berlin, they say, the school can draw the most promising applicants from well-off and progressive families. Rasfeld rejects such criticisms, insisting that the school aims for a heterogenous mix of students from different backgrounds. While a cross adorns the assembly hall and each school day starts with worship, only one-third of current pupils are baptised. Thirty per cent of students have a migrant background and 7% are from households where no German is spoken."

OK, Rasfeld. If you believe you are working this magic with normal students, switch to lottery admissions, and provide pre-admission grades, IQ scores, parental incomes or language or country of origin, and we'll see how much of that outperformance remains after controlling for baseline characteristics and comparing the lottery winners with losers...

_petronius 5 days ago 1 reply      
Minor nitpick: "evangelical" is a bad translation of the German "evangelisch" (at least in the sense that many Americans think of "evangelical"). In the German context, this is better glossed as simply "Protestant".

(And the German protestant church, at least here in Berlin, is a vastly more liberal and progressive organization than one would imagine if you grew up in the American south like I did.)

gurkendoktor 5 days ago 5 replies      
Every time I come home from a mind-numbing eight-hour Scrum meeting, I feel frustrated for having accomplished absolutely nothing. Then I realise that this is what school was like, every single day.

I'm sure some people prefer to be force-fed instead of learning on their own, but self-organised schools as in the OP should at least be an option for people who know what they want.

vinceguidry 5 days ago 5 replies      
Education is extremely political. It's not that we can't come up with better ways of occupying school-age children. Montessori has been around since the early 1900s!

The problem is convincing everyone they need to adopt a new method. Which is absolutely completely politically impossible compared to making small tweaks to the system already present.

Nobody is going to turn education upside down. Ever.

rweba 5 days ago 1 reply      
The bottom line is: how well do these "progressive ideas" work in practice? I couldn't find much evidence online that they're more effective in general than the "traditional" approach.

I have been a college professor for 7 years, here's what I have been able to PERSONALLY observe in that time:

(1) Frequent, regular feedback (quizzes, tests, homeworks, projects) helps a lot for most students. If you just have a final at the end the majority of students will finish the course having learned a lot less.

(2) Obviously students learn more when they're personally engaged and interested in a topic rather from just doing it to get a grade. But getting them excited is not obvious and engagement varies in predictable ways based on previous background knowledge and aptitude for the subject. A lot of the extra effort a teacher does is ultimately to try to get students more excited about a subject.

(3) One on one time with a teacher students can be very helpful. It will almost certainly produce a noticeable improvement in subject understanding, particularly for those students who are motivated but struggling a little bit. Unfortunately this requires a lot of time from the professor, so it's not really practical except with small classes and a small teaching load.

HillaryBriss 5 days ago 3 replies      
> The pupils decide which subjects they want to study for each lesson and when they want to take an exam.

So the student learns what they want to learn when they want to learn it. The student focuses on a subject when the brain is ready and interested. Sounds efficient.

OTOH, one of the really valuable things about a curriculum is that it serves as a guide to a complex and bewildering subject.

A curriculum, at its best, is like a highly knowledgeable person telling a novice: "Study these nineteen subtopics and you'll grasp this field pretty quickly. If you study these other nineteen subtopics, you'll just waste a year of your time and never really get a clue." Which also sounds efficient.

How to reconcile?

sandworm101 5 days ago 1 reply      
> Year after year, Rasfelds institution ends up with the best grades among Berlins gesamtschulen, or comprehensive schools, ...

So there are still grades, or at least someone is still testing the kids. This isn't utopia yet.

>> each school day starts with worship.

Sorry. Game over imho. I am not so put off by the specifics of the religion, but by its presence muddying the academic waters. A school religion means a distinct value system, and associated enforcement mechanisms, over and above what is available at standard public schools. That means the academic achievements of this school may not be applicable as the school's faith may be playing a large role in student motivation.

I once read a paper on whether Hogwarts was a faith school. The theory went that faith schools are characterized by the fact that teachers double as faith leaders, as teachers of morality and dogma, something that does occur at Hogwarts. This motivates students in a manner not available where there isn't religious commonality between students and teachers. The problem is that the scheme breaks down once the kids realize their teachers are but human, flawed and sinful as anyone else. Let's call that the Krabappel effect. Then the religion becomes a reason to distrust teachers, to break ranks even work against them ... like at hogwarts.


As I am not allowed to reply (thanks for that btw) I'll edit in my reply here.

My comment above is not to whether religion is a good/bad thing, but to the fact that the presence of religion at this school makes it less likely that its results can be replicated at other schools with either different or no religion present.

stared 5 days ago 1 reply      
An obligatory reference:

A. S. Neill "Summerhill: A Radical Approach to Child Rearing" (1960), especially chapter 1 (about the Summerhill school itself). The schools started in 1920s, and sadly, we made no progress in that direction in the last century.

Link: https://trisquel.info/files/summerhill-english_1.pdf

ThomPete 5 days ago 1 reply      
It really comes down to the children. If they have a lot of selfdisciplin it's great if not they are better off at more strict schools.

At the end of the day though, parents and their indirect involvement is the key to a proper education no matter what educational philosophy one follows.

mvdwoord 5 days ago 1 reply      
I went to a Montessori school for a while. Much like what is described here. For me it was fantastic and I sometimes lightly regret not continuing in high school due to (minor) practical obstacles. That said, I believe these types of systems are definitely not for everyone. Lots of kids are probably better off with a bit of structure and imposed discipline.
ayylmao907 5 days ago 1 reply      
I have to say I'm of the opinion that modern education is terribly inefficient, and to back my claim I offer that most people who are any good get there not by following a curriculum but by having an interest which they explore on their own.But that's just on the supposed goals of education. I feel the important achievement of modern school systems is the obedience of the populace, which from a historical perspective is quite a feat really, if far from their professed aim.
wapapaloobop 5 days ago 5 replies      
We school-educated people associate learning with authority and it's hard for us to truly grasp that education takes place most efficiently under conditions of freedom. Repealing Hitler's 1938 law against home education would be an important step for Germany.
nottoday88 5 days ago 2 replies      
Is this a new idea or a rebranded old one? Summerhill school?
dschiptsov 5 days ago 0 replies      
The essence of teaching, it seems, to assist students with theoretical knowledge while they are learning by doing the right things. We are evolved to learn by doing (beautiful studies about how children of remote rural areas in Himalaya and Mongolia are learning their native language without being taught, by mere exposure is a nice evidence).

The second factor is to be taught of right things - the first principles, fundamental ideas with no nonsense examples. This is why classic MIT or Berkeley Scheme courses based on SIP were wastly supervisor than modern "pragmatic" Python or Java crap. Brian Harvey's CS61A is a gold standard.

Schedules and grades are of second importance. The proof is all these self-educated people who picked up knowledge without attending any high school. Moreover, in many cases a municipal primary school, being much like an overcrowded prison-like facility for kids from impoverished families, did more damage than good by imposing wrong habits and impressions of what education is about.

Learning is a continuous and natural process for us and other higher animals. Removal of obstacles, distractions and idiots and exposure to right principles and ideas will do much more than all the micro optimizations combined. "Good schools", like MIT or Yale proved it many times.

There is also Pirsig's book.

alejoriveralara 5 days ago 0 replies      
Incredible! Sounds very similar to what schools like Acton Academy and Talent Unbound are doing in the US. I have no doubt that the future of education will come from private schools made by entrepreneurs re-imagining schools from the bottom-up.

I opened the second campus for Talent Unbound in Houston about a year ago, and am now working on building software to help power more of these schools. It's very exciting to be a part of what seems like a world-wide learning revolution!

nippples 4 days ago 0 replies      
I don't think you need to be a helicopter parent to be somewhat wary of the description of this. There's plenty of value in standardized education and a scoring system.

It doesn't need to be so rigid as traditional schools, but some metric, even if very abstract and vague, that helps us identify that there are possibly important gaps in the kids' knowledge is pretty damn useful.

JoeAltmaier 4 days ago 2 replies      
I don't understand the 'no timetable' part. In my experience kids will do nothing if no deadline is presented. How do schools avoid some kids diddling away the semester? Or is it just OK for some kids to fail?
up_and_up 4 days ago 0 replies      
AMA: We are secular homeschoolers and primarily use self-directed learning as our teaching method. Unschooling/self-directed gives our children extreme freedom to pursue their interests, encourages deep play-based learning, and instills a deep level of self-confidence in them.

Most of our homeschooling friends also use this method of "teaching". We have really become more like facilitators for the next big project they feel like doing.

ommunist 5 days ago 0 replies      
It depends on perspective. Getting education or acquiring skills or better network is ol'good combination of carrot and stick anyway. Those instruments just happily passed from teachers to students themselves. Time will tell who is using these precious things more wise.

Full disclosure - I am big fun of didactics in delivering knowledge, since only implying of proper level of didactical pressure makes the right chemical background for application of nootropics.

frozenport 5 days ago 2 replies      
I went to one of these as a child and learned nothing. It became a real struggle to learn in a time efficient manner. I would strongly discourage parents from choosing this kind of education for early learners as it doesn't include various social skills like how to sit in a classroom, address a prompt, etc.
anotherhacker 5 days ago 0 replies      
A great primer to this type of philosophy (no grades for kids) is Deming--from about 40 years ago.

"A Theory of a System for Educators and Managers"https://www.youtube.com/watch?v=2MJ3lGJ4OFo

2suave 5 days ago 3 replies      
The title should read "No grades, no timetable, and no degree". As much as I like the idea behind it it's simply not a feasible concept in today's society and it's competitive environment. In fact, I'd even wager that the pupils of that school will be ridiculed by their fellow peers for attending such a school. Not to mention that admission boards of prestigious universities would never give them the time of the day. With such an "education" it's literally impossible to compete in today's society.
programminggeek 4 days ago 0 replies      
Sounds a lot like many homeschooling approaches done well in a non-home environment. Interesting.
lllorddino 5 days ago 1 reply      
> such as coding a computer game instead of sitting a maths exam.

Okay I suck at math but am decent at programming and the last time I checked there is a bunch of math in game development.

I agree that some subjects in school are useless but if you give a kid the option of doing whatever he wants versus learning math (which most would) then I'd say your school system sucks.

id122015 4 days ago 0 replies      
if I was still in school I wish they included being able to choose what matters I wanted to be graded. But I wish no grades for humanities like history, literature, geography.
pinaceae 5 days ago 1 reply      
so, this method is great for the outliers, kids like hawking or tony hawk - kids that are driven by themselves, don't need no guidance, help, discipline.

there is a reason asian kids outperform others, and it's not montessori.

and yes, school is about performance as kids will grow up and will need to be able to earn for a living. work, compete.

school is optimized for the average kid with average parents. all the HN outliers don't apply.

i would opt for full day schools in shitty areas, get the kids away from their parents as much as possible. off the street. teach them.

themartorana 5 days ago 4 replies      
This is a child being disrespectful on a monumental level, not outsmarting anyone. It also sounds like her parents are paying a great deal of attention.

She may also be a classic psychopath.


Felony An open-source PGP keychain github.com
469 points by henryboldi  5 days ago   228 comments top 36
henryboldi 5 days ago 12 replies      
Hi I'm Henry, the creator of Felony

Ive had a passion for politics, history, and programming since the age of 12 growing up in a suburb of Chicago. During my freshman year, I developed an interest in software. A couple of apps and hackathons (programming competitions) later, I was working on my own startups when I made the leap to drop out of high school to become a software engineer at a venture-backed tech startup.

While working there I learned that PGP encryption was the tool used by Edward Snowden to securely send messages to journalists. The immense value of encryption as a core component of our free society became clear to me. Amongst fellow coders, I had no trouble using command-line encryption to communicate. But my friends who didnt code couldnt easily do the same since they dont know how to use the command-line. Given how important encryption is, I decided to build a first-rate encryption tool that could be used by anyone on any website, regardless of background.

michaelmior 5 days ago 2 replies      
Mods, could we have something descriptive added to the title? This single word doesn't really give me any idea what this about. Suggestions (taken from the link)

Felony: Next Level PGPFelony: An open-source PGP keychain built on the modern web

knorker 5 days ago 1 reply      
This name is awful. I would never want to contribute to it, nor use it. Nor suggest it to anyone as a solution to anything.

It's the worst name since that framework called "cocaine" with tools and subprojects named after illicit drug market terms.

Yeah, "felony" and "cocaine" are not things I will put on my CV or would like to show up when someone Googles my name.

What's the joke here? That some people are incorrectly labelled felons for what they say and write?

Do you know what most "felons" did to be called that? It's not for what they said and wrote that should be constitutionally protected.[1]

[1] I don't have numbers to back this up. Maybe most people are actually felons for drug possession, but you know what? I don't want to be associated publicly with those actions either. Also do you want to be on this table? https://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/2015/...

Violent crime,Murder,Rape, Robbery, Property crime, Burglary, Larceny-theft,Motor vehicle theft, Arson

callumlocke 5 days ago 2 replies      
Although the name is ironic, it will reinforce the common vague notion that encryption is something politicized/controversial/illegal, and that's not a good thing for infosec.

Looks great otherwise.

danso 5 days ago 0 replies      
I've heard about optimizing for developer happiness, but this is kind of silly.

- the app has an unintuitive and harmful name that casts aspersions on the core values it purportedly touts because the developer saw that it was an available .io domain [0]

- This app has a shitton of leftover boilerplate and dev dependencies from a bootstrap scaffold, even though AFAIK there is no testing suite. (Because we all know how safe npm dependencies are...)

- A good number of unnecessary non-dev dependencies too. It includes font-awesome, which seems unnecessary to include in its entirety already...but are there any uses of font-awesome? I did a search for "font-awesome" and "fa-" but couldn't find any.

I understand using boilerplate generators to learn the ropes of creating within a framework...I've done it to learn React and Angular. But to use a scaffold-generator for a niche and highly specialized/sensitive app like this? It can't mean that it's anything more than a toy app. And yet one in which the decision to give it the name "felony" just looks immature on the author's part, meaning that it's not even useful as a resume padder.

[0] https://news.ycombinator.com/item?id=12030422

nickpsecurity 5 days ago 2 replies      
Most are focused on the name, which is terrible, while only one other (so far) noticed the big problem: Electron, React, and Redux. A secure messenger needs to have strong endpoint security. Easiest way to do that is using safe, system languages with simple implementation, as few dependencies as possible, and isolation of app from rest of the system. That's one of safe C's, restricted C++, SafeD, Ada/SPARK, Component Pascal, Rust... any of those with portable code for main library plus modules for OS-specific stuff (esp GUI & filesystem). That would have a chance of surviving hackers, esp good ones.

I know almost nothing of the above frameworks. However, Google gave me front pages for each that look more complex in implementation and dependencies than a C, Ada, or Rust app. Unnecessarily so. Secure applications should follow Lean and KISS principles every chance.

Note to author: All that said, if you're just doing it for fun or learning, then that's cool. Also a good area to learn about. :) The above applies to implementations meant to be used in field.

rxlim 5 days ago 0 replies      
After reading the README I think that "Felony" is a very appropriate name:

... built on the modern web with Electron, React, and Redux.

Building desktop applications with web frameworks should definitely count as a felony.

imjustsaying 5 days ago 0 replies      
Guess what the prosecutor will say to the jury in every case involving a defendant who uses this?

"The accused was even using an app named Felony!"

fdomig 5 days ago 1 reply      
> built on the modern web with Electron, React, and Redux.

Security? Encryption? Privacy?

gort 5 days ago 1 reply      
I'm not sure this is the name to use if you want people of only average political commitment to use your app. Although at least it's a striking name.
runj__ 5 days ago 1 reply      
Awesome! I'll finally be able to stop using the horrible GPG Keychain app I used to use which didn't even allow pasted public keys.
K0nserv 5 days ago 1 reply      
Really cool, it'd be nice to have a few more screenshots or maybe a video of the usage. It's not fully clear if Felony actually sends the message or only encrypts it and allows you to send the encrypted message in another medium.
SNvD7vEJ 5 days ago 0 replies      
The members page of the github page looks like some sort of criminal record


deftnerd 5 days ago 2 replies      
App doesn't appear to work for me. I downloaded the precompiled windows app, and it loads a window that says "Hello React" and gives error popups too.
tejasmanohar 5 days ago 1 reply      
Ah, neat- OpenPGP.js! Stumbled upon this the other day and was impressed that it's already been audited (Cure53).
j1vms 5 days ago 0 replies      
I understand other posters' concerns about the name, but I have to admit it evokes almost the same level of wry wit of Linus, when he christened 'git'.

In fact, the reception this name is getting is quite ironic. Just think about it, and you might just burst out laughing.

spriggan3 5 days ago 0 replies      
Poor name for an app. And yes it matters.
brian_cloutier 5 days ago 1 reply      
It's not obvious from the readme, how does key exchange work?
deanCommie 5 days ago 16 replies      
Okay, I'll be the contrarian one: I HATE the name.

There have already been trends in the mainstream and right wing media that "If you have nothing to hide, you have nothing to fear", that the NSA only monitors the communication of criminals, and that things like iPhone encryption help terrorists first.

With that in mind, can you imagine the reaction that the average lay-person will have when they see a clickbait headline or morning news report that says "A new app called Felony allows ISIS and online pedophiles to communicate in secret with ease."

It looks like a great app, and I will honestly use it.

But I don't think the name helps the cause of promoting easy and default end-to-end encryption for all to remove the implication that the only people that use it have something to hide.

e12e 4 days ago 0 replies      
On a related note, has anyone had a look at "Pretty Curved Privacy" ?


(Just submitted it to hn - I thought there was an old submission, but apparently I was mistaken):https://news.ycombinator.com/item?id=12035081

If felony is PGP protocols wrapped in modern web technology, I suppose pcp is NaCl wrapped in old PGP command line and protocols...

macawfish 4 days ago 0 replies      
Call it "privatebits" or something more suggestive that personal informational boundaries and privacy can be healthy for everyone, rather than the highest criminal offense. I understand that there's some irony or sarcasm there, but trust me, those are not timeless, even for people who "get it". Bitter humor is not sustainable in the long run, so relying on that kind of energy probably won't help the cause.
tokenizerrr 5 days ago 0 replies      
The Gnu Privacy Assistant (GPA, https://www.gnupg.org/(en)/related_software/gpa/screenshots...., bundled with https://www.gpg4win.org/) is also pretty good. Though it does require you to already know the right words and a basic knowledge of GPG.
itschekkers 5 days ago 0 replies      
I like the idea of this, and would love to give it a try. I would say, however, that the documentation/instructions are a little bit barebones. I know its just early days, but as a newcomer to node it is pretty difficult to know how to use this. You may also want to include a PGP 101 (or a link to a good get-started guide) because it isn't really common knowledge either
givinguflac 5 days ago 0 replies      
Could have used a hyperbolic name in the other direction. "FreedomKeyper" comes to mind.
dkarapetyan 5 days ago 0 replies      
This is fantastic. Now all you have to do is add a share button and an extension to the site being shared to. Imagine if all status updates where PGP encrypted, what a wonderful world that would be.
thinkMOAR 4 days ago 0 replies      
Looks like neat little app, read the website, but couldn't find a question i had which popped up instantly when i launched the app.

Is it possible to use existing PGP keys with Felony?

mugsie 5 days ago 0 replies      
From what I can see, the underlying openpgp js lib does not support GPG Cards (smartcards / newer Yubikeys + others).

Interesting app, and it looks cool, but it rules out usage for me. Why the JS + Electron stack?

esafwan 4 days ago 0 replies      
Really great app. I was alway in search for open-source tool as this. Didn't get time to check yet. But have to soon.
peterkshultz 5 days ago 1 reply      
What are the benefits of this over something like Keybase?
luke-stanley 4 days ago 0 replies      
Would be better to use a small GUI library, avoiding Webkit / Chromium, for memory usage and security.
unimpressive 5 days ago 0 replies      
Change the name.
cm3 5 days ago 0 replies      
What's the state of WebCrypto APIs, and is it already possible to avoid ciphers written and deployed in JS?
bbcbasic 4 days ago 0 replies      
Attackable via dependencies
bbcbasic 4 days ago 0 replies      
Call it Enigma
reviseddamage 5 days ago 0 replies      
committing felony. it's a github pun meta isn't it.
lllorddino 5 days ago 1 reply      
You shouldn't have to work in a terminal to be able to use HTTPS's security assurance.

This sort of mentality is a hindrance to getting encryption out there to the common person and making it common place in communication

BSD vs. Linux (2005) over-yonder.net
456 points by joseluisq  5 days ago   346 comments top 41
ChuckMcM 4 days ago 4 replies      
I like FreeBSD, and I use it as a server OS and on a NAS box but you only need look at https://wiki.freebsd.org/Graphics to understand that if the "Linux Desktop" is a joke compared to MacOS and Windows then the "FreeBSD Desktop" is even more so.

From my experience of "old" computers, workstations, and then "PCs as Workstations", Windows won the desktop because the UNIX camp could never check their egos at the door and get their act together on a windowing system and graphics architecture. And while it was brilliant that you could do the "heavy lifting" on a server and do the display part on a less powerful machine, that architecture should have been taken out behind the barn and shot the moment 2D and 3D acceleration hit the framebuffer cards. Instead we have the crap that I have to put up with, which is a wonderfully compact and powerful system (A NUC5i7YRH) with a graphics architecture that is not supported at all on FreeBSD and has pretty egregious bugs on Linux, and runs beautifully on Windows 10.

For the conversation at hand though, the Linux graphics pipeline teams are much better at getting things working than the FreeBSD team, even though FreeBSD is much more "friendly" to non-open drivers.

I would love to have a nice, integrated, workstation type experience with a UNIX operating system but that music died when SGI and Sun through in the towel and tried to retreat to the machine room.

vivin 4 days ago 3 replies      
I was a long-time FreeBSD user. Started using it in college and continued for a long time. I started using Linux because I had bought myself a new laptop and BSD didn't recognize the wifi card. I continued using FreeBSD at home for a few more years on my webserver before ultimately moving to dreamhost (I just didn't have the time to keep maintaining my own server).

I like using Linux, but I still miss the predictability of a BSD system - you know where things are, and where they are supposed to be. When I first started using Linux, I was absolutely flummoxed by the lack of distinction between the base system and add-on utilities.

Linux definitely feels more "organic" and "grown" whereas FreeBSD seems like it was architected and planned out. Not that this is a bad thing for Linux. My FreeBSD heritage still shines through when I use Linux; anything I install from source sits in /usr/local :).

nixos 4 days ago 9 replies      
I know Linux (Debian) quite well and would like an excuse to try learning FreeBSD, but I just can't find any serious use-cases where FreeBSD would be of any advantage.

I run a small site on a VPS, so:

1. I don't have GBs of free memory for ZFS

2. I don't have GB of RAM, CPU and HD space to build everything from ports, and most importantly, no need.

Except for an Application Firewall for nginx, what does ports have over deb packages? All in all, how much MB of free RAM or free HD space will I win by compiling everything myself (taking time to do so, and pushing off security updates because I don't have the time to sit and watch the compilation, hoping nothing breaks (which _did_ happen once)

3. License - I don't care. GPL is free enough for me.

4. Base vs. Ports - Why should I care? Debian (testing!) is stable enough for me. Except for dist-upgrades, I never ran into issues, and then it may be faster to nuke the server from orbit. Now had BSD "appified" the /usr/local directory (rather than keeping the nginx binary in /usr/local/bin and conf in /usr/local/conf it would have kept everything related to nginx in a /usr/local/nginx) it would have been interesting, but now?

If anything, I like how Debian maintains both base and ports, so I can get almost any software I need from apt-get, and don't have to worry about conflicts.

5. systemd? The reason Debian went with systemd was (IIRC) because they didn't have the manpower to undo all of RedHat's work in forcing all applications to integrate into systemd (such as GNOME). I don't know how FreeBSD is doing in that regard.

I don't mind learning new systems. (see my username :) ). I actually understand what nixos or coreos, for example, bring to the table. But FreeBSD?

hacknat 4 days ago 1 reply      
I think the best, and most traditional, comparison is the Cathedral and the Bazaar[1]. Pretty much every observation the author makes can be framed in this comparison. BSD has built-in libraries, and has a cohesive architecture. Linux is popular precisely because of its chaos. It is incredibly easy to hack some feature together in Linux. It will probably start out as ugly, but if enough people glom onto it, it will become great and maybe even secure. It is pretty clear that Linux became popular not because it was the best, but because it was the fastest development path.

Obviously BSD has its uses, but if you're looking to develop a new feature and get it out the door (in OS development) Linux is the easiest choice.

[1]ESM https://en.wikipedia.org/wiki/The_Cathedral_and_the_Bazaar


I remember reading somewhere that Netflix uses BSD for all of its net-intensive servers as network performance tuning on BSD is, at least by reputation, better, but they use Linux as their workhorse. They employ some FreeBSD committers though (obviously not everyone can do that).

kriro 4 days ago 0 replies      
My perspective as someone who is mostly a desktop user/dev and rarely dives into server related issues other than setting up a home network and maybe sshing into a server somewhere in the magical cloud:

* The BSDs feel more elegant, the Linux distributions feel more complete/up to date (and are probably more performance optimized due to enterprise embrace but that's only speculation on my part).

* I sympathize with the BSD license philosophy and agree that the BSD-license is in theory freer but don't care too much either way

* OpenBSD is awesome (yes even for desktops, with the right hardware) and I install it every now and then but ultimately I'm too familiar with Linux to switch. I do like that they keep fighting the good security fight, don't care about supposed personality quirks of some devs. Keep up the good work

* At the end of the day I use Linux because that's what I grew up with and it tends to "just work" (for the record all BSDs I installed in recent memory pretty much worked out of the box). I am kind of forced to use OS X at work but other than that it's Linux on all machines. The parents also use it by now.

dkarapetyan 4 days ago 3 replies      
I tried freebsd. Gone back to ubuntu. In this day and age not having proper dependency resolution for packages is not acceptable. I've more than once had some tool based on ports just go into an infinite cycle when asked to upgrade some packages.

The philosophy laid out in this article seems more like rationalization of historical accidents more than anything else. Linux file system layout is just as predictable as anything else. Configuration goes in /etc, PID files and other such things go in /var, things local to the users go in /usr/local, cron related things go in /etc/cron.d, and so on and so forth. FreeBSD file system layout on the other hand makes no sense to me and the rc system is even more bizarre. Do I really need to specify "XYZ_start=YES" when upstart, systemd, and others have this stuff all sorted out already. Well not systemd so much but close enough.

Overall the BSDs are too low level for development and play purposes. For deploying proxies, caches, NATs, and other such single purpose things I'm sure they're great but anything else not so much.

jdefr89 4 days ago 1 reply      
This is sort of explains why I have chosen OS X. Its a UNIX operating system that makes for a great modern desktop machine. I don't need to worry about poor and or glitchy hardware support. Its 2016, I don't have the time to play around with finding WiFi drivers or graphics drivers that can hardly support 3D-acceleration. Everything has always worked out of the box. Having had two MacBook Pros, I haven't had one hardware or software issue to date. They (OS X/Apple) have the advantage of being unified. People aren't constantly arguing over stupid shit as they seem to with Linux. FreeBSD's development on the other hand is just too slow for my desktop use. Despite what the article says, I have found that there is usually a support gap somewhere after an install; with drivers or whatever else. This isn't to say I dislike Linux, I do love Linux. I just wish they would get their shit together on a few issues. As for FreeBSD, it still has a strong spot for secure, stable servers. I suppose it's all a matter of preference. At the end of the day however I will always choose a UNIX system for my day to day use.
smegel 4 days ago 2 replies      
BSD has a wonderful, unified events system. That incorporates blocking disk IO. Linux has epoll.


pavanky 4 days ago 1 reply      
Guys, this article was written somewhere in between 2002 and 2005 (based on the author mentioning bitkeeper being used for the kernel). Someone edit the title to reflect this please.
matt_wulfeck 5 days ago 3 replies      
> It's been my impression that the BSD communit{y,ies}, in general, understand Linux far better than the Linux communit{y,ies} understand BSD.

Ok fair enough, but the same can be said about manual v automatic transmissions, static versus interpreted languages, etc.

When something is harder to use then you're forced to think about it more and understand it better.

paradite 4 days ago 0 replies      
Related discussion on GNU vs. Linux:

https://www.gnu.org/gnu/linux-and-gnu.html (BSD is discussed here as well)

https://www.gnu.org/gnu/gnu-linux-faq.html (According to this page, the title of the article should be BSD vs. GNU/Linux, though the latter was mentioned once in the article)

tempodox 4 days ago 2 replies      
Disclaimer: I know this is a side issue. However, I'm really wondering what it is that makes people think using such a creative color scheme for publishing their texts would do any good. If it weren't for Safari's Reader view, I would be physically unable to decipher this text. Which would really be a shame, because this text deserves to be read.

I guess, I'm still hoping that one of these days, I will reach one of those authors and make them understand that the contents is of paramount importance and creative coloring can do nothing but detract (except when you're an expert). If you use your own color scheme, please make sure you know what you're doing.

SFJulie 4 days ago 0 replies      
I would have entitled it Linux AND BSD

This article is really about the objective difference between linux and BSD. It is not a rant in the form B > L or B < L it is much more about the difference in a lot of direction.

I personally find it much more an help to guide people to come and as much as to stay away.

I am a former user of linux (that I used to like) and I am now using freeBSD. I switched without love or hate. I just was on linux to have video card and latest hardware support, and I am on freeBSD to have the ports, stability and easy upgrading.

I did it after years of experiences enabling me to build the information about the different fit use case of both OSes.

Having this summed up in 10 pages will spare people a lot of time in deciding.

And I think it is as important to have people come to BSD as to not delude them in coming for wrong reasons: disgruntled users are as much a pain that rigorous contributors are a gift.

So information to make choice, especially when they are not structured as an aggressive comparison are very welcomed.

The author should be praised.

skywhopper 4 days ago 0 replies      
"A tour of BSD for Linux people" sounds like a genuinely interesting read, but the aggressively defensive tone starting in the second paragraph and continuing throughout the introduction turned me off of reading past the first page. I'm definitely interested in how BSD differs from Linux, but I'm not interested in being lectured at. (So, please tell me if it tones down after the first page.)
corv 5 days ago 1 reply      
This is probably the best comparison of [Free]BSD and Linux, if a bit dated.

In the meantime FreeBSD has changed the release schedule and there are efforts under way to package the base system separately.

simula67 4 days ago 1 reply      
Very informative article, thank you. This article is missing discussion on ArchLinux ( probably because it is very old ? ). Arch has a minimal base system ( no X etc ) and you can build the system you most like.

Also Linux now uses git for development.

> And normally, you do the whole rebuild process in four steps. You start with a make buildworld which compiles all of userland, then a make buildkernel which compiles the kernel. Then you take a deep breath and make installkernel to install the new kernel, and make installworld to install the new userland. Each step is automated by a target in the Makefile.

> Of course, I'm leaving out tons of detail here. Things like describing the kernel config, merging system config files, cleaning up includes... all those gritty details.

Wow, so painful. For me its as simple as :

$ pacman -Syu

and watch some movie. I bet the BSD way offers more opportunities to learn but I personally don't like learning for the sake of learning. Learning different ways to do the same thing does not make me a better person. So this is not interesting to me.

> Does Linux support hardware that BSD doesn't? Probably. Does it matter? Only if you have that hardware.

I could obtain that hardware at some point in the future.

> "But Linux has more programs than BSD!"> How do you figure? Most of these "programs" you're so hot about are things that are open source or source-available anyway.

Given an application, does the provider support it on your OS is an important consideration that people make before choosing to use an OS, as they should.

> Linux, by not having that sort of separation, makes it very difficult to have parallel versions, and instead almost requires a single "blessed" one.

Isn't this what NixOS ( https://nixos.org/nixos/about.html ) is supposed to be solving ( among other things ) ?

I might try a BSD for the novelty aspect of it, but so far I have seen no reason why it should be better.

th0ma5 4 days ago 1 reply      
I feel this captures a lot of the feel I had for FreeBSD even back in 2001 when I used it quite a lot. It had a sense that you were chiseling out this solid system that would remain for eternity, along with an assurance of security and stability.

I run Ubuntu now on multiple devices. It often has a feel of flimsy binary patches. I don't know that I care too much though as it works well. I have family members too on it so it is good to remain practiced with it to help them.

I miss playing with FreeBSD though! Perhaps I should run it on one of my RPis? Anyway, this is a great site.

duncan_bayne 4 days ago 0 replies      
The only thing that's preventing me from switching my desktop (laptop) OS over to BSD is chipset support. I tried 11.0 ALPHA, but couldn't get anything other than VESA video going.

My setup scripts are all in Git:


I've used a few different OSs on a daily basis for work and recreation: Windows (since 3.0), Linux (various distros since 1995), OSX (for a few years between 2009 and 2013), and FreeBSD.

I have found BSD to be the most comprehensible, simplest and the best 'cultural fit' for the way I think and work. I appreciate that the latter part is a bit vague, but that's because my understanding of it is vague :) BSD just feels ... more like home.

Those wanting to give it a go as a desktop OS should check out PC-BSD, which is built to be usable 'out of the box' for that purpose:


aceperry 4 days ago 3 replies      
I'm a total linux user and have tried *Bsd from time to time. For some reason, the bsds have always taken a very long time to boot up and even to start X. The last time I tried FreeBsd (about 3-4 years ago), a friend asked me to help him debug why X windows didn't run. It was taking 15+ minutes to startx on his brand new server class PC that he built up himself. We didn't realize that it took so long to start up until we went out for dinner and came back, surprised to see X had started. Same with OpenBsd, it took 10+ minutes just to boot up. I thought I didn't install it correctly and re-installed it a couple of times. I'm guessing, the Bsds are meant to be run continuously and not shut down? I've asked a couple of other Bsd people, and they couldn't provide an answer, so I'm left up in the air about that.
eeZi 4 days ago 1 reply      
Note that most BSDs (with the notable exception of OpenBSD, of course) lack modern anti-exploitation features like ASLR.

FreeBSD is working on it, but it took surprisingly long.

slyall 4 days ago 0 replies      
Pity this documented is not dated. But judging from the version numbers of things he mentions it probably dates from around 2003 or perhaps 2004.
joonoro 4 days ago 0 replies      
Ahh, this old article again.

>"But Linux is more popular.">So? Windows is even more popular. Go use that.

This slope is so slippery I broke my neck when I fell.

AceJohnny2 4 days ago 3 replies      

 When I say "Linux", I mean Red Hat. I mean Slackware. I mean Mandrake. I mean Debian. I mean SuSe. I mean Gentoo. I mean every one of the 2 kadzillion distributions out there, based around a Linux kernel with substantially similar userlands, mostly based on GNU tools, that are floating around the ether.
So... do you include Android? It's worth asking, because though it uses a Linux kernel, it's a heavily modified one (little chance of Binder ever being accepted into Mainline, or of Android abandoning it), and its userland is very different from anything you're used to from GNU.

Edit: I can't find a date, but from other stuff in it I'm guessing the article is from quite a few years ago, before Android was mainstream or even existed.

lajospajtek 4 days ago 2 replies      
Another aspect of the "Linux vs. FreeBSD" story is Java. I was almost won over by FreeBSD few years ago, however what keeps me back is so to speak the "second class citizenship" of the JVM on FreeBDS. Or at least it was so pre-OpenJDK 8, when I looked.

Is anyone successfully using Java on FreeBSD in production?

geff82 4 days ago 2 replies      
I have a lot of sympathy for the BSDs. I actually learned Unix with NetBSD in 2004. Problem with BSD nowadays, at least for me, is that all supported desktop hardware is really old. The newest, still available, Laptop that works is the Lenovo X240 (and other x40s....). And when everything works, like on my Samsung NP530, there is strange stuff like a super slow and unreliable Wi-Fi connection. So if you want a great OOTB development experience with the same, working, OS on Desktop and server, Linux is the way to go...

But I cannot help and sometimes try the BSD stuff out, as it feels like "my parents home".

madmax96 4 days ago 0 replies      
I use FreeBSD on the Desktop. It took me a bit to set up my system, so I keep lots of backups. I've been exploring the idea of building an image that contains all the stuff I use.

I find that FreeBSD is generally a lot simpler and more discoverable than Gnu + Linux. If you want a bare-bones UNIX experience free from SystemD and Pulse Audio, I'd recommend giving FreeBSD a try. The FreeBSD handbook is very nicely written, and at the very least using FreeBSD is an educational experience.

joeyrobert 4 days ago 0 replies      
I remember checking out http://over-yonder.net/ after being linked from Slashdot circa 2006. I always have a soft spot for the "Homepage" link on Slashdot. I was impressed with his resume generator at the time: https://www.over-yonder.net/~fullermd/resume
andrewclunn 4 days ago 1 reply      
Must... resist... clickbait title... I'm not strong enough!
torgoguys 4 days ago 0 replies      
A check of the wayback machine indicates this was first posted back in October 2010. Mods might want to tag it as such since some of the information is dated.
caf 4 days ago 0 replies      
I'm not sure I see the distinction between the "FreeBSD's version of tcpdump" and, say, "Debian's version of tcpdump" ( http://sources.debian.net/patches/tcpdump/4.6.2-5%2Bdeb8u1/ )?
known 4 days ago 0 replies      
Linus Torvalds is benevolent dictator for Linux kernel;He is very rigid on what piece of code goes to kernel space or user space;
Scarbutt 4 days ago 1 reply      
Linux is Python, FreeBSD is Scheme ;)
waldir 4 days ago 0 replies      
> Darwin is closer to a standard BSD feel, but most of its userbase is people who came from BSD, so its a bit outside the scope of this essay as well.

I'm confused by this sentence. Did you perhaps mean "people who came from OS X"?

joseluisq 4 days ago 0 replies      
Actually I try FreeBSD as Web Server in a single core VPS (development purposes) and I'm happy with performance and stuff.But my default workstation is Linux in this case Fedora 24 (a RH based) with Gnome 3.x
geff82 4 days ago 0 replies      
I love BSDs for their pureness. You feel connected to the machine.
mac01021 4 days ago 0 replies      
This seems like it was written a decade or so ago. I don't know how much of this has changed in that time.

It's well written and informative, though.

JepZ 4 days ago 0 replies      
Some call it chaos, others call it 'Freedom of Choice'.
behnamoh 4 days ago 5 replies      
Something I don't understand about FreeBSD is why they use the evil daemon for the logo?Whatever story behind it, I just think there are many other alternatives they could use as the logo.
idobai 4 days ago 6 replies      
> I like FreeBSD, and I use it as a server OS and on a NAS box but you only need look at https://wiki.freebsd.org/Graphics to understand that if the "Linux Desktop" is a joke compared to MacOS and Windows then the "FreeBSD Desktop" is even more so.

I've never heard anything good about the "MacOS desktop" and I'm pretty sure that the "windows desktop" is far behind the "linux desktop"(plugins, performance, menus etc.). Have you tried something else than xfce or unity?

digi_owl 4 days ago 0 replies      
And the hipsters are out in force...
Xplain Explaining X11 for the rest of us magcius.github.io
452 points by b3h3moth  3 days ago   43 comments top 16
microtherion 3 days ago 2 replies      
Personally, my favorite explanation of X11 is found on page 139 of the Unix Hater's Handbook http://web.mit.edu/~simsong/www/ugh.pdf

"A truly portable X application is required to act like the persistent customer in Monty Pythons Cheese Shop sketch, or a grail seeker in Monty Python and the Holy Grail. Even the simplest applications must answer many difficult questions"

Read the original for the hilariously accurate dialog that follows

Jasper_ 3 days ago 1 reply      
Author here, let me know if you have any questions or comments! Given how much attention this is getting, I should probably finally finish one of the three articles I have sitting around in the backlog.
lil1729 2 days ago 1 reply      

"Unlike X11, where the graphics primitives were rather low-level and all input event handling involved round-trips to the client, NeWS was advanced enough that simple widgets, such as scroll bars and sliders, could be implemented entirely server-side, only sending high-level state changes to the client, more along the lines of slider value is now set to 15 than mouse button 2 released. Similarly, the client could ask the server to render a widget in a given state, rather than repeatedly transmitting sequences of graphics primitives."

eschaton 3 days ago 0 replies      
The author appears to credit the X Window System or Microsoft Windows with repaint-on-expose. To my knowledge that was first implemented (along with regions) by Bill Atkinson for the Apple Lisa, because he misremembered what he saw on the visit to PARC; overlapped windows in the PARC systems at the time didn't automatically and efficiently repaint only the affected area.
sdegutis 3 days ago 0 replies      
The interactivity of this series is just amazingly helpful in visualizing what the heck is going on. Honestly, for me that makes this one of the few links on HN that I might actually read all the way through. And may even use it as a homeschool lesson for our kids, if it seems written simply enough (which at first glance it is).
SXX 3 days ago 1 reply      
For those who might want to understand why Wayland was created. Also tons of fun facts about X11. Really nice talk of Daniel Stone about X server from 2013:


therein 3 days ago 0 replies      
Even my manager who worked on the original X/Motif codebase in the 80s when DEC and HP was working on it doesn't enjoy talking about these details. Pretty good write up, though.
shmerl 3 days ago 1 reply      
> Several new technologies have appeared

You should probably also add Vulkan + WSI to the list.

UPDATE: I just realized, this article is from 2013.

hathym 2 days ago 0 replies      
the X11 server written in Javascript here: https://magcius.github.io/xplain/src/server/server.js
digi_owl 2 days ago 0 replies      
> While I do personally believe Wayland is going to become The Linux Display Server Of The Future

Best i can understand, wayland is a protocol spec rather than a set of code like X. Each compositor/wm is individually responsible for implementing said protocol and thus take over the job of X (input handling et al).

Meaning that there will no longer be an X server to act as a independent arbiter of behavior. Whatever the devs of whatever DE you are logged into will have the final word.

The more i learn of Wayland, the more i expect it to turn into a hairball to match X. Only now without a network option ("too insecure"), and with GPU acceleration.

BTW, why are we so bent out of shape about this seats concept? Why oh why are we continually trying to turn a single user piece of hardware into a desktop mainframe?!

rikkus 2 days ago 0 replies      
Anyone who's interested in X11 (the protocol) and the Ruby language should have a look at Mathieu Bouchard's incredible implementation [1]

There's some quite beautiful Ruby code in there, e.g.:

 module X11 k = Keysymdef = {} c = Cursorfont = {} [k,c].each {|x| x.extend X11::MAssignForHash } k[%w| space exclam quotedbl numbersign dollar percent ampersand |]=32..38 k[%w| quoteright parenleft parenright asterisk plus comma minus |]=39..45 [...]
[1] http://artengine.ca/matju/RubyX11/

jd3 3 days ago 1 reply      
Excellent description of the (somewhat needlessly) complex X Window System (the Blit and Rio are so beautiful). Still makes me sad that COMPOSITE doesn't build in XQuartz for OS X. I posted on the mailing list, but got a pretty non-helpful response.. [0]

[0]: http://comments.gmane.org/gmane.os.apple.xquartz.devel/912

swiley 3 days ago 1 reply      
This looks like a great article but on a phone the margins are as wide as the text! And they used that wrapping mode that stretches lines out instead of truncating them which looks great if you don't want to read the text but is super annoying when you are.
foo101 2 days ago 1 reply      
How are the interactive windows that are displayed in the page created? Is that a complete simulation of X server done with JavaScript or is it something unrelated to X server?
Mozilla could walk away from Yahoo deal and get more than $1B recode.net
416 points by dblohm7  2 days ago   197 comments top 24
chollida1 2 days ago 4 replies      
Wow, I'm really surprised that I've never heard of this before, not because I'm some sort of all knowing cyborg, but because its not like Yahoo acquisition talk just started last week.

This is a company that's essentially had multiple parts of it for sale for the better part of 4 years and this hasn't once come out.

I'm running through all the Yahoo corporate filings that Bloomberg has indexed and I can't find any link to this clause.

I mean this is a very material issue!

If you are to be a share holder in a company that's trying to broker a 3-4 Billion sale of some of its assets, then knowing that the buyer may be on the hook for an additional 1 billion bill probably means that the asset you thought you owned is probably worth 20-25% less than you originally thought.

Someone isn't going to be very happy with the yahoo leadership today:)

jonknee 2 days ago 5 replies      
Marissa Mayer sure makes some interesting contracts. Remember that she was the one who recruited Henrique De Castro from Google and personally got his contract approved by the board (though curiously withholding his name and exact compensation details). He worked at Yahoo for 15 months and earned a $58m severance. That is in addition to the ~$50m he made while working there. Again, in 15 months.


greenspot 2 days ago 1 reply      
Wow, Marissa Mayer.

That she agreed on Mozilla's change-of-control clauseso Mozilla can just walk away in case of a M&A deal and still get $1Bis simply disconcerting.

I do not have any insights and why she gave in on this point but I know that one of her main skills and responsibilities in her position is to negotiate well and do proper deals. She had to negotiate this change-of-control clause away or to let Mozilla sacrifice on the payout if they walk away. Moreover and considering that Mozilla doesn't have that many financial potential search partner options (Google has been with Chrome rather a competitor for many years now), this should have been possible, I'd assume with my limited knowledge.

I do not like if random forum guys like me are bashing CEOs, I know that this is the toughest job and I don't want to pass judgement on decisions I don't have insights on. But this is really, really weird and Marissa should have known that this bummer will pop up at the next due diligence and create distrust ('are they more time bombs at Yahoo? lets dig deeper') or just reduce the deal value or just increase deal complexity later.

Maybe she didn't think about M&A at that time and she was rather in a fire-and-forget mode but a CEO is always supposed to think about what happens if new shareholders join, about the next due diligence, heck just about the future of the company and eventually, to keep the company always in a proper and clean state and not leaving time bombs for potential successors.

JoshTriplett 2 days ago 1 reply      
A clause like this could actually make sense for negotiation. Consider the following two hypothetical deals from Yahoo's perspective:

- $375M/year to Mozilla, with the clause to keep paying for 3 more years if Mozilla doesn't want to do business with Yahoo's new owner.

- $450M/year to Mozilla, with no such clause.

The former seems like the smarter deal for Yahoo to make if they want to focus on being successful rather than on being bought. It only sounds problematic if you start focusing primarily on getting acquired.

rdtsc 2 days ago 0 replies      
> Mayer has handed out excessively generous deals to many top execs, such as its chief revenue officer Lisa Utzschneider.

Right. Wonder if she honestly thought this would have fixed anything or she knew the train is headed to the final station and just wanted to be surrounded by a group she picked and the only way to get them to do that was was to buy her friends.

It is always fascinating to watch a company like that, and wonder if executives still privately believe it is a salvageable situation or they just put up a face and ride the gravy train with some nice golden parachute contract clauses. They probably have to use euphemisms and hints with the board and other top level people to convey their suspicion of viability, as they don't want to be negative and just too pessimistic as it makes them look like liars in press releases, but they can't also be completely oblivious either, that looks bad as well.

throwaway6497 2 days ago 1 reply      
Looks like everyone gets to milk the Yahoo cash cow - board, investors, business partners, execs, friends of execs and management except engineers. Yahoo still pays engineers terribly hoping to compete with the likes of Google and Facebook. Why don't they just give up at this point and expedite selling. Why bother with all the posturing.
ChuckMcM 2 days ago 0 replies      
I think part of this is how much search traffic firefox can swing on the Internet. Remember that Google and Yahoo (and maybe Microsoft's Bing) were probably courting Mozilla for that contract.

In the context of that negotiation I could certainly see it coming up that Yahoo! might be acquired and Mozilla wanted some assurances if they went with Yahoo!. So neither Bing nor Google has that concern, so Yahoo! is the only one exposed.

strictnein 2 days ago 10 replies      
Honest question: what in the world does Mozilla do currently with $375 million a year?
sesutton 2 days ago 6 replies      
With a deal like that what incentive does Mozilla have to not walk away if someone buys Yahoo? They'll get $1.1 billion dollars for free.
seizethecheese 2 days ago 1 reply      
With $1B from a breakup it seems the dominant strategy for Mozilla could be to develop it's own search engine.
xiaoma 2 days ago 2 replies      
>"There is a lot of hair hidden at the company..."

Is this a normal phrase in the enterprise world? I don't think I've ever heard it before.

AdmiralAsshat 2 days ago 5 replies      
Assuming they walk away from Yahoo, though, whom would they select as their next US search partner? Return to Google?
SubiculumCode 2 days ago 0 replies      
Firefox has a great feature where after typing a query a menu of search engines appears immediately below. Very convenient and allows me to mix up my searches across the search engines very nicely and easily. Yahoo got bilked by Mozilla..and probably the best thing Yahoo ever accomplished.
venomsnake 2 days ago 0 replies      
Could that be her attempt at a poison pill? How the *7"( did the board approve that deal?
jMyles 2 days ago 1 reply      
Everyone here is decrying the decision-making process of the CEO - which obviously is already in question - but I'm left pondering a different perspective.

Yahoo has cash. Sure, they're not Apple in terms of liquidity, but they're not a startup either. Mozilla is - and I may need to solicit your agreement here - a Good Thing for the world.

I'm not convinced that being a Bil in the hole to Mozilla is so bad.

By contrast, there are several companies that are, according to some legal theories that may yet prove persuasive in court, in debt this much or more to governments by dint of their offshore accounting practices.

At which company do you prefer to be a shareholder? One which owes Mozilla a billion, or one which very well might owe an armed, hotheaded, unpredictable entity several billion?

Ultimately, if I'm a shareholder (and I'm not), I can forgive a billion dollars to mozilla more easily than the other Yahoo mis-steps.

bobsil1 2 days ago 0 replies      
Microsoft offered $44.6B for Yahoo, was turned down and dodged a bullet.
peterjlee 2 days ago 0 replies      
Maybe with all that money they can buy DuckDuckGo and make it the default search engine? It's a risky move though. If DuckDuckGo doesn't workout, they lose the option of switching to another search partner.
paulbjensen 1 day ago 0 replies      
I could understand a break clause based on change of ownership, but the fact that Yahoo would still be obligated to pay is nuts.

Would Mozilla ever want to exercise this clause? I don't know - I wonder how they would react if Verizon/AOL ends up buying them, given that Netscape was bought by them early on.

shmerl 2 days ago 1 reply      
I hope Yahoo won't abandon search. We need more competition.
schnevets 2 days ago 1 reply      
Wow! If they received that chunk of change all at once, it could completely change the company. Imagine an endowment of that size, with the investment income being used for Mozilla's maintenance and initiatives.
ekianjo 2 days ago 0 replies      
I thought yahoo japan was a completely separate entity from yahoo global, why does the article mention softbank at all?
homero 2 days ago 0 replies      
Where's this money go? I thought many developers were doing it for free. Yet they can't support thunderbird or persona
kriro 1 day ago 0 replies      
Is there a reason why Mozilla shouldn't decline and take the money 100% of the time?
microDude 2 days ago 2 replies      
It is very easy for the user to change the default Search Engine for the browser (to say Google or DuckDuckGo). I would think most FireFox users do this... So, $375M/year seems like a very high price for Yahoo to be paying.Is this just a price war between the few search giants trying to keep competitors priced out of the market?
CSS is powerful, you can do a lot of things without JS github.com
489 points by NamPNQ  4 days ago   299 comments top 53
ctvo 4 days ago 11 replies      
I've worked with people who did things like this. It's not fun for anyone else on the team to try and figure out why you're using specific pseudo selectors on tags in a child inside a sibling in some label to make an accordion that would take 3 lines in Javascript.
achairapart 4 days ago 4 replies      
Most of these are just clever CSS hacks. There is a website collecting these kind of experiments since the early '00: http://www.cssplay.co.uk/

There was a time when separation of concerns and the "Rule of Least Power"[0] where the foundation principles of web development.

Today you have preprocessors, postprocessors, transpilers, content and style in JavaScript, behavior in CSS. Very powerful tools that can easily lead to unnecessary complexity.

On the other hand, somehow W3C failed to turn these principles into a truly interoperable and semantic web.

Ah! Also, nobody cares about unobtrusive JavaScript anymore.

[0]: https://www.w3.org/2001/tag/doc/leastPower.html

danielnixon 4 days ago 2 replies      
Numerous accessibility issues here. Many of these examples can't be operated by a keyboard, for example. "CSS only" is not a virtue if it comes at the cost (as it always does) of needlessly excluding people from the web.
mcv 4 days ago 1 reply      
I prefer using CSS for my styling and javascript for my functionality, as God intended. These examples could be educational for getting a bit more out of your CSS while keeping only the pure functionality in javascript. Too much styling-related stuff seems to be sneaking into code these days. I like my concerns separated.
code_research 4 days ago 2 replies      
I am missing one important argument in this discussion:

CSS only design is an important piece of a future web with reduced security and privacy threads.

The (interesting) model of allowing remote code execution per default was a beautiful, but naive vision. We have to make big advances in technology, politics and society to make this model work in a way that does not make internet users victims by default. We are not there yet. Reality is: the crooks destroyed that vision and are advantaged by the current situation, while all internet users are being trapped in their first moment of browser usage without their consent or knowledge.

For many use cases, (e.g. government websites, banking, anything where you type in your name) css-only design should become a requirement by law to protect the user until we figured out how to write secure software that respects user privacy and how to form governments that will respect their citizens (possibly will take longer). Until then browser vendors should implement more and better possibilities for CSS that help to avoid JavaScript whenever possible.

I very much like JS animations and stuff happening in the browser window, also there are some edge cases where JS brings some important advancements to a UI, but we have to face that privacy and security are much more important issues than having a nice UI and we have to change the current situation, as we, as programmers, are responsible for it.

The "remote-execution-by-default" experiment has failed, we need to change that, and CSS is a great way to go on with a web that might be less problematic for everyone, but still offers very nice usage experiences.

adamjc 4 days ago 4 replies      
The problem with relying on anchors means it stays in the browser history, so if you press 'back', you get the modal dialog again!

I don't think this is very useful, but it is a fun exercise in CSS.

nnq 4 days ago 0 replies      
May the gods have mercy on the soul of who'll maintain code using these... "techniques".

(I know because back in the days I've written things like this myself :) ..probably the curses whispered by those who've had to endure their progenitors are finally getting at me)

zaidf 4 days ago 2 replies      
History is going to look back at CSS in disbelief when it calculates the amount of engineering time was spent trying to position elements correctly.

This isn't hating on CSS as much as how broken layout creation is with no end in sight--but plenty of hacks.

zuxfer 4 days ago 5 replies      
And here I am, trying for past 10 years to centre align a div, both horizontally and vertically.
jordanwallwork 4 days ago 2 replies      
The problem with using css this way is that it's not as obvious what's supposed to happen as it is when using js. I'd love for there to be some explanation within the examples of _why_ they work - I've just been reading the first modal example but I can't really understand it
awjr 4 days ago 0 replies      
Although I think this is an interesting exercise, I need these behaviours to work the same across all browsers. This gets even more complicated within JavaScript apps where the state of the components is something I need to have fine grained control over.

Really cool and good to see what how powerful CSS3 is, but not sure how useful it is, particularly when I need this behaviour working on older mobile browsers.

drinchev 4 days ago 1 reply      
I'm still a huge fan of "If you can do something with CSS, avoid JS", but these days, working with React, everything is JS.

Although IMHO it makes logical sense to put the logic for the modal windows in the JS, not the CSS. And definitely easier for maintaining it.

kyriakos 4 days ago 0 replies      
Using checkboxes and radio buttons for keeping state is hacky to say the least. At the end of the day you can do the same with a few lines of easier to understand javascript.
progval 4 days ago 0 replies      
Ironically, all of these links point to jsfiddle/codepen, which require Javascript to see the demo.
mxuribe 4 days ago 1 reply      
Traditionally, I remember the rule that presentation should be managed by css, while behavior should be managed by javascript. But does the fact that some things can be reasonably done either way change - for example - a site's or app's maintainability? How about it performance? Is a CSS implementation rendered faster than javascript? I struggle with this at times, especially in some cases where - admittedly older - frameworks allowed for some overlap. Ah, well, much like lots of the web, i'll just work to get stuff done.
acbabis 4 days ago 1 reply      
The biggest problem with these hacks isn't a preference for CSS over JS (which IMO is fine in moderation); it's the fact that they aren't keyboard or screenreader accessible. A person using a screenreader can't even navigate to the accordion (http://jsfiddle.net/yn4ls/), and if they could, it would tell them it's a form element.
the_mitsuhiko 4 days ago 6 replies      
Whenever someone makes a menu with CSS instead of javascript i go crazy as a user. The lack of delay is such a frustration.
everydaypanos 4 days ago 2 replies      
Still not good enough. For example the popover/tooltip that pops up on hover is fixed right below the target. If for example the viewport/window is smaller than the available space it will "bleed" out of view. You still need javascript to place it properly, and javascript that manipulates :after and :before pseudo elements is not straightforward.. Just saying :)
nachtigall 4 days ago 0 replies      
Would be nice if this would list the minium browser versions required for each example.
jliptzin 4 days ago 0 replies      
Just because you can, doesn't mean you should.
dotdi 4 days ago 0 replies      
Look ma! I replaced a few lines of JS with a crap-ton of CSS.
ivanhoe 4 days ago 0 replies      
IMHO this is a big step back, what do you really gain by sacrificing semantics just to avoid a little js code? CSS was supposed to help us separate content and presentation, and adding a bunch of unneeded tags just to make these hacks work is the very opposite of that.
Fletch137 4 days ago 0 replies      
A fun exercise - while you might not _need_ JS, it's still the best option in a lot of cases.

I'd never even consider using most of these techniques in production - while they're interesting and showcase what CSS can do, they're practically unmaintainable.

MrPatan 4 days ago 0 replies      
The accordion example cheats a bit. The height of the items is fixed to 300px.

I don't know if it'd be possible with flexbox nowadays.

The usual max-height trick is not ideal, as the animation then doesn't have the right duration.

BinaryIdiot 4 days ago 0 replies      
These are cool "CSS tricks" but I would seriously never allow someone to use these in a production web application.

- It's not an accessible solution in most cases (no keyboard navigation)

- The CSS classes are very and overly complex to the point where it's not intuitive at all what they're doing (honestly the code being intuitive is more important to me than any minor performance benefits that may be seen here)

IanCal 4 days ago 1 reply      
So in the first modal example, the document contains both dialog windows at the same time.

Is that really what you want? Does that properly describe the content of your document?

brador 4 days ago 0 replies      
Not everyone is using the latest CSS standard browsers. Keep it simple for production code. It's easier to work with, support, and maintain.
supernintendo 4 days ago 0 replies      
There are almost 300 comments here and no one has pointed out the obvious: these can be used within email templates where JavaScript is unavailable. For that reason alone, I think this is pretty cool.

But sure. Always use the right tool for the job, as they say, and for application logic that tool is not CSS.

wnevets 4 days ago 2 replies      
Is it wrong of me to not like using the css content property? I feel like actual content shouldn't be in css.
vzip 4 days ago 0 replies      
I've seen people create some beautiful things in pure CSS but often the code looks like a horrendous hack.
vladootz 4 days ago 1 reply      
Even if you can do a lot of stuff in css, sometimes you shouldn't do them. Razvan Caliman from Adobe has some good points on this topic: https://youtu.be/WupAsZGHDcY
amelius 4 days ago 0 replies      
> Css is more powerful today and you can do a lot of thing dont need js

Yes, but the problem is that adding or changing functionality implemented in CSS can easily lead you into a brick wall.

At that point you will either have to rethink your approach completely, or move back to JS.

imafish 4 days ago 0 replies      
In my opinion, using Cascading StyleSheets for this stuff is just broken. Animations are not styles. Behavior is not a style.

Readability and usability of animations could be improved by adding animation tags to the HTML Canvas (like WPF Storyboards: http://www.codeproject.com/Articles/364529/Animation-using-S...).

The view behavior-part could be done like WPF triggers. In fact let's just get rid of HTML/CSS and implement WPF for the browser.

vkjv 4 days ago 0 replies      
The accordion example isn't really what I think of as an accordion because it doesn't auto-close other elements.

For that behavior, you can make some minor tweaks swapping out the checkbox elements for radio buttons.

majewsky 4 days ago 0 replies      
I would like a superset of HTML that wraps these hacks up into nice controls, then compiles everything down to plain HTML + CSS.
erlehmann_ 4 days ago 0 replies      
If I do not need JS, why do at least the first four demos do not work without it? I stopped trying the linked demos after that.
curiousgal 4 days ago 0 replies      
I've always been told that the best way to learn something is to delve into it. Tried so with CSS but it was such a hassle that I gave up. Now, years later, all those past frustrations of my younger self remain to the point it's like a bte noire of mine. Seeing all this coolness makes me feel kinda bad for missing out. /rant
lucaspiller 4 days ago 1 reply      
Is it possible to display a "There are no records" message if a container is empty (i.e. tbody) using CSS only?
drydenwilliams 4 days ago 0 replies      
It's a really nice solution for experiments but I've found it quite difficult to get people to adopt this CSS approach in some companies (regardless of any cross browser implications). Everyone needs to be on the same page and of course be up to date with CSS3 animations, which can be over looked.
josephjrobison 4 days ago 0 replies      
Related to this - is it possible to do something with pure CSS like shown in the black filter buttons area on this - http://www.siegemedia.com/creation/best-infographics ?
20years 4 days ago 0 replies      
While I don't think I would use a lot of this in production for the maintainability alone, it does showcase what you can do with CSS. I see a lot of convoluted Javascript being used on things that can easily and more cleanly be accomplished with a little CSS.
anacleto 4 days ago 0 replies      
You definitely missed the Pure CSS dancing tree: https://codepen.io/yukulele/pen/KCvbi
wehadfun 4 days ago 0 replies      
I''m glad that the general consent here seems to be doing this in CSS is not a great idea. I feel the same way about XAML where programmers try to do things in XML instead of C#
d33 4 days ago 1 reply      
On a side note, HTML + CSS3 is Turing complete:


nzjrs 4 days ago 1 reply      
Related, does anyone else have a preferred pure css treeview?
reitoei 4 days ago 0 replies      
ITT: people taking this way too seriously
iLoch 4 days ago 1 reply      
I think it would actually be wise to stick to JS style rendering (as is the case with React) as we're only going to see more and more styling being delegated to scripts with the rise of wasm. Only a matter of time before CSS will only really be useful for completely static websites IMO.

Edit: Downvoting me because you disagree isn't really in the spirit of HN.

Please provide a counter argument if you disagree, I'd be interested to hear it.

boubiyeah 4 days ago 0 replies      
Please don't. CSS is unmaintenable. JS can be made very maintainable; end of story.
lasfter 4 days ago 0 replies      
It is pretty easy to cheat at the rocketship game by dragging your mouse to avoid enemies.
hartator 4 days ago 0 replies      
I think that's great, but everything is based on hacks using checkboxes.
lsh 4 days ago 0 replies      
hm. ironically (?) none of the linked examples work without javascript.
nikolay 4 days ago 1 reply      
What's the point of this post?
ClassyJacket 4 days ago 0 replies      
Is that title supposed to be in English?
Apple Open-Sources its Compression Algorithm LZFSE infoq.com
338 points by laktak  2 days ago   202 comments top 24
svckr 2 days ago 6 replies      
With energy efficiency as a primary goal I was expecting way more use of explicit SIMD instructions.

The InfoQ post mentions xcodebuild, but there is also a Makefile. I really appreciate the presence of a no-nonsense Makefile. No autoconf, no pkgconfig, just plain and simple make. Also, because nobody mentioned it: yes, it compiles on Linux out of the box.

espadrine 2 days ago 6 replies      
I feel like LZFSE is too little, too late. It would be great to have a proper comparison, but Zstd is stable, and offers a superior compression ratio with compression and decompression speeds that seem on par with LZFSE.

And Zstd is not proprietary. (This issue is relevant in this regard: https://github.com/lzfse/lzfse/issues/21)


Edit: here is a quick comparison I did on Linux with Project Gutemberg's webster (http://sun.aei.polsl.pl/~sdeor/corpus/webster.bz2).

 $ time ./lzfse-master/build/bin/lzfse -encode -i webster -o webster.lzfse real 0m1.885s user 0m1.860s sys 0m0.024s $ time ./zstd-master/programs/zstd webster -8 -f -o webster.zstd webster : 25.98% (41458703 =>10772836 bytes, webster.zstd) real 0m1.700s user 0m1.660s sys 0m0.036s $ ls -l -rw-r--r-- 1 tyl tyl 12209496 Jul 7 16:26 webster.lzfse -rw-rw-r-- 1 tyl tyl 10772836 Jul 7 16:31 webster.zstd $ time ./lzfse-master/build/bin/lzfse -decode -i webster.lzfse -o /dev/null real 0m0.127s user 0m0.112s sys 0m0.012s $ time ./zstd-master/programs/zstd -d webster.zstd -o /dev/null webster.zstd : 41458703 bytes real 0m0.116s user 0m0.112s sys 0m0.000s
LZFSE's -h option doesn't show a flag to tweak compression. Zstd's default -1 compression is super-fast, but obviously not optimal. Its -8 is the closest I got to LZFSE's compression speed; its -4 was the closest to LZFSE's compression ratio, with a speed of 0m0.527s real compression, 0m0.101s real decompression.

nodesocket 2 days ago 6 replies      
If you want to see some crazy C code, check out this file from the GitHub repo: https://github.com/lzfse/lzfse/blob/master/src/lzvn_encode_b...
DeepYogurt 2 days ago 0 replies      
Here's the github linkhttps://github.com/lzfse/lzfse
microcolonel 2 days ago 0 replies      
It's nice to see that we will be able to at least decode these archives. Though I think for new software people are better off using zstd if they're looking for this set of performance characteristics.
nodesocket 2 days ago 0 replies      
Interesting to see a benchmark against .zip and .gz in terms of file reduction and uncompress time.
shmerl 2 days ago 1 reply      
> LZFSE is only present in iOS and OS X, so it cant be used when the compressed payload has to be shared to other platforms (Linux, Windows).

So now it will be cross platform?

0x54MUR41 1 day ago 0 replies      
Sorry, I think this comment is out of topic. I don't know why Apple put this separated with their open source projects on https://github.com/apple.
Negative1 2 days ago 1 reply      
I'm not fully up on the latest and greatest in compression technologies but my go to format these days is usually 7zip which I believe is just a container that uses LZMA. For whatever reason *nix people seem to hate it even though I get much better compression with it than tarballs, zlib/zips. Is there a similar container format that will or does use LZFSE? And how much better is it than 7zip/LZMA?
tracker1 2 days ago 1 reply      
Just a side-comment, it would be really nice if we could get the browser vendors to support a newer compression algorithm beyond gzip and deflate. I know there have been a couple others implemented, but nothing that has been implemented by multiple browsers that has stuck. Really need to get MS, Google, Apple and Mozilla to come together on this. Should be patent free.
ausjke 2 days ago 1 reply      
A quick test resutl(zip a 1.5GB file):

 lzfse: real1m44.481s user1m17.956s sys0m2.852s lz4: real0m28.136s user0m1.200s sys0m2.240s
lz4 is much faster somehow. The final size are very close.

skreuzer 2 days ago 0 replies      
If anyone on FreeBSD wants to try this I just added a port under archivers/lzfse


technion 1 day ago 0 replies      
For anyone wondering, I've tried throwing afl-fuzz at this. It's only been a few hours, but as yet, nothing has turned up.
finchisko 2 days ago 2 replies      
Waiting for first guy who take this implementation and run it through emscripten so we can actually use it in client -> server communication, eg sending compressed json payloads to the server.
kazinator 2 days ago 0 replies      
Open source; but is it patent-free?
kirkdouglas 2 days ago 0 replies      
Huh, I hope macOS code looks better.
panic 2 days ago 1 reply      
The article is essentially a link to https://www.infoq.com/news/2016/07/apple-lzfse-lossless-open... with a bunch of ads on top. Maybe someone could update it to point there instead?
pilif 2 days ago 9 replies      
It's 2016. How can you launch a reasonably high profile open source project with code that looks like this? This fulfills all the TODO list for unreadable code. One character variable names, one character parameter names, full of magic numbers...

Yes. This is very performance critical code and I completely see the need to write very optimized code. That's fine. But optimizing code for speed shouldn't imply also optimizing it to use as few characters as possible.

Compression code is code that often runs at boundaries to the external world and thus is a very significant attack surface. To release compression code in a non-safe language is risky enough but then using what amounts to practically write-only code is, IMHO, irresponsible.

tlrobinson 2 days ago 1 reply      
Well, what's the Weissman score?
cooper12 2 days ago 3 replies      
Please don't do this here.

We detached this subthread from https://news.ycombinator.com/item?id=12048265 and marked it off-topic.

DiabloD3 2 days ago 1 reply      
So... basically a clone of LZ4?
benmarten 2 days ago 2 replies      
I just quickly tested it, in terms of highest compression ratio it still does not beat xz, e.g. `tar -cf -FILE | xz -c9e > FILE.tar.xz`https://blog.benmarten.me/2016/04/01/Compress-Files-With-Hig...
grewil2 2 days ago 4 replies      
Kind of weak licence, what are you actually allowed to do with this code? Change it? Distribute your changes?


Building a BitTorrent client from scratch in C# cheatdeath.github.io
505 points by nxzero  4 days ago   85 comments top 19
jabstack 4 days ago 1 reply      
Former admin of DC# here (forgive the sourceforge hosting-- it was a long time ago! https://sourceforge.net/projects/dc-sharp/). Great write-up! This was a fascinating read- thank you for putting it together.

One issue to be mindful of- the HttpWebRequest.BeginGetResponse method does not honor timeouts, and you are on your own to timeout the attempt. Consider using HttpClient, if available in Mono / .NET Core. Otherwise, see MSDN for how to do this:

"In the case of asynchronous requests, it is the responsibility of the client application to implement its own time-out mechanism. The following code example shows how to do it." See: https://msdn.microsoft.com/en-us/library/system.net.httpwebr...

I'm not sure if you have access to the ThreadPool class. In a bug that Microsoft's library had, I used the TPL Task construct to resolve this. See the pull request here: https://github.com/Microsoft/ProjectOxford-ClientSDK/pull/83...

mattcopp 4 days ago 2 replies      
Great work! Quite annoying actually. I finished my own implementation in Python at about 10pm last night, this would have been most useful. I'm no C# coder, but it's nicely readable, and this is a much better write up than I'm sure I could do.

If anyone who hasn't tried doing this before, the "official" BitTorrent spec docs, namely BEP-3 (http://bittorrent.org/beps/bep_0003.html), seem little more than a vague blog post turned in to a "spec". However, somewhat conversely, this has lead to is a wealth of articles describing how to do it.

The three guides I used were:

- A 2 part blog post which has a bit of a Python bent http://www.kristenwidman.com/blog/33/how-to-write-a-bittorre...

- The unofficial specs https://wiki.theory.org/BitTorrentSpecification, and

- An incomplete Python client https://github.com/JosephSalisbury/python-bittorrent

I didn't know of the RFC mentioned in the post, that would have also been really useful.

A lot of BitTorrent stuff for Python is remarkably hard to find in all the noise of Deluge, the original client, and libtorrent wrappers, but none that existed were sophisticated (or at least well documented) enough for my experiments, they have different focuses.

I never went as far as implementing my own BEncoder library, a billion seem to exist in multiple languages and install any BitTorrent Python library and it seems to come with their own copy. (I suspect due to the way BEncoder was bundled in the original client, see: https://pypi.python.org/pypi/bencode)

I also found a Rust implementation which seems not to compile, but is useful as I'm trying to teach myself Rust https://github.com/kenpratt/rusty_torrent I think the work to get it to compile might be minimal.

Const-me 3 days ago 1 reply      
Very good! A few minor performance comments.

1. In your EncodeDictionary, you sort byte arrays by converting them to string. Correct but subeffective. See e.g. this: http://stackoverflow.com/q/19695629/126995 but add checks for nulls, authors of that code forgot about that.

2. You dont need a dedicated thread to wake up every 1-10 seconds and do something small. Thread are expensive system resources, they own stack, cache misses are guaranteed then they wake up, etc. If your compiler supports async-await, use that instead + endless loop + Task.Delay inside the loop. If not, System.Timers.Timer class will do.

voltagex_ 4 days ago 2 replies      
BEncoding and variants like REncoding are possibly one of my least favourite things ever. If you deal with the Deluge torrent client API you'll see it everywhere.

That aside, fantastic work on this, I think previously the only Bittorrent library for C# was an abandoned Mono project.

jdudek 4 days ago 0 replies      
Heres a similar write-up on building a BitTorrent client in Haskell: https://blog.chaps.io/2015/10/05/torrent-client-in-haskell-1...
allenkim6 4 days ago 1 reply      
If anyone is interested in a similar writeup for node.js check out my tutorial here: http://allenkim67.github.io/bittorrent/2016/05/04/how-to-mak...
nbarbettini 4 days ago 1 reply      
Awesome write-up! I love C# and this was really well-written. Great work.

Pro tip: Use DateTimeOffset instead of DateTime. It's less frustratingly ambiguous than DateTime, and already has a Unix timestamp helper function if you're on the latest framework: https://msdn.microsoft.com/en-us/library/system.datetimeoffs...

whoisthemachine 4 days ago 1 reply      
Not sure if it's a part of the standard library, but the .Net variant of C# contains a sorted dictionary: https://msdn.microsoft.com/en-us/library/f7fta44c(v=vs.110)....
blt 4 days ago 0 replies      
It's really nice to see a walkthrough of a non-trivial program all on one page like this. The clarity of the code and writing makes me want to port it to a different language because it seems like it would be easy with all the needed info in one place.
th0ma5 4 days ago 0 replies      
Heh, more than a decade ago I created a torrent file format library in .NET ... actually with VB.Net ... anyway, this is GPLv2 licensed.http://writtorrent.cvs.sourceforge.net/viewvc/writtorrent/wr...
Uptrenda 4 days ago 0 replies      
It's really great work OP. I know this would have taken you a long time to do but part of me can't help but wonder if programming is becoming even more like paint by numbers than it already is.
vishnuks 4 days ago 0 replies      
For interested people there is a great write up on Tox protocol here https://toktok.github.io/spec
aashu_dwivedi 4 days ago 5 replies      
I wish there's a similar article in python.
ambicapter 4 days ago 3 replies      
I don't understand this encoding method. If say, a dictionary starts with d and ends with e, how do you know with "d3:key5:valuee" if the value is "value" or "valu"?
hackeradam17 4 days ago 0 replies      
I've been considering trying my hand at creating a bittorrent client. This should prove to be most helpful!
ZanyProgrammer 4 days ago 2 replies      
Somewhat disappointing that it's just a console app. I'd love to be able to do cross platform C# desktop development. There shock be something equivalent to WinForms/WPF on OSs other than Windows.
NKCSS 3 days ago 1 reply      
Fun read, but using automatic properties might lead you down a path that isn't optimal;

Take this for example:

 public byte[] Infohash { get; private set; } = new byte[20]; public string HexStringInfohash { get { return String.Join("", this.Infohash.Select(x => x.ToString("x2"))); } } public string UrlSafeStringInfohash { get { return Encoding.UTF8.GetString(WebUtility.UrlEncodeToBytes(this.Infohash, 0, 20)); } }
You have an automatic property and two 'properties' that actually perform work every time you call the getter (might be smarter to make functions of those, so you know it's not just retrieval of data, but work is done).

If you were to rewrite this a bit, you could make sure the 'work' is done only when needed, and the properties become actual simple data retrieval properties like:

 public class Hashes { byte[] _infohash; string _hexStringInfohash, _urlSafeStringInfohash; public byte[] Infohash { get { return _infohash; } private set { _infohash = value; _hexStringInfohash = String.Join("", this.Infohash.Select(x => x.ToString("x2"))); _urlSafeStringInfohash = Encoding.UTF8.GetString(WebUtility.UrlEncodeToBytes(this.Infohash, 0, 20)); } } public string HexStringInfohash { get { return _hexStringInfohash; } } public string UrlSafeStringInfohash { get { return _urlSafeStringInfohash; } } public Hashes() { Infohash = new byte[20]; } }
Going further through the article, I spot many more items to improve; but let's not forget your did great work and the code is quite readable.

One thing that might help; is building some indexes to know how files are fragmented; you have the following code multiple times:

 if ((start < Files[i].Offset && end < Files[i].Offset) || (start > Files[i].Offset + Files[i].Size && end > Files[i].Offset + Files[i].Size)) continue;
If you'd build an index to know which piece hits which files, you don't have to enumerate this every time.

Another general remark is to always 'retrieve' an indexed item from the array and use that instead of keep calling the 'indexed' record.

So; do:

 var file = Files[i]; if ((start < file.Offset && end < file.Offset) || (start > file.Offset + file.Size && end > file.Offset + file.Size)) continue;
The code becomes more readable and allows you to change the structure later on more easily since you don't have 100 references tot he same array now and only use an itermediate.

mafuy 4 days ago 3 replies      
Is the site down?
cheatdeath 4 days ago 8 replies      
If anyone's having issues, I've mirrored it at https://cheatdeath.github.io/research-bittorrent-doc/

edit: I'm the author, let me know if you have any questions.

Gos march to low-latency GC twitch.tv
399 points by etrevino  3 days ago   286 comments top 13
_ph_ 3 days ago 4 replies      
Another very nice feature of Go is, that since 1.5, the whole runtime, including the GC is written in Go itself. So every Go developer can look at the implementation. The GC itself is surprisingly small amount of code.
r1ch 3 days ago 8 replies      
I have to wonder - when you're digging down into this level of complexity in order to discover issues with the language you're using, wouldn't something like C be better? IRC isn't a very hard protocol and you know the language won't be getting in your way if you're using C.
ben_jones 3 days ago 4 replies      
This may be anecdotal but Twitch is an example of a service that just bloody works. I've been a user for awhile and I've yet to notice any noticeable service disruptions or issues. They were also one of the largest early adopters of EmberJS, pretty sure it was well before the 1.0 release when many bugs were still being worked out and the API suffered frequent changes, so hats off to the engineering team for continued awesome work.
anonymousDan 3 days ago 3 replies      
So how does the GC performance of Go compare to something like Java/the Hotspot JVM?
thegeekpirate 3 days ago 0 replies      
Posted earlier without the random hash in the URL https://news.ycombinator.com/item?id=12040349
fauigerzigerk 3 days ago 0 replies      
That's interesting, but it would be even more interesting if the article contained some info about heap sizes, memory utilization and the number of CPU cores.
pepesza 3 days ago 9 replies      
I think that not using Erlang in this particular case was a mistake. Erlang is running some of the largest chats out there, including League of Legends and WhatsApp. They would have avoided all the hassle of GC pauses, since Erlang has per-process GC collection. And scaling single OS process to their number of connections was done for Erlang machines years ago.
iamleppert 3 days ago 1 reply      
I'm curious why you didn't just use something like Redis for managing concurrent state and pair that to any of the various web apps that are good at concurrent connections? You could still use Go to serve the web requests/sockets/etc.
lllorddino 3 days ago 0 replies      
Before Go I was web developing in Node.js but wanted to get "closer to the metal." Thought about using C for the back end but then heard about Go and have been in love ever since. My favorite programming language by far.
jeffdavis 3 days ago 1 reply      
There has been a ton of research for GC on the JVM. What are the differences between Go's approach and Java's? Are those differences due to linguistic differences or different goals?
amelius 3 days ago 1 reply      
How do they prove correctness of their GC?
mkevac 3 days ago 0 replies      
> Next up is runtime.scanobject. That function does several things, but the reason its running during the chat servers mark termination phase in Go 1.5 is to implement finalizers.

How did you know that?

> We emailed back and forth and the Go team was very helpful with suggestions on how to diagnose the performance problems and how to distill them into minimal test cases.

Can you give us the link?

_pmf_ 3 days ago 0 replies      
Purposefully strolling to where the puck was in 2001.
fMRI software bugs could upend years of research theregister.co.uk
400 points by taylorbuley  5 days ago   180 comments top 23
maweki 4 days ago 3 replies      
"and along the way they swipe the fMRI community for their lamentable archiving and data-sharing practices that prevent most of the discipline's body of work being re-analysed."

That's quite funny. My girlfriend recently finished her master's thesis on data sharing for neuroscience data and created a model for universal access to research data across institutions, but came to the conclusion that making researchers share their data is a bigger hurdle than actually implementing the system.

The main reason for lack of sharing, she postulated, is, that studies (that create funding for the researcher who publishes them) can be done using just the raw data and researchers who create data want to publish all the studies/papers for themselves (because "they" paid for the data acquisition) and are also afraid to publish underlying data for it to be harder for others to falsify their results, which would lead, in their opinion, to funding going away.

Edit: of course there are privacy issues for the test subjects as well.

randcraw 5 days ago 13 replies      
As someone who works in the biomedical imaging business and is also a fan of philosophy, I think this news will matter more to folks in the latter camp. For a couple years now philosophers have insisted that fMRI images prove there is no such thing as free will. Today's revelation should put an end to that whole line of reasoning (and the absurd amount of fatalism that it engendered).

(The back story: Apparently fMRI showed motor signals arising before the cognitive / conscious signals that should have created them, assuming we humans have free will. This has led to the widely adopted belief among philosophers that we humans act before we think, thus we don't and can't act willfully and freely. To wit, science has proven there is no such thing as free will; we're all just automatons.)

Just this week there was an article in The Atlantic on how we all must accept that we're mere robots and we don't really choose our actions (nor can we choose to believe in a god).

Ah well. It seems philosophers STILL haven't learned the importance of applying the scientific method before leaping to a conclusion -- sometimes just to check that someone else didn't just abuse the scientific method.

jballanc 5 days ago 3 replies      
The real takeaway lesson from this research should be the vital importance of Open Data to the modern scientific enterprise:

> "lamentable archiving and data-sharing practices" that prevent most of the discipline's body of work being re-analysed.

Keeping data private before publication is (at this point in time) understandable. Once results are published, however, there is no excuse for not depositing the raw data in an open repository for later re-evaluation.

nonbel 5 days ago 2 replies      
I would send it back and ask for a detailed description of the null hypothesis they are testing, because they are not clear on this point at all:

>"All of the analyses to this point have been based on resting-state fMRI data, where the null hypothesis should be true."

They are not careful to explicitly define this null hypothesis anywhere, but earlier in the paper they describe some issues with the model used:

>"Resting-state data should not contain systematic changes in brain activity, but our previous work (14) showed that the assumed activity paradigm can have a large impact on the degree of false positives. Several different activity paradigms were therefore used, two block based (B1 and B2) and two event related (E1 and E2); see Table 1 for details."

This means that they actually know the null model to be false and have even written papers about some of the major contributors to this:

>"The main reason for the high familywise error rates seems to be that the global AR(1) auto correlation correction in SPM fails to model the spectra of the residuals"http://www.sciencedirect.com/science/article/pii/S1053811912...

If the null hypothesis is false, it is no wonder they detect this. In fact, if the sample size was larger (they used only n=20/40 here) they would get near 100% false positive rates. The test seems to be telling them the truth, it is a trivial truth, but according to their description it is correct nonetheless.

Edit: I was quoting from the actual paper.


UVDMAS 4 days ago 1 reply      
The paper has been rebutted by other researchers who argue that the original results hold:

"This technical report revisits the analysis of family-wise error rates in statistical parametric mapping - using random field theory - reported in (Eklund et al., 2015). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions - and random field theory - in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al., 2015) for parametric procedures."


honkhonkpants 5 days ago 2 replies      
Doesn't sound like a straight up bug, but rather unsound statistical methods which can happen with or without software. You get the same problem with finite element analysis software: the operator has to be aware of all the assumptions baked in, and has to ensure that the input conforms to them.
nerdponx 5 days ago 1 reply      
"Further: Our results suggest that the principal cause of the invalid cluster inferences is spatial autocorrelation functions that do not follow the assumed Gaussian shape."

This has nothing to do with bugs and everything to do with bad statistical analysis. It's Google Flu all over again.

williamscales 5 days ago 6 replies      
"Our results suggest that the principal cause of the invalid cluster inferences is spatial autocorrelation functions that do not follow the assumed Gaussian shape."

In other words, researchers cut corners. You should never assume that something is a certain way without rigorously proving it. How did these papers make it past peer review?

pfooti 5 days ago 1 reply      
The dead salmon study seems relevant here in discussion of how fMRI is used, especially the theory -ladenness of observations.


Toenex 5 days ago 0 replies      
And from a couple of days before.


iamleppert 5 days ago 0 replies      
Just goes to show when you're doing science you need to test and validate your experimental methodology, including the tools you use. In computer vision, its common to need to do some kind of calibration for many algorithms which can usually reveal some kind of statistical error or problem. I wonder why none of the researchers thought to do some very simple validation of the data?

And I wonder if the software was at one point correct and then this bug was introduced at a later point? Many times it feels like after a company does a formal scientific validation they never do it again despite the fact they have engineers constantly working on the code...

Trombone12 5 days ago 2 replies      
Well, I think the problems with interpreting fMRI scans have been at least vaguely known since that time a dead salmon activated its neurons when asked to judge the of a person from a photo, this was in 2009.


chrramirez 5 days ago 1 reply      
If this results to be true, this could be one of the most expensive bugs in computer history.
bjourne 5 days ago 0 replies      
Does it mean studies like these are likely bunk?

 http://kangleelab.com/articles/Paper0002_0009.pdf https://med.stanford.edu/news/all-news/2016/05/moms-voice-activates-different-regions-in-children-brains.html https://www.theguardian.com/science/2015/apr/21/babies-feel-pain-like-adults-mri-scan-study-suggests https://news.brown.edu/articles/2013/06/breastfeeding
And with bunk I mean doesn't show what they claim.

iLoch 5 days ago 1 reply      
"How X looks like" -- what's with this gramatical mistake? I see it everywhere. Is it a regional thing?
Alex3917 5 days ago 5 replies      
It's already been known for several years that almost all MRI brain scan research is wrong, what exactly is new here?
jamesrom 5 days ago 2 replies      
I thought that fMRI hasn't been taken seriously since they scanned a dead fish and found it was thinking.
seesomesense 5 days ago 0 replies      
fMRI is great for generating headlines and pretty pictures for the popular media.

However most neurologists view the vast majority of fMRI research as junk science.

jey 5 days ago 3 replies      
Title should really read "fMRI" instead of "MRI". The referenced journal article is titled "Cluster failure: Why fMRI inferences for spatial extent have inflated false-positive rates".
SubiculumCode 5 days ago 0 replies      
What a crap article.
jcoffland 5 days ago 0 replies      
This is why I always eat my science which a large helping of humble pie with extra skepticism.
multinglets 5 days ago 0 replies      
So people don't perform complex, goal-focused motor tasks without having a goal ahead of time after all.

Wow, philosophy people.

EDIT: Cry about it all you want. It won't change the fact that in 100 years people will look back and wonder if an entire academic discipline was afflicted with some form of literal mental retardation.

How Trees Calm Us Down newyorker.com
328 points by Vigier  2 days ago   89 comments top 26
eggy 1 day ago 3 replies      
I was born and raised in Brooklyn, NY, but moved out to rural NJ in my thirties. I bought a house on a lake with no motorboats, plenty of black bears and raccoons and lots of trees. I now live in the rice fields of East Java, Indonesia, so I guess you can say I love the outdoors.

I do question the science or numbers in the study as much as I believe the basic premise to be true, however, correlation does not automatically imply cause. People suffering more after trees are removed can also mean that urbanization or development brought factories, or unhealthier air, rodents or any number of other negative factors with it.

I do intuitively relax more, and take great solace in my surroundings, and I do believe it is better for people. I would like to see more research on this; there have been a lot of debacles in the past two years in the social sciences and psychology with statistics and peer review. Some of the studies were taken for granted and are now under the microscope for being inconclusive or just wrong.

Yea for trees! And plants, animals and all that entails!

fratlas 2 days ago 2 replies      
I opted for a lower quality apartment this year because all it's windows face vast green fields or trees. The effects are undeniable - some of my favourite times this year have been spent just sitting on my balcony admiring the greenery.
elcapitan 1 day ago 2 replies      
Having trees in the city is nice, and Berlin has an ok level (at least where I live). But I recently started to do long weekend day hikes in the area around the city, and the effect is even better. The constant change of natural forms while moving really frees up my mind and floods it with new impressions that I don't have on my work days. I used to have a meditative effect from running, but it has become a bit too much routine in that regard.

There must be something about "natural forms" (as in varying, not changing, non-rectangular) that creates that feeling.

jmarbach 1 day ago 0 replies      
Read more about The Biophilia Effect: https://en.wikipedia.org/wiki/Biophilia_hypothesis
andrewfromx 1 day ago 1 reply      
"Are trees alive?" is the question to ask yourself. They can seem very un-alive to us humans. But when the wind blows and their leaves move you can see it. They are literally WAVING at you. Think back to when you were 8 on the playground and a friendly kid waved at you. Trees just wanna play. But wait you say, that's just the wind. The tree isn't deciding to move like the 8 year old kid decided to wave his/her arms. OR DID the tree purposely make its leaves in a shape to catch the wind and that movement is 100% intentional. When you see it that way you can stare at trees for hours. Also, every single one of those trees is naked. When you are bored/depressed/lonely just stare at trees and giggle.
mkolodny 1 day ago 2 replies      
From what I've seen living in Toronto, NYC and Montreal, streets with nicer houses/apartments tend to have more trees. Those neighborhoods also tend to be quieter.

Take the example of NYC. The Upper East Side and Clinton Hill are two neighborhoods with a relatively large number of trees. Both of those neighborhoods are two of the most expensive and quiet neighborhoods in Manhattan/Brooklyn.

So it could just be that quiet streets and nice houses calm us down. But then again, maybe having more trees is what causes neighborhoods to be nice and quiet. As far as I can tell, it could go either way.

anilgulecha 1 day ago 3 replies      
One hypothesis is that we've also evolved to associate greenery with healthy land and lifestyle. I can see why these signals from millions of years can have the 1% quoted effect.
cantcopy 1 day ago 2 replies      
All of those saying correlation is not causation did not read the article. The study detected an immediate and neasurable effect from just walking among trees.
aaron695 1 day ago 1 reply      
An office enriched with plants makes staff happier and boosts productivity by 15 per cent.......


kkylin 1 day ago 0 replies      
Interesting finding, but the article at least leaves one with more questions than answers (I haven't looked up the original research).

My own personal experience tends to confirm the main point put forth. Indeed, when we moved to the US Southwest several years ago, I thought I would miss oceans the most (having always lived on a coast). But no, I really miss seeing green -- my first time back east after moving here, the impact of seeing all those trees was really tremendous (& positive).

Having said that, the effect mentioned in the study can also be due to the amount of attention that a city street demands, and a lot of other factors. (Walking down Broadway in NYC in the middle of day just isn't the same as strolling through West Village on a Sunday morning!) Not to mention what other commenters have pointed out, e.g., correlation != causality. Quite likely the researchers have thought about this; I would be interested in what they found.

et-al 1 day ago 0 replies      
National Geographic had a similar article earlier this year if you'd like to read more:

"This Is Your Brain on Nature"


kevindeasis 2 days ago 0 replies      
Anyone notice that the sound and atmosphere contributes to their well-being?
patrickk 1 day ago 1 reply      
I recently visited an abbey in Killarney, Kerry in Ireland. The monks build an enclosed walkway around a very old yew tree, it was fascinating: http://www.killarney.co/muckross-abbey-killarney.html

Perhaps inspired by a similar line of thinking.

ilaksh 1 hour ago 0 replies      
Plant fruit and nut trees. This is vastly more useful than trees for the sake of calm.

See permaculture, food security, urban farming, distributed production, decentralization.

Trees for some zen or aesthetic cause is an elitist and ignorant perspective. Land use in suburban environments is extremely poor. Food sustainability is very poor.

Trees are a good starting point to start researching. But there are much more serious reasons than a warm fuzzy feeling.

DennisP 1 day ago 0 replies      
"Berman and his colleagues have zeroed in on the low-level visual characteristics that distinguish natural from built environments. To do this, they broke down images into their visual components: the proportion of straight to curved edges, the hue and saturation of the colors, the entropy (a statistical measure of randomness in pixel intensity), and so on."

I wonder whether these principles could be incorporated into architecture and interior design, so we feel like we're in a natural setting even when indoors.

(Even better with trees visible through the windows, of course.)

jcl 1 day ago 0 replies      
Some of the walks took place in June, whereas others took place in January; most people didnt particularly enjoy trudging through the harsh Michigan winter, but their scores jumped just as much as in the summer trials.

I found this the most interesting point in the article. I would have assumed that any psychological effect of viewing trees would be largely due to their greenness, since that is their dominant visual aspect. But, assuming a largely deciduous environment, naked trees in winter would seem to have the same effect. So the effect must be stimulated by something deeper than just raw color.

nichochar 2 days ago 1 reply      
I can never show this to my mother, she was right all along!
ianai 1 day ago 0 replies      
You mean I don't have some mystical connection to the trees and that it's simply burned into my synapses from eons of evolution?!
JustSomeNobody 1 day ago 0 replies      
Interesting. I love walking among the southern live oaks around where I live. They're just amazing trees. There was one that got hit by lightening a couple weeks back and split down the middle. I actually felt empathy for it. These trees are usually hundreds of years old.
wodenokoto 1 day ago 3 replies      
A tree on a street is incredibly expensive to maintain. The tree itself need maintaince from a gardner and the surrounding road and sidewalk needs extra maintenence too.
dgudkov 1 day ago 0 replies      
Claude Monet clearly new something about it.
hyperpallium 1 day ago 1 reply      
> twenty per cent better ... on tests of memory and attention

> five times bigger in people who have been diagnosed with clinical depression

amelius 1 day ago 2 replies      
I can see a market for a VR movie/game which allows the user to walk/drive through (or just sit in) a forest :)
jrcii 1 day ago 0 replies      
I remember an explanation for the calming effect of nature by David Allen of Getting Things Done fame. He claims that the environment is too complex so your mind "lets go" he repeats some of that here https://www.linkedin.com/pulse/20121027044918-402563-david-a... I'm not sure what the basis of that analysis is, but he could be right. I contrast that with the jail from THX 1138 which doesn't seem like it would be relaxing http://nightflight.com/wp-content/uploads/THX-1138-6.jpg
known 1 day ago 0 replies      
tedks 1 day ago 1 reply      
Correlation is not causation.

Houses on streets with trees are more expensive. People that can afford to live there are healthier for obvious reasons.

Likewise, people that are put into better hospital rooms are probably just patients the hospital is willing to expend more energy on, because they have deeper pockets/are the right ethnicity/are "respectable people" etc..

Is there any non-depressing source of science journalism left in the world?

HTC Vive Headset Nearing 100,000 Sales roadtovr.com
383 points by prostoalex  3 days ago   249 comments top 26
carlosdp 3 days ago 8 replies      
I said this in another thread, but if you haven't tried the Vive (or maybe Oculus with the Touch controllers, which are not generally available yet), then you haven't really tried VR.

Wearing a stereoscopic headset is one thing, but the presence that room-scale brings is just another thing altogether. I had tried Oculus demos at conferences and such, and I kinda started to think maybe VR was over-hyped.

Then I got my Vive on pre-order, put it on and went into the tutorial. There was a part where you can inflate balloons coming out of your perfectly-tracked controllers. I instinctively bonked one with a controller, and I FELT it (thanks to haptic feedback from the controller and diagetic sound). I basically forgot I was wearing a headset for the rest of the tutorial, I was present.

That's when it really clicked for me that this is happening this time. This is something new. I'm 100% convinced VR is the new medium going forward.

cm2187 3 days ago 8 replies      
I must say that having purchased both the Vive and the Oculus, my vote is for the Oculus:

- easier to set up, the Vive requires these little captors placed at odd angles, that need to be powered and that make noise when powered

- the oculus is lighter. I find the weight of the headset to be a major factor for not playing long periods, more than motion sickness, particularly given that some games require some weird viewing angles

- the headset has less cables. I have the feeling of wearing an octopus with the vive

- I regularly lose tracking on the vive for short moments (grey screens) even though the captors are in direct line of sight and less than 3m away

- I have the feeling the vive is a lot blurier when looking at the sides of the screen. Not something one does when a zombie is trying to kill you, but certainly something one does when navigating menus

The only downsides of the oculus are:

- vertical green line between the two eyes, very visible in a dark environment

- the oculus pauses the game when I take off the headset. Except that I do that when a game is holding me hostage with some interminable, unskippable story telling instead of letting me play VR (too frequent!)

taspeotis 3 days ago 2 replies      
If you look at the Steam Hardware Survey [1] you will see that HTC Vive is "winning" two-to-one [2].

But sobering reminder of how far consumer-grade VR has to go: users with VR headsets are 0.15% of Steam's installed base.

[1] http://store.steampowered.com/hwsurvey Caveat: Participation is optional.

[2] Caveat: this assumes that all HTC Vive and Oculus Rift users are Steam users. Oculus has its own store.

ngokevin 3 days ago 3 replies      
Given the audience here, I think it's worth a plug that web developers looking to develop VR experiences such as for the Vive should look into A-Frame (https://aframe.io), a WebVR framework.
greenspot 2 days ago 2 replies      
Somewhat related: Nintendo sold 770,000 units if their Virtual Boy in less than one year.

The Virtual Boy was nothing against the current VR products and for a Nintendo product a pretty mediocre one then, maybe their biggest product failure but they still did 770k units.

Source: https://en.m.wikipedia.org/wiki/Virtual_Boy

fredliu 3 days ago 1 reply      
It's great that Vive has gained such traction (I personally contributed to that number), but I don't think VR has "hit it" yet. I think what Vive offers today is a good indicator of what would become the norm in VR in the next few years (assuming big companies keep pouring money in), when mass production of low cost equipment that provides experience similar to what Vive can do today. Low cost device + pretty good (but may not be the top notch) experience is what's needed for VR to really take off. Cardboard is trying to do that right now, but honestly it lacks a lot on the "pretty good experience" part.
bitmapbrother 3 days ago 3 replies      
I don't see why anyone would buy an Oculus over a Vive. The software support for the Vive is just so much better on Steam. Additionally, the majority of PC gamers prefer Steam and it only makes sense they would want all of their VR games and traditional games to be under one service.

I really don't see how Oculus will even be relevant in VR games in about 2-3 years especially once the VR headset is commoditized.

sandworm101 3 days ago 2 replies      
I've held off on the VR sets because, imho, the games are just not there yet. My decision turns on one thing: I want a proper flightsim. Not mariocart in the sky. I'm talking JanesF15 with a fully interactive cockpit. I want to be able to look over my own shoulder and see a wing. I want to look left and see the runway I'm about to turn onto. The day that happens, then the VR headset will be the least of my purchases. Pedals, a throttle suite ... perhaps a special chair. Until then, inflating balloons and riding roller coasters just won't win me over.

I don't see why this hasn't happened yet. Flightsims, which lock the player into a chair anyway, would seem the perfect vehicle for VR.

alexmingoia 3 days ago 5 replies      
VR is great but making any decent 3D model or environment requires enormous amount of labor. Without better tools I don't see VR being used for anything but games since you'll need to throw huge teams and many hours at making anything decent. Most (all?) VR apps and games I've used look like shit.
kriro 2 days ago 0 replies      
I recently watched the Arizona Cardinals documentation that is on Amazon Prime video (iirc it's called "All or Nothing" but searching for "Cardinals" finds it). It's excellent. The most interesting thing for me was that 5 NFL teams seem to be using VR to learn the playbook/go through scenarios. The Cardinals are one of them and it was a Rift. Unfortunately that sequence lasted about 5 minutes (don't remember which episode it was but probably across the midpoint).

Technically the Vive would have been a better choice as the room feeling is exactly what you want but I guess FB are pretty good at customer acquisition/signing big name contracts.

Fej 3 days ago 0 replies      
I've tried the Rift DK2 and the Vive Developer Edition. What I've found is that VR just isn't there yet; it's too much of a hassle for there to be a critical mass of consumers willing to purchase a headset. The HMDs are very much alpha tests - if you're into VR, wait for the Vive 2nd generation to come out that hopefully works out the kinks (unless you have a ton of spending money).

Haven't seen the Rift CV1 but I know that the Vive, for me, still has the "screen door" effect. Until that goes away I'll always be reminded that I'm in a digital world - after all, I can see the pixels.

swiley 3 days ago 2 replies      
I'll buy one when I can have a room full of xterm windows.
joeevans1000 1 day ago 0 replies      
Vive is so much better an experience than Oculus. I've tried the Vive version they were demoing to the public last year, and the current consumer Oculus. Presumably, the new Vive is better, but the version I tried was better then the actual consumer Oculus. That comes as a tremendous relief to me, as I'd never, ever buy into the Facebook VR walled garden. It's too bad Carmack had to join Oculus, but whatever, just glad there's a better alternative.
mxfh 2 days ago 0 replies      
Ever since I accidentally fell asleep in an Alcatraz prison cell[1] while being Vive's room-scale VR, I started to appreciate my carpet on a whole new level.

[1]realities.io http://store.steampowered.com/app/452710/

Tepix 2 days ago 0 replies      
I had both the Rift (got it because I was an original Kickstarter backer) and the Vive. The roomscale of the Vive together with the tracked controllers adds a lot to the immersion and that was the reason why the Vive is the one I kept.

In general the differences between the systems are overblown, you'll have a good experience with both.

kevindong 3 days ago 0 replies      
Before VR makes any real headway in the regular Joe Consumer market, it must come down in price dramatically.

The actual VR hardware itself isn't entirely unreasonably priced. But, when combined with some beefy backend hardware, it's approaching "unreasonable" territory.

yread 2 days ago 2 replies      
Is 100,000 a big number or a small one? Kinect sold 10 million units in the first three months (that's 133,000 a DAY). But it was half the price and didn't need such a strong PC - then again there is more PCs then xBoxes.
saluk 3 days ago 2 replies      
Steamspy is highly inaccurate, but I do think Vive sales are generally healthy, and it seems like they are making them about as fast as they can be sold now. It has enough of an install base for quality content to make it onto the top 10 list when they launch, as Pool Nation VR did a few weeks ago. The general public still has not tried VR (barring maybe cardboard) and a very vocal group is against the very idea of it. Still a pretty steep mountain to teleport up.
PureSin 2 days ago 1 reply      
Searched for places to try Vive without spending $800 + PC cost: http://www.digitaltrends.com/virtual-reality/where-to-try-ht...

select MS Stores and GameStop

Joof 2 days ago 0 replies      
I demoed the vive in-store and attempted to pop the balloon I was given (which resulted in a loud clap of the two controllers -- thsnkfully they designed for bumps like this). Now I keep thinking of ways to play with the tiny area of space (apart from just teleporting). I want to dev for one so bad.
zouhair 2 days ago 3 replies      
The more I look at these numbers in the tech world the more I see the huge divide in humanity. Even if they have sold 10 million units it's still a fraction of a water drop of the sea of humanity, more than frigging 7 billion.

100k is just 0.0014 % of humanity who got a Vive.

Bubbles, bubbles everywhere.

intrasight 3 days ago 0 replies      
Vive and Oculus are both first-generation, early-adopter versions of VR. As such, arguing about which is "better" is unwarranted or at least premature. They are both capable learning and demoing tools. Neither one, as v1 offerings, will reach the mainstream. In fact, it is possible that neither will become the dominant consumer VR platform.
werid 2 days ago 1 reply      
I don't see a near future where the Vive is something I'll invest in, purely because I don't have the physical space to dedicate to it (not even temporarily).
ycosynot 3 days ago 0 replies      
I'll just leave this here


nabaraz 3 days ago 1 reply      
I own a Firefly VR headset which I bought for $60. It works great for 360 videos on youtube and some games. The best part is it uses my phone and has a small wireless remote for navigating and making selections.

I don't understand the whole hype with Vive/Oculus. I know they are a beast compared to cheap VRs but requirement of gaming pc, strangling wires etc sets me off.

serg_chernata 3 days ago 1 reply      
I know that this is completely superficial, but I really wish I could have a Vive in Rift's industrial design. Part of me can't even get past the look of Vive. I know, when I'm using it I'm not seeing the outside. However, knowing what's strapped to my face just seems a little gaudy.
Coursera courses preserved by Archive Team archive.org
376 points by mihaitodor  11 hours ago   57 comments top 19
white-flame 10 hours ago 10 replies      
It's great that there's a central place for this, at least once it's organized sensibly.

The stupidest, most counter-productive aspect of all these MOOC systems is the artificial schedule imposed on the course. While I've been able to take a couple to completion, I've had to let some by the wayside due to scheduling. Once that happens, you're disincentivized to catch up because of being behind and those that affect scoring. When I've gone back to finish courses that I had to leave by the wayside for the moment, sometimes I've lost access to the materials because the course has shifted to its next "semester". There's absolutely no reason for that. While there are a few courses like music or writing where you are collaborating or cross-reviewing other people's work, most of them are standalone lectures, homework, and tests.

Rondom 6 hours ago 0 replies      
If you want to help Archive Team in its efforts to preserve disappearing content and have some bandwidth to spare:Run an "Archive Team Warrior"-Appliance! This way you can help downloading all this content that is about to disappear!

Or do you have some Digital Ocean promotional credits left, that are about to expire? Spin up a (few) VMs with docker-containers running the warrior on DigitalOcean!


dhawalhs 9 hours ago 0 replies      
For those unaware, Coursera shutdown their old platform on Jun 30th [1].

Many of the courses on the old platform are slowly coming back on the new platform. When I built the list [2] of courses on the old platform the course count was 472, now its around 390. Some of the notables that I was excited to see come back are:

Neural Networks for Machine Learning with Geoffrey Hinton [3]

Computer Architecture from Princeton [4]

Programming Languages from UW by Dan Grossman [5]

Introduction to Natural Language Processing by Dragomir Radev [6]

Many of these courses were last offered a couple of years ago. Hopefully more courses form the list [2] start coming back.

[1] https://www.class-central.com/report/coursera-old-platform-s...

[2] https://www.class-central.com/collection/coursera-old-stack-...

[3] https://www.coursera.org/learn/neural-networks

[4] https://www.coursera.org/learn/comparch

[5] https://www.coursera.org/learn/programming-languages

[6] https://www.coursera.org/learn/natural-language-processing

peatfreak 10 hours ago 2 replies      
You know what would be AMAZING? If there was a Coursera course (or some other MOOC course, books, etc) that explains how archive.org works from the foundational technologies upwards. So, like, you could build your own mini version of archive.org as an exercise. It'd be a fascinating project and could be a great case study in web archiving techniques and information retrieval. Does anything like this exist already?
marai2 9 hours ago 1 reply      
This is incredible, I was just on the Coursera website trying to go to my old courses to continue from where I've left off, but I couldn't get to them. The link to my previously enrolled courses was taking me to the newer version of those courses which haven't started yet. So I thought I'll search HN because I remember reading that someone was archiving these ... and boom! it's the top link on the front page! Yay HackerNews!!! There is some Voodoo-AI going on at HN!
sharmi 9 hours ago 1 reply      
All the names are 'Coursera Curses' instead of 'Coursera Courses'. Someone there must have been really upset with Coursera's approach :) .
nym 9 hours ago 0 replies      
Archive.org accepts all kinds of donations.


Credit Card, PayPal, Bitcoin. Brewster is an amazing Steward of the Internet Archive.

Video Tour: https://vimeo.com/59207751

ipsum2 10 hours ago 0 replies      
Archive.org does amazing work, I would highly recommend donating to them if you can.
RCortex 3 hours ago 1 reply      
Does anyone know if they archived the webpages, assignments, and quizzes too? Or did they just manage to download the lecture videos? I'll try downloading it myself and checking, but I don't have the fastest internet connection.
tgarma1234 7 hours ago 1 reply      
I have taken a couple of Coursera courses on R and Stats. They basically give you a brief outline of some topics you might want to pursue more in depth and they give you access to a discussion forum. I haven't found that this method of learning/teaching is very useful. There seems to me to be a huge opportunity waiting to be developed if someone can make a site like that but with more interactive elements AND where the learning/teaching is based on sound educational principles that can be demonstrated to effectively result in skills mastery. As it is now, Coursera is basically skimming cash off of the internet's insatiable google searching for information, like for example someone might google "Learn R" and then fall into the trap of paying $49 for a class that consists of nothing but videos really without having a clue about whether or not the videos really work to communicate knowledge or even whether the videos are touching on anything meaningful. If it hadn't been for the "Johns Hopkins Data Science Course" branding on the class I signed up for I wouldn't have fallen for it I am sure.
philippnagel 9 hours ago 1 reply      
Is there a way to torrent/mirror the data from archive.org? Storing all of it in a central repository seems counter-intuitive to me.
unreal37 8 hours ago 0 replies      
This is a collection of torrent links of copyrighted material? Is that right?

I guess I'm asking, how is this legal?

avodonosov 6 hours ago 0 replies      
I would pay several dollars to keep some courses I took in their original form (even archived form, no new edits or posts). I guess many other course participants would too.

That might be a source of income for coursera, probably enough to cover operational expenses of running the old platform with old content.

Joof 6 hours ago 0 replies      
Thank god. Geoffrey Hinton's RMSProp for deep neural networks is still cited in papers from his slide on his coursera course (the only place it was published AFAIK). It would be a shame to lose that forever.
govindpatel 8 hours ago 0 replies      
How can i use this?Those file are so big. Is there is any way I can download only courses which I want?
satyajeet23 10 hours ago 0 replies      
That's really amazing.
chris_wot 3 hours ago 0 replies      
Whilst I hate some aspects to MOOCs, the fact is that I spent about $120 to learn the basics if SAP, whereas if I'd gotten "proper" training on the same subject matter it would have cost me thousands.

I'd love to see a site that specialises in user contributed content along the lines of Wikipedia. It's funny though - take SAP as an example: I'd be just as happy reading a book that explains it all better than what is out there right now! A book that assumes you are into technology but have little skills or knowledge of the business processes that SAP gets intoned in, and which gives you a rundown of this before giving a detailed rundown on how SAP implements these processes.

Sadly, no such thing exists, but happily for me I stumbled upon http://www.accountingverse.com/ and http://www.accountingcoach.com/ (no, I'm not affiliated with them in any way) and it turns out they didn't cost anything and I finally "get" double-entry book keeping, financial transaction concepts like the general ledger, journal, accrual method and the fundamental accounting equation. Wish I'd known this earlier to be honest - as I say, I lament that there are no books on SAP core modules that go from concepts to the nuts and bolts of how SAP does things :-(

reachtarunhere 9 hours ago 1 reply      
It is great that someone decided to act on it.
enraged_camel 11 hours ago 2 replies      
Maybe I'm missing something here... but where are the actual course titles?
Unpleasant Design 99percentinvisible.org
371 points by sndean  3 days ago   206 comments top 27
scrollaway 3 days ago 6 replies      
One of the most important paragraphs:

> Most of these goals seem noble, but the overall effect is somewhat demoralizing, and follows a potentially dangerous logic with respect to designing for public spaces. When design solutions address the symptoms of a problem (like sleeping outside in public) rather than the cause of the problem (like the myriad societal shortcomings that lead to homelessness), that problem is simply pushed down the street.

Can't agree more. It's so common in France and really depressing. On the linked video (which looks to be shot in Paris, actually), we see dozens of these contraptions right in front of metro ads which cost most than it does to feed and shelter homeless people for several weeks.

wingerlang 3 days ago 4 replies      
> The Camden Bench is virtually impossible to sleep on. It is anti-dealer and anti-litter because it features no slots or crevices in which to stash drugs or into which trash could slip. It is anti-theft because the recesses near the ground allow people to store bags behind their legs and away from would be criminals. It is anti-skateboard because the edges on the bench fluctuate in height to make grinding difficult. It is anti-graffiti because it has a special coating to repel paint.

It's a bench for sitting on, I don't understand how anyone gave this a second thought.

And are they treating the anti-skate/anti-graffiti as negatives? Why would the city not want to protect their property and make keep the tear to a minimum.

fatdog 3 days ago 8 replies      
Why do people sleep on benches instead of shelters? When you look at a typical homeless person nest, it is arranged to protect them from theft and violence. It's either in a well populated place, usually with middle class people who will report violence, or where they can hear people coming, has good visibility, a corner for their back, etc.

When you look at a typical homeless shelter, it has bunks or cots, no security, people with no reason to report violence, and it's full of desperate homeless people with addictions and erratic mental illness. They don't want to be around each other because they know it's dangerous. Why would you go to a shelter unless you were looking for trouble? It's full of predators.

Charitable organizations don't give homeless people privacy for fear of the human things they will do, like drink, use drugs, have sex, etc. The street or a park provides relative privacy and freedom, within walking distance of services.

Given most homeless people are men, the solutions are more likely to involve providing basic physical privacy and security than comfort or sympathy.

panic 3 days ago 3 replies      
This isn't necessarily bad, and there are plenty of innocuous unpleasant designs: doors with locks, fences, car alarms, even those faucets in airport bathrooms you have to keep pushing to keep them from turning off. Sometimes you need to make something less friendly in order to prevent certain unwanted uses.

The important thing to remember is that you aren't fooling anyone. It's quite clear what your unpleasant design is designed to prevent, and the design itself serves as a constant reminder of the problem it was designed to solve. Locks remind you of burglary, airport bathroom faucets remind you that people often leave the water running, and spiky windowsills remind you of homelessness almost as much as actual homeless people do. This may not be the kind of feeling you want to build into your city!

geon 3 days ago 1 reply      
Something I didn't see mentioned in the article are aluminum benches. They act as a giant heatsink and makes your butt ice cold if you sit too long.

They have them at the Copenhagen airport.

gabemart 3 days ago 5 replies      
I see two main claims in the article:

* "Unpleasant design" is different because it aims to exert social control in public spaces

* "Unpleasant design" is different because it is intended to create a hostile atmosphere - it increases unpleasantness rather than pleasantness

I don't really agree with either claim.

All design of public spaces is intended to exert some form of social control. Creating a manicured park with flower beds, comfortable benches and beautiful splashing fountains is intended to exert social control by encouraging people to congregate and spend time in the park. In fact, any good designer attempts to exert control by encouraging certain behaviour patterns over other behaviour patterns.

"Unpleasant design" is not intended to increase unpleasantness. Some groups perceive people sleeping on benches or congregating ("loitering") as unpleasant, just as other groups find preventing people from sleeping on benches or congregating to be unpleasant.

I'm not defending the particular examples of design referenced in the article. I'm not saying it's a great idea to make benches you can't lie down on. But I don't agree it makes sense as a distinct category of design. It's just design that tends to favour groups higher in the socio-economic hierarchy at the expense of groups lower in the socio-economic hierarchy. This may be a very bad thing, but I don't think it's a different type of design.

vog 3 days ago 1 reply      
This works also towards animals.

For example, on top of the information displays at outdoor stations, there are spikes, as seen on this picture:


These prevent birds from sitting on these displays, which keeps the displays clean and prevents passengers waiting below that display from unpleasant surprises.

habosa 3 days ago 0 replies      
I really hate all of the effort taken to prevent people from sleeping (or generally relaxing) on the street in San Francisco. I have on many occasions picked up a sandwich and then walked a mile without finding a good place to sit and eat it. All because we're worried some homeless person might be too comfortable.

The problem is not that homeless people want to sleep on benches, the problem is that we've failed so badly at taking care of each other that people have nowhere else to sleep.

Anthony-G 3 days ago 0 replies      
The leaning benches remind me of the mercy seats found in late medieval churches. Old or infirm monks and other clergy could use them for support during long prayers for which standing was obligatory.


timlyo 3 days ago 3 replies      
I hate those high pitched buzzers for teenagers. I'm in my 20s and can still hear them.
swah 3 days ago 3 replies      
A friend had a shoe store downtown and one of the most unpleasants things about it was that every night someone would piss in the front gate/door. How do you deter that?

He closed the business (not only for that reason, of course) and has a store in the mall now.

pmontra 3 days ago 0 replies      
I always wondered if this is hostile architecture or just plain stupid


See how the lady walks on the right? Bicycles stay on the two rails on the left and often pedestrians too. All the space in the middle is nobody's land. It's rounded pebbles in a concrete pavement.

feintruled 3 days ago 0 replies      
You see this a lot of airports. One particularly egregious example is EasyJet's 'boarding lounge' (which you are funneled into long before a plane even arrives), which has a few leaning benches but is otherwise standing room only. This isn't for any public good - it is meant to push you into paying for 'speedy boarding', which gets you access to a roped off area with proper seats.
skybrian 3 days ago 0 replies      
It seems like this is the real world equivalent of the design that has to go into social software.

It's a universal rule of the public Internet that spam and abuse make everything suck. Therefore it's not enough that your design encourages good usage. It also has to discourage bad usage.

Overtonwindow 3 days ago 0 replies      
Generally I avoid the homeless. I don't give them money and I frown on their activities. However, they are humans, and they deserve to be treated with respect. It warmed my hearts when the anarchists in London used cement to cover up the spikes at a Tesco. I'm not advocating destruction of property, or the breakage of any laws, but I do think we all need to change our views on the homeless. They're people. Treat them with respect, even if you disagree with their way of life.
ipsin 3 days ago 1 reply      
I'm convinced that someone(s) in Santa Monica, CA's urban planning divsion has been practicing unpleasant design ruthlessly.

The first pass replaced normal bus bench seating with low blue toadstools. Once they figured out that they're terrible for people with any kind of knee problem or more serious disability, they added a pair of poles to the sides. Useless,uncomfortable seats, but by God, no homeless people (or regular people) can enjoy their use.

More recently, at a rail station they added a wavy pattern to the sidewalks[1]. I heard second-hand that the texture causes disorientation and nausea, and then I experienced it myself. Now I'm convinced that it's to keep bicyclists off the sidewalk, because it makes anyone sick to look at it, but bicyclists are slightly faster and usually have to look where they're going more carefully than pedestrians.

[1] http://ronslog.typepad.com/ronslog/2016/06/santa-monica-phot... , particularly the sixth photo. Not mine, but the best shots I've seen of it.

Animats 3 days ago 0 replies      
Then there's the Columbia, SC final solution to the homeless problem.[1][2] Their concentration camp plan didn't work out, though.

[1] http://www.columbiasc.net/depts/city-council/docs/old_downlo...[2] http://www.foxnews.com/politics/2013/08/28/south-carolina-ca...

smonff 3 days ago 0 replies      
frik 3 days ago 3 replies      
There is an old classic book about the topic about NYC from a famous researcher. Everyone should read that book. (I don't recall the details, but I have it in my bookshelf at home)
ctack 3 days ago 1 reply      
I would love to see fewer street lights. Especially in village / suburban settings. But the word is that they reduce crime. Are there any studies to this effect?
ekingr 3 days ago 0 replies      
Reminds me of a scandal with Tesco in London when they installed "anti-homeless" spikes in front of their stores.https://www.theguardian.com/society/2014/jun/12/tesco-spikes...
mentatseb 3 days ago 0 replies      
It's basically applied actor-network theory from sociology and the delegation of control to objects https://en.wikipedia.org/wiki/Actor%E2%80%93network_theory
mobiuscog 2 days ago 0 replies      
The irony that the second edition of the book is digital only, so can't be used for stoking fires, covering windows, making paper airplanes... etc.

Unpleasant design ?

paul_milovanov 2 days ago 0 replies      
aka, design.
PepeGomez 2 days ago 1 reply      
This comment breaks the HN guidelines badly. We ban accounts that do this, so please don't post anything like it again.

Instead, please (re)-read the site guidelines and follow them. That means posting civilly and substantively, or not at all.



We detached this comment from https://news.ycombinator.com/item?id=12043967 and marked it off-topic.

fiftyacorn 3 days ago 5 replies      
the blue toilet isnt bad design - its used to stop people injecting drugs in public toilets
Lets make peer review scientific nature.com
278 points by return0  3 days ago   140 comments top 13
joelg 3 days ago 14 replies      
Shameless plug: I'm working at the MIT Media Lab on the PubPub project (http://www.pubpub.org), a free platform for totally open publishing designed to solve a lot of these problems:

One is peer review, which, as some have already mentioned, needs to be done in an open, ongoing, and interactive forum. Making peer review transparent to both parties (and the public) makes everyone more honest.

Another is the incentive of publication itself as the ultimate goal. Instead, we need to think of documents as evolving, growing bodies of knowledge and compilations of ongoing research. Every step of the scientific process is important, yet most of it is flattened and compressed and lost, like most negative results, which are ditched in search of sexy click-bait headliner results.

Another is the role of publishers as gatekeepers and arbiters of truth. We need a medium in which anyone can curate a journal, and in which submission, review, and acceptance procedures are consistent and transparent.

Another is the nature of the medium itself. It's 2016, and these dead, flat, static PDFs are functionally identical to the paper they replaced! Insert your favorite Bret Victor/Ted Nelson rant here: we need modern, digitally-native documents that are as rich as the information they contain.

Another is reproduciblity. We should be able to see the code that transformed the raw dataset, tweak it, and publish our own fork, while automatically keeping the thread of attribution.

The list goes on and on...

danieltillett 3 days ago 3 replies      
This issue of scientists being OK with totally non-scientific processes is all too common. When I was an academic my Department used to make an enormous fuss about the year to year jitter in student teaching evaluations. My colleagues (all scientists) would sit around discussing why they were a heroes because their evaluation went up 10% from last year or what they had to change because it went down 10%. I used to just sit there thinking if this sort of analysis was in a paper they were reviewing they would have ripped the authors to shreds.

Peer review fails on all levels. It does not functions as a source of quality control (everything gets published eventually) and even worse it rarely improves the quality of the paper being reviewed. I have published dozens of papers over the years and on only one occasion has the review process improved the paper - in most cases the reviewers demands made the papers worse by forcing me to remove important information or include irrelevant details (citing the reviewers publications mostly).

apathy 3 days ago 0 replies      
Buried in the middle of this (wonderful) article is the heart of the problem -- academia has foolishly placed its metrics into the hands of editors and publishers, who have corrupted the living hell out of it. Ctrl-F "Cochrane" and witness the exchange between an editor, who benefited from the status quo, and a scholar, who did not and does not.

Academia != scholarship and has not for some time. There is no longer a good reason for traditional for-profit journals to exist. (Before someone says it: SpringerNature likes to pretend that editorial independence is possible, but they'll have no choice save to fire their "news" guys if their board asks for this).

Please recall that the entire point of the World Wide Web was to share physics papers. The arXiv exists because of physicists (who quickly noticed that HTML wasn't a good substitute for LaTeX when writing math-heavy papers). The problem is not technological. It's social. And until the incentives are fixed (the Cochrane Collaboration in Britain has gone a long, long way to address this, and now the Wellcome Trust is going even further), nothing of any real import will change. In the USA, NIH could make a lot of positive changes (and in exchanges with mid- to senior executive level directors, I honestly believe they're trying to do so). But it will take time. Academia moves with glacial speed, when it moves at all.

lordnacho 3 days ago 1 reply      
Perhaps it's a question of incentives. What exactly do you get out of reading a quite complex piece of work and saying your opinion?

The closest I can come, as a non-academic, is perhaps reading other people's code.

It's bloody hard. You need to concentrate, and it's not like reading a newspaper article at all. Even small errors are not easy to spot, and it's even hard to know whether the code is structured in the way the comments say.

It somewhat makes sense to do the exercise if I have to be using the codebase. If I'm just commenting and giving a thumbs up/down, it's quite easy to reduce effort, come up with some generic comments, and see how the other reviewers do. Which is a recipe for letting errors through.

chmike 3 days ago 4 replies      
I'm enclined to think that an open review process by using a web system with a space for open discussion/commenting could be a good system. This wasn't possible before the web.

Actual system is publish or not, but it is possible now to publish everything and use a rating system. This removes the risk of plagiarism and anteriority disputes.

mariusz79 3 days ago 2 replies      
I've got a simpler solution - scientific paper should not be considered valid until another team replicates the findings. That would quickly get rid of all of the fake results, plus would weed out all of the research that nobody really cares about. Also require that all data is shared, and that until your results are not verified or replicated by a team from another university/country you don't get any more funds.
zyxzevn 3 days ago 0 replies      
The peer-review is mixed with a reductionistic structure. Each scientific area is specialized. The articles are reviewed by people in the specialized fields. Often they find something that is not entirely in their field of knowledge. That is because they are trying to "science" something new.

So they can produce theories that seem a solution from their field of knowledge, but are not valid in another field. The peers will not see this problem. Another problem is that engineers that try to apply this science find practical problems. But because they are seen as "of lesser knowledge" their practical criticism is often rejected.

The best example I know is the "theory" that magnetic fields can bump into each other, producing energy. In this model the fields are made of flux lines, which can bumb into other flux lines. The flux lines will then break and reorganize, producing energy.

Yet, as I write this, you may already think that this is bullshit, because flux-lines are imaginary lines to describe a magnetic field.

Now, with this in mind, look at magnetic reconnection. This is a theory made scientists specialized in astronomy, and not in the area of electromagnetism.

I believe that this problem is in every (specialized) area of science.

jkot 3 days ago 2 replies      
I personally prefer reproducibility over peer review.
Toenex 3 days ago 0 replies      
We could allow everything to be published and then use a continual monitoring approach. After all the limiting resource these days is not paper but reviewers time, so lets distribute that. What is wrong with model like HN uses for open review and comment on work? Even a voting system to help identify the most 'important' works.

Edit: Haha great minds chmike

altoz 3 days ago 1 reply      
Hypothesis: peer-reviewed articles aren't rigorous

Prediction: a flawed article will pass peer-review

Testing: https://svpow.com/2013/10/03/john-bohannons-peer-review-stin...

Analysis: Peer review is not rigorous

arca_vorago 3 days ago 0 replies      
I'll preface by saying I'm only a sysadmin with a love of science, but I learned the most about peer review and scientific publishing while working in biotech at a genetics company. At some point I realized I actually had to have some understanding of the science to properly admin it, and began reading lots of papers, and I slowly started to realize just how bad many papers are.

There are a few key issues:

1. Scientists who "collaborate" with other scientists but do a small fraction of actual work get their names on papers as number fodder. Anytime I hear "I've been published in over 1000 journal articles" now I generally become more skeptical.

2. Lack of reproducibility. Not only in the methods and the documentation of the methods, but also the fact that most things just simply aren't even tested by a third party.

3. Publishing. They have far too long locked up information the public deserves to know, which is bad enough, but then they do a bad job of it and allow bad science in. The largest part of the problem imho because they created the situation that semi forces the scientists into questionable paper writing tactics.

Pinatubo 3 days ago 0 replies      
There's a peer review scandal currently underway involving a top economics journal.


aaxe 3 days ago 0 replies      
www.Examine.com does a great job analyzing nutrition/supplements studies.
Release of IPython 5.0 jupyter.org
324 points by trymas  1 day ago   99 comments top 13
quantumtremor 1 day ago 4 replies      
Glad to hear improvements to the shell ipython interface, especially up/down arrows on pasted code.

The most interesting part of this for me is that IPython 6 will not support Python 2.

>Projects such as Matplotlib and SymPy plan to drop support in the next few years, while a few projects like Scikit-Bio are already ahead of us, and should be Python 3 only soon.

This was also very surprising for the standard reasons, especially for a library like matplotlib. Glad to find Python moving forward. But what will companies stuck on Python2 do? Will libraries like numpy, matplotlib, and scipy all maintain a Python2 LTS?

brbsix 1 day ago 0 replies      
This is going to take some getting used-to. The visuals (specifically the syntax highlighting and code completion) are very reminiscent of bpython.

On the other hand, I'm really happy to say farewell to readline. I've been stuck with readline v6.2.4.1 for ages just so I can have proper linewrap [0]. Of course this breaks virtualenv creation so you end up having to override the system readline [1]. Needless to say, this is well overdue.

[0]: https://github.com/ipython/ipython/issues/3329/

[1]: https://github.com/pypa/virtualenv/issues/4#issuecomment-966...

wodenokoto 1 day ago 1 reply      
> It is important to note that users will always be able to use a Python 2 kernel with the Jupyter Notebook, even when all of our projects have transitioned to Python 3

The way I understand this is that you will need python 3 to open ipython 6, but once running, you can interact with python2 and run and inspect python 2 code.I think that is fine. I can't imagine a modern scientific setup that can't readily create python 2 and 3 virtual environments.

ppod 1 day ago 2 replies      
There is a particular behaviour in RStudio that I would really love to be able to do in python, but haven't found the combination of IDE/or notebook that will do it yet:

I want to execute through lines with cmd-return, or by highlighting and pressing cmd-return, and then see the change in the variables in a separate pane, like RStudio's environment pane. Bonus points if I can click on table variables in the environment pane and examine them in a separate tab with sorting and searching. Spyder comes closest but the execution part doesn't work as fluidly.

thomasahle 1 day ago 8 replies      
As someone who's never used ipython before, but used the standard python interactive terminal a lot, I'm very impressed!

The best feature I've discovered so far, is that when I want to change a function, I can simply 'up arrow', and I get the whole thing! Not a single line of the function, as in the normal python terminal. And if I write a syntax error while typing the function, it tells me immediately!

Does anybody have other examples of great features in ipython over the standard python terminal?

xvilka 1 day ago 1 reply      
Thanks to ipython, radare2 (reverse engineering framework) now has a very useful python shell with autocompletion: https://asciinema.org/a/16ko4jd1e6kdrqkqjxeu248hm
tbarbugli 1 day ago 0 replies      
"The awkward dependencies on pyreadline for Windows and gnureadline for Mac prompted Thomas Kluyver to replace the old machinery with a brand new pure-python readline replacement: prompt_toolkit."

I was waiting for something like this for years!

ferdinandvwyk 1 day ago 1 reply      
Multiline support - my prayers have been answered! Copying/pasting code and modifying multiline commands from your history was easily the most annoying thing about ipython.
ericjang 1 day ago 0 replies      
Kudos to the iPython / Jupyter team for spearheading such an important dev-productivity tool. I use your software all the time. I also appreciate the effort to migrate everybody to Python3.
aliencat 1 day ago 1 reply      
Is there a way to enable vi mode in iPython 5? Since it no longer depends on readline library, putting `set editing-mode vi` in ~/.inputrc file no longer seems to work.
farmerj 14 hours ago 1 reply      
Twitching on the floor... blinky blinky cursor, is there no way to make it go away..? looks in the feature list. Went back to 4.
bobwaycott 1 day ago 0 replies      
Syntax highlighting and line navigation look awesome. The big news is this is the last release to support Python 2.x. Oh, and no more readline. Thank the gods.
thuruv 1 day ago 0 replies      
Damn, the storm already hit python 2.x. Have to move on.
Announcing Rust 1.10 rust-lang.org
341 points by steveklabnik  2 days ago   74 comments top 9
parley 2 days ago 3 replies      
I'm really looking forward to when rustup.rs is stable (atleast for Linux)! I'm trying to push Rust at work, and it's one of those polishy things that would help.

Unfortunately, last time I checked development (on issues blocking the initial stable release) seemed to have slowed as of late, but I should be helping out instead of whining - the Rust community is doing great work!

dikaiosune 2 days ago 2 replies      

Tongue-in-cheek, it's very exciting that distros will now have an easier time patching Rust and producing bugs like this one:


rustc 2 days ago 2 replies      
How's the MIR stuff going on? Is there an ETA on when MIR will land in beta/stable? Will the incremental compilation work start after that?
moosingin3space 2 days ago 1 reply      
Congratulations on the release! `cdylib` targets should prove very helpful for embedding, and compilation from a recent stable version will make it much easier to package.
yarper 1 day ago 1 reply      
What amazes me, is all this cool stuff but a Rust process still can't return an exit code [0] without using a workaround.

I actually use Rust in production, and have generally found it very good - compared to the well discussed difficulties with CPP. I will continue to do so, but I think that the edges really need covering off properly before it'll be treated as a serious competitor to CPP (and equally often, Go).

[0]: https://github.com/rust-lang/rfcs/issues/1176

progman 1 day ago 2 replies      
Congratulations! Are there any plans to support bootstrap from source? Currently any install requires a binary Rust compiler.

I like the way how Nim handles the bootstrap. It's always easy, and it also eases ports to other platforms significantly since everthing is coded in C.

outworlder 2 days ago 4 replies      
> Rust is implemented in Rust, which means that to build a copy of Rust, you need a copy of Rust. This is commonly referred to as bootstrapping. Historically, we would do this by snapshotting a specific version of the compiler, and always bootstrapping from that

So, does that mean that Rust (has/will have) issues regarding binary blobs in pure free software Linux distributions?

dfdfghhhfgh 2 days ago 0 replies      
The marketing of Rust programming language has been excellent. Hats off for having pulled off something like this.
pjmlp 1 day ago 0 replies      
Congratulations on the work.

Very nice to see the push for relying only on stable releases while building Rust.

ThinkPwn: System Management Mode arbitrary code execution github.com
337 points by edmorley  4 days ago   144 comments top 18
fencepost 4 days ago 3 replies      
Don't just plaster Lenovo with this - they're getting the splatter because Cr4sh has been researching their firmware, but this is a multi-vendor issue.

A few important notes from the article and the releaser's blog post:

* This is not a Lenovo problem so much as a problem for multiple vendors who used BIOS based on Intel's reference information. The original problem was with source code provided by Intel. The same problem is confirmed to exist in at least one HP system.

* This was apparently fixed back in 2014, but there doesn't seem to be an indication that it was recognized as a security flaw then or at least it wasn't noted as a security fix. 2014 isn't that long ago in terms of propagating BIOS updates.

* Cr4sh apparently decided to just release, "I decided to do the full disclosure because the main goal of my UEFI series articles is to share the knowledge, not to make vendors and their users happy."

* His assessment is "Its very unlikely that this vulnerability will be exploited in the wild, for regular customers there are much more chances to be killed with the lightning strike than meet any System Management Mode exploit or malware."

jsjohns2 4 days ago 2 replies      
Quite the hilarious "security advisory" [0] that Lenovo put out. They manage to take zero responsibility, shift blame to the researcher/IBV/Intel, and admit that they ship SMM code of both unknown author and purpose.

[0] https://support.lenovo.com/us/en/solutions/LEN-8324

cdubzzz 4 days ago 3 replies      
Interesting bit form Lenovo's security advisory on the matter[0]:

> Shortly after the researcher stated over social media that he would disclose a BIOS-level vulnerability in Lenovo products, Lenovo PSIRT made several unsuccessful attempts to collaborate with the researcher in advance of his publication of this information.

[0] https://support.lenovo.com/us/en/solutions/LEN-8324

bsilvereagle 4 days ago 3 replies      
> Vulnerable code of SystemSmmRuntimeRt UEFI driver was copy-pasted by Lenovo from Intel reference code for 8-series chipsets.

> Alex James found vulnerable code on motherboards from GIGABYTE (Z68-UD3H, Z77X-UD5H, Z87MX-D3H, Z97-D3H and many others):

This is beyond the scope of just Lenovo machines.

excelangue 4 days ago 4 replies      
Starting with the X230 series of ThinkPads, Lenovo has used flash write protection to prevent "unauthorized" BIOS modifications. Owners of X220 laptops and below are able to reflash the BIOS to remove Lenovo's whitelist of WLAN/WWAN cards; the X230 models are currently stuck with Wi-Fi N and Gobi 3000 3G-only cards due to Lenovo's whitelist. Would this exploit allow ThinkPad owners to reflash their BIOS chip without desoldering and flashing with external hardware?
quotemstr 4 days ago 3 replies      
I wish we could even purchase machines without SMM, Intel Management Engine, and similar features. On any machine I own, I want the CPU and the software running in ring zero to be the last word on what happens. I don't think it's unreasonable to ask for a system that meets this requirement.
acd 4 days ago 1 reply      
Joanna Rutkowska has written about Intel based products that are possibly vulnerable to the Intel management engine code. Even if you run an open source operating system such as Linux or FreeBSD, there is still proprietary code in the management engine that you cannot look or verify that its secure.

Here is the paper http://blog.invisiblethings.org/papers/2015/x86_harmful.pdf

UEFI is another gigantic hide point for malware.

I think one could possibly run more simple platforms such as Raspberry PI, Odroid which may not have embedded management engines. That should be more secure than x86 platforms.

flurpitude 4 days ago 0 replies      
Also found in an HP DV7 4087CL from 2010: https://twitter.com/al3xtjames/status/749063556486791168/pho...
gvb 4 days ago 2 replies      
This should be good news for people looking to getting rid of Computrace, effectively a rootkit, from surplus Thinkpads.


Yes, I bought a surplus Thinkpad (T61) and found it had Computrace activated on it. Grrrr.

Yes, I could call the Absolute(R) Software number and they should disable it for me. I have not been willing to sit on hold and jump their hoops to date. Since I run linux on the laptop, is fairly low risk for me, but Absolute(R) Software could inadvertently or intentionally "brick" my laptop. Grrrr.

jkldotio 4 days ago 3 replies      
What are we up to now? Three preloaded spyware scandals, possible remote execution via the Intel stack and now this vulnerability. That's just what we know about, who knows what else exists. I don't think I can buy another one, which is sad as I think it was a timeless and great design.
jonwachob91 4 days ago 2 replies      
T450S user here. What exactly does this mean for me? I get it's a security issue, but that's about all I understood...
mdip 4 days ago 2 replies      
I've always liked the build quality of the ThinkPad series though it's been a few years since I've put my hands on one. That said, it looks like they need to spend similar attention on the software. On the flip side, though, I wonder if this solves the issue with the Yoga laptops I read about recently where the user could not disable SecureBoot in order to install the operating system of their choice.

It's a little sad to be excited about a vulnerability because it might provide an opportunity for a consumer open up the products they rightfully purchased.

shmel 4 days ago 0 replies      
Cool. Now Lenovo admits they have no idea whose this code is and what it is supposed to do. It sounds like a shitty excuse of junior programmer: "Hey man, not my fault! I have just copy-pasted that snippet from StackOverflow!"
kriro 3 days ago 0 replies      
So this is traceable to the Intel reference implementation. I am going to assume incompetence but let's say I would assume malice instead (some government agent wrote the reference spec full well knowing it's exploitable with or without the knowledge of Intel) how would one go about soft-proving that?

Compare machines that are vulnerable in the wild and same spec machines from important people at Intel and/or suspected government agency (assuming they'd simply use a non-vulnerable version instead of some completely different hardware)?

OneTwoFree 4 days ago 1 reply      
I thought I have an intermediate level C knowledge, but I have no idea what is happening in the vulnerable line:

 *(v3 + 0x8)(*(VOID **)v3, &dword_AD002290, CommunicationBuffer + 0x18);
As I understand it is (was) an example code from Intel. Example codes should be easy to understand and well documented.

hoodoof 4 days ago 0 replies      
OK so what can be done to prevent this being an issue on my machine?
rikkus 4 days ago 3 replies      
Really hope Lenovo respond soon. Feel like I should leave my ThinkPads hibernated for now.
ChoHag 4 days ago 2 replies      
Please don't post unsubstantive comments.

We detached this subthread from https://news.ycombinator.com/item?id=12037726 and marked it off-topic.

NASA Data Shows Toxic Air Threat Choking Indian Subcontinent bloomberg.com
297 points by rhayabusa  2 days ago   205 comments top 20
chdir 2 days ago 3 replies      
A related story : There's a mega mall in Chandigarh, India that wasn't given approval for a power connection for quite some time (politics / corruption).

Real estate is super expensive & the owners couldn't afford to sit idle and play games with the government so they decided to run the mall from morning till night on diesel generators. That's ~ $4500 of diesel each day, probably 6-8k litres. This is an example of a pollution source that's completely avoidable. I'm not sure for how many years this continued on.

What's worse is that there was no widespread public outrage. Why didn't people boycott the mall that's polluting their city and at the same time put pressure on the government to set things right ?

For a perspective, the city I'm talking about is a modern affluent city, close to Delhi (~ 160 miles). Home to a lot of politicians & celebrities, one of the most well planned cities in the world [1] & one of the cleanest in India [2]

[1] http://www.indiatimes.com/culture/travel/9-most-well-planned...

[2] https://en.wikipedia.org/wiki/Cleanest_cities_in_India

[3] http://www.tribuneindia.com/2013/20130407/cth1.htm#5

wrong_variable 2 days ago 5 replies      
The Irony !

When I went to India, I though to my-self the living standard in the villages were HIGHER then the cities.

The air in the villages were clean - it was much cooler weather due to not being trapped in the congested cities.

The village people though I was crazy to think that their village was a paradise, and everyone wanted to move to the city.

I think its a serious lack of education that India/Bangladesh will not learn easily. A large number of people need to die due to cancer for it be taken seriously, unfortunately.

We are talking about a country where chain smoking is really common !

And no I a person from the sub-continent so I am not being racist, just pointing out some of the terrible facts about why I am terrified of going back.

kamaal 2 days ago 2 replies      
As an Indian when I come back from overseas this the very first problem I notice, especially when I return from a long trip. The problem is far deep than one can imagine.

A city like Bangalore has almost an impossible amount of scooters on roads. Almost every house has at least one, and most homes have easily 1 for every two people staying in the home. And its not like US where there are laws on building homes. People take a 1200 sqft plot and build 4 floors, with 4 families, so at least 6 two wheelers for a small plot of land. This is even possible because auto rickshaws and buses have gotten expensive. Public transport of any kind is expensive, unpredictable and not worth when you look at the overall comfort and economics of having our own scooter.

At the other hand trees are being cut at an alarming rate. The area where I stay, around 15 years back, had a drive where students from an agricultural campus plotted a tree per home. In the time since a countable few trees remain. People cut trees for various reasons, some which are down right stupid. Reasons go like- A big tree will attract birds who will in turn crap on our cars/scooters, or that chirping birds disturb their morning sleep to impossibility of cutting the tree if grows too high.

The garbage landfill are full. So the government often doesn't collect garbage on time. Sometimes even a whole month passes before the garbage is collected. So you have massive piles of trash(Medical and all other toxic waste included) piling right at the corner of the lane. This causes mosquitoes to breed, and then diseases like dengue spread. The most obvious solution people around the place work to is burn the trash, causing all this toxic fumes to now mix in the air and reach almost everyone's lungs in the area.

On top of this comes industrial pollution. Rivers and lakes are being polluted, encroached and destroyed almost everywhere. Bangalore's lakes have almost disappeared. Many remaining are now cesspools. There was a lake which caught fire recently.

India needs something on the lines of Clean air act, and clean water act urgently. Feasibility of implementation remains a problem though.

goombastic 2 days ago 0 replies      
It's good that the government isn't allowing google street view access to the country's streets otherwise the world would see and understand the unmitigated hell that the country has become.
HenryTheHorse 2 days ago 4 replies      
That satellite image of the haze is scary. Scarier still is how the deterioration of air quality in the cities is affecting health. (The WHO estimates ~15 million bronchial asthma patients in India.)

You haven't experienced air pollution if you haven't breathed in the evening air in Mumbai or Bangalore during the rush hour.

j0e1 2 days ago 3 replies      
As someone who grew up in New Delhi, I think the government has taken quite a few steps like passing laws that require all public transport to run only on CNG, mandating industries to be positioned outside the city and more recently having policies to curb number of cars on the road. But the problem of pollution never seemed to improve. Optimistically I've told myself that if those measures weren't taken we would have been in a worse position much earlier. Though that does little good in the long run. Unless action is taken to effect negative growth to pollution I think anything you do won't quite cut it.

To be fair though, like in any Indian metropolitan, there is immense pressure on the infrastructure that takes solving such problems to the next level. The sheer growth in population(primarily due to immigrants from other parts of India) on a daily basis would put any government in a quandary.

_navaneethan 2 days ago 3 replies      
I am from India. Especially from Bangalore. I commute daily 30kms up and down. I am facing breathing polluted air issues. In fact I was looking for good pollution filter masks. But I am unable to get the good one. Since the reviews of them are unsatisfactory

I am trying to come out of this issue at all. At the same time I don't want to leave my lovable tech job.

Anybody experienced the same issues? Any advise ?

astannard 2 days ago 2 replies      
I remember traveling through Delhi and washing my face afterwards at the hotel. The amount of black sooty dirt that came off was startling!
comatose_kid 2 days ago 1 reply      
Traveled to India ~2+ years ago. Landed in Delhi - couldn't make out the city at all from the air due to pollution. We drove to Agra (3 hr drive) and there was smog pretty much the entire way there.

This seems like a promising opportunity for startups to tackle...I wonder if there are any out there, would love to hear more.

dlandis 2 days ago 1 reply      
How much worse is the air pollution in New Delhi compared to a relatively polluted US city such as Los Angeles?
abc_lisper 2 days ago 0 replies      
Not surprised. The last time I was in Delhi, 4 years ago, I couldn't breathe outside the home.
bhewes 2 days ago 0 replies      
This reminds me of London's Great Smog of 1952 (Dec). It finally pushed the country and city to deal with the air pollution. Hopefully this does the same for the millions living each day in this current mess.
Nano2rad 2 days ago 2 replies      
The NASA data shows particulate matter. The real pollutants are CO2 and other invisible gases. Why concentrate on the visible particulate matter? The real concern is 400 ppm CO2 in air.
ismyrnow 2 days ago 1 reply      
I'm surprised the article doesn't mention density of cattle farming. I know it's a complicated issue, but my understanding is that industrial animal production affects climate more than transportation does.



99_00 2 days ago 1 reply      
Do auto-rickshaws still have dirty two-stroke engines?
microcolonel 2 days ago 0 replies      
They should spend a few billion dollars greening the Thar Desert. They need it.
mediumdeviation 2 days ago 1 reply      
So I'm not really sure what happened, but I couldn't read the article because the page ate up more than 1.5GB of RAM on Firefox and caused it to peg an entire core at 100% utilization http://i.imgur.com/j3q686E.png

And if I scroll down a bit more you'll find a web worker spawned by the same page consuming another 120MB of RAM. This is a lot of stuff happening on what is suppose to be just a news article.

lipun4u 2 days ago 0 replies      
NASA doesn't have anything else to do ? Why do they waste money to tell something which none listens ?
hammock 2 days ago 5 replies      
Dare we suggest that air pollution (the old-fashioned kind) is more important than global warming?
exabrial 2 days ago 6 replies      
We can pass all the air regulations in America, but it won't matter one bit until China and India get their act together. China has 100% 24/7/365 smog cover along their population centers. So sick of people pushing overbearing regulation in the USA without even holding these countries accountable!
Corrode: C to Rust translator written in Haskell github.com
367 points by adamnemecek  1 day ago   119 comments top 18
tinco 1 day ago 3 replies      
Absolutely blown away by the detail of the documentation. The main logic of this project is in a literate haskell file you can easily read on GitHub.


I wonder how readable is to someone who isn't experienced in Haskell. To me reads like a breeze, but I have a project using the exact same parsing library so maybe that puts me at an advantage.

The language-c library he uses is an excellent one, it's a fully spec compliant C parser that's well maintained. I've based my C compiler on it and I haven't encountered any C code it couldn't parse yet. One time I upgraded to a new OSX and Apple added some stupid thing to their headers that broke the parser and a fix was merged within days. This means it takes away the entire headache of parsing C leaving just the actual compiling.

pierrec 1 day ago 2 replies      
I was curious about how this worked so I looked into the source a little (even though Haskell isn't exactly my cup of tea), and WOW... This is just amazing. The most important part of the source is highly educative literate haskell:


kerkeslager 1 day ago 2 replies      
This is the coolest thing I've seen on HN in a long time, and useful to boot. Hopefully this will be a very big help to people moving over to Rust from C for its safety and type-checking. In general I don't support rewrites because, as many experienced programmers have pointed out, rewrites often make many of the same mistakes as the program they're rewriting. But transpilation allows us to keep the code with all the fixes to those mistakes.

In theory I'm a big supporter of Rust. I strongly feel that we should be using stronger-typed languages than C for developing security-critical applications, and from the outside, it looks like Rust solves a lot of my criticisms of C without giving up any benefits of C. A transition over to Rust could be a big win for security and reliability of many systems.

However, I'm reluctant to devote time to learning Rust primarily because it's not supported by GCC (or any other GPL compiler that I know of). I hope the next cool thing that that the Rust community does is to continue the work done by Philip Herron[1] on a Rust front-end for GCC. I know the classic response to this is, "Do it yourself!" but there are too many other areas of Open Source that are higher priorities for me, so sadly this will have to be done by someone else if it happens at all.

[1] https://github.com/redbrain/gccrs

nategri 1 day ago 1 reply      
This is probably the most "Hacker News" thing I've ever seen.
wink 1 day ago 3 replies      
I do get that Haskell is useful to be taken as a tool for these kind of code transformations (at least I have seen quite a few of those) but I am always a bit surprised that people would start such a project in a language that has -per se- nothing to do with either the source or the target language. I know, I know, it doesn't always have to be this way, but I am very much of the opinion that everytime good tools in an ecosystem are written in the language in said ecosystem you get a lot more (and meaningful) contributions.

Best examples: rake (and everything in the ruby ecosystem basically), the amount of people touching ruby c code is very small compared to all the 'standard tools', or cargo.

DaGardner 1 day ago 2 replies      
as many "transpilers" / compilers, whatever you might name them, it lacks example input output.

I want to see how my new rust code base looks light, does it compile with some heuritics, or just 1:1 C to rust primitives?

loeg 1 day ago 2 replies      
Has anyone tried it on some real-world codebases? How about kernel code? It would be very exciting to improve real-world crash safety and security by e.g. converting popular drivers quickly and automatically, followed by a manual pass applying safer Rust semantics.
jswny 1 day ago 1 reply      
Missed the chance to name it "Crust."
DanWaterworth 1 day ago 1 reply      
Time to start sending PRs [1], :P

[1] https://github.com/search?utf8=%E2%9C%93&q=language%3Ac

steveklabnik 1 day ago 0 replies      
Thea author wanted me to drop a link to his blog post on contributing: http://jamey.thesharps.us/2016/07/translating-c-to-rust-and-...
danidiaz 1 day ago 0 replies      
That's an impressive use of literate programming. By coincidence, I had just read this post by John D. Cook "Literate programming: presenting code in human order" http://www.johndcook.com/blog/2016/07/06/literate-programmin...
eutectic 1 day ago 0 replies      
I wonder if there would be any point in using this to fuzz the Rust compiler.

On the one hand, you could use CSmith with a C compiler as a convenient oracle, but on the other you would only be covering a very limited subset of e.g. the type system.

haimez 1 day ago 0 replies      
Here we have it gentlemen: HN bingo.
felixangell1024 1 day ago 6 replies      
The name "Corrode" doesn't seem very positive given the purpose of this program...
serge2k 1 day ago 1 reply      
> Partial automation for migrating legacy code that was implemented in C. (This tool does not fully automate the job because its output is only as safe as the input was; you should clean up the output afterward to use Rust features and idioms where appropriate.)

This was my immediate concern. Is there any chance this tool can produce anything close to clean, safe, idiomatic, rust code?

Mathnerd314 1 day ago 0 replies      
The code has a lot of special cases. Could these be eliminated using machine translation techniques?
cannonpr 1 day ago 4 replies      
I understand the world play, but perhaps it's a misunderstanding of Rusts name origin ?https://www.reddit.com/r/rust/comments/27jvdt/internet_archa...It's after a fungushttps://en.wikipedia.org/wiki/Rust_(fungus)
wspeirs 1 day ago 4 replies      
It's too bad this is written in Haskell. I don't have anything against Haskell, it is just not as popular a language as others.[1] Any ANTLR target language would have been a solid choice.[2] This way more of the community could contribute. This is an invaluable tool if we're truly going to see a shift from C (or C++) to Rust.

[1] http://pypl.github.io/PYPL.html

[2] http://www.antlr.org/download.html

Tech job listings are down 40% on several job boards medium.com
283 points by uptown  1 day ago   244 comments top 37
jpeg_hero 1 day ago 4 replies      
I am feeling the chill in the circle of companies I know.

Other thing that should be mentioned: a lot of initiatives at a lot of companies over the last two years just didn't work.

Feels a bit like we are at the tail end of a pretty big macro cycle of tech companies green lighting big initiatives with optimistic mindset and now a few years later the due bill is coming and the bets just didn't pay off.

Maybe a big example would be Twitter: a few years ago they were straight up hoarding engineers; fast forward to now, what did all this amazing engineering talent get them? Maybe nice code but the engineering they've done hasn't been able to increase users.

The reason engineers get paid well is because of their extreme leverage: just a few engineers can pull off amazing results. But you eventually need the results. And if the results are not there, broadly, then there will be a reentrenchmet.

bigtunacan 1 day ago 3 replies      
Disclaimer - This is all gut feeling and anecdotal evidence so feel free to ignore.

I suspect that jobs being posted to smaller niche job boards (such as Authentic Jobs, which I hadn't heard of until this post) is down overall, but this doesn't reflect the state of job availability.

I believe larger job sites are having enough of a network effect that these boards are becoming less relevant. Most tech job seekers I talk to are going one of four routes.

1) LinkedIn (Facebook for job seekers)

2) Indeed (since it aggregates)

3) Direct application (when you know a specific company you want to work for)

4) Recruiters (Let them do the work for you...)

On the anecdotal front (within the past year) I have been contacted twice by recruiters that took me out and bought me lunch to try and woo me to another job and once contacted directly by the CEO of a company who did the same thing. I want to make this point clear; I'm no one special just a regular software dev with 15+ years working experience.

During one of these lunches the recruiter said to me, "It is so difficult to hire qualified developers that I would say there is a NEGATIVE unemployment rate currently."

I have spoke with multiple company owners, CEOs, and others who are in the hiring position and the general consensus is they can't find qualified help.

This is just my experience.

rev_bird 1 day ago 5 replies      
I don't think I buy the premise. "Tech job listings on one website are down 40%" seems more accurate, and is much less scary. It definitely helps that the author mentions that they track their listings compared to their competitors, but I'm baffled as to why there isn't a graph that covers more than a six-month span when the thesis spans years. If this company has job listing data for multiple listing websites that goes back to at least 2014, that'd be interesting data to look at, but it's not here.

Even then, I'm not sure "tech hiring is down 40%" would be a reasonable conclusion to draw -- it's like saying, "the newspaper only had 15 job listings in it, the American economy must be in the toilet."

whamlastxmas 1 day ago 17 replies      
I'm a newer web developer with a few years of full-time experience (and over a decade of hobby experience), but definitely not a "senior" developer. Most job openings I see are for senior developers. I don't really know if this is historically normal, because I wasn't doing much developer job hunting a few years ago.

For the jobs that I have applied to, I would say I am pretty well qualified. I have put a lot of time into my resume to make it clean and legible. I usually spend over an hour writing a cover letter, and I am pretty confident they are interesting, well written, and have a friendly/personal tone without coming across as awkward or over eager.

Despite all of this effort and being pretty well qualified, I never hear back. Not even a rejection letter. Just silence. I have applied to pretty entry-level positions too for which I am overqualified, and also never hear back. Sample size is only about 10 applications, but they were very focused, well matched, high effort applications.

If there is any shortage of web developers out there, companies sure aren't acting like it. It's not even like I would ask very much, I only want maybe $75k since I am currently way underpaid (about ~$50k in a big city).

My guess is that this is just a side effect of unemployment being pretty high (if you look at realistic measures, not the biased official ones). Senior developers are willing to work for less than before because getting a job is harder these days, and less-than-senior developers can't get a job at all because there's only enough spots for the senior level ones.

myth_drannon 1 day ago 6 replies      
I consider StackOveflow's Careers section a good, high quality sample of the job market (mostly US).I have been scraping them for about 3 months now and I see slight increase in job posts.

Here is the data dump:https://github.com/aparij/soCareers-Data/tree/master/new

minimaxir 1 day ago 1 reply      
I call data shennanigans.

There are a very large number of reasons that the number of jobs on a job site could drop, included to but not limited to the fact that people have stopped using OP's very small job site. (It is also suspicious why the OP compared it to 4 other sites which are unlabeled)

That's partially why blogs use large job sites to try so atleast the change is somewhat statistically significant. (but still could be caused by noneconomic issues. From the little bit displayed in the charts, seasonality is in play, which makes looking at a Jan-Jun horizon flawed)

hkarthik 1 day ago 1 reply      
Yup, the signs of a slowdown are there. The smaller startups and companies that use these job boards have cut back on hiring as their revenue growth has slowed down. The next step for them will be layoffs.

The big tech companies are still hiring but being much more selective and limiting how many they hire. By the time some of them start having layoffs, the tech economy will be in free fall for a number of months and it will be too late if you aren't prepared for it.

Many larger companies are starting to put in place policies and procedures that will limit the number of promotions, raises, and bonuses that they give out. These same actions are also intended to expose lower performers more quickly so they can be identified and managed out.

My advice is this: find ways to continue to learn and grow, and keep building skills and making things. Save for potentially long periods of unemployment, and be willing to work very hard to ensure that you're not perceived as a low performer.

Lastly, don't sweat it. These economic corrections are good thing. They can unlock a lot of latent talent, capital, and resources that are being wasted. Just be financially prepared for it and you'll be fine.

lukasm 1 day ago 0 replies      
Disclosure: I work on a startup that provides a tool for referrals (rolepoint.com) as well as SaaS that is used by job boards (rolepoint.io). I also gathered job boards in a quite popular repository https://github.com/lukasz-madon/awesome-remote-job

Couple of things that I noticed

- more channels - There are at least a dozen of startups like TripleByte, interviewing.io

- referrals becomes preferred way of hiring. From small companies where majority of hires are made through personal networks to fortune 500 companies (using an external recruiter is a last resort).

- AngelList, Stackoverflow Careers have matching features and you can learn more about the company.

- This is an employee market. Companies have to proactive to get employees (hence the 'poaching'). Posting something to a job board is not efficient.

- More acquihires by US companies happens in Europe, since they have hard time competing with Apple, Google, Facebook etc.

First two points are backed by data.

xando 1 day ago 0 replies      
It happens that I collect similar data as well. My data range doesn't go as far as years. Although I've just run few quires for past months. Doesn't look like it Stack Overflow would have a reason to complain. But yes AuthenticJobs doesn't look good.

The chart suggests that he compares AuthenticJobs to similar job boards ~200 job post per month. Stack Overflow Careers is a different league. Numbers averages there 1800 posts per month.

Also, I wrote a counting script for Hacker News' "Who is Hiring" and the results suggesting opposite as well. Plese check this chart https://blog.whoishiring.io/hacker-news-who-is-hiring-thread...

Disclaimer: I run https://whoishiring.io I scrape them all.

Private opinion. Statements like his are trying to produce shitstorm in an obvious way. There is no bubble where I stand.

nfriedly 1 day ago 1 reply      
I get "recruiter spam" pretty frequently. I generally respond with something like "No thanks, but I'll pass along your info if I come across anyone who might be interested." And they generally say "Yes please!"

I now have a list of ~300 recruiters email's that I can give out to anyone who is looking for work :)

exelius 1 day ago 0 replies      
No, it's not just you.

Hiring is down across the board; and it is indeed due to poor economic headwinds. Many finance heads are starting to think that all our bailouts in 2007/2008 did were create an asset bubble down the road in equities by providing access to cheap capital, encouraging borrowing for risky investments that are just now starting to sour.

Add to this the slowdown in China, Brexit, Donald Trump, and a volatile oil market and there's just too much uncertainty for companies to staff up right now. Some firms are even quietly issuing preemptive layoffs under the assumption that 2017 is going to be a very bad year a la 2007.

That said, there's still plenty of work for tech workers (someone has to implement those cost savings projects, amirite?). But look for middle managers, marketing folks and sales people to have a bad time as companies look to trim payrolls.

ILIKEPONIES 1 day ago 0 replies      
On the one hand, we've seen some similar patterns as many seed stage companies have struggled to raise an A.* On the other, it's hard to extrapolate anything from such a small dataset.

Anecdotally, I would say it's also likely that tech companies that ARE hiring are shifting their hiring budget to sources that bring in more qualified candidates. AuthenticJobs is one of the better niche tech job boards, but it's still a job board. We see lots of growing companies that are spending less money on older channels like LinkedIn and job boards and instead, spending money on an applicant tracking system, hiring in-house recruiters, and using services like Entelo, TheMuse, AngelList, Hired, etc.

*We run Underdog.io.

lukeHeuer 1 day ago 1 reply      
Authentic Jobs is one of the more old school places that heavily catered to web design type work. A more accurate assessment of this data may be that web design work is drying up, but we've known that for a while. It's hard to say since they don't mention the competitors they are tracking, but they may serve the same shrinking market as well.
juandazapata 1 day ago 0 replies      
"My startup is experiencing a 40% churn" will be a better title for the post.
matchagaucho 1 day ago 0 replies      
The on-demand gig economy is taking over... aka the "Uber for Web Professionals".

Many companies, SMBs in particular, are discovering they don't need to "own the cow to drink the milk".

It's simpler to post tasks to sites like Fiverr than it is to post a "job description" and hire an employee/contractor.

toephu2 1 day ago 1 reply      
FTA: "Junes job report, well above forecasts, suggests the majority of job gains may be happening in areas outside of tech."

Correct, they are occurring outside of tech, in fact you don't need to guess, it says it right there in the report if you read it [1]:

"In June, job growth occurred in leisure and hospitality, health care and social assistance, and financial activities."

The largest increase last month was in "Leisure and hospitality" which added 59,000 jobs. This is basically the hotel industry gearing up for summer which is when most Americans travel for vacation. These jobs don't produce any real economic growth. The economy is not recovering.Today what the media failed to mention is that the unemployment rate actually went up 0.2% to 4.9% (although the more accurate number to look at is the U-6 which went down from 9.7% to 9.6%). Also Average Hourly Earnings m/m went down from 0.2% to 0.1%.


distracteddev90 1 day ago 0 replies      
I believe this is symptomatic of an industry moving away from job board based recruiting. There has been a significant change in the recruiting sphere and many talent teams are starting to focus more on sourcing and pursuing passive candidates.
markbnj 1 day ago 0 replies      
It could also reflect that job boards are becoming less and less useful for employers or job seekers. I recently went through a job hunt and enabling my resume on boards like Dice just resulted in a flood of irrelevant recruiter spam. I'm not being overly harsh here either.

When I landed a job (a good one) it was through a posting on the Who's Hiring thread here on HN, and subsequent email/phone conversations and co-working time.

percept 1 day ago 0 replies      
While there is still room for other considerations, I was a little more skeptical about the headline until I saw the article's source.

If this had been the usual analysis of one of the large job listing aggregators, then I think more of the arguments made here would come into play, but I give this one slightly more credibility since it's a for-pay and "curated" job site.

So I'll put it in the somewhat-more-interesting category.

Back to the author's points, of course perceptions help frame reality, too.

lawless123 1 day ago 0 replies      
Most jobs sites i have used recently are inundated with recruiters all offering/advertising for the same few jobs rather than the companies that are hiring.
kami8845 1 day ago 0 replies      
From my experience the jobs I have found on "Authentic Jobs" have never been from the tech startups that I actually want to work at, since those primarily use AngelList, StackOverflow, GitHub, weworkremotely, Hacker News etc. to advertise their jobs.
sanowski 1 day ago 0 replies      
leroy_masochist 1 day ago 1 reply      
Occum's Razor might suggest that the reason for this is pretty straightforward -- the downturn in VC funding means fewer dollars of other people's money available for startup salaries.
me551ah 1 day ago 0 replies      
The post should have been 'Traffic to authenticjobs.com is down by 40%'?
mathattack 1 day ago 1 reply      
2 observations:

1) There has been some batten down the hatches.

2) There is migration among boards. (Example: Indeed has become much more relevant than during my last job search. As a hiring manager I need to pay more attention to it.)

RomanPushkin 1 day ago 0 replies      
Can I request a feature for authenticjobs? Glassdoor rating for every company. It will save a lot of time, and I wonder why nobody has it. Thanks!
swingbridge 1 day ago 0 replies      
"Lean" is the new black.

With investors focusing more on hard fundamentals like revenue and profit and a lot less on "hype" metrics like user growth, employee growth, number of baristas on staff and such a lot of tech companies have taken the hint and got down to business. That includes slamming the brakes on hiring and in some cases cutting staff. That's not the only force at play, but it's a big one.

DiNovi 1 day ago 0 replies      
I still get an absurd amount of recruiter spam...
jswny 1 day ago 1 reply      
Is it possible that this trend is due to companies taking in more interns and training them as opposed to normal hiring practices?
DailyHN 1 day ago 0 replies      
Posts on http://angularjobs.com are higher than ever. Disclosure/Source: I'm the owner.
alexchamberlain 1 day ago 0 replies      
I wonder how many companies have decided to hire more junior devs, rather than seniors. In an economic downturn, you still need to keep people in your (human) development pipeline.
geori 1 day ago 0 replies      
This might be happening because Angelist is free and has great candidates.
abritinthebay 1 day ago 1 reply      
I can tell the number of people who weren't around for the last non-incumbent election.

Every major election year the job market does this. It's worse when it's an 8 year cycle.

Add into it the uncertainty around Trump and Brexit causing economic worries... it's basically there is less risk, so less VC capital, so less startups.

It's a normal cycle.

cmdrfred 1 day ago 0 replies      
Who is that job site with the orange line in the graph, they seem to be doing pretty well.
indeedwhynot 1 day ago 0 replies      
If you advertize for a sheep with five green legs, suddenly mother nature will start sending in resums of people who claim to be exactly that. Therefore, job adverts don't work.

What you need, is a resum database in which you can search.

In fact, I would personally never, ever react to a job advert. Thousands of people, who do not have the skill set whatsoever, will react too. That will give a completely wrong impressionof levels of competition that do not exist in realityand drive down compensation for real candidates, who therefore bail out pretty much immediately, or even never react.

Seriously, there is often a good reason why these other people are looking for jobs. It is always the same people looking for jobs. The people whom you really want to hire, are usually not looking for jobs.

You should reasonably assume that people who can really do the work, are already working. The discussion then revolves around why would it be more interesting to work for your company and how much more are you willing to pay for that?

Wisen up and stop wasting your time and money on job adverts.

phaemon 1 day ago 3 replies      
13 points and no comments...
Living, in Limbo hintjens.com
361 points by alcio  5 days ago   71 comments top 18
ci5er 4 days ago 1 reply      
Memento Mori.

I want to simply declare that the iMatix state machine generator (Libero) and cross-platform library(ies) (SMT/SFL) saved my project/company from certain destruction in 1996. And I never had the opportunity to thank you/him/them for that.

I'm a C guy, so admittedly old-fashioned, but I still use that FSM generator language more than twice/year. FSMs rock (I came from circuits) and this is the best business-language-to-compiled-software-description thing I have ever met.

Thank you.

sdfin 5 days ago 4 replies      
The proximity of death turns many people philosophical.I like reading what people think when looking at life from that perspective. I also enjoyed reading an article which was posted in HN some years ago, about the more common regrets of people on their deathbeds: https://www.theguardian.com/lifeandstyle/2012/feb/01/top-fiv...
xgbi 4 days ago 2 replies      
I remember the shortness of breath during my chemo. It wasn't lung cancer so it might be linked to the effects of the chemicals you get every other week.

The red blood cells being wiped over and over when the Chemo destroys everything probably leads to a worse oxygenation.

Wish you a good holiday, enjoy some good wine over there while you're at it!

jay_m 5 days ago 0 replies      
Recommend reading this piece: http://hintjens.com/blog:115for some context, and because it's excellent.

Glad to see another update from Pieter, I hope he finds his way out of limbo, preferably healthy.

yadongwen 5 days ago 1 reply      
Not sure what kind of cancer you have. There are so much progress in this field and I believe many types of cancer can be controlled now. My mom has NSCLC. It's been two years and it's very likely she'll be just fine in the next a few years. Please do not give up..
FuNe 4 days ago 1 reply      
All the best and the most pleasant holiday possible :)

First time I read something of the guy and his blog made an entrance in my quick dial.

I take he lives in Germany. That is one of the best and free healthcare systems in Europe (if not the world). Timely and hassle-free access to medication and care is hugely important when you race against time. Kudos to the system there.

ngrilly 4 days ago 0 replies      
I'm always impressed by the depth and quality of Pieter Hintjens' writings, even more in such a difficult moment. He must be a remarkable mind. Kudos for fighting this, and keeping thinking.
darshwithsmile 4 days ago 0 replies      
Wonderful article, and full of life. After reading it, i strongly believe people live on will power rather than anything else, surviving such harsh treatment is very difficult, but the power of our loved once keep us get going.

One of my uncle had throat cancer, and I feel that rather than the disease its the Chemo that was doing harm to him. As I can see him transform from a healthy person to a weak and slim one.

I really appreciate the courage @PeterH is showing, and motivating the people around the world.

richforrester 5 days ago 0 replies      
Hey Pieter; thanks. Wish you the best for however long you're around. I hope you can make limbo work for as long as it lasts.
timClicks 5 days ago 0 replies      
You're an inspiration Pieter. Thank you for all of your contributions to building a better world.
zsellera 4 days ago 0 replies      
He's a true renaissance man; Here's a great interview with him on his life:https://changelog.com/205/
fasteo 4 days ago 0 replies      
I can only hope to face death like you are doing, without the drama, being realistic but not losing hope, being pragmatic and having the courage to keep enjoing life with your family.

Thank you.

isuckatcoding 4 days ago 0 replies      
Seems like a very smart, humble guy. Would love to see him speak sometime.


k__ 4 days ago 0 replies      
Chemo is really crazy.

A friend of mine had brain-cancer and got chemo every month or so.

While the chemo was running he looked like he would die any moment.

When it was done he looked like he wasn't even sick.

This went on for years till he died two years ago.

sneak 5 days ago 0 replies      
What a class act this Mr. Hintjens is!
nickpsecurity 4 days ago 0 replies      
@ PieterH

"Lots of time, mostly at home, no long term plans. And yet it has been hard. It's taken me a month to start on this article. In limbo, it is so much easier to just switch off, become passive. It doesn't matter anyhow, does it."

I know the feeling from a brain injury that cost most of my memory, learning, and analytical capabilities. I've been wanting to do some deep analysis and coding for high-assurance systems but chunks of knowledge, motivation, or mental capacity for some analysis just aren't there. Easier to take a mental break, too, than tackle the stuff. I've gotten quite a bit done over the years with specific strategies, though, that might help you. Not sure if it will or won't as your situation is clearly a bit different but does have similarities. Worth a try.

To start with, keep it simple and incremental. Even my thoughts & non-coding projects I do almost Niklaus Wirth style where each piece, from how I describe it to how it do it, is individually pretty simple. Also helps to reuse ideas & work as much as possible in new thoughts to reduce brain drain. Doing it incrementally is most important, though. This applies to both code, writing, and planning. (Sound like programming/engineering yet?) You can tackle almost nothing or a lot but you at least see pieces of it forming that both motivate and act as leverage for next pieces. Also, opposite of decomposition in software, gradually refactor the simple ideas or jobs into more complex ones maintaining simple interfaces or ways of putting them together. Idea being that complex stuff is hard to make with brain trying to turn off whereas small changes merging simple pieces is easier (not easy).

I do an example with book you want to work on. Writing a book is a lot of work that your brain won't let you do it seems. So, screw writing a book. Instead, keep two files or sets of them: one an outline-in-progress to structure your ideas; one a list of them in terms of techniques, explanations, code examples, and so on. Just keep adding stuff... little by little or a lot if a burst of mental energy... to the category on the right gradually refactoring the left (directory) with labels or pointers to stuff on right (content). For specific stuff, start simple while refactoring into complex. You might describe a problematic situation with an English description or solution with little text... just enough to remember what you were thinking. Another day add some specifics and/or code to it. Another day a little more or a footnote. Eventually, you end up with what can be turned straightforward into a book (SUCCESS) or a collection of useful information (LESS SUCCESS) others can build on. This approach has no failure mode since some wisdom is better than none. Peer review helps, though, to catch little inaccuracies bound to sneak in. That's normal, though.

So, outline, simplify, maximal reuse in ideas/code/explanations/structuring, refactor to add complexity later if too much now, and just accept work might be more wiki than book at least for now. These are how a semi-brain-dead individual like myself musters on despite it not wanting to cooperate something like 50-80% of the time. Hopefully, yours has more steam than that and similarly good results. :)

james-watson 5 days ago 4 replies      
Surprisingly strong opinions on world politics, without much evidence.

I have the utmost sympathy for the plight of the chronically ill, but to decree a referendum result as "madness" is slightly egotistical.

Perhaps those who are outraged by world events merely don't see the full picture? Perhaps their protective shells work too well, and they are insulated from the reality with which the masses must contend?

Worth pondering.

simplify 5 days ago 5 replies      
> Lesson is: take your medicine. It may hurt, yet the alternative will hurt more.

There is something I've recently realized. Hospitals are optimized to receive patients, control the situation, prevent pain & death, and then hope for the best. In other words, traditional medicine doesn't actually heal you. It only controls the situation, and trusts that your body will heal itself.

This was a strange realization to make. My whole life I've always associated medicine with healing. But now I see that, although very useful, medicine only serves as a band-aid. It doesn't solve the underlying problem.

I think we as a society need to realize this. Not because traditional medicine is bad (it has helped save countless lives, for sure), but because we need to start looking into alternative medicine that could potentially provide actual healing. There's currently a stigma of anything that isn't a traditional drug / pill... we need to progress past this mindset.

Edit: Sorry for being overly broad. I have more in mind physical and mental health conditions. Regardless, instead of downvotes, how about we have an enlightening discussion?

Edit 2: I misspoke "traditional medicine"; but what I meant was "western medicine", as I was kindly corrected by maxerickson. I apologize for the confusion.

       cached 10 July 2016 04:11:01 GMT