hacker news with inline top comments    .. more ..    16 Aug 2016 Best
home   ask   best   3 years ago   
Zero-cost futures in Rust aturon.github.io
918 points by steveklabnik  4 days ago   337 comments top 38
AndyKelley 4 days ago 2 replies      
This is huge.

This allows one to express concurrency in a natural way not prone to the typical errors of problems of this nature, with no runtime overhead, while competing with C in terms of the constraints of the runtime.

Big kudos to Aaron Turon and Alex Crichton. You guys knocked it out of the park.

rdtsc 4 days ago 4 replies      
As mentioned in the post, given Rust wants to operate in the same space as C, this approach makes sense. However from a higher level, building more complex concurrent systems, dealing with futures/deferred-s/promises and/or a central select/epoll/kqueue reactor loop gets daunting and doesn't mix with complex business rules.

Deferred based approach has been a round for many years. I experienced it by using Twisted (Python framework) for 5 or so years. And early on it was great However when we switched to using green threads, the logic and amount of code was greatly simplified.

So wondering if Rust provides any ability to add that kind of an N:M threading approach. Perhaps via an extension, macro or some other mechanism.

Note that in C, it being C such things can be done with some low level trickery. Here is a library that attempt that:


And there were a few others before, but none have taken off enough to become mainstream.

jaytaylor 4 days ago 3 replies      

Ive claimed a few times that our futures library provides a zero-cost abstraction, in that it compiles to something very close to the state machine code youd write by hand. To make that a bit more concrete:

- None of the future combinators impose any allocation. When we do things like chain uses of and_then, not only are we not allocating, we are in fact building up a big enum that represents the state machine. (There is one allocation needed per task, which usually works out to one per connection.)

- When an event arrives, only one dynamic dispatch is required.

- There are essentially no imposed synchronization costs; if you want to associate data that lives on your event loop and access it in a single-threaded way from futures, we give you the tools to do so.

This sounds quite badass and awesome. I'm not sure what other language implementations take this approach, but this is clearly an extremely beautiful, powerful, and novel (to me at least!) concept. Before reading this, I thought rust was great. This takes it to the next level, though.

losvedir 4 days ago 6 replies      
I dabbled with rust in the past and was really fascinated with it, but haven't played around lately. One thing caught my eye in the post:

 fn get_row(id: i32) -> impl Future<Item = Row>;
That return type looks odd to me. What does it mean to return an "impl", and is that a new feature in rust, or just something advanced that I missed in my exploration before?

leovonl 4 days ago 0 replies      
In my opinion - as someone with some background in CS - the name "future" is a little too overloaded here. It is not only used for the deferred computation of a value so much as it also means the composition of computations. This is not wrong per se, but calling the result a "future" alone oversimplifies what's happening below and hides some properties about the combinations.

The first observation one can make - which is not mentioned anywhere in the article - is that the composition of futures here can be understood as a monadic composition. This by itself gives a big hint why this interface is so powerful. Second is that this library could be understood as an implementation of process and process combination from pi-calculus [1] - sequential combination, joining, selection, etc - so it could be formalized using its process algebra.

From the practical side, one example of a mature library that implements similar concepts is the LWT [2] library for OCaml, which has the same idea of deferred computation, joining and sequencing, but calls the computations "lightweight threads". One could also argue about naming in this case, but it seem to reflect a better the idea of independent "processes" that are combined on the same address space.

Finally, as much as these concepts of futures and processes look similar on the surface, they each have their own properties - so it's always good to consider what better fits the model. By looking at the research and at other similar solutions, one can make more informed choices and have a better idea of what to expect from the implementation.

[1] http://www.cs.cmu.edu/~wing/publications/Wing02a.pdf

[2] http://ocsigen.org/lwt/manual/

nv-vn 4 days ago 1 reply      
Anyone else find the f.select(g)/f.join(g) syntax unintuitive/awkward? I'm confused as to why they wouldn't go with the (IMO) more logical select(f, g) and join(f, g) in this case (since neither Future is really the "subject" in these cases). Not that this is a major concern (it would take only a few lines of code to change within your own program using an alias for the functions), just interested in knowing the rationale behind the choice.
soulbadguy 4 days ago 5 replies      
Finally a nice async/io interface for rust, always felt that it was a big missing piece, couple of questions for peps familiar with async in other languages :

1 - Isn't the state machine approach the same as C#/.net async/await is using ? But the with the added convenience of the syntactic sugar ?

2 - no allocation : , does'nt the lambda closure need to be allocated somewhere ?

3 - I would have love some comparison (booth performance wise and on theory) with C++ up comming coroutine work, from my understand the C++ approach is even more efficient in term of context switching and have the advantage of even less allocation.

cyber1 4 days ago 2 replies      
Little benchmark rs-futures vs lwan (https://lwan.ws) on my machine Core i5


 $ wrk -c 100 -t 2 -d 20 Running 20s test @ 2 threads and 100 connections Thread Stats Avg Stdev Max +/- Stdev Latency 823.09us 449.37us 20.98ms 98.69% Req/Sec 62.15k 10.51k 105.24k 48.63% 2479035 requests in 20.10s, 340.44MB read Requests/sec: 123335.77 Transfer/sec: 16.94MB

 $ wrk -c 100 -t 2 -d 20 Running 20s test @ 2 threads and 100 connections Thread Stats Avg Stdev Max +/- Stdev Latency 596.45us 573.31us 24.46ms 99.33% Req/Sec 86.17k 13.15k 119.71k 76.00% 3429720 requests in 20.01s, 624.73MB read Requests/sec: 171404.15 Transfer/sec: 31.22MB
For lwan i use http server example from lwan.ws main page.

As you can see in this example C http server much faster than simple http Rust server.

* futures-minihttp release build

* lwan -O3

Animats 4 days ago 4 replies      
This is cute. This is clever. Whether or not it's too clever time will tell. A year ago, I noted that Rust was starting out at roughly the cruft level C++ took 20 years to reach. Rust is now well beyond that.

All this "futures" stuff strongly favors the main path over any other paths. You can't loop, retry, or easily branch on an error, other than bailing out. It's really a weird syntax for describing a limited type of state machine.

I'm not saying it's good or bad, but it seems a bit tortured.

thomasahle 4 days ago 4 replies      
I'm confused by

 .map(|row| { json::encode(row) }) .map(|val| some_new_value(val))

 .map(json::encode) .map(some_new_value)
Is the explicit extra layer of lambda generally prefered in Rust over just passing the functions?

bfrog 4 days ago 1 reply      
I love the direction this is going, and the performance it achieves.

Debugging promises/deferreds in other languages has given me nightmares, compare with erlang/golang debugging where you get a simple stacktrace.

Does this provide some nice way of debugging complex future chains? Are there plans towards making it super easy to debug?


Manishearth 4 days ago 6 replies      
I'm rather surprised by the benchmark; I would expect the Go benchmark to be faster than Java (and the fact that it isn't may indicate some improvements that can be done to fasthttp by learning from rapidoid or minihttp). Then again, the difference isn't that much, so it just could be implementation details that would require a total refactor to fix.
skybrian 4 days ago 2 replies      
Is there any special handling for Futures that complete with an error?

Also, how do you debug code that's hung or taking too long? It might be useful to get a list of all the jobs (incomplete Futures) that are currently running, much like running 'ps'.

bascule 4 days ago 1 reply      
While the benchmarks are looking a lot better than many other similar Rust libraries in this space, I'm not sure the code is in a state where they're actually meaningful yet: https://github.com/alexcrichton/futures-rs/blob/master/futur...
plesner 4 days ago 0 replies      
This looks really impressive. I'm curious what the story is around propagating errors through chains of futures. Traditionally future libraries don't pay much attention to that which can make debugging excruciating, which it doesn't have to be. But then rust does errors differently so maybe it's less of an issue there?

About the naming though, I was a little disappointed. Out of future, deferred, and promise, "promise" is the better term. The two others imply that something will happen later which is misleading because it's fine to have promises stick around long after they're fulfilled.

lossolo 4 days ago 1 reply      
Why you didn't compare it to C++ or C ? If you want to compete with C/C++ it would be natural to compare those in benchmarks. Java and Go have GC. It's like comparing super car with street cars when you should compare it to other super cars.
tomdale 4 days ago 1 reply      
The recent flurry of activity around async IO in Rust has been really exciting; to me, it indicates that the core team's decision to stabilize the language was a smart bet that is paying off in rapid ecosystem growth.

One quibble I have with this post is that it talks about futures as a zero-cost abstraction. That might be true (or close to true) from a performance perspective, but in my (admittedly inexperienced) opinion, it seems to have a significant ergonomic cost that is not accounted for.

While futures help us deal with multi-threaded coordination of data from multiple sources, that overhead isn't necessary for situations where you're running in a single thread dedicated to doing IO operations.

Dealing with futures in your code is not non-trivial. Browsing through the futures version of the HTTP server, I had a hard time following along:


And it requires a bunch of helper code to go with it:


The blog post mentions Tokio, another high-level abstraction on top of mio (by the same author). Because it doesn't require the futures abstraction from top-to-bottom, it offers similar (maybe even a little better) performance with what, to my eyes, is far simpler code:


I'm still learning Rust and spend most of my time in JavaScript. The analogy I'd use is: imagine if in the Node programming model, every API required you to use JS Promises, even at the very lowest level. Even if you could reduce the cost of creating new Promise objects, interacting with them over simple values could make the code you write more verbose. In Rust, that problem is exacerbated by the much stricter type system and the fact that you have to do cross-thread coordination.

I'm a total beginner to systems programming, and a lot of this stuff is above my pay grade. However this shakes out in the community, I'm very happy to see Rust on the way to becoming the fastest, most productive way to write high-performance web services.

crudbug 4 days ago 0 replies      
One thing I have not seen in discussion is - Work vs. Worker abstraction.

Your application work - computation logic/business rules, should be decoupled from the type of worker.

The worker can be - blocking or non-blocking - Futures/Continuations/Co-routines.

vvanders 4 days ago 2 replies      
Not sure if I missed this in the post, does this depend on any unstabilized features or can we use this today on 1.10.0 stable?

Awesome stuff btw, love the iterator inspiration.

saynsedit 4 days ago 2 replies      
Big downside is now you will have a dichotomy of functions that block using futures and functions that block at the OS level and no sane way to intermix them. Rust essentially becomes two languages. Async/await sugar doesn't fix this.

Would be great if functions could be written in a general way for both IO models and users could select the implementation at their convenience.

cm3 4 days ago 0 replies      
This is cool and validates Rust, but I just want to add that even 2kb stacks as mentioned in sibling comments is bigger than Erlang's process stacks. In Erlang 19.0.3, even with dirty-schedulers enabled, a process's default size is 338 words.
the_mitsuhiko 4 days ago 1 reply      
Wohoo. I was waiting for this. I hope that at a later point this will also mean that we get some sort of syntax support for it once it's stable and entered std.
hinkley 4 days ago 0 replies      
Back when futures and promises were a new concept to most people, if someone asked me to explain why you would want to do such a thing, my favorite example was loading images in a web browser. You wouldn't want to load the same image four times just because it appears in four places on a page, would you? Yada yada promises etc etc.

Seeing articles like this makes me feel like a circle has finally been closed.

soulbadguy 3 days ago 0 replies      
For those who are curious about how does that fair against a coroutine based approach : https://www.youtube.com/watch?v=_fu0gx-xseY
kbenson 4 days ago 0 replies      
> a simple TCP echo server;

How convenient. I've been exploring/learning Rust, and writing a simple echo server and comparing it to a reference version I've written in Perl is my first semi-trivial program I wanted to do to compare.

michaelmior 4 days ago 1 reply      
Curious if someone has tried this and Eventual[0] with any thoughts on how they compare.


meneses 4 days ago 0 replies      
Aawesome. So to cancel a future, I just drop it! Awesome.
eggnet 4 days ago 1 reply      
How are futures handled for open() and disk i/o?
matthewaveryusa 4 days ago 1 reply      
I'm genuinely interested in knowing what the problem is with an event loop using epoll and a threadpool for IO that blocks but epoll can't poll. I've used proprietary event loops at 2 giant companies, libuv with C, asio with cpp and nodes async, and the async IO was never the problem in terms of performance or complexity. What is the problem that's trying to be solved?
ufo 4 days ago 0 replies      
Unfortunately, it seems that you still need to use callbacks and lots of and_thens to write this async code.

Wouldn't it be possible to add coroutines to Rust instead?

ridiculous_fish 4 days ago 4 replies      
How does the zero-cost abstraction work?

Say we make a Future<Int> and then chain `.map(|x| x+1)` on a dynamic number of times (N). Presumably this requires storing at least N function pointers.

How can we store these N function pointers with zero cost? If it only takes one allocation, where does the N-1 future store its function pointers?

natrius 4 days ago 1 reply      
What would a rough sketch of async/await syntax sugar look like implemented with Rust macros?
shmerl 4 days ago 1 reply      
So will this become the official part of the language / standard library?
hackaflocka 4 days ago 2 replies      
What's the meaning of "zero cost future" in this context? I googled the phrase and got a bunch of irrelevant material.
b34r 4 days ago 1 reply      
select is an odd term choice for what is essentially a race condition. What's the thought behind the naming of that method?
pbarnes_1 4 days ago 4 replies      
This is awesome, but I have an off-topic rust question:

Why can't we have some syntactic sugar to get rid of .unwrap()?

ben0x539 4 days ago 0 replies      
I guess it's cool that Rust is getting zero-cost futures, but they have a long way to go to catch up to C++'s negative-overhead coroutines!
mike_hock 4 days ago 1 reply      
I suppose you could say, this way of programming is the future.
Indie Hackers: Learn how developers are making money indiehackers.com
946 points by csallen  4 days ago   182 comments top 42
radarsat1 4 days ago 3 replies      
Very cool, but also a bit misleading.. I was wondering how the hell wub machine actually makes 900$/month, so I read it.

> Record high was $850/mo, record low was $40, with the average month bringing in around $300. Certainly not quit-my-job money, but it helps.

Posting the all-time high instead of the average is a bit off-putting imho, considering it's supposed to represent "paychecks." This is more like getting lucky once. Anyways, haven't read the rest of the stories, and I think it's pretty inspirational. I'd just appreciate more honest revenue reporting on the front page. (It actually says $900/mo, not just $900, so it's not like I'm reading this ambiguously..)

Edit: On second thought, it's not clear to me from the description whether he means that it made $900/mo for some kind of long streak or just one month.

rafapaez 4 days ago 10 replies      
This is very cool, but can I ask you a question?

Some days ago I posted a very similar website here (http://www.transparentstartups.com/) and I didn't get barely any vote whereas this one is getting totally viral (385 votes and adding).

Please let me know what I'm doing wrong. The only thing I can think of is that I didn't mention some key words like "developers" and "money" but "startups" and "transparency". Or is there anything else I'm missing here?

Thank you in advance.

UPDATE: I'm learning a lot today, thank you guys for the feedback.

ryandrake 4 days ago 2 replies      
I'd be interested in learning from founders of full-time businesses that started as side projects, how they made that transition from side project to full-time.

The amount of time you need to spend on a side project to grow probably increases faster than revenue. So inevitably there will be some point in time where your level of effort is much more than what you'd call a side project, yet your revenue is much less than what you'd call a business. I bet that's a frustrating point for many, as you can't just quit your job and stop paying your bills. I'd love to hear creative ways people have gotten past that hump that don't involve mortgaging the house, selling all your possessions and depleting your life savings.

inputcoffee 4 days ago 4 replies      
This is a very useful contribution.

The only thing I would add is how many man hours the project took over what period of time.

I have to click on them and sort of guess if that site took 5hr/week for 10 weeks, or if it took 10 people working 10 hour days for 2 years.

Since you already ask for the tech stack, this would also help launch a dozen "which tech stack is more productive?" studies.

On a final note, I appreciate that you also ask about marketing. Maybe the marketing efforts and man hours can be summarized too?

malcolmocean 4 days ago 2 replies      
Founder of Complice (https://complice.co/) here, one of the sites featured (https://indiehackers.com/businesses/complice).

I'm game to answer questions that people have that weren't answered in the interview!

whamlastxmas 4 days ago 2 replies      
If I was personally making thousands a month from a web app I made, I don't think I'd want to advertise that. Partially because it motivates competition, and partially because I feel like public information about my income could later be used against me (no concrete examples other than alimony/child support). I wonder why these people aren't bothered by this.
epalmer 4 days ago 5 replies      
So I went to the site in Chrome and tried to back arrow to HN and the site cleaned out the history queue in Chrome. I stopped there. This is bad form in opinion.
cyberferret 4 days ago 1 reply      
Hi all, founder of HR Partner (http://hrpartner.io) as featured on Indie Hackers here. Happy to answer any questions that anyone has. As we are 'pre revenue', I would also appreciate tips and hints with respect to marketing and B2B sales from anyone who has been there, done that. :)
nodesocket 4 days ago 3 replies      
Founder of Commando.io here (https://indiehackers.com/businesses/commando-io). Let me know if you have any questions.
herbst 4 days ago 1 reply      
That you include actual numbers is awesome. Kudos
negrit 4 days ago 4 replies      
I'm very confuse with sentence:

 sideProject.generate(8500, 'dollars').per('month');
Those are not side projects

timbowhite 4 days ago 4 replies      
Great site, considering sharing some of my projects.

> Learn how developers are writing their own paychecks.

Would love to see a forum dedicated solely to "indie hacking" for developers. ie. threads related to all the ins-and-outs of independent product development, idea validation, market research, dealing with customers, marketing strategies, founder Q&A, etc.

mettamage 4 days ago 1 reply      
This is awesome. Soon, I'm going to devour all the stories. How do the companies get to know your site and share their story before you were on HN?
johnward 4 days ago 3 replies      
I'd like to be able to subscribe via RSS
Silhouette 4 days ago 0 replies      
On very rare occasions, there's a post on HN that I wish I could upvote so much it would pin at the top of the home page until everyone had a chance to see it. This is one of those rare posts: fascinating, well presented, and I can see it being practically useful for a lot of people who aren't there yet as well.
stockkid 4 days ago 0 replies      
This actually motivates me a lot. Thanks for making this.

UI nitpick: when I navigate to the project's page, the if the project name is long, it goes out of screen on Nexus 5X + Chrome.

danr4 4 days ago 0 replies      
It's not every day I'm happy to give out my email. very nice.
redstripe 3 days ago 1 reply      
I'd like to see an additional question asked: How did you go about designing the interface of your app?

I'm blown away by how good these and so many "show HN" entries usually look. Everything I've put online is functional but looks so obviously "designed by a programmer" that it's embarrassing to mention.

Do people first work on a prototype and then run it by a graphic artist/UI designer to make it look decent? Where do you find these people?

gricardo99 4 days ago 0 replies      
Cool project! One forward-looking idea for your site: You could start to incorporate a community platform where people can post side-project ideas, skills they're willing to contribute, skills they're looking for, etc..
jackmaney 4 days ago 0 replies      
Nice site, but it hijacked my back button. There's no excuse for that.
pascalxus 3 days ago 0 replies      
I think this site is Awesome! IndieDevs love to find out what worked and what didn't. Can I make a suggestion? Perhaps, you can add a section or another site that lists how any successful businesses got their start, especially with elaborate insights on distribution as this is the number 1 barrier to entry for most start ups. Thanks!
imaginology 4 days ago 0 replies      
Nice site, I enjoyed reading it.

I like the effect when transitioning from grid view to list view. Is that just CSS or is there some Javascript magic doing that?

Sindrome 4 days ago 0 replies      
Was literally lying in bed sleepless last night thinking about which side project on my list to start. Was digging through "Ask HN: How do you make recurring revenue" posts. There's always someone asking in HN. Didn't even think about researching side projects as a side project.
palerdot 4 days ago 0 replies      
Great work. Please provide an easy way to clear all the filters in one click. Subscribed and will be eagerly following.
swah 4 days ago 0 replies      
Very motivational that some apps can make those numbers - thank you for making this! (subscribed!)
was_boring 4 days ago 0 replies      
I really like this idea, and even signed up for the mailing list. I've been scratching my head for years trying to crack income diversification without a lot of money to begin with. It's good to see some success stories.
avipars 2 days ago 0 replies      
Great. I wish there could be charts of the monthly cash rate, also it would be nice to mention how much maintenance time and cost of servers and such...
vatotemking 4 days ago 0 replies      
Very important question but isn't mentioned much is: How do you let other people know about your business after launch?
andretti1977 4 days ago 1 reply      
Great job, subscribed! But here it is a simple question: what is the business model of indiehackers.com?
tummybug 4 days ago 0 replies      
Great site for inspiration. Gave me the same feels reading revenuenumbers.com (posted here a while ago) did.
NinjaTrappeur 4 days ago 0 replies      
Feature request: it would be nice to be able to communicate with the indie developer within your website. Maybe though a comment section at the bottom of the page or a community FAQ.

Anyways, nice website I will come back!Looking forward the rss feed ;)

shellerik 4 days ago 1 reply      
Sorted by revenue only seven make more than my Amazon affiliate site. I'm not sure if that type of site would fit in there but I did develop quite a bit of code for it. It's not a review site but rather a searchable product catalog.
ktu100 4 days ago 1 reply      
It would be great if there is a comment section, or private Q&A, to ask founders questions.
augb 4 days ago 0 replies      
Having the month and year of the "interview" would be helpful. Down-the-road, it will help give context to the information. A neat idea would be to allow for follow-up "interviews" later.
otto_ortega 4 days ago 0 replies      
Cool idea, I hope to build something one day that I can publish on it.
augb 4 days ago 1 reply      
I like that you can filter on solo vs. multiple founders. Very cool.
tener 4 days ago 0 replies      
Really good read, the structured Q&A format is great.
dudeget 4 days ago 0 replies      
wow, very interesting reads. Many of them make me inspired and frustrated at the same time. Inspired because of how cool the ideas are, frustrated because "why didn't I think of that?!"
robotnoises 4 days ago 0 replies      
This is very cool and one of the most handsome designs I've seen in a while.
one_thawt 4 days ago 3 replies      
Cool. Although the site breaks my back button on Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36
okket 4 days ago 2 replies      
Evil site, blocks back button.
azernik 4 days ago 1 reply      
Could we get a less link-baity title? Something like "Viable Single-Developer Businesses" or the like?
What Makes a McMansion Bad Architecture? mcmansionhell.tumblr.com
738 points by jseliger  1 day ago   503 comments top 69
carsongross 1 day ago 11 replies      
The lack of balance, proportion, controlling lines[1] and symmetry are all bad aspects of nearly all modern (in the chronological sense) architecture, not simply McMansions.

My primary issue with McMansions specifically (beyond the fact that they lost the thread on western architecture, which should be blamed on the academy[2]) is that the materials and workmanship are terrible: ugly gaps, quick to stain stuccos and metals, slapdash construction and very little craftsmanship. The flip culture that the mortgage-debt bubble of the last 15 years created has exacerbated this issue to almost comical levels.

[1] - https://www.amazon.com/Old-Way-Seeing-Architecture-Magic/dp/...

[2] - https://www.amazon.com/Bauhaus-Our-House-Tom-Wolfe/dp/031242...

EDIT: After reading another post[a] of his, it is worth mentioning another chronic problem with modern (again chronological, not stylistic) building: the buildings often look like they are about to fall over. A particular pet peeve of mine is the flashing gap found at the base of many houses and buildings, which introduces a disconcerting negative gap right where a soothing, wide foundation should be. Visual insanity.

[a] - http://mcmansionhell.tumblr.com/post/148935246684/mcmansions...

laretluval 1 day ago 12 replies      
I conjecture that the entirety of architectural design theory is just an attempt to create a formal system such that the styles preferred by people that architects don't like are "bad", and the styles preferred by people that architects do like are "good".

The class of people who own McMansions are not very popular among the class of people who write about architectural design on tumblr.

If the conjecture is true then it should be possible to find cases of houses that clearly defy these principles of "mass", "balance" etc., but which are deemed "good" through a series of ad-hoc exceptions and explanations. Those houses will probably not be suburban.

De gustibus non est disputandum. This is equally true when you can create a low-dimensional approximation to your taste in terms of abstract principles such as "mass" and "balance".

toast0 1 day ago 7 replies      
In my mind, Bad Architecture means the form doesn't function. This critique is entirely aesthetic. I agree, they don't look nice, but I've seen plenty of buildings that look nice, and are terrible to be in (hello commercial bathroom with tons of afternoon sun, poor ventilation, and the toilet paper dispenser glued to the window; it was pretty though). Moreover, if it's my house, I don't care too much what it looks like on the outside, cause I'll be inside for much more time than i spend looking at the outside.
ascendantlogic 1 day ago 3 replies      
The issue isn't so much the design, but the cultural attitudes towards the upper middle class. Looked down upon by the truly wealthy, and viewed with suspicion and derision by the middle and lower classes. There's a lot of emotional hostility towards the people that buy these houses that spills over into the critiques of the structures themselves.

That said, this person isn't necessarily wrong. Having a lot of visual interest in your home isn't a bad thing, but you have to have visual interest surrounding the home as well. The issue I have with McMansions personally is having a 4000 square foot house on a postage stamp lot looks ridiculous.

I was once told many years ago the following quote: "Rich people have big houses, but wealthy people have land".

4k sq/ft home on 1 acre? Probably not the best looking neighborhood. 4k sq/ft home on 5 acres? Probably a very wealthy, upscale area.

jasode 1 day ago 4 replies      
I also dislike McMansions but I disagree with his pseudo-scientific criteria of "voids" and "secondary mass" that he applied to the various houses.

The bungalow home and the colonial homes that he praised, I didn't find the aesthetics to be pleasing.

To me, the photo from wikipedia is more like the McMansions I think of as ugly:


The wiki example has the siding that covers the entire side combined with the Greek columns in the front entry. It's a weird mismash of gaudiness. The garage also overwhelms of rest of the house.

Like another poster mentioned, some of the awkward boxiness is due to homeowners wanting 3500+ sq ft homes on small lots that are 1/3 and 1/2 acre. An architect no matter how talented is too constrained by the lot dimensions to avoid designing an oversized out-of-proportion box.

jasonkester 1 day ago 1 reply      
I know, right? Look at [Other Group] with their [Thing They Like]. Can't they see how it goes against [Thing Our Group Wants To Signal]?

If only they had cultivated [Subjective Taste We've Cultivated] they would be ashamed.

It's almost like they watch Television. Or Sports on Television. Or eat Fast Food and enjoy it. Or prefer sandwich bread to crusty artisanal bread.

Or drive Big Flashy Cars. Or drive pedestrian affordable cars. Or support Obama. Or support Trump. Or drink instant coffee or take public transportation.

There's no end to the list of things that the Other People do that Our People can look down on and laugh. If only they knew better.

VLM 1 day ago 1 reply      
Could summarize the design choices to malignancy, fakeness, and excess. Those are best avoided in programming BTW, its not just a house architecture problem. Also bad always crowds out good.

The design pattern of mcmansions is malignant. Oh you still have money, lets slap on an asymmetric ugly random whatever, any old place till you run out of money. The design scheme is tumor like. Ah the underlying tissue must have had a good blood supply in that direction as the tumor expanded in that direction. They're almost organically gross as opposed to simply random.

The fakeness isn't properly explored in the blog. There is an uncanny valley effect where anyone who knows anything about Georgians, for example, can trivially identify a Georgian-inspired fake Georgian. I mean, you blew a million bucks and got all the hundred bucks parts wrong... how silly. Also there's the fast food cookie cutter nature of McMansions where even if a single Big Mac isn't repulsive, an entire subdivision of identical ones is repellent, endless roads of ticky tacky. A semi-competent architect could make a nice looking Georgian pretty easily, but it doesn't flow to have dozens of CCR enforced utterly identical ones packed into a small area. Shotgun shack townhouse from Boston, yeah those look right when packed together like books on a shelf. If you must have lot spacing of 3 feet between houses, at least select matching appropriate styles, shotgun shack instead of southern plantation on LSD.

The excess is very much fast food like. Well, yeah, it sucks, and it sucks just like everyone elses, and you have no choice in some areas of the country, but the excess is very 128 oz big gulp or super size frys like. Oh you still got money, well, lets put more columns and weird dormers and rooflines until you run out of money. Sure it'll be ugly but the point is to show off how big of a subprime balloon payment adjustable rate mortgage you can take out, and ugly does get noticed...

The offensiveness of a mcmansion is bad always crowds out good. If you don't have land, why "must" you have an ugly mcmansion instead of a nice looking row of boston style townhouses? If you do have land, why "must" you have an ugly mcmansion instead of a nice plantation era with luxurious porches and stuff?

dsr_ 1 day ago 1 reply      
My objection -- visible in nearly every one of the listed McMansions, but also a few of the 'classic' buildings -- is that a roof is a functional object, and every time you create a valley between two neighboring rooflines, you increase the likelihood of failure.

Housing should be built for the climate of the area where it is placed. A SoCal-style flat roof built around a central courtyard and pool is perfectly reasonable there... and a really bad idea in Minnesota or Maine. There's no reason to put a two-story barn pseudo-conversion in a place where they never had that style of barn and will never need to shed snow quickly.

First function, then play around with form.

cleandreams 1 day ago 1 reply      
Useful! I live in a neighborhood of mixed age houses and it is interesting how many of newer houses make me wince. This article helps me understand why. They lack harmony with themselves and their surroundings because of these failings. But here's another thing. I think these criteria (which are based in couple thousand years of western design principles) all fundamentally rely on the restraint of self-expression in favor of impersonal principles. Lots of the asymmetry and weird bulkiness I see around me seems to come with an attitude of 'Hey ma, I'm expressing myself!' Asymmetry is like loud, brash rock music. As I've gotten older I'm bored with a lot of the crudeness that is passed off as self expression and I wish for more restraint. Fat chance, I know.
jpalomaki 1 day ago 0 replies      
Need to throw in the mandatory pointer to Christopher Alexanders "Timeless way of building" [1]. The text is also available at Archive.org, see [2]. I originally bought his books because they were mentioned to be behind the software design patterns "movement" that was quite popular some years ago.

Reading his books changed the way I'm looking at houses and apartments. Not from outside, but the layout. I guess (some of) the patterns are quite obvious, but as usual, it helps when somebody points them out. Two example things that stuck to me:

1) Intimacy gradient: Public areas where you host guests, meet people and closer to the front door. The deeper you go, the more private it gets.

2) Parents and children's realms: Separate areas for parents and kids, common area that connect them.

When shopping for apartment and browsing through many different layouts, this kind of simple things help rule out many of them.

[1] https://www.amazon.com/Timeless-Way-Building-Christopher-Ale...[2] https://archive.org/stream/APatternLanguage/A_Pattern_Langua...

Animats 1 day ago 9 replies      
After reading this author's criteria, look at Fallingwater, Frank Lloyd Wright's most famous house. Secondary masses all over the place, dominating the structure. No center of symmetry. So much window area that it's more voids than structure.

The problem with McMansions is usually too much house on too little land, with no visual relationship with neighboring houses. On a bigger lot with more trees and space around the house, most of them wouldn't look bad.

lnanek2 1 day ago 2 replies      
I don't really agree that most of what he says is a bad thing. E.g.:"Another issue with McMansions and mass is the use of too many voids. Some McMansions are so guilty of this they resemble swiss cheese in appearance. In the below example, the masses are so pockmarked with voids, they give the faade an overall appearance of emptiness. "So what? Personally, I'd prefer the 26 voids example he gives because it looks like a nice place to sit in a chair reading a book by sunlight. He's recommending making a much worse lighted set of rooms just to change the outside appearance, which I couldn't care less about.

Similarly his complaints about secondary masses. His example photo shows a big offshoot of a house sticking forward, but I'd rather have the extra room than cut it off and have less space, or make it a separate garage I have to put my shoes on a trek out to instead of just being able to walk over there inside.

The whole thing sounds like what programmers and designers get up to if we don't have usability studies with users showing us how stupid we are, thinking the huge buttons we make are titles and never pressing them, or not finding actions we think are obvious in our apps. Sounds like he's just going on without caring about the people who live in the house.

jonahx 1 day ago 2 replies      
McMansions look cheap -- this is their defining quality imo, more so than any of the more abstract qualities the article discusses.

Even people without much formal knowledge of architecture (like myself) have an intuitive sense of what cheap wood or ersatz brick look like. You can just immediately sense the economy of the construction.

bitwize 1 day ago 1 reply      
To me a McMansion is built like a suburban tract house, but has faade features that make it look vaguely mansion-like. The architectural equivalent of Imari Stevenson's "Lamborghini", which is a fiberglass shell on a Pontiac Fiero chassis and powertrain.
Spooky23 1 day ago 0 replies      
McMansions are striver homes, and they throw too many prestige elements at the house. It's like when you look at a home with an over the top 80s kitchen -- they don't age well.

The other thing is that they really aren't architected. Builders drive the design vs clients, and the bling is there to hide the lousy construction material and craftsmanship. The insides of these things are an even bigger shitshow -- all cheap Home Depot fixtures and millwork.

alricb 1 day ago 0 replies      
From a purely functional perspective, complex roof lines and the multiplication of volumes suck for durability and energy use. The more complex the roof, the more vulnerable areas it contains. The more edges and transitions you have, the harder it is to properly flash and air seal the enclosure.

And huge voids suck too: windows have poor thermal resistance and large glazed areas mean overheating due to the sun's radiation.

Add cheap material and poor construction practices (the famed "ductopus" is often found in million dollar homes) and you get crap housing, even without taking aesthetics into account.

mrcsparker 1 day ago 2 replies      
This whole article reeks of magical thinking (as do many of the comments here).

I have been looking for information on what constitutes a McMansion vs a non-McMansion and the arguments parallel the same sort of stuff that made Feng Shui so popular: there are magical rules that make things right or wrong.

Would love so see a real test done on these assumptions. If there are real rules that are being violated with these homes, then an unbiased test should be easy to put together.

acbabis 1 day ago 10 replies      
I'm not an architect, and I probably wouldn't have even noticed how ugly these houses are if I spotted them in the wild, but having read this article, it seems mind-boggling that someone would spend the money it takes to make a house and not adhere to basic architectural principles. Can someone explain how this happens?
skybrian 1 day ago 6 replies      
This is basically an article of the form "I believe the status of X should be lowered," giving some reasons why.

I'm sure they believe it. Everyone always thinks their own culture has better taste. That's pretty much what culture is, a loose agreement on what the best X is for a wide variety of X.

The best way out is not to play. Why have any opinion at all about "mcmansions?" Also, why a special term? Why not just call them mansions?

Lazare 1 day ago 0 replies      
My issue with the article is that it is:

1) Making purely aesthetic arguments

2) In several cases I feel that the "bad" architecture is aesthetically more appealing than the "good architecture.

For example, when the author discusses how "secondary masses should never compete with the primary mass"; what that seems to mean in practice is "ugly, squat, lumpen buildings are good; interesting, attractive silhouettes are bad". No doubt the author would disagree with my characterisation, but then, that's sort of the point. It feels very subjective.

Of all the examples given, the one I unambiguously agreed with was the one where none of the windows styles matched.

galfarragem 1 day ago 1 reply      
Bad architecture and good architecture are like good software and bad software. For non-IT folks it doesn't matter once the software works, for IT folks, it matters, they can 'see' things that the others can't. With architecture is the same.

Disclaimer: I'm an architect.

edit: this is the best blog I (didn't) know explaining what makes architecture good. There is a lack of short articles on this subject. Thanks for posting this.

ryanmarsh 1 day ago 1 reply      
People don't buy these for curb appeal. They buy them because of interior features. Also, the exterior appeal only matters in relation to the architecture around it. My wife knows the difference between good architecture and bad, we just haven't seen a new neighborhood that resembles the old rich parts of town. So as long as we keep buying vaulted ceilings, open lofts, big kitchens, and game rooms \_()_/

Source: I own one and I did mortgages for a while. Was product manager for a company that sold some tech to home builders.

Houshalter 1 day ago 1 reply      
This is entirely subjective preferences for the appearance and aesthetic. Personally I liked the "McMansion" examples better and found the arguments for why they were bad unconvincing. The rules seem arbitrary and made up.
karma_vaccum123 1 day ago 2 replies      
Most of these homes have awful interior features like oversized entryways that feature giant voids. The heat flows to the ceiling, which is a terrible waste. Similarly, the "great rooms" of these homes tend to have too-high ceilings which once again is wasteful as furnaces must work even harder.

These homes tend to also have ridiculously oversized garages that are fashioned to make the home look much larger than it is.

bootload 1 day ago 0 replies      
Shallow article, it's like it had all the engineering in the design/construction has been sucked out. I would have expected something on the lack of:

* energy efficiency/re-cycling of building materials used

* integration of passive/active energy saving power

* poor design in creating buildings capable of whole life existence

* improved use of natural light and air flow

* use of factory built, prefabricated frames (ie: Holland)

If you watch or catch Kevin McCloud's, Grand Designs (BBC) would appreciate this. [0] Any others you can think of?

[0] Though even best intentions have problems: cf "Grand Designs presenter Kevin McCloud's eco-development 'riddled with building errors"http://www.telegraph.co.uk/news/earth/environment/conservati...

ourmandave 1 day ago 2 replies      
Definition of "McMansion": a large modern house that is considered ostentatious and lacking in architectural integrity.

The architectural integrity I guess is the problem. The house can be sprawling (no well defined center) or has to many windows (that may be of different sizes).

It sounds like designers hating on Comic Sans or programmers arguing over tabs vs spaces.

But worry not...

This rounds up post #1 of McMansions 101 - but dont worry, there are many more factors that make an otherwise normal suburban house a McMansion, and each will be covered in their own special posts.

codeonfire 1 day ago 0 replies      
Every article about apartment buildings complains that new buildings are simple, ugly boxes and should be more artistic flowing examples of the character of the neighborhood. In private homes everything is too complex, unbalanced and disproportionate. Homes should be simple proportionate boxes where all the windows are the same size and line up. Architects think individuals should live in their designated worker eat-sleep unit pods while large classes of people as a whole deserve unique and inspiring works of art. To break it down even further, I will just guess that commercial architecture pays better than residential and this is manifesting as a hatred of "McMansions". There are always going to be both ugly and beautiful homes.
zeko 1 day ago 2 replies      
While McMansions do seem kitschy and aesthetically unappealing to me, that is my subjective feeling.

I find it inappropriate and offensive that this author decides to be the sole authority on aesthetics of PRIVATE properties, appealing to some sort of an unassailable, celestial authority that mainstream architecture is and promoting boring, utilitarian designs. Just how far down the slippery slope of the rabbit hole do you want to go? Why not box people in grey Soviet apartment buildings?

All of this sounds to me like arrogant ramblings of people who like authoritarianism and can't grasp the concept of personal freedom and that people like homes that are wacky and extension of their own egos and sense of pride in their financial success.

snicker7 1 day ago 1 reply      
The author explains what good design principles are, but not why they are important. So what if a house had too many 'secondary masses'? Why exactly is that such a bad thing? The author fails to answer such basic questions.
brownbat 1 day ago 0 replies      
There are better ways to critique McMansions.

The article's approach is to list a series of hard aesthetic rules, then show how some McMansions break them.

That would be a valid critique, but the article never fully establishes that those aesthetic rules are universally desirable. In aesthetics, rules are made to be broken.

If standards are broken haphazardly or unintentionally, that may be a sign of poor aesthetics. But I'm convinced McMansions intentionally break several of those rules.

Breaking those rules is the only way to convey something that the author somehow completely fails to notice.

A better aesthetic test of McMansions would only involve two questions:

1) Are McMansions effectively conveying some idea, and,

2) Is that idea is worth conveying?

Setting (2) aside for the moment, I think McMansions meet (1) very well, primarily by breaking the author's "rules."

Think about who buys these. The architects and the purchasers are not ultra-rich, but are still in a class that wants to display wealth.

When purchasers in this group compare several homes, they are most likely to buy the one that makes them feel like they're displaying the most wealth and power for their price point.

Wait, why would a tangled mess of asymmetrical nonsense achieve that?

Because the aesthetic McMansions are shooting for is "this is more than one house."

Bits crop up all over the place and the masses are obscured so that the building gives the appearance of multiple dwellings.

What does it look most like? Villages that have cropped up around a central castle, organically growing in different heights and orientations without much central planning. (Maybe more from fairy tales than real world villages.)

The image the McMansion tries to convey is one of dominion over a territory or a population. I think McMansions succeed, primarily through the subversion of the author's favored aesthetic tropes.

But even if McMansions successfully evoke small villages, part (2) is the more important question.

Is mimicry of an order of magnitude of wealth and power beyond one's reach aesthetically appealing, or desperate and ostentatious?

I'd find a critique along those lines far more compelling. The fact that some buildings fail to satisfy some arbitrary checklist is not the more important thing, what the building says about the owner's character and insecurity is far more interesting.

elgoog1212 1 day ago 1 reply      
Go to any blog about "good" modern architecture. As it turns out "good" architecture these days is roof-less bauhaus with windows that take the entire front wall, including the bedroom. Looks good on Tumblr, impossible to live in, expensive to build, and insane heating bll. No, thanks.

Case in point: http://designhomes.info/wp-content/uploads/2016/04/mid-centu...

applecore 1 day ago 3 replies      
This critique is composed entirely of exterior shots. Maybe the McMansion layout leads to a better interior configuration for a given square footage?
digi_owl 1 day ago 0 replies      
As a foreigner, what i find myself thinking is that the mcmansions are trying to catch the feel of a generational dwelling while being brand new.

Meaning that if you look at old houses in Europe or similar you very often find extensions and whatsnot that have been added without any overall design in mind. Instead they were going by the time and resources at hand at the time of addition.

tripzilch 14 hours ago 0 replies      
You've got to know to the rules before you break the rules.

I don't know about McMansions. Certainly (~20th century) Dutch architecture follows somewhat different design guidelines. We don't have as much space, sometimes sacrificing balance or even proportion. We definitely have much bigger windows (voids), it's a cultural thing. Similarly, fig.2 "Secondary Masses: 6" looks to me like it could be a Mediterranean communal farm, grown organically (truly don't see what's wrong with that one tbh).

But I don't want to go into details. The most important rule of design is:

You've got to know to the rules before you break the rules.

Some of these rules are grown by circumstance, and you can't always break them (or build a shitty house--but sometimes you can be clever).

Some of the rules are grown by taste or preference. These you can play with.

Some of the rules are grown by circumstance, turned into preference, because technology advanced and preference didn't. These you can also play with.

Relating this to HN: The above holds just as much for designing webpages or app user interfaces.

mangeletti 1 day ago 0 replies      
Is it just me, or is Tumblr's UX absolutely terrible?

You reach the bottom and it says "keep reading". Keep reading what, the last post in the list or pagination to more posts? So I click it, takes me to the last post (fine), so I scroll down to look at some of the pictures, etc., and then at the bottom it seems to continue back at the top of the prior post list. So I go back, scroll back down to the bottom of the page and notice a "See mcmansionhell's whole Tumblr" link. Perfect. I click it... App Store.

hartror 1 day ago 0 replies      
There is a huge cheap build going on next door. It is two stories with 7 bedrooms with no back yard because pool. The contractors I talked to joked it will be knocked down in 20 years because of the low build quality.
stcredzero 1 day ago 0 replies      
The overarching theme of American architecture gone wrong is the lack of supporting real community. It used to be that supporting commerce did that as well, but the car changed this.
vanderZwan 1 day ago 1 reply      
Related: Ugly Belgian Houses


Doesn't try explaining anything though.

skilesare 1 day ago 0 replies      
People have already mentioned Christopher Alexander's Pattern Language and Timeless Way, but I'll note that he has further distilled those ideas in his The Nature of Order Series: http://amzn.to/2b4XYf5

1. Levels of Scale2. Strong Centers3. Thick Boundaries4. Alternating Repetition5. Positive Space6. Local Symmetry7. Good Shape8. Deep Interlock and Ambiguity9. Gradients10. Contrast11. Echos12. Roughness13. The Void14. Simplicity and Inner Calm15. Each in the Other

tired_man 1 day ago 0 replies      
I thought having the poor taste to plunk one down into a neighborhood where it was a sore thumb was enough.

I'm happy to see there are other aesthetic reasons, too.

velox_io 1 day ago 0 replies      
I'm not a big fan of the McMansion houses, but I'm sure they mean the world to their owners, who have made large sacrifices to get the house of 'their' dreams.

Personally I like the Georgian style; big windows and look decent when they're kept simple, or something a little more modern depending on the location. Yes, money doesn't buy happiness, but I'm sure we all have a picture on our mind of a dream house. Nothing wrong with aspirations, excuse me while I do more than just dream!

99_00 1 day ago 0 replies      
That's a great blog. I don't like the term MacMansion. Everyone seems to have their own definition and sometimes it's applied to anything new and big that someone is envious of.
jimmaswell 1 day ago 2 replies      
That all sounds like opinion to me. When I look at the examples of what I'm supposed to think looks awful, I just see a house. The one with so much overwhelming "secondary mass" looks nice to me.

If you need to study architecture to "realize" how bad those look, maybe they actually don't look bad. All the people buying them seem to agree.

There are legitimate problems with the construction quality, as others have pointed out, but trying to claim they objectively look bad is clearly wrong.

hx87 22 hours ago 0 replies      
Most of these houses, and most McMansions, would look and function much better if they just rid of two features:

1) Gables that don't terminate a wing of the house2) Massive hip roofs

#1 makes no sense from either an aesthetic or functional point of view and ends up doing nothing besides adding cost and making solar panel installation more difficult.

#2 also makes solar panel installation more difficult, is harder to ventilate (or air seal and insulate, in the case of cathedral ceilings), and unless labor is extremely cheap, more expensive to build compared to trussed gable roofs.

re_todd 1 day ago 0 replies      
It always amazes me that the people I know who are financially well-off almost always choose to live in an ugly McMansion. Voluntarily. I've determined that if I ever become well-off I'll buy an old Craftsman or Spanish style home. They may be smaller, but I can always buy sheds or storage space.
mbfg 1 day ago 0 replies      
I'd take any of the McMansions shown here. sorry.
bjterry 1 day ago 1 reply      
This conversation has gone in a very interesting direction. It's true that on one level the perspective of this blog author corresponds to a sort of class-based tutting of the nouveau riche, but on another level I can't shake the feeling that he is pointing to something that's real.

I have much more experience doing web design (some) than I do evaluating architecture (none). The principles he is describing apply in a similar manner to web design, and they are similarly hard to articulate. When you see a block of text without enough space between paragraphs, you know it doesn't have enough space, even though you would be hard-pressed to provide an objective reason why a paragraph needs 20 px vs. 5 px vs. 100px vs. 500px. Only one of those distances feels right, and it has almost nothing to do with how much vertical space is available on your screen or the relative ease of reading paragraphs with that precise distance between them.

Similarly, you can feel a wrongness when a similar visual element is repeated, but without consistency, like a second text box that has a grey that is just slightly bluer than the last text box, even though they contain text of the same type at the same level of semantic importance. That is the equivalent of including windows with wooden shutters, and a set of windows with blinds, and a set of bay windows on the same facade. (For a visual-design example of his masses and voids principle, see the BeOS UI window, but I think this is the weakest of his principles)

It is easy to be a contrarian and say "this guy is wrong, all of this is subjective," but I think that contrariness is more wrong than right. The best functional (not purely artistic) designs, whether they are web designs or architectural designs, hold to a set of quasi-universal aesthetic standards, including balance (symmetry or balanced asymmetry), repetition (consistency in web development terms, repeated elements in musical terms), and appropriate blank space. I suspect that these core principles reflect aspects of human cognition that are not subject to pure subjectivity.

I think if you were designing from first aesthetic principles, you would probably retain balance and thematic cohesiveness (repetition). What you would lose is the trappings of particular eras, like roman columns or those shutter windows. Even if your house was based on a geodesic dome you'd want it to have a certain proportionality and space between visual elements. The problem with McMansions is that they take ONLY those trappings, and discard the principles, and that's what this article should really be about. The reason for this is obvious: the uneducated consumer sees only the trappings, because they are concrete and obvious, thus easy to evaluate. They are associated in the minds of buyers with high class, and so they seem high class in this particular instance.

sireat 1 day ago 3 replies      
My SO is an architect, my relatives and friends are also architects.

Common complaint of theirs: clients like to meddle

Analogy would be asking for <blink> tags to a web developer or for gold incisors to your dentist instead of ceramics

zeroer 1 day ago 1 reply      
I get what the author is saying. But as a home owner, I'll be spending 98% of my time in the house, not staring at the exterior. I'd like to see criticisms of the experience of living in a McMansion.
rbanffy 1 day ago 0 replies      
To be fair, I found the second one perfectly reasonable - it's kind of nice on its determination to avoid the recipe the article outlines.

Having said that, it's often better to fail by actively trying to avoid following the norm than to try to follow it and not quite succeed.

jernfrost 1 day ago 0 replies      
As someone who has done a lot of GUI design, I recognize many of the design principles. The point of balance and symmetry e.g. is very relevant for GUI design as well.

It is very useful to know these sorts of principles because without them you notice stuff looks wrong but you don't know why and you don't quite know how to fix it.

manigandham 1 day ago 1 reply      
There are plenty of great looking "mcmansions" - bad design for some houses (of any type) shouldnt be generalized to an entire construction and development approach.
codeulike 1 day ago 1 reply      
What would the software equivalent of a McMansion be, and what sort of arguments would be used to disparage it and also defend it?
gsmethells 1 day ago 0 replies      
This made the rounds on Facebook. It does a good job of covering why bigger is not necessarily prettier.
swehner 1 day ago 0 replies      
The funny thing to me is, it's more important what your neighbour's house looks like.
emodendroket 1 day ago 0 replies      
Well, if you need education to understand why your house is not aesthetically pleasing, maybe you should just forego the education and be happy with it.
gopi 1 day ago 0 replies      
Maybe i am unsophisticated but i liked the supposedly bad examples better than the good examples!
galfarragem 22 hours ago 0 replies      
Architecture is materialized values. McMansions materialize the values of the majority of Americans.
sp527 1 day ago 0 replies      
All his examples of presumably 'good' architecture strike me as hideous and in many instances depressing. I'd take the McMansion over those hands down.
bane 1 day ago 0 replies      
This post wades into an area normally met with knee-jerk comments and defensive house owners, but I think there's some really subtle points being made here -- I don't agree with some of them, but I think the author is really trying to figure out what exactly is going on with some of the really terrible McMansions shown. However, these problems extend up through real mansions as well.

Here's a ~$9million house being built that suffers from almost all of the claims put forth in the post even though they're supposed to be hallmarks of McMansions and not real mansions https://www.redfin.com/MD/Potomac/10501-Chapel-Rd-20854/home...

After spending some time thinking about the problem, I've come to the following hypothesis: architecture schools aren't producing architects trained to produce quality designs that match the desires of the vast majority of the marketplace.

Consider the following: we've been though decades of massive architectural development in practical housing, the result of which appears to be variations on boxes-with-mixed-materials https://encrypted.google.com/search?q=modern+house+design&hl...

Most consumers don't want these kinds of buildings and the ones who do buy them because they're custom architected and that acts as a status symbol. Consumers want classic "house" shapes on various degrees of "fanciness".

Most consumers also don't know good from bad architecture. But it's okay, because it appears most architects don't know good from bad either or else we wouldn't be getting endless variants of boxes-with-mixed-materials. The last time the kind of design this blog is looking for was actively used was during the American colonial period. And that wasn't always a successful meme -- America is littered with old homes with huge pillared entrances on otherwise modest homes. http://www.house-design-coffee.com/porch-columns-disaster.ht...

It could be worse, America could be littered with craptastic hyper-modernism and weird vestigial bits of unintersting history http://uglybelgianhouses.tumblr.com/ But Europe isn't immune to this nonsense either, McMansions have started infecting everywhere from Ireland to Germany.

Asia has it's own entirely unique series of architectural sins.

I think there's a number of forces at play here that have all led up to where we are.

- A growing desire in industry to find ways to minimize materials usage in new construction (older homes are sometimes hilariously overbuilt)

- A rising and increasingly affluent middle class

- A lack of good examples to educate people on -- most buildings are crap...even the ones that are well considered often fall apart quickly http://failures.wikispaces.com/Fallingwater

- A lack of architects who can produce something other than boxes-with-mixed-materials

- Continued cost cutting as plans go to market

- Consumers who are fine settling with "nice enough" if it saves them $100k-$200k on a similarly specc'd home

- Most people don't give a crap about gardening or yard maintenance. Who cares what your house looks like or how small the property is if you're spending all day inside your climate controlled 5000 sq ft personal facility with movie theater, gym, gourmet kitchen and jacuzzi?

It also turns out most McMansions are actually pretty nice places to spend time: well lit, nice materials on the surfaces you interact with, tons of space so you don't have to make too many compromises, open floor plans and so on.

They are worlds better than the 60's, 70's and 80's era homes I grew up in and later owned. The 70s and 80s were particularly abysmal for material quality and energy use. Holy crap the heating/cooling bills on those homes.

AReallyGoodName 1 day ago 0 replies      
Americans are really snooty imho. I suspect the snootiness is both the reason these houses exist (I have a bigger house than you) and the reason this criticism exists (I have a nicer looking house than you).

I feel like it's a rather toxic cultural thing to care at all about other people's houses. Likewise with cars. It's also something that's not nearly as prevalent in mainstream culture outside the US (snobbery is the domain of the upper classes exclusively).

PhasmaFelis 1 day ago 0 replies      
I should have known what was coming when the article kicked off with, essentially, "If your tastes differ from mine, don't worry! You're not stupid, just uneducated."

But even then I was expecting him to list things like shoddy construction, or dull featurelessness, or cookie-cutter tract construction. No, his #1 issue is houses that are (gasp) not perfectly rectangular. #2 is houses that are are asymmetrical, but not in a cool way. He actually calls out one house specifically for having too many windows. I thought windows were for letting sunlight in, but apparently they're actually intended for class signalling.

And class signalling is all this is. He gives the game away when he specifies that these rules are very important unless you're using a cool, [m/M]odern style where rule-breaking is socially approved. This is no different than the byzantine fashion codes manufactured by society matrons around the turn of the last century to exclude the vulgar nouveau riche. Never wear white after Labor Day; never have too many windows on your house.

jheriko 1 day ago 0 replies      
you would think 101 would tell you what the hell a mcmansion is, instead of me needing to google it to find out half way through what is otherwise an interesting article.
NietTim 1 day ago 0 replies      
But why would you care about the outside of your house when the inside is nice and cozy?
ilaksh 1 day ago 1 reply      
The main problem with McMansions is that they waste a lot of space and energy.

The purely aesthetic perspective should embarass contemporary architects by its shallowness.

squozzer 23 hours ago 0 replies      
The author made some good points but focused solely on the outside appearance of the home. Another critical aspect of architecture is the layout and use of interior space - a point where many beloved designs - such as colonial - have weaknesses. For example, long hallways.McMansions seem to be designed from the inside out, which is almost a given for an overtly modern (Bauhaus) style, then a traditional skin applied.
brokenglass 1 day ago 0 replies      
might want to fix the typo in the title
cosarara97 1 day ago 0 replies      
s/cutre/cture/(On the title)
AWildDHHAppears 1 day ago 7 replies      
Machine Learning and Ketosis github.com
676 points by ddv  3 days ago   290 comments top 58
ghshephard 3 days ago 6 replies      
What's interesting about this post isn't the actual diet advice, but the guidance to track and see what works for you. Different people react to fasting differently - some continue to burn calories and drop weight, some just go into starvation mode, have their BMR crash through the floor, and end up exhausted all the time because their body is fighting like crazy to conserve energy.

Likewise - some people, when they start to eat carbs, see their BMR ramp up, are full of energy, and end up running 5-10 miles a day to burn that energy off, and are pumped the rest of the day.

Also, focussing on weight to the exclusion of everything else is really horrible. It's pretty easy to lose 40-50 pounds and end up being much less healthier than you were before if you aren't careful. Having some sense of your V02 max, your strength, endurance, flexibility, etc.. are sometimes as important, if not more important, than what your weight is to +/- 20 pounds.

Everybody is somewhat different in how they'll react to various diet and exercise regimes. Understanding that, and taking a bit of time to watch how your body responds, is the important insight here.

onetwotree 3 days ago 5 replies      
I'd really like to see this kind of analysis applied to a larger sample.

If someone were to put together a little app to make this sort of data collection trivial and upload it to a central (preferably cheap) repository, would people be interested in contributing their data in exchange for analysis?

Also, since this is HN, is anyone interested in building something like this? I've got an hour or two a week of spare time to chip in.

Unrelated question, mostly for OP: do you know of any publicly available databases of GI/GL info? It's really important for people with type 1 diabetes (a subject rather dear to my heart), and could also be useful for OpenAPS stuff (https://github.com/openaps).

gtrubetskoy 3 days ago 6 replies      
I don't have a weight problem, I learned about ketosis researching natural ways for improving focus and concentration, and somehow came across "bulletproof coffee". (I'm only using BP for lack of a better name and I don't endorse the stuff they sell online).

To my utter surprise, something as stupidly simple as coffee blended with butter with coconut oil (or MCT) first thing in the morning did unbelievable things for mental productivity. After much googling I learned that this is not a fluke, and that there is real science behind it - beta-oxidation, etc, all that good stuff.

What's most incredible to me is that (1) I didn't know about it, I always thought glucose is the only fuel (and I consider myself fairly knowledgeable as far as basic physiology is concerned) and (2) that sugar, especially the industrially produced kind, is about as close to being the root of all evil as it gets and there is nothing wrong with fat at all.

benkuhn 3 days ago 2 replies      
It's astonishing (and really awesome) that they were able to extract such a strong signal from daily weight swings! This makes me a lot more optimistic about the possibilities of quantified-self-type stuff.

Three things I'm really curious about:

- How significant are the estimates of lifestyle factors? Do you have p-values? If you bootstrap resample, how much do the rankings change at the extremes?

- How much cognitive overhead did it impose to collect the data for this? Did you put a lot of effort into designing the tags beforehand or making sure you weighed yourself at a consistent time?

- It looks like the predicted delta from going from a "nosleep" day to a "sleep" days is about 1.4 pounds (sleep coef minus nosleep coef). That seems fishy, or at least like it will stop working fairly soon because you can't actually lose 1.4 pounds/day sustainably. Is it possible there's something weird going on with the data or those variables don't have the obvious meanings?

tunnuz 3 days ago 8 replies      
This is all good if your goal is weight loss. However weight loss doesn't necessarily means higher fitness. Glycogen is fundamental if you do sports, and exercise is a major ingredient in getting fit. If I read correctly, exercise was not a big part of your experiment, how would you suggest modifying the experiment to accommodate one's exercising needs?

On a side note, I reached a similar conclusion on the role of "carbs at night", sleeping, and fats, and I read this interesting article https://aeon.co/essays/hunger-is-psychological-and-dieting-o... on the importance, for effective weight loss, of feeling satisfied (I believe there is also a reference to the relationship between eating fats and feeling satisfied).

fernly 3 days ago 3 replies      
Just for what it's worth, there are several Soylent-like meal replacement products that are explicitly formulated for ketogenesis. This could be an aid for a person who wants to try a keto diet without having to research a lot of new recipes.

Biolent has a keto variant: http://biolent.ca/

Keto Chow: https://www.thebairs.net/product-category/ketochow/

Keto Fuel: http://superbodyfuel.com/shop/keto-fuel/

KetoSoy: https://www.ketosoy.com/

PrimalKind is "paleo": http://primalkind.com/

Edit: added Keto Chow, which I forgot on first pass!

cel1ne 3 days ago 3 replies      
I lost 15 kg (33 lb) over a period of 7 months.

* Stopped drinking juice, coke etc. completely* Ate sweets strictly only once a week (like one cake every sunday)* Ate carbs mainly at lunch when doing sport afterwards. In the evening I ate carbs too, but much less.

There's one thing that all these fasting-guides and tips fail to mention:

The fastest way to loose weight is to heighten your resting-energy-consumption, and the fastest way to do this is to gain muscle mass by training.

frankus 3 days ago 3 replies      
Totally anecdotal, but I want to second the recommendation of a longer fasting period.

I switched to a strict-but-not-religious "no food between 7 pm and 11 am" system (with exceptions for weekends and social occasions).

Within a few months I was down 15 pounds (~182 to around ~167) and had shed 4 inches off my waist (~34 to ~30). I'm about 5'10" and 41 years old.

It's definitely helped with physical activity (mostly parkour/free running) and I look better. It's also more convenient than what I was doing before, since I don't have to cook breakfast.

The only negative side effect (possibly unrelated) has been that I need a much cooler sleeping environment to be comfortable.

The only thing I would add is that I'm starting to (upside-down) plateau at around 165 and (what my scale says is) 20% body fat. I would love to lose another 5-10 pounds but it'll probably be a slow process.

philip1209 3 days ago 2 replies      
I've been doing ketosis for about 6 months, after having done twice in the past. From my heaviest to today over the last 2 years, I'm down 50 pounds.

The data in this post was cool. I found Dr. Peter Addia's analysis to be one of my favorite resources. He's a medical doctor who was an overweight endurance athlete, then began doing ketosis. I appreciated an honest scientific analysis of the benefits and drawbacks of ketosis.

His blog:


As an aside, I think that our attitude toward insulin from a public health perspective is going to change a lot in the next few years.

ryeguy 3 days ago 4 replies      
This is delightfully nerdy and overly analytical, but I can't believe someone would go to the extent of creating this yet never think to look for counterarguments against their dieting regimen. Doing this would have given him the answer much sooner.

Trying to figure out how the consumption of certain foods correlates with weight gain/loss is a waste of time because it's simply the caloric content, not the macronutritional content (eg protein vs carbs vs fats). There are dozens of studies showing that only calories matter for weight loss, including vs low-carb, low-fat, and more[1].

The author is acting like each food has arbitrary properties that make them food or bad for weight loss. It's a good reason to play with ML, but it's easier to just count calories. I hate seeing people go down the "good food/bad food" path of thinking because they end up overanalyzing the shit out of everything they eat and for no good reason.

The glycemic index doesn't realistically matter either for a ton of reasons. It hasn't been taken seriously as a useful marker when chosing foods for quite some time now. It's not a reliable indicator of anything. I posted a summary a few years back that debunks all the insulin/GI spike voodoo[2].

Low carb diets work, but they only work because they're a trick to get you to reduce calories. Look at what you're eating and remove all the carb foods. Notice how much your calorie intake dropped. It's a simple way of losing weight without counting calories, but that's it. The weight loss on keto, atkins, etc have nothing to do with carbs, and everything to do with calorie restriction.

It's worth noting that low carb diets have great health benefits, however [3].

[1]: http://examine.com/nutrition/what-should-i-eat-for-weight-lo...

[2]: https://reddit.com/r/Fitness/comments/j853z/insulin_an_undes...

[3]: http://examine.com/nutrition/are-there-health-benefits-of-a-...

sangd 3 days ago 0 replies      
I really like your reading because I went to a time period when everything I read was so misleading even words from doctors. That's the time when I tried very hard to improve my health (I gained about 30 lbs since the time I came to this country 15 years ago). To do that, I ate less fat, avoid carbs, getting a lot of exercises (1-3hrs a day, at least 5 days a week), eating oatmeal, even taking statins. And all one failed after another. I went through a lot of things mentioned here. And finally I decided to forget about everything doctors and researchers said. I look at my parents diet and how I was raised and started slowly from there. My health then got improved much better. I'm glad that you wrote something here for everybody to read. But there's one point I would like to add to the recipe: improve mental state by listening to the body (meditation, yoga, brisk walking are good methods), eat only when the body feels hungry and stop eating when it feels full, eat the food that it feels good after eating (you will develop your own list).I lost about 10lbs and I don't gain/lose any weight for the last 3 years.
hoodwink 3 days ago 5 replies      
As a long-term ketogenic eater (10 years), here are my top simple tips. Unfortunately they were not gleaned using machine learning.

1. Watch your protein. Most people when first going keto will eat too much protein and not enough fat. Protein has an insulinogenic effect when eaten in quantity. Keep protein below 8 oz per meal. Don't be afraid to eat more fat.

2. Avoid cheese. Yes, it's technically low carb, but it repeatedly throws me and my girlfriend off (also a low carber).

3. Avoid nuts. Yes, like cheese, nuts are delicious. But they're a slippery slope. Life will be easier if you avoid them.

bmarkovic 3 days ago 1 reply      
The author is wrong that sleeping more just gives less time for eating. Truth is that sleeping more reduces insulin resistance, increases testosterone production in males and puts leptin under control. All three reduce carb craving, push us towards meat and fat and increase natural fat burning.
davidf18 3 days ago 0 replies      
In males, over time testosterone decreases which causes muscle mass to decrease. Building up the large muscles through resistant exercise or any exercise (eg, gluteus maximus -- thigh) increases the muscle mass. Increased muscle mass increases the basic metabolic rate (BMR) which is the number of calories consumed even while at rest, even while sleeping.
raffy 3 days ago 3 replies      
I guess if we're sharing fitness plots, here's 1300 measurements and 7 DXA scans:


elo_ 3 days ago 1 reply      
I collect some of my own data in 1/0 form in regard to whether I was productive the day before as well as a few other things (I have a google form for myself that I fill out each morning). I only have a month of data but here is what it spat out:

 FeatureName HashVal MinVal MaxVal Weight RelScore ^dreams 24546 0.00 1.00 +0.0705 47.11% ^shower 215555 0.00 1.00 -0.0239 -15.96% ^exercise 190069 0.00 1.00 -0.0350 -23.41% ^vitamins 252959 0.00 1.00 -0.0442 -29.56% ^write 129676 0.00 1.00 -0.0687 -45.90% ^publish 12600 0.00 1.00 -0.1496 -100.00%
Which is to say that when I write and publish I also have the willpower and combined other factors to lose weight.

Interestingly I was tracking dreams because I made various changes all at once:

 -phillips hue bulbs -new mattress -new exercise routine (up from nothing) -started taking vitamins -stricter on my diet
And was wondering what the cause might be.

When I say "dreams" I mean - "did I wake up remembering vivid dreams?". I wonder now if it's related to caloric surplus.

I also have minutised step data, nightly minutised sleep data and hourly mood self-reported data that I might try to throw in to the system and see what it says.

samuelbrin 3 days ago 3 replies      
I might have missed it, but doesn't look like ketone levels was one of the factors being measured. You can actually buy ketone pee strips at any drugstore for cheap. Would have been an interesting thing to track, since conventional wisdom says that ketosis is binary, you're in it or your not, and the actual ketone level doesn't affect the weight loss. I'm sure this has been put to the test in some experiments already, but maybe applying learning could teach us more and validate/invalidate this hypothesis
zzleeper 3 days ago 1 reply      
> The 'stayhome' lifestlye, which fell mostly on weekends, is a red-herring, I simply slept longer when I didn't have to commute to work.

Is it?

First, whatever method you use should already take into account that sleep happens together with stayathome. Even basic regressions take into account that.

Second, staying at home means your eating binges are constrained by what's around you. If it is healthy stuff it might mean weight loss, if it is bread and chips the opposite

iopq 3 days ago 1 reply      
Of course you're going to gain weight when you eat carbs. Glycogen contains mostly water and carbs, so if you store glycogen you gain weight, up to 20 lbs from completely depleted.

So let's say you eat low carb and train very hard for a week. Your muscles will be depleted, your liver will be depleted. You see that you lost 10 lbs! Great! But is it? Maybe you lost one pound of fat and 9 pounds of water. Then you eat NOTHING BUT carbs for three days and you gain 20 lbs. Oh no! But actually it's your muscles expanding their capacity to store glycogen (since you trained hard and depleted yourself they will regain even more glycogen than before). You may have not gained ANY FAT AT ALL!

I have had great success dieting with medium fat medium carb diets. I try to keep between 180lbs and 220lbs (I'm 6'4"). I'm saying that this kind of tracking is oversimplifying the issue of fat loss - you need to see that you're losing fat tissue which involves at least caliper measurements.

Borkdude 3 days ago 3 replies      
Warning: unpopular opinion here. Long term low carb dieting and ketosis is not without risk. Yes, you will see weight loss, but what about other health markers? Check out some of the information here: http://nutritionfacts.org/topics/low-carb-diets/What has worked for me was a plant based diet without added oil and processed foods: https://medium.com/@borkdude/tl-dr-of-my-long-term-weight-lo...Been doing that for five years now. Never hungry, still happy with it.
lucidguppy 3 days ago 4 replies      
Carbohydrates do not make you fat. I am eating a high carb vegan diet and have lost 63 pounds since last October.

My weight loss: http://imgur.com/a/cGb4X

I do not restrict the amount of food I eat. I snack and have big meals.

Carbs do not make you fat. Eating high caloric density foods makes you fat.

More resources:http://nutritionfacts.org/



laretluval 3 days ago 1 reply      
Interestingly "gioza" and "gyoza" as factors come out with opposite signs. I guess that provides a heuristic confidence interval for the weights.
clamprecht 3 days ago 0 replies      
This is awesome. I'd love to see something similar from someone who (intentionally) gained muscle mass. Is it really about high-protein + high-carbs + workout + sleep, or is there a more optimum diet for this.
escoz 3 days ago 1 reply      
If anybody is reading this and curious about Ketosis, I'd recommend Taubes book (https://www.amazon.com/Why-We-Get-Fat-About/dp/0307474259). It's a great review of scientific studies done over the years.

I read that 4 years ago, spent another 4 months reading the listed studies, convinced myself it was a good plan, and lost 40 pounds with no exercises. I still do LCHF after all these years, and likely will never go back to a traditional diet, it feels great.

timeu 3 days ago 0 replies      
Sorry for the crosspost/copy-and-paste:

For me personally this worked quite well for reducing bodyfat:

Intermitted fasting[1] and lifting weights 3 times a week[2] and being on a cutting regime[3] (cutting on the rest days with low carb and loading on the workout days wiht more carbs).

In the beginning IF was quite difficult but after a while the body get used to it and also I tend to have less cravings during the day. On the loading days/workout days I often have a hard time to get enoug calories because I feel full. I am using MyFitnessPal to track what I eat but it's more about the macros and not so much about the exact calories (but it's also good to get a feeling how much calories different kinds of food has)

[1] http://www.leangains.com/2011/03/intermittent-fasting-for-we... [2] http://stronglifts.com/ [3] http://www.lgmacros.com/standard-leangains-macro-calculator/reply

thisisananth 3 days ago 3 replies      
This matches my experience of weight loss using high fat diet. I started it after hearing Sarah Hallberg's Tedx talk. https://www.youtube.com/watch?v=da1vvigy5tQ&feature=youtu.beFor a vegetarian, starting and continuing the high fat low carb diet is difficult. Are there any resources for more vegetarian recipes for high fat low carb food?

The article was very impressive. I liked the graphs and presentation.

agentgt 3 days ago 0 replies      
It is funny but I had pretty much the exact weight loss as the OP had doing intermittent fasting. Sadly I wasn't as rigorous on the recording of weight loss but I went from 195 to 175 over roughly the same time period as the OP.

One thing I found is writing down a plan seems to really help you stick with it.

I finally wrote down my workout routine after decade of on and off training. I always logged my workouts but I never wrote the overall plan down. If you are interested in my routine it is here (it has very little to do with diet as I was going to write a follow up some day): https://gist.github.com/agentgt/f93b78dbe13870a6d0a1

I have never publicly posted the routine so if you feel the need to trash it I suppose you can do so in the comment area of the gist but routines are pretty personal anyway (to the OPs point).

fb42 1 day ago 0 replies      
I think this shows how various factors correlates with the amount of water/food still in the digestive system.

I'm not surprised that for example sleeping longer would make you having less water in your body.

Most foods that gave "weight loss" were high energy foods that have lower mass per calorie.

cpplinuxdude 3 days ago 1 reply      
A word of advice to those attempting a Ketogenic diet: get your blood work done on a regular basis.

If possible, get heart scans done once a year.

It's easy to slip into a sloppy version of the Ketogenic diet, at which point you're consuming lots of unhealthy fats and carbs, you triglycerides and cholesterol could shoot up and put you in shit street.

This is a strict diet.

zachrose 3 days ago 1 reply      
Among words correlated to weight-gain, the least correlated of these is cheeseburgers! Be still my heart (not literally)!
vinnyp 3 days ago 0 replies      
The challenge for me is always knowing what to eat to help me accelerate my weight loss. I need structure. If I don't have structure, it's easy for me to eat things I shouldn't. I started a weight loss program 5 weeks ago called "Ideal Protein," which uses the Ketosis method. They supply breakfast, lunch, and a snack. I only need to make dinner (plus 2 cups of veggies for lunch). I love how easy it is.

I'm wrapping up week 5 tomorrow and I'm down 25lbs (~14% of my body weight)! They promise 2-5lbs of fat loss a week. I generally drop around a pound a day, but I'll get stuck a few days here and there. This method really works for me.

The best part is how fast you lose it. When I did South Beach years ago, it took me months to hit my goal. I have 14lbs left now and I should be able to hit my goal by Labor Day.

Give it a try.

dschiptsov 3 days ago 0 replies      
Yeah. I has been downvoted into oblivion by saying almost the same things without any machine learning, just by observing eating habits and traditions of "village folks" in Nepal, India, Tibet and Sri Lanka where I have spent last few years.

The message was that traditional (evolved according to local food sources and seasons) unprocessed foods of Asian tribes is the most natural and healthy. Economic and habitat selection pressures implicitly works the same way as machine learning, for hundreds of years.

Visit any rural area and try simple folk's foods. They will be way healthier that any processed crap. The similar patterns could be notices really quickly.

brandon 3 days ago 0 replies      
I appreciate that the author shared his data because it's interesting to compare notes. I embarked on a strict ketogenic diet in 2012 from a significantly higher initial weight and observed a very different pattern of weight loss: https://i.imgur.com/mOt6P.png

I hit a loss plateau around 180 lbs that I couldn't break through until I began eating on an 8-16 fasting schedule like Ariel mentions in his "Further progress" section.

I gave up on the diet during 2015 and have since regained a significant amount of weight, but I suppose that's just an opportunity to apply some of the tracking techniques in this article to my next foray.

d23 3 days ago 1 reply      
I don't get it, and I've really tried. I'm referring to the keto diet (though this person's approach is obviously extreme overkill). I absolutely love meat, but when I go that low on carbs, I feel like crap. And sure, some people say if I just wait 4 weeks to 6 months my body will adjust, but... why bother?

Get a calorie tracker like my fitness pal, set a goal weight loss, stick to it, and 1 lb per week will be a breeze. You don't need a fad diet. For me, meticulously paying attention to what I was putting into my body and putting a number next to it made it a no brainer. It was almost gamified at that point.

raverbashing 3 days ago 2 replies      
It's a great experiment

However I think the biggest issue with this is that it's considering only a 1 day window. Weight can vary a lot between every day (bladder and bowel contents, muscular glycogen, etc)

kensai 3 days ago 0 replies      
Remarkable work. I am really pleased about the contribution of "sleep" in the whole weight loss experience. I have reached some similar anecdotal conclusions; I bet this is similarly not surprising for many persons.

I am not entirely sure sleep contributes only as "a fasting period". Sleep also means we are relaxed. All the other times of the day (when we are awake) we are relatively stressed, meaning more stress hormones (ie cortisol) which are known to be contributing to weight gain.

stickydink 3 days ago 1 reply      
Semi-related. For those who want to track their weight daily, efficiently, try Fitbit Aria (or one of the cheaper no-brand WiFi scales). Hook it up to Fitbit app, connect that to TrendWeight (https://trendweight.com/), then throw away the Fitbit app.

Whatever your goals are, however serious you are, I find it a great way to keep track. I just stand on this thing once a morning, and forget about it. And now I have a near-daily record, smoothed out with a weekly moving average.

notyourloops 2 days ago 0 replies      
When I ditch the carbs and eliminate grain from my diet, there's no denying that I look better and feel better. I've cycled on and off several times now and there's no way to get around it: The carbs and grain have to go and they have to go forever.

If you can eat them, no problem. However, there's many of us that can't and we're going to talk about how to operate with that as baseline.

emptyroads 3 days ago 0 replies      
I'm pretty sure this is just an example of a "placebo diet"[1] at work.

[1] http://www.placebodiet.org/

criddell 3 days ago 0 replies      
One thing I've never been able to get a clear answer on, is when you "lose" fat, how long does it take for the number of fat cells to go down? As I understand it, at first fat cells just deflate (not the right word) are will easily plump up again and that's one reason why it's hard to keep weight off. What's involved in actually ridding the body of the extra cells (outside of mechanical means)?
smokedoutraider 2 days ago 0 replies      
A couple a years ago I did keto for 8 months. During that time I lost 31 kg. While this diet was amazing for losing weight it really affected my concentration negatively, to the point where I was forced to drop the diet because it was simply interfering with my performance at work. It's a real shame though, as I've never been able to find a diet as easy and effective since.
bluetwo 3 days ago 0 replies      
The impact of sleep/no sleep. Wow.
mbrundle 3 days ago 2 replies      
Very interesting littlw study. The results seem to make sense, and I'm impressed that his model can learn what causes his weight swings given the low resolution delta-weight data he collects.

The author uses vowpal-wabbit to train his regression model. Anybody know what learning algorithm it uses (eg random forest?) Here's the link: https://github.com/JohnLangford/vowpal_wabbit/wiki

ifemide06 3 days ago 0 replies      
If you're looking to extend this further, I'll be privileged to work on this with you. Another great opportunity to be part of something! yay!
ermterm 3 days ago 2 replies      
I'd love to see something like this applied to the goal of gaining muscle. I've been a twig my whole life. In college, I gained 20 lbs of "noob gains" and then tapered off severely. I've maintained that weight, but can't gain more with any reasonable effort.

Dear ddv,

You've got a potential million+ dollar business in the works. In fact, I'd quit my job to be a programmer on your team.

sytelus 2 days ago 1 reply      
The observation that "no breakfast" is #1 way to lose weight by extending fasting period that includes sleep is pretty astonishing. Everything I have read says no breakfast is one of the contributors to weight gain.
apineda 3 days ago 1 reply      
I just started my keto diet about 2 weeks ago and I feel great. Did have one rough night of "keto flu" but then I started supplementing with minerals and more water. I'm super low carb right now. Funny thing is to get (about 4 weeks ago) I would get a double big mac or a double quarter pounder but no drink and no fries and that was it till breakfast. That was fun.
zelcon 3 days ago 0 replies      
Maybe your weight loss rate accelerated because you started watching what you eat more carefully, knowing that you want to provide the most sanitary input to your program. Anyway this is impressive for a n=1 study. Glad it worked for you.

Also, OP, what materials would you recommend for a machine learning newbie?

WhitneyLand 3 days ago 2 replies      
He advocates a book "The Truth About Statins".

Is it helpful info on evidence based science or mostly zealotry and soapboxing?

etangent 3 days ago 1 reply      
One should be careful extrapolating this type of data to other individuals. For example, sleep on its own may not actually be the primary factor in weight-loss --- the weight gain during periods of lack of sleep may occur simply because you have a habit of snacking while staying up late.

That said, awesome work.

quickpost 3 days ago 1 reply      
I've tried keto a couple times in the past, but struggled with serious acid reflux / heartburn from having to digest so much fat all the time. Any one else struggle with this and find a way to incorporate keto without negative digestive consequences?
arisAlexis 2 days ago 0 replies      
I understand the principles and found the low-carb diets sound scientifically but I can't understand how eating a lot of fruits and potatoes would be bad for you from an evolutionary perspective.
tim333 3 days ago 0 replies      
I've had weight graphs like that and the trouble always is that if you plot further instead of it being a flat line at a constant weight it bounces back up again. It's maintaining that's the hard part.
jalopy 3 days ago 0 replies      
This is freaking genius. Kudos. I wish I could upvote this 1000x.
bresc 3 days ago 0 replies      
I don't understand what exactly he did with the data and how the calculations indicate weight loss/gain.

Can someone explain, please?

catalystframe 2 days ago 0 replies      
Lol bacon is associated with weight loss by 32%
swang 3 days ago 0 replies      
so i cannot give up rice. how much can i consume of rice/carbs per day and not mess up this balance?
valde 3 days ago 1 reply      
What about sex?
What I learned as a hired consultant to autodidact physicists aeon.co
555 points by tbrownaw  5 days ago   310 comments top 38
Xcelerate 4 days ago 12 replies      
Whenever I talk about physics (to non-scientists), I notice that people have a tendency to start veering away from the math and onto irrelevant metaphysical tangents. For instance, I'll be trying to explain the history of renormalization in quantum field theory, and someone will suggest, "Well maybe we don't really understand infinity". No, we understand "infinity" just fine. It's a concept that's clearly defined using a set of axioms that have been around for thousands of years. "Well maybe the mathematicians are wrong." I start losing my patience pretty quickly at this point. The other big one that annoys me is "Well it's just a theory". Sure, and gravity is just a theory too. If you doubt it, you're free to go skydiving without a parachute, but personally, I'm not taking any chances.
j2kun 4 days ago 7 replies      
> A typical problem is that, in the absence of equations, they project literal meanings onto words such as grains of space-time or particles popping in and out of existence. Science writers should be more careful to point out when we are using metaphors. My clients read way too much into pictures, measuring every angle, scrutinising every colour, counting every dash. Illustrators should be more careful to point out what is relevant information and what is artistic freedom. But the most important lesson Ive learned is that journalists are so successful at making physics seem not so complicated that many readers come away with the impression that they can easily do it themselves. How can we blame them for not knowing what it takes if we never tell them?

Just the other day I went to a talk by a prestigious physicist who, on top of telling only half-truths _at best_, made all of these mistakes and more. And the audience ate it up! As a mathematician and guy-who-writes-about-math-online, it makes me feel very frustrated. I also realized the difference between a physicist and a mathematician: a physicist is openly willing to compromise their principles and stretch the truth for the sake of press, while a mathematician sticks to the truth and as a result nobody cares.

mindcrime 4 days ago 4 replies      
Some of these folks could probably benefit from reading a book that I just bought: The Theoretical Minimum: What You Need To Know To Start Doing Physics.


It's a cool book... written to be relatively accessible, but is actually grounded in the real principles and math used in physics. As somebody who considers himself an autodidact of sorts (in that I'm as much self-taught as formally educated), but who has some awareness of "what I don't know" (and therefore doesn't sit around coming up with crackpot theories about quantum mechanics and what-not), I love this kind of stuff.

One of the authors is Leonard Susskind who is pretty credible. This is a book that is serious, but succinct (as you might guess from the title). Note that there is also a companion volume that is specifically about Quantum Mechanics. https://www.amazon.com/Quantum-Mechanics-Theoretical-Leonard...

All of that said, I do think it's important to note (as others already have) that "autodidact != crank". Plenty of autodidacts are just people who study physics (or whatever) because they find it interesting, but they are aware of their limitations and don't pretend to have amazing new insights that have escaped physics for decades, etc. Likewise I'm pretty sure you can find cranks who have a formal education as well.

pklausler 4 days ago 1 reply      
A very long time ago, I worked for Seymour Cray. He received a surprising amount of crank mail (and back then, it was real postal mail, not e-mail). His secretary filtered out the crank mail, spared him from it, and was good enough to pass the best stuff on to some of the engineers that would appreciate it.

I still have some of it, including a long treatise from an inmate at the county jail who had a theory of interplanetary transportation involving kangaroos whose energy output would be measured in "gigahops".

EDIT: two minor typos

forgotpwtomain 4 days ago 1 reply      
> Sociologists have long tried and failed to draw a line between science and pseudoscience. In physics, though, that demarcation problem is a non-problem, solved by the pragmatic observation that we can reliably tell an outsider when we see one.

So generally for sciences (and for compsci cranks as well) we have a direct answer because either your theories can be experimentally verified or they cannot. This is normally a solid position but it puts for example the decades of work on string theory in a bind - since they haven't produced a single verifiable result either.

So the author offers a tangential and more broadly encompassing but subjectively experiential position:

> During a decade of education, we physicists learn more than the tools of the trade; we also learn the walk and talk of the community, shared through countless seminars and conferences, meetings, lectures and papers. After exchanging a few sentences, we can tell if youre one of us. You cant fake our community slang any more than you can fake a local accent in a foreign country.

Sure, that's great you've verified membership in a social group - but that's really insufficient when you are trying to identify crank science. This sentence can also be applied to all kinds of cults and secular belief systems, hell, I think most of academic humanities fall under this as well.

Anecdotal - I know someone who is a well accomplished researcher in their respective experimental physics field (with numerous citations), as a hobby they also happen to have an interest in theoretical physics, where they have published several papers entirely to no response (which to my understanding would be pretty awesome of they were not incorrect) . So it's not just in and out of 'professional physics', the number of people specializing in a particular area can be very small and closed off in an even more domain particular kind of way.

epistasis 4 days ago 1 reply      
>My clients almost exclusively get their information from the popular science media. Often, they get something utterly wrong in the process. Once I hear their reading of an article about, say, space-time foam or black hole firewalls, I can see where their misunderstanding stems from. But they come up with interpretations that never would have crossed my mind when writing an article.

This isn't just physics articles, and isn't just cranks. Most of the popular reporting on science gets things very wrong. I can't say whether physics is more correct or less correct, but I think I notice less eye-rolling and complaints from physicists about popular news articles than I do from other fields.

Something to keep in mind for people that are getting their science news from the media.

schoen 4 days ago 2 replies      
I get the e-mail for the EFF Cooperative Computing Awards


so, despite having put lots of effort into not having people make spurious claims, I hear from a whole lot of math cranks.

Two things that I find striking are many people's level of confidence that they can personally "solve the problem" (in this case, by inventing some kind of "formula for primes" that has eluded the organized mathematics world for decades), and many people's lack of understanding of what a solution would consist of (in terms of knowing that mathematical proofs exist, and being able to understand whether they have a theorem or just a conjecture).

Our situation is especially tricky because we chose a problem that experts said would require lots of computational resources and couldn't be solved by new mathematical insight, but then we didn't outright forbid people from trying to solve it by insight. So a lot of people see an exciting challenge, like "they think it will take a lot of computer time, but if I can just see the pattern, I can skip all of that!". Also, we have a large monetary reward for solutions and so people are excited by the idea that they have them and are about to receive a bunch of money.

I think it's true that many of the people who contact me about this are excited about mathematics in the way that people who contacted Dr. Hossenfelder were excited about physics (and, as pdkl95 said, that Carl Sagan's cab driver was excited about science). But it's still frustrating that, after we've gone to some lengths to say that you need a proof and not just a guess, and that decades of research indicate that you can't find primes of this size without significant computer time, people are still so confident that their guesses are right and so resistant to accepting that they haven't met the awards criteria.

It would be interesting to see an equivalent service for talking to mathematicians and to see what some of the people who contact us might get out of it, and whether it might inspire them to pursue more constructive things. (I always wish that our awards would motivate someone to start doing Project Euler problems or something...) If someone set up that "talk to a mathematician" service, I would probably try to send lots of people their way.

api 4 days ago 1 reply      
I've been fascinated for a long time by just how much effective autodidactism there is in software vs. other fields. There are tons of people who have made major contributions here that are completely self-taught.

Software is uniquely suited to autodidactism for three reasons:

(1) The tools are easy to obtain and easy to start using. Capital cost is low to non-existent.

(2) The learning feedback loop is nearly instantaneous and the results are almost always perfectly objective. Things either work or they don't. There is not much room for delusional or wishful thinking.

(3) Resources for learning are readily available and are mostly written in a style that is utilitarian and straightforward rather than cliquish and arcane.

Theoretical physics passes on point #1 until you hit the need to do serious experimentation, but it fails on points #2 and #3. There is no command prompt that will tell you in 10ms if a theory is at least rational and internally consistent, and advanced mathematics has an arcane symbology and jargon that seems almost intentionally designed to resist penetration by those outside the academic circles where it is used and taught.

martincmartin 4 days ago 2 replies      
There are many people who decide not to go into research but enter industry instead. Some of them don't have the intellectual chops, but others are turned off by the politics, long hours that professors work, spending more time writing grant proposals and managing students than doing research, etc.

Some of them have spare time, or maybe will have spare time after their kids are grown, or will be able to retire early. Then they could become citizen scientists [1], independent scientists [2], etc.

I wonder how to organize and encourage them? How to redirect or weed out the cranks, and encourage those who are motivated and can look at things from a new perspective?

[1] https://en.wikipedia.org/wiki/Citizen_science[2] https://en.wikipedia.org/wiki/Independent_scientist

bigger_cheese 4 days ago 0 replies      
"Many of them are retired or near retirement, typically with a background in engineering or a related industry...After exchanging a few sentences, we can tell if youre one of us. You cant fake our community slang any more than you can fake a local accent in a foreign country."

This matches my experience. During the final year of my Engineering degree I decided to take a third year particle physics elective because it sounded interesting. The course had no pre-requisites but it probably should have. I remember showing up to the first lecture and being one of the only non-science students in the theatre. The lecturer started talking about Hamiltonian's, Fermi-Dirac Statistics and Wave-Functions and it all just went completely over my head. There was a whole bunch of "foreign" concepts that were assumed. I ended up needing to check out a bunch of physics texts from the library and over the next few weeks I had to teach myself the 2+ years of physics knowledge the rest of the class was familiar with. I passed the course but it was a lot of work for what was supposed to be an elective.

martincmartin 4 days ago 1 reply      
Einstein famously couldn't find a teaching position after graduation, spent 2 years unemployed, then worked as a patent clerk in a patent office. According to wikipedia: "Much of his work at the patent office related to questions about transmission of electric signals and electrical-mechanical synchronization of time, two technical problems that show up conspicuously in the thought experiments that eventually led Einstein to his radical conclusions about the nature of light and the fundamental connection between space and time."

So there's two lessons here, I think: 1. People outside academia can still make important contributions, and 2. spending a lot of time thinking about other people's proposals, separating the good from the bad, and inspire a new, fruitful way of looking at things, or at least overcoming standard mental traps.

panglott 4 days ago 2 replies      
This strikes me more as a success of science journalism (people are inspired to improve their understanding of physics) and a failure of science education (intelligent, motivated amateurs receive no support outside of formal education).

Along these lines, is there a good recommended contemporary popular work on quantum physics for non-physicists?

sevenless 4 days ago 1 reply      
Seems to bear comparison to phone sex lines, in that you're satisfying a basic human need - in this case, to be listened to.

John Baez keeps a 'Crackpot Index' score at http://math.ucr.edu/home/baez/crackpot.html

netcan 4 days ago 1 reply      
This is a fun idea for a service.

I happened to read 3 articles this week on labour productivity. My economics is undergraduate level with 15 years of rust, but I had an idea. I thought I was brilliant for the rest of the day. But, I'd like to know if my idea is an existing theorem, wrong for some reason I don't understand or (most likely) a novel, brilliant idea that economists just overlooked.

Dunno if I'd pay $50 to find out. 34.99 tops, maybe. :)

angrow 4 days ago 1 reply      
If we need a more neutral term than "crank", rather than "autodidact," why not "outsider"?

An outsider artist is anyone who creates art not easily dismissed, despite not participating in the social and academic communities of their medium, so why can't there be outsider scientists and engineers as well?

drauh 5 days ago 1 reply      
I've had my share trying to convince someone that a perpetual motion system they described did not conserve energy or momentum. They refused to believe my assertion that momentum and energy were conserved quantities.
mrcactu5 4 days ago 2 replies      

 Many base their theories on images, downloaded or drawn by hand, embedded in long pamphlets. A few use basic equations. Some add videos or applets. Some work with 3D models of Styrofoam, cardboard or wires.
Actually these are perfectly good ways of communicating ideas and solving problems.

keithpeter 4 days ago 0 replies      
"They are driven by the same desire to understand nature and make a contribution to science as we are. They just werent lucky enough to get the required education early in life, and now they have a hard time figuring out where to even begin."

Any chance of on-boarding via experimental work/data analysis in some way like in Astronomy?


erroneousfunk 4 days ago 5 replies      
I'm hardly a physicist, but I have a degree in "general engineering" (long story about how I managed to escape specialization there) and a master's in software engineering, and I've taken a few advanced math courses, including partial differential equations, computational theory, scientific computing, and a math-heavy course on relativity. I also spent a year and a half working for a Harvard physics professor, alongside his team of grad students. So, while I can't "do physics" I think I know enough to understand a little about what it takes to _be_ a physicist, and appreciate the work that they do.

My husband and I were listening to a radio program (http://www.thisamericanlife.org/radio-archives/episode/293/a...) about a man convinced that he had found a mistake in Einstein's theory of relativity, and trying to communicate this idea to a physicist (futilely, obviously). My husband, who was a music major for a year before dropping out of college, and I started getting in an argument about this episode that was so heated, I felt like we were listening to two completely different stories!

I kept insisting that the advanced math and education wasn't just some funsies shibboleth the physicists had to keep the hoi polloi out of physics -- the devil's in the details and the man in the story didn't even understand the big picture correctly. My husband was angry and insulted that the physicist dismissed the man's theory out of hand, and felt that anyone could make a contribution to physics with perhaps a little help from a calculus book -- just look at history! I thought the hero of the story (if there was one) was clearly the physicist, while my husband was solidly on the side of the electrician with a little learning, trying dangerous things.

I was really shocked. We don't usually fight like that, and especially over something so seemingly trivial, but, in retrospect, I thought that it displayed huge tension in society as a whole, between academics and non-academics. We hear so much about the "one percent" and income-based class distinctions, but relatively little about academic barriers in society, whether real or artificially imposed. Should physics open up in a real way (not just "pop science" articles and occasional books for laymen)? Should we put a stronger "academic" focus in early public and high school education? Should we provide more resources for "physicists who just need a little help with the math" whether they're right or wrong?

Although I was staunchly in favor of the hallowed halls of academe, and it still holds a special place in my heart, I suspect that the correct solution lies somewhere in the middle. Anyway, fantastic article, and it really brings up an important point that is too seldom addressed, by physicists, or society as a whole!

dandare 4 days ago 0 replies      
>My clients almost exclusively get their information from the popular science media. Often, they get something utterly wrong in the process. Once I hear their reading of an article about, say, space-time foam or black hole firewalls, I can see where their misunderstanding stems from. But they come up with interpretations that never would have crossed my mind when writing an article.

Can you see the parallel with democracy? Autodidacts can not really harm the field of physics no matter how naively wrong they are. But we have voting rights and we believe we understand the complex issues in sociology, justice, economics ... I am depressed.

paulcole 4 days ago 0 replies      
The comments here are full of people who should be paying $50 for 20 minutes of a physicist's time. Maybe we should start taking up a collection and seeing if we can get a bulk discount.
dekhn 4 days ago 0 replies      
While I agree with plenty the author wrote, I have seen plenty of people who are great with math, can speak the language, know how to promote their results in pubs and conferences, and yet are still completely and totally wrong.

My best example is a smart physics graduate that I went to grad school (in biophysics) with. An open problem at the time was how motor proteins couple the energy in ATP hydrolosis to directed motion. I said one day, "hmm, maybe it works like this..." and she said, "oh no, my advisor and I proved that mechanism was impossible."

A few years go by, we're getting ready to graduate. I ask her, "so, since you spent the last 7 years studying motor proteins, how do they work?" And she told me it was the mechanism I had proposed. I said, "but you disproved that!". And she said, "well, then I collected data, and it turns out our assumptions are wrong."

I constantly end up arguing with quantitative people in my own old field- for example, I used to argue with people who did GWAS, they insisted all their stats were great and perfect, then Ionnides and others showed their stats were abysmal, and they were massively overconfident in their results.

This is not to say all the cranks are right- they are almost certainly wrong. Anybody who attempts to get around the second law of thermodynamics is going to lose, unless there is something truly and fundamentally wrong with statistical mechanics.

lcvella 4 days ago 2 replies      
That is why I believe society is not getting what it is paying for in physics and mathematics. Too much of the funds invested in physics knowledge is from taxpayers money, and all the knowledge produced is useless if we can't get it back and, with due dedication, understand it. I always felt most of the modern physics knowledge is completely inaccessible to me.

The latest physics book I could read and understand was one by Einstein himself, "Relativity: The Special and General Theory", which is 100 years old.

When I tackled to learn quantum mechanics, I couldn't find good accessible (cost-wise) material, and the supposedly good book appointed by a physicist friend of mine cost more than $100 on Amazon (which is about 1/3 of the minimum wage of where country I live). I end up buying the Indian print of the book much cheaper. But there was no chance I could read it at the time, because of my lack of calculus basis, what made me watch the entire Udacity course on differential equations.

Thanks to that, I had the bare minimum to be accepted in a PhD program on mechanical engineering (I am MSc on Computer Science) to work on computational fluid dynamics. Now, halfway through an engineering PhD, I believe am (more) able to tackle the QM book (look all that took me!)

That is why I deeply value the effort of Udacity, Coursera, Khan Academy and such, because without real efforts to bring actual knowledge to public, in an accessible way (both cost and didactic-wise), modern physics and mathematics are a waste of money on private clubs.

atemerev 4 days ago 3 replies      
This guy have secured himself so much good karma. Wow. Thank you, from all us autodidacts.

(I am trying to learn some astrophysics as a hobby. Amateur science is currently mostly frowned upon).

evanwolf 3 days ago 0 replies      
Heh. What I learned as a product manager by watching 100 hours of Hallmark movies. https://medium.com/product-hospice/100-hours-of-hallmark-mov...
jxy 4 days ago 2 replies      
It is a good form of communicating sciences. Seriously, more professors should do it as their contributions to community. Perhaps starting from more reddit AMA?
fritzo 4 days ago 0 replies      
I wish there were services for this in other areas. I'd pay to ask embarrassingly naive questions in fields where my knowledge-to-interest level is near zero, and where reading the internet has proven ineffective.
jsprogrammer 4 days ago 1 reply      
>Sociologists have long tried and failed to draw a line between science and pseudoscience. In physics, though, that demarcation problem is a non-problem, solved by the pragmatic observation that we can reliably tell an outsider when we see one.

Sorry, but this is an admission of pseudoscience. All apparently in the name of committing an ad hominem.

It is commendable that the author is helping others to get answers to their questions, but this article indicates that there are substantial issues to be dealt with.

TylerH 2 days ago 0 replies      
"None of us makes good money"

$150 an hour is very good money, especially if you have interested parties sending e-mails that are "piling up" in your inbox.

Get outta here with that nonsense.

Yenrabbit 4 days ago 0 replies      
Semi-relevant xkcd: http://xkcd.com/1486/
emmelaich 4 days ago 0 replies      
I think there is a place for speculation in science as long it is clearly understood as such.

There used to be journal for it; it was fun reading.

Speculations in Science and Technologyhttp://link.springer.com/journal/volumesAndIssues/11216

It only ran for two years, 1997-1998.

hardlianotion 4 days ago 0 replies      
The bit that I take from this is a scientist who is taking on a mission to explain himself fully, to people who make an effort and give a damn. Very easy to call people like this cranks, and rather hard to make something positive come out of the engagement in many cases.

I suspect the fact that money is involved goes some way to making this initiative a success.

erdevs 4 days ago 1 reply      
I wonder if this is changing with things like online education and the wealth of real information and in-depth knowledge on the internet. It seems self-teaching is much, much more viable today and I imagine that younger generations of autodidacts might not be so ill-informed on the whole.
dredmorbius 4 days ago 0 replies      
This raises any number of points and questions.

How effectively can we communicate (and pass on to new generations) complex ideas at all? There's an essay on mathematics PhDs, noting that in any given seminar of a half-dozen or so people, you may well have the only six people in the world who can even understand the topic in the room. What does it mean to "know" something if only one billionth of the global population can even grasp it?

There's the question of how to assess the quality of knowledge within a field. How can laypeople, inlcuding the politicians, voters, and taxpayers who ultimately pay for research, education, and otherwise support many of these operations, even assess what it is they're paying for and receiving?

There's the matter of media quality and access. I make absolutely no bones that I'm a book thief, and praise Alexandra Elbakyan daily (along with Kale Brewster of the Internet Archive, Reddit's /r/scholar and BookZZ.org) for providing access to the raw materials of my own research. (Also various libraries, though they're far less convenient or accessible.) Information is a public good. It has massive positive externalities, it's nonrivalrous, and it needs to be disseminated and accessible in order to be useful. And yet we lock it away. Which is among the reasons why autodidacts rely on such poor alternatives as the popular press.

Blaming journalists for poor descriptions of scientific concepts doesn't fly when it's scientists themselves who are kicking out these concepts. OK, to paraphrase other unfortunate social slogans, not all scientists. But many. And yes, reporters and editors should absolutely be called to task for sloppy writing and whitewashing crud.

(Ah. I'm remembering a panel, Christopher Hitchens was among the presenters, where a female writer mentioned her experience writing for a fashion magazine on social topics -- the original work was hers, but after it had been washed through many more hands, it was anything but. I think in here: https://m.youtube.com/watch?v=fkLX58ZWbWw Sorry, that's a 2 hour video, I'll see if I can't narrow down the timeframe. Probably Katha Pollitt. The relevant comments concern writing for Glamour, and occur at 14m30s.)

And finally, as I've raised the issue with @schoen below, there's the question of how best to filter out the sensible cranks from the nuts. Finding good new ideas amongst the many bad ones, and sorting out how to keep from having to relitigate bullshit, is a Very Hard Problem.

rupellohn 4 days ago 0 replies      
I recommend 'Physics on the Fringe' for anyone interested in this topic


lifeisstillgood 4 days ago 0 replies      
This is an awesome example of real down in the dirt science communication.

I'm impressed

Analemma_ 4 days ago 7 replies      
This is a fun article, and also a useful reminder that although Hacker News tends to mythologize autodidacts, the boring reality is that in disciplines other than programming, the overwhelming majority of them are useless cranks.
ARothfusz 4 days ago 0 replies      
This is exactly developer support for "open source" physics.
Machine Learning Exercises in Python, Part 1 johnwittenauer.net
506 points by jdwittenauer  3 days ago   62 comments top 11
jupiter90000 3 days ago 6 replies      
Often this sort of material seems to be a collection of methods and understanding them, which is obviously important to being able to use them. However, I usually feel like the example problems are much cleaner and simpler than those I've encountered in business. I feel like there's this missing link between learning the methods and doing something that actually adds significant value for a business using machine learning. Perhaps it's just me or my field though.

I found that usually lots of work involved just transforming or examining data in relatively simple ways or using human expert decisions as to important threshholds for outliers. For example I could run an outlier algorithm on data and either the returned outliers were very obvious and could have been found using a manual query by knowing the business context, or it returned alot of false positive outliers that were useless for the business.Other times, we'd have a predictive model that was good for 95% of cases but would make our company look ridiculous on predictions for the other 5%, so couldn't use it in production-- and the nature of the data was such that we couldn't use the model for only certain value ranges.

Perhaps it was just the nature of our realm of business (telecom), and these approaches are more useful for others (advertising, stock trading, etc). Any experience with business fields where this stuff made a sizable impact for something they productionized in business they can share?

Animats 3 days ago 4 replies      
I took that course from the pre-Coursera Stanford videos, when someone from Black Rock Capital taught the course at Hacker Dojo. Did the homework in Octave, although it was intended to be done in Matlab.

It was painful. Those videos are just Ng at a physical chalkboard, with marginally legible writing. All math, little motivation, and, in particular, few graphics, although most of the concepts have a graphical representation.

fitzwatermellow 3 days ago 0 replies      
During the time of the original class, I don't think scikit-learn and spark were quite as mature. But perhaps Octave still enjoys a certain prominance in academic machine learning research. Matlab was also used for the recent EdX SynthBio class. And it just feels a bit archaic now, doing science in a gui on the desktop, instead of on a cloud server via cli ;)
ivan_ah 3 days ago 0 replies      
Related, the demos from Kevin P. Murphy's excellent ML book implemented in Octave [1] and (partially) in Python[2].

[1] https://github.com/probml/pmtk3/tree/master/demos[2] https://github.com/probml/pmtk3/tree/master/python/demos

jjallen 3 days ago 1 reply      
Seems like to compensate for day to day weight/water fluctuations one would need to track the trailing activity and food data for a period of days prior to the data analyzed. I'm thinking 3-5.

.2 lbs/kilos lost is mostly a rounding error. Our weight could fluctuate that much on a daily basis from the amount of salt consumed.

mark_l_watson 3 days ago 0 replies      
Very nice. I took the class twice and think it is easiest to use Octave, but for after taking the class these Python examples might help some people.
earthpalm 3 days ago 1 reply      
Lets talk about about how much Michael I. Jordon taught Andrew Ng what he knows about machine learning and AI.
NelsonMinar 2 days ago 0 replies      
Ng's machine learning class is excellent, but the main thing holding it back is its use of Matlab/Octave for the exercises. A Python version (with auto-grading of exercises) would be a huge improvement.
motyar 3 days ago 1 reply      
Can I find same in R?
denfromufa 3 days ago 1 reply      
What is the best learning resource for gaussian process (kriging) using Python?
danjoc 3 days ago 3 replies      

I will not make solutions to homework, quizzes, exams, projects, and other assignments available to anyone else (except to the extent an assignment explicitly permits sharing solutions). This includes both solutions written by me, as well as any solutions provided by the course staff or others.

Go 1.7 is released golang.org
499 points by techietim  15 hours ago   105 comments top 18
travjones 15 hours ago 1 reply      
Great work, Go team!

Standout points in my opinion:

Overall performance improvements:

>> "We observed a 535% speedup across our benchmarks."

Decreased compile times and binary size:

>> "While these changes across the compiler toolchain are mostly invisible, users have observed a significant speedup in compile time and a reduction in binary size by as much as 2030%."

Vendoring dependencies by default:

>> "...and in Go 1.7... the "vendor" behavior is always enabled"

Context package added to std lib:

>> "To make use of contexts within the standard library and to encourage more extensive use, the package has been moved from the x/net repository to the standard library as the context package."

chrisper 14 hours ago 2 replies      
I just learned Go 2 days ago and today I am already running my own web app!

I love Go because it is kind of like C, but better (more modern).Generally, I always program my stuff in Java, but whenever I have an idea of creating something I could only choose between: C, Java, bash. Obviously I am not going to use C and bash for most of my ideas. Thinking about solving my issues in Java is meh so often I decided it is not worth it to invest time. I feel like using Java for my ideas is like using a semi truck for roadtrips. It can be done, but it's just not very efficient having to launch the JVM everytime I want to do something small.

Yes, I was open to new languages, but I did not really care about: C++, Python, Perl, Ruby and so on because I never cared about web dev. Now I was bored and finally decided to learn Go.

geodel 15 hours ago 2 replies      
Seems like a great release. Faster compilation and faster at runtime. Normally these 2 are considered opposite of each other.
cyphar 12 hours ago 0 replies      
Yay, s390x support is finally in mainline! Finally we (SUSE) no longer needs to use gcc-go to build Go binaries on some platforms (we've had nothing but issues from gcc-go, half of the patches we apply to Docker are to make it behave when built with SLE's gcc version).
ksec 13 hours ago 10 replies      
Ok, off topic question. Not an expert in grammar, but something about "Go 1.7 is released" seems wrong to me.Anyone could tell me if this is an correct usage of grammar?

Edit: Dont know why so many downvote, but it is a honest question.

Tehnix 13 hours ago 3 replies      
>A new compiler back end, based on static single-assignment form (SSA), has been under development for the past year

Huh, I was under the impression that either SSA or CPS was pretty standard for any serious compiler. Does anyone know why they didn't design it for this from the beginning? It's like one of the earlier things you learn when making actual compilers.

blinkingled 11 hours ago 3 replies      

> SystemCertPool returns an error on Windows. Maybe it's fixable later

This doesn't inspire confidence in Go as a cross platform language which is at version 1.7. If this is implemented in say 1.8, am I supposed to check for Go versions in order to know if the SystemCertPool func works or not? I mean why not just release it when it works on all tier-1 supported platforms?

stock_toaster 14 hours ago 1 reply      
Be sure to check out the release notes, especially if you run FreeBSD (see known issues section).


jcadam 2 hours ago 1 reply      
I wish the Clojure compiler was half as fast as Go's :(

I'm not much of a fan of using Go for anything 'big', but I have taken to using it in places where I would have previously used Python (tiny/simple services, housekeeping/utility scripts fired by cron, etc.)

I'd love to use Go at work (amongst many other things), but my employer already gives me a hard time for writing small utilities in Python rather than Java (I refuse to wait for a JVM to spin up just to convert a single file from csv to xml. I simply will not do it, and apparently my employer doesn't consider it worth firing me over).

tmaly 2 hours ago 0 replies      
Great work,

I am looking forward to recompiling my code to see how things are. I have had my side project running on a set of Go micro services compiled against 1.4 since last year.

What can we expect from Go 1.8?

aprdm 11 hours ago 1 reply      
I have some years of experience with python and picking up golang was extremely straightforward.

Sometimes it annoys me a little bit because it's standard lib is much smaller than python's however having it being compiled is more than worth it.

Thanks go team !

cyphar 12 hours ago 2 replies      
The one thing that really pisses me off about the way Go handled vendoring is that they did it in a way that makes it incompatible with GOPATH. Previously in runC and Docker, we had build hacks that would symlink (or full copy) the current directory into vendor/src/<package> and then set the GOPATH to vendor/. This was compatible with every go version. In addition, many other projects did the exact same thing.

But the way that vendoring works in Go 1.5 and up is that you make vendor not a valid GOPATH and you have to now either create a fake GOPATH and move your current directory into it, or you have to do some symlink stuff within vendor/ that doesn't really work. Why was such a small cosmetic change seen as a good idea? It's needlessly incompatible with previous ways of making vendoring work seamlessly with Go.

I'm hoping that the packaging discussions that are going to be happening over the next few months don't result in a similar decision that "we know best".

sfrailsdev 15 hours ago 3 replies      
vendor directories are no longer optional, which is nice.

Go get now update submodules... I'm honestly not sure what the current hack is for package management, but I assume people are still doing wrappers around go get to pin to commits/versions, (or else building your own repos for funsies), and I'm wondering if that breaks anything.

sdegutis 15 hours ago 2 replies      
Cool. What's the plan for Go 8? What exciting features are up ahead?
maxpert 14 hours ago 0 replies      
Awesome! Time to recompile RaspChat :D
WhitneyLand 10 hours ago 0 replies      
Elevator description:

 - Open source, 6 years old, backing from Google - Nice for concurrency and service implementations - Not nice for generic types - C like, modern, minimalist, garbage collected - Becoming more popular

yuyuyy 10 hours ago 4 replies      
Is Go based on llvm? It doesn't seem like it, but would be curious to know why not?

Isn't this the point of llvm, to separate the "language component" from the "cpu component"?

litaohackernews 5 hours ago 1 reply      
The semantics of Go is too ugly. It has no taste.
Lake Nyos suffocated over 1,700 people in one night atlasobscura.com
461 points by vmateixeira  4 days ago   101 comments top 17
bnjmn 3 days ago 7 replies      
A few years ago I wrote a poem loosely based this event.

I know unsolicited poetry from strangers on the internet is almost always awful, but this poem still holds up for me, which is pretty unusual for anything I've ever written.

So I hope you enjoy it, too:

 NYOS You took me in on dusky breath, tasted me, tasted nothing, gathered by my easy take that I was oxygen enough for idle inspiration. How swiftly my lack became your lack; my misgiving, your mistake. Your eyes flashed a baffled petition as you fell limp in a thousand different doorways, cribs, embraces, fits and fields, yet I pressed after whatever it was I thought to find in the lowest parts of Cameroon, as foolish in love as a gas trapped in a lake.

Diederich 3 days ago 2 replies      
Horseshoe Lake, in Mammoth Lakes, California, had a substantial CO2 vent in the early 2000s:http://pubs.usgs.gov/fs/fs172-96/

300 tons per day at the time.

I visited in 2002. There were danger signs everywhere, advising people to stay away from the lake itself, and away from low-lying depressions.

In such forested areas, there is always a background of natural sounds, but that day, there was almost complete silence. Large swathes of trees, especially those nearer the lake itself, were standing, dead and dying. I found a couple of dead birds on the ground, completely undisturbed.

The whole situation was so eerie that I bugged out even without taking any pictures.

MollyR 3 days ago 4 replies      
Wow, the lake turning red and so many people dying sounds like a biblical style event. It's great to have a scientific explanation of what happened.
donretag 3 days ago 1 reply      
"Also worrisome is Lake Kivu, a lake over 1,000 times larger than Nyos and in a much more populous area."

I stayed on the Rwandan side of Lake Kivu and had a great relaxing time. Met many peace core volunteers.

Lake Kivu is a major source of methane gas, which powers much of Rwanda. I am assuming they are very aware of the dangers. If something like that would happen at the scale of Lake Kivu, it would be a major catastrophe. That said, I encountered nothing regarding the potential of disaster. Nothing like tsunami/flood zone warnings. Rwanda tends to be a bit more western (aka not chaotic African) in its safety precautions. The DRC on the other side would definitely be more lax.

lanius 3 days ago 1 reply      
>As the CO2 settled, every flame and fire was immediately extinguished, a sign of the doom descending all around Lake Nyos

Despite being relatively enlightened due to living in a modern society, I would be freaked the fuck out if I witnessed this personally.

_audakel 3 days ago 3 replies      
bad choice of ads - at the end of the article there was a little date widget to "find a hotel near lake Nyos" and choose your departure date and return date.
ScottBurson 3 days ago 2 replies      
Given that CO2 levels in the lake have now built up again to be even higher than they were in 1986, according to this article, I wonder if they shouldn't evacuate the area and see if they can set the thing off intentionally. Seems like explosives might do it -- like shaking a soda can.
Animats 3 days ago 0 replies      
Maybe geologic sequestration of excess CO2 isn't a good idea.[1]

[1] https://www.undeerc.org/pcor/sequestration/whatissequestrati...

tantalor 3 days ago 3 replies      
"1.2 cubic kilometers" is a ridiculous unit, because it obscures the cubic power on the kilometer. "1.2 billion cubic meters" would be much better.
yincrash 3 days ago 0 replies      
This was an extremely carbonated lake. Seltzer water is about 4 volumes (4:1 CO2 to water measured in volume). The article places the lake at 5 volumes.
waqf 3 days ago 2 replies      
Isn't CO poisoning in humans supposed to cause a panic reaction which would serve as a defence? Why didn't that happen here?
teslaberry 3 days ago 0 replies      

Hydrogen sulfide caused Permian extinction.

My favorite theory is that it was caused by massive increase in volcanic activity which was itself caused by a giant meteor hitting earth.

aandrewc 3 days ago 0 replies      
One of my favorites podcasts did a great episode on this! http://www.stuffyoushouldknow.com/podcasts/how-can-a-lake-ex...
Pinatubo 3 days ago 1 reply      
Did anyone else find the juxtaposition of "over" and a very precise number in the title a bit strange?
scarygliders 4 days ago 2 replies      
Scroll down to the bottom of that article and... "Find a hotel near lake Nyos".

Er, no, thank you.

wcummings 3 days ago 1 reply      
I like the ad for hotels "Near Lake Nyos" below the article about how dangerous it is.
nilved 3 days ago 5 replies      
This is hardly HN material, but this article included the best application of the Cloud To Butt browser extension I've seen yet.


Ask HN: Is it possible to run your own mail server for personal use?
609 points by jdmoreira  2 days ago   293 comments top 99
Nux 2 days ago 13 replies      
It's absolutely possible to run an email server in 2016 and I encourage anyone capable to do so!

Email is one of the bastions of the decentralised Internet and we should hang on to it.

Every day more and more people are moving to Gmail/Hotmail/Outlook and while I do understand the reasons, it also puts more and more power into the hands of these providers and the little guy (us) gets more screwed (like marked as junk by default by them :< )

Having said that, here's my check list for successfully delivering email:

- make sure your IP (IPv6) is clean and not listed in any RBL, use e.g. http://multirbl.valli.org/ to check

- make sure you have a correct reverse dns (ptr) entry for said IP and that ptr/hostname's A record is also valid

- make sure your MTA does not append to the message headers your client's IP (ie x-originating-ip), messages can be blocked based only on "dodgy" x-originating-ip (see eg https://major.io/2013/04/14/remove-sensitive-information-fro... )

- set up SSL properly in your MTA, there are so many providers giving away free certs nowadays

- SPF, DKIM, DMARC - set them up, properly, this site can come in handy for checking yourself https://www.mail-tester.com/

- do not share the IP of your email server with a web server running any sort of scripting engine - if it gets exploited in any way usually sending spam is what the abusers will do

- last but not least - and while I loved qmail and vpopmail - use Postfix or Exim, they are both more fit for 2016, more configurable and with much, much larger user bases and as such bigger community and documentation.


mrb 2 days ago 6 replies      
One little trick that I rarely see mentioned for working around the negative or neutral reputation your MTA's IP might have is that you can route your outgoing emails through another MTA that has a higher reputation. For example route them through smtp.gmail.com (or for other options see https://support.google.com/a/answer/176600?hl=en). It does not mean you have to use Gmail. It does not mean you have to change your MX records. It does not mean you have to use a @gmail.com address. None of that. Your recipients will not even notice you are routing through smtp.gmail.com (unless they inspect the detailed headers). All you need is a Google account and password to authenticate against smtp.gmail.com, and Google will happily route your email to wherever, to any external domains, etc.

Doing this makes you retain all the advantages of running your own MTA: none of your emails are hosted at a third party provider, no scanning of your emails to personalize ads, no government agency can knock at the door of an email provider and ask them for the content of your inbox, etc.

The only downside is that in theory Google can scan and block your outgoing emails (not incoming emails since these hit your MTA directly). But if you don't send spam, this should never happen.

Another option is to route your mail through your ISP's MTA. Yes ISPs usually offer SMTP relay service accessible only from their customer's IP addresses (eg. for Comcast it is "smtp.comcast.net" IIRC.) However the reputation factor of an ISP's MTA might be worse than Google's MTA.

walrus01 2 days ago 1 reply      
Having a perfect smtpd that speaks TLS 1.2, has properly set up dkim, sfp and dmarc records, working reverse DNS, etc is sadly not enough these days if you use a commodity vps/VM host. IP block reputation matters as well. Sadly, some other customers in your same /24 have been less clueful than you within the recent memory of major SMTP operators(gmail, office365/Microsoft, etc) and your IP space probably had a bad reputation.

Reputation perception by opaque large SMTP operators will not show up in RBLs and other ways to check for blacklists. You cannot query your IP block's status unless you happen to personally know a senior sysadmin on their mail operations teams. They don't share this information because it would help spammers choose new "clean" places to spam from.

One solution is to Colo your own 1u system with an ISP that is known to have very stringent zero tolerance abuse policies. Typically not one that is a commodity hoster.

ChuckMcM 2 days ago 1 reply      
Absolutely possible but its a battlefield on the Internet so you have to understand the players. Two things I haven't seen mentioned in all the excellent advice:

1) Does your ISP let you send email? Some ISP's will not allow any outbound traffic to port 25 from a non "business" port. They force their users to send their email to their server, and then they forward it on to the Internet.

They do this with nominally good intentions (it is easier to control spam generated from their networks), but they also are financially motivated to do so.

2) Don't try to send mail from a dynamic IP address, you should have (and would probably pay extra for) a fixed static IP address (V4 and V6).

Dynamic IPs have two problems, one they change and mail receivers don't like that. Two, they carry with them the abuses people who had the IP address before committed. So your email may get delivered one day, and then poof you renew the lease on your IP and get one that is on a black list somewhere.

hannob 2 days ago 0 replies      
I'm running small scale mail servers. It's not nearly as difficult as the common folklore makes you believe.

Make sure you don't send spam and your server is properly configured. If you are sending mails to people that don't want it then it is spam. "They silently agreed to get our newsletter because it was listed in our ToS on page 357" is not acceptable. No other excuse for sending spam is acceptable. Whenever you send any automated mail there must be an easy way to unsubscribe.

A few more tips:

* Check your mail server on http://multirbl.valli.org/ - if it's in any blacklist try to find out why (there are a few rogue blacklists, ignore them).

* Hotmail allows you to receive a report for every mail that a hotmail user thinks is spam. Use that. Act on it.

* Check your logs for messages that indicate that others think you're spamming.

* E-Mail forwarding is a tricky business these days. Avoid it.

I occasionally get dubious spam rejections, but they don't come from the large hosts. They usually come from some small ISP using a proprietary antispam solution that gives you no insight what's going on.

My suspicion would be that qmail is your problem. There are a great many details that a mail software has to get right, qmail often doesn't do what the email ecosystem expects.

TheMog 2 days ago 3 replies      
I've been running my own MTA for about 15 years now, so it's definitely possible without spending the majority of one's waking hours to do so.

Even when I switched mail server IPs twice over the last few years I didn't run into the issues you ran into. A large part of it depends on where you run it - if you, say, run it on your home Internet connection that's usually an immediate strike against you because of the insane number of spammers using backdoor'd PCs to do exactly that.

The only time I ran an MTA out of my home was when I was on a commercial ISP with a fixed IP address, that seemed to be good enough for most services including gmail and hotmail.

These days I run my MTA on a VPS with a reputable hosting provider and don't seem to have that many issues with outgoing mails marked as spam.

SPF and DKIM are pretty much a must these days, so that's a good starting point, as are the rest of the precautions you already too. I assume you're using your own domain, how "old" is that domain? That might also have an impact giving how many phishers and spammers register odd domains and use them for a short amount of time. I've used the same domain since about 1999 so that could make a difference.

I use postfix instead of qmail, but I've used qmail in the past. Both work well and are easier to configure than sendmail or exim IMHO. On top of that I do run amavisd/spamassassin/clamav for the incoming emails as well.

One more thing I've got set up that I didn't see in your list is that I've got TLS set up with a non-self signed certificate for both incoming and outgoing email. I suspect that this also makes a difference even if the other email server won't request a client certificate (most, if not all, won't). Certainly shows up when I send an email over to gmail.

My biggest issues these days are more with incoming email:

- You'll never get to the level of spam filtering that, say, gmail offers. To me, that's OK

- I use greylisting to weed out a lot of the spam that would normally make it through spamassassin, but unfortunately that's when you find out how many people have misconfigured servers that bounce emails when they encounter temporary failures

pflanze 2 days ago 1 reply      
I'm doing it, also using Qmail. I've felt the same pains as you (even started to suspect that providers might detect mail was being sent by Qmail and scoring that lower (perhaps (only) spammers are using Qmail today?), but more probably my network block (Hetzner.de) is the biggest reason for my difficulties).

Here's what I've done on top of your list:

- backscatter prevention (using my own https://github.com/pflanze/better-qmail-remote)

- do the Google domain verification dance (postmaster tools, configuring their entry in the DNS); still didn't prevent mails ending up in spam, but who knows whether it might still have helped.

- started running mailing lists on it anyway, in spite of me knowing that mails end up in people's spam folders, and simply tell all new subscribers that mails first end up in their spam folders and that once they mark them as non-spam the problem goes away. This seems to be working (people haven't complained), and will over time hopefully give my server the reputation I need.

(PS. I'm also still using DJBDNS, with a config generator written in Scheme, look out for tinydns-scm on my Github)

rahkiin 2 days ago 0 replies      
I will not repeat what everyone else has already said, but I can add one thing.You need to 'warm up' your IP address. You need to send a lot of non-spam email. MTAs will A/B test it: mark some as spam, mark others as not spam. Then they see if they get user spam reports or non-junk reports. They don't know you. The more you send (successfully, if everything is marked as spam by receivers it won't work), the more they start trusting you.

SparkPost allows the use of dedicated IPs and it has a warm up time. They tell everything about it in [0].

[0] https://support.sparkpost.com/customer/portal/articles/19722...

jwr 2 days ago 0 replies      
Of course it is. I've been doing it since 2001 or so. It isn't as easy as it should be, but it isn't that hard, either.

I had problems with mail acceptance only once, when one of my ISPs got me an IP address that was either used by a spammer in the past, or was in the same subnet that the spammers used. Other than that, no problems over the past 15 years, and I switched providers and systems at least three times over that time.

I'd encourage everyone to go ahead and do it. It isn't very hard, cost on the order of several dollars/euros a month, and you finally own your E-mail. I find it appalling that most people either use company E-mail (it isn't yours, anyone can read it, and if you part ways with the company you have a problem) or Google Gmail (Google does read it, trains its algorithms on it, and targets advertising based on that).

Don't worry too much about DKIM. It is no longer a good signal anyway, most spam gets it right.

So, if you're capable of it, go ahead and run your own mail server. I wish more people did it, so that we could avoid the "big guys" restricting E-mail. If more individuals ran their own servers, we could democratize E-mail again: it wouldn't be that easy to just reject E-mail for no good reason.

For the reference, the software I use right now is: Ubuntu LTS, postfix, postgrey, amavis, dovecot. I rent a virtual server at Hetzner.de.

tezza 2 days ago 1 reply      
Linode + Postfix successfully for years.

Reverse DNS very important and the SPF

Linode have excellent setup documentation ( https://www.linode.com/docs/email/postfix/email-with-postfix... )

Kadin 2 days ago 0 replies      
You can do this, and I do this (although not for my personal email, currently, although I have in the past -- I do it for a club though and it works fine).

If you are not being blacklisted (check the common ones plus AOL, they run their own), and are using SPF and DKIM, you shouldn't be having problems with messages getting blocked. That's pretty unusual.

What could be happening is that you might be in an IP range that's residential; there are some operators who blackhole all messages originating from "residential" IPs, even if they are not specifically being blacklisted for bad conduct, and even if they have valid SPF/DKIM records. I think this is a pretty bullshitty thing to do, and completely out of the spirit of Postel's Law, to the point where I think anyone who configures a server this way ought to be forced into a lifetime of Windows XP helpdesk duty. But it's a thing that happens.

One solution that's worked for me is to get a cheap VPS and run my mailserver there. It's in an IP block that traces back to a big datacenter, and it seems to be much more acceptable to various overzealous spam filters than my home IP.

tristor 2 days ago 0 replies      
Yes, it's absolutely possible. I wrote an extensive set of step-by-step instructions on how to deploy secure email services on top of Debian 7[0]. They still work but are no longer maintained because my current position is that services like Proton Mail [1] make running your own email services unnecessary. You're welcome to review and use them, and if anybody wants to update them to work under Debian 8, PRs are welcome on the Github repo [2]

[0] http://securemail.tristor.ro/#!index.md

[1] https://protonmail.com/

[2] https://github.com/Tristor/securemail.tristor.ro

bensbox 2 days ago 2 replies      
I am running my own mailserver for several years now and it is quite possible. The problem is, that your IP is new to the other MTAs and you need a couple of months to build up the reputation. Services like the one from Microsoft have forms, where you can delist yourself from their blacklists. Even though you are not blacklisted, it helps for the reputation if you fill out the forms with the MTAs you have problems with. Nevertheless it is a constant work (not much..1 hour per week) to keep up the reputation. Just make sure you are not loosing the IP when it starts to work out ;)
soneil 2 days ago 2 replies      
It's worth checking whether you're running IPv6. It does become very relevant. e.g., SPF records need to include it, it must also have a good rDNS, etc.

In particular, I know gmail hold IPv6 to higher standards. Some things (e.g. rDNS) that we traditionally treat as 'should', gmail will treat as 'must' over IPv6 - it's being treated as a chance to drop a lot of legacy leeway.

I do run my own MTA. It's not high-maintenance, at all. Understand the pitfalls, iron them out, and then stick with it to build your reputation. There is no magic bullet - the big providers won't tell us how they measure us - the best we can do is be well-behaved, stay well-behaved, and adopt modern standards (TLS, SPF, DKIM, etc) as they're thrown at us.

My best advice is to choose a reputable host. There's a lot of race-to-the-bottom in the web hosting market, and VPS are turning out no different. Keeping a clean house is good for your reputation - but so is living in a nice neighbourhood. It's well worth a couple of bucks extra to find such a neighbourhood.

sigil 2 days ago 1 reply      
Send a mail from your address to mailtest@unlocktheinbox.com. They send back an extensive "lint for smtp" report within minutes. I found it indispensable for debugging a DMARC issue recently.
acd 2 days ago 0 replies      
Was a mail admin for quite a number of years, here is some tips. Check that the ip address you are running your own mail server from is not black listed. Nor can the IP hosting the mail server be in a home ip range. This is because there is spam black lists that explicitly mark home user ip ranges as possibly spammy.

Check the repuation in the various anti spam blacklists out there* Check IP in multiple blacklist http://multirbl.valli.org/* Check server IP on mxtoolbox http://mxtoolbox.com/whatismyip/* If you are using home server IP please consider using a VPS with a good IP reputation.* Consider lookup up ip at Cisco senderbase reputation and check its score, make sure its consider good. https://www.senderbase.org* Lookup ip in Barracuda repuation http://www.barracudacentral.org/lookups(Cisco and Barracuda is because these are somewhat common antispam services at edges).

ebbv 2 days ago 2 replies      
It's gonna be really hard. I work at a hosting company and our staff has to work constantly to make sure people's servers get taken off of spam blacklists, or IP blocks of ours need to get removed, etc. There's a lot of stuff to navigate out there, basically kludges that have been put in place because email is just such a terrible, insecure system.

I'm sure if you're willing to put in the effort you can do it. But from my point of view, I'd probably just get a managed VPS with a hosting company who will take care of all the headaches of dealing with spam filters for me. They can be had pretty cheaply and the money is well worth it if you get good support.

jjnoakes 2 days ago 3 replies      
PSA: If you run your own mail server and use that email address for password resets, please use a reputable hosting provider and dns provider, and turn on 2FA.

Don't let that often overlooked weak point be the way every one of your accounts gets compromised. Once they have your email, they have everything that resets via that email domain.

mehdym 2 days ago 0 replies      
1. Creating a good IP reputation takes time, get a static IP from your ISP and gradually increase number of sent out emails.

2. Having multiple IP addresses and throttling helps if you send bulk emails.

3. Check email headers of spammed emails, they usually contain valuable info about the reason of being detected (SPF/DKIM ,...)

4. Check the contents of your emails and find out spam score of the content.

5. You can look into commercial solutions like: www.own-mailbox.com

6. Hillary, if it's you, don't do it again!

FollowSteph3 2 days ago 1 reply      
It's possible but it's not worth it unless you have a lot of knowledge about it or a lot of time and enjoy it. That's why most small to medium companies outsource this to services like sendgrid, mailchimp, etc.

You can't do everything so you have to pick your battles ;)

tacon 2 days ago 0 replies      
If you are having trouble with Google putting your emails into spam folders, email to yourself at Gmail. Then examine the original headers ("show original") and check the authentication header Google adds. It begins:

Authentication-Results: mx.google.com;

and details what it did and did not like about your message. For example, it let me know my mail server had suddenly started sending over IPv6 (actually, Google started accepting there and IPv6 had priority) and I only had SPF records for the IPv4 address. Google's authentication results are the friend of everyone with a personal mail server.

waits 2 days ago 1 reply      
I've been running my own mail server on AWS for about 3 years now. Postfix, Dovecot, and a Rails app I wrote for webmail. At first I had 0 deliverability but over time it's improved to near 100%. Just setting up SPF, DKIM, not being on a blacklist, and building up a reputation of good mail seems to have worked wonders. I've been wanting to move to DigitalOcean but I don't yet know if there will be a significant hit on my IP reputation.

Postfix occasionally drops a legitimate incoming email due to a misconfigured sender, usually from a domain that doesn't resolve to anything, but I just log those in case I miss something important.

zzzcpan 2 days ago 1 reply      
> I don't know what else to do

These days you also have to get a bunch of different VPSs and test sending e-mails from their IPs and choose the ones working. That's what e-mail marketing companies do. Because a lot of IPs and subnetworks out there have poor reputation or even completely blacklisted and that reputation is not generally recoverable.

deftnerd 2 days ago 1 reply      
Receiving email is easy. Sending it is much harder.

One of the things that is the most frustrating is that if you end up on a blacklist, or a large provider decides independently not to trust you, it's often completely silent when it blackholes all your outbound emails to that service.

I've just moved over to a hybrid of hosting my own MX servers for incoming email, and forwarding all my outgoing emails to an email-as-a-service provider for outgoing messages. Their trusted IPs usually help delivery, and they're actively paid by their users to have employees making sure that their IP ranges are whitelisted.

galori 2 days ago 0 replies      
If you look at the aggregate of the comments here, I think you get the idea. Its possible, but you have to constantly work at keeping it going with reputation, whitelisting, etc. and you'll never get to 100% deliverability ingoing or outgoing. Probably more like 70%-80%.

The big guys whitelist each other.

There are (very expensive) services for the medium guys, such as [Return Path](https://returnpath.com/) , that help you keep good relationship with the various ISPs. With many ISP's they even have a very specific whitelisting deal where you literally pay to be whitelisted...I think thats mostly the second tier ones like AOL, Comcast, Yahoo. Gmail for example doesn't play ball with that - but they will whitelist you if you follow all the rules and have a good reputation, which someone like Return Path can help with.

Is the situation shady? Yes. But it rose out of a need for dealing with a real problem (spam), which you have to admit has gotten under control over the last 10 years.

Bottom lime, I would use a 3rd party service.

mifreewil 2 days ago 1 reply      
It's been many years since I've run my own mail server, but especially if you are running your mail server on a public cloud, you should make sure you aren't on any blacklists like Spamhaus: https://www.spamhaus.org/lookup/

EDIT: Looks like this is one of the things https://glockapps.com/ checks.

jeffmould 2 days ago 1 reply      
pja 2 days ago 0 replies      
Yes, absolutely. I do this & have done for a decade or more.

But that decade probably helps - my mail server has kept the same static IP the entire time, so has a pristine spam reputation.

I added SPF a couple of years ago as otherwise Google was started to look askance at some of the emails sent from my server but I havent bothered with DKIM (apart from added a DKIM policy that says I dont do DKIM that is). No problems so far.

Where are you hosting your mailserver? If its on a dynamic IP on your home internet, then youre on a hiding to nothing. Static IP on a home internet might be OK, if you can fix your reverse DNS to be something sane rather than the more usual adsl123455.isp.net or something. Google generally hates reverse DNS entries that look like consumer internet connections.

If youre hosting with AWS or another cloud provider, then I believe the only way to get an server on an IP address that doesnt come with a terrible reputation for spam is to cycle through IP addresses until you find one that works - this is what the big mail delivery companies do I believe.

jakeogh 8 hours ago 0 replies      
hukl 1 day ago 0 replies      
I'm running my own mail server for a couple of years now. I'm using postfix and dovecot. I have SPF and DKIM set up and I'm using spamassassin, roundcube.

Setting up a working mail server is one of the biggest challenges because so many components are involved these days and you need to make them play perfectly together. I'm against using out of the box VM images / installers because then you don't understand whats going on under the hood. Mail servers are these kinds of beasts I would suggest to understand as best as possible before letting it loose on the world.

Its like with security. There is no "Just press this button/Just install this software and you are secure" solution - There is no easy and convenient way to run your own mail server without getting really involved with it :)

Just saying this as a warning for everybody who has this thought. I've been through it and my mail set up is running for 5 years like a charm now. But it was a steep way to get there :)

lucb1e 2 days ago 1 reply      
> is it possible to run your own MTA, for personal use in 2016? Who, here, is doing it successfully?

Hello, I run my own mail server, though admittedly with an attitude of "your spam filter thinks I'm spam? That filter is broken; have fun reading your spambox."

Company email addresses are actually never any trouble anyway, only personal ones (the free ones like yahoo, gmail and hotmail) are the ones where people have trouble with broken filters, so I don't think I'm missing out on anything. And even those filters usually learn (after a few emails to different accounts on the service) that my IP address is not to fear.

I add SPF records but don't sign with DKIM (too much trouble; I set this up years ago when I didn't have much experience yet).

The last time I had trouble sending email was with (of course) google apps. Some company, whose product we were required to use by school, had no privacy policy so I wanted to ask after it. Sending them an email, google's mail server outright refused my IP address to deliver a message (this is extremely rare, usually it goes into a spambox). Google is the only one that can get away with this, given the near monopoly, without people thinking it's an issue on google's side. In the end I just didn't send them an email. Also quite ironic that google, of all companies, is the one standing in the way of my email trying to ask after a company's privacy policy.

This was half a year ago. Before that I can't remember having issues with anyone or anything for another year or so. Given how much I use email and how little spam I receive (catch-all with a blacklist), owning a mail server is totally worth it. Also because I don't have to accept anyone else's privacy policy or consider how many people have access to my inbox (I host at home, not colocated nor VPS).

shirro 1 day ago 0 replies      
I had been running mail servers for a few years but didn't bother with my own until about 12 years ago. Back then all I needed was an old box in a cupboard, static IP, mx record and not have an open relay. As you have discovered things have changed a lot.

I have seen some people do cool things with qmail but unless you have a long history with it I think you are better off with postfix. Qmail requires a lot of patches to catch up with the way things are done these days.

It sounds like you have done all the best practice things. A valid PTR record, valid ssl certificate, SPF, DKIM, DMARC. You generally can't host from home anymore because your IP will likely be blacklisted so grab a vps and check your IP is clean. You will want to add ridiculous rate limiting so you don't get sin binned when someone mails to a group. Adding your IP to dnswl probably does not hurt.

Even with you doing everything else right it won't get you instant acceptance somewhere like hotmail. You are going to have to establish a reputation over time with them. Get an email address at all the free services, send emails, make as not spam, check their help websites. View full source of received emails if they let you to see what headers their mail system attached to your messages.

And ofcourse protect your reputation. If you need to send legitimate bulk email use someone like mailchimp and let them deal with any damage.

And this is only the outgoing part. You still have to worry about incoming emails, submission, retrieval, filtering etc.

I do email setups for websites where I need to ensure spam filter free communication with clients and most business mail system I encounter don't seem to bother with half the stuff I do. They will deliver all their emails plain text with no spf or dkim. And they will do brain dead things like having fake blackhole mx records as some sort of homeopathic spam remedy. And they all seem to stay in business so perhaps we just overthink this stuff sometimes.

dbcurtis 2 days ago 0 replies      
I've been doing it for ages, but I do it a very lazy way. It's so lazy that I highly recommend it.

Before I explain what I do, I'll just mention that if you run the server that is the destination of your MX record, then you probably also want to run a back-up spooler at a remote site for when the connection to the primary server, or the server itself, is down. Down for security patching, for instance. Running a rock-solid mail service is kind of a pain, because it never fails at a convenient time.

So... I let my ISPs handle the high up-time requirement stuff, and let them be the mail spool as far as the outside world is concerned. Then I pop it down and requeue the mail on a machine that sits behind my firewall and isn't even on the front lines of the internet. I run an IMAP server on that. If it goes down, pfffft, the mail spools up a the ISP for a while and it gets popped down when my sever comes back up. It actually all works pretty well, but since I use my ISP's SMTP server for outgoing, all of my e-mail clients have a rather funky asymmetric set-up. The e-mail setup wizards just don't handle it. At. All. As long as you remember how to do old-school e-mail config settings, and can convince the new-fangled e-mail client to let you do a manual config, the asymmetric server is not much of an issue.

For remote access, I port-forward IMAP in my firewall.

So I should probably modernize this whole kit, but.... I think I mentioned above that I am lazy.

jrnichols 2 days ago 0 replies      
"gmail and hotmail both mark my mails as spam."

And they will continue to do so for as long as it takes for your mail server to earn a positive reputation. I recently had this problem with gmail after moving mail servers. there's really nobody that you can contact, gmail doesn't whitelist mail servers.

You're going to run into this problem with proofpoint, barracuda, postini, etc...

virtualio 2 days ago 0 replies      
The problem is that your ISP has named your home connection with a DNS name that probably doesn't match with the domain name that you're trying to recieve and send mail for. There's a workaround by adding a vsp text record to your DNS. But that's no guarantee that all mayor mail providers will accept mail from your mailserver as unmarked (rather they'll end up as spam in your spam folder)
calpaterson 2 days ago 1 reply      
I run my own mailserver and have done since ~2012. I don't have DKIM or DMARC set up but I am using postfix and not qmail. I'm sorry I can't help you with your delivery problem - except that to say that if you didn't have eg rDNS set up to begin with gmail might have a negative cache of it. I haven't had problems with delivery of outgoing mail except to a couple of poorly administered exchange hosts run by recruitment agencies - which complain about my mail but confusingly do still deliver it.

Two real problems I have faced...I once (embarrassingly) created a unix user with test/test credentials for messing about and forgot that my postfix setup at the time reused unix credentials (ssh was locked down to only allow specific user to log in). I sent a few million spams a hour for a couple of weeks, getting my host into all the DNS blocklists. This took some time to fix (you have to apply to have your host removed from the blacklist). While I'm sure sending all that spam was annoying to many other people it didn't actually affect delivery for my mail...so it seems other administrators aren't using DNS blocklists?

Second, after a while I started to get a lot of spam. Maybe 10 per day. I tried various things to handle this, including setting up a proper bayesian spam filter (amavis-new) and using DNS blocklists myself. None of this worked for me. Greylisting however worked great.

So my suggestions to you: use defence in depth for mail as well as ssh. That means, fail2ban, different creds for both, unusual ports, user whitelists, high patchlevels (auto-patch and restart is great for a personal mail server) maybe client side TLS certs...etc. If you're relying on a single layer of defence eventually you'll make a mistake with it and then you're in trouble. I guess that really applies to anything you're trying to secure.

diego_moita 2 days ago 1 reply      
Using Postfix & Ubuntu on a Linode server, they have a very good how-to[0]. The main problem I have is filtering incoming spam on spam-assassin.

[0] https://www.linode.com/docs/email/postfix/email-with-postfix...

perakojotgenije 2 days ago 0 replies      
Yes, it is possible. Here's a blog entry [1] I wrote some time ago how to set up your own email server that will accepted by major MTAs. I use my own mail server since 2004.


timdeneau 2 days ago 1 reply      
You also need a DMARC record (https://dmarc.org) along with your DKIM and SPF records.

Be careful testing your configuration when sending emails to the large providers, you can inadvertently score negative marks against your own reputation, which is hard to recover from.

pmlnr 2 days ago 0 replies      
Of course it is.I've been running my for ages; the current setup is postfix, dovecot, dspam, opendkim, opendmarc. The last IP change was ~1 years ago, the last domain change is ~1 month, no issues.

A long while ago, when I wasn't running the mail server in lxc, and it was a server which also hosted web frontends, I got it "hacked" once; a rouge perl script sent 10s of thousands of emails within an hours. Thankfully, this was in ~2007, so removing the node from blacklists wasn't impossible.

Anyway: add dmarc, and make sure you have TLS to send/receive. This latter is probably the most important bit.

hawat 2 days ago 0 replies      
You can deploy end-to-end solution like zimbra (community version is open source and free). Has all necessary features (DMARC, SSL, clamav, spamassasin, and s-mime), really good webmail interface, some "cloud" file storage features (briefcase!) and more. As more and more people depend on big providers like google and microsoft with they mail we should advertise as much as possible, and convert as many as we can to self hosted mail solutions. Mail is last bastion of free communication. And should last "neutral" as long as is possible. Big providers are bulling smaller ones marking mail as spam/junk or blocking entire address spaces on "we thing that we should, and we do as we thing" policy.
fusiongyro 2 days ago 0 replies      
I just set up mail myself last week. Postfix on FreeBSD over at DigitalOcean. Did everything you did and was quite frustrated that my wife seemed not to be getting my email. The thing is, I sent a test email before adding SPF, and then one before adding DKIM, and Gmail figured out they were all part of the same "conversation" so because the first one seemed spammy the rest were penalized.

I made my own fresh Gmail account and messages go through fine. So I'd try that: make a fresh account from the one you've been testing with, and see if mail goes through.

Jaruzel 2 days ago 0 replies      
Adding in my 2p's worth of advice here as well. The big thing I think is your IP. Even if it is 'static', if you are running the MTA from your house, your IP will be marked as residential, and probably also still 'dynamic' (just made sticky by your ISP to your Router/Modem).

The best way to give your outgoing mail 'authority' is to relay it through a smarthost. Some ISPs offer this - All my outgoing mail from the Exchange server in my Garage routes through my ISPs smarthost - it's the only way I can be sure that the big webmail hosts (hotmail/outlook, gmail, yahoo) actually get the mails. If I try to route direct, the mails get blocked.

mrbill 2 days ago 0 replies      
I've been doing so for years. Having spent almost a decade in the ISP industry, I don't trust other providers for anything but transit.

I have a 50/10 connection from Comcast Business with five static IPs. One of those hosts what used to be a colo box with an ISP in Austin (I'd worked for them years ago, and had a services-for-colo agreement that lasted until an ownership change).

For about five years now I've had no problems sending or receiving mail; just keep a common-sense best practices configuration and do regular checks to make sure that you're not relaying/sending spam and aren't on any RBLs. "Nux"'s comment is a good list.

aabajian 2 days ago 0 replies      
We've used our own mail server to send all email reminders from www.cronote.com for the past five years. I followed the following tutorial step-by-step. The hardest thing was understanding how to setup the DNS records correctly:


It'll take about three hours to get everything working right (and passing spam checks), but it's a great introduction to running your own mail server, and when you're done you can simply create an image of the machine to use in the future.

swenn 2 days ago 0 replies      
When I was in the same situation some time ago, I used https://www.mail-tester.com/With the score around 7 or 8, Gmail didn't mark it as spam anymore.
jghn 2 days ago 0 replies      
I used to and stopped just because it was a pain in the butt to keep my emails from getting flagged as spam all over the place. Several years ago I switched to using Dreamhost which was great for a while but I'm running into the same issues again, an increasing number of popular mail hosts are flagging emails from me as spam. Likewise Dreamhost has pretty much given up providing reasonable spam blocking tools.

I'm now considering something like a google organization account, or whatever it is called, where it's really gmail but with my domain name

lisper 2 days ago 0 replies      
I've run my own mail server for >10 years. Getting it set up the first time is a bit of a chore but one it is running it requires nearly no attention. The hardest part is keeping your IP address off the spam black lists.

My setup:

Debian Linux + Postfix + Dovecot + a little greylist milter I wrote myself in Python. Happy to release the code if there's any interest.

I also have a script that automates the process of setting up a mail server but it's not quite ready for prime time. If anyone is interested in being an early adopter let me know.

vancan1ty 2 days ago 1 reply      
Those of you who run your own email servers -- do any of you run it from a residential connection with dynamic ip address? Or do you pay extra for a static ip address/host using a VPS?
cdysthe 2 days ago 0 replies      
Hillary Clinton is the expert here \_()_/
bluejekyll 2 days ago 1 reply      
I used to run my own MTA, and then I get fed up staying ahead of spammers.

I chose postfix which has a lot of off the shelf support for blacklisting and such, but even so I found that I couldn't stay ahead of spammers and their (new at the time) techniques like reflection attacks.

Anyway, while it's fun to play with this, unless you want to spend time every week keeping it up to date, etc., I found it mostly a drain on other better things I could be doing.

cookiecaper 2 days ago 2 replies      
No, not really. It's a constant battle to get delisted from spam blacklists and your site keeps popping back up. Even most companies don't bother with it anymore.
cmdrfred 2 days ago 1 reply      
I did almost exactly what you describe and it works flawlessly, but I only send a small volume of mail (my own). Digital ocean droplet $10 a month, but it would run on a $5 one if you don't also have caldav/sftp/kerberos/vpn/etc there like me.

Fun Fact: Microsoft Exchange has no native support for DKIM and tons of business run that in house, often on 'business class' connections without correct reverse IP records.

lazyant 2 days ago 0 replies      
Yes, with the caveat of non-guaranteed deliverability (in other words: one day out of the blue gmail/hotmail/whoever will drop silently your emails)
saynsedit 2 days ago 0 replies      
Gmail blocks based on IP. If you're running off of a dynamic IP at home you're going to need a smarthost.

If you're running from a VPS you may need a smarthost.

dboreham 2 days ago 0 replies      
What exactly is happening to your outbound test messages? Is the recipient MTA accepting delivery but filtering? Are you seeing rejects from the MTA? If so what's the error? Try sending messages that are ordinary looking (not "Testing, testing...").

How long has the sender domain been registered?

dboreham 2 days ago 0 replies      
What exactly is happening to your outbound test messages? Is the recipient MTA accepting delivery but filtering? Are you seeing rejects from the MTA? If so what's the error? Try sending messages that are ordinary looking (not "Testing, testing...").

How long has the sender domain been registered?

lrusnac 2 days ago 0 replies      
You should think about security though. An interesting article you should read before deciding to use your own mail domain: https://medium.com/@N/how-i-lost-my-50-000-twitter-username-...
sliverstorm 2 days ago 0 replies      
Make sure you aren't delivering over IPv6. I had all my IPv4 rules set up, and mail worked fine - except delivering to gmail. Turns out I was delivering over IPv6. I wound up shutting off my IPv6 interface.

As a good netzien I should work to behave on IPv6, but it was just too much of a pita for my tiny, single user server which works fine on IPv4

bluedino 2 days ago 0 replies      
Ars did a series of articles on running your own mail server: http://arstechnica.com/information-technology/2014/02/how-to...
gbuk2013 2 days ago 0 replies      
I have been running an Exim4 email server servising several domains for many years, mostly without issue. I do have SPF configured and it is running on a commercial VPS server.

For spam defence I use grey-listing based on DNSBL lookups, some standards-enforcement ACLs and then SpamAssasin via MailScanner on the messages that get through.

z3t4 2 days ago 0 replies      
The spam filter is a product. And if it doesn't work, like legitimate mail getting marked as spam, they will lose business. Just look at Yahoo where it's currently impossible to get white-listed, who are losing business because of this. (Yahoo mail was as big as Gmail is now about 15 or so years ago)
mverwijs 2 days ago 3 replies      
Been running it for over 15 years. No SPF. No dkim. Never had complaints, though threads like these have me worried.
wtbob 2 days ago 0 replies      
I followed Ars Technica's article years ago, and modulo a few minor alterations, their instructions seemed to have worked pretty well. I will note that I mostly receive, rather than send email, but I've had no problems I'm aware of.
darkhorn 2 days ago 0 replies      
In addition to the other things; DNSSEC and then registering your domain for at least 3 years might help.
ashitlerferad 2 days ago 0 replies      
efesak 2 days ago 1 reply      
Sure! I run several servers (and developing them, see https://poste.io). Watch DMARC reports and give it little time, most of time it will solve itself. You can also register to feedback loop...
RIMR 2 days ago 0 replies      
Yeah, all you need is a business-class Internet connection (residential Internet services block mail services), and a server.

I run my own servers both onsite and in the cloud. I have my only little personal "corporate" network.

j45 2 days ago 0 replies      
You can run something like Zimbra. I ran my own personal server for almost 15 years and intend on going back to it.

I trusted Microsoft once with my hotmail and they somehow deleted 16 years of my emails and correspondence with a few friends who passed away.

wfunction 2 days ago 0 replies      
I'm not sure it's a great idea from a privacy viewpoint.

Why should anyone who has ever received an email from you know your IP address? Especially if it's a home server, that will give them some idea of where you live.

qwertyuiop924 2 days ago 0 replies      
Running an MTA isn't too hard, but it's annoying. Postfix, qmail, and opensmtpd are the easiest to set up. I never actually got POP/IMAP working, so I have no advice there.
yeukhon 2 days ago 2 replies      
Some ISP don't allow SMTP at all. FWIW, if you are hosting on Amazon for example, you can't run mail server on port 25 from an EC2 until you submit a support ticket.
markvaneijk 2 days ago 1 reply      
One important thing is that for Gmail you need to send your e-mail using TLS.
chmaynard 1 day ago 0 replies      
This has to be the most arcane discussion I have ever read on HN. Fascinating and terrifying at the same time.
jack9 2 days ago 0 replies      
I run my own for my own personal use...not public available accounts. I've never even looked at SPF, DKIM, or cared if I was in blacklists. I rarely send mail and receive lots of it.
jwatte 2 days ago 0 replies      
I do that, using postfix.

You need to also run spamassassin, with auto update, and check in with a number of rbl servers on the receiving side. (This is more important when you forward aliases to gmail and such)

kazinator 2 days ago 1 reply      
I run my own mail server on a dynamic IP. Not being marked as spam is mainly a matter of how you send mail. Make your next SMTP hop a server of decent repute. Don't send mail directly.
SixSigma 2 days ago 0 replies      
Your IP block assigned to domestic suppliers is probably the thing that kills you.

My VM in the ISP has no issues, but from home, blocked at lots of places.

What you can do is use your ISP as a mail sending relay.

t3ra 2 days ago 0 replies      
Related question : how to get push notifications from a self hosted MTA? The closest option I know of is app called CloudMagic
ams6110 2 days ago 2 replies      
Are you running it out of your house (e.g. a residential ISP). If you are you'll probably never have much success, all the major email providers block residential IP addresses.
meej 2 days ago 0 replies      
I have been running my own email server on a VPS running slackware and sendmail for over 3.5 years now and have not had any trouble sending mail.
drudru11 2 days ago 0 replies      
Yes - I just started doing it again. I'm glad I did. I just made sure that I could send/receive mail without issue from the large email systems.
ariejan 1 day ago 0 replies      
Try Mail-in-a-Box. It works very, very well :-)
omginternets 2 days ago 0 replies      
Is there a comprehensive guide to running a homebrew end-to-end encrypted mail server? Something roughly analogous to tutanota or protonmail?
mindslight 2 days ago 0 replies      
Postfix, Mail Avenger, Linode, unison, mutt, -spf, -dkim, -dmarc. Fine for >10yr, although I'm sure the age helps with rep.
janci 2 days ago 0 replies      
Have a look at the message headers when it goes to Gmail or any other target. There will be a hint, why it gets marked as SPAM.
cdnsteve 2 days ago 0 replies      
Anyone aware of any newer MTAs built using Python, Go or Node? Maybe with nice JSON config.
meeper16 2 days ago 0 replies      
Sendmail is your friend.
dstjean 2 days ago 0 replies      
Ask Hilary Clinton
igk 2 days ago 0 replies      
Yes, am doing it successfully using the awesome iredmail.
ABorserker 2 days ago 0 replies      
Check with mxtoolbox.com
marmot777 2 days ago 0 replies      
notadoc 2 days ago 0 replies      
Sure, but do you want another chore?
RajkumarOvi 2 days ago 0 replies      
miguelrochefort 2 days ago 0 replies      
People like you should be punished.
tacostakohashi 2 days ago 0 replies      
Basically not, if you have other things you're trying to do with your life at the same time.

It's like trying to make a toaster from scratch, or growing your own wheat to make your own bread. It's possible, but it's also impractical and you'll end up with a worse result than you can get off the shelf.

tootie 2 days ago 0 replies      
I don't know if "run your own" means your own everything, but Amazon SES is a pretty viable option. Even if it's not, they have a pretty good checklist for keeping yourself out of spam filters: http://docs.aws.amazon.com/ses/latest/DeveloperGuide/deliver...
mindcrash 2 days ago 1 reply      
Grab https://github.com/sovereign/sovereign, follow instructions, done.
CPUs are optimized for video games moderncrypto.org
431 points by zx2c4  5 days ago   336 comments top 21
sapphireblue 5 days ago 19 replies      
This may be an unpopular opinion, but I find it completely fine and reasonable that CPUs are optimized for games and weakly optimized for crypto, because games are what people want.

Sometimes I can't help but wonder how the world where there is no need to spend endless billions on "cybersecurity", "infosec" would look like. Perhaps these billions would be used to create more value for the people. I find it insane that so much money and manpower is spent on scrambling the data to "secure" it from vandal-ish script kiddies (sometimes hired by governments), there is definitely something unhealthy about it.

pcwalton 5 days ago 4 replies      
Games are also representative of the apps that actually squeeze the performance out of CPUs. When you look at most desktop apps and Web servers, you see enormous wastes of CPU cycles. This is because development velocity, ease of development, and language ecosystems (Ruby on Rails, node.js, PHP, etc.) take priority over using the hardware efficiently in those domains. I don't think this is necessarily a huge problem; however, it does mean that CPU vendors are disincentivized to optimize for e.g. your startup's Ruby on Rails app, since the problem (if there is one) is that Ruby isn't using the functionality that already exists, not that the hardware doesn't have the right functionality available.
speeder 5 days ago 6 replies      
As a gamedev I found that... weird.

A CPU for games would have very fast cores, larger cache, faster (less latency) branch prediction, fast apu and double floating point.

Few games care about multicore, many "rules" are completely serial, and more cores doesn't help.

Also, gigantic simd is nice, but most games never use it, unless it is ancient, because compatibility with old machines is important to have wide market.

And again, many cpu demanding games are running serial algorithms with serial data, matrix are usually only essential to stuff that the gpu is doing anyway.

To me, cpus are instead are optimized for intel biggest clients (server and office machines)

Narann 5 days ago 3 replies      
The real quote would have been:

> Do CPU designers spend area on niche operations such as _binary-field_ multiplication? Sometimes, yes, but not much area. Given how CPUs are actually used, CPU designers see vastly more benefit to spending area on, e.g., vectorized floating-point multipliers.

So, CPUs are not "optimized for video games", they are optimized for "vectorized floating-point multipliers". Something video game (and many others) benefits from.

joseraul 5 days ago 0 replies      
TL;DRTo please the gaming market, CPUs develop large SIMD operations.ChaCha uses SIMD so it gets faster. AES needs array lookups (for its S-Box) and gets stuck.
wmf 5 days ago 0 replies      
Maybe a better headline would be something like "How software crypto can be as fast as hardware crypto". I was curious about this after the WireGuard announcement so thanks to DJB for the explanation.
nitwit005 5 days ago 1 reply      
Not really. Just look through the feature lists of some newer processors:

AES encryption support: https://en.wikipedia.org/wiki/AES_instruction_set

Hardware video encoding/decoding support (I presume for phones): https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video

It's more that it's relatively easy to make some instruction useful to a variety of video game problems, but difficult to do the same for encryption or compression. You tend to end up with hardware support for specific standards.

magila 5 days ago 1 reply      
One important aspect DJB ignores is power efficiency. ChaCha achieves its high speed by using the CPU's vector units, which consume huge amounts of power when running at peak load. Dedicated AES-GCM hardware can achieve the same performance at a fraction of the power consumption, which is an important consideration for both mobile and datacenter applications.

Gamers generally don't care about power consumption. When you've spent $1000 on the hardware an extra dollar or two on your electricity bill is no big deal.

revelation 5 days ago 6 replies      
I thought modern video games are predominantly limited by GPU performance? Maybe the argument is that while usually CPU performance isn't the most important part of the equation, video gamers base their purchasing decision on misguided benchmarks that expose it.

The big CPU hog and prime candidates for these vector operations nowadays seems to be video encoding.

joaomacp 5 days ago 1 reply      
Of course. Gamers are the biggest consumers of new, top of the line PC hardware.
milesf 5 days ago 0 replies      
And because CPUs are optimized for both gamers and Windows, the world has access to lots of cheap, powerful hardware. I'm not a Microsoft fan, but I'm very appreciative to them for making this ecosystem possible.

In fact, games have always driven the modern computer industry. Even Unix started because of a game (http://www.unix.org/what_is_unix/history_timeline.html).

rdtsc 5 days ago 2 replies      
Wonder how a POWER8 CPU would handle it or if it is optimized differently. It obviously is not geared for the gaming market.
stephenr 5 days ago 0 replies      
Isn't this exactly why HSM's exist - to provide optimised hardware crypto functionality?

Honestly I would treat this the same as eg Ethernet - high end cards have hardware offload capabilities that the software stack can utilise to get better performance.

tgarma1234 5 days ago 2 replies      
I really find it hard to believe that people for whom such an interest in security at the CPU level would buy "retail" processors like you and me have access to. I am no expert in the field but it just seems weird that there isn't a market for and producer of specialized processors that are more militarized or something. Why does everyone have access to the same Intel chips? I doubt that's actually the case. Am I wrong?
Philipp__ 5 days ago 3 replies      
ARMA III could be the good example of CPU bottleneck. Or maybe it is badly optimized... Then we hit the hot topic of multicore vs singlecore performance.
wangchow 5 days ago 1 reply      
The form-factor for laptop screens are built for media consumption, even though the square form-factor is superior for productivity (I found an old Sony Vaio and the screen form-factor felt very pleasant). Seems the general consumption of media has dominated CPU design in addition to everything else in our computers.
rphlx 4 days ago 0 replies      
Perhaps that was true in the mid 90s, but today Intel optimizes x86_64 for its highest margin core business: server/datacenter workloads. Any resulting benefit to desktop PC gaming is appreciated, but it's a side effect rather than a primary design goal.
wscott 5 days ago 0 replies      
No, Intel CPUs are optimized to simulate CPUs

Some stories from back around 2000 when designing CPUs at Intel. Some people did bemoan the fact the few software actually needed the performance in the processors we were building. One of the benchmarks where the performance is actually needed was ripping DVDs. That lead to the unofficial saying "The future of CPU performance is in copyright infringement." (Not seriously, mind you)

However, here is a case where the CPUs were actually modified to improve one certain program.

From: https://www.cs.rice.edu/~vardi/comp607/bentley.pdf (section 2.3)

"We ran these simulation models on either interactive workstations or compute servers initially, these were legacy IBM RS6Ks running AIX, but over the course of the project we transitioned to using mostly Pentium III based systems running Linux. The full-chip model ran at speeds ranging from 05-0.6 Hz on the oldest RS6K machines to 3-5 Hz on the Pentium III based systems (we have recently started to deploy Pentium 4 based systems into our computing pool and are seeing full-chip SRTL model simulation speeds of around 15 Hz on these machines)"

You can see that the P6-based processors (PIII) were a lot faster than the RS6K's and theWmt version (P4) was faster still? That program is csim and it is a program that does areally dumb translation of the SRTL model of the chip (think verilog) to C code that thengets compiled with GCC. (the Intel compiler choked) That code was huge and it had loopswith 2M basic blocks. It totally didn't fit in any instruction cache for processors. Mostprocessors assume they are running from the instruction cache and stall when reading frommemory. Since running csim is one of the testcases we used when evaluating performance the frontend was designed to execute directly from memory. The frontend would pipelinecacheline fetches from memory which the decoders would unpack in parallel. It could executeat the memory read bandwidth. This was improved more on Wmt. This behavior probably helpssome other read programs now, but at the time this was the only case we saw where it reallymattered.

The end of the section is unrelated but fun:

"By tapeout we were averaging 5-6 billion cycles per week and hadaccumulated over 200 billion (to be precise, 2.384 * 1011) SRTLsimulation cycles of all types. This may sound like a lot, but toput it into perspective, it is roughly equivalent to 2 minutes on asingle 1 GHz CPU!"

Games were important but at the time most of the performance came from the graphics card.In recent years Intel has improved the on-chip graphics and offloaded some of the 3d work to the processor using these vector extensions. That is to reclaim the money going to the graphic card companies.

xenadu02 5 days ago 0 replies      
tl;dr: AES uses branches and is not optimized for vectorization. Other (newer) algorithms are designed with branchless vectorization in mind, which makes specialized hardware instructions unnecessary.
Philipp__ 5 days ago 2 replies      
And what if games are better (or worse) optimised for certain type of hardware? So that way, you spend on new Intel CPU every 3 years. So the point is, what if some games are badly optimisied and run bad on certain hardware on purpose. Maybe it sounds like a conspiracy theory. But look, CPUs are stalling, Intel wants to sell it's things every year, what if they come to developers and say "Look make your game run 10% better on our latest hardware and we give you money"?
DINKDINK 5 days ago 0 replies      
Off-topic: That's a great favicon
I'm a Judge and I Think Criminal Court Is Horrifying themarshallproject.org
430 points by juanplusjuan  2 days ago   210 comments top 23
bleachedsleet 2 days ago 1 reply      
Several years ago, I was arrested on a hacking charge and got to see first hand the appalling nature of our legal system in criminal courts.

Normally, after your arrest, you have to have a hearing within 24-48 hours, but if they arrest you on a Friday as they did me, they are allowed to detain you for an extra day because it's the weekend. I'm sure this is a tactic used often to frighten and goad people. My hearing was exactly as the judge in this article describes. There were lots of minor, non violent offenders in the court room with me, most minorities, and many couldn't speak English well at all. The judge would openly mock them and condescend. One man obviously had no idea what he was even pleading to because his English was so poor.

Once I got to higher court for my actual sentencing, it was no different. The judge didn't even read my case and the clerk forgot to have it presented and available for the judge to review. My lawyer had to give him her own copy which he briefly skimmed without adjournment. I later discovered that the prosecuting attorney was good friends with the plaintiff and the investigating FBI officer assigned to my case.

The courts in America are a joke, the legal system is in bad need of an overhaul. I couldn't believe the level of incompetence, racism, bias and prejudice existed there.

rayiner 2 days ago 6 replies      
The criminal justice system can be jarring, even to those of us who are lawyers. I was a clerk for a judge in Philadelphia, and saw a ton of criminal convictions - usually drug cases - come up on appeal. Two things stuck out: (1) we jail a lot of people for stupid things; (2) there are a lot of bad people in these communities.[1]

I'd see someone appealing a sentence for selling some OxyContin and think, "geez, what a waste of money to put this guy in prison." But he's got a rap sheet a mile long - theft, robbery, assault, etc.

The treatment of the criminally-accused in this country is deplorable. But it's also the product of a society that got fed up with skyrocketing crime a few decades ago and responded in a harsh and heartless manner. Crime is a lot lower today than it was in the 1990s, but even in the safest American cities murder rates are 5 times higher than big cities in Europe that aren't even considered that safe (like Berlin). And crime is heavily concentrated in poor places like the Bronx.

And the usual canard - for profit prisons - isn't even applicable here. Private prisons are illegal in New York. This isn't lobbying at work, this is purely a product of the democratic will reacting to devilish social issues. That's what makes it so hard to fix.

[1] I was also living in downtown Baltimore during the post-Freddie Gray unrest. I was disappointed to see the acquittals. At the same time, if I were in those cops' shoes, I'm not sure if have the moral strength to be any different. Society needs a certain amount of order to function. In much of Baltimore, that order doesn't exist. Gangs are in charge and the law-abiding people of those neighborhoods are the biggest victims of that.

twoquestions 2 days ago 6 replies      
And I hear from family and friends about how we 'coddle' people in the system too much, and we're not 'tough' enough.

A large number of our people seem to think if we're sufficiently cruel and inhuman to people accused of breaking the law, then people will magically stop getting suspected of breaking the law. It's more than a little horrifying seeing people's eyes light up when they talk about how our cops aren't afraid to kill people and how merciless our prisons are.

This horrifying system is a symptom of our cruelty, and any move to make this more humane (or less of an atrocity) will face stiff resistance from people who get off on seeing people get punished.

I don't know what I can do, or what anyone can do.

chillwaves 2 days ago 0 replies      
I had a friend who ended up with a drug related charge. When bailing them out, the bondsman, who was a conservative, ex cop, hardened and huge, told me how my friend was lucky to get the judge they got.

There were three judges in that court and one was known to be particularly rough. A defendant (drug charges, heroin) was asking for bail to be set, had secured a bed in a locked down rehab facility and the judge denied his bond. The bondsman said he had never seen someone leave the courtroom so broken.

Here is a case where a man acknowledges his crime, says he will do his time but wants more than anything to get clean. And securing a bed in a lock down rehab facility, besides being expensive, is not easy. Here is a case where the state had every interest in sending the man to rehab, even to save the cost of housing them in a jail, but the system doesn't care. The DA, the judge, don't care. Bail denied because the man had missed a hearing due to being in another jail after being picked up on the street with dope.

The bondsman said he sees this kind of thing every day. It was the rule, not the exception. People are just cycled in and out of the system. The addict will get locked up, released and locked up again.

ryanmarsh 2 days ago 2 replies      
I've observed the criminal justice system at work in Texas a few times. A murder case, some misdemeanors, a few felonies. I've seen it from arrest to prison life.

It is a dark and deeply depressing thing. There is little compassion for victims, the accused, the convicted. I could write a thousand words but it pains me to think about it.

I pray I never get falsely accused of a crime and I'm so glad I'm not black. If all the boring white folk who couldn't fathom finding themselves so much as suspected of a crime had to go through the criminal justice system as the average YBM the system would be massively overhauled yesterday.

Most of the boring white folk I know think everyone in jail is a cold blooded psychopath who works out all day and dreams of stabbing people... "hardened criminal".

The system is just full of unlucky humans. God knows I did shit in my youth I could still be in prison for. The extremely violent type are actually quite rare.

foepys 2 days ago 3 replies      
I never understood the concept of "plea bargain" in the USA that comes up in the article. Either somebody is guilty and should be convicted by a judge or they are not guilty and should have the chance for a fair trial. I can see that there are situations where they can be useful but they seem to be the norm instead of being used in certain, limited situations. Scaring somebody into pleading guilty for a lesser sentence is simply unfair and in my eyes undemocratic.
tbihl 2 days ago 0 replies      
I have some family friends, and their son recently got arrested for a felony drug dealing charge at the end of his spring semester at college. From what I understand, the police built this whole sting to catch the guy at his school mailbox when he got his shipment from whatever Silk Road's current successor is. Anyway, he confessed everything and now he's just waiting around, unable to get a job because interviewers invariably ask why he stopped school. Apparently having a job really helps in these cases.

Anyway, they remarked one day on the phone, "we've never dealt with criminal Court, and we don't know anything about it, except the lawyer doesn't sound very optimistic. The only hope we have is because the system is so racist that being middle class and white might actually save us."

They also said something about how the police were pushing some angle about using Tor. I'm very concerned about that: I can just picture the ghost story that the prosecutors are going to tell some septuagenarian judge about this special internet for terrorists and arms dealers.

kafkaesq 2 days ago 3 replies      
I was shocked at the casual racism emanating from the bench. The judge explained a stay away order to a Hispanic defendant by saying that if the complainant calls and invites you over for rice and beans, you cannot go. She lectured some defendants that most young men with names like yours have lengthy criminal records by the time they reach a certain age.

Thereby instantly disqualifying herself from the privilege of serving on the bench. Ample grounds for impeachment, by any reasonable standard.

will_brown 2 days ago 6 replies      
>Even I, as a bankruptcy judge, know that the point of bail is supposed to be simply to ensure that a person will return to court.

That is the point of posting bail, but that is not the standard of granting a defendant the right to post bail/bond.

The standard is more along the lines of: a) is the defendant a flight risk; and b) does the defendant present a danger to the community.

As to b) it is not simply enough that the charges are non-violent, which seemed to shock this Judge. For example, DUI while not a violent crime generally one eligible for bail/bond by default, may not be granted if, say, it is the 3rd or 4th DUI. Or if say the defendant was already out on bail/bond and picked up a new charge, a Judge may revoke the bail/bond on the 1st crime. I'm not claiming this is always how it works and the decisions are always just, but I just want to give a little more perspective.

As to the shock of the state of the courthouse, only a Federal Judge would find that shocking. Don't get me wrong it's a pleasure to practice in a gorgeous Federal Courthouse, complete with grand marble accents, but I'm of the opinion those types of luxuries are a waste of tax payer dollars.

teddyknox 2 days ago 1 reply      
There's a new HBO series called "The Night Of" that portrays the New York justice system this way.

Trailer: https://www.youtube.com/watch?v=556N5vojtp0

desdiv 2 days ago 1 reply      
>Once the court officers caught their breath from laughing, they barked at him, Where is your belt? Of course, it was taken from him in the lockup, he said.

I don't like to watch crime/legal drama and even I know this. How does someone who work in the legal system as a day job not notice this?

ak217 2 days ago 1 reply      
We need a better accountability system for judges. Even for those judges who are elected, the public is usually not meaningfully informed about the judge's performance.
ChuckMcM 2 days ago 1 reply      
Great, where is the follow up with the judge in chambers? Don't judges talk to each other? Standards and mores are upheld by peers not by individuals.
jostmey 2 days ago 0 replies      
After reading this article, I have to wonder if the criminal court system is under funded and overworked. I think America's core institutions have been under decay for some time now.
marklyon 2 days ago 2 replies      
An actual criminal attorney in NYC did a far better job of deconstructing this judicial tourist's article than I can: http://blog.simplejustice.us/2016/08/13/the-dilettante-judge...
epynonymous 2 days ago 0 replies      
i must say, though i was in people's court once trying to get my security deposit back from my landlady, which is by definition the lowest level court in the land, i too have many apprehensions about the "fairness" of the judicial system in america. that's not to say that my 1 incident should be representative of the entire judicial system, as i still believe that there are lots of excellent examples of upstanding individuals, probably more so than bad ones, but the judicial system is powered by people and their interpretation of the law, and though with some checks and balances, people having unconscious biases and tend to work these around the system, just like her example of guns and butter, no one's going to penalize that judge for this and she knows it.

my particular case was in quincy, mass, i remember the faces of the judge, deputy, and clerk that waited to put my case towards the very end after about 10 cases and all other people had left, behind closed doors, seemed i didnt understand much about the system back then, but this was an arbitration, under the guise of a court order to reappear for an appeal. the landlady, a chinese national, had lost the claim originally and had ninety days to appeal, she appealed after 138 days, and the court ordered me to show up with an official court summons. in looking back, i should not have showed up, i had no need to show up, this was over a measely 700-900 usd if i recall correctly. and the way the judge, clerk, and deputy nodded their heads in contempt of me as i spoke my case was so planned and orchestrated that in my mind i decided that this was useless and to go ahead and tear up the check that she was asked to pay me for my security deposit after the first decision which i hadnt even cashed because it was never about the money.

if these are the shenanigans in the lowest of courts, i wonder what things are like in the upper courts. my lesson from that day was 2 fold, try your best never to go the legal route, and the american system, although perfect in many ways, still cannot provide perfect justice.

tomohawk 2 days ago 0 replies      
And then you have politicians like Martin O'Malley using the system to further their careers at others expense.


peter303 2 days ago 0 replies      
Some of the modern courthouses reduce the courthouse violence problem with a "triple entrance" system, one for all the court personal, a second for defendants briefly released from county jail, and a third for public witnesses and spectators. You dont have the hallway encounters between the three parties you had in older facilities.
shams93 2 days ago 1 reply      
In Los Angeles it's quite a bit more professional. When I served on a jury in a criminal case the judge was very professional.
Qantourisc 2 days ago 0 replies      
A judge holding the court in contempt basically. And since the judge is part of the court, the court is holding the court in contempt.

Perhaps we need judged for judges.

mLuby 2 days ago 0 replies      
It's heartening to read so many thoughtful posts about our justice system. Gives me some small hope for the future. :)
jondubois 2 days ago 1 reply      
I wonder if that has something to do with the different level of 'leniency' which is accorded in bankruptcy cases versus criminal cases.

My gut tells me that white-collar transgressions will always be punished less often (and less severely) than outright blue collar crime.

It sounds like the bankrupcy judge who wrote the article is surprised by the fact that the justice system actually punishes people for doing bad stuff.

Maybe it's not the criminal justice system which is harsh - Maybe it's her who is too lenient with her own cases...

Or maybe both need some recalibration.

japhyr 2 days ago 1 reply      
This is what bothers me so much about the Trump campaign. Even though he'll probably lose, he's helped legitimize the kind of speech and attitude this judge demonstrates.

I'm hoping there's a strong enough backlash that this kind of speech and attitude gets called out more often. I hope the long-term effect is not to make this judge's behavior even more accepted.

Stop Treating Marijuana Like Heroin nytimes.com
331 points by okket  3 days ago   249 comments top 22
poof131 3 days ago 5 replies      
The DEA needs to go bye bye, and be replaced by the DRA, Drug Rehabilitation Agency. While I respect the risks operational agents take against crime syndicates, the war on drugs is a disaster. Even the governments own RAND corporation concluded enforcement is far worse than rehabilitation, with rehab being 7 times more effective than policing, 400 times more effective then border interdiction, and 800 times more effective than actions in source countries.[1] Not to mention the privacy violations its enabled. If the goal is to reduce drug use, the US policies are stupid. If the goal is for politicians to sound tough and maintain a never ending drug war so they can keep sounding tough, then its a success.

And letting the DEA decide what is Schedule 1 is like letting the fox guard the hen house. Ive never heard of a government agency voluntarily reducing its scope and budget. Just more of our leadership deferring responsibility to so called experts with a vested interest. A pretty rampant phenomenon these days ranging from the banks to the military.

[1] http://www.rand.org/pubs/periodicals/rand-review/issues/RRR-...

martinald 3 days ago 33 replies      
While I'm totally for drug (all drugs - not just cannabis) legalisation/regulation, I do think the "dangers" of marijuana have been really downplayed over the last few years.

I know a lot of people at university who got into smoking it very often and basically lost 10+ years of their life to complete apathy to anything. Some have now stopped and are totally different people - just 10 years behind.

While it doesn't cause overdose, cirrhosis or criminal activity from the user, it does become very addictive for some people and causes them to be extremely unmotivated in their life.

Is this as bad as heroin or crack? No obviously not, but a lot of people end up really trapped by it.

contemporary2u 3 days ago 3 replies      
The big question is why drugs are illegal at all?

Drugs are a multi trillion dollar business. 100's of billions of dollars turning the wheels of the shadow economy every day.

Who stands to lose all that money if drugs were legalized? Who is involved in drug business? think of all of the players in the chain..

We know examples of developed countries where drugs are legalized and everyone did not become a junkie over night, or next week, or next year.

Drug money fuels a lot of things in this world, entire countries are built and destroyed using drug money, new political systems are built and presidents elected using drug money. It is convenient "invisible" hand that makes a lot of things happen in this world.

Drug problem is not about me or you becoming a junkie. Its about money, a lot of money, its about economical and political power that comes with it.

The day when drugs are legalized the convenient way to make lots of money out of "thin air" all of the sudden would be gone...

Hearings on the CIA and Drug Trafficking:


John Kerry committee:


Police Officer Mike Ruppert (dead now, shot in the head 2 years ago) Confronts CIA Director John Deutch on Drug Trafficking:


will_brown 3 days ago 1 reply      
>The D.E.A. and the F.D.A. insist that there is not enough scientific evidence to justify removing marijuana from Schedule 1. This is a disingenuous argument; the government itself has made it impossible to do the kinds of trials and studies that could produce the evidence that would justify changing the drugs classification.

This can not be overstated. I recently discussed a prior client of mine on HN who is one of the few patients who receive the federally authorized marijuana from University of Mississippi (at a dosage of 360 joints/month). My client has a rare bone disease and has been in the federal program for over 30 years, one of my clients biggest complaints is that while he wanted studies to be conducted regarding the use/effects of marijuana the Government has refused.

thisjustinm 3 days ago 0 replies      
The biggest change I've seen since marijuana was legalized here in Colorado is that pot isn't just one type (just like beer isn't just lagers, etc). There's hundreds of strains and some are VERY different than the others (again think Guinness vs bud light vs your favorite IPA, etc).

You can walk into a dispensary here and choose your levels and mix of THC and CBD - want 20%+ THC for an intense high? Maybe 15% CBD with no THC for almost no psychoactive high but a focus on pain relief / relaxing? All that and everything in between is available.

As research into all the other cannabinoids increases I expect to see their breakdowns included on labels as well.

And that's where the analogy to alcohol breaks down - there's typically just one active ingredient in beer but dozens in pot. And the different strains can deliver various ratios of the cannabinoids to create very different effects.

Bottom line is that I could easily see how if you only ever experienced illegal pot i.e. no choice and low quality I think you are likely to hold a different view on it than if you'd experienced the variety and nuance of legal pot.

arcticbull 3 days ago 1 reply      
Stop treating addiction as a legal problem and treat it as a public health problem, like Portugal. Decriminalize all substances and push addicted to treatment programs. And just legalize pot entirely, one of these is not like the others (at least Canada is leading the way there).
snarfy 3 days ago 3 replies      
I've been a daily user for about 30 years now. It's more like coffee and cigarettes than it is like heroin or alcohol.
Fifer82 3 days ago 0 replies      
I am in the UK and have been smoking cannabis daily now for 12 years. Legal or not.... does it look like I care?? At the end of the day, I am a nice guy, I help people out, I have held down a 10 year happy relationship and do pretty well at my job. I also keep fit and "healthy".

I don't care if I die, so take that away from me, and literally there is no valid argument to stop me enjoying a smoke.

Fuck the Police.

god_bless_texas 3 days ago 0 replies      
So we're looking to an enforcement agency to make legal one of the very things that justifies its existence?
Mz 2 days ago 0 replies      
My observation has been that drugs don't really fuck up people's lives. People with fucked up lives turn to drugs and then, when they are ready to get their act together, they also get off the drugs. Then blame the problems on the drug use.

I am pretty much a tea-totaller. I am also allergic to marijuana, so I have no desire to be around it. But I have known people in person and read articles and the like. One example that comes to mind: 16 year old boy's brother is gruesomely murdered, he becomes an addict or alcoholic and when he is 19 he decides to get clean and sober. So, he's had three years to process his grief and he is now a legal adult, not a helpless legal minor who can't do much about anything wrong in his life. Now, he wants to be sober and talks trash about what a bad person he was for using/drinking and blah blah blah.

My best guess: It is easier and/or more socially acceptable to blame drugs than to admit that life really shat on them, their parents are assholes, whatever. But, in most cases, it looks to me like they use for a reason and if that underlying reason gets better, then they will tend to stop using.

I am fond of the book "The truth about addiction and recovery" which basically takes this view.

aaron695 3 days ago 2 replies      
Stop demonizing heroin.

To be honest, not sure what is right here.

Is it ok to demonise interracial marriage while trying to get the minority the right to vote?

Does the end justify the means?

But if you think heroin is a demon and marijuana(or alcohol) is ok you are sadly mistaken.

mark_l_watson 2 days ago 0 replies      
I like to listen to Catherine Austin Fitts who was the undersecretary of HUD in the George H. Bush administration. Catherine often says "follow the money."

Unfortunately there are a lot of vested interests, organizations that make a ton of money from the cruel Marijuana laws: private corporations running prisons, police unions/organizations, and I would argue the big pharmaceutical companies (because Marijuana is a good natural pain killer). The war of drugs is a high-profit business.

sandworm101 3 days ago 0 replies      
The OP speaks of a lack of studies due to a lack of supply. That's incorrect. There are plenty of studies and plenty of research-grade marijuana out there. There is just very little American material. The plant has been studied in Canada and Europe with plenty of material available from a variety of producers.

Canadian government's list of sanctioned providers:


Their trade association:


The growing supply of research-grade material:


So stop harping on about a lack of research. The fact that a medical substance hasn't been studies within the magic bounds of a particular country should be irrelevant to any reasonable person.

mahranch 2 days ago 0 replies      
Stop treating Heroin like heroin. That's the very reason why we have so many opiate addicts. As a former opiate user who almost fell down the rabbit hole (unfortunately, my brother eventually did fall down that hole), by demonizing heroin, you set it apart from other opiates like Codiene, Vicodin, and Percocet/Oxycontin. It's not any different from those drugs. I found out the hard way. All of those drugs are opiates and if you were to snort heroin and if you were to snort Oxycontin, you would almost certainly prefer the oxycontin. It's a much cleaner, longer lasting high while the heroin high kinda blows. Taking it in pill form isn't any different. Heroin is only sorta useful when it's taken IV or smoked, but heating it up destroys the compounds which get you high so it's not a very good ingestion route.

Pure heroin is quite harmless by itself (provided you didn't take too much). People can live a full and complete life totally addicted heroin with probably less risks than even pot. The problem of heroin is never direct -- it kills/harms via overdose (misjudging a dose, or doing the same amount after your tolerance drops) or the most common "overdose" we hear of is when Fentanyl gets cut into your heroin (Yes, Fentanyl is way more potent than heroin and hospitals prescribe it all the time). The other ways people die is by mixing heroin with other stuff, namely xanax. You virtually never hear of someone just dying from heroin alone. And have you ever heard of someone getting lung cancer from their heroin?

The drug itself isn't like crack or meth where doing it for an extended period of time can permanently turn you crazy (Psychosis). Or kill you from exhaustion/sleep deprivation. People can take it for decades and show no obvious signs (functioning addict). The drug is always labeled "super hard" by the same people carrying around Percocets in their purse. But that's OK because they got it from a dentist/doctor. Percocet is oxycontin mixed with Acetaminophen and Oxy is almost as strong as heroin per mg. Demonizing heroin while you're taking vicodin or oxy is so hypocritical that there should be a new word invented for it, something that means "more than hypocritical".

Heroin's problem lies in that it really doesn't have a great medical application. What it does, other drugs now do literally (not figuratively) 1000x better. But heroin is incredibly cheap to make (relatively speaking) and provides a unique rush when taken IV. There are other drugs (like Demerol) which provide that same rush but hospitals instead inject it into your fatty tissue (ass) so it's slow to release (so you can't get the rush). So why don't drug kingpins make that stuff instead? Because those pharmaceuticals (like Vicodin, Demerol, and Fentanyl) require complex processes (many steps) and expensive labs to make well, and it's way more expensive than Heroin. Not just cost-wise but also time-wise. When you're out in some back-country in Afghanistan mixing your freshly dried Opium in a vat of Acetone to dissolve it, you're on the clock to get in and out with the product or the boss is going to be pissed.

I realize I'm starting to rant but that's because I myself always though "drugs are bad!" then I tried pot. Realized it wasn't that bad, safer than alcohol. That same "Well that's not so bad" happened to me with heroin. After snorting it I was actually disappointed and thought I was ripped off. I didn't realize it wasn't stronger than Oxycontin. I thought it was orders of magnitude stronger than these harmless little pills my dentist gave me when I got my wisdom teeth pulled. I was wrong. And that lack of understanding is what almost threw me down the rabbit hole.

mercer 2 days ago 0 replies      
Without arguing for anything in particular I'd just like to add the following:

If you feel or suspect that marijuana negatively affects your life, and if you want to quit (even temporarily), then I can strongly recommend visiting the 'leaves' subreddit (leaves.reddit.com).

In fact, even if you're just curious about how people compare living with and without weed as part of their lives, it could be interesting to read a bunch of the posts there.

It's a remarkably varied community; not everyone there sees weed as evil and quitting as an necessity.

And for alcohol there's a similar subreddit called stopdrinking. It's similarly open-minded.

Both have really helped me deal with (borderline) dependence.

davidf18 3 days ago 2 replies      
A problem is that women are smoking marijuana and may not yet know they are pregnant.

Medical marijuana laws and pregnancy: implications for public health policy.http://www.ncbi.nlm.nih.gov/pubmed/27422056

"Although there is much to learn yet about the effects of prenatal marijuana use on pregnancy and child outcome, there is enough evidence to suggest that marijuana, contrary to popular perception, is not a harmless drug, especially when used during pregnancy. Consequently, the public health system has a responsibility to educate physicians and the public about the impact of marijuana on pregnancy and to discourage the use of medical marijuana by pregnant women or women considering pregnancy." [emphasis added]

Long-term Marijuana Use and Cognitive Impairment in Middle Agehttp://archinte.jamanetwork.com/article.aspx?articleid=24849...

Association Between Lifetime Marijuana Use and Cognitive Function in Middle AgeThe Coronary Artery Risk Development in Young Adults (CARDIA) Studyhttp://archinte.jamanetwork.com/article.aspx?articleid=24849...

Marijuana use leads to increased stillbirth: (free download)

Association between stillbirth and illicit drug use and smoking during pregnancyhttp://www.ncbi.nlm.nih.gov/pubmed/24463671

"CONCLUSION: Cannabis use, smoking, illicit drug use, and apparent exposure to second-hand smoke, separately or in combination, during pregnancy were associated with an increased risk of stillbirth. Because cannabis use may be increasing with increased legalization, the relevance of these findings may increase as well."

fithisux 3 days ago 2 replies      
And start treating nicotine like heroin.
kennell 3 days ago 0 replies      
More importantly: view drug use (and abuse) as a health issue, rather than a criminal justice issue.
viraptor 3 days ago 4 replies      
Do I understand it right? Even in states with legalised recreational use you can't do research on cannabis without DEA approval?
alvarosm 2 days ago 0 replies      
This is like lsd in the 60s... Hopefully in a couple of decades we'll realize marijuana is shit.
caub 2 days ago 0 replies      
just don't smoke it
youngButEager 2 days ago 4 replies      
INSIGHT: Being in a position to observe the behavior of users -- MANY users -- over a period of time, and non-users in the same settings.

Most of you probably don't have that experience.

- Parents do. Ask parents "how did your kid change after commencing use of pot?"

- Teachers do, if they know a student imbibes. These days teachers have a great chance to see the difference between regular students and those who smoke it.

- Property managers of apartment properties do.

I'm the latter. I made my Silicon Valley startup bucks and have been buying and operating apartment properties since 1993, 2 years out of college.

Here's what I experience:

1) my pot using tenants do not like following rules compared to other tenants.

2) they are defiant in their attitude to varying degrees, challenging things they initially agree to ("no smoking", "new occupants must pass the same tenant screening you did and must be added to the lease", "no guest parking", "no loud parties/noise after 10pm", etc).

RECENT EXPERIENCES- tenant moves in, it's a no smoking building, they begin smoking pot in their unit EVEN THOUGH there are 'no smoking' signs everywhere and the lease clearly calls for 'no smoking cigars/cigarettes/marijuana'. THAT'S HAPPENED 18 TIMES (18 different tenants) IN THE PAST year and a half.

- tenant moves in, gets a warning for their car blocking other tenants in the parking lot, they kept blowing it off, parking and blocking others. FIVE TIMES over a 2 month period until they were evicted.

- tenant moves in someone without adding them to the lease (a requirement), we catch them, the added person does not pass the normal tenant screening process and they have to leave, the original tenant keeps them there anyway, we catch them, gave them a final warning, they ignored the final warning, and got evicted

Just a very small set of examples.

IF YOU SMOKE, you are the LAST person to know if your pot use has changed you, added some negatives to your behavior. "A doctor who treats himself has a fool for a patient."

I myself imbibed for 3 years as a teen. WHAT AN UNMITIGATED DISASTER. Normal recreation time was clouded by intoxication.

If you prefer being intoxicated in your leisure time, how would you feel telling people that?

"I like being intoxicated. It's my recreational activity."


"I like being intoxicated during my leisure time."


"When I spend free time with recreational pursuits, I like being intoxicated."

VERY FEW pot users will admit that to arbitrary others. Deep inside, we know to ourselves "I shouldn't need intoxication to enjoy myself."

You should not need to live in an altered state, intoxication, and if you are frequently choosing intoxication from pot as 'recreation', something is wrong.

AWS Application Load Balancer amazon.com
368 points by rjsamson  5 days ago   127 comments top 30
encoderer 4 days ago 6 replies      
We plan to do a blog post about this at some point, but we had the pleasure of seeing exactly how elastic the elb is when we switched Cronitor from linode to aws in February 2015. Requisite backstory: Our api traffic comes from jobs, daemons, etc, which tend to create huge hot spots at tops of each minute, quarter hour, hour and midnight of popular tz offsets like UTC, us eastern, etc. There is an emergent behavior to stacking these up and we hit peak traffic many many times our resting baseline. At the time, our median ping traffic was around 8 requests per second, with peaks around 25x that.

What's unfortunate is that in the first day after setting up the elb we didn't have problems, but soon after we started getting reports of intermittent downtime. On our end our metrics looked clean. The elb queue never backed up seriously according to cloud watch. But when we started running our own healthchecks against the elb we saw what our customers had been reporting: in the crush of traffic at the top of the hour connections to the elb were rejected despite the metrics never indicating a problem.

Once we saw the problem ourselves it seemed easy to understand. Amazon is provisioning that load balancer elastically and our traffic was more power law than normal distribution. We didn't have high enough baseline traffic to earn enough resources to service peak load. So, cautionary tale of dont just trust the instruments in the tin when it comes to cloud iaas -- you need your own. It's understandable that we ran into a product limitation, but unfortunate that we were not given enough visibility to see the obvious problem without our own testing rig.

ihsw 4 days ago 1 reply      
Can we agree on the terminology for Application Load Balancer and Elastic Load Balancer?

* ALB: Application Load Balancer

* ELB: Elastic Load Balancer

I have seen Application Elastic Load Balancer/AELB, Classic Load Balancer/CLB, Elastic Load Balancer (Classic)/ELBC, Elastic Load Balancer (Application)/ELBA.

In any event, I think it is great that AWS is bringing WebSockets and HTTP/2 to the forefront of web technology.

tobz 4 days ago 0 replies      
The real question: does this provide a faster elasticity component than ELBs?

At a previous employer, we punted on ever using ELBs at the edge because our traffic was just too unpredictable.

Combining together all of the internet rumors, I've been led to believe that ELBs were/are custom software running on simple EC2 instances in an ASG or something, hence being relatively slow to respond to traffic spikes.

Given that ALBs are metered, it seems like this suggests shared infrastructure (binpacking peoples ALBs onto beefy machines) which makes me wonder if that is how it actually works now, because it would seem the region/AZ-level elasticity of ALBs could actually help the elasticity of a single ALB.

If you don't have to spin up a brand new machine, but simply configure another to start helping out, or spin up a container on another which launches faster than an EC2 instance... that'd be clutch.

Deep thoughts?

0xmohit 4 days ago 4 replies      
AWS still doesn't support IPv6. Good to see them talking about HTTP/2.

Waiting for AWS to embrace IPv6.

boundlessdreamz 4 days ago 2 replies      
So this is pretty much the same as Google HTTP load balancing https://cloud.google.com/compute/docs/load-balancing/http/ + websocket & http2?
fred256 4 days ago 1 reply      
+1 for CloudFormation support on launch day.+1 for support for ECS services with dynamic ports (finally!)-1 for no CloudFormation support for ECS

(To configure an ECS service to use an ALB, you need to set a Target Group ARN in the ECS service, which is not exposed by CloudFormation)

cheald 4 days ago 1 reply      
Exciting! Disappointing that you can't route based on hostname yet, though. I've got 5 ELBs set up to route to different microservices for one app, and because we couldn't do path-based routing before, that's all segmented by hostname. As soon as ALB supports hostname routing, I can collapse those all into a single LB.
agwa 4 days ago 1 reply      
> 25 connections/second with a 2 KB certificate, 3,000 active connections, and 2.22 Mbps of data transfer or

>5 connections/second with a 4 KB certificate, 3,000 active connective, and 2.22 Mbps of data transfer.

"2KB certificate" and "4KB certificate"? Is this supposed to read "2048 bit RSA" and "4096 bit RSA"?

indale 4 days ago 1 reply      
This looks pretty sweet. The next big thing for api versioning would be header instead of url based routing, looking forward to 'give you access to other routing methods'.
rjsamson 5 days ago 2 replies      
They finally added support for websockets! Really looking forward to giving this a try with Phoenix.
daigoba66 4 days ago 1 reply      
These new features are cool... but they still pale in comparison to something like HAProxy.

I guess the tradeoff is that with ELB/ALB, like most PaaS, you don't have to "manage" your load balancer hosts. And it's probably cheaper than running an HAProxy cluster on EC2.

But for the power you get with HAProxy, is it worth it?

Does anyone have experience running HAProxy on EC2 at large scale?

erikcw 4 days ago 1 reply      
I'm curious if this will Convox to route to multiple services with just a single ALB instead of the historical default of 1 ELB per service. Would be a real cost savings for a micro-services architecture.
avitzurel 4 days ago 0 replies      
This is very good.Recently my workflow has been ELB -> NGINX -> Cluster.

Nginx was a cluster of machines that did routing based on rules into the ec2 machines. Now that the AELB has some of those capabilities it's time to evaluate it.

archgrove 4 days ago 1 reply      
Any love for Elastic Beanstalk with these? They seem well matched. Though EB always feels a bit of a red-headed stepchild in the AWS portfolio.
dblooman 4 days ago 1 reply      
It seems that routing is done in the following way /API/* goes to applications and expects :8080/api/ rather than the root. Would be nice to have the option to direct traffic to just :8080.
axelfontaine 4 days ago 2 replies      
It looks like the big missing piece is auto-scaling groups as target groups...
shawn-butler 4 days ago 0 replies      
Anybody know whether the new ALB handles a client TLS (SSL) when operating in http mode?

I was trying secure an API Gateway backend using a client certificate but found ELB doesn't currently support client side certificates when operating in http mode.

There was this complicated Lambda proxy workaround solution but I gave up halfway through...


sturgill 4 days ago 1 reply      
This sentence sums up one of my main reasons for appreciating AWS:

The hourly rate for the use of an Application Load Balancer is 10% lower than the cost of a Classic Load Balancer.

They frequently introduce new features while cutting costs.

renaudg 3 days ago 0 replies      
I'm the process of containerizing an app that includes a Websockets service, and given ECS / ELB limitations we'd just decided to go for Kubernetes as the orchestration layer.

This ALB announcement + the nicer ECS integration could tip the balance though.

Any thoughts on how likely it is that Kubernetes can/will take advantage of ALBs (as Ingress objects I suppose) soon ?

kookster 4 days ago 1 reply      
As a heavy ECS user, all I can say is thank you, finally!
nodesocket 4 days ago 3 replies      
Do ALBs support more than a single SSL certificate?
manishsharan 4 days ago 0 replies      
This is definitely nicer than having to create subdomains for microservices and mapping each subdomain url to its own Elastic Loaad Balancer + Elastic Beanstalk instance. But I have already gone down this path so I am unlikely to use AWS Application Load balancer. I wish I had this option a year ago.
nailer 4 days ago 1 reply      
Nice haproxy / nginx alternative. It's got http2 support though which puts it ahead of haproxy.
DonFizachi 4 days ago 0 replies      
Any idea if sticky TCP sessions will be supported on ELB/ALB any time soon?
amasad 4 days ago 0 replies      
I wonder if they fixed the routing algorithm for TCP connections. It's round-robin on ELB, which is performs terribly for long lasting connections.
nodesocket 4 days ago 3 replies      
So what would be a use case for using ELBs now? Seems like ALBs do everything ELBs do, but with websocket and HTTP/2 support.
joneholland 4 days ago 0 replies      
Disappointing. I was hoping they were launching a service discovery stack to compliment ECS.
bradavogel 4 days ago 2 replies      
Does anyone know if it (finally) supports sticky websocket sessions?
merb 4 days ago 0 replies      
Virtual Host Load Balancer would be great.
NeckBeardPrince 4 days ago 0 replies      
Any idea if it's HIPPA compliant?
Shape of errors to come rust-lang.org
423 points by runesoerensen  5 days ago   161 comments top 17
cfallin 5 days ago 8 replies      
I really appreciate the user-friendliness of Rust's error messages -- I can't remember seeing a compiler tell me "maybe try this instead?" before (perhaps something from Clang, but never with the specificity of, e.g., a suggested lifetime annotation). And from a parsing / compiler-hacking perspective, it seems really hard to get the heuristics good enough to produce the "right" root cause. Kudos to the Rust team for this continued focus!
justinsaccount 5 days ago 1 reply      
In my little rust experience the suggestions the error messages had, even in the longer explanations, were useless. Mostly it came down to me trying to do something that was simply not supported, but the compiler not knowing that and leading me on a wild goose chase.

From what I remember, I was trying to use the iterator trait, but return records that contained str pointers.. The goal being to parse a file and yield pointers to the strings from each line to avoid allocating memory and copying bytes around. Rust tries to tell you that you need lifetime specifiers, but if you try adding those nothing compiles anyway because the iterator stuff doesn't actually support that.

I eventually got it to work by returning copies of the strings.. maybe the unsafe stuff would have done what I wanted, that's what rust-csv seems to do at least.

I concluded that Rust is definitely not a language you can learn by chasing down error messages and incrementally fixing things. If you don't 100% understand the borrow and lifetime stuff, fixing one error is just going to cause two new ones.

dllthomas 5 days ago 1 reply      
Please put file locations (as many as might be relevant) at the start of a line in the standard format!

The other changes look valuable. Improving error reporting is great.

Edited to add emphasis: I really did mean a line, not every line.

waynenilsen 5 days ago 3 replies      
I would love to see an option for showing the error order forward or backward. My workflow is to start fixing compile time errors from the top of the `cargo` output but scrolling to the top can be fairly annoying when there are a lot of errors. Having the most relevant error at the bottom of the command line (where it automatically scrolls) would be useful as an option IMO. This probably causes some other unseen problems however
gnuvince 5 days ago 0 replies      
Very good! I always liked the content of Rust's error messages as it clearly explained the issue, but the form of those error messages was a bit problematic, they were very noisy and it wasn't easy to see the issue by simply glancing, you had to scroll up, find the beginning of the error and read carefully.
karysto 5 days ago 0 replies      
I always love it when we bring UX to the the tooling we use, and not just on the end product in our users' hands. Everyone appreciates a delightful UX, including software engineers. I've been eyeing Rust for a while now, this just gives me another excuse to hack with it.
ColinDabritz 5 days ago 0 replies      
I love the clarity and readability of these errors. You can work on UX at lower levels, and it looks like this. Beautiful. I'm not even a Rust dev, I'm mostly in C# land these days, but I appreciate the effort this takes. Well done!
Animats 5 days ago 4 replies      
Imagine what you could do if error messages didn't have to appear as plain text. You could have arrows, lines, shaded areas, and even multiple fonts.
zalmoxes 5 days ago 3 replies      
Inspired by Elm?
IshKebab 5 days ago 0 replies      
This is great! Why keep the row/column numbers at the end of the source file though? Aren't they redundant (IDEs shouldn't be parsing command line output anyway so they don't need it).
nneonneo 5 days ago 0 replies      
This focus on compiler usability is really fantastic. C calls these "compiler diagnostics" for a reason - they should help the programmer diagnose and fix the problem. I loved it when Clang started making C errors sane (and getting GCC to introduce better messages too!), and I'm glad to see Rust take the next step. Since I'm stuck in C++ for the time being, I'm (selfishly) hoping that Clang takes a page from these new Rust errors - these sorts of diagnostics would look great on C++!
marsrover 5 days ago 3 replies      
Not related to this article, but I was looking through the Rust survey mentioned at the bottom of the article and was surprised at the amount of people using it for web development.

I'm not very knowledgeable about Rust but I guess I assumed it would not be the best technology to use in that space. Is Rust really that prevalent in web development?

mrich 5 days ago 1 reply      
Great to see an improvement over the already improved error reporting established by clang and later adopted by gcc.

However I don't understand why backticks are still being used - they tend to look ugly especially when pasting them into emails etc.

Symmetry 5 days ago 0 replies      
That looks really cool and I'll have to give learning Rust another try when this lands. Also, the title was pretty wonderful. I spent a bit thinking about it before looking at the domain and realizing what it had to be.
knodi 5 days ago 0 replies      
damn... thats nice. Can't wait to dive into it.
hyperpape 5 days ago 1 reply      
Do I understand correctly that since this is in the current nightlies, it's slated for 1.12? So, sometime in October?
miguelrochefort 5 days ago 0 replies      
Nothing that Visual Studio and C# doesn't do.
What Danes consider healthy childrens television economist.com
350 points by CraneWorm  4 days ago   190 comments top 24
skrebbel 4 days ago 11 replies      
Kaj and Andreea is fantastic!

I'm Dutch but my wife is Danish. She does her utmost best confronting our kids with Danish TV (we live in Holland), so I understand a bit of what the author is on about. I was initially surprised as well.

Truth is, it _is_ entertaining! But most importantly, once you get past the absolutely amateuristic way things are shot (hey, the kids don't care, why should you?), there's a lot of depth to a lot of it. You need to let that depth sink in a bit before you can see the value of it.

For example, the article about Kaj and Andreea:

Probably most striking, though, is another thing lacking: education. Quite simply, there is none, academic or moral. Kaj and Andrea, a pair of puppets, are sweet friends, but also goofily flawed: Kaj is terribly self-obsessed, Andrea is warbling and neurotic. When other characters do something wrong, there is little of the obvious consequence-and-lesson resolution of American shows; the results are usually left to speak for themselves.

The lovely thing here is that it is educative, but it's up to the watcher to draw the lesson. Kaj's self-obsession is usually very funny and rather ridiculous. It teaches kids "wow, it's pretty ridiculous to be so self-obsessed". And at the same time it teaches kids that even if you're flawed, that's ok - both Kaj and Andreea appreciate one another and are appreciated and respected by the human co-hosts of their TV shows. This is something that I've not really seen anywhere else.

I'm not convinced that kids learn lessons like these better when it's spelled out for them.

ups101 4 days ago 0 replies      
You missed Bamse ("Teddy"): A self-absorbed egocentric narcissist, taking advantage of his naive, little-minded friend Kylling ("chicken"), episode after episode with zero negative impact. After +20 years he's now joined by Bruno, also a teddy bear, sharing similar personal flaws except with much refined insults bordering on psychopathic abuse, bullying kids instead of chickens, believing firmly that the world exists solely to please him in every which way.

However, mostly you may have missed the subtle, educational point (yes, there is one) underpinning these characters: Critical thinking. Is it right to behave like this, even if there is no negative consequence (as is often the case in real life)? Do you emphasize with someone taking advantage of their friends? In Denmark, we have this crazy idea that kids are interested in distinguishing right from wrong, even when the answer isn't spoon-fed by a morally correct television character - and maybe exactly because it isn't.

The kids don't like Ramasjang because it's a training ground for inspiring film makers. The like it because naughty is fun. And, if done right, it just may induce some critical thinking. It'll take more than a few days of watching, but just like the kids, you'll see the point :-)

Best regards from a grown-up, bottle-fed on Bamse, somewhat certain of right from wrong.

johnjuuljensen 4 days ago 0 replies      
As a Citizen of Denmark and a father of two (4 & 1), I absolutely love the danish stuff on Ramasjang and hate most of the American (foreign) stuff. The programs produced in DK are rude and fun and slow and scary, while the foreign stuff is mostly lame attempts at education (Dora the explorer is a particularly nasty example of this) or over the top animations with too much moral crap, too pretty characters and backed by toy franchises and whatnot.

I seems to me that the author actually enjoyed the shows and I mostly agree with his assessment, but he can't have been watching too much because he missed some important shows that are in fact educational.

Most prominent of the educational shows is Mr. Beard, who teaches numbers, counting, simple math, letters and simple spelling. The show is weird and quirky, but also features some great songs. The whole thing is backed by some pretty great apps, which expand on the educational stuff.

There's also a great show about kids helping animals in trouble. They don't pretend that the kids do most of the work, it's a professional that does the heavy lifting, but the kids get involved as much as possible.

I don't mind watching these shows with my kids, but I'll leave as soon as Dora or Thomas the tank engine or Chuggington starts.

If you're interested in watching some completely outrageous danish childrens TV, go watch "Carsten og Gittes vennevilla". It's created by the duo "Wulf Morgenthaler", who have produced some the weirdest stuff on danish TV. Particularly the one where Carsten creates a fox is a hoot.

aedron 4 days ago 6 replies      
The most astute observation is that characters on Danish kids TV are often completely, unapologetically flawed. This makes them relatable to kids, and kids enjoy watching them in all their dysfunction.

And I actually disagree with the premise that this contains no lessons or morals. Children are able to see clearly how ridiculous and ill-advised the behavior of these characters is, and so learn to recognize it in themselves and others.

Someone once pointed out that while Mickey Mouse is the mascot of Disney in the U.S. and the main character of the franchise, in Europe it was Donald Duck who became popular. Europeans like the flawed anti-hero, while moralizing America (sorry!) preferred the do-gooder know-it-all Mickey Mouse.

ulrikrasmussen 4 days ago 2 replies      
I'm a Dane, and grew up with Danish children's television, including the shows mentioned in the article. I just recently re-watched one of my favorite shows "Nanna", which you can now stream from the archives of DR. Even as an adult, I still enjoy watching it for its humor and creativity, and for the fact that it doesn't speak down to its audience. I don't think that can be said about most other children's shows.


Nanna: "What're you doing?"

Mother: "We're having sex, what're you doing?"

iamthepieman 4 days ago 0 replies      
I just turned off the television after my kids watched more television in one morning than they have in an entire month. The reason? Rio Olympics track and field. Finally had to turn it off because I wasn't getting any work done.

Other than that, in the summer time anyways, I'll usually toss them a book of matches and tell them to start a fire or go build a dam in the brook.

We usually have lofty educational goals at the start of summer but quite honestly I can't be arsed when the weather is warm and there's so much great romping to be had. When we watch TV it's usually youtube for something we're curious about as a family. The "Fun" T.V. most always leaves my kids grumpy afterwards so we avoid it, not for moral or educational or child development reasons, but because i don't like grumpy kids.

After a couple kids and being the oldest of a large family I've realized from my individual scientific survey of one that kids will develop and learn in spite of their parents. We get too much credit and too much blame.

You gotta water if the sun's hot and it won't be raining for a while and you should try to get a load of fertilizer at least once a year. I guess you should weed after the first planting but you certainly can't make them grow.

If this comment is too rambling for you maybe you should go watch something with a moral in it[0]


tronje 4 days ago 1 reply      
> But until then, they seem utterly unharmed by a childhood of hearing about the queens bottom and watching grandma light some bodily gas on fire.

Is that really as surprising as is implied? To me, none of what the article describes seems like such a bad thing. Sure it's a little weird, but it seems right up there with the humor children enjoy. And not everything has to have a message or a moral at the end of it.

wodenokoto 4 days ago 4 replies      

 > And God he says, lives in heaven with Santa Claus and > their dog Marianne, implying that the Supreme Being > is not only imaginary, but also gay.
That is like saying Ernie and Bert are gay. I'd appreciate if the author tried a little harder at interpreting childrens television from a childs perspective.

TeMPOraL 4 days ago 5 replies      
At the risk of making a tu quoque argument - if you want to see unhealthy children television, look no further than Cartoon Network.
ganzuul 3 days ago 0 replies      
"And God he says, lives in heaven with Santa Claus and their dog Marianne, implying that the Supreme Being is not only imaginary, but also gay."

Or, you know, a woman.

koolba 4 days ago 0 replies      
> Probably most striking, though, is another thing lacking: education. Quite simply, there is none, academic or moral.

I already love this!

If there's one thing that children need to experience at an early age, it's that the majority of what they'll deal with in life will have no redeeming qualities whatsoever. Life should be enjoyed for life ... it doesn't always have to be a backhanded way to learn to count.

silvestrov 4 days ago 1 reply      
This link to "Ramasjang" might (or might not) work from outside Denmark:


Most bakers sells "Kaj Cakes", we don't mind eating the cute puppets: https://scontent.cdninstagram.com/t51.2885-15/e35/c72.0.495....

jonah 3 days ago 0 replies      
I'm surprised no one has mentioned Mr. Roger's Neighborhood yet. One of the best if not the very best childrens' program in the US. It was slow-paced and deliberate like the Danish shows are described, but instead of being crass and letting things play out, he dealt with serious topics in a simple, honest, and caring way. He also encouraged curiosity and investigation and broke gender and ethnic mores way before it was the in thing to do.

Well worth checking out.

gurkendoktor 3 days ago 0 replies      
I see a lot of fellow Europeans in this thread who prefer local TV series to preachy US cartoons (me too). This is slightly off-topic, do you have an opinion on the old Dutch show Alfred J. Kwak[1]? It was extremely political, but somehow I didn't mind when I was a kid. I wonder if it was simply surreal enough to make up for all the reality that it depicts? Has anyone re-watched this series again with kids, 25 years later?

[1] https://en.wikipedia.org/wiki/Alfred_J._Kwak

goodJobWalrus 4 days ago 0 replies      
> Ramasjang is entertainment, not a replacement for parents or school. Parents are expected to know when to switch it off (but just in case, the characters go to bed at 8.00pm, and are shown sleeping until the morning) rather than pretend that it is self-improvement.

yes, I grew up watching mainly old American cartoons (like Tom and Jerry), and I don't remember them being particularly "educational", and it was well understood that we are watching them for entertainment, and the last one was at 7:15 pm on our television.

zeristor 3 days ago 0 replies      
Hi kids, don't forget we live in the future; seems odd to discuss something and not have a link to it.

Onkle Reje:


It may be in Danish, but it is for 4 year olds.

What would Rasmus Klump do?

mankash666 3 days ago 0 replies      
Denmark - Crass, potty mouthed but entertaining kid shows. USA - Clean, preachy, educational kid shows.

Denmark - No gun crime, low homocide rate, high Human Development Index (HDI)USA - High gun crime, high homicide rate, lower HDI.

So - What exactly do kid shows have to do with long term success of the kid, or the general society they contribute to?

chvid 4 days ago 0 replies      
As a dane I was deeply traumatised by the show "Poul og Nulle i hullet".
zeristor 3 days ago 0 replies      
You can download the DR app on an Apple TV and watch Danish television.

Strangely a lot of reshot UK programmes.

My particular favourite was putting a camera on the front of a train: Helsingr to Helsingborg the long way round.

I'll get my coat.

repler 3 days ago 0 replies      
95% of the population needs to be brainwashed means 95% of the children's shows are going to be garbage.

It's just math. Make sure your kids are in that 5%.

spullara 3 days ago 0 replies      
None of my kids watch traditional television at all, everything they are interested in is on YouTube, both for entertainment and education.
galfarragem 3 days ago 0 replies      
They are using 'vaccination' logic: a small dosis of reality makes kids healthier in the long term.
arethuza 4 days ago 6 replies      
"Supreme Being is not only imaginary"

I was immensely proud when my son at 4 years old decided to stand up in his pre-school class and point this out to everyone.

sixhobbits 3 days ago 1 reply      
Can we please edit the title to match the (correct) original "children's" instead of the modified (incorrect) "childrens'"
React Server react-server.io
401 points by uptown  4 days ago   137 comments top 30
pault 4 days ago 6 replies      
This is interesting, especially since I just spent the last two weeks setting up a boilerplate for a universal react/redux SPA on spec for a new client. I enjoy the flexibility but the need to develop a deep working knowledge of several independent libraries, transpilers, and build tool configuration files (each of which has several competing options with their own way of doing things) just to get to hello world is cost-prohibitive for most people, I'm sure. At the same time, I'm hesitant to go "all in" on a stack that I haven't heavily researched myself. If the developers are reading, can you go into some details about how you handle routing and data stores? Are you using off the shelf libraries or have you rolled your own?
android521 4 days ago 1 reply      
It doesn't seem like it is for beginners. Docs look at the author assumes everything is self-obvious. Need better tutorials.
idbehold 4 days ago 4 replies      
How does it handle fetching data from a path on the same origin? For instance, I only need to fetch('/api/users.json') on a certain page (/users). This means that it can either be hydrated in the initial state when performing a full page load of /users or needs to be fetched (using xhr/fetch) when navigating to that page from another page on the site (which shouldn't require a full page load).

So how exactly does the server perform that same fetch when attempting to hydrate the full page without actually making an HTTP request against itself?

ken47 4 days ago 0 replies      
If this works as advertised, this may prove to be a very useful project, that can replace numerous homegrown, half-complete implementations strewn about the internet.
mattbroekhuis 4 days ago 2 replies      
I've been banging my head in this boilerplate for the last few weeks and it's been very interesting.


How does this compare to that?

fdim 4 days ago 1 reply      
I see a bunch of failed attempts in console to connect to slack websocket. Is everything looking ok for you?

After blocking 2 iframes with adblocker I could finally inspect what was going on :)

Anyway, I can definitely feel that is fast and seamless and worth to give a deeper look! In the meantime, prefetching all the content in docs or source views upon load generates quite a few requests and might explain your scaling issues. Would you mind sharing statistics for number of users and hardware behind ?

underwater 4 days ago 1 reply      
Based on the documentation and the design principles this seems like a really promising framework.

The data hydration, incremental HTML delivery and incremental code loading are really, really important for creating web apps that aren't load time hogs. Great to see that they were unopinionated about data fetching, too. That's one of the things that has made it difficult to drop Relay into existing applications.

Is this used in production? Are there any performance numbers that you can share.

hughbetcha 4 days ago 2 replies      
Would be useful if this project explained the difference between React and ReactServer. It seems they are as similar as Java and Javascript.

Instead of the render() method in React to output JSX, it appears that ReactServer uses getElements() for a similar purpose. So the entire model and object lifecycle is probably different as well?

gedy 4 days ago 1 reply      
Humorously this looks a bit like Apache Wicket, a Java-based client/server UI framework which has been around for about 10 years: https://wicket.apache.org
kelvin0 4 days ago 3 replies      
I just recently made the switch into the Web Dev world (coming from C++/Python, desktop world). Since I knew of Django, I've started using it as the 'Backend/Server' part of my Web app dev stack. Basically using Django to render minimal React/HTML/JS/CSS to bootstrap my single page Web app.Wondering, what advantage would I have would I get from using react-server instead of Django (aside from using JS across the board)?

Would appreciate feedback!

sergeym 4 days ago 0 replies      
there is a wealth of information in their documentation about making performant rendering on the server and client https://react-server.io/docs/intro/why-react-serverI found it super interesting.
mxuribe 3 days ago 0 replies      
Mad props to a crew from the real estate industry who just pumped out cool tech for universal utility! Kudos to the folks from Redfin!
silasb 4 days ago 0 replies      
The company that I was consulting for tried to get Spring and React to play nice via Nashorn, but ended up scraping the idea 6 weeks in because of performance issues and not enough developers knowing the stack. Nashorn was missing a lot of essentials to make this easy out of the box. So looking at this is a breath of fresh air.
mbreedlove 4 days ago 0 replies      
One of the great things about React is its support for server-side rendering

Cool, I never knew that.

crudbug 4 days ago 0 replies      
A simple model/flow of rendering pipeline would be helpful here based upon routing logic.

As I understand, Pages are composed of Components, Components provide HTML sections. HTML sections are loaded incrementally.

encoderer 4 days ago 1 reply      
An alternative here is Hypernova by Airbnb.
zinssmeister 4 days ago 0 replies      
React Server looks fantastic!

For some reason this week I stumbled over many great React.js articles, so I started a collection here http://deepreact.com mostly to save&share things with friends, since a lot of us are getting deeper into React now.

bsimpson 4 days ago 2 replies      
I'd really love to see a demo of what they mean by "seamless transitions."
Swennemans 4 days ago 1 reply      
Are there any numbers comparing pre-rendered React versus React communicating with a JSON Api?It seems to put a lot more stress on the server which can neglect the (theoritical) speed improvements
tboyd47 4 days ago 1 reply      
Looks fascinating, I will definitely be watching this project in the coming year.

Just curious, if it's been successfully running in production for over a year, why is it only being used to serve three pages?

Amplifix 4 days ago 0 replies      
Looks very interesting, will give this a spin. I've been skimming the docs, but I assume you'll still have to install redux to handle state etc.
deegles 4 days ago 0 replies      
Could react-server build in support for AMP? https://www.ampproject.org/
antar 4 days ago 1 reply      
You have a misspelling in your homepage: ne not ne.
oelmekki 4 days ago 0 replies      
It even includes a Dockerfile and a docker-compose.yml file to get quickly started, that's cool :)
krebby 4 days ago 1 reply      
This looks cool. Any way to make it easily work with Relay?
pseudointellect 4 days ago 0 replies      
> "Blazing fast page load and seamless transitions."

Click on Get Started and no seamless transition, but rather a very abrupt page load.

kra34 4 days ago 0 replies      
every time somebody makes a new react / angular derivative a unicorn loses it's horn.
rando832 4 days ago 1 reply      
wtf is ne?
charford 4 days ago 3 replies      
Getting random 504 timeout errors on this site. Is this hosted using react-server?
rajangdavis 4 days ago 1 reply      
Getting some weird issues, can only get to the docs by clicking on the link to the docs and then refreshing the page a couple of times...
New Leaf Is More Efficient Than Natural Photosynthesis scientificamerican.com
332 points by jseip  4 days ago   130 comments top 18
ChuckMcM 4 days ago 4 replies      
I am surprised no one has mentioned replacing the adsorption CO2 scrubbers on space craft[1] with something that is powered by electricity. The article claims 130 grams of CO2 removed from the air per kilowatt-hour. Astronauts might expel 40 - 50 grams of CO2 per hour into the air, so 500wHrs of power keeps the air breathable forever? That is a good deal. For reference the ISS has a crew of 6 and has 84 - 120kW of power capacity [2].

[1] Study problem on CO2 removal from NASA -- https://www.nasa.gov/pdf/519338main_AP_ED_Chem_CO2Removal_St...

[2] https://www.nasa.gov/mission_pages/station/structure/element...

matt4077 4 days ago 9 replies      
This needs the additional information that photosynthesis is incredibly inefficient. It's <5% IIRC, so we already have solar panels almost a magnitude better than what nature did.

(RuBisCO as the protein at the center of the process is also quite strange: it's huge and slow. As in 'this ain't funny any more, start working' slow with about a reaction per second.)

Animats 4 days ago 3 replies      
The article seems to indicate this is electrically powered, not powered by light. Where's the photosynthesis? First they make electricity, then they crack water into oxygen and hydrogen, then they combine the hydrogen and C02 to make hydrocarbons. If you've got electricity, why make fuel? That's wasteful.
cmccart 4 days ago 2 replies      
This got me thinking: how much CO2 is emitted by different energy sources in generating 1 kilowatt-hour?

I came across this link: http://blueskymodel.org/kilowatt-hour

Seems like solar is more or less a break even, where nuclear /wind/geothermal/hydro are pure wins, which makes sense I suppose. Could you manufacture enough of these to consume 1 kilowatt-hour without generating more CO2 in the process than they would consume in their lifecycle?

eggy 4 days ago 0 replies      
Very cool. I thought it was going to be more 'biological' i.e. less about finding a catalyst, and more about microbes and chlorophyll somehow.

I think technology creates things, sometimes problems, and then newer technology sometimes addresses these problems. I like the concept of all of these CO2 extraction-for-energy technologies that seem to be popping up lately.

Now, let's hope they can scrub some CO2 from the atmosphere, but not too much! After all, the climate models have been proven to be underestimating the rise of temperature, so the models are deemed not reliable for prediction or forecasting.

Take too much CO2 out, and we're in for a Global Winter. Sort of like the old Twilight Zone episode on TV (Ok, I'm old) where a guy is feverish, and in the background the Earth is getting too hot because of the sun getting closer? He then wakes up and it is snowing outside, and just when you think it is fine and dandy, it is the beginning of an Apocalyptic Winter!

Osiris30 4 days ago 6 replies      
Previous discussions on 'artificial leaves':


fernly 4 days ago 1 reply      
Found the paper:


Found a nice quote in the LA Times coverage:


Stay tuned because Pam and I are on a path to do nitrogen fixation in the same sort of way weve just done water splitting, [Nocera said]

taneq 4 days ago 0 replies      
It seems disingenuous to say this is "more efficient than natural photosynthesis" when it's a bioreactor using photosynthesizing microbes (and thus presumably use the same chlorophyll-based chemistry as natural photosynthesis).

On the other hand, I wonder about the potential for evolution to occur inside reactors like these, essentially self-optimising them over time.

sleepychu 4 days ago 1 reply      
I read science fiction story where they're on a colony on venus where the big problem is "how do we do make artificial photosynthesis? Plants are rubbish." but it was a free self-published novel and I now cannot find it. Anyone else read it?
_pmf_ 4 days ago 0 replies      
I cannot fathom why we have large scale, highly efficient meat production facilities (remember: animal rights are only relevant to hippie tree huggers), yet there's virtually no commercial efforts to harness plants. The technology exists [0]; I don't understand what the practical or political problems are. I think small scale plant reactors have the potential of being able to be manufactured very cheaply; maintenance is probably the issue, but I'd like to hear from an expert in the field.

[0] https://en.wikipedia.org/wiki/Algae_bioreactor

lechevalierd3on 4 days ago 0 replies      
I was expecting some crazy analogy between the a Nissan Leaf and natural photosynthesis.
mtgx 4 days ago 0 replies      
The question is if this is more efficient than just using solar panels + batteries to power something up. My guess is it's not, probably not even close. But perhaps there can be some niche uses for it where this system complexity and lower efficiency still makes more sense than using batteries.
frgewut 4 days ago 0 replies      
Does anyone know how does the efficiency of an entire tree compare (i.e a single leaf reflects some light, which then is absorbed by another leaf)?

I would guess that number should be higher than simple "photosynthesis efficiency".

fsiefken 4 days ago 0 replies      
So if you replace standard solar panels with this and you burn the alcohol in an engine each hour, does it produce more or less electricity each hour if we know it's 10x more efficient then natural photosynthesis?
naasking 4 days ago 0 replies      
Very neat technology for a possibly carbon neutral fossil fuel economy.
lwis 4 days ago 0 replies      
Does this present any potential improvements to solar panel efficiency?
mtgx 4 days ago 1 reply      
sanoy 4 days ago 0 replies      
This is insanely cool.
Bi-Directional Replication for PostgreSQL v1.0 2ndquadrant.com
302 points by eloycoto  4 days ago   91 comments top 16
sinatra 4 days ago 1 reply      
As the linked page doesn't describe what it is: Bi-Directional Replication for PostgreSQL (Postgres-BDR, or BDR) is an asynchronous multi-master replication system for PostgreSQL, specifically designed to allow geographically distributed clusters. Supporting more than 48 nodes, BDR is a low overhead, low maintenance technology for distributed databases. [0]

[0]: https://2ndquadrant.com/en/resources/bdr/

baq 4 days ago 1 reply      
It's not immediately clear for me when to use BDR and when would I choose XL. Can somebody do a quick comparison? I like this quote in the announcement email:



BDR is well suited for databases where:

- Data is distributed globally

- Majority of data is written to from only one node at a time (For example, the US node mostly writes changes to US customers, each branch office writes mostly to branch-office-specific data, and so on.)

- There is a need for SELECTs over complete data set (lookups and consolidation)

- There are OLTP workloads with many smaller transactions

- Transactions mostly touching non overlapping sets of data

- There is partition and latency tolerance

However, this is not a comprehensive list and use cases for BDR can vary based on database type and functionality.

In addition, BDR aids business continuity by providing increased availability during network faults. Applications can be closer to the data and more responsive for users, allowing for a much more satisfying end-user experience.


But I can't seem to find something similar for XL and especially a diff between the two.

Klathmon 4 days ago 5 replies      
So this might not be the right place for this, but i'm curious.

How do people deal with "eventual consistency"?

In my head once a transaction is done, it's everywhere and everyone has access to it.

What happens if 2 nodes try to modify the same data at the same time? Or what happens if you insert on one node, then query on another before it propagates? And if the answer to those questions are what I think they are (that bad stuff happens), how do you setup your application to avoid doing it?

jimktrains2 4 days ago 3 replies      
> Im pleased to say that weve just released Postgres-BDR 1.0, based on PostgreSQL 9.4.9.

While I'm still excited to play with it, I can't wait until pglogical comes into mainline.

DelaneyM 4 days ago 1 reply      
I wonder if it's plausible for AWS to release a customized variant on this targeted to their data centers and designed for ease of multi-zone deployment? Something like Azure (which optimized and facilitates MySQL).

I realize that redshift already loosely meets that definition, but it doesn't quite work as a globally distributed regionally clustered web service back end. This is ideal.

koolba 4 days ago 3 replies      
Anybody in the HN crowd have experience using this? How does it perform on a WAN?
looneysquash 3 days ago 2 replies      
What I want to know is the timeframe of supporting PostgreSQL 9.5 and/or 9.6. (Though 9.6 is still in beta.)

Also, my understanding is that they're feeding patches back to postgres, and want it to eventually run on stock postgres. But it's not clear to me how progress on that is going.

I was also surprised that UDR was removed, I didn't even realize it was deprecated.

I'm not actually using the product at all right now, but I've been watching on the website, because I want to use it eventually.

I'm kind of hoping it works with stock postgres before I jump in. But if not that, I think I at least want to wait for 9.6 support.

nubela 4 days ago 2 replies      
Any reasons why I should not use this? Any other (Postgres-esqe) alternatives for such a solution?
zzzeek 4 days ago 1 reply      
um, LICENSE? am I missing something


rattray 3 days ago 0 replies      
They claim performance comparable to Hot Standby:


and, in some cases, ~1.5x over Slony.

onderkalaci 3 days ago 1 reply      
Is there a documentation on "How BDR works?" or so?
brightball 4 days ago 1 reply      
Can't wait to try this out.

I wonder how long it will take Heroku, RDS or another PG provider to make this available?

therealmarv 4 days ago 3 replies      
When reading this I wonder how to do this: OS or Postgres updates with one normal Postgres (or a typical master+slave setup) without downtime and without using Postgres BDR. Does somebody now?
rattray 3 days ago 1 reply      
It seems like this is very much a (FOSS) product of 2nd Quadrant. Does anyone here have experience with them? What is their reputation like?
therealmarv 4 days ago 1 reply      
Would not this be an ideal Docker database?! But I could not find a good maintained Docker image in the hub: https://hub.docker.com/search/?isAutomated=0&isOfficial=0&pa...
petepete 3 days ago 0 replies      
Title makes it sound like people are still writing extensions for PostgreSQL 1.0
Unsafe levels of toxic chemicals found in drinking water for 6M Americans harvard.edu
316 points by upen  5 days ago   147 comments top 24
ianlevesque 5 days ago 3 replies      
Since people seem curious how to analyze this for their locale, the best I could come up with (using the raw data at https://www.epa.gov/sites/production/files/2015-09/ucmr-3-oc... ) was:

1. Search zipcode in UCMR3_ZipCodes.txt, obtain PWSID.

2. Filter by PWSID in UCMR3_All.txt.

3. Filter that result by rows containing "=" (which means at or above minimum reporting level)

4. Don't panic.

5. Compare AnalyticalResultsValue column to the Reference Concentration in ucmr3-data-summary-april-2016.pdf. If its under the Reference Concentration then you're safe, within the limits of how incomplete their reference concentations are. The document specifically states:

> The intent of the following table is to identify draft UCMR reference concentrations, where possible, to provide context around the detection of a particular UCMR contaminant above the MRL. The draft reference concentration does not represent an action level (EPA requires no particular action1,2 based simply on the fact that UCMR monitoring results exceed draft reference concentrations), nor should the draft reference concentration be interpreted as any indication of an Agency intent to establish a future drinking water regulation for the contaminant at this or any other level.

The minimal reporting level seems to be based on how small an amount is detectable, not harmful. The reference concentration appears to be a best guess at the moment for what a maximum safe amount is.

My zipcode for example came up with several of these above the MRL but below the reference concentration. Enjoy.

Edit: added link.

mortehu 5 days ago 7 replies      
A small PSA: "Brita" brand water filters seem to be very popular in US, but they're also some of the worst in tests. For example in this test ...


... "Brita" filters were found to remove around 55% of PFOA, whereas "ZeroWater" filters remove more than 95%. I.e. "Brita" filters leave 9x more stuff in the water. The performance difference is similar for other unwanted stuff, like lead.

refurb 5 days ago 2 replies      
had at least one water sample that measured at or above the EPA safety limit of 70 parts per trillion (ng/L)

Just as a comparison, the EPA safety limits for other compounds in drinking water is[1]:

Arsenic 10,000 parts per trillion

Cyanide 200,000 parts per trillion

Carbon tetrachloride 5,000 parts per trillion

All of those compounds are known to be very toxic. I'm curious why the safety limit is so low for these PFAs if there isn't much data on them.


mox1 5 days ago 2 replies      
If you want a report like this for your own personal drinking supply check out http://www.karlabs.com/watertestkit/

I did it recently. I'm using the results to put some water filters into the house that are actually NSF (1) certified to remove the contaminants the test found.

[1] http://info.nsf.org/Certified/DWTU/

stokedmartin 5 days ago 4 replies      
To see specific zip codes, download zip[0]. UCMR3_ZipCodes.txt has all the zip codes and you can cross reference UCMR3_All.txt using PWSID to get facility ID; then cross reference UCMR3_DRT.txt using the facility ID to get the disinfection type. Details about the disinfection type can be found in a pdf within the zip file.


vskarine 5 days ago 2 replies      
Original article has a generic map without much details but does anyone have a list of specific cities or zip codes that are affected?Original article with map: https://www.hsph.harvard.edu/news/press-releases/toxic-chemi...
Loic 5 days ago 0 replies      
This illustrates the future unknown issues with respect to chemicals and health/environment. At the moment, we have two schools to handle this:

1. not proved bad, we allow.

2. not proved good, we disallow.

The US is more on the first side and the EU on the second, of course, like for any complex issues, the right decision is context dependent and in the middle.

I am working in the area of chemical properties, this is a hard problem and I am sure that when this class of compounds arrived on the market, smart people tried their best to figure out the health implications. We have more tools and experience 50 years later, but we are not smarter. We need to move with caution while handling an increasing market pressure... this is challenging!

ars 5 days ago 1 reply      
A simple carbon filter - of any type, will take care of these.

Lead is a much bigger problem since it's much harder to filter out.

wfunction 5 days ago 1 reply      
I saw an article on this back in January. I remember reading that CA seems to have the highest number of people affected by PFOA.

Scary and worth reading: http://www.nytimes.com/2016/01/10/magazine/the-lawyer-who-be...

jonah 5 days ago 2 replies      
Ermm, my city's well closest to me has the following:

 MRLTested chromium0.20.67 vanadium0.21.1 strontium0.3960 chromium-60.030.44 chlorate20190 molybdenum18.7
[Edit] to change "Allowable" to "MRL":

"The lowest amount of an analyte in a sample that can be quantitatively determined with stated, acceptable precision and accuracy under stated analytical conditions (i.e. the lower limit of quantitation). Therefore, analyses are calibrated to the MRL, or lower. To take into account day-to-day fluctuations in instrument sensitivity, analyst performance, and other factors, the MRL is established at three times the MDL (or greater)."

[Edit 2] In my town's overall distribution system there are multiple samples above the allowable Reference Concentration. The highest were:

 TestedAllowable strontium19001500 chlorate410210

lvs 5 days ago 0 replies      
I continue to be shocked that consumer beauty products containing polyfluorinated alkyl compounds are being sold without any significant attention or regulatory oversight [1].

[1] http://www.livingproof.com/buy/our-science

hiou 5 days ago 2 replies      
Good grief. There is very little data this stuff is harmful. Certainly not going to wipe out a town in a weekend. Can't wait for millennials to age a few more years and hopefully get a little context. Being wedged between baby boomers and millennials gets more uncomfortable by the day.
the_mitsuhiko 5 days ago 1 reply      
I'm not sure how it's in the US but my biggest fear with water quality is never the water from the source but what happens on the way to your tap. In Vienna Austria for instance they are super concerned about PH levels because some old houses still have lead piping and if it gets out too far, then the pipes can corrode and lead makes it into the water.

As another example the difference in water quality in the same district in Moscow and SPB from different taps is crazy.

Now that smart meters are a thing for electricity I really wonder if it would not start to make sense to work on basic water quality measurements in houses.

smaili 5 days ago 0 replies      
Anyone have personal experience or research from reverse osmosis filters? I keep hearing it makes the water "better" but never any hard facts on how it performs statistically.
finid 5 days ago 1 reply      
> Drinking water samples near industrial sites, military fire training areas, wastewater treatment plants have highest levels of fluorinated compounds

Areas around fracking sites must be even worse.

emilong 5 days ago 0 replies      
This appears to me to the raw sample data https://www.epa.gov/sites/production/files/2015-09/ucmr-3-oc...

The UCMR3_All.txt file inside looks to have a fairly nice, denormalized set of samples with location names, dates, and detections.

bronz 5 days ago 2 replies      
i rent a room in a very old house and i wouldnt be surprised if there were higher than acceptable lead levels in the tap water here. so it got me thinking about water filtering. a reverse osmosis filter and charcoal filter working in series seems to be the most thorough method. does anyone have experience with filtering their water?
tdaltonc 5 days ago 0 replies      
Any tips on how to tell what I should actually be worried about from FUD or WOO wrt/ drinking water?
Aloha 5 days ago 2 replies      
Show me more on the science on them, and I might be concerned about this being a problem.
drsim 5 days ago 0 replies      

> ...from the Faroe Islands, an island country off the coast of Denmark

Not really off the coast of Denmark...https://www.google.co.uk/maps/place/Faroe+Islands

artur_makly 5 days ago 0 replies      
What are the stats for NYC?
azinman2 4 days ago 0 replies      
Being lazy here: anyone know the data for SF?
grandalf 5 days ago 5 replies      
Anything that is public health related is bound to be full of tradeoffs and "good enoughs". The water in Flint, MI was considered good enough by Flint's officials, as is the water in many other municipalities.

Just as a private monopoly would start to reduce quality once it cornered the market, public, regulated monopolies behave similarly. Why care about quality if there is no competition and it's just another year earning a pension for the bureaucrats involved?

Consider how the bidding process for building roads results in poorly built roads and no incentive to the winner of the contract to build something that will last. Nearly every piece of American infrastructure is riddled with corruption or incompetence (or both). Even basic construction is full of various rules intended to bolster union workers, etc. and adding enormous cost.

Why we won't be selling Genuino or Arduino any more pimoroni.com
381 points by whiskers  5 days ago   242 comments top 24
nathancahill 5 days ago 42 replies      
Serious question: what are people building with these boards? The recurring projects I've seen are controlling lights, doors/locks and monitoring water in plants. I don't find these particularly compelling, am I missing something? How is this the next big thing?
chappi42 4 days ago 3 replies      
If this [untold Arduino history](http://arduinohistory.github.io/) link is true, it seems like Massimo Banzi is not only difficult to do business with but that he had a very particular way to give credit.

Better buy clones...

snarfy 4 days ago 2 replies      
The whole thing is a sham really when you consider the heritage from the Wiring/Processing.


Arduino is a commercial version of somebody else's academic work.

rocky1138 5 days ago 0 replies      
This is what happens when a company crosses the line from being a small, hacky, indie, startup shop to a corporation.

It's a line that can only be seen after you cross it, making it especially onerous.

SEJeff 5 days ago 5 replies      
I really don't see the point of the arduinos when you can buy an esp8266 that is arduino compatible (you can use the arduino IDE and sketches).

You can get the bare esp8266 chips for $1-$2 from aliexpress, or buy a tricked out one from adafruit (the huzzah) for $9. They have a full wifi stack, support micropython, etc.

I think we're in a post-arduino age for microcontrollers. There are simply better options out there, of which the esp8266 is just one of the current better choices.

rootbear 5 days ago 1 reply      
The local Microcenter sells Arduino Uno R3 clones under their Inland store brand for $10, currently and frequently on sale for $6. Ideally, I'd like to support Arduino LLC, but given how absurd this whole feud has been, I don't feel especially guilty when I buy a clone. I have bought a few Arduino brand 101 boards from Microcenter as well so I'm not a total sell out!
jklinger410 5 days ago 3 replies      
TLDR: Reseller is pissed that "Genuino" has to be sold as "Arduino" in the US and that they have separate SKUs. Also Genuino seems to have poor B2B/reseller support.

Is this post really that petty or am I missing something?

herbst 5 days ago 2 replies      
Cool. Now i can buy the cheap offbrand clones from Aliexpress without thinking i should have supported the actual creators.
antoniuschan99 5 days ago 2 replies      
So basically the Arduino the Software Company (IDE), and Arduino the hardware supplier broke ties?

I use the ESP8266 which I do believe is an Arduino (Hardware) Killer. However, the Arduino IDE has amazing support for the ESP8266 among other hardware components.

This isn't a bad thing, at least for consumers, since I feel like Arduino (the hardware) is a few cycles behind (eg. Particle, CHIP, bbc micro, pi zero, nodemcu/esp8266... etc)

Marazan 5 days ago 1 reply      
My conclusion in all of this is to buy Teensy instead. Lovely thing.
triplesec 4 days ago 0 replies      
It appears that greed and control issues were present right from the beginning of the Arduino project, given the experiences of Wiring's developer, Hernando Barragan, upon whose work it seems most of Arduino was branded, without recognition or recompense. I therefore would have very little sympathy for Arduino LLC.

http://arduinohistory.github.io/and HN: https://news.ycombinator.com/item?id=11212021

crumpled 4 days ago 0 replies      
This article acknowledges over and over that Arduino/Genuino is following common business practices. They are complaining that arduino has become too big, and it's not nice for them, because pomoroni is also big.Kind of ironic, really.

Yes, they want to cultivate a new brand while defending the old brand. It's awkward and I'm sure they aren't happy about it either.

falcolas 5 days ago 2 replies      
Dear Pimoroni,

Please add a link to your store from your blog.

Sincerely, everyone who might actually want to buy from you.

emptybits 5 days ago 0 replies      
So... asking on behalf of a friend ;-)... my local dealer stocks both Arduino and Genuino brand boards. Is this unusual or about to change?

I feel like a curtain of innocence has been torn away for me. Ignorance of the politics in the Arduino scene was bliss.

qz_ 4 days ago 0 replies      
I have a Teensy board, which is much cheaper and works pretty much exactly the same as Arduino and Genuino. The Teensy LC is great for small projects and is about 15 bucks IIRC. Never understood why you would stick to a more expensive brand just because of the name.
markhahn 4 days ago 1 reply      
Good response, and indeed the "maker vibe" seems to have been poisoned by commercial concerns...

Real makers tend to buy unbranded stuff from Chinese free-shipping places anyway. Branded Arduino is mostly a historic artifact...

rplnt 4 days ago 0 replies      
> Everyone sells worldwide now.

Yeah.. no. Not even close.

nfriedly 5 days ago 0 replies      
As far as cheap Arduino clones go, I've been pretty happy with the EDArduino ones from Electrodragon: http://www.electrodragon.com/product-category/arduino-2/ardu...
erichocean 4 days ago 3 replies      
Anyone know of any small, low-cost, battery powered programmable boards for Bluetooth LE? I've got the Pi 3 and CHIP is ordered, but I'd love something smaller if possible. Thanks!
dotBen 4 days ago 0 replies      
It's really not that unusual for resellers to be given geographical territories within which they can/can't sell. Try looking up your favorite electronics on Amazon.com and see if they will ship to Europe - most times the page will say 'for US delivery only'.

It is a shame that the Arduino guys split up but I would argue the situation would be even more of a mess if both new companies were able to sell into the same territories with the same name + similar product.

st3v3r 4 days ago 1 reply      
I'm sorry, but I don't buy it. Other stores, like Sparkfun, are quite able to do this.
xchip 5 days ago 3 replies      
Why do you want an Arduino when the ESP8266 is half the price and has WIFI?
dingo_bat 5 days ago 5 replies      
Bah! Arduino is way overpriced for the specs anyway. Raspberry pi and others are where it's at now. But I do think there is a massive power usage difference.
_pmf_ 5 days ago 1 reply      
> It's a real shame that Arduino LLC seem to have lost any of the Maker-vibe it had

What "vibe" would that be? Supply shortage with undependable lead time? Having to piece together documentation from malware infected wikis that may or may not still exist 10 days from now?

Jailbreaking the Kindle github.com
293 points by bdelay  1 day ago   80 comments top 11
mintplant 22 hours ago 3 replies      
I wouldn't use my Kindle half as much if it weren't for KOReader [0]. Ironically enough for a dedicated reading device, Amazon's built-in reader app pales in comparison to this third-party tool. The killer feature for me is on-the-fly column splitting and text reflow, with the ability to flip to the original page view by tapping a corner -- this is critical for reading academic papers, which tend to be two-column PDFs. It also features contrast adjustment, more fonts, stylesheets, wireless syncing with Calibre, and support for many more file formats including ePub.

There's also a Gargoyle [1] port for interactive fiction on the go. It's less practical due to the input lag on the Kindle keyboard, but I still pull it out every now and again.

[0] https://github.com/koreader/koreader

[1] http://www.fabiszewski.net/kindle-gargoyle/

shostack 3 minutes ago 0 replies      
All I want is to invert colors on my Paperwhite for easier reading with the lights off. Why is that so hard?
david-given 22 hours ago 2 replies      
I did some playing with a Kindle 3 a few years back --- I was writing programs that integrated into the native UI. I built an app which was a Javascript interpreter bolted onto a VT52 terminal emulator. You could type in programs and run them! Using the K3's fiddly little keyboard! Um, awesome. http://cowlark.com/kindle/javascript.html

This was on the 3.1 firmware, so it's likely all completely obsolete on modern devices.

...the 3.1 firmware was terrible. It was all Java based, but Java 1.4. No generics! No autoboxing! No foreach! People forget just how awful early versions of Java were in comparison to what we have today. I ended up building a toolchain using RetroWeaver to convert modern Java bytecode into something that would run on the Kindle.

Also, the firmware was based on the Personal Basis Profile 1.1. Think back, way into the past, before there were smartphones and Android and iOS... back to the heyday of the downloadable Java applet for your T9-based phone. Yup, that. Kindle apps were midlets, and anyone who remembers writing programs for midlets will be shuddering by now.

And it gets worse! The Kindle ran the entire UI, third-party applications included, in a single Java VM. It was as fragile as hell, and it tended to silt up with un-garbagecollectable data until it crashed and rebooted. If you left a thread running on application exit, it would crash and reboot. If your app hung you had to power cycle the device. I believe that the reason why Amazon never really opened up the Kindle to large-scale third party apps was mainly embarrassment.

Good times. Good times...

jonahx 23 hours ago 11 replies      
Could anyone knowledgeable about this explain what you can do with a jailbroken kindle that you cannot do with a locked one?
Zombieball 23 hours ago 5 replies      
Not sure if you are the author or just sharing. If you authored this I would highly recommend mentioning which "kindle" this is applicable to as the first topic. There are multiple generations of kindle e-readers and kindle tablets. It's not readily apparent up front as to which this is applicable to.
Steeeve 18 hours ago 1 reply      
The more I see writeups like this, the more I wonder if the effort being laid out by the people doing the work is compensated appropriately.

I'm not sure what Amazon pays for identifying a security flaw, but I imagine it's somewhere between $5 and $15k.

Having success monthly might yield reasonable compensation, but companies only pay when a flaw is identified, which means you don't get paid for your work, you get paid for your successful work. And you don't get to define what is successful, nor is there usually a clear definition of what successful actually means.

I understand that many people do this to get a job in security / security research, but it just seems like the effort-to-payoff ratio still favors people using their found exploits for evil dramatically.

There really should be a different pricing model around security exploits - one that encourages responsible disclosure more heavily.

pavanlimo 15 hours ago 0 replies      
I was recently researching ebook readers and found Kobo devices way better than Kindle (and cheaper too). Especially for somebody who is a power user of devices. Without getting into specifities, I found that in general Kobo is more open.
geniium 2 hours ago 1 reply      
Maybe that'll make Amazon update their Kindle's experimental browser. It has an underestimated potential.

Will keep an eye on this, even if am now using a Kobo H20.

criddell 20 hours ago 2 replies      
All I want is an e-ink device that I can use with books purchased from Amazon, Apple, Google, B&N, and other vendors.

Why is that so hard?

enthdegree 19 hours ago 1 reply      
How do you get the background to pursue this sort of thing?

I've programmed and used Linux for a little while and I've done some simple things in assembly language (although not in much depth), but all the technical things past the CVE-2013-2842 section are impenetrable to me.

wineisfine 14 hours ago 0 replies      
Btw here is a webservice that converts and sends ePub's to your kindle, without having to use Calibre yourself http://www.sendepubtokindle.com
How the Arab World Came Apart nytimes.com
301 points by s3b  5 days ago   328 comments top 23
6stringmerc 5 days ago 2 replies      
How did it come apart? Almost exactly like Dick Cheney thought it would 20 years before cheering for the invasion. From an episode of Meet The Press in 2014[1]:


All right, let me ask you a couple of quick questions. I want to play for you an interesting clip of you 20 years ago about Iraq and Saddam Hussein. Take a look.


That's a very volatile part of the world. And if you take down the central government in Iraq you can easily end up seeing pieces of Iraq fly off. Part of it the Syrians would like to have to the west, part of Eastern Iraq the Iranians would like to claim, fought over for eight years. In the north you've got the Kurds. And the Kurds spin loose and join with the Kurds in Turkey then you threaten the territorial integrity of Turkey. It's a quagmire if you go that far.

[1] http://www.nbcnews.com/meet-the-press/meet-press-transcript-...

sevenless 5 days ago 4 replies      
The NYTimes itself is also a considerable part of the reason, considering Judith Miller, Friedman, Krauthammer and many others propagandized hard for invading Iraq.
return0 5 days ago 4 replies      
Long piece , but it fails in that it does not give a coherent message. Pieces of stories here and there are not history. First of all , lets stop calling it Arab Spring. If anything it's the arab Autumn of civil wars, where all arabs are united only in their resentment of the west. Secondly, let's talk about people, their souls, aspirations, and culture instead of political history. Given that these states were built as western protectorates, their history should not be a good guide for their future. Clearly the approach to colonialism/interventionism that the US takes is a failure, compared to the colonialism of the British for example (e.g. Jordan). The arab world has always been far too divided to able to draw clear borders around their states. Their only hope for peace is long-term economic prosperity and transition to secularism. Until then, tyranny worked.

This piece may be fun to get through your flight, but it should offer a holistic perspective.

anonu 5 days ago 2 replies      
My view on the Middle East: it has been a playground for the powers of the world to muck around. The fall of the Ottoman Empire at the end WWI allowed the region to be carved up between the French and the British. Secret agreements like Sykes-Picot cemented borders that should never have been there. After WWII a considerable amount of support was put behind Israel (and rightly so). However, the ongoing conflict and lack of a 2-state solution to this day continues to be a rallying call for millions of Arabs against the West. To keep the Arabs in check - the West continuously undermines their governments (which may be led by strongmen - but this is better than the alternative which we have seen). In addition, the West assumes it understands the intricate complexities of the demographics on the ground. Mucking around only exacerbates the problems.
DanielBMarkham 5 days ago 1 reply      
So far this is good. There's a lot of depth.

One nit:

" Much as the United States Army and white settlers did with Indian tribes in the conquest of the American West, so the British and French and Italians proved adept at pitting these groups against one another, bestowing favors weapons or food or sinecures to one faction in return for fighting another. The great difference, of course, is that in the American West, the settlers stayed and the tribal system was essentially destroyed."

I think it's a mistake to only back up to the end up WWI and start running the tape there. The Arab world has a rich and nuanced history full of the exact kinds of tribal tensions we see now going back hundreds of years. There's a reason the Ottomans were the way they were -- and it has nothing to do with Colonialism. There are also great parallels between what's happening with the Arab spring and what happened when other great powers consolidated their hold over the Arabs and then left. Just citing one example seems like a tremendous disservice to the history. Also the meme of "It was the SykesPicot Agreement" has some truth but is extremely easy to lean too much on. With this amount of verbiage being produced, I'm expecting some alternative lines of reasoning to be explored.

Looking forward to more of the series!

(Apologies -- looks like the entire thing is here? Wow! I've heard of long-format writing before, but this is kindle material. Tremendous amount of work here.)

hedonistbot 4 days ago 2 replies      
This piece was a huge waste of time. Following the personal stories of a number of individuals in most of the Middle East countries does not give any information to the reader about any of the geopolitics or power plays in the region. It just leaves the reader confused and depressed that nothing can be done and we should probably leave this matters to people more knowledgeable. Is this some new form of journalism. And to see this in a NYT publication...

Also NY Times has so much *to answer for in their coverage of the events that it kind of make sense that they are avoiding any real analysis of the issues.

not_a_terrorist 5 days ago 4 replies      
tl:dr 100 pages:

1) People are fucking poor and hungry (extreme wealth inequalities) 2) Salafi/Wahhabi (Saudi) funding of islamism 3) Antediluvian hatred between people (it goes, way, way, way farther back than Sykes-Picot)

pipio21 5 days ago 2 replies      
For me it is extremely simple: They have an extraordinary amount of oil:Iraq,Iran, Saudi Arabia, Kuwait, natural gas: Libya,Iran or they are in the middle of strategic places to build oil ducts: Syria, Afghanistan.


The West needs much more energy that what they have. They have industry and without energy their society will collapse.

Anything else is secondary. Most of those places are desert, and have not enough technology to protect themselves from Western (or Eastern)plundering.

Those countries can only life in peace as protectorates from powerful industrialized countries, like Saudi Arabia(de facto protectorate of USA, its oil can only be paid in USD), or Iran(protectorate of Russia and China) or Syria(Russia).

Libya itself had a lot of Chinese civilian presence, but not military. So UK, USA and France thought it was going to be easy to take the country by force, like they did.

They also tried with Syria, but Russia had an army there. They tried hard, remember Assad having chemical weapons so the West needed to "save" and "free" the country? Putin reacted fast to that. The need of creating a fly exclusion zone(prior to the invasion, like in Libya), again Putin reacted faster sending his own airplanes.

Salamat 5 days ago 2 replies      
This is just another propaganda piece to obscure what is really happening.To make sure that no Arab spring takes place, the US has sold all its allies all the weapons they might need to crush any opposition to their fiefdoms. The New York Times never explains to us who those moderate rebels are. "The alliance says it is fighting terrorists, a name it uses for all of Mr. Assads foes, from the extremists of the Islamic State to more moderate rebels who came out of the Arab Spring protest movement against his rule." http://www.nytimes.com/2015/10/16/world/middleeast/syrian-fo... "Donald Trump Praises Dictators, But Hillary Clinton Befriends Them""Clinton has described former Egyptian dictator Hosni Mubarak and his wife as friends of my family. Mubarak ruled Egypt under a perpetual state of emergency rule that involved disappearing and torturing dissidents, police killings, and persecution of LGBT people. The U.S. gave Mubarak $1.3 billion in military aid per year, and when Arab Spring protests threatened his grip on power, Clinton warned the administration not to push a longtime partner out the door, according to her book Hard Choices. After Arab Spring protests unseated Mubarak and led to democratic elections, the Egyptian military, led by Abdel Fattah el-Sisi, staged a coup. El-Sisi suspended the countrys 2012 Constitution, appointed officials from the former dictatorship, and moved to silence opposition. Sisi traveled to the U.S. in 2014 and met with Clinton and her husband, posing for a photo. The Obama administration last year lifted a hold on the transfer of weapons and cash to el-Sisis government....Egypt is far from the only military dictatorship that Clinton has supported. During her tenure as secretary of state, Clinton approved tens of billions of dollars of weapons transfers to Saudi Arabia including fighter jets now being used to bomb Yemen. Clinton played a central role in legitimizing a 2009 military coup in Honduras, and once called Syrian dictator Bashir al-Assad a reformer. And in return for approving arms deals to gulf state monarchies, Clinton accepted tens of millions of dollars in donations to the Clinton Foundation. Clinton has also boasted about receiving advice from Secretary of State Henry Kissinger, who was notorious for his support of dictators. According to records from the National Security Archive, Kissinger oversaw a plot to assassinate the Chilean President Salvador Allende and install the brutal dictator Augusto Pinochet."
jomamaxx 5 days ago 5 replies      
Ha ha ha ... ha ...

The 'Arab World' was never together. Ever.

All of the 'Anti-American Imperialism' kids here should remember that the bulk of the 'Arab World' is 'Arab By The Sword'.

Arabic is spoken across North Africa, in particular because of Arab Colonialism of the 9th-12th centuries.

Not since then has the 'Arab World' been anything resembling 'together'.

The Turks kept them (and there was not much of them) under the thumb, after that the Europeans tried to maintain some degree of balance, now the Americans.

The most recent and damaging decision by the US was Obama's withdrawl of troops in Iraq. Of course, invading in the first place - but Obama simply by virtue of having 10K soldiers sitting on a base 'behind the wire' doing nothing, could have kept forcing Malaki to play nice with the Sunnis. The moment Obama withdrew, Malaki purged Iraq of Sunnis, and the Sunni tribes decided that ISIS was 'less worse' than their own government and there you have it.

As far as Syria ... this is a function of the 'Arab Spring' more than anything, and I don't think anyone can say anyone else is directly responsible for that. Other than the standard: Assad, Saudis, Iran etc...

Once things stabilize in Syria, maybe things can start to settle down.

susi22 5 days ago 1 reply      
Slightly off topic:

Clicking "Simplify Page" on the google chrome printing dialog makes this a fantastic formatted PDF. I'm impressed (be it Chrome's doing or NYT's).

bogomipz 5 days ago 3 replies      
"The Arab World" - What does that even mean? Arab is a language distinction, a language of which there are many dialects. As Arabic is spoken from Western Sahara all the way east to Oman that pretty much disqualifies Arab World from having geographical significance. Arab also does not denote religious faith as there are Arab Jews, Arab Christians(Coptic) and of course Arab Muslims.

There was once briefly a concept of Pan-Arabism but that died when Gamal Abdel Nasser died in 1970.

Does a Muslim Arabic speaker from Morocco really have any sense of kinship with an Arabic Christian(a Coptic) from Egypt? I am going to say probably not. Probably not any more than two Slavic language speakers in different parts of Europe do. Have the Saudis taken in any Arab refugee "brothers" from Syria and Iraq? No. Have the Arab Emirates? Again no.

So what is this "Arab World" that the NYTimes and the rest of the media are so fond of using as a point of reference? Countries carved up as part of the Sykes Picot agreement? Can they not come up with a more meaningful distinction? This matters.

giis 5 days ago 1 reply      
Just last week, I watched this documentary called "Saudi Arabia Uncovered" to understand its current state. Its on youtube.
punnerud 5 days ago 0 replies      
To get pictures in the article, remove the last part of the url:http://www.nytimes.com/interactive/2016/08/11/magazine/isis-...
d23 4 days ago 1 reply      
I wish long form, informative articles like this had a way to donate a small amount of money to show appreciation. I don't read any particular news outlet enough to get an exclusive subscription with them, but I'd love to reward them for good individual pieces.
Hortinstein 5 days ago 0 replies      
I really hope this gets put into a podcast/audio format, would love to ingest it on a commute
nowey 4 days ago 0 replies      
I think even farther back with the overthrow of the Shah in Iran it started
Cyph0n 5 days ago 0 replies      
It's fun to see the armchair historians and political theorists pop up. HN is becoming more like Reddit by the day. Stick to discussing technology guys ;)
louwrentius 5 days ago 4 replies      
ishener 5 days ago 0 replies      
mms1973 5 days ago 4 replies      
I recommend to read Raphael Patai's classic "The Arab mind" to try to understand. Don't drink the NYT koolaid.
transfire 5 days ago 2 replies      
Just ask Lawrence. Same shit, different day.
YC Tech Stacks themacro.com
299 points by bootload  16 hours ago   89 comments top 26
sergiotapia 13 hours ago 3 replies      
If you're interested in startup tech stacks, we have a lot of different ones curated at StackShare.


Some well known startups are "verified" with a blue checkmark, which means they update their tech stack themselves so look out for that.

overcast 14 hours ago 4 replies      
So besides the boring stuff listed in this article, how about the database servers and developer environments? The actual tech stack running the show.
ktamura 15 hours ago 2 replies      
Some of the juxtapositions/language are bizarre.

>2. Of the YC companies that use hosting providers, 55% use AWS, 13% use Cloudflare and 6% use Rackspace.

Comparing Cloudflare and AWS? Where's GCP or Azure?

>3. Of the YC companies that use a CDN, 68% use Amazon CloudFront, 7% use Cloudflare and 5% use Fastly.

Okay, so there's a separate section for CDN...

>8. Only 22% of YC sites use a third party CMS. Of those, 83% use WordPress.

"3rd party" CMS? As opposed to...a custom made web app?

>11. In terms of installs, the most common Analytics providers are Google Analytics (451), Google Universal Analytics (314), Mixpanel (183) and Optimizely (129).

I thought Google Universal Analytics was an updated version of Google Analytics.

That said, this kind of market research is incredibly valuable.

westernmostcoy 14 hours ago 3 replies      
2. Of the YC companies that use hosting providers, 55% use AWS, 13% use Cloudflare and 6% use Rackspace.

Cloudflare is a CDN exclusively, right? Why are they counted among the hosting providers and not with CDNs in the next line?

thedrake 9 hours ago 0 replies      
BuiltWith has a much more extensive list of not only YC stacks but several curated lists. https://trends.builtwith.com/tech-reports

the YC stack is here: https://trends.builtwith.com/tech-reports/Y-Combinator

*click the All Technologies dropdown to see all the options

base 1 hour ago 0 replies      
Advertising technologies I'm almost sure it's wrong. How do you know from the javascript used if the company uses Adwords? My company uses adwords extensively and all the events and conversions are passed using Google Analytics and the Adwords API.
1123581321 15 hours ago 4 replies      
83% WordPress usage among CMS users seems high, but 22% using a CMS at all seems low. Presumably most using a CMS just want a blog and a few pages? Any kind of complicated content structure goes straight to custom development? Does this account for static generators like Jekyll or CMSes that don't provide information about themselves?
grape_ 14 hours ago 2 replies      
I ran a similar study a while back - I was curious to see if the YC incentives for startups (Azure credits, Amazon credits, Digital Ocean credits, etc.) were significantly swaying founders to use a technology stack.

You can find a quick analysis of registrars and DNS providers here - https://www.gra.pe/?p=77

You can find architecture analysis here - https://www.gra.pe/?p=36

Interesting stuff.

minimaxir 15 hours ago 0 replies      
The chart at the bottom (now that it's fixed; thanks!), if you remove the Google outliers, shows that there is little correlation between the market share of a given SASS within its category and the number of YC startups that use it.

As a result, the conclusion is that YC companies just use whatever the hell they want, not what is "best."

rezashirazian 14 hours ago 4 replies      
2. Of the YC companies that use hosting providers, 55% use AWS, 13% use Cloudflare and 6% use Rackspace.

No Azure? considering how much Microsoft gives away with their bizspark program, this is surprising.

serialpreneur 12 hours ago 0 replies      
I don't see tech stacks here. Mostly adtech!
buro9 5 hours ago 0 replies      
Instead of scraping the sites and trying to guess what YC companies are using...

Perhaps just ask them?

Surely every YC company would answer, or at least enough to make the information more valuable than a set of guesses.

Additionally, it would be good to know when YC companies choose other YC company products based on incentives offered (it's free to other alumni), as this disclosure allows others to evaluate whether the choice to use a product was because it was the best product for that task or was potentially skewed by the incentive offered.

ruffrey 15 hours ago 4 replies      
I was hoping it was a list of the tech stacks they used to build their software, like Node.js, C, Clojure, Redis, etc
caust1c 15 hours ago 1 reply      
CloudFlare is not a hosting provider.

Interesting analysis nonetheless.

drewvolpe 15 hours ago 4 replies      
"Only 21% of YC companies are A/B testing, but 93% use Optimizely"

This doesn't sound right. I wonder if the writer meant "93% of the 21% who do A/B testing use Optimizely".

mikeboydbrowne 3 hours ago 0 replies      
None of these numbers point to a monopoly on the bizdev/marketing side - is that due to the tools/players being new (I consolidated) or is it likely to stay this fragmented?
pranaysharma 12 hours ago 3 replies      
Why no one uses services from providers like Digital ocean and linode . They are cheap and reliable?
zzleeper 13 hours ago 2 replies      

 AdWords comes in fourth with 130
Any reason for this?

bluedino 14 hours ago 1 reply      
>> Of the YC companies that use hosting providers, 55% use AWS, 13% use Cloudflare and 6% use Rackspace.

No love for Linode?

ggiaco 9 hours ago 0 replies      
You can also see a lot of this data over at Siftery, with much of it verified first-hand by the companies themselves (https://siftery.com/groups/y-combinator-portfolio)
wslh 15 hours ago 1 reply      
It would be interesting to have also information about the frontend. Are they using Bootstrap/Foundation/etc? What other JS libs are they using for the presentation layer?
gk1 15 hours ago 1 reply      
LinkedIn ads don't have tracking scripts or pixels, so not sure how they're getting the number for that. Certainly can't be accurate.
greyskull 12 hours ago 2 replies      
Why is AWS so popular? I don't mean that as a question with pretense, just genuinely curious.
danbmil99 8 hours ago 0 replies      
Still waiting for info about actual tech stacks.
spydertennis 15 hours ago 0 replies      
some good data on best in class saas
Introducing Webpack Dashboard formidable.com
418 points by thekenwheeler  23 hours ago   52 comments top 14
methyl 18 hours ago 2 replies      
JS community will never stop to surprise me, both in negative and positive ways. This is certainly a great idea and I no wonder it came out of JS people.

Of course we got some fatigue in the ecosystem, but things like Webpack Dashboard strengthens my belief the fatigue is a tradeoff worth to pay.

BTW. if you like Blessed, certainly check out react-blessed: https://github.com/Yomguithereal/react-blessed

BinaryIdiot 18 hours ago 2 replies      
Webpack is supposed to be one of those things that you setup and then forget about and is likely not the only piece of bundling / building your application. So except for playing with this, admittedly pretty neat interface, I don't see how this fits in the typical webpack workflow. I'd be curious if they are logging metrics around how frequently it'll be used to see if it's something people are actually interested in or if it'll just be a neat thing everyone checks out once and then never goes back to it.
colemannerd 21 hours ago 0 replies      
You should try and uplevel this to main webpack. It should have a dev mode that displays this.
i_live_there 19 hours ago 1 reply      
I gotta say that I'm absolutely impressed with the blessed library. I didn't heard of it until I saw this post. Wow, just wow
smac8 21 hours ago 2 replies      
This is awesome! However, when I add it to my webpack config it does load the dashboard, but afterward still produces all the scrolled output. Anyone now how to remove the scrolled output and just show the dash?
philip1209 21 hours ago 1 reply      
Their SSL appears broken, so if you are using "HTTPS Everywhere" you will have to circumvent to non-https.
visarga 5 hours ago 0 replies      
Looks great and it would be nice to have work over http too. That way the UI could be polished even more.
mthoms 20 hours ago 1 reply      
Can anyone point me towards how to build something like this with pure bash? Is NCurses the answer?
rubber_duck 18 hours ago 1 reply      
I can't get this to work with awesome-typescript-loader fork typechecker - it overwrites the screen and scrolls past the end - am I missing something obvious or is this not supported ?

When I disable it it works great, would be nice if I could use it with it.

savanaly 22 hours ago 0 replies      
Haven't tried it myself yet, but my first impression is that this looks very, very cool.
mfrye0 20 hours ago 0 replies      
This looks great. I'll definitely try it out.
b34r 20 hours ago 0 replies      
Because adding GUIs to overcomplicated software with poor DUX always helps things.
dotancohen 21 hours ago 3 replies      
Nice, but useless. I suppose that I'm an old enough fart that the noisy and scrolly" doesn't bother me one bit.

However, the "blessed" library that Webpack Dashboard uses _is_ impressive:https://github.com/chjj/blessed

That's an ncurses-like library with some really nice features. I will be using it.

RedHat is hiring to make Linux run better on laptops gnome.org
244 points by soulbadguy  4 days ago   182 comments top 23
brightball 4 days ago 11 replies      
As much as I like Ubuntu, I have a significantly higher level of trust in Red Hat to get this right for some reason. Maybe it's just confidence in the company's core principles.

I've hit a point where I'm ready to move on from my Macbook Pro after about 10 years and I've been looking at Linux laptop options. It's mind boggling that it's so hard to find good options.

Everything seems to revolve around "get a Windows laptop and wipe it" or "buy some flavor of old Thinkpad" with warnings about EFI and compatibility. Then there are company's like System76 who have what looks like a good offering on the surface but I keep seeing threads about bad experiences with them.

If I could order a laptop direct from Red Hat I'd do it without hesitation.

CptMauli 4 days ago 2 replies      
My ex colleague is now at Read Hat, and he told me one anecdote. The card reader of his ThinkPad didn't work. He created a bug in the bugtracker, got personally contacted by the guy who apparently does card reader stuff, and a day later he had a kernel, where the bug was fixed.
technofiend 4 days ago 0 replies      
RedHat seems to be trying to make their OS more accessible and mainstream friendly through their $0 developer license http://developers.redhat.com/blog/2016/03/31/no-cost-rhel-de.... Not that there's anything wrong with CentOS. So it's cool that effort extends to commodity hardware, too.

Sample size of one but as part of working down the Redhat certifications track I purchased an Intel NUC http://www.intel.com/content/www/us/en/nuc/nuc-kit-nuc6i5syk... for a low power lab machine; it's really just a laptop squeezed into a cube but everything I've needed to use it for works fine. Admittedly I have not tried wireless or bluetooth.

Hopefully that means laptops built off reference Intel designs will also Just Work. Interestingly NUCs are showing up now in the Hackintosh community because despite moderate specs next to a modern desktop, they're still competitive with Apple's current hardware. No doubt Apple will refresh this year; they seem to be overdue.

AdmiralAsshat 4 days ago 8 replies      
Fedora's bleeding-edge kernel support usually means better hardware support for newer laptops, which is great. One potential drawback, however, is the decision not to include any non-FOSS drivers in the installation package. I completely understand why they do it, but it throws a wrench into the idea of loading Linux onto a laptop and having everything "just work".

For instance, getting the Broadcom wifi card that comes with my Dell XPS 13 to work on Fedora has been such a pain in the ass (proprietary driver only) that most people recommend tearing it out and replacing it with an Intel card that has a more Linux-friendly driver.

If Redhat really wants better Linux penetration in the laptop world, at some point they're probably going to have to make a decision to either go the Linux Mint route and include proprietary drivers by default, or try to engage some of the hardware manufacturers to open-source their drivers.

dxxvi 4 days ago 2 replies      
I always had both Windows and Linux (Arch) on every computer in my house (now only a few of them have Windows because I use Windows in VirtualBox in Linux). It seems to me that wifi speed in Linux is always slower than that in Windows. Drivers are the culprit?If you want people to use Linux laptops, I think you should make them really really fast (esp. at boot) so that everybody has to wow when they just try it. No need to use gnome or kde, openbox is fine as long as it's not ugly.Then wifi spped must be faster or at least as fast as in Window. Next is printing.In summary, most of the issues are related to drivers.
hinkley 3 days ago 0 replies      
I have a lot of mixed emotions about RedHat and to what degree they are a net positive or negative, but I'm glad someone is taking this on.

It was so frustrating for me that the 'Linux on the Desktop' effort started right after the numbers showed that everyone was trading in their desktops for laptops.

I wanted to program ON my laptop, not program my laptop.

After spending almost 18 months trying to get all of the hardware on my laptop to work with linux (this included swapping out the wifi card and learning ACPI scripting so I could cobble together partial fixes from four other sources, and learning Crusoe CPU registers to contribute a power saving fix back to Transmeta, both things I have absolutely no interest in whatsoever), I said screw it and bought a Macbook. I'm on my fourth now, and aside for some difficulty installing command line tools, it's entirely removed hardware as a source of stress and procrastination.

cm3 3 days ago 1 reply      
Please focus on reducing and avoiding regressions in the Intel gpu stack. It's gotten pretty bad in the last two years. Major regressions were introduced beginning with kernel 4.2's atomic modesetting changes, across the board.
Zenst 3 days ago 2 replies      
Wouldn't be nice if there was a universal driver that you could use upon any operating system that supported universal drivers.

Alas not aware of any initiative or indeed reason that such drivers could exist, even in binary blobs it would be a step forward.

satysin 3 days ago 1 reply      
My main machine is an oldish ThinkPad T420s. It runs Fedora 24 flawlessly. Sadly not many machines seem to run Linux (any distro) perfectly. Part of the reason I haven't upgraded to a newer machine is because this machine just works and I am very lazy so trying to get a newer machine to run as well is more work than I care for. It isn't like performance has improved massively since SandyBridge.
walterbell 4 days ago 0 replies      
Is RedHat interested in contributing to Qubes, which uses Fedora? This would help advance the state of the art in desktop security and seamless UI compositing.
soulbadguy 4 days ago 1 reply      
While this is indeed a great news, Are just two people enough for the wide range of laptop and devices out there in the wild ?

What i would really love to see is a cross distribution effort in the same direction : People from the main distribution coming together, identifying the main experience pain point a fixing them upstream.

I really think there is an under served (from both hardware manufacturer and distros) market of people who wants a better linux desktop/laptop experience. But until someone figure out a way to monetized that (like linux is on the server side), it will be hard to build on the desktop side the same kind of momentum that linux is enjoy on servers.

dovdov 4 days ago 3 replies      
better (10 years) late than never, right? :D
arvinsim 4 days ago 0 replies      
Power Management should be top priority I think.
known 3 days ago 0 replies      
Sounds like RedHat is after https://bugs.launchpad.net/ubuntu/+bug/1
cpach 4 days ago 0 replies      
Neat! Munich seems like a nice city to live in.
cs702 3 days ago 0 replies      
This is great, because all distributions will benefit from it eventually, not just RedHat/Fedora. The folks at Canonical, in particular, seem to be very adept at leveraging the work of others for the benefit of Ubuntu.

I would hope these kinds of efforts lead to better collaboration and coordination between the different distros for improving compatibility with desktop, laptop, tablet, and even phone hardware...

...but Unfortunately I don't think we should expect better collaboration and coordination, due to the usual political and quasi-religious barriers between distros.

asteriskdelete 3 days ago 0 replies      
Yeah, two people will fix it.
yobo 4 days ago 0 replies      
the year of linux on the laptop they said.
hyperion2010 3 days ago 0 replies      
I'm currently running Gentoo on a T30, T60p, and X1 Carbon (1st gen). For the X1 Carbon I actually switched because Windows 7 was causing periodic freezes. Power managment and drivers are always what is missing, and getting drivers written in a timely fashion is hard. That said, if they focus on a subset of laptops then they could show some major improvements. I do wonder about the old M$ ACPI manoeuvring though, if vendors still aren't documenting features then there will be problems.
crudbug 3 days ago 0 replies      
Please improve GNOME multi-display support anybody.

I have triple monitor setup, GNOME always crashes. I am using XFCE with some success on F22.

digi_owl 3 days ago 0 replies      
Welcome back, Red Hat Linux...


jamespo 4 days ago 1 reply      
I find the Arch derived distribution Apricity runs very well on my HP Spectre X360
boynamedsue 4 days ago 1 reply      
The problem with Linux on laptops is Linux itself. The notion of Linux on anything other than servers turns most consumers off. Imagine if Android were called Linux Phones instead. It would be a disaster.

Linux should be used incidentally to the device or laptop itself. Or it should be spared for those who really want to know and understand more about it.

I used Linux on a laptop for a couple of years and, mostly, loved it. I hope Red Hat doesn't brand it as such.

An introduction to Japanese pomax.github.io
350 points by e-sushi  2 days ago   118 comments top 27
helloworld 1 day ago 2 replies      
Len Walsh's "Read Japanese Today" is a gem. By exploring the history behind the symbols -- using what the author calls a "pictorial mnemonic method" -- it's easy to learn many kanji in an enjoyable way.

Before a few-month stay in Japan, I sampled a variety of books and audio programs. Now, looking back many years later, Walsh's simple introduction is the one I remember best.



twoquestions 1 hour ago 0 replies      
This book looks really cool to learn literal meanings of words and how to speak them in Japanese, I'll have to keep it bookmarked for later if I ever go there!

However, the big difficulty for me when I studied Japanese in school was how indirect you have to be when speaking. Even once I could follow along simple conversations and translate the literal meaning of everything, it turns out Japanese people generally only imply what they really mean, rather than state it directly. I have similar problems with American Southerners.

I wonder if that's a cause of someone becoming a hikikomori (basically a shut-in), as they may be unable to keep up with the huge number of unspoken social rules.

brokencup 1 day ago 2 replies      
Tae Kim's Guide to Japanese is a resource that has been around for a while that has helped a lot of beginners with the language. I'd recommend checking it out if you are interested in learning the language: http://www.guidetojapanese.org/learn/grammar
nlawalker 1 day ago 0 replies      
Cool! I'm traveling to Japan for the first time later this year. When I started making a few flash cards here and there for common phrases, I realized that learning a new language might be a good use of spare brain cycles, and Japanese seemed pretty cool, so I kind of fell into it.

From a beginner's perspective, this seems like a good overview of the "mechanics" - good to read at the beginning in order to get a basic understanding of the concepts, and a good reference, but not a good way to actually start learning the language.

My local library has the full beginner/intermediate/advanced set of Pimsleur Japanese, which has been great to use during my commute - I say "use" rather than "listen to" because the point of the Pimsleur method is that it's interactive - you're supposed to respond to the prompts out loud so you get used to physically forming the words and so you can hear yourself in comparison to the reference speaker. The Human Japanese "app textbook" is pretty good as well. Anki is a good flashcard app but making the cards yourself is painful and time consuming, I need to go find a few more premade decks.

redthrow 1 day ago 5 replies      
This doesn't seem to be a good resource to learn the language as a beginner.

For example Prof Victor Mair is against teaching Chinese characters to beginners:



pay little or no attention to memorizing characters (I would have been content with actively mastering 25 or so very high frequency characters and passively recognizing at most a hundred or so high frequency characters during the first year)

focus on pronunciation, vocabulary, grammar, particles, morphology, syntax, idioms, patterns, constructions, sentence structure, rhythm, prosody, and so forth real language, not the script

somenomadicguy 1 day ago 11 replies      
Orthogonally, I've always wondered, why are geeks so into Japan? Slashdot and all of it's children, like HN, are always full of posts by Westerners about Japanese culture, language, etc, but pretty much lacking in any interest for any other foreign lands. Is it solely an anime thing?
melling 2 days ago 1 reply      
I'm collecting links for Japanese language resources, as well as other languages on Github:


Other languages:


I'd love to find more great resources.

Grue3 1 day ago 0 replies      
Oh, looks like it's time to plug my website: http://ichi.moe

It's very helpful if you're trying to learn Japanese, or read some Japanese. Just copy-paste any sentence in Japanese and it will show how to read it, and what do the words mean. Read up on grammar, and you're ready to go.

Also I generated an Anki deck for learning kanji and all the words [1]. It's really big, and I'm almost halfway through it myself (about 1.5 years of study). Good luck if you try to attempt this challenge.

[1] https://ankiweb.net/shared/info/831167744

garfieldnate 13 hours ago 0 replies      
As long as we're mentioning Japanese resources, here are my favs (all paper):

* A Dictionary of Basic/Intermediate/Advanced Japanese Grammar, Makino and Tsutsui. These have 99% of all of the grammar you will ever need, and they are packed with lots of helpful example sentences.

* Remembering the Kanji, Volumes I and II (III is extra credit), James W. Heisig. Keep in mind you need about a month of dedicated brain time for volume I (great for ALT jobs!), and volume II can then be repeatedly skimmed afterwards while engaging with real texts.

* Making Sense of Japanese, Jay Rubin. He has a way of explaining concepts that seem weird to English speakers.

* Colloquial Kansai Japanese, D.C. Palter. It is not very fun to get to Japan and realize they don't talk like the text books. If you get the basics of Kansai you'll be less lost in all of western Japan, which shares a lot of features, plus you'll be more aware of the types of variation that are possible.

* Polite Fictions, Nancy Sakamoto. This book is getting old, but as long as you remember that times change and culture is a continuum this is a fantastic introduction to group culture.

Actually, the first book I learned Japanese from was "Speak Japanese Today," by Taeko Kamiya. It's meant for business travelers and gets you set up with the basics without using the native orthography. I read it when I was 13. Good memories.

ljw1001 1 day ago 0 replies      
FWIW, what may be the largest language program in Japan (called Hippo in Japan and Lex in the US) focuses to an enormous degree on spoken language and does no grammatical training at all. Instead, they try to emulate, as much as possible, learning the way children do. More info:

Hippo Japan: https://www.lexhippo.gr.jp/english/The US branch's online store: https://audio.lexlrf.org/

On the store, you can pick a language you speak, and the languages you want to learn, and it generates a download, with playlists (iPhone) that mix the audio segments in a variety of ways: https://audio.lexlrf.org/#!Customize

(They are a non-profit and I have done some volunteer programming for them)

zylonbane 1 day ago 6 replies      
From what little I know and from reading this, Japanese seems exactly like what Microsoft would design if it needed to make its own language. Complicated, full of ambiguity, still trying to keep compatibility with an older system (but breaks it anyway), pointlessly redesigned every once in a while. Reading seems even worse because very visually dense kanji just do not survive compression as much as a couple of sticks (like letters in an English or Cyrillic alphabet).Granted, the language may have a beautiful side which you only learn about by actually learning the language, but still, spending an eternity to do so doesn't seem like a nice way to spend time.
solidsnack9000 1 day ago 0 replies      
There are some dubious examples of English usage in the section on verbs. I understood what was intended but only because I am already well familiar with the topics the author intended to illustrate.

> However, there is something funny about transitivity: some verbs, like "walk", you can only use intransitively (we don't say that we "walked the street", for instance), but many verbs can be used either intransitively or transitively, like "eat".

One can "walk the streets of Bakersfield".

> For instance, traversal verbs (such as 'walk', 'run', 'fly', 'sail', etc.) are intransitive in English, but ... while in English one does not "fly the sky" or "swim the ocean" (at the very least you'd need a preposition such as "through" or "in" to make those correct English), in Japanese this is exactly what you're doing.

Although we don't often say that someone swam the ocean, Columbus "sailed the ocean blue".

glandium 1 day ago 0 replies      
Only skimmed through the beginning of the grammar section, and it looks more technical than most of what I've seen so far online.

I must say that after I had studied Japanese for some time, reading Reiko Shimamori's systematic japanese grammar (Grammaire Japonaise Systmatique, actually, I don't think there's an english edition of this french book) was an eye opener. Japanese grammar made so much more sense to me once I heard about , , etc. And I'm glad you talk about those. As I said, though, I only skimmed, but a quick search doesn't reveal any mention of , sadly.

PS: the menu on the right overlaps with the text if the window is narrow, and that makes it impossible to read.

SomeHacker44 1 day ago 0 replies      
Thank you for this. I would like to have read this, especially since I'm on vacation in Hokkaido now, but unfortunately your "clever" right-side table of contents overlaps the actual content on my iPad Pro running the current iOS. So, it's impossible to read the last 10-15% of each line, especially once the TOC expands to most of the height of the screen.

Please give a less "clever" and more traditional UI.


gcr 1 day ago 1 reply      
Is there a way to hide the table of contents on the right? It overlaps the text pretty badly and makes the content unreadable on iPad.
mahranch 1 day ago 2 replies      
The article is better than I thought it would be (I typically don't have my hopes high) but it does skip over something in the pronunciation section: silent letters. When reading Japanese, sometimes the "u" is, or can be, silent in the word. Most common times (but not the only) you'll see this is for words like "desu" and "masu". The "u" is usually not spoken or sounded out. So spoken, it would sound like "Des" or "Mas". But you can see it silent in other words, like daigaku or eikoku. Now I'm not a linguistic or Japanese expert so they may not use "silent" to describe not voicing the letter/vowel, but my point is that the article doesn't mention this. And given how often "desu" and "masu" are used by beginners, I'd include that in beginning for an introduction to Japanese.
jotux 1 day ago 1 reply      
If you're looking for a book, this is what I used in my college Japanese class: https://www.amazon.com/GENKI-Integrated-Elementary-Japanese-...

With the associated workbook: https://www.amazon.com/Genki-Integrated-Elementary-Japanese-...

djent 1 day ago 2 replies      
The furigana in this is very light and small - not easy to see despite beginners relying on them.
canjobear 1 day ago 1 reply      
The content of this grammar seems good and I like it! But I have a pedantic nitpick: Seeing the kana and kanji sections listed under "syntax" is a huge red flag for anyone trained in linguistics. The writing system is not usually considered part of the syntax (or even the grammar) of a language, rather it is just a means to express the spoken language on paper. These should be in an "orthography" section; they are certainly not syntax in the modern linguistics sense of the word. Just some advice to make this impressive work more palatable to linguists :)
ktRolster 1 day ago 1 reply      
Introductions are cool. My difficulty is figuring out how to advance from 'vaguely conversant' to 'highly competent.' There should be more research into that.
natch 1 day ago 0 replies      
Wow, this is impressive. If I ever need a Japanese grammar, this looks really good. The author has said it's not intended as a tutorial, so let's get that out of the way, and just say good job on creating a resource that students of other languages can envy.
Bromskloss 1 day ago 1 reply      
I something like this available for other languages, say, for French and Russian?
oDot 1 day ago 1 reply      
For me, the hardest thing with learning a new language is immersion. I can study grammar, read books and use Duolingo all day long, but in the end it comes down to "living" it.

I wish there was an app for that.

coin 1 day ago 0 replies      
The author needs to test this page on an iPad.
mrcactu5 1 day ago 1 reply      
this is so thorough! I lived in Tokyo for 3 months during the end of 2010 -- so this is a great review!
partycoder 1 day ago 0 replies      
If you want to learn kana, you can drill it in this website:realkana.com . You select the kanas you want to learn, then you drill them until you learn them.Writing each one repeteadly also helps. Note that they have a stroke order that you are expected to respect.

Kanji is really hard, you will need to spend some good time learning them. A chrome plugin called rikaikun, or rikaichan for firefox might help you get familiar with kanji by annotating them when you hover over them as you encounter them on websites.

cloudjacker 1 day ago 2 replies      
I'm familiar with Japanese, but I had a question: do the phonetic building blocks have individual names?

In western alphabets, the letters each have a name that is separate and sometimes loosely related to their sounds.

In Japanese alphabets, I haven't noticed this yet, but I've never asked specifically

Do they?

George Orwell, Politics and the English Language (1946) mtholyoke.edu
298 points by Tomte  4 days ago   149 comments top 27
Houshalter 4 days ago 4 replies      
I love this essay. The whole essay is good, but I really like this paragraph:

>In our time, political speech and writing are largely the defense of the indefensible. Things like the continuance of British rule in India, the Russian purges and deportations, the dropping of the atom bombs on Japan, can indeed be defended, but only by arguments which are too brutal for most people to face, and which do not square with the professed aims of the political parties. Thus political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness. Defenseless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets: this is called pacification. Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called transfer of population or rectification of frontiers. People are imprisoned for years without trial, or shot in the back of the neck or sent to die of scurvy in Arctic lumber camps: this is called elimination of unreliable elements. Such phraseology is needed if one wants to name things without calling up mental pictures of them.

Animats 4 days ago 4 replies      
Ah, Orwell. This was one of his pet peeves. He spent much of WWII translating news into Basic English for transmission to British colonies. The evasions and hyperbole of political speech had to be expressed in the plain and practical words of Basic English. That's a political act. Newspeak in "1984" came from that experience.

His list of worn-out metaphors understood by few, "ring the changes on, take up the cudgel for, toe the line, ride roughshod over, stand shoulder to shoulder with, play into the hands of, no axe to grind, grist to the mill, fishing in troubled waters, on the order of the day, Achilles' heel, swan song, hotbed" is still apt. "Ring the changes" is misused in today's South China Morning Post.[1] My own favorite is "free rein", which is a horse term. (One not used by riders today; riders say "loose rein".) It often appears today as "free reign".

Today's metaphors come from popular culture rather than the classics, and age faster. This may not be an improvement.

[1] http://www.scmp.com/sport/rugby/article/2002459/mark-coeberg...

ajkjk 4 days ago 2 replies      
I consider David Foster Wallace's "Authority and the English Language" to be a spiritual successor to this piece (http://wilson.med.harvard.edu/nb204/AuthorityAndAmericanUsag...). I'd recommend it to anyone who likes Orwell's essay.
chrisdone 4 days ago 1 reply      
The below example is stunning. I feel sick to recognize the second paragraph (especially in academic writing), and I feel strong relief that the first paragraph can exist.

> Here is a well-known verse from Ecclesiastes:

> I returned and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.

> Here it is in modern English:

> Objective considerations of contemporary phenomena compel the conclusion that success or failure in competitive activities exhibits no tendency to be commensurate with innate capacity, but that a considerable element of the unpredictable must invariably be taken into account.

devishard 4 days ago 2 replies      
What's frustrating about Orwell's writing advice here is that it doesn't really improve society unless everyone does it. If I write using Orwell's rules and my political opponent doesn't, I'll likely lose.

If anything, I wish that the phenomenon Orwell is describing would be used by the people on the "right" side more often--but people who are doing what they think is right don't as often feel the need to obfuscate it for PR purposes.

Ultimately I think understanding the phenomenon Orwell describes is fundamental. You should be able to read "Two died after a shooting incident involving LAPD officers" and know to look further to discover that meant "LAPD shot and killed two unarmed black men". But actually following Orwell's advice on how to write puts you at a disadvantage. Playing by the rules when your opponents aren't is a fool's game.

charlesism 4 days ago 1 reply      
This essay has been on HN a few times before, but I'm upvoting this anyways. It's one of the greatest essays ever written. It will change how you write, and how you perceive the writing of others.
narrator 3 days ago 0 replies      
"Marxism and the Problem of Linguistics" (1950) by Stalin [1] is an interesting read to go alongside this. You've got Stalin saying that we shouldn't distort language so much so that it loses it's practical use in everyday matters simply because it is inherited from pre-communist ideologies. In a way he is saying that the slippery slope of language manipulation is useful for political purposes, but should not be followed all the way down into impracticality. This was a problem Stalin had as people were excessively fanatical to the point of absurdity to avoid being purged, so he had to give them the ability to limit their fanaticism by saying that some allowance for practicality in the use of language was part of the Stalinist orthodoxy and thus not "reactionary".

[1] https://www.marxists.org/reference/archive/stalin/works/1950...

cafard 4 days ago 0 replies      
On the other hand, there is Samuel Johnson's observation in his Life of Milton:

"No man forgets his original trade: the rights of nations, and of kings, sink into questions of grammar, if grammarians discuss them."

TheLarch 4 days ago 2 replies      
When I hear neologisms from the radical left or right this is the first thing that comes to mind. I don't take it as a writing style guide, rather as a guide to spotting tyranny.

The neologisms from FOX and social justice warriors are politics of language.

RodericDay 4 days ago 2 replies      
I used to like this essay, but the folks over at [UPenn's Language Log](http://languagelog.ldc.upenn.edu/nll/?p=992) do a great takedown of it, and I am inclined to agree with them. Orwell is extremely hypocritical (which many people try to claim is a "deliberate stroke of genius", with very little evidence to support it).

It's particularly a hit amongst center-left liberals who are emboldened into feeling like they are very righteous by not doing anything at all. The more accurate observation comes from commenter Mark F:

> The reason Orwell's essay makes some people angry is that it depicts violations of stylistic rules as moral violations. Use the passive, it says, and you are playing into the hands of the totalitarians. I think that's also why some people like it; people can feel like they're defending the cause of freedom by writing concisely.

> I tend to side with the former camp. I think people pick up on cant pretty well without his help, except when it's telling them something they already want to believe. And in the latter case his help is no use.

elmojenkins 3 days ago 1 reply      
I'm surrounded by people who speak 'politically'. Rather than making their statements clear and explicit, they structure their words in ways that allow them to conceal the true meaning of what they said. Using verbs to misrepresent meaning makes it tough to have good communication and solid understanding between group members.
adrianratnapala 4 days ago 0 replies      
I am enjoying the essay, to the extent that it makes C++ and its compile times seem not so bad.

But how much of the badness is really unique to the English language, or to the modern world?

My guess is that all times and places have wallowed in mushy rhetoric, and always will. If we look back at Pericles or Cicero or Jefferson and see better verbiage, that's because we are selecting for it.

conistonwater 4 days ago 4 replies      
> If it is possible to cut a word out, always cut it out.

Does anybody know if this has been (properly) studied? It's not quite so obvious that succinctness is good for readability and comprehension, and I think it could also go the other way.

Jach 4 days ago 0 replies      
Truly a great essay... When I reread it a couple years ago I made the semi-serious connection to bad but all too common object oriented programming practices as exemplified in Yegge's http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom... Orwell says:

> As I have tried to show, modern writing at its worst does not consist in picking out words for the sake of their meaning and inventing images in order to make the meaning clearer. It consists in gumming together long strips of words which have already been set in order by someone else, and making the results presentable by sheer humbug. The attraction of this way of writing is that it is easy. It is easier -- even quicker, once you have the habit -- to say In my opinion it is not an unjustifiable assumption that than to say I think. If you use ready-made phrases, you not only don't have to hunt about for the words; you also don't have to bother with the rhythms of your sentences since these phrases are generally so arranged as to be more or less euphonious. When you are composing in a hurry -- when you are dictating to a stenographer, for instance, or making a public speech -- it is natural to fall into a pretentious, Latinized style. Tags like a consideration which we should do well to bear in mind or a conclusion to which all of us would readily assent will save many a sentence from coming down with a bump. By using stale metaphors, similes, and idioms, you save much mental effort, at the cost of leaving your meaning vague, not only for your reader but for yourself.

To me it's a funny, semi-serious connection. I see many Javalanders are very happy with Java, and have been for quite some time. For them it's easier, and quicker once they're in the habit, to fire up Eclipse (or IntelliJ), autocomplete and autorefactor and glue together this and that without having to think much (hey tests are green!), sling giant names and namespaces around dozens of files and directories up and down various call stacks in and out of giant systems like Spring and Hibernate, a few even try to match large programming patterns to everything... Yet still they frequently fail to convey in code just what exactly is actually happening. In many cases they just needed a few functions in a file or two with concise names that can be remembered and typed without assistance, even written faithfully on paper without having to use shorthand.

Jedd 4 days ago 0 replies      
I recall reading one of Christopher Hitchens' essays on language, where he referred back to this essay of Orwell's.

(I don't believe it was in his book 'Why Orwell Matters', but maybe it was.)

Hitch is possibly best known now as a vocal anti-theist, but his writings on language (mis-)use are delightful pieces.

mk89 3 days ago 0 replies      
Despite the fact it was meant for something probably different, this reminds me of what Umberto Eco wrote in his "Ur-Fascism": [...] We kids hurried to pick up the shells, precious items, but I had also learned that freedom of speech means freedom from rhetoric.

And, sad enough, nowadays we are full of rhetoric.

compil3r 4 days ago 0 replies      
oppressive ideology in close association with bad prose, love this essay.
yks 4 days ago 2 replies      
Using the opulent language Orwell argues against in English language tests like IELTS is a sure way to increase your score.
tkfu 4 days ago 4 replies      
I detest this essay. I won't go too far into the reasons why, because David Beaver has already done an excellent job of that: http://languagelog.ldc.upenn.edu/nll/?p=992

I'd also agree with Geoff Pullum's characterization of it as "a smug, arrogant, dishonest tract full of posturing and pothering, and writing advice that ranges from idiosyncratic to irrational" (http://chronicle.com/blogs/linguafranca/2013/04/04/eliminati...)

But apart from it being a very poor source of writing advice, I don't believe it's accurate in its diagnosis of how language is deployed for political endsthe question is a much more complicated one than he claims here.

There exist actual good books that can teach you how to write better, instead of patting you on the back and allowing you to tut-tut at those plebians who write things that are "outright barbarous". Pinker's The Sense of Style is one of those; Ann Lamott's Bird by Bird is another. There also exist much better (and more accurate and scientific) resources about how language actually affects the way we think about things. Benjamin Bergen's Louder than Words is an excellent start, and virtually anything from Lakoff's long list of publications is worth readingMetaphors We Live By is the classic, but Women, Fire, and Dangerous Things in particular is excellent from his more recent work.

vivekd 3 days ago 0 replies      
I remember reading this in my high school English class, it was really the only substantive work that I ever come across on how not to write badly. I could never find another how-to work on how to keep your writing from being bad.
merkleme 4 days ago 0 replies      
Great essay, and a mantra to live by - "If it is possible to cut a word, cut it."
tomelders 4 days ago 4 replies      
I've often (but only casually) wondered why "Orwelianism" isn't a thing, like Marxism or Leninism or Reaganism/Thatcherism. I kind of understand why, as I think his world view outs such things as inherently abhorrent, but since when has that stopped people making idols out of people and dogma out of the things they say?

But still, it would be nice if "Orwelianism" meant "Adhering to the principles of George Orwell in encouraging critical thinking and considered reasoned observation" and if it became a school of political thought... alas, all good ideas are corrupted in the end, but human progress walks on the stepping stones of ideologies, and it seems we've had nothing but nasty ones for a very long time.

oftenwrong 4 days ago 0 replies      
Is this essay still under US copyright?
igravious 4 days ago 0 replies      
What's the canonical way to cite this essay? Answers on the back of a bibtex napkin please.
elie_CH 4 days ago 0 replies      
The proposed French translation is awful :)
bitwize 4 days ago 1 reply      
People want to promote horrible policies without Trumping themselves.
grabcocque 4 days ago 4 replies      
Orwell's advice on how to write better English is at best naively harmfully, and at worst cravenly hypocritical. He never followed his own "rules", why in the hell should anyone else? Answer: because his rules are self-serving bullshit.

Case in point: Orwell uses passive constructions 40% more frequently than than an average English corpus. This essay is full of them. Language Log did a brilliant analysis of the essay's towering inaccuracy and hypocrisy:



If it is possible to cut a word out, always cut it out.

And wouldn't you know it, the very first sentence of Orwell's essay runs:

Most people who bother with the matter at all would admit that the English language is in a bad way, but it is generally assumed that we cannot by conscious action do anything about it.


His rules are bullshit, and he knew it, which is why he was smart enough to ignore them completely.

The Peoples Code whitehouse.gov
278 points by jonbaer  5 days ago   60 comments top 15
thewopr 5 days ago 1 reply      
Full disclosure, I'm in a federal department that has been pushing for more open source for a while.

This is a great move by the White House. While there are a lot of groups that are trying to push for more openness and release of software, it can often be challenging. A lot of federal groups have been taught over the years to be very risk averse, and open software is viewed by them to be risk. Probably one of the most common concerns is, "What happens if someone takes and misuses our software?" In a highly risk-averse federal environment, these can be challenging arguments to fight against.

If you like and support this kind of thing, one big thing you can do is to contribute and supply feedback. We frequently have to go to our superiors and justify what we are doing with regards to open source. We say things like, "this repository had X pull requests from non-federal contributors". Or, "We got Y comments and questions from non-federal users of our projects".

It could be as simple as an email saying "Hey thanks, I found this useful", to a full-on pull request fixing an issue or with a new feature request. The more fodder we have to say "open source increases engagement and creates positive feedback" the more you will see this kind of thing happening.

ideonexus 5 days ago 2 replies      
As a developer, I have been so impressed with the Obama Administrations efforts to put everything online. There is a .gov for anything now and I've been watching organization after organization digitize our commons and put them online.

This "Federal Source Code" policy is a great extension of the project-open-data initiative released a few years back:


I found the db-to-api project in this repository incredibly useful for quickly and safely exposing data from one of my applications to clients.

The only failing I see to these many many initiatives is that so few people realize these powerful free resources are out there to be taken advantage of. I hope that changes in the future.

swalsh 5 days ago 0 replies      
The cool part of this is it seems like there's a bunch of API's on here. Perhaps that means I can submit pull-requests to get features that perhaps previously that government would not have had resources to develop.

A good example, there's a recall API: https://github.com/GSA/recalls_api

this is cool, I want to use it... I have an eCommerce website where I sell food. It would be cool if I could be proactive in pulling items from my product catalog. The issue is there is no UPC in the API, so there's no easy way to correlate my products to recalled products. A cursory look at the source code shows me the source:


if you open that up, a lot of the items have the UPC codes in there. This gives me the ability to parse the details, and add the fields I need.

Roboprog 4 days ago 2 replies      
I ran into this last year: the feds set up their own "Bootstrap" type project:



Declanomous 5 days ago 1 reply      
I think this is great. Maybe the code won't be reused, but it adds another dimension to governmental transparency, which is always a good thing. Furthermore, any code produced by the government is effectively being produced for the American people. We should have access to the code to use as we see fit.

I wonder how this will affect bids for government software projects? Will companies be upset that they have to open-source their software? Regardless of whether an individual agency will use it, I can see the initiative saving time and money, since programmers will know they can just find what they need in a repository. If there is one thing you can count on, it's programmers and government employees being lazy.

SmellTheGlove 5 days ago 2 replies      
Putting the code out there for other agencies to use is great, but getting them to actually use it will be another battle. Reuse is tantamount to taking away someone's budget, and therefore, status. The occurrence of two different groups implementing very similar things entirely separately happens more often than you'd think. And I'm not convinced they want to talk or work together, because bureaucrat/military officer X doesn't want to lose budget/people.
emilecantin 5 days ago 0 replies      
> This is, after all, the Peoples code.

I've been thinking this for a long time, and I'm pleasantly surprised that the US government now says this publicly. I hope this point of view becomes more prevalent in the near future.

qwertyuiop924 4 days ago 1 reply      
As a person outside the government, I'm concerned about some potential problems with this initiative:

For one, Pull Requests: If a government agency gets a PR on some code, I'm concerned there may be pressure not to accept it: auditing requirements that are so high that nobody wants to review PRs (not that audits are bad!), or policies that otherwise don't encourage PRs, meaning that improvements don't get back to the government.

Secondly, us winding up with a repeat of some of the problems that other previously proprietary projects (namely, OpenSolaris) encountered: The code that was opensourced being dependant on code that, for whatever reason, couldn't be opensourced, hampering forks, and further development outside of the organization that developed the software in the first place.

Even if issues like this, or issues that I haven't even thought of, occur, this is a huge step forward.

fludlight 4 days ago 2 replies      
This is cool, but can we call it something else? The People's * is a prefix used by totalitarian governments.
rm_-rf_slash 5 days ago 1 reply      
As helpful as this may be, the real transformative code is often proprietary. As others have mentioned here, not everybody will use every reusable component, whether it be because of ignorance, larger system incompatibility, or simply turf protection.

We should look into solutions for intellectual property that are based on an information economy instead of an industrial economy.

Personally, I think it would make sense (perhaps more for pharmaceuticals than software) to significantly shorten the time a patent is valid and/or strip the protection of monopolistic production rights, and instead allow the free market to sell the product at the lowest cost it can be made at, as long as there is a royalty fee. How the fee is determined, I'm not sure yet.

Still, it's clear that our IP system is creaky, overcomplicated, and is tilted too far in the direction of big business, lawyers, and patent trolls, instead of the actual inventors and consumers.

clarkmoody 4 days ago 0 replies      
Avoiding duplication of source code across Federal agencies is nice, but it would be better to eliminate duplicate agencies and functions within the bureaucracy.
nsx147 4 days ago 1 reply      
Check out this guy's commit history: https://github.com/alex-perfilov-reisys?tab=overview
batbomb 4 days ago 0 replies      
I'm very familiar with the Department of Energy process, and there are a few considerations for every lab in the DOE.

TL;DR: The DOE encourages open source software, but it isn't default and there's some (low) barriers.

In general, though, what you can do with your (non-export-controlled) code consists of (in order of increasing difficulty):

1. Nothing. Keep your code private. If you'd like to stop maintaining code but want to make sure it sticks around, the DOE has a software library, the ESTSC, in Oak Ridge (a division of OSTI). It may also be the case that the entity running the lab wants to claim ownership.

2. Open Source. Due diligence is needed to ensure funding agencies and MOUs are respected. Copyright is typically assigned to the contractor running the lab in question (i.e. for Berkeley lab, -> Copyright goes to UC, SLAC -> Stanford, etc...). Major international collaborations can be a bit tricky because foreign countries have their own rules. I think more work needs to be emphasized on this front going forwards.

The DOE also wants to track the popularity of Open Source software, namely downloads. GitHub has met their requirements for reporting.

The DOE discourages use of the GPL and similar licenses. The reason, as I understand it, is due to the fact that the Government (i.e. Defense) must be able to use and modify software (and give to contractors, etc...) without falling under any additional burden. I believe the BSD license is preferred most widely across the labs.

In some cases, people at labs do release software under GPL. If they didn't get special permission, they are likely violating their lab's contract with the DOE.

3. Commercialize. This is really hard. You have to first perform market research, establish the market, spin off, deal with SBIRs, etc... This is a high barrier.

I've been personally working on streamlining the process for (2) with legal for my lab, so that anybody can open source their software very easily, hopefully by just filling out a web page. I'm hoping the recent white house directives help eliminate some of the bureaucracy involved in the process. I've also been trying to reduce fragmentation across the lab. The lab has never offered an official SCM platform, and grad students/postdocs are notoriously bad at keeping important source code in their personal GitHub and then leaving after some time.

It should be noted that almost all national lab facilities are effectively ran under contract, so nearly all national lab employees are not actually federal employees. So we do have a slightly different set of rules.

Finally, there is already a decent presence on github and bitbucket of labs, in case you are interested:






It should be noted this is an extremely, extremely small slice of the software that drives experiments, projects, and research in the lab. Many times software belongs to the project/research group, so there's likely a project github organization where the code naturally resides. This is sort of a consequence of labs becoming more and more multi-disciplined, i.e. the science missions of labs like SLAC and Fermilab are no longer aligned primarily around their accelerators.

OSTI is supposed to maintain an index of that software if it's reasonably important, but it's not really enforced.

PS: If someone from USDS/data.gov/18f can and would like to help out with this in any way, I'd be happy to collaborate!

afarrell 5 days ago 3 replies      
I wonder to what degree this applies to the DC city government and if it can be made useful for municipalities generally.
JimLaheyMD 4 days ago 1 reply      
How about we start with the source code for electronic voting machines?
Google Duo, a simple 1-to-1 video calling app googleblog.blogspot.com
243 points by marban  10 hours ago   249 comments top 47
dzmien 3 hours ago 6 replies      
Why didn't they just update hangouts? I actually use hangouts for video calls with relative frequency, and I wish it was better adapted to changes in network speed like they say Duo is. I think it would be better if "Duo" was just integrated into hangouts as an update. I like hangouts because it allows me to use one app for voice, text and video.
karma_vaccum123 9 hours ago 6 replies      
I know these will be preinstalled on Android, but many friends and family use iOS and I will feel silly asking them to install Duo and Allo.

Hangouts is able to mix voice, video and text just fine. Why start requiring separate apps?

This is Google at its dumbest. They are squandering the small amount of momentum Hangouts has.

jalami 8 hours ago 4 replies      
I'm still putting my money on vector.im and matrix.org. Closed source communication apps are not appealing to me, even if they come with an E2E promise.

From a business perspective this makes sense for Google. A big problem with Skype was always the lack of ubiquity. Lot of people had it, but it required another install and explicit configuration. Now that Skype is nearly bundled with W10 and WebRTC has made skype.com trivial, the gap for Google to move in is closing. If this rolls out with Google branded Android, people will use it irrespective of its merits (a la bundled Internet Explorer). Interop on iOS makes it stand out from Facetime. There's always room to change terms later when it becomes a household name.

Also, with all these services adding an E2E sticker on their communications, Google's hand was forced, they're not trend setters here and they shouldn't be applauded for being extremely late to the privacy game.

helloworld 9 hours ago 9 replies      
Sounds like Duo has a simple user experience, which is great. But it doesn't solve the eye-contact problem with video calls today.

When you look at the person on the screen whom you're talking to, that person sees you looking away, because you're not looking into the camera, which is somewhere on the edge of the screen. So you don't make eye contact with the person you're talking to.

And for me, that makes video calls feel weird.

tdkl 7 hours ago 1 reply      
No desktop support, one device only, they must be joking.[1]

I just hope it can be disabled with rest of the Google bloatware when I buy my next phone.


BinaryIdiot 8 hours ago 2 replies      
I still think this is a huge mistake. It's great it's easy to use but now Google has two products that do video chat and they do not work together; why? If they eventually discontinue Hangouts then now we have to use two apps for texting and video?

The past several years have shown that providing a more integrated experienced typically brings a better user experience so this just smacks as a mistake.

aluhut 7 hours ago 0 replies      
I don't have a number on my Pad...there is not even a Sim in there. Why can't it be just their damn Google account?! I dont even want to give my phone number to them and when I'm in a foreign country i don't want them to easy Switch between WiFi and the super expensive Roaming just because I went in the wrong corner of the Hotel room...
nagarjun 7 hours ago 2 replies      
It's already bad enough that I have a row of chat apps (WhatsApp, Messenger, Hangouts, Skype, Slack, Telegram etc.) on my phone because I can't get everyone I know to agree on one app and now, Google's trying to get me to add yet another icon to that list! I love Telegram but I couldn't get more than a few friends to try it out and even then, no one checks their Telegram anymore because none of THEIR friends are on it. Even though I love the tech here, I can't get enough people to try and use it.

Not sure why this had to be its own app. Could have just been included into Allo. Also, all of the other apps I mentioned above have some form a desktop app (which is in a way, the biggest factor for me considering that's where I spend most of my time). Sigh! Great tech, terrible packaging.

ende 9 hours ago 6 replies      
Have they announced when it will be discontinued?
losteverything 2 hours ago 8 replies      
In my entire post tech life and post tech environment I have never seen anyone use, talk or desire to have a video call. (Save 1 when a patron tested FaceTime on their new iPhone.)

What will make video calling desireable for the average Joe?

sidcool 9 hours ago 3 replies      
Sounds promising. But not sure what this adds on top of Hangouts, which has a great group (and of course one to one) video calling feature. The only difference seems that Duo works with phone numbers, not Google Accounts.
sotojuan 9 hours ago 2 replies      
On the same week that they discontinued Hangouts on Air in favor of YouTube? Why is Google always making and closing stuff?
Animats 8 hours ago 1 reply      
"Announces", yes; "releases", no. The Google Play Store offers only a "pre-register" button. Yet it has 4.9 stars already.

Looking forward to seeing an analysis of the protocol. Does it go through Google servers, or is it really peer to peer? How does the "end to end encryption" work? How are the keys generated and exchanged? Do the servers have the keys? Are you sure?

bfrog 57 minutes ago 0 replies      
I still long for the days of a real gtalk native app
comex 7 hours ago 1 reply      
> You shouldnt have to worry about whether your call will connect, or if your friend is using the same type of device as you are.

...From the blurb from app that only runs on two operating systems (notably excluding Windows Phone), is proprietary (preventing third party clients from being written for other systems), and has no support for desktop operating systems (laptops are devices too).

On the last point, I suppose mobile-only for messengers is the new normal, but I for one frequently use iMessage and FaceTime from my MacBook in addition to my phone - depending on which device is closer, mainly - so Allo would be a significant downgrade for me. For video calls, laptops have an advantage over phones if there's more than one person on your side of the call, since you can get farther away from the camera to let everyone into the picture, without awkwardly keeping your arm held out horizontally or whatever.

lazyjones 5 hours ago 1 reply      
Wasn't WebRTC going to solve all this (e.g. https://appr.tc)? What happened, has it become browser cruft?
limeyy 8 hours ago 1 reply      
Ok, just a FaceTime clone. When Apple added it to iOS it seemed redudant initially (because of Skype etc), but now everyone is using it all the time, because it's phone number based: It's a success.

So the real question is, if they wanted to make this, what the hell took them so long?

It really smells like a mistake, and catch-up, almost pitiful at this point.

knocte 9 hours ago 1 reply      
End-to-end encrypted but not opensource is just trusting that it's really (well)encrypted.

Thanks, but no thanks.

seanp2k2 8 hours ago 0 replies      
No reason to get excited about this, they'll just abandon it in a year or two like the rest of their products which aren't selling ads.
buro9 5 hours ago 3 replies      
I'm going to presume that this isn't available to anyone who has a Google Apps account, as that has been the trend of nearly all product launches by Google recently.


Not available to Apps accounts:

* Google Family Library Sharing (Android apps sharing)

* Google Play Music Family Sharing (music collection sharing)

* Project Fi

* Google Spaces

* YouTube Red

* YouTube Music

Given that is most of the recent product launches, the chances of Duo and Allo being added to that list would appear to be very high.

josh_carterPDX 2 hours ago 0 replies      
Can't wait to hear how they're going to change it four times then abandon it and announce they're working more on Hangouts.
zuxfer 8 hours ago 0 replies      
I still like google talk better. It was simpler. Elegant. And it was paradise for IM lovers.
mkj 9 hours ago 0 replies      
Is this using the same parts underneath as hangouts?
noahmbarr 7 hours ago 0 replies      
Facetime benefits from Apple's consistent hardware compression.

It's why Skype and other platforms have had such a hard time touching it in terms of video quality.

guelo 8 hours ago 0 replies      
I think there's definitely space for a video conference app that's simpler then Hangouts and cross-platform in contrast to Facetime.

Here's the US play store link where you can pre-register https://play.google.com/store/apps/details?id=com.google.and...

soufron 6 hours ago 1 reply      
How many similar apps do we have already? We are increasingly solving problems that dont exist.
vorotato 1 hour ago 0 replies      
No desktop version? Why would anyone use this over hangouts?
jijji 1 hour ago 0 replies      
They are trying to do what viber has been doing for years
serpix 9 hours ago 6 replies      
Is there anything currently that works similarly without a google or Facebook account? My dad is so computer/phone illiterate that he cannot/won't register for either of these.

It should work across iPhone/Android.

I ask because video calling between my son and my dad is no longer as simple as FaceTime after I switched to Android

kevindeasis 8 hours ago 0 replies      
Man this and allo looks really interesting. I really can't wait for it to come out in Canada. I definitely wanna know how these product managers/owners or product feature creators are doing it in Google.
soufron 6 hours ago 4 replies      
Reading the comments... You guys know that any 3G/4G phone with a camera can make videocalls with awesome quality, right ?
HaloZero 8 hours ago 1 reply      
It sounds like they're just trying to replace Facetime. I know my parent adore Facetime but they don't know all about this email stuff. They just know they want to call me.
unexpected 8 hours ago 1 reply      
Is anyone able to download this? I am trying to download this on iOS, the App Store is giving me the message "this application is not available to download the in the US Store"


mattkevan 5 hours ago 0 replies      
Are the same people responsible for Microsoft's endless rebranding of products and services now working at Google?

It's exhausting keeping track of what's either just been released or cancelled on any particular day.

izacus 6 hours ago 1 reply      
Does this even work from a deskop computer or is it a mobile app only?
diimdeep 5 hours ago 0 replies      
Guy with beard and hipster glasses - check.
jmspring 9 hours ago 0 replies      
Why should a video app be required for N of M situations? Just do one app that works well.
rattler 5 hours ago 0 replies      
I like that the apps name is tachyon.
satyajeet23 4 hours ago 0 replies      
Facetime Rival? Nope.
subhrm 8 hours ago 0 replies      
Not in India :(
aandon 8 hours ago 1 reply      
aaaaaaand it's not on the App Store
techaddict009 8 hours ago 0 replies      
Aren't a bit late to the bot party?
davidone_f 8 hours ago 0 replies      
Cool, let's try this. Oh my device (android 5.1.1 cm12) is not supported (yet? anymore?). Well, I think I can survive also without this super cool application :)
akerro 7 hours ago 0 replies      
>a feature in Duo called Knock Knock which lets you see live video of your caller before you answer, giving you a sense of what theyre up to and why they want to chat

Where can I report this as a bug? That literally misses the point of "call". Call starts before it's started for a caller.

sickbeard 1 hour ago 0 replies      
Reminds me of Microsoft. Makes a bunch of products.. nobody cares.
ProxCoques 8 hours ago 7 replies      
Video calling is a great example of something you don't need, but because you can do it, you do it. It's also impossible to refuse a video call for fear of being seen as secretive or worse when in fact you just don't care to look up somebody's nose when you talk to them.

I also wonder if it's mainly nerds who care about video calling because it speaks to their otherwise rather broken social skills. "Normal people look at each other when talking, right? Look! Using this heap of high tech makes me a normal person!"

Meanwhile, I note that amongst actual normal people, interest in video calling is low because they see it for what it is: a poor substitute for the real thing. Just as no normal person is into cyber dildonics.

Fuchsia, a new operating system github.com
371 points by helloworld517  4 days ago   157 comments top 27
c3534l 4 days ago 8 replies      
Fuchsia is not a combination of pink and purple. It is the color your brain comes up with when it sees contradictory color signals (such as very high and very low wavelengths without the appropriate middle stimulation). It's the only color not in the rainbow. As you can see from this additive color program (http://trycolors.com/?try=1&ffb5d9=0&c31cff=0), pink and purple create a lavender color. Whereas fuscia is what happens when you combine colors in an unusual way (http://www.exploratorium.edu/sites/default/files/ColoredShad...). Normally I wouldn't be this pedantic, but this is hackernews after all.
pavlov 4 days ago 2 replies      
The repo at https://fuchsia.googlesource.com reveals a rather interesting UI story for this new operating system.

It seems like the intention is to use Flutter [1] as the UI layer. Flutter uses the Dart language, so there's a Dart environment included in Fuchsia too [2].

For rendering, Fuchsia includes a project called Escher [3] which is described as a physically based renderer that supports soft shadows, light diffusion and other advanced effects. Looking at the source code, Escher is designed to use either OpenGL or Vulkan as the underlying graphics API. (There's an iOS example project included in Escher's source tree. Would be interesting to build that.)

It's not immediately obvious why a lightweight operating system would need a renderer that can do realtime soft shadows and light effects...! But I think the idea here is to build an UI layer that's designed from scratch for Google's Material design language. Shadows and subtle color reflections are a major part of that "layered paper" aesthetic.

So, the stack seems to be: Dart is the language for GUI apps, Flutter provides the widgets, and Escher renders the layers.

The underlying application framework is called Mojo [4]. It already offers bindings for Go, Java, JavaScript, Python and Rust in addition to Dart, but maybe those languages are meant for services rather than GUI apps. (At least I can't see an easy way to create Flutter widgets from something like Rust without loading the Dart VM.)

[1] https://flutter.io

[2] https://fuchsia.googlesource.com/dart_content_handler/

[3] https://fuchsia.googlesource.com/escher/

[4] https://fuchsia.googlesource.com/mojo/

ansible 4 days ago 2 replies      
I'm calling it now: this is for augmented reality displays and similar. You want an RTOS for loss and predictable latency. And current GUIs aren't really suited to 3D environments you can walk around inside.

This is Google's next Android, with a low latency rendering pipeline for the next generation of mobile devices.

ocdtrekkie 4 days ago 4 replies      
Some useful bits from IRC:

[16:21] <ocdtrekkie_web> Why's it public (mirrored to GitHub even) but not announced or even documented what it's for?

[16:22] <@swetland> ocdtrekkie_web: the decision was made to build it open source, so might as well start there from the beginning

[16:22] <lanechr> ocdtrekkie_web: things will eventually be public, documented and announced, just not yet

[16:23] <@swetland> currently booting reasonably well on broadwell and skylake NUCs and the Acer Switch Alpha 12, though driver support is still a work in progress

[16:24] <@travisg> yeah and soon we'll have raspberry pi 3 support which should be interesting to some folk

Sidebar comment: I wonder how much more activity this thread would be getting if the subject line had "by Google" in it. LOL

fixmycode 4 days ago 6 replies      
I remember a post earlier today about how open source projects needed better marketing. prime example. I had to dive in to know what was the project all about...
pavlov 4 days ago 1 reply      
> Pink + Purple == Fuchsia (a new Operating System)

Pink [1] and Purple [2] were both Apple codenames for operating systems. Probably not a coincidence, but I don't see an obvious connection...

[1] https://en.wikipedia.org/wiki/Taligent#Pink_and_Blue

[2] http://www.phonearena.com/news/Did-you-know-that-the-codenam...

helloworld517 4 days ago 0 replies      
Hosted on https://fuchsia.googlesource.com/ is what looks like an early in development operating system.

Mirrored on Github where it's described as Pink + Purple == Fuchsia (a new Operating System)

The kernel component 'Magenta' reveals it "targets modern phones and modern personal computers with fast processors, non-trivial amounts of ram with arbitrary peripherals doing open ended computation." [1]

[1] https://github.com/fuchsia-mirror/magenta/blob/master/docs/m...

kevin_thibedeau 4 days ago 3 replies      
> It is good alternative to commercial offerings like FreeRTOS [1]

FreeRTOS is GPL with an exception for static linking making it effectively free if you make no modifications. There is, however, an onerous clause prohibiting the publication of comparative benchmarks. [2]

[1] https://fuchsia.googlesource.com/magenta/+/HEAD/docs/mg_and_...

[2] http://www.freertos.org/license.txt

mcirsta 4 days ago 1 reply      
The only thing that really bothers me with this new OS is that the kernel is no longer GPL. With a GPL kernel like Linux we had a chance of getting the kernel source code ( some companies don't care if it's GPL anyway ) for our devices but if it's Apache or BSD good luck with that.
colemickens 4 days ago 4 replies      
Written in C. What a shame.

edit: Thank you to all of the repliers, I had no idea that most OSes were written in C. Er, actually, I'm more than well aware of that fact and I'm familiar with the number of CVEs that have occurred over the years because of the lack of memory safety involved in that C code.

Sorry, I simply don't get the appeal of writing more operating systems and network-exposed code that isn't written in a safer language. Say like Rust; see Redox.

fredgrott 4 days ago 1 reply      
Its a new RTOS...

Basically that means more than just cell phones as you have embedded systems in vehicles that use RTOS, watches, medical devices, etc.

Its the operating system that runs the baseband cpu/chip other wise known as the BaseBandProcessor.

vacri 4 days ago 1 reply      
Well, if it takes off, it'll have the side-effect of getting more people to be able to spell 'Fuchsia' correctly...
bobajeff 4 days ago 0 replies      
Does anyone have an idea of what kind of technical problems this is trying to solve?

It sounds like it's trying to be a RTOS for phones and modern hardware. But I know there has to be more to it than that.

lwis 4 days ago 0 replies      
There's a distinct lack of information in their repos README's.
merb 4 days ago 1 reply      
Actually this compiles pretty easy.It however only runs on the provided qemu. And actually without adding some user space tools you only can use kilo which is akward to handle under a MacOS terminal :(But it's really really easy to setup and even integrate userspace programs. kind of extremly simple to do useful stuff on it.
zfrenchee 4 days ago 3 replies      
Put Fuchsia and Android green on a color wheel. I dare you.
hackaflocka 4 days ago 1 reply      
I consider myself fairly tech savvy, and have done a little programming here and there.

But from the linked page I couldn't figure out where any of the documentation of this "thing" is, nor how to install it, nor what platforms it's for.

Was it just me?

united893 4 days ago 1 reply      
Would someone most kindly provide a VM?
HoopleHead 3 days ago 0 replies      
Well, it may be difficult to fathom what this project is all about but, at least, I've learned one thing. I've been spelling FUCHSIA wrong all these years. I always thought it was FUSCHIA.

[As I have several growing in the garden, this is an important development]

merb 4 days ago 0 replies      
Apache2 and MIT. Sounds interesting
dart_user 4 days ago 1 reply      
>> It seems like the intention is to use Flutter [1] as the UI layer.

Flutter already ready for use?If Flutter is not ready for use then how it can be used?

shankardelta 1 day ago 0 replies      
Fuchsia = deep mind + Ai

(Getting to there Ultimate goal) we can also expect updated Google glass too

whyagaindavid 4 days ago 1 reply      
No screenshots?
aezell 4 days ago 1 reply      
I'm working on a Twitter client that only runs on this OS. Wait a sec... Oh, I see. Time machine was NOT set to 2010.

Working on a Pokemon GO client that only runs on this OS.

lotsoflumens 3 days ago 0 replies      
I'm (not) eagerly awaiting the day when some IoT thing "Fucks ya".
empressplay 4 days ago 1 reply      
thekevan 4 days ago 0 replies      
I have not looked in to this and don't have expertise, but my first split second thought was "how are they going to keep up with security exploits?" Sad when my first thought it that someone's going try to steal from people using it.
       cached 16 August 2016 15:11:01 GMT