hacker news with inline top comments    .. more ..    11 Nov 2015 News
home   ask   best   2 years ago   
The Impossible Music of Black MIDI rhizome.org
135 points by GotAnyMegadeth  3 hours ago   63 comments top 23
reviseddamage 0 minutes ago 0 replies      
The weirdest thing: my son who is autistic is fascinated by this music and can't seem to let go. He keeps watching/listening to every "black MIDI" out there on youtube.
baldfat 1 hour ago 4 replies      
I immediately thought of a quote from the movie Amadaus (About Mozart)

EMPEROR: Of course I do. It's very good. Of course now and then - just now and then - it gets a touch elaborate.

MOZART: What do you mean, Sire?

EMPEROR: Well, I mean occasionally it seems to have, how shall one say? [he stops in difficulty; turning to Orsini-Rosenberg] How shall one say, Director?

ORSINI-ROSENBERG: Too many notes, Your Majesty?

EMPEROR: Exactly. Very well put. Too many notes.

MOZART: I don't understand. There are just as many notes, Majesty, as are required. Neither more nor less.

EMPEROR: My dear fellow, there are in fact only so many notes the ear can hear in the course of an evening. I think I'm right in saying that, aren't I, Court Composer?

SALIERI: Yes! yes! er, on the whole, yes, Majesty.

MOZART: But this is absurd!


This certainly is an interesting area to explorer musically. Want can we do with a piano if we had more than ten notes to play at a time. Right now it is HOW many can we play and still sound musical from the samples I saw.

avian 1 hour ago 1 reply      
I guess the "speaking piano" could also classify as black MIDI:


hammock 2 hours ago 4 replies      
Has anyone ever created an "inverted" song? For example, take the Ode to Joy melody and play every other note BUT that one. Maybe limit yourself to one octave so it's at least somewhat listenable. Wonder what that would sound like.
felipebueno 13 minutes ago 0 replies      
That's facinating!

If you are reading this comment watch this [Black MIDI] Synthesia "Nyan Trololol" | Rainbow Tylenol & Nyan Cat Remix ~ BusiedGem


You are welcome ;)

aclissold 38 minutes ago 1 reply      
This is fascinating! They took a medium I had always classified as a cheap imitation of a real piano, and accentuated its strengths to make it something incredibly! Blocks of notes for percussion, impossibly fast trills for a different timbre of sustained notes, melodies that are detectable aurally but not visually... Awesome!
userbinator 1 hour ago 0 replies      
If these were in the form of player piano rolls, they'd probably have a similar effect to this:


pdkl95 52 minutes ago 0 replies      
There's a "too many notes" version of Bad Apple? Hmm... I think I'll stick with marasy8's incredible cover[1] of the song.

Not that there is anything wrong with noise. As a fan of CCCC (Mayuko Hino)[2], I have often thought noise is best when it is played directly (analog), instead of the digital perfection of MIDI. I like my digital noise when it's written in Fourth[3].

[1] https://www.youtube.com/watch?v=hr7uwOp0Yck

[2] https://www.youtube.com/watch?v=kXbb-e0RRc0

[3] http://pelulamu.net/ibniz/ warning: turn down the volume - raw square waves at the beginning!)

s_kilk 2 hours ago 0 replies      
Pretty cool, but the note-counts are a bit less impressive when you realize they're using big blocks of notes to simulate percussion and noise/pads.
DanBC 3 hours ago 1 reply      
There's a bit of discussion from 2 years ago: https://news.ycombinator.com/item?id=6640963
dwiel 2 hours ago 0 replies      
Lubomyr Melnyk does something similar by hand on a real piano.


6stringmerc 1 hour ago 1 reply      
Mmmm. Reminds me of some of the most interesting Drum & Bass that I come across in my listening / purchasing. From what I can tell, it has about the same appeal haha.

I've been legitimately impressed by the abilities of certain artists to really push the boundaries and limits - in my opinion - of packing in musicality in the D&B platform. One I can point to would be Camo & Krooked[0]. Another that sort of crosses into bass music would be Knife Party[1]. What they have in common, to my ears, is that they're able to embrace the full spectrum of available sounds. High peaks and really, really deep bass. Then, with so much digital control, they can go up and down, place certain sounds in certain tonal areas...it's just amazing to me.

One of the things that infuriates me about music commentary is the tired refrain of "rock is dead" or "music isn't original anymore" because frankly, they're not true. Rock continues to be a broad genre, and I really see electronic-production tunes like Skrillex an extension of rock and metal in that it appeals to a younger audience (predominately) and is very abrasive to 'traditional' ears.

Music is really going through a metamorphasis of innovation thanks to software like Ableton and the numerous brilliant synths out there. It's one thing to say "I don't like that music" but it's completely dishonest to say nothing's original anymore. Yes, there will always be 'trend jumpers' and some formulaic stuff (it goes for all genres, and specifically anything Max Martin touches), but now and then, BAM. Something shows up and moves the needle.

[0] Camo & Krooked - Let's Get Dirty - https://www.youtube.com/watch?v=IL5H38bpvFA

[1] Knife Party - Resistance - https://www.youtube.com/watch?v=DwqUGkR9yh8

andrewclunn 1 hour ago 0 replies      
I'm synesthetic and this is AMAZING! How did I not know this genre existed before now?
omginternets 3 hours ago 5 replies      
It's a shame that none of this music is ... well... enjoyable.
thebouv 1 hour ago 0 replies      
What I find particularly interesting to watch is the patterns drawn by some of those YouTube videos when the notes are struck. A visual art as well as the sound.
sideshowb 2 hours ago 0 replies      
Too many notes? https://www.youtube.com/watch?v=Q_UsmvtyxEI

(To clarify - I feel the clip from Amadeus is somewhat amusing in this context, but to those who critique black midi as music, I don't think it is really ... it's art).

apalmer 1 hour ago 0 replies      
Not really fundamentally interesting from a musical standpoint. Only 'interesting' part of it is looks funny on paper...
ph0rque 2 hours ago 0 replies      
So, it's like the high-frequency trading of music then.
GFK_of_xmaspast 2 hours ago 0 replies      
The music looks incredibly dense, but I wonder what would happen if you took a symphonic score and put all the notes on the same staff. (and also wonder if they could get significant reduction in note count by using different instruments besides a piano
joeyspn 2 hours ago 2 replies      
"Less is more" - Ludwig Mies Van Der Rohe
kmenzel 59 minutes ago 0 replies      
It's amazing how complicated you can make something look with very low midi velocities...
Dizzying Ride May Be Ending for Startups nytimes.com
68 points by thatcherclay  3 hours ago   53 comments top 18
andy_ppp 0 minutes ago 0 replies      
I actually think the opposite, that we are in a period of history where all the software (and arguably businesses) people use day to day goes from being crap to fantastic.

A gold rush for good startups I think.

The returns from sitting the right group of people in a room and getting them to make doing something a few orders of magnitude better than it was before are still going to be huge.

hvs 1 hour ago 4 replies      
For those of you too young to remember, there were numerous articles written about the bubble bursting before it finally did in 2000-01. It wasn't a surprise that it did, just that no one knew precisely when it would.

My point is that arguing that people have said this bubble was about to burst and that it hasn't yet isn't an argument that it won't.

debacle 4 minutes ago 0 replies      
While things might wind up bad for the next round of unicorns, things are likely to be better for startups overall once the trend of "To the moon!" dies down just slightly. Hopefully we can return to a world where an acquisition isn't seen as a failure.
jgrahamc 1 hour ago 3 replies      
It's time for a new term: a "Pegasus" (a different kind of mythical horse than a unicorn):


 Pegasus (n) 1. Mythical winged horse; 2. Silicon Valley 'unicorn' with high gross margin. i.e. one that might actually take off.

chollida1 1 hour ago 1 reply      
Fidelity has just marked its shares down from $30.72 at the end of June to $22.91 for the end of September.

To be fair, I think these markdowns have more to do with who is investing than the companies themselves.

VC's do portfolio valuations much less frequently than mutual funds, PE firms or hedge funds do and they give less negative scrutiny to the valuation than the aforementioned firms do, the reason for this....

... is VC firm's typically don't allow redemptions on monthly intervals which means they can keep an unrealistic valuation for longer where as Hedge funds, PE firms and mutual funds, who typically allow monthly redemption, need to properly value each holding at the end of each month.

I mean if you are a VC, do you care if you don't write down Snap-chat at the end of the month, you really have no incentive to do so?

You get paid on a quarterly basis on the size of your portfolio, why mark it down until you are absolutely certain that it needs to be marked down, this point is usually not until you actually go to sell, be it IPO or private equity deal.

However, if you are a hedge fund and someone wants to redeem their assets, you want to make sure you value Snap-chat for what you can realistically sell if for as that's essentially what you are doing when you allow someone to redeem their funds from your firm.

With people pulling money out of hedge funds, and PE firms on a monthly basis, this makes you have to pay attention to valuations on a much more granular time frame than historically VC firms would have.

dgreensp 1 hour ago 1 reply      
Sam Altman has already explained why late-stage private valuations -- but not earlier-stage or public valuations -- are bubble-like right now:

>To summarize: there does not appear to be a tech bubble in the public markets. There does not appear to be a bubble in early or mid stages of the private markets. There does appear to be a bubble in the late-stage private companies, but thats because people are misunderstanding these financial instruments as equity. If you reclassify those rounds as debt, then it gets hard to say where exactly the bubble is.

>At some point, I expect LPs to realize that buying debt in late-stage tech companies is not what they signed up for, and then prices in late-stage private companies will appear to correct. And I think that the entire public market is likely to go downperhaps substantiallywhen interest rates materially move up, though that may be a long time away. But I expect public tech companies are likely to trade with the rest of the market and not underperform.


rdlecler1 2 hours ago 2 replies      
If Fidelity just did a 25% write down on SnapChat on the most senior portion of a $600m investment round, and assuming that Fidelity has at least a 1x liquidity pref/ratchet, then SnapChat is now valued at $462M floor, not $15 billion.
damon_c 1 hour ago 2 replies      
If I may speculate, it seems like the end result of this situation will be that in the future, startups will avoid taking money from mutual funds or anyone else who must attempt to accurately value their holdings publicly, whenever possible.
alp1970 1 hour ago 0 replies      
I always get scared when "delivery" based start-ups get hot. Reminds me of Kozmo, UrbanFetch, WebVan, Askville...
tmaly 1 hour ago 0 replies      
I was doing a paid internship at Intel in 1999 out in Portland, OR. I remember seeing huge numbers of new hires every week. I met people out in Portland that were hired to due VB programming with no programming experience. A few months later, the music stopped and there were too few chairs to go around. I always think of the Austrian business cycle when I see such huge upswings in things
Alex3917 1 hour ago 1 reply      
I think a lot of companies who raised seed funding prior to 2010 or 2012 did so at excessively low valuations, and then tried to make up for it later by raising at excessively high valuations once they hit. The 'bubble' over the last couple years that's driven up pre-seed valuations should actually make the current crop of startups more stable over the long run.

Also, the decaying state of physical infrastructure in the U.S. is only going to drive more people to spend time on the Internet, where network effects are only getting exponentially more powerful as new networks are getting built on top of existing networks. These days a social startup that's "only growing as fast as Facebook" might not even be able to successfully raise a seed round. There might be a cyclical downturn, but none of the underlying trends in society point to tech being a bad investment over the longterm.

koblas 1 hour ago 0 replies      
What we're seeing an issue with valuations and investments. TechCrunch just did a really good piece on how a raise of $150M gave a $6B valuation with a preference that guaranteed a 20% return on investment to the Series E investors (at the cost to the early investors).


So what we're seeing is that people are starting to re-think valuations in the face of these preferences.

bsg75 49 minutes ago 0 replies      
It would be a nice change if focus was on companies that produced a product or service with long term revenue prospects, instead of short term wildly high margins.

The current state of highly educated people looking for get rich quick schemes (unicorns) is tiresome.

kshatrea 1 hour ago 0 replies      
Looking at India, this can be conflated with a global level ending of the dizzying ride. [0] gives a good overview of this. In short, the free lunch is now over and people are asking for results. I am sure it has a lot to do also at an economic level with the Fed now talking of tightening- that means interest rates are headed higher and there is more aversion to risk. I am not an economist, so others might have different opinions.

[0] https://goo.gl/9MfjBa

axis967 49 minutes ago 0 replies      
I think there are far too many startups that focus solely on growth/reach. Build a sustainable business: revenue and more importantly gross margin are the key metrics that need to be thought about. While vcs want fast growth, it is often not in the best interest of common stockholders to jet ahead at the paces many of these companies go.
ForHackernews 52 minutes ago 0 replies      
Oh thank god, finally. Maybe I'll be able to afford an apartment again.
billybilly1920 1 hour ago 2 replies      
it's ending again? Wasn't it supposed to end the year before, and the year before that, and the year before that? When is google going to just drop news.google.com and have an algorithm write the same stories over every year?

Next up: The next [pick top product] killer! you won't believe how [pick new or underdog product] is going to completely replace [pick top product] due to it's [pick random feature in [pick new or underdog product]]

mironathetin 1 hour ago 1 reply      
Next bubble burst is close.
Distributed Machine Learning Toolkit dmtk.io
48 points by mrry  3 hours ago   discuss
IP traffic over ICMP tunneling github.com
18 points by vampire_dk  1 hour ago   16 comments top 5
piyush8311 13 minutes ago 1 reply      
I just tried iodine and icmptunnel. Can't say for sure but I think icmptunnel was faster. At least for my internet
victorhooi 21 minutes ago 1 reply      
For anybody that's tried both - how do these compare to DNS tunnels (e.g. iodine), in terms of speed and reliability?
PinguTS 38 minutes ago 1 reply      
Not the first of its kind, just look-up in Wikipedia: https://en.wikipedia.org/wiki/ICMP_tunnel

Any captive portal these days block also ICMP.

Most firewalls block ICMP these days, because the days of blacklisting are over and ICMP is not the one who is getting white listed. Why?

The only way these days is to misuse DNS. But even that works less and less reliable.

txutxu 23 minutes ago 2 replies      
I use to restrict ICMP to echo/reply using -m icmp on iptables, but this uses just that kind of packets...

Is there anyway to stop things like this at the corporate firewall?

de_wq912AesppE5 31 minutes ago 1 reply      
There are DNS tunneling apps which will (usually) get past those captive portals that block ICMP. It's just slower.
Star of Startup.com Charged with Accounting Fraud wsj.com
18 points by harold  1 hour ago   2 comments top 2
tgb 33 minutes ago 0 replies      
Next they'll be investigating Tup.
83457 13 minutes ago 0 replies      
oh wait... googling shows it as a real documentary. Could have sworn when I saw it so many years ago that it was not a real company
Bare Metal Rust 2: Retarget your compiler so interrupts are not evil randomhacks.net
93 points by dbaupp  5 hours ago   11 comments top 3
jscheel 2 hours ago 3 replies      
I've been playing with Rust for os dev and emulation for a little bit. It's great to have others who are significantly more well-versed in this field sharing their knowledge with those of us who are struggling through it.
br1 2 hours ago 1 reply      
Can you just not use the first 128 bytes of the stack on a interrupt?
steveklabnik 3 hours ago 0 replies      
There's been a lot of really neat stuff focusing on beginners in the Rust OSDev space lately. Glad to see even more posts about it!
Artificial Intelligence as a flow chart motion.ai
8 points by impostervt  18 minutes ago   discuss
Habits of a Happy Node Hacker heroku.com
44 points by snodgrass23  1 hour ago   18 comments top 5
acbabis 3 minutes ago 0 replies      
#4 seems spurious. Just another example of "If everyone did it my way, then everyone would be doing it my way." What if people don't want to do it your way? What if people can actually remember to spell correctly?EDIT: *nix has case-sensitivity so that people can actually use it.
cvburgess 14 minutes ago 1 reply      
The article suggests using node-foreman (#6); can someone explain the advantage of Procfile-based environment management? I read the docs and didn't see anything that couldn't be handled by a simple config.json or some environment variables.
bshimmin 42 minutes ago 5 replies      
Node.js is the rare example of a Linux-centric tool with great cross-platform support.

Apart from the problems with Windows, npm, and path lengths.

douche 31 minutes ago 0 replies      
Until I looked at the actual title of the post, I was a little concerned that somebody had invented a time machine.

It's been changed now, but originally was titled "10 Habits of a Happy Node Hacker (2016)"

secretagent 47 minutes ago 5 replies      
Regarding point 9, if you don't check your `node_modules` into version control don't you run the risk of them disappearing from NPM?
Court says tracking web histories can violate Wiretap Act wired.com
73 points by Oatseller  5 hours ago   6 comments top 3
rm_-rf_slash 3 hours ago 1 reply      
The original intent of the Wiretap Act was to place a distinction between content and metadata.

The problem now is that there is JUST SO MUCH metadata that it is losing that distinction. If someone calls a known pot dealer once a week, then it doesn't matter whether you hear the call or not, you can still infer that the caller picks up every week.

A DOJ lawyer once said to me that "when survellience is ubiquitous, the role of law enforcement becomes the role of a prison warden, where everyone is an infraction waiting to happen."

throwaway2048 3 hours ago 0 replies      
Frankly results like this are inevitable from either the courts, or congress if advertisement companies continue to refuse to self regulate in a reasonable way.

The only reason its not a bigger issue right now isn't because "nobody cares, privacy is dead!". It's because people for the most part do not understand the mechanism.

Explain exactly how ads track you to common people, just how many do you suppose will approve and be comfortable with the arrangement?

The entire industry is built on sand.

module17 4 hours ago 1 reply      
Analytics. Illegal!
Rare early photographs of Peking bbc.com
22 points by yitchelle  2 hours ago   4 comments top 3
coldpie 1 hour ago 0 replies      
Thomas Child died in 1898[1]. If current US copyright law, as expressed in the TPP, had existed when these photos were taken, then the copyright on these photos would have expired in 1968, seventy years after their creator's death and about a hundred years after they were taken. If they were taken as works for hire, then they would have expired around the year 2000, more than one hundred years after their creator's death and one-century-and-two-decades after they were taken.

[1] http://hpc.vcea.net/Database/Photographers?ID=39

k2enemy 11 minutes ago 0 replies      
Very neat photos. The Azure Cloud Temple in the Fragrant Hills park is one of my favorite places on earth. The park is very busy in general, but the entire path leading up to the Cloud Temple and the temple itself are typically empty, peaceful, and serene.
officialjunk 30 minutes ago 1 reply      
rant: it never was Peking. "Peking" is a phonetic spelling performed by westerners listening to non-mandarin speakers, who weren't from Beijing. If they had listened to a mandarin speaker, which most Chinese are, even at that time, we wouldn't have that horribly mis-spelled name.
Hoffa: TPP a punch to the gut of U.S. workers detroitnews.com
50 points by walterbell  3 hours ago   19 comments top 7
walterbell 50 minutes ago 0 replies      
Melinda St. Louis of Public Citizen's Global Trade Watch, and Ari Rabin-Havt, speaking yesterday in a 1-hour panel discussion on the TPP, http://www.youtube.com/watch?v=MFHG0fZW3_s&t=15m40s

"Of the 30 chapters of the TPP, there are only 6 that have to do with traditional trade issues the rest have to do with "behind the border" policies, which are basically our laws In the text, we see an expansion of the failed model, under NAFTA, that pits US workers against workers in other countries, this time in Vietnam where workers earn 65 cents an hour, as well as other countries such as Malaysia which has a huge problem with human trafficking modern-day slavery we are very concerned that this continues that race to the bottom it leads to an overall depression of wages and an increase of income inequality in the U.S.

It's a trade of industries: we are going to favor our pharmaceutical manufacturers, certain content producers receive favorable status while giving up other industries."

clarkevans 1 hour ago 1 reply      
"Protectionism" is the dog whistle rallying those who wish to circumvent labour and environmental protections by making products elsewhere, yet, selling them here in direct competition with those who follow the rules.

If only we could fashion trade agreements to lift all boats, removing tariffs only when the production of a product is ecological and societally sustainable.

untothebreach 1 hour ago 1 reply      
Did a double-take when I noticed the author's name. Jimmy Hoffa's disappearance[1] is a pretty well-known mystery, at least here in Detroit, and I guess I never knew that his son had taken over leadership of the Teamsters.

1: https://en.wikipedia.org/wiki/Jimmy_Hoffa#Disappearance

kailuowang 1 hour ago 2 replies      
The software industry in U.S. did have a similar experience 5-15 years ago when "outsourcing" is the fashion.

What came out of that, it appears to me, is a booming software industry with more local competent and confident software engineers.

I am for reaching a economic global optimum.

lumberjack 1 hour ago 3 replies      
Random idea: If protectionism works cannot a group of people decide to create an internal economy where they purposefully trade with each other before outsiders for all products that they manufacture? They will buy goods for above real market value but the money will remain within their internal economy. Biggest problem would be enforcing the contract.

I think this already sort of exists somewhat. If my close relatives, manufacture something, I gladly buy their goods over those of their competitors because yes, I am paying more but the money is supporting my uncles, nephews and nieces. I think that's a net positive for me. They reciprocate the favour. As a family we are wealthier than we would be otherwise because, no, they cannot abandon their business and start new businesses just so they could take advantage of competitive advantage. That is unrealistic. They know only one market well enough and do not have the resources or guidance to get into a completely different market.

I guess the Amish are another example of this and I believe Israel's Kibbutz aswell.

amyjess 19 minutes ago 0 replies      
And if Hoffa gets his way, US consumers will experience a punch to the gut.

Protectionism and "Buy American" is fundamentally incompatible with affordable consumer goods. Hoffa's way will lead to you only being able to afford a new smartphone once every 20 years. Televisions will become family heirlooms because you just can't afford to buy a new one. Hope you like the outfit you're planning on buying, because you're going to be wearing it for years.

I value the modern American lifestyle.

ameister14 57 minutes ago 1 reply      
I don't really like it when people try to evoke emotional responses to things they haven't read.
Windows game developer about porting to and using OS X shiningrocksoftware.com
240 points by levifig  10 hours ago   138 comments top 19
lucozade 5 hours ago 2 replies      
> Apparently my vi command muscle memory hasn't faded.

Back in the day I did most of my dev work on Solaris. I then spent 4 years as CTO as a startup that was pretty much only Windows.

When I subsequently went back to working at a unix shop I was initially struggling with vi as I tried to read some of the C++ code. I couldn't remember commands, was having to refer to the man pages every few mins. It was torture.

A couple of days in, I was writing up some notes in vi when someone walked past my desk and started chatting. When we finished talking I looked down at the monitor and I'd written more than I had when I was concentrating, nicely formatted, the works. Turns out "my hands" had remembered a load of what I thought I had forgotten.

For the next few days I had to keep finding ways to distract myself so that I could work efficiently. Eventually it all came to the foreground but it was the most bizarre experience while it was happening.

SXX 6 hours ago 2 replies      
Im not using SDL or any other library to hide those platform differences.

Once start working on Linux port he'll regret about that. Every developer that start with own platform-specific code end up using SDL2 anyway. Don't do that mistake.

stinos 9 hours ago 11 replies      
Xcode isnt too bad.

I wish the author told me more about it than just this. Can somebody comment on how it compares to recent VS editions these days? About 5 years ago I also looked into using OSX as main OS. As I've always been using non-commandline graphic text editors and IDEs for most coding that made XCode the go-to environment but I just couldn't deal with it even though I tried. I don't remember all details but in general it just felt inferior to VS on like all fronts, with no advantages of any kind (for C++). Again, IIRC, but it did annoying things like opening the same document in multiple windows, one for editing and one for debugging or so? Anyway, what's the state today?

curyous 8 hours ago 5 replies      
So he's got a working port on an OS he's never seen before, in only 1 week? Does that seem extraordinarily productive to anyone else?
jokoon 1 hour ago 1 reply      
Well I sold my macbook pro because I wasn't able to build my ogre project properly. For years. Also there was some OIS (not iOS, OIS) input issue. It comes from apple force feeding Cocoa into opengl apps, or something like that, which can only be remedied by using some SDL hack.

Anyway, I don't really care anymore, I bought a thinkpad instead. Cocoa is just something I just can't even.

My experience has been pretty different. I'm not a professional developer though.

lmolnar 3 hours ago 1 reply      
I recently went through a very similar process porting my screensaver [1] from Windows to Mac without using a library like SDL. Here are some additional difficulties I encountered during this process:

OpenGL on multiple monitors - this was much more difficult to do on MacOS. I had to create a separate window for each monitor, create a rendering context for each window, make sure my graphics code was issuing the drawing commands to the proper context, then have each context queue/batch "pending" rendering commands and issue them all at once at the end of a frame on a by-context basis. Whereas on Windows you can pretty much create a window that spans multiple monitors and draw to it with a single rendering context.

Input - I used DirectInput on Windows and wrangled a suitable implementation using HID Utilities on Mac, which was not easy given my lack of previous USB programming experience. A major annoyance was the lack of a device "guid" that you can get via HID Utilities to uniquely identify an input device - I had to manually contruct one using (among other things) the USB port # that the device was plugged into. Not ideal.

SSE intrinsics - my experience was that Microsoft's compiler was MUCH better at generating code from SSE/SSE2 intrinsics then clang - my Windows SSE-optimized functions ran significantly faster then my "pure" C++ implementations, where as the Mac versions ran a bit slower! My next thought was to take this particular bit of code gen responsibility away from clang and write inline assembly versions of these functions, but I took a look at the clang inline assembly syntax and decided to skip that effort. (I did write an inline assembly version using the MS syntax and squeezed an additional 15% perf over the MS intrinsic code.)

Prtty much everything else (porting audio from DirectSound to OpenAL, issuing HTTP requests, kludging up a GUI etc) was pretty straight forward/did not have any nasty surprises.

[1] http://www.ubernes.com/nesscreensaver.html

trymas 9 hours ago 9 replies      
For me the most interesting part (and answer to it) why on OS X the game runs at 1FPS, whereas on windows machine with the same graphics card it runs just fine?

What can make such considerable difference?

mavdi 8 hours ago 3 replies      
Has he developed this game all by himself? How can people be so productive? Develop, test, market... This is just crazy.
anton_gogolev 9 hours ago 3 replies      
Tangentially related: Banished is higly recommended. Quiet room, couple glasses of whiskey and you're guaranteed to have a nice evening.
stevoski 7 hours ago 0 replies      
I went the other way: last year I converted an OS X app to Windows. I hadn't used Windows for six years, and had forgotten most things.

It took two weeks to get the code compiling and running. That turned out to be the easy part. Getting the application performing well, feeling "native", and getting the bug count down took another six months.

I love Banished and I'd like to see a completed OS X port. But I'm not expecting this to be done, like, tomorrow!

885895 9 hours ago 2 replies      

Not just unix-like, OS X is certified UNIX.

shmerl 9 hours ago 0 replies      
I'd be more interested in his experience about porting to Linux (since supposedly he is working on it as well).

OpenGL on OS X is still behind the times, and so far it's not even clear if Apple will add Vulkan support when it will come out.

packersville 1 hour ago 1 reply      
> couldnt figure out how an All Files category was useful when browsing finder windows

To this day I still don't see or find how it is useful.

indifferentalex 5 hours ago 0 replies      
The man's commitment to his game, considering that he works alone is incredible. Without using terms like "10xer" and "rockstar", he's got an incredible level of perseverance and dedication, considering that he already launched the game and at this point is working on features that are considered by many boring and a grind, all to make a polished finished product. The fact that he documented pretty much everything in his blog is great if you need motivation or are just curious about how to make a game from zero.
maljx 4 hours ago 0 replies      
We develop our game on Mac OS X and port to Win32 and Linux. Using CMake, SDL2 and C++11 there is very little code that actually needs to be rewritten. The windows build process is just a python script that pulls, cmake configure, compiles and zips the latest build.

The code that is completely different on the platforms is stuff like HTTPS requests, open file dialog, create/delete folders.

AdmiralAsshat 2 hours ago 1 reply      
The fact that its running at 1 FPS is a little disheartening I know the GPU is fast enough. Ive got a Windows machine with an Intel Iris Pro 5000 that runs Banished just fine, which is the same graphics hardware in my MacBook Pro. Ive got my suspicions as to whats going on but I have a bunch of testing ahead of me to make sure I fix the issue properly.

Did the author buy a MacBook Pro just for this purpose? I'd assume this is his personal laptop, but his "Using a Mac" section sounds like he's not a Mac user even in his free time.

DeveloperExtras 6 hours ago 1 reply      
I do it this way, but it's based on my particular skill set:

First make the iOS version. Then, port it over to Java. Then, port it over to C# or maybe ActionScript3/Flash.

This way, I can recursively update previous versions as the 'best solution' to interesting problems become most clear by the end of the 2nd or 3rd port. This gives the Objective-C/iOS version the attention it needs, and I can use the rapid application development features for each new port.

kevingadd 8 hours ago 2 replies      
The comment about C++ templates is baffling and I wish the author would elaborate. The behavior he describes that clang doesn't support is... how templates are specified to work. They're near-useless without that property.

Most of these had to do with templates that expected the code inside them not to be compiled until they were instantiated. The Microsoft compiler has that behavior, while clang does not.

jupp0r 3 hours ago 3 replies      
"but then realized I could open a term window and that I was really using a unix-like system with a user interface that wasnt X Windows."

Where would one have to hide to retain this level of ignorance for so long?

Microkernels Meet Recursive Virtual Machines (1996) [pdf] brynosaurus.com
12 points by vezzy-fnord  2 hours ago   1 comment top
vezzy-fnord 16 minutes ago 0 replies      
P.S. If anyone can assist me in finding the Fluke or Flux OSKit source code, that would be appreciated. The Utah FTP server appears unresponsive from my end, so I'm looking for mirrors.
Show HN: Give a Dime Donate change from credit card purchases giveadime.org
8 points by harrisonmgordon  1 hour ago   6 comments top 3
edent 1 minute ago 0 replies      
Slightly confused - what's a "Round up charge"? Is it a US thing? Not heard of anything like that in the UK.
harrisonmgordon 1 hour ago 1 reply      
I'd love feedback on this project - it's been a labor of love for a year trying to find the best charities in the bay area and making it easy for anyone to donate what they can without a lot of effort. Thanks!
rch 27 minutes ago 1 reply      
Fee structure (flat $0.50 +CC txn per donation) described here:


Formal proof on a Pebble smartwatch in Ada adacore.com
14 points by iamwil  2 hours ago   discuss
Beyond TCP: The evolution of Internet transport protocols slideshare.net
51 points by jsnell  5 hours ago   13 comments top 5
jMyles 1 minute ago 0 replies      
I wish for a HN URL replacement algorithm that replaces decks with talk videos.
jorangreef 3 hours ago 1 reply      
One of the major problems with TCP often not mentioned is bufferbloat. TCP's common congestion avoidance algorithms usually have no precautions against bufferbloat and frequently induce it, i.e. when probing the congestion window. And those TCP congestion avoidance algorithms which do manage bufferbloat are often not deployed in public for fear of losing out to greedy congestion algorithms.

QUIC uses fundamentally the same basic congestion avoidance algorithm as TCP (QUIC's algorithm is a work in progress AFAIK) so even QUIC is still in the same bloat.

The problem eventually bubbles up the stack and affects most web applications, so that a single user, just using a web application on one machine, can break the Internet for all other users on the same LAN.

Try this demo for yourself:

1. Run "ping google.com" from another computer on your LAN.

2. Upload a 10-20MB file via Gmail or Dropbox from your computer.

3. Watch the ping times on the other computer skyrocket from around 100ms to upwards of 5-10 seconds.

4. Try a Google search from any other computer on your LAN while this is happening.

Web applications which use protocols such as WebSockets have no way to reduce the bufferbloat footprint of their application, other than re-implement their own delay sensitive congestion avoidance algorithm on top. And actually, if you want to build a robust application which does any uploading (or plenty of downloading), this is what you need to do.

For example, to prevent inducing bufferbloat, Apple's software update actually uses a variant of LEDBAT (the delay sensitive congestion avoidance algorithm from BitTorrent's protocol) when downloading software updates.

ElijahLynn 3 hours ago 0 replies      
Is there a recording? Slides only supply maybe 50% of the information needed to form the picture. If the accompanying spoken words are left out then it is difficult to receive the message as intended.
AndrewDucker 3 hours ago 1 reply      
It really annoys me that some routing systems drop packets that use any protocol other than TCP/UDP. It makes innovation of new protocols basically impossible.
gkfasdfasdf 2 hours ago 0 replies      
I wonder why SCTP over UDP was not mentioned. This solves the problem of routers dropping unrecognized protocols.
Wood carving tools overview davidffisher.com
12 points by wiherek  1 hour ago   discuss
Footage of life in Nazi Austria, thanks to a new video archive smithsonianmag.com
6 points by Oatseller  1 hour ago   discuss
On Being Smart (2009) [pdf] epfl.ch
141 points by jonnybgood  7 hours ago   91 comments top 21
rdlecler1 4 hours ago 2 replies      
I had the good fortune of having dyslexia and ADD before they tested these kinds of things, and eventually went on to become a high school dropout. Ultimately I went to university, got five degrees including a PhD from Yale. I published two papers in Nature journals now cited together over 250 times. One, a single author paper, overturned 10 years of high profile theory and originally caused a falling out with my advisor, a MacArthur fellow and one of the giants of Yale. Having grown up never feeling smart I was always intellectually humble and assumed I was wrong. I found my new theoretical discovery because I noticed an anomaly -- apparently others had encountered before, but swept it under the carpet. I on the other hand assumed I must have done something wrong and so I kept digging until I Worked out the answer. If you're too smart you can also be too confident in your own abilities to extrapolate and interpolate.
iMark 3 hours ago 5 replies      
There's a lot here that resonates with me beyond matters of intelligence.

In my spare time I'm a contact juggler. If you don't know what that it is, it involves rolling balls around the body. David Bowie in Labyrinth is usually a good reference point.

And I'm good at it. I'm good at it because I've been doing it for nearly a decade and I've put in the hours. I don't think I learned particularly quickly, or even particularly well, but I stuck with it and worked hard to improve. I'm not shy about telling people that, but many still seem to assume it's some form of innate talent, no matter how much I reassure them otherwise.

It's as though people would rather accept their own status quos rather than believe that effort and commitment is enough to improve their lot. Yes, it might take years to reach a level of skill in a given discipline, but those years will pass anyway. Wouldn't it be nice to have something more to show for all that time than a depression on the sofa in front of the tv?

nether 38 minutes ago 0 replies      
What I've noticed is that hard work doesn't bear the same fruit for everyone. I worked my ass off in university. I knew others who did too, but they accomplished far more with fewer mistakes. I've also met people who were dedicated in their studies and just seemed to hit the wall with grasping some concepts in math. It was really painful to see this, they weren't lazy or unmotivated, but they often had understanding that fell short of their enthusiasm. It's not just "hard work." Innate talent exists that cannot be compensated by effort, optimistic mindset etc.
ph33r 3 hours ago 2 replies      
The best thing I've ever read online about 'being smart' came from a Reddit comment:


danieltillett 6 hours ago 3 replies      
I actually don't think the divide is between smartness and hard work, it is between smartness and originality. Originality is the wedge and hard work is the sledge hammer. All smartness provides is a torch to find the wedge in the darkness of our ignorance.
cmrdporcupine 3 hours ago 3 replies      
All my life I heard from people "you're so smart, you have so much potential, but you need to try" while I failed through school due to challenges focusing, and, frankly, giving a shit. In the end, I carried this "you're so smart, but" attitude around with me to both my benefit and detriment.

I have no CS degree but have elbowed my way (often with a distinct lack of grace, in retrospect) into the industry as a software developer. I now somehow work at a place that prides itself on intense meritocracy, famous for its grueling elitist interviews .... and the impostor syndrome is intense. But when I look around, most of the people around me do not seem so much 'more intelligent' as 'more adapted for the school-grades / work-politics system' which the interview process / promo process selects for.

To me intelligence and smartness are clearly cultural phenomenon. Yes some people are more adapted to certain types of intellectual activity, but whether those things are 'smart' or not is questionable to me.

As a parent I often get frustrated with myself when I instinctually reward my children with comments like "you're so smart". Unfortunately they struggle with focusing, behavioural compliance, etc. in similar ways to me, while their intellectual and artistic curiosity is intense -- I know they have a long uphill battle ahead of them.

ahussain 3 hours ago 0 replies      
A physics professor in college used to say "learning is a spiral" - you go around and around a concept a few times, getting closer to understanding each time. I quite liked that as a model for learning - don't get demoralized if you don't hit the target on the first pass.
6stringmerc 1 hour ago 0 replies      
"Ignorance is bliss."

The older I get, the more wise I get, the more I believe in the above statement. As in, not having the mental capacity or brain-power to muse over inequities both in personal and worldly topics is a less emotionally affective position to be in. I grew up in a protestant Christian faith, and while I don't actively participate, I do reflect often on some of the teachings (mostly the Beatitudes) and literature, and only in my 20s did I realize that "Eating the apple from the tree of life" is pretty much a metaphor for our evolutionary development into consciousness, of "knowing right and wrong" as a species.

Intelligence? It's a curse as much as it is a blessing. Folks can disagree with that assertion if they'd like. From my personal studies in literature and philosophy though, I think it's a pretty common understanding amongst a certain tier of thinkers. My apology if I come off sounding a little elitist, but intelligence is a bell curve, and, to quote the famous Demotivational poster, "Not everybody gets to be an astronaut when they grow up."[0]

[0]Link to photo I found via Google search: http://cdn.shopify.com/s/files/1/0535/6917/products/potentia...

fengwick3 5 hours ago 0 replies      
One must be careful of conflating the distinct effects of expectations of intelligence and intelligence itself. The former exerts psychological pressure[0], the latter is a catalyst for success. It's often possible to have one without another.

[0] http://web.stanford.edu/dept/CTL/cgi-bin/academicskillscoach...

Sealy 3 hours ago 0 replies      
This looks like a re-hash of the findings published in a 2008 book by Malcolm Gladwell called "Outliers - The Story of Success". He's the one that made the 10,000 hour theory popular.


Its a #1 Best-Seller on Amazon:http://www.amazon.co.uk/Outliers-Story-Success-Malcolm-Gladw...

vlehto 4 hours ago 4 replies      
According to Dunning-Kruger effect, and expert should consider his accomplishments trivial in hindsight. Grothendieck and Gauss were mathematicians, not psychologists. It's likely that the predicate of the article is then completely wrong.

Then there is also the impostor syndrome, many students are likely to feel not smart enough.

And then there is the cultural taboo around smartness. Taking pride in ones ability is socially accepted as long as that ability is not intellectual. So in a way idea like this turns into twisted logic: "I'm smart, my laziness in college shows it." And now you get to do that sweet guilt tripping for feeling too smart. There is difference between pride and arrogance. So this self deprecation seems bit needless.

What if we don't consider pride as a sin, but as natural phenomena in range of human emotion. Maybe pride is inevitable for ego? Now if that is true, what should a good student to be proud of? What if being openly proud of your intellect is healthy after all?

rifung 4 hours ago 0 replies      
While the article was interesting, I wonder if the 50% of dropouts is really caused by people feeling like they are not smart enough. I've met quite a few people who dropped out of good PhD programs, but most of them dropped out because they felt like they weren't in love with research and didn't think it made sense over going into industry and making a lot more money. I met one person who got into a top 10 program but never intended to finish his PhD; he just wanted to get a free Master's.

On the other hand I also have heard from many that grad school is the first time many students have to see a therapist and deal with depression.

tome 5 hours ago 2 replies      
> `The brain is ultimately just a muscle. Make it stronger by working it out.'

I like the analogy that things are "muscles". It concisely captures various phenomena that I've observed: the brain is a muscle, willpower is a muscle, trust (in another person) is a muscle ...

RivieraKid 4 hours ago 2 replies      
I think that a large part of it are psychological / personality factors - how much pleasure do you find in thinking, learning new things, solving problems.
personjerry 6 hours ago 2 replies      
We see so much of the smart vs hard work argument. It seems heavily biased towards the argument of hard work. But, I wonder if there a way to quantify and measure so that we might be able to get an objective argument?
davidiach 4 hours ago 1 reply      
It's funny how it seems that only really smart people do the hard work required for winning Nobel Prizes, Fields Medals and other such distinctions. Average people are just lazy, that's why they don't succeed!
refrigerator 3 hours ago 1 reply      
I've read similar things before, and totally agree with them, but surely 'the ability to work hard and persevere' varies a lot from person to person according to genetics and environmental factors?
gypsy_boots 4 hours ago 1 reply      
> The brain is ultimately just a muscle. Make it stronger by working it out

This seems like such an empowering way to look at learning, and one I think many of us are prone to forget. Even masters don't just arrive at their talent, they too have to work at it, over and over, day after day.

octatoan 6 hours ago 1 reply      
I posted this yesterday: https://news.ycombinator.com/item?id=10539083

It seems timezones matter a lot. :)

HiroshiSan 3 hours ago 2 replies      
By your door analogy, if the door is broken (through hard work), it is now open. The key provides an 'easier' or more obvious way to open as opposed to persistence.
tempodox 4 hours ago 0 replies      
This broadcasts an encouraging message. While I still think actual success depends on a good deal of sheer luck, we needn't despair because we're missing some irreplaceable gift. Even in this respect, all humans are created equal.
A Basic Guide for Curious Minds: Review of Thing Explainer gatesnotes.com
129 points by mhb  4 hours ago   52 comments top 10
mavhc 2 hours ago 5 replies      
Amused that the first sentence is "Terminology is an occupational hazard of philanthropy", I think he meant "hard words are often used by people who give money away"
mc32 41 minutes ago 1 reply      
Gates is spot on and Munro extraordinary. Too often managers feel obliged to use technical language where none is either needed nor natural.

People talking about budgets and fiscal planning making use of hi-tech specific, bio-tech, etc. Anything which passively indicates that they are up to date on cant and argot of totally unrelated fields but serve as signal of being at the edge or on the cusp of all things new and modern and professional.

Examples, using MM for millions, or K for thousands (is that metric K or computing K), or saying "spend" as a noun. Having to sound like one knows and is up to date with all the different industry terminologies must be taxing.

Best of all is when a person is actually familiar with the terminology and senses the forced nature of the out of context use. They can only smile at the stilted use.

amelius 1 hour ago 3 replies      
I'm now hoping for a book titled "How to build civilization from scratch".
maweki 2 hours ago 2 replies      
Yeah, the book is real fun. Amazon seems to have sent it early and I am not complaining.

It's a lot of text in comparison to the original poster-comic and a lot of stuff seems very repetitive (it's bound to be, right?) since many principles of planes, submarines, rockets are the same (physics). Still a fun read if you don't plow through it in one sitting.

rdlecler1 2 hours ago 1 reply      
This would make a great Wikipedia! Score for brevity and using words with high frequency.
Jabbles 3 hours ago 8 replies      
What rule allowed him to use the word "goer"?
kbutler 2 hours ago 4 replies      
Although I enjoy the Saturn 5 "Up goer five" description, I doubt that "goer" is in the top 1,000 most common words...

Edit: He's taken the common word list and added different forms as he needed them. Go -> goer and goers, because he wanted to use them. Grow is allowed, but grower and growers are not, because he didn't need them.

urish 1 hour ago 1 reply      
I wonder at what point does using fewer words become detrimental to understanding.
gadders 1 hour ago 0 replies      
I would just like to take this opportunity to do my annual joke that Bill Gates should try and monetise his blog with affiliate links or he'll never make any money.
AC__ 2 hours ago 1 reply      
I wonder why this book doesn't explain the fractional reserve banking system. Oh wait, never mind, I know why.
Chamath Palihapitya Launches Rama Corp to compete with AT&T and Verizon businessinsider.com
31 points by prostoalex  4 hours ago   10 comments top 7
paragpatelone 46 minutes ago 1 reply      
I am sure he may have something up his sleeve. But how exactly is Rama going to compete, are they going to acquire real infrastructure or build cell sites? It will be pretty costly to build something from the ground up or to acquire customers from Verizon or AT&T, just ask T-mobile and Sprint.

Unless they have some game changing wireless technology than I don't see it happening and investors may as well put their money down a black hole.

BI says "Part of his plan involves installing microcells in customer's homes to blanket the nation, but also making it as easy as buying a cellphone to sign up for it. Another key to the plan is a portfolio of zero-rated apps that won't cut into your data, Palihapitiya said."

^That does not sound like a very good plan. Carriers like AT&T and T-mobile already have microcell options, and most people won't opt for it especially if other people get use the microcell at the cost of the person's personal bandwidth with their ISP.

What is Zero-rated apps? Isn't it similar to what T-mobile is already doing with their video and audio streaming; whitelisting Apps that will not count against data. Most carriers also have WIFI calling.

That auction that he wants to participate in, isn't that for low band spectrum like 600Mhz. That is good of extending coverage but will not increase your download speeds. Carriers like to have both low band and high band.

salimmadjd 51 minutes ago 0 replies      
I can fly to Riga, Latvia or many other places. Pay $5 for a SIM card (with some amount of base call/data credit) and use it when I want or need it. But in US you are charged if you use or don't use your service. Basically it's like your electric or water meter running even if you use electricity or not. I hope Chamath will be able to solve that. Have a network like other places in the world. Where I buy a SIM card and fill it up when I need it and if I don't use my phone for a week, I won't be charged when not using it.
occam65 1 hour ago 0 replies      
I've got a lot of respect for Chamath and his straight forward no bullshit approach. With that said, this seems like a huge challenge, and I couldn't be more excited that he's taking the lead on this.

Best of luck, Chamath.

mdasen 17 minutes ago 1 reply      
The article is light on the details, but it seems to refer to the opportunity presented by the auction of 600MHz spectrum in 2016. The FCC has set aside 30MHz of spectrum for smaller carriers that don't have so much sub-1GHz spectrum. This is important because low-frequency spectrum travels farther and allows carriers to create broad, reliable coverage. In most markets, this will mean that Verizon, AT&T, or both will be excluded from bidding on that 30MHz.

But it would hardly help him "overtake" AT&T and Verizon with no network and probably only 10-20MHz of spectrum, possibly missing large markets like New York and San Francisco entirely due to the high price licenses in those markets fetch.

The auction is generally seen as most beneficial to T-Mobile who has a network, but lacks low-frequency spectrum in a lot of markets.

There have been many companies that have dreamed of entering the US wireless business. Many have bought spectrum only to let it languish for years and then sell it to an incumbent carrier. Looking at 700MHz-A licenses, a lot of them are owned by companies such as "C700-Salt Lake City-A LLC", "C700-Jacksonville-A LLC", "Cavalier Louisville, LLC", and "Cavalier Albany NY, LLC". Part of the issue is that it is expensive to build out a wireless carrier requiring lots of money and consumers demand a high level of perfection when it comes to their wireless carriers. The American market isn't one that tolerates even small carrier issues.

Some of it will depend on what licenses go for. I think most people are expecting licenses to run T-Mobile in the range of a couple billion given the 30MHz set aside, but T-Mobile is planning for up to $10B. Verizon just paid around $10B for around 10MHz of AWS-3 spectrum and the 600MHz licenses are a lot more valuable. But 30MHz is set aside with less competition from the big two.

Given that the licenses will likely cost $2B+ and cap-ex can often run $3-8B per year for carriers, $4-10B seems a little low to launch a compelling service. Part of the issue with the US is that it's such an expansive place and people don't want to be told "this service is only available in Maryland, DC, and Northern Virginia".

I guess the question is: what does he think he can do better. If it's cost, I can get 2-5GB of data with unlimited talk and text from Boost for $30/mo, taxes and fees included, on Sprint's nationwide network. His network will be worse than Sprint's so how much of a discount can he offer off $30 to make a much worse, completely new network compelling? He won't have loads of spectrum to offer really high data caps. And if he starts offering 100GB for $25, that will have to come down fast as people start actually using it and the network becomes capacity constrained. And customers are very used to being grandfathered into plans in wireless. If he were just competing against AT&T and Verizon, he might have an opening. But even AT&T has their Cricket brand where I can get 2.5GB for $35, taxes and fees included, on AT&T's network.

The question is: what does his entry bring to the scene? It seems unlikely that he can greatly undercut prices. It seems unlikely that microcells in people's homes will "blanket the nation". With so little spectrum compared to competitors, he won't be able to offer the speed and capacity they're offering. Without a reliable network, customers will want a steep discount to move to his service. So, how much below $30/mo can he go to grab customers? Is there something else compelling? T-Mobile already offers microcells for your home and zero-rates music and now video streaming.

I'm all for increasing competition. It just seems unlikely that this will increase competition. It seems way more likely that Dish will buy some 600MHz licenses and start rolling out a network. They already have substantial spectrum holdings and the low-frequency licenses would allow them to get broad coverage without spending too much while using their higher-frequency spectrum to supplement capacity where needed. Dish also seems to think that wireless data is going to be their salvation. As we inch closer to 5G, it's likely that fixed mobile broadband from an antenna in your home could serve as competition to wired internet services. That would allow Dish to offer on-demand services and home broadband plus mobile services in an era when more people are forgoing pay-TV services.

To me, that seems like something with a strong chance of happening. The company already has a huge spectrum investment and they need wireless for their future. Rama would be competing with way less spectrum and starting from scratch. Seems like a much easier way to be profitable would be to bid on the spectrum that AT&T and Verizon can't bid on, use the big FCC discounts for new players, hold it for 4-5 years, and then re-sell it for a profit.

yohann305 2 hours ago 0 replies      
Good luck on the undertaking, it's going to be a tough bumpy road, but at the end of the day, US consumers will surely benefit from it. cheers!
logfromblammo 39 minutes ago 0 replies      
Given my frothing antipathy toward AT&T, and my lesser dislike of Verizon, I am very eager to see new competitors entering the mobile telephony market, especially one that is not a reassembled collection of Baby Bells.
mikeash 1 hour ago 0 replies      
Hell yes, fuck my country up! His words, not mine! I'd subscribe just because of that phrasing.
Squares S-1: Ratchets and Unicorn Valuations techcrunch.com
16 points by prostoalex  1 hour ago   2 comments top 2
dantillberg 29 minutes ago 0 replies      
I don't always believe in scare quotes, but we probably ought to use them around "valuations" promoted by companies raising private capital.

I own a tiny bit of common stock in a private company (by way of ESOs), and I have a hunch that virtually all of the rest of the stock is in preferred shares of some sort. And thus I have no idea how much my stake is worth. For all I know, it's zero.

Shouldn't it count as some sort of securities/financial fraud to issue preferred shares of stock, and then to hawk the "market valuation" of the company as if you'd just sold common stock for the same terms?

pbreit 11 minutes ago 0 replies      
I've defended private market valuations since we haven't been hearing much about "ratchets" but will need to back-track. Yeah, Square's last round was a "debt" round with decent upside if Square can get back on track.
How does a parasite create zombie-like behavior? experiment.com
90 points by dluan  8 hours ago   18 comments top 10
hellbanner 2 hours ago 1 reply      
If you want to lose sleep: https://www.youtube.com/watch?v=vMG-LWyNcAs

"The caterpillar, instead of building its cocoon to guard itself at night, wraps it around the larval wasps (which previously tore their way out of its body), and will continue to defend them until it starves to death.

The biggest danger for these parasitic wasps is being injected with another species of parasitic wasp".

EDIT: Wow, what a cool site - crowdfunding experiments!

mizzao 2 hours ago 0 replies      
Did anyone notice the rest of the site?

It seems surprising that this entire project was done with $4,500 of crowdfunding. That seems very cheap. I wonder how much of the project funding came from elsewhere.


fauria 2 hours ago 1 reply      
The Leucochloridium paradoxum infects snails, making their eyes look like caterpillars, catching birds attention and thus spreading through them.

More info: https://en.m.wikipedia.org/wiki/Leucochloridium_paradoxum

ericjang 1 hour ago 0 replies      
Scientific content aside, I really think the web is a far superior medium for delivering technical papers. It allows for videos and interactive data exploration. I wish more scientists would publish this way.
wattengard 4 hours ago 2 replies      
Isn't this the fungus that's the inspration for the Last of Us videogame?
Angostura 4 hours ago 0 replies      
I haven't come across this crowdfunding platform for research before and I absolutely love it. I suspect quite a few of my pennies will be going here.
ignoramous 4 hours ago 1 reply      
Another very well presented albiet brief introduction on this topic is this Ed Yong TED Talk https://www.ted.com/talks/ed_yong_suicidal_wasps_zombie_roac...
univalent 1 hour ago 0 replies      
Somewhat related. "The girl with all the gifts", which is on this subject was a great read. Emotional, thought provoking in a sea of horrible zombie related writing/TV.
tempodox 5 hours ago 0 replies      
+1. Fascinating topic, very nicely presented.

I didn't know this platform exists.

spoiler 5 hours ago 1 reply      
tl;dr ant escapes. Bites human. So it begins; the end.

P.S: It's just a joke!

A sliding puzzle, built with Elm moroshko.github.io
40 points by michaelsbradley  6 hours ago   11 comments top 3
cokernel 4 hours ago 3 replies      
It's a nice interface. I'm glad arrow keys, the first thing I tried, work. But you're not going to trick me into looking for a sequence of moves that change the parity of the underlying permutation.
sotojuan 1 hour ago 0 replies      
Very nice! More reasons to try Elm out.
TensorFlow Benchmarks github.com
68 points by sherjilozair  7 hours ago   37 comments top 7
mark_l_watson 3 hours ago 6 replies      
From Google's perspective it is probably more about how TensorFlow scales out horizontally. If a researcher fires off a Borg run (or whatever they use now) and the job takes a few thousand CPUs, no problem, at least for research.

They must have better optimization a for running in production, such as in place operations.

iraphael 3 hours ago 0 replies      
I don't know how much this matters, but an issue similar to this had been raised here: https://github.com/tensorflow/tensorflow/issues/120

The responses seem to show that the way you implement things can make a big difference in runtime. Perhaps the scripts used for benchmarking can be further optimized?

That said, the lack of in-place operations might be surprising (although it has been said that they are coming)

donthateyo 3 hours ago 3 replies      
Until now, I've seen two responses to Google's TensorFlow from Facebook employees. Yann Le Cunn seemed to really challenge Jeff Dean about TensorFlow's scalability [1] and this benchmark puts TensorFlow down there in all the measures it tested for. I can't ignore the possibility that this criticism of TensorFlow from Facebook employees (while factually correct and constructive) might be driven by some competition and jealousy.

[1] https://www.youtube.com/watch?v=90-S1M7Ny_o&t=39m

IanCal 4 hours ago 1 reply      
Interesting benchmarks. One hopefully constructive critique, if you say things go out of memory, it'd be really useful to know what your setup is. Maybe you've got a big array of massive GPUs or you're running it on a more normal consumer GPU+box.
programnature 4 hours ago 0 replies      
In the announcement they called this the "reference implementation".
vonnik 6 hours ago 0 replies      
It takes 4x as long as CuDNN on AlexNet due to lack of in-place ops. What is up with that?
mtgx 5 hours ago 4 replies      
It's almost like Google wanted everyone to use slow obsolete software and keep the really good stuff for itself, while still making it look like they're doing a great thing for the community.
The sad state of SMTP encryption filippo.io
140 points by FiloSottile  10 hours ago   51 comments top 12
Tepix 7 hours ago 3 replies      
The article clarifies the issues that exist with SMTP encryption nicely.

Regarding the issue with certificates for the servers that the MX points to, I disagree with the author.If the MX for example.com points to mail.example.info, it implies that example.com trusting the handling of its mail to mail.example.info, therefore there is no issue with letting mail.example.info present its own certificate.

The article also suggests that DNSSEC with DANE will solve all issues with SMTP encryption.

However, DNSSEC is a crappy standard. It doesn't do encryption so a surveillant can still collect metadata; it has unsolved issues that facilitate amplification attacks, it's overly complex and has slow adoption.In fact, before DANE arrived on the scene, there was hardly a good reason to deploy it.

If we adopt DNSSEC now we'll be stuck with it (including its lack of privacy) pretty much forever. Instead, I suggest we work on more promising initiatives such as DNSCurve (https://en.wikipedia.org/wiki/DNSCurve)

rascul 4 hours ago 1 reply      
Awhile back I started requiring TLS for connections to my mail server. Not only is this not standards compliant from what I recall, but I've noticed a disturbing amount of other mail servers that apparently require plaintext connections. Most notable is password and college application information which couldn't be delivered to me because their mail servers refused to use TLS. The college in question was kind enough to look at their logs for me and see the issue, and ask me to disable TLS to receive the information. Of course this doesn't solve all the problems. Mail is still stored in plaintext, at least until I figure out a workable solution for that, and there's probably still the possibility of MITM attacks, but I feel at least requiring TLS is a step in the right direction. Until email is finally abolished, anyway.
rc4algorithm 32 minutes ago 0 replies      
This article suggests that there's no point in having a valid SMTP cert. However, consider end-users' clients, which store the SMTP domain (i.e. don't do MX lookups) and connect to it directly. For mail to users on the same email network, this is the only non-local SMTP hop. Securing this connection also prevents anyone on the end-user's local network from MitMing.
nwah1 1 hour ago 0 replies      
Daniel J Bernstein, designer of DNSCurve, proposed some alternatives to SMTP.



snori74 8 hours ago 4 replies      
In fact (as the article admits), the encryption is fine and fully effective against a passive attacker, the problem is that it's not much use against an active man-in-the-middle. But, that's not something anyone but a NSA or ISP can easily do between mailservers.
devy 1 hour ago 0 replies      
The author only mentioned STARTTLS as the way to secure SMTP. However, there are at least two other ways to do it:

* TLS Wraper

* Secure Tunnel

And Amazon Web Services Simple Email Service accepts all three approaches. Granted the latter two may not be supported by a lot of providers, but hey is that the same thing with browsers securities? We deprecate old MTAs and old versions of them progressively. Just my two cents.

teekert 7 hours ago 2 replies      
It's indeed not a nice situation, I'd love for my mailserver to insert a line into the message subject something like: 'TLS not used' when this is the case (or a plugin for Roundcube (Next?) that colors the subject, how cool would that be?). Just so I know and are aware of suspicious things. Enforcing TLS (and valid certs + strong encryption) is just not very practical yet, although it will definitely not affect mayor players' connectivity (Google, Apple, MS). Still, a lot of problems will arise, to see how many exactly, this is a rough indicator: http://www.google.com/transparencyreport/saferemail/

I predict in my group of friends I can receive/sent from/to almost everyone if I would enfore TLS on my server. Except to/from that one guy that is savvy enough to have his own domain but hosts his email at a cheap, crappy provider.

yuhong 8 hours ago 0 replies      
Right now I am thinking a HSTS-like solution is the best idea for now, though I do wish for a DNSSEC2 eventually.
capt_hotpants 5 hours ago 4 replies      
A thousand times yes.

PGP and SMIME is perfectly fine for high security scenarios (whistleblowing and such), in other words for the 0.000001% use case.

For the 99.9% use case, all that regular folks need is for the sending MX to verify that the recipient MX owns the domain before delivery.

PGP and SMIME with their key-signing parties, government-owned PKI et cetera, is either wild overkill or so utterly complex that it defeats the purpose for the 99.9% use case.


That said, you are going to break some of my software with this.

Specifically a SMTP reverse proxy, that looks at the domain part of RCPT TO, and transparently forwards the SMTP connection to the correct customer's MX for processing.

It could easily be unbroken again - BUT that would require that Postfix get their software together and add SNI support to their TLS stack (like all? other MX software does).


Implementation proposal:

1) Use RCPT domain-part for the SNI hostname.

2) Always try SMTPS port before SMTP port. Always try STARTTLS before plaintext.

3) Actually verify the certificate, duh.

4) Support a new EHLO header that mimics Strict-Transport-Security exactly.

rnbrady 9 hours ago 0 replies      
Wow, that's an eye opener. Thanks for the write up.
thrownaway2424 8 hours ago 1 reply      
S/MIME also neatly solves the problem.
abricot 7 hours ago 2 replies      
This article motivated me to at least create a self-signed certificate for my server.
Show HN: Sodocan.js: Documentation Made Easy sodocanjs.com
41 points by serrisande  3 hours ago   32 comments top 11
codebeaker 1 hour ago 2 replies      
A shame that the page has to hijack the scroll functionality, it makes for a very unpleasant "lumpy" feeling scrolling. Chrome Version 46.0.2490.80 (64-bit) on OS X Yosemite.
IanCal 1 hour ago 1 reply      
This looks really interesting, thanks for sharing.

I'm finding it quite difficult to read through the main page however, as there's something going on with the scrolling. It's hyper-sensitive, then when there are animations on the page they play through before jumping me really far.

amelius 1 hour ago 1 reply      
I think it is a shame that most projects have only documentation in the form of one-liners for each function/variable/class. Those one-liners really don't mean much if you don't have a picture of the global architecture in your mind. That is far more important.
Gigablah 47 minutes ago 1 reply      
I find it frustrating to read the documentation demos you have created. Low contrast and poor choice of font. This is what it looks like on Chrome in Windows 7:


liujoycec 2 hours ago 2 replies      
If you would like to some real examples of documentation created using Sodocan.js, check out http://jesterswilde.github.io/hashids/ and http://liujoycec.github.io/jwt-simple/ . These are both created from the Blueprints template Sodone. These templates are also open-source, and we would love for you to help us improve them and add more templates!
jand 49 minutes ago 1 reply      
Ok, what is the unwanted side-effect for the user? Since i cannot see pricing info or such - how do you monetize? Which part of my soul - as a user - do i sell by using the service?

And: The service looks nice - my way of asking questions does not imply a bad impression.

graffitici 1 hour ago 1 reply      
I'm trying understand what this fulfills. I understand it extracts documentation from comments, and writes them to a JSON file. There are have been countless tools to do this.

But the difference of Sodocan is that it then sends these JSON files to an API server, which hosts these documentations?

So basically instead of using JSDoc + static site generator, one would use this method?

And the benefit would be that the generated documentation would be crowdsourced?

trymas 45 minutes ago 0 replies      
Scroll hijacking in your landing page? Why?

Am I the only one, who cannot believe that websites made by (web)developers use scroll hijacking?

sdtsui 2 hours ago 1 reply      
Hey there. Where could an OSS project owner find a 'getting started' guide?
yclept 1 hour ago 1 reply      
is it compatible with jsdoc?
lekeve 2 hours ago 1 reply      
This is awesome!
How Humans Evolved Supersize Brains quantamagazine.org
46 points by retupmoc01  7 hours ago   5 comments top 4
swamp40 16 minutes ago 0 replies      
I like this quote: A few years later, the anthropologist Richard Wrangham built on this idea, arguing that the invention of cooking was crucial to human brain evolution. Soft, cooked foods are much easier to digest than tough raw ones, yielding more calories for less gastrointestinal work.

So along that line, the massive recent increase in high-sugar food/drinks and fast-food restaurants like McDonald's should be fueling another leap in brain size!

simiano 2 hours ago 0 replies      
There is also this awesome talk by Suzana Herculano-Houzel (mentioned in the article)


restalis 2 hours ago 1 reply      
I am a little disappointed to find no comparison to the dolphin brain in the talk about the number of neurons! Why elephant? If it was about the size it could have been the animal with the largest brain out there (which happens to be a cetacean, like dolphins)!
CapitalistCartr 2 hours ago 0 replies      
Planet of the Apes here we come. We should have a second uplifted species of ape by mid century.
Specific Problems with Other RNGs pcg-random.org
26 points by nkurz  5 hours ago   15 comments top 6
tptacek 3 hours ago 2 replies      
I went back to the root page of this site to see what PCG is, and I don't understand it. What's the point of an RNG that is "more secure" than an insecure RNG, but less secure than a real CSPRNG? What does "Predictability: Challenging" actually mean?
carterschonwald 2 hours ago 0 replies      
I mentored a gsoc student this summer who worked on rngs, and the only two algorithms that passed the big crush statistical suite were pcg random and split mix.
thesz 1 hour ago 0 replies      
ChaCha20 is (relatively) slow, but! ChaCha8 does not have known attacks and 2.5 times faster than ChaCha20.

Not mentioning speed variability of ChaCha family is a flaw in analysis.

pettou 3 hours ago 0 replies      
What does the "zero will be produced once less often than every other output" for negative qualities of XorShift and XorShift* means? That they are not able to generate "0"?

Also, does anyone know if PCG is in use somewhere today?

justcommenting 3 hours ago 0 replies      
Having encountered a number of VPS setups relying on haveged or rng-tools/virtio-rng, has anyone observed "specific problems" with the misconfiguration/misuse of haveged on VPS?
cwmma 3 hours ago 4 replies      
the knocks against the openbsd arc4random, namely

> No facility for a user-provided seed, preventing programs from getting reproducible results

> Periodically stirs the generator using kernel-provided entropy; this code must be removed if reproducible results desired (in testing its speed, I deleted this code)

seem like exactly the kinds of foot guns you really want removed from an RNG you're using for real live code.

       cached 11 November 2015 17:02:03 GMT