hacker news with inline top comments    .. more ..    30 Dec 2014 News
home   ask   best   3 years ago   
Why I Drilled Holes in My MacBook Pro and Put It in the Oven
29 points by ColinWright  1 hour ago   15 comments top 5
joeblau 0 minutes ago 0 replies      
This story is awesome. It would be hilarious if the OP ever tried to sell his laptop. I can imagine him trying to explain why all of the drilled holes in the bottom are _actually_ a benefit and not a problem.
petercooper 5 minutes ago 0 replies      
One time I saw it climb as high as 102 Chot enough to boil water.

If doing anything slightly intensive, my 15" MBPR is the same; I've had it hover at 100-105C for periods. Why not scale down the CPU at crazy temperatures? Intel CPUs have a cutout, but the entire machine needs a more gradual solution. Running at 90C+ is not viable long term.

Now it's over 2 years old, I might just buy a entry level MBP instead rather than maxing out as I usually do. They seem fast enough now and what's the point if I can't even use the full power?

gnarbarian 29 minutes ago 2 replies      
This is what happens when function plays second fiddle to form.
unvs 7 minutes ago 1 reply      
I have the same laptop model and I'm on my 4th logic board. Luckily, we have 5 years of warranty for hardware failures by law where I live.
aceperry 17 minutes ago 0 replies      
The author shows a lot of love for his laptop.
The Death of Cocoa
149 points by andrewbarba  7 hours ago   57 comments top 17
tempodox 5 minutes ago 0 replies      
I think Swift has promise, but I don't expect it to replace Cocoa anytime soon.

My main complaint about Swift is the lack of available documentation. In C languages, you have static header files which you can read and learn about APIs & types / classes / whatever. API documentation in Swift is generated on the fly when you search for a specific term that happens to be a function / type / protocol / whatever. But how do I find out what functions / methods are applicable to, e.g. a String? There is no chance to just `grep` existing headers for 'String' or some such trick, to find everything related to that type. This may seem minor to some, but for me it is a major stumbling block making Swift effectively a black box for me.

jawngee 2 hours ago 3 replies      
I like Matt, but I think he's wrong.

Objective-C isn't going anywhere, just like C++ hasn't gone anywhere, nor has C left the building either.

There are 124 public frameworks in the 10.10 sdk, 357 private ones. While a good chunk of these are written in Objective-C, a good deal of them are written in C, or Objective-C with the guts in C++. AVFoundation, for example, is mostly C++ in the "backend".

The amount of effort it would take to move those to Swift would be fairly substantial, and with no real gain. This doesn't mean that new frameworks won't be written in Swift, but still interoperable with Objective-C, this will probably happen - though I doubt it'll happen in my development lifetime. So we are looking at a gradual replacement of frameworks, or superseding with newer ones (QuickTime -> AVFoundation -> ???).

He brings up Carbon, but erroneously as Cocoa actually predates Carbon and Carbon was only meant to serve as a compatibility bridge from OS 9 to OS X. Carbon was meant to die.

I don't plan to switch to Swift unless I absolutely have to. I spend 14 hours a day developing for OS X, but my problem isn't Objective-C, it's the ambiguous documentation and mildly temperamental behavior of certain frameworks coughAVFoundationcough. I didn't need Swift, wasn't looking for Swift. I have nothing against it, I'm sure it's awesome, but language isn't my issue at this point, so I don't really have much to gain by it.

gurkendoktor 8 minutes ago 0 replies      
This has been my biggest worry since Swift was announced. I am really happy with Foundation (much more than with AppKit or post-iOS 6 UIKit). But if 2014 Apple were to reinvent it, I am sure it would suck.

Has Apple recently released anything that they have dog-fed to themselves before, and that was not buggy? Swift itself is a good example of how Apple seems to work now: Let engineers build a toy, release it as v1.0, wait for the early adopters on Twitter to sing their praises, then maybe start using it internally. Maybe.

I wish Apple had instead designed better frameworks for UI and persistence and then built a language to make working with them easier.

gilgoomesh 4 hours ago 1 reply      
Holy clickbait headline, Batman.

Yes, Carbon has been replaced in the past but that involved a $400 million acquisition and 10 years of continual complaining, kicking and screaming from established Carbon users who had no desire to change. I doubt it's an example of how future changes will occur.

Replacing entire application frameworks is hard. Super, super hard. It seems like it might be simple to start by replacing Foundation with Swift's standard library but actually, replacing Foundation would mean replacing every Cocoa framework since they all rely on it. And there's a gigantic amount of work in those frameworks; 25 years of development (all the way back to the early NeXT days).

I think this is why Swift includes such extensive support for Objective-C interoperation: Apple expect Swift will need to link against Objective-C APIs for a long, long time.

I think we're much more likely to see a major deprecation sweep through Cocoa in one or two years time (probably once Swift finally has all the features Apple have hinted are coming). Not deleting things, per se, but simply saying "these things look ugly or silly in Swift" so use these other things instead.

CookWithMe 3 hours ago 0 replies      
I had the same feeling when I did my first side project in Swift earlier this month. My main experience is with CoreData.

An example with 1:n relationships: CoreData returns and expects an (untyped) NSOrderedSet. Now I may either keep the NSOrderedSet, but have to cast each object I want to use - and my Swift code is just as bloated as Obj C would be:

  let obj = mySet[0] as MyClass
instead of:

  let obj = mySet[0]
Plus, map/reduce etc. won't work on an NSOrderedSet.

Or I create a typed Array from the NSOrderedSet, which is fine to work with in Swift - except that it is not managed by CoreData anymore. So I'd have to be careful to synchronize with CoreData manually, and it's not just saving that I have to watch out for, but also other operations like rollback etc.

Another example is NSNumber. NSNumber needs to be manually mapped to Int, while the other direction works automatically. That makes sense when it is unknown whether NSNumber is an Integer or a Floating Point Number, but in CoreData I have specified that it's an Integer 32... (Well, I think it was similar with Obj C, actually).

So, working with Swift in the Playground felt like a huge improvement over Obj C at first, but then working with some Cocoa APIs it started to feel ... more clunky again.

I think it was/is similar with Scala. If you can stay in the Scala libraries, awesome. It is a huge improvement over Java. But Scala won't automagically turn a terrible Java API in a great Scala API. Yes, you'll save a couple of semicolons and type declarations, but it's not that big of an improvement. Thankfully (in Scala), there are a lot of better frameworks or wrappers for existing frameworks by now. I guess Swift will have to go the same way...

jarjoura 4 hours ago 1 reply      
Swift has the potential to become a great systems language but I am cautious it will replace Foundation/AppKit/UIKit. For example, a Swift string IS an NSString (at the moment at least).

It's impossible to know for sure, but when you look back 15 years ago when Cocoa was the new and shiny to the old-trusty Carbon. Apple was actually writing software in Cocoa internally. Everything they were building towards extended from the NeXT Objective-C world. Though true, they didn't publicly commit to Cocoa 100% until OS X 10.4, but you better believe internally they were always in. It's just the world didn't see it until the "We're rewriting the Finder in Cocoa" campaign was announced.

At Apple right now, no one outside of the compiler team is working on anything interesting in Swift. It's still locked away from them. To be fair, it's an evolving language and will cause a lot of heartache for everyone until the language has been baked in more.

I know Mattt is excited for Swift. Plus, a lot of developers are already doing some really cool stuff. So we shall see in a couple years how the story plays out.

nailer 24 minutes ago 1 reply      
Oh god, NextStep API:

What else does JSON do if not serialization?

As opposed to JSON things that aren't objects and don't have data? The signal/noise ratio is so low.

philliphaydon 6 hours ago 3 replies      
"Apple is a company with a long view of technology. It's really difficult to tell whether a technology like JSON is going to stick, or if it's just another fad. Apple once released a framework for PubSub, which despite not being widely known or used, still has to be supported for the foreseeable future. Each technology is a gamble of engineering resources."

Apple doubted JSON would be around?

Normati 3 hours ago 0 replies      
This feature of strings is cool! It sounds like the end of the Unicode encoding mess that most languages drag the programmer into:

"One of the truly clever design choices for Swift's String is the internal use of encoding-independent Unicode characters, with exposed "views" to specific encodings:

A collection of UTF-8 code units (accessed with the strings utf8 property)

A collection of UTF-16 code units (accessed with the strings utf16 property)

A collection of 21-bit Unicode scalar values, equivalent to the strings UTF-32 encoding form (accessed with the string's unicodeScalars property)"

bluerobotcat 33 minutes ago 1 reply      
Slightly off-topic: I'm surprised to see the Set implementation in this article. There's really no need for such an abstraction in my opinion, and there definitely is no need to base it on Dictionary<T, Bool>. I have been using Dictionary<T, Void> instead and that works just fine.

Here's what I mean:

    var s = [Int: Void]()    s[5] = ()    s[10] = ()    s[10] = nil    assert(s[5] != nil)    assert(s[10] == nil)    assert(s[1] == nil)

imron 5 hours ago 2 replies      
>Will Swift unseat Javascript as the only viable web scripting language by adding interpreter to Safari?

Adding Swift to Safari would be an interesting development

masters3d 4 hours ago 0 replies      
I think vanilla cocoa is being replaced slowly by cocoa touch. They will probably rename it and then force cocoa touch onto macosx. I don't see foundation changing much though. Maybe foundation touch "universal" is coming ;)
msoad 4 hours ago 0 replies      
With Objective-C they had to do a lot of things in higher level (standard library or the framework). With the new shiny language that had tons of features that are remained unused it makes sense to slim down Cocoa and move things to language level.

A good example is string interpolation. Swift makes it unnecessary to have things like `stringWithFormat` in framework level.

bovermyer 5 hours ago 1 reply      
Am I terrible for loving the minimalist design, and simultaneously wondering how I can read the archives from above the fold on the home page, instead of reading the article?
onedev 6 hours ago 0 replies      
I'm in love with the Cocoa
johnny_reilly 3 hours ago 2 replies      
I clicked on this link thinking "yeah, hot chocolate has kind of vanquished cocoa hasn't it?" Very disappointing to discover this had nothing to do with chocolatey drinks.
unicornporn 1 hour ago 0 replies      
Totally OT, but the only thing I could of when I saw that domain name was https://en.wikipedia.org/wiki/National_Socialism_%28disambig... hipster...
Quake on an oscilloscope
692 points by markmassie  17 hours ago   58 comments top 17
peter303 14 hours ago 2 replies      
This was how we did computer graphics before there were affordable (less than $50K) frame buffers in 1980.In my 1975 MIT Digital Systems Lab my team constructed a hardware Game Of Life out of TTL gates. I designed the display as timed X-Y points on an oscilloscope. In those days a kilobyte of RAM still cost a hundred dollars, so the computer lab rationed the number of memory chips each team could use. I recall we stored the automata in a dual 64 x 64 bit frame buffer or two kilobytes overall.

The first generation computer graphics languages were vector-oriented to support either oscilloscopes or pen-plotters.

My first color frame buffer terminal was 512 x 512 x 8bits $30K AED in 1980. I think this costs than a dollar on a low-end cellphone now.

cubano 15 hours ago 2 replies      
I'm old enough to remember when my college computer lab had an analog display for one of its systems...if I remember right, it was used for LaTeX applications and development, which during those days was still an interesting CS direction.

I used to be mesmerized by it..everything looked so clean and modern, much better then the crappy 80x25 "workstation" displays of the VAX's and PDP-11's.

mistercow 13 hours ago 1 reply      
>This means the frequencies emitted are very high (5 samples per period is 19.2 kHz) and it seems the audio output is being low pass filtered resulting in silly wobbly lines.

That effect actually looks amazing. I'd totally play a game with that aesthetic.

russnewcomer 16 hours ago 3 replies      
Always fascinated to see people use alternate input methods. I wonder if there is some possibility for steganographic applications with this? I.E., plug a soundcard into an oscilliscope, play a specially crafted .wav or equivalent file, and viola, secret message?
tenfingers 2 hours ago 1 reply      
Regarding the sound card, from the description it looks like the "crappy" sound card also employs a high-pass filter.

Is there a reason for it? What kind of circuit design in the DAC could cause low frequencies/significant distortion to require a high-pass at these frequency ranges?

Genuine curiosity here.

k_sze 9 hours ago 0 replies      
That looks really trippy. I think it would actually be a cool rendering mode to have in FPS games in general, on computer screens, particularly the cyber-/techno-punk kinds.
bluedino 15 hours ago 2 replies      
Didn't Woz have the original Apple computer outputting letters to an oscilloscope before he made the circuits to output to a television?
Kiro 2 hours ago 1 reply      
I don't know how oscilloscopes work. Is the sound in the game producing the actual image or is it generated from something else?
SCdF 4 hours ago 1 reply      
To bring this full circle, is anyone aware of a software oscilloscope for Windows / iOS that you could 'play' the wave file into to render his Quake demo via software?
frik 16 hours ago 0 replies      
Wow, great!

I can only try to imagine how it would look on a oscilloscope with more MHz and a better soundcard.

mutagen 13 hours ago 0 replies      
I wonder if the limit on lines was due to 20 kHz filters on the audio output. When the 24/96 audio sampling & perception discussion went by a few weeks ago I seem to recall some testing showing that even high sample rate sound hardware had low pass filters just above 20 kHz. This kind of issue should have been easy to spot in development and testing, though, so maybe just a limitation of the medium.

I've got an old hobbyist oscilloscope, it was my former boss and mentor's first scope. I really should hack up something with it along with some of these low cost cpu boards.

lotsofmangos 14 hours ago 0 replies      
This would be a very cheap way of representing cyberspace in a b-movie. It already looks massively better than the Matrix's falling green thingies.
rndn 15 hours ago 1 reply      
Awesome idea and execution. Im wondering whether it would work better with an edge detection algorithm instead of using the triangle edges.
jokoon 15 hours ago 0 replies      
definitely the coolest cyberpunk thing ever

I wonder if it could be emulated with a shader...

zwegner 14 hours ago 0 replies      
Maybe it's just me, but the audio output sounds really cool.
lotsofmangos 13 hours ago 1 reply      
Given that this is a realtime 3d space squished into the bandwidth of audio, could a variation be done with a worn kinect to feed into headphones an audio waveform version of current surroundings that is capable of being translated by the brain back into 3d. As that sounds like a possible hack for giving some form of vision to the blind.
dang 16 hours ago 5 replies      
This was posted earlier and set off the voting ring detector. I haven't looked closely, but that may have been a false positive. Since it's a good post and didn't get the attention it deserves, we won't treat this one as a duplicate.
World War IIs Strangest Battle: When Americans and Germans Fought Together
50 points by smacktoward  8 hours ago   discuss
Nashville police chief shares message, responds to questions
298 points by whiddershins  12 hours ago   135 comments top 13
sfeng 9 hours ago 6 replies      
He missed the core point that the freedom to protest is important in and of itself. Whether you agree with the topic being protested or not, we should all support our right to protest in a peaceful way without fear of a disproportionate response by law enforcement.

I can say that I am never more proud of our police forces than when I see them maintain their control and treat people with respect in the face of provocation. That ability to say in command of oneself even when provoked is a core part of maturity and something to be lauded. As he said, it's a sign of a professional.

The role of the police is to prevent violence. That's what makes a community safe. It sounds like the letter-writer, on the other hand, is looking for an agent of his or her frustration. Someone to lash out at the protestors because he or she legally can't. A community where the police can attack people with impunity is about as far from safe as one can get. It seems somewhat obvious, but a safe community is one where no one is attacking anyone.

jobu 10 hours ago 1 reply      
"It is only when we go outside that comfort zone, and subject ourselves to the discomfort of considering thoughts we don't agree with, that we can make an informed judgment on any matter. We can still disagree and maintain our opinions, but we can now do so knowing that the issue has been given consideration from all four sides. Or, if we truly give fair consideration to all points of view, we may need to swallow our pride and amend our original thoughts."

I've heard this sentiment before, but never so well written.

sriram_sun 10 hours ago 1 reply      
Here is another gem "It is somewhat perplexing when children are injected into the conversation as an attempt to bolster a position or as an attempt to thwart the position of another." I was hesitant to read this at first as it had nothing to do with programming. However, I am glad I did. I would recommend the HN community to read both the mail and the response. The response was well thought out, organized and calls out a few biases we carry around.
smtddr 10 hours ago 1 reply      
Be sure to read the whole post, as well as the email chain below it.

This was part of a complaint email sent to him:

>>I wanted to send you this email to express my frustration and outrage at how the situation of these protesters is being handled in Nashville. The first night protesters marched here after the incidents in Ferguson they never should have been allowed to shut down the interstate. Instead of at least threatening to arrest them, they were served coffee and hot chocolate.

This is how you deal with protests. Good job Nashville police; much respect. My own frustration lies with whoever sent this complaint.

awjr 44 minutes ago 0 replies      
His response email is exceptional:"As imperfect humans, we have a tendency to limit our association with other persons to those persons who are most like us. Unfortunately, there is even more of a human tendency to stay within our comfort zone by further narrowing those associations to those persons who share our thoughts and opinions. By doing this we can avoid giving consideration to thoughts and ideas different than our own. This would make us uncomfortable. By considering only the thoughts and ideas we are in agreement with, we stay in our comfort zone. Our own biases get reinforced and reflected back at us leaving no room for any opinion but our own. By doing this, we often convince ourselves that the majority of the world shares opinion and that anyone with another opinion is, obviously, wrong.

It is only when we go outside that comfort zone, and subject ourselves to the discomfort of considering thoughts we don't agree with, that we can make an informed judgment on any matter. We can still disagree and maintain our opinions, but we can now do so knowing that the issue has been given consideration from all four sides. Or, if we truly give fair consideration to all points of view, we may need to swallow our pride and amend our original thoughts."

vertis 10 hours ago 1 reply      
Any time the police are forced to go arresting people for protesting they've failed a little bit. This is especially true when the protests are about police.

Granted there are times when they are left with no choice, but if there is a strong relationship established between the protesters and the police, such that the protesters believe that the police understand their concerns, how much less likely does this become?

femto 7 hours ago 1 reply      
> The police are merely a representative of a government formed by the people for the peoplefor all people

Peelian Principles [1] explicitly say that the police force is not representing the government, as that's the job of the military. Firstly, the police force are citizens in uniform and part of the local community. The police are explicitly in place so the military can stay out of the community.

Militarised police forces around the world would do well to keep in mind that they are making themselves redundant, since if the police are indistinguishable from the military, they might as well be done away with and replaced by the military.

[1] https://en.wikipedia.org/wiki/Peelian_Principles

KayEss 7 hours ago 3 replies      
When did the plural for "person" become "persons" instead of "people"? Is this just some weird Americanism, or is there a specific difference in police jargon?
dchichkov 7 hours ago 2 replies      
"In the year 2013, our officers made over four hundred thousand vehicle stops, mostly for traffic violations. A citation was issued in only about one in six of those stops. Five of the six received warnings. This is the police exercising discretion for minor violations of the law. Few, if any, persons would argue that the police should have no discretion."

Huh. Really? Warnings five out of six times? Is that pretty common?

MistahKoala 10 hours ago 2 replies      
If the letter received was representative of the 'fringe' 5%, I'm interested in how the other 90% read. It's quite anodyne, in my opinion.
lizzard 10 hours ago 0 replies      
Perfectly pitched and perfectly constructed!
dang 11 hours ago 5 replies      
This was killed by user flags, but we unkilled it. It's surprisingly thoughtful and unusual enough to make it intellectually interesting and therefore on topic for this site.
V7Theory 9 hours ago 2 replies      
It's pretty clear the chief intended on showing the writer how well-adjusted he wasn't, but to do so he has to completely ignore the fact that protesters were not just being heard, but shutting down major highways and disrupting people's efforts to enter the mall, this after many protesters doing similar things elsewhere had been violent and criminal. Sure, if you ignore that then yeah, the writer is a total dope.
A Generation Lost in the Bazaar (2012)
191 points by bdr  10 hours ago   66 comments top 18
bojanz 21 minutes ago 1 reply      
I always understood this article as being about software design in open source.

Every project (library, application, you name it) needs to be designed, by one or more people thinking deeply about the problem space. You can't expect software design to just happen, once the code has been written, it's already too late, see you in v2. You can't expect a 100 people to do it, there will either be a lot of conflicting visions, or no vision at all. And you can't expect to do it in your 1 hour of "free coding time" a day, because it doesn't give you enough time for deep thinking about the problem.

If you try to bypass design and solve it from "another angle", you get libtool, a solution that is a hundred times more complex than the problem it solves.

Look at successful open source projects (Rails, Go, even Linux). They were all initially designed by someone, handed down to the community to improve.They still have strong minded architects leading the effort. Now compare it to those random "Let's clone X!" threads that never produce anything.

So, there's cathedral thinking even in the bazaar. And it's the only thing preventing it from eating us alive.

lutorm 9 hours ago 3 replies      
I must admit to being a bit puzzled as to what his point is.

Yeah, there are a lot of dependencies if you want to compile an entire linux distro from source. But it's just exposed because everything is right there. If you actually tried to figure out how to compile all the software on your Windows machine, would it be any better?

And to say that libtool, bloated and complex as it is, came about because someone was incompetent seems quite a bit insulting. It seems that confronted with a compatibility problem between different Unix-like systems, one has two choices:

(1) Coordinate changes to all Unix-like systems so the compatibility problem is removed, or

(2) Work around the problem with some sort of wrapper.

Now, in a perfect world, (1) is obviously the better alternative (but even that wouldn't help existing installations.) But the world is not perfect, and my chances of accomplishing (1) are practically zero, even if it's "just a single flag to the ld(1) command". Hence, anyone who actually wants to get something working on all those systems would have to do whatever it takes.

ChuckMcM 6 hours ago 2 replies      
I have always enjoyed that essay, it reminds me of something my Dad once said to me, "Cannibal's don't realize they are cannibals." It wasn't strictly true of course but it tried to capture the essence that if you've grown up in a society where X is the norm, you don't know what living in a society where X is not the norm is like or can be like. Combine that with humans who like what they know more than what they don't know, and you find communities moving to high entropy / low energy states and staying there.

That said, there isn't anything that prevents people from having standards. Both FreeBSD and MacOS are pretty coherent UNIX type OSes. But it is important to realize that a lot of really innovative and cool stuff comes out of the amazing bubbling pot that is Linux as well. I sometimes wish it were possible to do a better mashup.

pjscott 9 hours ago 2 replies      
Earlier discussion, including a number of comments from the author, here:


lazyjones 4 hours ago 1 reply      
FWIW, the 19000 lines of "configure" for Varnish also check whether stdlib.h exists. Perhaps it's still useful today to do so in order to avoid obscure compilation issues or to catch problems on misconfigured systems early on?

As an old-timer with ~30 years of programming experience, I have similar sentiments as the author about complex projects today, yet I also often feel that too much knowledge, accumulated in sometimes cumbersome form, is being thrown away and reinvented badly. There has to be a compromise somewhere and it's no surprise that projects in an old language like C, running on evolved systems like Unix, de facto standardized on Autoconf to make it a little easier for developers. Do I want to use it myself? Certainly not, I have the luxury of being able to choose a modern language that abstracts most (not all!) platform-specific issues away at compiler installation time, at the cost of having much fewer deployment options for my code.

adambatkin 8 hours ago 2 replies      
And yet it all still works.

Technology is more complicated than even a few years ago. It can do more. It is accessible to more people (and all of their unique needs and abilities). Computers have the ability to make an almost infinite number of interconnections with other computers.

The point is that a single person can't possibly keep track of a sufficient quantity of information to direct a sufficiently complex system anymore. And with the communication and development tools available today we are able to build these complex layered solutions without always having to worry about all of the other details that we can't possibly worry about.

thu 2 hours ago 0 replies      
"detour", to reuse the word of the last paragraph, is how things happen. Look at how we'll get our favourite programming languages in the browser: by compiling to javascript, which evolves to become a potent IR (e.g. browsers support asm.js).

Even if you look at the end result (the m4/configure/shell/fortran example) and it is indeed twisted, to honestly say it is abnormal to reach such a state is to disregard any experience developing software. Any project, even brought to life in the cathedral style, will accumulate cruft in the long run that can disappear only with effort.

scj 7 hours ago 0 replies      
There are many related problems to the one pointed out by Kamp. But I'm left asking, does the cathedral scale? Does it handle evolutionary complexity well?

I'm a believer that a much simpler/cleaner set of software tools could be created. But their wide-scale adoption would be more difficult.

f2f 8 hours ago 2 replies      

    > I updated my laptop. I have been running the development      > version of FreeBSD for 18 years straight now, and     > compiling  even my Spartan work environment from source     > code takes a  full day, because it involves trying to     > make sense and  architecture out of Raymond's    > anarchistic software bazaar.
ahh, you should've tried Plan 9, phk. 130 seconds to compile all software and libraries, 6 seconds to compile a new kernel. no bazaar there...

of course, this didn't appeal to you back then, did it? ;)

sysk 6 hours ago 1 reply      
I'm a bit confused. Is this supposed to apply to software development in general or just package management / software repository systems? The author is describing his ideas at a high level of abstraction but I can't seem to make a concrete connection. For example, how would one design a web app in a cathedralesque way?
lxe 8 hours ago 0 replies      
There are a few good quality "stalls" in every bazaar, but discovering them, and maintaining some sort of order is difficult. Every package ecosystem suffers from this.
djcapelis 7 hours ago 0 replies      
Unix was never much of a cathedral, was it? The original codebase took points of pride in its jank.

MULTICS on the other hand...

Animats 5 hours ago 0 replies      
The article author points out several problems in the open source world.

The first is that it's too hard to converge things that have diverged. I pointed out an example in a Python library recently - the code for parsing ISO standard date/time stamps exists in at least 11 different versions, most of them with known, but different, bugs. I've had an issue open for two years to get a single usable version into the standard Python library.

Some of this is a tooling problem. Few source control systems allow sharing a file between different projects. (Microsoft SourceSafe is a rare exception.) So code reuse implies a fork. As the author points out, this sort of thing has resulted in a huge number of slightly different copies of standard functions.

Github is helping a little; enough projects now use Github that it's the repository of choice for open source, and Git supports pull requests from outsiders. On some projects, some of the time, they eventually get merged into the master. So at least there's some machinery for convergence. But a library has to be a project of its own for this to work. That's worth working on. A program which scanned Github for common code and proposed code merges would be useful.

Build tools remain a problem. "./configure" is rather dated, of course. The new approach is for each language has their own packaging/build system. These tend to be kind of mysterious, with opaque caching and dependency systems that almost work. It still seems to be necessary to rebuild everything occasionally, because the dependency system isn't airtight. (It could be, if it used hashes and information about what the compiler/linker/etc actually looked at to track dependencies. Usually, though, user created makefiles or manifest files are required. We've thus progressed, in 30 years, only from "make clean; make" to "cargo update; cargo build".

The interest in shared libraries is perhaps misplaced. A shared library saves memory only when 1) there are several different programs on the same machine using the same library, and 2) a significant fraction of the library code is in use. For libraries above the level of "libc", this is unlikely. Two copies of the same program on UNIX/Linux share their code space even for static libraries. Invoking a shared library not only pulls the whole thing in, it may run the initialization code for everything in the library. This is a great way to make your program start slowly. Ask yourself "is there really a win in making this a shared library?"

Shared libraries which are really big shared objects with state are, in the Linux/UNIX world, mostly a workaround for inadequate support for message passing and middleware. Linux/UNIX still sucks at programs calling programs with performance comparable to subroutine calls. (It can be done; see QNX. When done on Linux, there tend to be too many layers involved, with the overhead of inter-machine communication for a local inter-process call.)

georgemcbay 6 hours ago 2 replies      
I think the core issue the author seems to be getting at is bigger than just the Bazaar (Open Source).

Even when it comes to closed source commercial software development I really miss the days of code ownership. When I first started working as a programmer back in the 90's it was common for different members of the team to "own" sections of code in a larger system (obviously I don't mean "own" in the copyright sense, just in the sense of having one clear person who knows that bit of code inside and out (and probably wrote most of it). Of course we'd (preferably) still be beholden to code review and such and couldn't change things willy-nilly so as not to break the code of consumers of our code, but it was clear to all who to talk to if you needed some new functionality in that module.

The last few places I've worked for have been the exact opposite of this where everything is some form of "agile" and nobody "owns" anything and stories are assigned primarily based on scheduling availability as opposed to knowledge of a certain system. There is admittedly some management benefit to this -- easier to treat developers as cogs that can be moved around easily, etc, but my anecdotal belief is that this sort of setup results in far worse overall code quality for a number of reasons: lots of developer cache-misses when the developer is just bouncing around in a very large code base making changes to various systems day to day, lots of breadth of understanding of the system among all the developers, but very little depth of understanding of any individual component (which makes gnarly bugs really hard to find when they inevitably occur) and what should be strongly defined APIs between systems getting really leaky (if nobody 'owns' any bit of the code it is easier to hack such leaks across the code than define them well, and when your non-technical managers who interpret "agile" in their own worldview force developers to try to maintain or increase some specific "velocity" shit like this happens often).

Granted, there are some cases in which such defined ownership falls apart (the person who owns some system is a prima donna asshole and then everyone has to work around them in painful ways), but there were/are solutions to such cases, like don't hire assholes and if you do fire them.

vezzy-fnord 7 hours ago 2 replies      
His observations are sound, but his blaming it on the "bazaar philosophy" doesn't really follow. The problem of unused dependencies that he points out with the Firefox port is a failure in packaging, either due to clumsy work with the port itself, or an inability to properly support soft dependencies.

I barely understand the voodoo magic behind libtool myself, but as PHK says, it "tries to hide the fact that there is no standardized way to build a shared library in Unix". I'd wager dynamic linking inherently poses such quandaries that are easier solved through kludges.

Hey, it's still probably better than WinSxS.

cportela 5 hours ago 0 replies      
Still don't get it, but I recommend looking at the link someone posted to get the view of the author from when this was posted.
angersock 5 hours ago 2 replies      
So, a really great example of this is over in Node land. I was trying to install some basic boilerplate HAPI demo scaffolding, and I watched with horror as dependencies were pulled in (as third or fourth level deps!): isarray, isstring, three different versions of underscore and lodash, and so on and so forth.

I've never seen developers so lazy or just uneducated about their own language that they blantantly pull in libraries for such trivial operations. On the server even, no excuse about compatibility!

Secrets of Intel Management Engine Hidden code in your chipset
28 points by mmastrac  8 hours ago   5 comments top 4
ollybee 1 hour ago 0 replies      
My second thought on reading this was how can a server be PCI compliant with Intel management engine installed? but a quick search shows that Intel have thought of this: http://www.intel.co.uk/content/dam/www/public/us/en/document...

My first thought was that it seems increasingly clear that Stallman has been right all along.

userbinator 25 minutes ago 0 replies      
Wow. SPARC and Java, two things you wouldn't ever expect Intel hardware to ship with! The mention of SOAP-based protocols is also rather surprising, since they have rather high overhead, and this means ME is not just a little 8051-class MCU but almost a fully-featured PC itself...

The amount of complexity - and the opportunities to hide things in that - has increased so much compared to earlier PCs that in some ways I think the development of computer systems is headed on a rather treacherous path. When systems are so complex that no single person can understand them entirely, it's easier to make them behave against their owner's will.

jesrui 1 hour ago 1 reply      
Slides can be downloaded without registration athttp://recon.cx/2014/slides/Recon%202014%20Skochinsky.pdf
astrange 35 minutes ago 0 replies      
Why would the newest version of the ME use SPARC ISA? Does someone out there need register windows?
Python: Faster Way
17 points by kozlovsky  2 hours ago   6 comments top 3
jwl 18 minutes ago 0 replies      
Could someone please explain #Test 11. How is using %s instead of %d double as fast even though it is the same thing is being calculated? I always use %d because it seems like the prooper way to do it.
ayrx 56 minutes ago 2 replies      
I honestly do not get the point of this page.

None of the time differences in all the test cases are significant at all. Concerning yourself with this is premature optimization of the highest level especially in a language like Python. One should definitely be more concerned about writing clearer and more idiomatic code.

im3w1l 19 minutes ago 1 reply      
What's up with test 11, variant 1 vs 3?
Show HN: #Startup A global startup community, on Slack
33 points by bramk  1 hour ago   34 comments top 8
davidw 37 minutes ago 1 reply      
Speaking of #startups, that channel is free to join on irc.freenode.net.
ttty 1 hour ago 1 reply      
The signup has really bad UX. Can't even use it.

There is a secret (or you will find it later):

- You have to pay 10$;

- or write a "great motivation";

to sign up.

petercooper 12 minutes ago 1 reply      
I was ready to donate the 10 but found it just wants me to enter credit card details onto the application form and I was on a non-https page. It said it uses Stripe but it's tricky to tell if that's really the case (I don't believe there's bad intent here at all, but the UX for taking the payment doesn't inspire confidence).
Kudos 1 hour ago 2 replies      
The domain is dumb, Slack's hash logo is a nod to IRC and not Twitter. It doesn't represent tagging, it represents IRC channels.

Edit: hey hastagstartup.co downvote brigade, hashstartup.co is still available. So is hashstartups.com which is even better.

hayksaakian 44 minutes ago 1 reply      
The concept of a global community of startup people is nice, but why do I have to use your platform to be a part of it?

That question felt unanswered to me after 60 seconds on the landing page.

typeformer 1 hour ago 3 replies      
For those who are wondering the slick UI for the sign up was made with Typeform out of Barcelona :)
danmaz74 49 minutes ago 1 reply      

Shameless plug: If you're interested in the actual hashtag on Twitter, you could like to use of our free embeds:


Also let me know if you'd like to get detailed analysis of the #startup hashtag, I'd gladly contribute it for free.

meesterdude 1 hour ago 1 reply      
Pretty cool! I just signed up. I think its an interesting line of dialog to have going in the background, and am hopeful it will make the path to starting and running a startup clearer for me.

The signup process is kinda neat, but some of the questions made me go "errr..."

edit: i see they(you?) use typeform for the signup form; never heard of them before, but pretty neat!

Deep Learning Reading List
75 points by jmozah  5 hours ago   8 comments top 4
sabalaba 1 hour ago 0 replies      
I would add "Practical recommendations for gradient-based training of deep architectures" to the list for those who already have a feel for training multi-layer neural nets. It provides a good overview for those that want to learn more about gradient descent, hyperparameter tuning, and other practical considerations involved with training deep architectures.


therobot24 4 hours ago 1 reply      
You're going to get most of these from a simple Google search, if you're going to build a list of what to read you should at least put some effort into it. Currently, this list is missing a lot of history behind deep learning - only 3 papers listed!

if you want a good set of papers that starts with perceptrons and hebbian learning to multi-layered neural nets and the emergence of what we now refer to as deep networks checkout http://deeplearning.cs.cmu.edu/

ratsimihah 4 hours ago 1 reply      
Automatic Speech Recognition: a Deep Learning Approach contains an excellent section about deep learning, as well as more content about ASR and hybrid deep learning methods.
userbinator 2 hours ago 0 replies      
For those interested in learning deeply about deep learning?
Working with queue and stack people
32 points by rmason  10 hours ago   4 comments top 3
johnloeber 10 minutes ago 1 reply      
Huh, this is a pretty salient observation. I'm curious about the extent to which it applies -- I certainly am a stack-person some times, and a queue-person at other times. Could it depend on context? Queue for important tasks, Stack for unimportant tasks?
zniperr 33 minutes ago 0 replies      
The examples are very relatable, at least for me personally. Judging from this fragment, I would consider myself a stack person:"For tasks that represent pain that must be immediately alleviated, its not that the stack feels I must fix it; rather, this must be fixed. The task is painful not because the stack must fix it, but because it exists, period."

The frustrating part about this is that it is very hard to explain to coworkers why you feel this, because you intuitively expect them to feel the same way. I can recall some cases in which this has caused some friction when working on group projects.

rdtsc 1 hour ago 0 replies      
On a funny note, for mathematicians, "analysis vs algebra" bend predicts if they'll eat corn depth first or breadth first.


On the new Snowden documents
40 points by donmcc  11 hours ago   4 comments top
nl 1 hour ago 1 reply      
During the period in question, we know of at least one vulnerability (Heartbleed) that could have been used to extract private keys from software TLS implementations. There are still other, unreported vulnerabilities that could be used today.

His analysis that there are unreported vulnerabilities in TLS implementations sounds definitive enough to think he knows some of these vulnerabilities.

Mae Keane, the Last 'Radium Girl,' Dies at 107
51 points by benbreen  6 hours ago   13 comments top 3
Omniusaspirer 4 hours ago 3 replies      
"There was one women who the dentist went to pull a tooth and he pulled her entire jaw out when he did it," says Blum. "Their legs broke underneath them. Their spines collapsed."

Horrific stuff.

I've always found it curious how nonchalant people are with dangerous substances before it's realized how hazardous they can be. My grandfather loaded asbestos into railcars at a factory for 30 years, I still can't quite imagine how he must have felt when he fully realized the dangers it posed.

mirkules 1 hour ago 0 replies      
There is a much more in-depth article about the Radium Girls and the ensuing cover-up: http://www.damninteresting.com/undark-and-the-radium-girls/

Interestingly, the article I linked missed the woman FTA, possibly because she quit early.

piokoch 2 hours ago 1 reply      
That's a very interesting story. Radium was a huge hit in its times. That what was happening was even called "Radium fever".

Radium was everywhere including toothpastes, shampoons (cure against hair loss).

Fortunately only a few people could afford Radium-enhaced products, since it was expensive. Side effects started to appear quite soon too.

There were other popular uses of radiation, until '50 many shoes shops had X-ray equimpent that allow to see how shoes fit [http://en.wikipedia.org/wiki/Shoe-fitting_fluoroscope]. Not that great idea, after all.

For me it is kind of warning against using everything that men invent or men can do. There are always things we don't fully know or understand. In the long term something could be a great danger for all of us.

Before we jump into next great thing it is good to stop for a while...

William Gibson: How I wrote Neuromancer
167 points by _pius  14 hours ago   59 comments top 9
abruzzi 12 hours ago 5 replies      
"The sky above the port was the color of television tuned to a dead channel" -- That is one of the all-time classic opening lines. It is also something people will gradually lose since most people no longer have that grey static.
veidr 8 hours ago 2 replies      
An interesting brief history of what is, if I was forced to pick one, (still!) the best among the thousands of novels I've read.

The whole Sprawl trilogy is fantastic, and while I agree with other commenters here that Gibson's subsequent novels have become somewhat less awesome, it's hard to complain too much about that if you believe, as I do, that the author in question's first attempt resulted in the best novel of all time.

Still, Neuromancer is indisputably dated, as any such work would inevitably be, so I am glad to have originally read it in the 1980s.

aikah 12 hours ago 5 replies      
A great book. Unfortunately after Johnny Mnemonic , and the matrix(2+3) which were total garbage, I'm not sure I would want a movie based on that book.

It however influenced so much good stuff,like Ghost in the shell which is basically the same plot,Deus ex and others.

I enjoyed the audio-book read by Gibson itself,it was excellent.

laichzeit0 3 hours ago 0 replies      
This is very interesting to me from the "performing under pressure" point of view. I've been under the gun, so to speak, on more than one occasion and invariably I've delivered and learned most rapidly during those times.

Makes me wonder if people who "get shit done" operate on that sort of do-or-die mental state, or how long it's possible to put yourself in that mental state without either burning out or breaking down. I've read similar anecdotes from people like John Carmack and Richard Feynman (again, pressurized during WWII).

It's almost like we're operating at 50% efficiency, maybe we go to 75% when we're really focusing, but actually only when we're in the self-preservation state, we go to 90+%

yzzxy 11 hours ago 0 replies      
I find Gibson's greatest gift is naming things and coming up with vernacular. Panther Moderns and ice are obvious standouts but I particularly liked "funny" as a term for pirated 3d printed objects in The Peripheral.
crucini 12 hours ago 9 replies      
Neuromancer was amazing. I realize now that Gibson's books monotonically decreased in quality.

The books have gotten thicker, artier, more self-indulgent, and weaker.

I'm sure he'd like to recapture the magic he had at 34, but maybe it requires the fear he spoke of. And an absolute ignorance about computers and networks.

I think it shares more with The Maltese Falcon than with any SciFi.

Saw WG complaining about GamerGate recently and thought how much he's aged, and how ungracefully, since GG and Operation Disrespectful Nod reminded me of the Panther Moderns.

jessaustin 10 hours ago 0 replies      
Interesting to find that editor/author Terry Carr was so instrumental. The first scifi I ever read was his Cirque. That was a deeply weird book for an 8yo in the early 1980s (and thinking back I'm not sure whose bookshelf I could have raided to find it) but I was hooked.
hyp0 5 hours ago 0 replies      
answer to title: Iwould write, then, to the audience Iimagined in the future of my discovery by friendly if unimaginable forces, and to them alone.
wiredfool 12 hours ago 0 replies      

  Is it going to be OK? I asked, my anxiety phrasing the  question. He paused on the stair, gave me a brief,  memorably odd look, then smiled. Yes, he said, I   definitely think it will,
Anxiety. Over the quality of the manuscript.

The best things and stuff of 2014
254 points by untothebreach  21 hours ago   17 comments top 8
NhanH 13 hours ago 0 replies      
I will need a follow up blog title "How to be productive" or something similar.

But seriously, I can't fathom myself doing half the stuff being listed in the post in a year.

gluggymug 1 hour ago 0 replies      
"Favorite code read...Z3 (Verilog)"

As someone working in ASIC design and verification, that code is not a good read IMO. I am stunned it even synthesized. The use of "initial" and "task" is not generally for describing hardware.

cheeseprocedure 12 hours ago 0 replies      
When I see these end-of-year book lists, I wonder if I'm the slowest reader on the planet.
_nullandnull_ 14 hours ago 1 reply      
> Norwegian Wood, The Contortionists Handbook

I hope he reads both these books consecutively. These two books are worlds apart but both equally excellent. It would be an interesting contrast.

hammerha 6 hours ago 0 replies      
That list seems more than mine of my life.
acidx 11 hours ago 1 reply      
Surprised to see that my blog post about Lwan made to such list.
shoshin23 13 hours ago 1 reply      
I really liked Read-Eval-Print-Loop, I hope he goes on to publish a few issues in 2015.
polynomial 15 hours ago 1 reply      
" Number of books published: 1 "

" Number of books written: 0 "


In his garage lab, Omahan aims to bend fabric of space
35 points by rmason  6 hours ago   14 comments top 6
mjfl 2 hours ago 0 replies      
Inspiring. I'm sure there's more then a few members of this community content to knock down and debunk everything that he's doing. I'm not going to do that, instead I'm going to upvote and hope more people take on similar crazy ventures.
colordrops 1 hour ago 1 reply      
I don't know what to make of this. It has a lot of the same idiosyncrasies of a lot of other pseudo-scientific endeavors, with the 90's style website design, several spelling errors, a disproportionate amount of media coverage, poorly designed charts, jerry rigged lab setup, etc.

They always include a model of their "spacecraft" in every shot. In this video, https://www.youtube.com/watch?v=8TKTsAa4sSs, they already have a pilot, random equations on the blackboard, and a spread of meters in front of the person speaking.

Their lead scientist is a physics student, who is the president UNO Paranormal Society.

It all seems rather fishy, but I'll withhold my judgement until someone comes and reproduces or falsifies their claims.

rmason 6 hours ago 2 replies      
There were several well funded efforts to create the first airplane, Alexander Graham Bell among them. Yet they got bested by two bicycle mechanics.

I truly believe if someone creates a warp drive it will indeed be someone like this guy operating out of a garage and not NASA.

colechristensen 1 hour ago 0 replies      
This is bollocks.

The Wright brothers did not break any barriers in basic science, they solved an engineering problem. The basics of fluid dynamics and prerequisites to flight were established in the late 19th/beginning of the 20th century. There were plenty of details to sort out and experiments to do, but the scientific foundation was there. The problem of flight had been reduced to the engineering problem of improving two ratios: Thrust/Weight and Lift/Drag.

What this man in his garage is proposing he's done would be earth shattering if the demo he's impressed upon a rather impressionable reporter (and several HN commenters) were real. More impressive than anything the $10 billion LHC could hope for, but he hasn't, let's be clear.

What's actually happened is a guy tinkering in his garage with electromagnets has gotten himself and a few people surrounding him caught up with grandiose ideas. Electromagnets aren't blocked by faraday cages, which easily explains away any warping he thinks he's accomplished.

You shouldn't need technical knowledge to figure this out though, because the top half of that article isn't about what he's doing, it's about the underdog ignored maverick thinker making revolutions in his garage. It's in an online newspaper hosted at omaha.com. It's written by someone with obviously little scientific training.

Personal bollocks filters are important to develop.

steventhedev 1 hour ago 1 reply      
Here's his site, for those who want to start the analysis of his data/methodology:


japhyr 2 hours ago 1 reply      
His work sounds easy enough to test: hang a weight in an electric field, and the weight moves farther than it should. Has there been any effort to independently test what he claims to have discovered?
Reverse engineering a Qualcomm baseband processor [pdf]
58 points by dodders  7 hours ago   25 comments top 6
mmastrac 7 hours ago 4 replies      
This topic is close to my heart. I spent a few years immersed in the Qualcomm basebands as part of the unrevoked project and personal research. I stared at the ARM code for what must be hundreds of hours.

There are so many vulnerabilities in the baseband that it's not even funny. Even the QCOM secure boot process is full of holes. If a government agency wanted to drop a persistent baseband 'rootkit' on your device with full access to userspace, they could (unless you're using one of the few phones with separate userspace and baseband processors).

The DIAG commands are particularly fun. You can read and write memory on most phones. Some have locked it down to certain areas, but this varies wildly depending on manufacturer.

CamperBob2 6 hours ago 0 replies      
Unfortunately this is almost guaranteed to bring a legal attack from Qualcomm, with or without actual grounds. I've never encountered a more litigious company in my (long) involvement in electronics, or the tech sector in general. Whether Qualcomm employs more engineers or more lawyers is an open research topic.
therealmarv 6 hours ago 2 replies      
Are there any opensource baseband phones out there? Does opensource baseband actually exist? So many people think that they have a phone with opensource software but so many components, especially the baseband can give so much control over the phone.
pronoiac 3 hours ago 0 replies      
If you're wondering, iPhones have used both Qualcomm and Infineon baseband processors: https://theiphonewiki.com/wiki/Baseband_Device

According to a note in this presentation, Ralf-Philipp Weinmann has noted exploits on broadband processors from both.

jordanthoms 6 hours ago 1 reply      
So the usual view is that the capabilities we hear of the NSA having (keeping phone on even when it appears to be off, using GPS etc to locate the phone, transmitting microphone in the background, etc) is enabled in the baseband, when it receives coded requests from the network.

It'd be interesting if reverse engineering of the baseband could find those capabilities and see what's really possible and how it works.

jcr 7 hours ago 0 replies      
Here's the video of the talk that Guillaume Delugre did on this pdf at 28C3 in 2011.


It's both fascinating and frightening.

Show HN: LambdaNet A functional neural network library written in Haskell
38 points by jbarrow  6 hours ago   3 comments top 2
im3w1l 2 hours ago 0 replies      
Looks very nice!May I ask how fast it is?
coolsunglasses 2 hours ago 1 reply      
This looks really cool. Are there any papers that describe neural networks that function in a manner similar to how this library works?
Predictions for the future from 1930
7 points by capdiz  3 hours ago   discuss
Software engineers should write
264 points by shbhrsaha  18 hours ago   128 comments top 46
edpichler 16 hours ago 5 replies      
"Even if nobody reads your essay, writing it will make an impact on you."

After reading a post in HN (https://news.ycombinator.com/item?id=5614689) entitled "why you should write every day", I've being doing it daily in a private blog. I do it in English to improve my second language. My main language is Portuguese.

I'm doing it since 09/22/2014. I try to write about my own ideas, because I believe is the right thing to do and it is the best subject to improve myself. It's not an easy task, and I do not feel I'm improving yet, but something in me tell me that I should keep doing it.

kalid 16 hours ago 4 replies      
I've always been a fan of Einstein's quote "If you can't explain it simply, you don't understand it well enough."

Writing about a topic is a good test of whether you can explain it simply.

singingfish 33 minutes ago 0 replies      
"Writing is easy: All you do is sit staring at a blank sheet of paper until drops of blood form on your forehead.

(yeah I've written one book with a readership of thousands, 5/8ths of a phd with a readership of 5 and a bunch of academic articles with a variable readership)

tomphoolery 17 hours ago 4 replies      
I was always an "English kid", came close to failing my math subjects in middle school and finally in high school, I did fail Algebra I, and had to re-take it the next year. Meanwhile, I was in advanced programming courses and on my way to take an AP Computer Science course in the last semester of my sophomore year. Looking back, that experience taught me about how important modeling is to pedagogy. The fact is, my Algebra I teacher couldn't model the problem for me. He never explained anything, just expected us to memorize everything. We were never given real, tangible examples (contrived word problems don't count and never did!)...all we did was take stupid tests. My high school math experience was like a crash course in everything that's wrong with STEM education in America, and why we need to alter that if we're going to depend more on STEM in the real world.

In my opinion, programming has always been a form of writing. Just like songwriting is a form of writing. It's simply a different medium, and therefore you get a different result.

I might be looking at it from a different kind of lens though.

euphemize 17 hours ago 3 replies      
Thoroughly agreed. As a developer who spent a lot of time studying humanities, arts and other disciplines requiring constant writing for grant submissions, essays, etc. a lot of skills required to be a good developer become invaluable for a dissertation and the other way around.

  >  A core skill in both disciplines is an ability to think clearly. The best software engineers are great writers because their prose is as logical and elegant as their code.
Personally, I found that combing through my text again and again and again to cut down on unnecessary words, combining similar ideas ideas and clarifying points made a huge difference, and was very much akin to optimizing software. It was generally something other students didn't really bother to do and their writing greatly suffered from it.

Bahamut 17 hours ago 0 replies      
Highly agree with this - I am baffled why the humanities aren't taught this way in US high schools. Writing always seemed difficult to me then, but upon taking my first college class which had lots of essay writing, it dawned on me that making a persuasive argument was the most important part of an essay. This has much in common with the critical thinking touted in the hard sciences & mathematics. If English was taught this way in K-12 education, instead of enforcing arbitrary rules such as page length & word count, I would have done immensely better.

I got the important takeaway from that experience, but many people do not, and it is a shame.

joeblau 16 hours ago 2 replies      
I write a blog[1] and I try to add good documentation to my open source project[2], but I recognize that I'm in the minority. One benefit I get from writing, even though I don't get a lot of readers, is thought refinement. I usually send my blog posts to friends and family for help on word choice and better delivery. Even though Steve Jobs said people don't read, I think reading and writing are critical because you don't always have a camera or a microphone to get your message across.

[1] - https://blog.joeblau.com/

[2] - https://github.com/joeblau/gitignore.io

marktangotango 17 hours ago 2 replies      
Hey software engineers, write some m*ther f!cking documentation! Don't tell me it goes out of date, at the very least a module level, architectural overview is better than nothing, and should remain relevant past your tenure.


dataphile 15 hours ago 1 reply      
Thanks for posting this, it is very timely for me. I have been a forum lurker for most of my life. I visit Hacker News almost everyday but seldom do I post a comment and I have never submitted an article. Same with Facebook, I'm mostly a lurker. It is my New Year's resolution to start writing and contributing more to the online communities I visit. In fact I just finished my first draft of a blog about my experience using Angular, LokiJS, and Ionic to make offline apps. Hopefully in the next day or two I will publish it on my blog and maybe even submit it to Hacker News. Your blog posts encourages me to keep at it. Thanks.
stevebmark 5 hours ago 0 replies      
Writing a blog post about technology, such as teaching a library or programming language, or how to accomplish something, or even docs for your own library is really hard to get right. In the programming field I work in, I'd guess about 90% of all tutorials / tech blog posts are poorly written. Even Github READMEs are a minefield of very poorly presented ideas. I would not encourage people to continue to add to this noisy spectrum unless they are already capable of conveying ideas clearly and simply. Most technical articles, including many that front page HN, do not come close to this criteria.

I agree that writing can be helpful for many things, such as expressing emotion, or telling stories, or just a journal. In those scenarios, it's not dangerous to get it wrong. No one will lose their way in a technical project because you can't write cleanly about your dog.

Write anything except technical articles, until someone comments with something like "this was really well written!" Then you can consider adding to the painful cloud of tech articles.

If you release a tech blog post without editing (and largely re-writing) it a minimum of three times, stop doing it. Seriously. You're not helping.

Also, if you're a newcomer to a tech field and get discouraged by trying to learn about something from online resources, 90% of the time it's not you. It's the author being unable to clearly present ideas. Don't get discouraged!

henryw 15 hours ago 1 reply      
I've always admired the writing style in the Economist: http://www.economist.com/styleguide/introduction
ggambetta 17 hours ago 3 replies      
Why stop at essays or technical articles? As an engineer, I've always been fascinated by the structure and inner mechanics of stories - what makes them work.

As a hobby I've done a lot of reading around this; I've written three feature-length screenplays, and a novel you can find in Amazon[1], using very structure-centric approaches (as a result, my characters tend to be too flat).

Take a look at The Snowflake Method[2], unsurprisingly designed by a novelist who is also a theoretical physicist. Even with The Hero's Journey, there's a surprising amount of well-understood structure behind every story.

[1] http://www.amazon.com/dp/B00QPBYGFI[2] http://www.advancedfictionwriting.com/articles/snowflake-met...

0xdeadbeefbabe 15 hours ago 1 reply      
Andy Rooney complained about people who say, "I'm going to write a novel when I retire" but don't say, "I'm going to do brain surgery when I retire."

I think he might mean that writing for human consumption can be harder than most people think.

portman 14 hours ago 0 replies      
Reminds me of Steven Pinker's recently-published book on how to write well, "The Sense of Style" [1]

Pinker uses software terms to describe good writing: convert a _web_ of ideas into a _tree_ of syntax into a _string_ of words.

[1] http://www.amazon.com/dp/0670025852

pluc 17 hours ago 0 replies      
It's been said that a programmer's brain is more akin to that of a writer than a mathematician, allegedly because learning programming languages is much like learning a language - the same areas of the brain are involved.

From http://www.huffingtonpost.com/chris-parnin/scientists-begin-...

> Scientists are finding that there may be a deeper connection between programming languages and other languages then previously thought. Brain-imaging techniques, such as fMRI allow scientists to compare and contrast different cognitive tasks by analyzing differences in brain locations that are activated by the tasks. For people that are fluent in a second language, studies have shown distinct developmental differences in language processing regions of the brain. A new study provides new evidence that programmers are using language regions of the brain when understanding code and found little activation in other regions of the brain devoted to mathematical thinking.

mastazi 7 hours ago 0 replies      
As a former English-kid (well I grew up in Italy so I was an "Italian-kid"), who became a programmer after having worked as a journalist, I completely agree with the article. Although code can be seen merely as a series of mathematical statements, nonetheless it has its 'grammar', syntax and semantics, just like any natural language does.I have noticed that in Australia (where I got my bachelor in IT) you are required to write essays on a regular basis even if you are studying scientific subjects and I think that's good.
jobu 17 hours ago 1 reply      
There are times I wish I could go back to my younger self and explain exactly this. Unfortunately it took me almost a decade as a software engineer before I realized there was a career barrier without being able to communicate effectively when writing or presenting.

It's possible to stay as an average engineer for a long time, but if you want to try being an Architect, then at least 50% of your time is spent writing or public speaking. If you want to be an engineering manager, that's over 90%.

Fortunately a company I used to work for believed pretty strongly in cultivating these "soft skills", so they incentivized things like Tech Talks, and covered the cost of courses like Dale Carnegie.

lmm 15 hours ago 0 replies      
If you enjoy writing, then write. If writing, or improving your writing, helps you achieve the things you want to, then write.

But don't feel you "should". Essays are a bit like code - but if you want to get better at coding, you'll do better practicing coding than practicing essays. Likewise if your goal is "impact"; blog posts, particularly general ones like this, are ten-a-penny - even really good ones. Whereas really good software libraries are rare, even now - and you're more likely to write a specialist software library, with a small audience but one for whom that library is vital, than an equally specialist blog post. And while writing about something may clarify your thoughts, it's nothing next to setting that thing down in code.

Once again, do what you enjoy. If you like to paint, paint; if you like to make music, make music. But if you'd rather just code, or even just watch TV (the very epitome of unproductive wastefulness - but the typical blog probably achieves very little more), that's fine too. Don't let anyone tell you you shouldn't.

azdle 17 hours ago 0 replies      
I completely agree with this. I've started blogging for my work on some of the things that I'm actually writing code for.

I've actually found that it helps me think about more of the big picture stuff. In writing my first post about one of our APIs [1] I actually realized that there was a small omission in how we designed it.

[1] http://exosite.com/real-time-device-communication-part-1/

MichaelCrawford 7 hours ago 0 replies      
Come to http://www.kuro5hin.org/ - most of us are software engineers; all we do is write.

Before submitting a story, spend some time introducing yourself to the community - post diaries, as well as reply to the diaries and comments of other kurons.

alexggordon 15 hours ago 0 replies      
I think this is one of the main benefits I find on HN. Not only does this give me an intelligent, well educated community to talk to, but most often the community shares my hobbies and interests.

I think being able to write is extremely important, but I think the rhetoric behind writing is just as important, if not more important. When you write in a community or forum, like HN, citing your sources and defending your arguments is more important than on a blog, because if you don't, your voice simply won't be heard as loudly.

Contrast this with clickbait blogs, or blogs that simply write for shock, it because clear that having a humorous or convincing writing style is almost as important as being able to argue your point, or convey a complicated idea. However, in my mind I find the latter a far more important skill in the long run.

So yes, software engineers should write, but also don't forget to do some 'code' reviews.

adrianh 13 hours ago 0 replies      
I interpreted this headline as "Software that engineers should write," as in "Engineers, please make the following bits of software."

Which in a way helps prove one of the article's points: writing and programming are alike in their need for precision and clarity. :)

vayarajesh 7 hours ago 0 replies      
I completely agree with you, writing does help the software engineer think clearly. I recently started writing short poems and it has helped me think better when writing code..

I also believe that the engineers should write code everyday as well.. many of the engineers take at least a day off from the week and it somehow reset's mind a little

harshbhasin 15 hours ago 0 replies      
I just published my first kindle short story: http://www.amazon.com/dp/B00RA3UD20

Also, related link on writing: https://news.ycombinator.com/item?id=8793024

amelius 12 hours ago 3 replies      
> Software engineers should write because it promotes many of the same skills required in programming.

The problem with writing is that you usually do it by serializing your thoughts in one go. Programming on the other hand is an activity where you almost randomly jump from one point to the other.

> Code and essays have a lot more in common.

What I hate about writing prose, is that you are expected to use synonyms all over the place. If you use the same word in two subsequent sentences, this is considered "bad". With programming, I have no such problem.

sigil 15 hours ago 0 replies      
To paraphrase Knuth, "programming is explaining to another human being what you want a computer to do."

Should you take it to the extreme Knuth did with Literate Programming? I personally don't. But, once I've successfully explained to the computer what it should do (my program works) I look for ways to better communicate what it's doing (my program is readable). In many cases that's harder than solving the technical problem at hand.

Concision and simplicity seem to be the key to that. I agree with the author that "like good prose, good code is concise," although for prose that's more a matter of taste. Otherwise we'd all be reading a lot more Hemingway.

fallat 14 hours ago 0 replies      
I don't know about writing essays, but just writing about your ideas to yourself and showing others is a great way to explore said ideas even further. At least that's what I find. That's why I've started a blog too about 8 months ago and try to get something interesting into it at least once a month. I've had some great discussions with people. I don't like to see it as blog either (in the sense that I want everyone to see what I'm writing), but more of a commentary platform to validate and explore my thoughts.

Obligatory link: ecc-comp.blogspot.ca

mkramlich 16 hours ago 0 replies      
I agree with the article's arguments. I've been programming since age 10. Doing creative writing almost as long. As an adult ended up doing software engineering as my career. But wrote and published a fiction book two years ago (The Dread Space Pirate Richard, an adventure comedy), close to finishing its sequel (The Man in Black) and also have my first technical book (Software Performance & Scalability) under development. Also written a screenplay and many short stories.

I've found a lot of overlap, in thought process, between programming and writing. Also with music and math. Leverage everything that helps, I think.

mschip 12 hours ago 1 reply      
Great article. As somebody who has always identified with the "math" kid, I've never dreaded nor disliked writing. In fact, I often have the urge to write. My biggest issue is confidence in my vocabulary. After years of schooling for engineering I feel as though my non-technical vocabulary hasn't progressed much since high school and always seems to fall to the bottom of my personal studies.
henrik_w 16 hours ago 1 reply      
I like this quote (from Joan Didion):

"I don't know what I think until I try to write it down."

tooatui 12 hours ago 0 replies      
Writing is never easy for me even in my first language. I guess I am one of those "math kids", but some part of me wants to write, to express myself, to make an impact. I do become better at writing after a few blog posts in Chinese, and I will start to write in English, because I think English is such a beautiful language and after so many years of studying, writing might be the way for me to truly understand and use it.
mathattack 17 hours ago 0 replies      
Where I grew up the distinction didn't happen until high school, and even then there was a large overlap between AP Calculus and AP English. Computer Science was a different beast - we were a subset of the Math nerds that didn't necessarily get into English due to the imprecision.

The irony is that the precision of CS makes us better writers because we can see the inconsistencies. (How many requirements documents can be interpreted multiple ways?) Between undergrad and grad the Math/Verbal spread on my standardized tests flipped.

emcarey 16 hours ago 0 replies      
When I first moved to the valley (as a writer) I found I had so much in common with my new friends who were software engineers. Our personality traits were similar, and we wore hoodies and stayed up all night. I'm actually a brilliant math student but writing was the skill set I pursued. I'm really glad you wrote this -- there are so many fascinating parts of software engineering and bright minds whom I would love to learn more insights from!
k__ 17 hours ago 2 replies      
Thinking back, I was neither.

I wasn't good in math, languages or anything in elementary school.

I didn't want to be there and always played "sick".

This just got a little bit better, when I left elementary school and switched 2 schools afterwards. Since the second school was a lot easier than the first, I got better grades without doing anything.

But I never got really good at anything at school, better in Science than in Humanities, always a B- on average. Even my degrees got that rating...

kevinmireles 8 hours ago 0 replies      
As a writer, software product manager and dad who is trying to encourage his girls to learn to code, I really enjoyed the post as it highlights the similarities between programming and writing - and the fact that communication is truly one of the most critical skills a developer or anyone can have.

Now, I just need to learn a programming language and start coding so I can improve my human/computer communication skills :)

TylerH 13 hours ago 0 replies      
This title is a bit too vague; I read it as "Software [that] engineers should write" thinking it was going to be an expos on how software engineers could better spend their time. Maybe change it to something like "Software engineers need to write, not just code"
friendcode 16 hours ago 0 replies      
For all the software engineers who want to write: https://www.gitbook.com
karlbeecher 12 hours ago 0 replies      
Totally agree with this post. I was a math/english kid. Still am. I went as far as writing a whole book about computer science:


normloman 14 hours ago 0 replies      
Software engineers do write. You see their rants on hacker news all the time. Most of the time, they write about why some programming language is either good or bad.

Maybe software engineers should write less.

Kalium 17 hours ago 4 replies      
I find myself uncertain about the thesis. Code that is clear and expressive to a human should not be assumed to also be efficient and optimal for a computer.

As writers say, know your audience.

exacube 15 hours ago 0 replies      
I think programming might help you write very prose pieces, but there is still a lot more to literature that remains very artistic.
VLM 15 hours ago 1 reply      
Interesting discussion point, Larry Wall, linguist, invented a programming language that is not exactly considered stylish at this time. Implications?
volune 13 hours ago 0 replies      
they should do it all. there is nothing they should not do.
graycat 14 hours ago 0 replies      
<rant> YMMV:

There is little so obscure as undocumentedcode.

An old software joke goes, "When codeis written, only the programmer andGod understand it. Six months later,only God.".

As a result, for continued understandingof code, documentation, to explain the codeto a human reader, is crucial. In simpleterms, to humans, code without documentationis at best a puzzle problem in translationand otherwise next to meaningless. Use of mnemonicidentifier names to make the code readablehas created a pidgin-like language that isusually unclear and inadequate.

Thus, writing documentation is crucial,for the next time the code needs tobe read and understood, for users, etc.

Thus, net, after too many years with codeand softwarem=, I claim (big letters in sky writing, please):

The most important problem, and a severebottleneck, in computing is theneed for more and better technical writing.

My suggestion for some of the best models ofsuch technical writing are a classictext in freshman physics, a classic textin freshman calculus, and, at times, aclassic text in college abstract algebra(for examples of especially high precisionin technical writing). Otherwise I suggestKnuth's The Art of Computer Programming.

First rule of technical writing: A word usedwith a meaning not clear in an ordinarydictionary is a term, in technical writing,say, a technical term. Then, before a termis used in the writing, it needs a definition, that is, needs to have beenmotivated, defined precisely (maybe evenmathematically), explained,and illustrated with examples. Then wheneverin doubt, when using the term, include a linkback to the definition. So the first ruleof technical writing is never but never usea term without easy access to the definition.Similarly for acronyms.

Biggest bottleneck in computing .... Sorry'bout that. YMMV. </rant>

dreamdu5t 14 hours ago 0 replies      
Don't tell me what I should do. It's condescending. You don't know me. You should learn how to write without adopting a tone of presumptuous condescension.
jmnicholson 13 hours ago 0 replies      
Software engineers should write at thewinnower.com. Here's why:

1) their writings will be preserved with the same power that libraries afford traditional scientific publishing

2) They wont just be blogging, they will be publishing.

3) They can assign a digital object identifier (DOI) at their discretion making their work "count" in the scholarly literature.

4) Their blog will be automatically formatted as a PDF.

5) https://thewinnower.com/papers/science-the-pursuit-of-the-tr...

6) https://thewinnower.com/papers/making-scientific-blogging-co...

Kyoto Tycoon for modern systems
4 points by rcarmo  2 hours ago   discuss
Scaled Inference Raises $13.6M to Build Out Machine Learning
4 points by prostoalex  3 hours ago   discuss
Reducing Lwan memory usage by 2670%
4 points by dsr12  2 hours ago   1 comment top
osmala 14 minutes ago 0 replies      
If something is X% less or reduced then the number you are taking percentage from is BEFORE the change, not after.So its (820-32)/820 . So its really reducing 96%

Correct way of getting huge percentages is saying it used to be X% More. You can get MORE huge percentages you can get LESS less than 100%.

A comforting lie
77 points by colinprince  15 hours ago   11 comments top 5
zw123456 6 hours ago 2 replies      
The comfort noise that he is referring to is the white noise that is intentionally introduced by modern digital cell phones. Most cell phones have a CODEC that will detect silence and rather than wasting valuable bandwidth on nothing, a silence packet is sent that says something like "quiet for the next 250ms" or something like that. The GSM specification for the AMR CODEC provides for a feature called comfort noise. It turns out that on the other end, if the listener hears total silence, they worry that the connection was lost. So instead it will provide white noise, specifically +/- 2048 (of a 16 bit word). Since a pseudo random number generator on the phone is typically used, it is making white noise, similar to the white noise you hear on an old fashioned analog phone or as the author references, a radio tuned to no station and you hear random static a percentage of which is cosmic MW background radiation from the big bang. It is ironic that with all that digital technology and people still like the comfort of white noise in the background, not complete silence. Of course the carriers could instead use some extra BW and transmit the actual background noise of whoever you are talking to.
couchand 4 hours ago 0 replies      
Remember that it's not always about comfort, sometimes it genuinely is about usability. Consider Skeuocard [0]: it's much easier to enter information when the place you're entering it looks like the place you're reading it from.

[0]: http://kenkeiter.com/skeuocard/

raintrees 3 hours ago 0 replies      
And my copy of Linux Mint still uses the shutter-click-motor winding sound for screen shots... Since the photos get dropped in the Pictures folder, it is one of the few ways I know it happened, unlike the copy of Ubuntu I am running on another box that shows the Save As interface.

So we developers keep adding the comfort noises, as well.

At least Mint didn't use the AOL's "File's done" announcement :)

ourmandave 7 hours ago 2 replies      
One "comfort noise" that Just Ain't Right is the fake engine noise they pipe over the speakers of cars that have a lot of sound proofing, etc.

It gives you feedback (which increases safety) but it raises trust issues.

RexRollman 9 hours ago 1 reply      
Sony's Magic Cap had a skeuomorphic interface as well. I always wanted to try it, given that Bill Atkinson was involved, but I never had the chance.

The Wikipedia page has a screenshot for those who haven't seen it: https://en.wikipedia.org/wiki/Magic_Cap

Math from Three to Seven: Chapters One and Three [pdf]
97 points by Mz  15 hours ago   18 comments top 7
cubano 12 hours ago 3 replies      
I believe the most important learning device used is the fact that he is sitting patiently with the kids and giving them his full attention, sending the meta-message to them that "this is important stuff and you will be rewarded with an adult's undivided interest if you work with this".

All children (and most adults) crave attention, and the giving and taking of it is perhaps the most powerful tool that a parent and/or teacher has in the arsenal.

I've always thought a lot of parents do it totally wrong...they give the misbehaving kid lots of (albeit negative) attention while the quiet, well-behaved one they ignore.

My parenting style was the exact opposite...I always stroked good behaviours and tried my best to ignore bad ones.

Animats 11 hours ago 0 replies      
(Chapter Two is left as an exercise for the student?)

The comment What an idiot I was, I thought. That was just an axiom, it is called commutativity. One doesnt prove axioms. is interesting. What's chosen as an axiom, and why, is an advanced question. Unless you get into foundations of mathematics, that question is seldom addressed. It's way beyond most pre-college math teachers. It's the sort of question that occurs to smart kids, but there's no easy answer you can give them. The usual answers are theological, and boil down to "shut up, kid". Here's a discussion on Stack Exchange of that subject: http://math.stackexchange.com/questions/127158/in-what-sense...

(If you get into automatic theorem proving, you have to address such issues head-on. Adding an inconsistent axiom can create a contradiction and break the system. This leads to constructive mathematics, Russell and Whitehead, Boyer and Moore, and an incredible amount of grinding just to get the basics of arithmetic and number theory locked down solid. In constructive mathematics, commutativity of integer addition is a provable theorem, not an axiom.

I once spent time developing and machine-proving a constructive theory of arrays, without the "axioms" of set theory. The "axioms" of arrays are in fact provable as theorems using constructive methods. It took a lot of automated case analysis, but I was able to come up with a set of theorems which the Boyer-Moore prover could prove in sequence to get to the usual rules for arrays. Some mathematicians who looked at that result didn't like seeing so much grinding needed to prove things that seemed fundamental. This was in the 1980s; today's mathematicians would not be bothered by a need for mechanized case analysis.)

otoburb 13 hours ago 1 reply      
Absorbing tales, most likely for those of us with young children in the aforementioned age bracket. First time I'd read about Jean Piaget too. Was reading through @tokenadult's[1] site earlier and didn't see much mention of Piaget, so another good reference to add to the list.

This is the probably the most insightful part of the AMS narrative:

>OK, if it is so hard to teach kids the notion of a number, what am I trying to do? What is the point of my lessons? I said it many times and I am going to say it again: the meaning of the lessons is the lessons themselves. Because they are fun. Because its fun to ask questions and look for the answers. Its a way of life.

[1] https://news.ycombinator.com/user?id=tokenadult

oldbuzzard 9 hours ago 0 replies      
Alexander Zvonkin's book is worthwhile both for math pedagogy and Soviet insight... the OP doesn't link the full book... either http://www.ams.org/bookstore-getitem/item=MCL-5 or http://www.amazon.com/Math-Three-Seven-Mathematical-Preschoo... is a great example of a preschool Math Circle.

If math pedagogy is your main interest any of the MSRI Math Circle Library books are worthwhile. This includes "Circle in a Box" which is a Math Circle starter kit freely available here http://www.mathcircles.org/GettingStartedForNewOrganizers_Wh...

abecedarius 11 hours ago 1 reply      
Summary and table of contents at http://www.ams.org/bookstore-getitem/item=MCL-5
indrax 4 hours ago 0 replies      
On coins and buttons: Perhaps the children are intuitively comparing the areas of the convex hulls, a thing humans are good at, is an interesting mathematical problem, and could be the basis of interesting alternative mathematics.

You could have a whole set of sessions with children exploring different arrangements of coins and noting that no matter how many you add within the hull, you don't get any 'more coin' (altering the plurality may help adults understand this problem.) If you have some button[s] and much more coin[s], can you add just one coin so that you have more coin than button? How far away do you need to add it?

cphuntington97 12 hours ago 3 replies      
Do we really need to work so hard to stimulate curiosity in people? Aren't people inherently very curious?
Everything we know Google is working on for the new year
106 points by sciurus  18 hours ago   56 comments top 7
laxatives 12 hours ago 3 replies      
I've always thought that the Google Ideas projects page was a bit frightening since many of projects are closely tied to defense.


acheron 13 hours ago 2 replies      
I've got a crazy guess here: I bet they're working on more ways to show ads.
xasos 9 hours ago 2 replies      
I'm especially excited for their Google X projects[1] - Google Glass 2, Google X Display, and self-driving cars.

If reports[2] are true that the self-driving car will start testing next month, I predict 2015 may be the year that autonomous vehicles go big. The true goals of Lyft and Uber will finally be accomplished. I hope there won't be much red tape getting these cars into action (although there probably will be).

[1] http://arstechnica.com/gadgets/2014/12/google-tracker-2015-e...

[2] http://www.wired.com/2014/12/google-self-driving-car-prototy...

bra-ket 6 hours ago 0 replies      
they missed Google Genomics: https://cloud.google.com/genomics/
higherpurpose 12 hours ago 3 replies      
If Google wants to offer "Whatsapp competition" in 2015, it should have 2 things, that Hangouts doesn't right now:

1) fast (instant) performance

2) end-to-end encryption (I wouldn't mind if they used the same Axolotl protocol as TextSecure and Whatsapp)

If it doesn't have any of those, I won't be using it.

Zigurd 8 hours ago 0 replies      
Ara will be a crappy phone. But it will be potentially important to making Android, and perhaps the underlying Linux portable across a a range of instruction sets and SoC architectures, and, of course, a wide range of peripherals.
amelius 12 hours ago 1 reply      
I see no biotech projects. Kind of strange, because it is a hot field, possibly the next big thing.
The Perl Jam: Exploiting a 20 Year-old Vulnerability [pdf]
5 points by lifthrasiir  5 hours ago   1 comment top
lxst 7 minutes ago 0 replies      
You can watch the talk here: https://www.youtube.com/watch?v=gweDBQ-9LuQ
The Life and Times of the Father of Linear Programming (2005)
27 points by the_d00d  9 hours ago   8 comments top 4
dminor 6 hours ago 1 reply      
In my first job out of college we used linear programming to solve combinatorial auctions - two things I had no idea existed until I interviewed with the company. It was sort of the nexus between OR and econometrics.

Very interesting to learn there was one man who helped give birth to both fields.

philip1209 5 hours ago 0 replies      
This is great. I had a college course that basically taught only the simplex algorithm - we did all all of the iterations by hand using tableaus and matrices.

We are using operations research optimization techniques at my side project, StaffJoy. The greatest innovation in usability of OR has been the JuMP project - there is now a fairly universal way to express optimization problems that is lower-level than Excel and higher-level than C.


qwerta 2 hours ago 1 reply      
There is no mention of Antonin Svoboda. He developed linear computer which would target moving airplanes in pre-nazi Czechoslovakia


His book from 1948 is probably first book about applied programming: https://archive.org/details/ComputingMechanismsLinkages

doug1001 3 hours ago 1 reply      
Leonid Kantorovich doesn't deserve even a mention in this article (or in the HN comments)? pretty sure that LK is the one who first developed and applied the technique--and did so with more conviction than most have in their own work: in Russia, during the Siege of Leningrad, no less--he used this new technique to calculate the optimum distance between vehicles carrying food & supplies along the frozen Ladoga river.
ECMAScript 6: new OOP features besides classes
76 points by StylifyYourBlog  18 hours ago   25 comments top 9
igl 9 hours ago 2 replies      
First we have to wait for complete implementations.

Then wait for v8 performance: http://jsperf.com/performance-frozen-object ES5 is doing good...

And at last: Wait for < IE-11 to die.

It remains to be seen if io.js will boosts ES Next for the server.Since bound to v8 i don't expect much.

Other implementations like the ahead of time compiler echojs become interesting. I am also curious how Typescript will look at v2.0.

I am ready, however I still don't use arrow functions... Which were first heard of in 2010? 2011?

It still feels like so far away.

bradchoate 5 hours ago 0 replies      
I guess the guy that proposed an '====' operator lost to the guy that proposed a 'Object.is()' method.
Kiro 2 hours ago 2 replies      
Why so much let? Are you supposed to only use let now? I thought it had a very specific use (like wanting a local variable in a loop).
dpweb 9 hours ago 2 replies      
I'm sorry most of the sugar I don't find particularly helpful and can be confusing when you have more than one way to do something. generators, async keyword these should be prioritized not class based inheritance. Even block scoping I mean maybe I'm nuts but function scoping if you take the time to understand it, it works.
bsimpson 8 hours ago 0 replies      
I'm constantly impressed by the quality of articles on 2ality. They're always both informative and succinct.
gargarplex 10 hours ago 2 replies      
What's the timeline on everyone using ES6 in production and JavaScript developers being expected to know ES6? 2 years?
recursive 10 hours ago 1 reply      
Being able to (eventually) use all these language features in a language that's running inside a browser is blowing my mind right now. It's got iterators! It's got properties!
kelvin0 9 hours ago 2 replies      
More makeup on a Gorilla doesn't turn it into a Lady.
Hypx 8 hours ago 0 replies      
State of ECMAScript-6 support in current browsers:


       cached 30 December 2014 11:02:01 GMT