hacker news with inline top comments    .. more ..    27 May 2014 News
home   ask   best   5 years ago   
Python 3 can revive Python medium.com
112 points by coldtea  3 hours ago   47 comments top 21
rdtsc 1 hour ago 0 replies      
That pretty much reiterates my points from last time we discussed (not many hours ago).


Besides what already was said. It is also important to emphasize that Python 2 is already pretty good. So it is not that Python 3 is bad, it is just that it is very hard to improve on 2.

Ironing out the warts is good, but this was not the right time. This should have happened 7-10 years ago.

Nowadays I can't imagine a lot of people discounting Python because of the print statement, unicode support, division rules, or lack of yield from statement.

It will be performance, concurrency, ability to create web back-ends, installing packages, testing frameworks, IDE support.

Apart from allowing optional type annotation syntax I just don't see Python 3 providing a good enough carrot to force many projects to switch to it.

Imagine you go to a manger and tell him. "Oh this 800K line project we have in Python 2 will be ported to Python 3, can we have 1 month to do that?". Ok then the manager might say "Well we have these features to implement but if you all say so. But what will we gain by it, to offset the time spent (opportunity cost) and risk of breakages". And if the answer if "oh you know print is not a statement anymore, and many dictionary and sequence methods are not iterators not returning values, and this new Twisted-like async library...' Well you can imagine many a manager might just say "well that is just not enough".

If in turn the dev team came back said "Oh yeah they integrated PyPy, STM module, requests module. Static type checking via annotations, 3x speed improvement, no more GIL so can do some CPU intensive work if need to.". I can easily see this proverbial manager OK-ing that.

gexla 29 minutes ago 0 replies      
Python needs to be revived?

I always thought of Python as being a great utility programming language. It's not really a specialist, more of a jack of all trades.

For example PHP is all about web development. Ruby is probably most well known for Rails and also widely used for web development. Python is widely used for web development, but that's not necessarily the first thing you think of for Python.

What's going to keep any programming language alive is the libraries that become so well entrenched that a competing library would have a serious uphill battle to even come close to matching functionality. Python has a lot of libraries like this for scientific tools.

I'm always skeptical to hear that a developer has moved from X programming language to Go. I wonder how many of these tales are from developers who are actually referring to what they do in their spare time rather than their day jobs. Go is still early enough that doing the sorts of things which create the most jobs is still more painful than it needs to be (and so you would probably be doing these things in a different ecosystem.) It seems that the real Go job generating stories are from start-ups which have hit some momentum, received funding and are rewriting parts of their stack in Go.

The mass job generators are still at the Rails / Django / PHP / JS levels.

chrishenn 2 hours ago 6 replies      
One pain point I've really felt recently with Python is in the deploy step. pip installing dependencies with a requirements.txt file seems to be the recommended way, but it's far from easy. Many dependencies (such as PyTables) don't install their own dependencies automatically, so you are left with a fragile one-by-one process for getting library code in place.

It was all fine once I got it worked out but it would so much nicer to provide a requirements.txt file and have pip figure out the ordering and dependency stuff. That and being able to install binary packages in a simple way from pip would make me much happier with python (no more waiting 10 minutes for numpy or whatever to compile).

As far as actual language features go however, I still find python a joy to work in.

hmsimha 2 hours ago 0 replies      
As much as I wanted to like this article (I remain optimistic about the future of Python as Python 3) wouldn't most of the compelling additional features suggested break backwards compatibility with current versions of Python 3? This strikes me more as a proposal for a Python 4 than a revitalization of Python 3.

edit: I wanted to respond to this myself, since upon rereading I no longer get the impression the proposed changes need 'break' backwards compatibility per se. For the suggestion on removing the GIL specifically, this would completely necessitate a revolution in the design of python programs such that even if, say, the libraries that had already been ported at the time of 3.2 still work in 3.9, their implementation would be senseless by 3.9 conventions.

peter-row 2 hours ago 2 replies      
Python 3 isn't really good. It's not really bad, either. There's really not that many magic bullets (other than proper functional programming, maybe, which isn't about to happen in Python).

People are leaving Python for Go because people have always left Python for fast compiled languages. Google ditched Python for C++ and Java. Java! I've seen a lot of projects get re-written in Java from Python, but no-one worried then.

Python 3 adds some cool stuff (async, in particular), and fixes some warts. It's a bit rude of them to force people to upgrade, but that will eventually pay off. It will add more things in the future.

The people who start new projects in Python 3 will have some short-term pain, as some libraries take time to port. There will be a long term benefit, though - future libraries will be better for Python 3, and they won't have to port their project.

The only controversial thing was the use of unicode. IMO, Python 3 made the right choice - you should make everything unicode where-ever it's feasible, because it's just a mess otherwise.

pekk 2 hours ago 2 replies      
The idea that people move from Python specifically to Go is one of those chestnuts of conventional wisdom that never receives any kind of backing in actual data.

If you think that Python and Go are made for the same tasks then you're really confused.

allendoerfer 1 hour ago 2 replies      
Python is not dying and if so, Go would not be the reason why. Sure, Python is not functional, not compiled, not mobile, not Rails and not in the browser.

But I can not imagine the languages, which are all this, to spread so nice and readable from command line scripts to scientific computing to big server applications.

Python's use cases will not go anywhere, so don't panic: Python is doing just fine and improving in many areas while holding on to its core values.

kunstmord 2 hours ago 2 replies      
> Newer programmers are not that impressed with either version of Python.

Any evidence? If it's personal experience, then mine is exactly the opposite.

I don't know if such a thing exists, but maybe a big list of the main changes would help convince people more (type annotations, yield from, the forthcoming @ operator). From what I've seen and read, of course all this is somewhere in the docs and release notes, but I've never seen a clean concise list of the main new features, fixes and reasons why these features are cool.

analog31 1 hour ago 2 replies      
This isn't intended to be obstreperous, but I'm genuinely curious: How many Python developers are in a position to really care about 2 vs 3?

The people I know who use Python, including myself, range from utter beginners to experienced programmers, but are using a relatively small subset of the available libraries, and are just using whichever version we started with. I could translate my code to version 3 in a heartbeat, but have no particular reason to do so. I've translated some of my most important stuff from BASIC to Python after all.

Professional developers will do whatever is right for their projects.

My concern would be for the folks who develop and maintain the libraries -- for whose generosity I'm grateful. If there were some sort of consensus on the direction of Python, I'd hop on the bus just to make life easier for those developers. Their time would be better spent adding useful features or just combing the code for bugs, than coping with multiple Python versions.

Could a single Python interpreter somehow manage a mixture of 2 and 3 code?

scrollaway 2 hours ago 0 replies      
> Add types. Well, opt-in types. That you can use to speed some code up (a la Cython), or to provide assurances and help type-check (a la Dart). And add type annotations to everything in the standard library.

I think this will happen eventually, what with some of the recent PEPs; I just wish it could happen faster. Optional typing is the best of both worlds and there is no reason not to have it.

thatthatis 28 minutes ago 0 replies      
I can agree with most of this, so long as it is python 4.

Python shouldn't start a policy or breaking things just because python3 is less successful than hoped.

samirmenon 34 minutes ago 0 replies      
Python is still, by far, the most friendly language for beginners. The single biggest factor in this is the powerful data structures that Python has, especially lists. As so many others have said, Python is almost "executable pseudocode".

Python remains the language of choice for introducing programming because it is so simple. It isn't fast, and it might not be very well suited for large-scale, long-term use. That's okay.

This appeal to beginners, which the article claims is waning, is the vital force of Python; as long as it is the de facto language for beginners, it will never go the way of Perl.

mangecoeur 1 hour ago 0 replies      
The biggest problem with python 3 as i see it is that it seems to cause people to air grand opinions unencumbered by any actual data.

I have seen neither evidence that python is "dieing" in any way, nor that people are dropping it because it lacks radical new feature X. Things may be more competitive now but I don't see any stagnation in the community - and that's always been one of python's strongest points.

marcosdumay 2 hours ago 0 replies      
Well, Python does have optional type systems. What is missing on that bullet is choosing one, and making the standard library support it.
okso 2 hours ago 0 replies      
> Its not like anyone is using Python 3 anyway, so take some chances. Break things.

Well, people start using it now. I'm teaching Python classes exclusively in Python 3, and do all my personal projects in Python 3 and love it.

I would like to finally have a "stable" Python 3 with forward compatibility. This is important for the future of the language, else no-one would invest in it.

ForHackernews 1 hour ago 0 replies      
Is Python in need of revival? It seems to be doing pretty well. I think it's a great workaday language. It's probably not the Absolute Best at any particular domain but it's easy to learn and use, and it serves fairly capably for sysadmin scripting, web development, scientific computing, etc.
Polarity 1 hour ago 1 reply      
Why do people tend to just stick to one tool? I mean i try everything i can and what works works. Its not actually bad if somethings dying. There are always new/better stuff on the horizon.
jqm 1 hour ago 0 replies      
Is Python really dying? I think the original article proclaiming it's death was written by a MS guy (aren't they coming out with a new version of asp or .net or something?).

I use it every day and even though I love JS (and have been thinking about looking into Go because of all the positive noise) I don't see Python going anywhere for me at least. It is simply too handy and familiar. Maybe it just will be used slightly less by some?

I'd have to see some real stats that Python is dying to believe it.

dvl 1 hour ago 2 replies      
I only want print as statement again
kolev 2 hours ago 0 replies      
Reminds me of "Waiting for Godot" by Samuel Beckett...
eudox 2 hours ago 1 reply      
Alternatively, all the effort spent reviving Python could be spent making Python go away.
Whitewood Under Siege: Wooden Shipping Pallets cabinetmagazine.org
181 points by drjohnson  7 hours ago   36 comments top 10
MisterBastahrd 5 hours ago 1 reply      
Note: I worked for a large grocery chain for 7 years

Plastic pallets are vastly superior in the long run to wooden pallets in terms of durability, and many companies have used them for almost two decades interchangeably with wooden pallets. One of the issues here is that once a pallet enters the supply chain, who knows if or when you'll get them back. When I first started working for the grocery chain, there were over 30 pallets of back inventory sitting in the warehouse (this is a very bad thing and I corrected it during my time with the company).

The main problem with wooden pallets is that they are largely made with inferior wood that can't stand up to the stresses applied to them for a long period of time. If any of the center slats break, there's no problem. If one of the ends break, then you're probably going to be cleaning up a warehouse floor from whatever was on the pallet. People are also more likely to walk away with a wooden pallet since they have plenty of utility and are very inexpensive.

That isn't to say that plastic pallets are without fault, even if they are virtually indestructible. Most plastic pallets are manufactured with a diamond plate pattern. That's nice and all, but it's still plastic and therefore, slippery. Place some frozen or refrigerated food on a pallet destined for a windy area and have fun cleaning bananas out of the back of the truck.

So the best of both worlds would be a plastic pallet with a non-slip coating on the top of the pallet and the feet.

stephengillie 4 hours ago 0 replies      
Interestingly, this article completely leaves out soda (aka pop aka coke) pallets. All of the major soft drink bottlers use about the same size and shape pallet, and it's much smaller than the traditional whitewood pallet. Often, they'll put their pallet onto a square pallet, then that one a whitewood pallet, when they deliver to grocery stores.

Pepsi, Coke, and other soft drink and beer distribution companies even have their own, smaller, powered pallet jacks. The powered pallet jacks that grocery and other retail stores use are usually too big for these smaller pallets.

These companies have been slowly moving to plastic pallets over the past decade, and wooden pallets of those dimensions are hard to find today. The plastic pallets offer some level of 4-way access as well.


Edit: I had forgotten about these full-size plastic pallets. They are so much cleaner and easier to use. Even new whitewood pallets leave behind a cloud of splinters and wood dust. These leave behind some shipping dust as there are no cracks for dust to fall through. They stack more securely and don't get heavier when wet like wooden pallets. And you can stack about twice as many in the same space.


dewey 3 hours ago 1 reply      
There's also an interesting system in Europe called EUR-pallet. [0]

They are usually made out of higher quality wood and quite durable. The system works by trading pallets for pallets, so if you receive some goods on a EUR pallet the driver takes an empty EUR pallet from your stack and it'll be reused at the other company.

[0] http://en.wikipedia.org/wiki/EUR-pallet

dustin1114 1 hour ago 1 reply      
I've worked in grocery logistics for several years, and can tell you from experience just how political CHEP, PECO, and iGPS pallet distribution can be. Because of the size of my organization's warehouses, CHEP especially requires stringent audits, and if a truck sent out with CHEP when it was not supposed to, the gestapo comes after you.

I will admit, though, that CHEP are the best when it comes to quality (other than iGPS, perhaps). They are far less likely to have splinter, warp, and degrade. They are also much more resilient to the abuse that the supply chain puts on them.

An important point to remember is automation. The logistics industry is becoming more and more automated. Consistent, high quality pallets are becoming a must. The typical hi-lo is becoming less and less common, while in its place are robotic automated guided vehicles, distribution conveyors, and high-bay storage and retrieval machines. I know from experience how much pain and frustration is caused by broken stringers, splinters, and warped whitewood (in our industry, we call them GMA pallets [Grocery Manufacturers Association]]. Literally days of lost time annually, which can equate to millions of dollars.

I have no real opinion about what the best direction is. You either fork out more up front for the good stuff (with all of its politics), or you just deal with the bad quality and inconsistencies of whitewood.

I have to say, this was a very interesting article...a nice change from Python 2.7 vs. 3.X!

gohrt 6 hours ago 0 replies      
YC application question: "How have you hacked a system in real life?"

> What is most vexing to many recyclers is the belief that the accumulation of blue pallets in their yards is not an accident, but a deliberate CHEP strategy. After all, collecting these stray pallets takes a lot of labor, a lot of miles, and a lot of trucks. If you are CHEP, why do this work yourself if you can get someone else to do it for you, at a price that you dictate?

> In 2008, a group of recyclers filed a class action lawsuit, claiming that CHEP was leveraging its dominant market position, and violating anti-trust law, by transferring its operational costs onto recyclers. The recyclers argued that CHEP had made them into a conscripted collection army.

GFischer 6 hours ago 2 replies      
A really compelling read.

How many hidden industries like the pallets one are out there, waiting for software (or hardware) aid or disruption? I mean, 3.5 billion in pallet-related revenues :) , and millions in losses due to lack of tracking... is RFID really the best solution?

jessaustin 3 hours ago 3 replies      
Maybe this is a rare streak of jingoism for me, but it seems odd that a foreign company has been able to come in and dictate through the courts that the whole industry should work on a completely different model. Perhaps the reason "blue" is such a good deal for shippers is because much of the real cost has been transferred to third parties. ISTM the courts should force CHEP to include a significant deposit in its contracts, or else forgo the discounted forced labor of the recyclers. It wouldn't have to be a 100% deposit, but it would have to be high enough so that shipping destinations were no longer indifferent to whether the pallet was returned to CHEP or recycled or stolen.
vacri 48 minutes ago 0 replies      
The article is stacking the deck pretty heavily:

After all, collecting these stray pallets takes a lot of labor, a lot of miles, and a lot of trucks.


they receive blue pallets whether they want them or not

So the recyclers' hands are tied because they receive pallets mixed in with white pallets when delivered by the truckload, but then they get to turn around and talk of all the effort collecting them. It's not hard - just educate your customers - "I won't pay for blue pallets, because they're rented equipment. You can ship them to me, but I'll reduce the payout for a truckload by the number of blues". What the article is promoting is that the recyclers get to play the innocent... then directly profit off it.

Don't get me wrong, I think the idea of renting pallets is stupid, but then again, I'm not making $3.5B/year. But the article seems to gloss over the fact that selling something you do not own is not legal, regardless of how much labour you put into it (otherwise burglary and fencing would be legal). These pallets are clearly marked; it's not like they're hard to confuse.

lotsofmangos 4 hours ago 0 replies      
You could easily make a cheap machine that could make pallets to order out of plastic packaging scrap on-site. Judging from this article though, that might annoy a hell of a lot of people.
quinndupont 6 hours ago 2 replies      
Stunning to see Cabinet Magazine being referenced on HN. For those that don't know, Cabinet is an excellent magazine that covers many topics, but has something of a philosophy and art focus. Great article, as always.
Deep Learning From The Bottom Up metacademy.org
47 points by zercool  3 hours ago   2 comments top 2
cjrd 12 minutes ago 0 replies      
Hi, I'm one of the creators of Metacademy. I hope you find it useful. Feel free to follow our new Twitter account if you'd like low volume updates:


Also, you can register an account for an occasional email.

PS) We're completely free and open source: https://github.com/metacademy/metacademy-application

nrmn 16 minutes ago 0 replies      
For anyone actually interested in implementing DNN's I wrote up a quick blog post (essentially a brain dump) of general guidelines to adhere to when training DNNs. The source for this information is primarily from videos given by Geoffrey Hinton as well as various papers.


Organic Cat Litter Chief Suspect In Nuclear Waste Accident npr.org
81 points by timr  5 hours ago   29 comments top 8
CapitalistCartr 5 hours ago 1 reply      
People laugh at the extreme detail the military uses to specify such item. I spent six years in USAF and I've groaned at some of it. But we damn sure never had this happen under SAC. And we were building nukes. Details matter.
tedsanders 4 hours ago 1 reply      
I once talked to a scientist who worked on cat litter for a major corporation. It was surprisingly interesting to hear about the chemistry and geology and supply chain management of such a mundane substance. I guess the DOE scientists running this project were not aware of these subtle issues of chemistry and geology.
lotsofmangos 3 hours ago 2 replies      
Why the hell they are writing a spec with a commercial product name rather than the material contained in it is beyond me. If the contractor knew they were looking for a certain kind of clay rather than cat litter, this would be nearly impossible.
ScottBurson 2 hours ago 1 reply      
The strangest thing about this is that the organic cat litters are quite a bit more expensive than clay. So it wasn't done to save money. What the reason possibly could have been I cannot fathom.
mikeash 3 hours ago 1 reply      
How is this possible?

To take a random example I'm familiar with, making small modifications outside certain small bounds requires a lot of paperwork and approval. This can be something as simple as adding a tow hook to an airplane known to be good for towing. If you're lucky and the modification has been done before and somebody went through the trouble of getting the modification certified, you can take advantage of the work they've already done, greatly reducing the trouble involved as long as you can get permission from whoever got it certified. If you're doing something totally new (or something other people have done, but nobody got it certified for general use) then you have to file a form describing what you're going to do, get it approved, do the work, get the result inspected....

All this even for small aircraft where you'd be very hard-pressed to use them to kill more than two people (including the pilot) even if for some reason you had a goal of maximizing deaths.

Yet, when handling nuclear waste, apparently people can just randomly decide to completely change an important component used in the process?

Or was the change studied and approved by an engineer, but the problem was missed? The article certainly doesn't make it sound like this happened, but it could be an omission.

jqm 4 hours ago 1 reply      
I think this is still speculative though.

From what I understand from local papers it is also believed a piece of salt fell from the ceiling (waste is housed in an old salt mine).

mikeryan 3 hours ago 5 replies      
God I read that title wrong.

I read it as

"Organic Cat Litter Chief" - "Suspect in Nuclear Waste Accident"

As opposed to

"Organic Cat Litter" - "Chief Suspect in Nuclear Waste Accident"

I kept wondering when the CEO of a Cat Litter company was going to be blamed for something.

sp332 4 hours ago 1 reply      
"How come nobody caught this and raised a red flag?" asks

It looks like they caught it to me.

The "Work For Hire" Doctrine Almost Never Works in Software Contracts metrocorpcounsel.com
27 points by EGreg  3 hours ago   18 comments top 5
patio11 1 hour ago 1 reply      
This is one of the reason consulting clients are paranoid about IP assignment clauses. I've signed MSAs where the language about the IP assignment was as long as all other contractual terms and the NDA combined.

That was historically one of the points of highest friction at my consultancy during contract negotiation, because every lawyer had a different idea of how to totally derisk the IP assignment for the client, many of which were not compatible with me signing them and then continuing to run a consultancy or software company. (Hypothetical example: If I'm doing A/B testing for you, I am of course amenable to giving you copyright to code/copy/reports delivered to you, but I'm not going to give you exclusive rights over "all procedures and knowhow used in the production of the deliverables.")

Word to the wise: when you have your lawyer draft your standard contract, ask them "Hey can we have IP assignment happen only after SoW's associated invoices have been paid in full?" That's a valuable lever to have to encourage clients operating in good faith to prioritize getting your invoices paid expeditiously.

roberthahn 55 minutes ago 0 replies      
As with anything legal, whether this applies to you depends on where you are. The article appears to be quoting from laws based on the State of California. If you're not under their jurisdiction, it probably doesn't apply to you.

At the least, you should look up the work-for-hire laws in your jurisdiction. Or, you know, work with a lawyer to learn your rights.

(edit: copy tweaks)

mike_herrera 1 hour ago 2 replies      
Can anyone explain why written code couldn't be considered "a compilation (an original manner of selecting or arranging preexisting works)?" I would think common sense would define most software as an arrangement of preexisting works.

e.g. If I'm contracted to develop a web app it's an original arrangement of an existing programming language.

zanny 1 hour ago 1 reply      
Sounds like a good opportunity to develop FOSS code rather than limit it to one company.

I don't get it, though. The business wants the software - they pay to have it made. And then they also want rights to it. I really like how you have to make copyright assignment explicit in the contract, because in general it is ridiculous to write code and then lock it behind a vault door and treat it like liquid gold when other people could benefit from it.

k__ 2 hours ago 4 replies      
So, if I just go anywhere, write code, write a bill for it and get paid, the whole copyright of the code belongs to me?
Evolution of chess: Popularity of openings over time randalolson.com
37 points by rhiever  4 hours ago   18 comments top 6
dfan 2 hours ago 1 reply      
The Pirc spike in the 1850s looks interesting visually, but keep in mind that your data set back then is incredibly small. Looking at my own database, the spike seems to be entirely due to some guy name Mahescandra playing it 57 times against Cochrane. It certainly doesn't have anything to do with the increasing popularity of 1.d4 40 years later.

I've never heard of Mahescandra, but Cochrane is the guy the famous Cochrane gambit in the Petroff is named after, where White sacrifices a piece on move 4 (1.e4 e5 2.Nf3 Nf6 3.Nxe5 d6 4.Nxf7).

cven714 3 hours ago 2 replies      
Chess opening trends are like fashion--some high profile player(s), always on the search for new ideas, finds a resource in an unpopular line and suddenly it's all the rage. Everyone is playing it, working out the complications, finding ways to defend or neutralize the lines, then interest wanes until someone uncovers a fresh new plan somewhere else and the cycle repeats. Other times though, new resources aren't found and a line mostly dies out, like the Kings Gambit.

So what I would be interested to see from your data set is a relation between opening performance and its popularity. Did people stop playing the Pirc due to sub-par results compared to other openings at the time (like I imagine happened with the Vienna) or did it simply fall out of fashion? It would be interesting to know which lines always had good results, but just stopped being popular for whatever reason. They could be due for a revival.

adamconroy 50 minutes ago 0 replies      
The analysis is interesting. However I'm not sure it has much practical value due to transpositions. For example, as white I play 1.Nf3 and if black plays d5 I play d4 and we have a d4 opening. If black plays c5 I play c4 and depending on what black does it will transpose into either an English opening (1. c4) or a maroczy sicilian (1. e4) or an indian defence (1. d4).

So basically, my opening move would be classed as 'other' but really it is one of 1.d4, 1.e4, 1.c4 in terms of the classifications of this post.

bane 1 hour ago 3 replies      
Out of curiosity, is chess solvable yet by computers? Meaning, is it possible to simply brute force every possible legal game up to n moves and determine all the winning and losing move sets? What's this number look like theoretically?

(I'm sure in the general sense games with a very large n aren't as I suppose a game could be played in perpetuity)

kristopolous 2 hours ago 1 reply      
I'm under the impression that introducing the Queen too early (say < 6 moves) usually leads to it being taken and that player losing.

I don't know how much this is actually true though. That would also be a good thing to look over

Matetricks 2 hours ago 0 replies      
Great blog post. It's interesting to compare my knowledge of opening evolution with the historic data. During the Romantic Era, King Pawn games were clearly the norm. When Reti and Nimzowitch introduced hypermodernism in the 1920s Indian openings became much more popular.
Running ASP.NET vNext on OS X and Linux graemechristie.github.io
85 points by chillitom  7 hours ago   24 comments top 6
chillitom 6 hours ago 1 reply      

Followed Graeme's instructions and after about 30 mins spent building the latest Mono I was quickly able to get ASP.vNext up and running on OSX.

Found one small mistake in the instructions, the switch --feed should be --source in the kpm restore step.

rbanffy 3 hours ago 9 replies      
Microsoft sponsoring a cross-platform application environment does not make much sense. Why would they do something that is bad for them in the long run? Why would it make sense for them to release it under an open-source license?

Can anyone imagine a compelling business case for this? I am not used to corporations being overly generous.

rcarmo 4 hours ago 0 replies      
Awesome. I started looking at vNext last week, and stopped short of trying it then due to lack of a Windows machine (which I've set up in the meantime). This will make it a lot easier to experiment with.
tphan 1 hour ago 2 replies      
Disappointing article. A console application isn't ASP.NET.
MichaelGG 2 hours ago 1 reply      
> This is your ASP.NET vNext project file .. and not an angle bracket to be seen !

And no comments allowed, either! Changing formats to be hip is fun!

The Internet With A Human Face idlewords.com
111 points by NelsonMinar  8 hours ago   34 comments top 16
gone35 8 minutes ago 0 replies      
This festive map shows seismic hazard in Northern California, where pretty much all the large Internet companies are based, along with a zillion startups. The ones that aren't here have their headquarters in an even deadlier zone up in Cascadia. (...)

So even if you don't agree with my politics, maybe you'll agree with my geology. Let's not build a vast, distributed global network only to put everything in one place!

That slide[1] hits close to home. I'm painfully aware of how hard (and almost pointless/powerless) it is to reason about long-term geological risks, esp compared to less catastrophic and more (short-term) predictable hazards like hurricanes, tornadoes or blizzards; but from time to time I idly question the wisdom, from a civilizational point of view, of having so many concentrated, incredibly talented people living directly atop one of the most dangerous fault regions on earth[2].

But again, it's pointless to think about it as an individual, so better get back to work and keep living day by day, I guess. Wovon man nicht sprechen kann...

[1] https://static.pinboard.in/bt14/bt14.069.jpg

[2] http://peer.berkeley.edu/pdf/Senate_testimonial-8-07.pdf

ronaldx 49 minutes ago 0 replies      
I'm now more hard-line than this on data privacy:

I have come to believe that businesses should not be legally allowed to store any consumer data unless it's obvious to the consumer that it's absolutely required for the primary function of the service, and they should only be allowed to store data for that one function, with an exception if the consumer explicitly and voluntarily opts-in for each additional function.

Large internet companies have been collecting swathes of data with the claim that they are secretly using it to improve people's lives. But it seems to me that A/B testing has failed to improve anyone's life.

Example:I use search engines to search for something I'm looking for.

I do not benefit from being shown 'targeted' ads, nor from the search engine identifying the most populist answers which it uses to spoon-feed me later rather than serve what I asked for, nor from the search engine identifying which particular arrangement of pixels will leave me personally more addicted.

Businesses are welcome to use my data in ways which are in my interest, but they should not get to decide which of these uses are in my interest.

moultano 3 hours ago 1 reply      

I thought it worth noting that Google does strip personal identifiers after 18 months which is in line with one of his proposed fixes.

gammarator 2 hours ago 1 reply      
The thesis the talk pivots around is this one, in my reading:

"Investor storytime only works if you can argue that advertising in the future is going to be effective and lucrative in ways it just isn't today. If the investors stop believing this, the money will dry up."

corford 46 minutes ago 0 replies      
Fantastic as always. Every time I read one of Maciej's talks or essays I get a little closer to throwing in the towel and pursuing a more meaningful existence. It's going to happen one day and I can't wait to read the post that forces it.
coldtea 4 hours ago 0 replies      
>There was an ad for the new Pixies album. This was the one ad that was well targeted; I love the Pixies. I got the torrent right away.

I laughed very hard on this!

In all, an excellent article. I disagree with blind faith in technology to solve all our problems and not create new ones

People often forget that technology is tools (and not always neutral tools, as is another naive belief: some inventions have larger inherent "harm potential"), and that policy matters as much, or even more, as does the kind of cultural landscape we guide our use of the tools.

(Remember the classic xkcd comic: http://xkcd.com/538/ ).

thaumaturgy 5 hours ago 1 reply      
This was excellent. It described some of the reluctance I've had towards social networks since 2000 at least.

It's also a little bit funny that it was written by the guy behind pinboard.in, a nice social bookmarking service (where many people went when del.icio.us died). But that makes me trust the service more, not less.

Which probably means I am stupid.

lightyrs 6 hours ago 0 replies      
Ditto the kudos on the formatting. This piece really resonated with me. As for solutions, I have none. Hopefully someone smarter and more resourceful than me will be inspired by this talk.
ChrisNorstrom 6 hours ago 6 replies      
The formatting of this article itself is something worth studying. It's brilliantly seductive to read and read all of it. The pictures/slides by each paragraph were like rewards, continually luring me to the next paragraph. For the first time in a long time. I read every single word. Not skimmed.

Although I disagree with the idea of regulating how long behavioral data is saved. Not all behavioral data is sensitive. Rather we should consider fully disclosing to users either how long their data will be saved or what data has been collected on them or both. Any other regulations may be too burdensome to the startup.

=== Examples ===

His suggestion that all behavioral data be deleted after a certain period of time means every little piece of data collected must also have a timestamp. Inflating databases and costing money.

A program must be written that seeks out timestamped data ready to expire and delete it.

If the deleted data is connected with other pieces of data or reports elsewhere we're going to run into complex problems.

These obligations must be handed down from company to company during acquisitions. A company selling data about to expire will get acquired for a lot less than a company with fresh data. This may in turn cause a series of unforseen consequences in the acquisition market.

=== Solution ===

Rather than controlling and manipulating what can and cannot be done, it may be best to just create transparent policies and let the free market converse its way towards a compromise.

pradeep89 1 hour ago 0 replies      
> America built 75,000 kilometers of interstate highways

Liked the of use of kilometers over miles

angersock 6 hours ago 1 reply      
There's a rather hilarious portion (in an otherwise soul-crushing deck): the author is trying to figure out what this massive dragnet and mining of their information has actually gotten, and so they look at all of the ads they get served. This bring forth this gem:

"There was an ad for the new Pixies album. This was the one ad that was well targeted; I love the Pixies. I got the torrent right away. "

davidhariri 5 hours ago 0 replies      
I really like this essay
L_Rahman 6 hours ago 1 reply      
Still reading the talk, but as an aside wanted to point out that the way the transcript is formatted with the words alongside the slides is probably the best way I've seen a talk presented in text form on the internet.
nl 1 hour ago 2 replies      
It's too bad that Maciej Ceglowski (the author) is banned on HN, over some infraction I never understood.
quadrangle 6 hours ago 0 replies      
This tells the truth.
quadrangle 6 hours ago 0 replies      
One thing: this guy says he couldn't figure out how to block YouTube ads. Ridiculous. It was years before I learned they even had any. Adblock Plus or Adblock Edge both fully block them if you have EasyList (the default).
ShowHN: Smash: The game changing tennis wearable kickstarter.com
10 points by evjan  1 hour ago   discuss
Learning to Love Complex Numbers jeremykun.com
25 points by adbge  3 hours ago   2 comments top 2
gohrt 1 hour ago 0 replies      

Kalid Azad's oft-posted article:http://betterexplained.com/articles/a-visual-intuitive-guide...

Tristn Needham's book Visual Complex Analysis http://usf.usfca.edu/vca//

zercool 2 hours ago 0 replies      
Some fantastic visualizations to accompany his essay. I would love an ipython notebook version that I could download and play with.
A Statement About Mahbods Annotations on Elliot Rodgers Manifesto rapgenius.com
83 points by ovechtrick  6 hours ago   71 comments top 11
tptacek 5 hours ago 9 replies      
I can appreciate RG's leadership having the self-awareness to see the need to make an Adult Decision in this case, but I think it's probably a mistake for them to continue hosting the Rodgers diary.

Contrary to Moghadam's comments, the diary is not particularly well-written. It's long, repetitive, weirdly detailed (the author recounts meals eaten years ago), and studded with evidence of psychopathy.

RG's style of annotation works extremely well for some kinds of writing --- song lyrics, The Great Gatsby, TS Eliot poems. What I think those things have in common is that they're hospitable to "riffing" and cross-linking; for instance, the lyrics to the ICP song where they come out of the closet as religious are totally incongruous until RG annotations inform you that they're reprised lyrics from previous ICP songs.

But riffing on Rodgers diary doesn't serve the same purpose, at least so close to the event. It is instead a minefield; almost anything you can say risks diminishing the tragedy, or misapprehending how the mind of a deeply mentally ill person functions, or, god help us, using the output of that mind as a platform on which to build suggestions on changing our culture.

There may be some point at which RG annotations will add value to this terribly sad artifact of Elliot Rodgers, but it probably won't be in 2014.

rl3 31 minutes ago 0 replies      
Obviously Mahbod's latest annotations were inappropriate. Based on his past behavior, these recent antics should not come as much of a surprise.

As an aside, it has always amazed me how many people over the years have failed to realize Rap Genius' gimmick is just that, a gimmick. It's their attempt at using an admittedly off-color flavor of comedy to build their brand. Said brand is heavily rooted in rap, which is one of the most politically incorrect and offensive mediums of pop culture in existence today.

It stands to reason that when the Rap Genius founders are in character, their behavior should not be taken literally as a reflection of who they really are as people.

A good example of this is when they were featured on stage at TechCrunch Disrupt 2013:


Most people simply took them literally, were offended, and jumped on the revulsion bandwagon. Others understood that the RG guys were essentially mocking the startup scene and the rap scene at the same time, in effect making fun of themselves.


In Mahbod's case specifically, it seemed like he was aiming for humor that went right up to the line but didn't cross it. Unfortunately, comedy is a hit-or-miss endeavor and some of the misses were bound to cross that line. Add to that his medical issues potentially adversely affecting his judgement, and it's no wonder.

Was what he said inappropriate? Absolutely.

Should he have been fired for it? Debatable.

Should we assume he's a terrible human being (as some other comments have implied)? Certainly not.

JumpCrisscross 5 hours ago 3 replies      
Context: http://gawker.com/rap-genius-co-founders-creepy-comments-on-...

Mahbod compliments several of Rodger's sentences as "artful" and/or "beautifully written". That is okay, if ill-timed. One can make a stylistic statement about Mein Kampf without endorsing its message.

He also, however, speculates that Rodger's "sister is smokin' hot." That is violently inappropriate, particularly given the misogynistic nature of Elliot's crimes.

nightpool 5 hours ago 0 replies      
Personally, I think this is a long time coming. Like, Tom, I have nothing but respect for Maboo, but I think many, many people have raised the point that his, for lack of a better word, "antics" are holding the site back. I know a lot of people in the HN community have been sharply critical of him, and for good reason. Maybe that's the main driver here, and Maboo's absolutely inappropriate response is the catalyst.
minimaxir 5 hours ago 3 replies      
Mahbod has been fired (i.e. not "resigned") due to these comments.


darkrabbi 5 hours ago 1 reply      
Apparently this guy had a brain tumor and this wasn't the first time he's publicly embarrassed the company. He tweeted from the RapGenius account "WARREN BUFFETT CAN SUCK MY DICK".



gordonguthrie 5 hours ago 6 replies      
But they are still turning violence against women into page views.
cheetos 5 hours ago 2 replies      
Who are RapGenius' investors? Who is bankrolling these people?
arfliw 1 hour ago 0 replies      
Sounds like somebody wanted to get rid of a co-founder and found a great opportunity to do it.
onewaystreet 5 hours ago 0 replies      
If it was just the annotations I'd be surprised, but this is just another incident by Mahbod that has caused RapGenius bad press. It was the last straw.
angersock 5 hours ago 1 reply      
I wish they had they shown the annotations that caused the firing. The Valleywag link elsewhere has one screenshot, but other than being in poor taste I'm not sure why this rates a firing/step down.
Boosted's Electric Skateboard (YC S12) wired.com
19 points by sethbannon  3 hours ago   8 comments top 6
nl 31 minutes ago 0 replies      
I can't wait to see what kind of vehicle Boosted start working on next now they have this done.

Electric bicycles are a good, existing market (especially in Europe), but there are already some good options there and I suspect their prefer something where their R&D is more directly applicable (bicycles are heavier and human power can take a bigger proportion of the propulsion load). OTOH, it might be possible to engineer an add-on solution for existing bikes that would work well.

Electric scooters seems like a good bet. The form factor is similar, and I'd imagine many of the technologies can be directly transplanted. I think a folding electric scooter could be more practical than a longboard for many people.

There are of course more exotic options. The self-balancing electric unicycle (which was on HN the other day) was interesting, and I'd hope there are other things waiting to be invented.

Anyway - I'm very excited about the increased diversity in transport options.

xal 2 hours ago 0 replies      
I've used mine every day since I got it. It's hard to describe how amazing the riding experience is.
billmalarky 1 hour ago 0 replies      
I'm excited to see the number of players in the electric skateboard market. I'd like to see costs come down though, it seems like all of them are in the $1200-$2000 range with not much differentiation. I suspect the one who can bring costs down while maintaining quality will be the winner in this market.

I'm not sure what Boosted is bringing to the table honestly, compared to brands like [Evolve](http://evolveskateboardsusa.com/) that have been out for a while now and have better specs at a significantly lower price.

Holbein 2 hours ago 2 replies      
Boosted boards are 6.8kg. The Marbel board is even lighter at 4.5kg(!), so it's even easier to carry, and the range is greater as well (16km instead of 9.6km):


timhargis 2 hours ago 0 replies      
Great product. I was on the pre-order list at $1299 and was disappointed to then get an email a month ago saying the price went up to $2000.

Outside of that, awesome job on the product. Looks great.

dang 2 hours ago 0 replies      
The article title is a tad too linkbaity (plus slightly misleading, given its final sentence), so we changed it to the caption from the photo.
App.js: Mobile webapps made easy kik.com
33 points by tdrnd  5 hours ago   8 comments top 3
evv 45 minutes ago 0 replies      
I would be a bigger fan if there were some included phonegap utilities & tutorials.

Famo.us is making the same mistake for some reason. Aggressively targeting mobile, with half of the toolchain missing to actually shipping mobile apps.

Not that I can blame the OP, who is likely not a SF startup with millions in funding.

gearoidoc 3 hours ago 2 replies      
Looks nice but I'm just wondering what this offers above Mobile Angular.js?
notduncansmith 2 hours ago 1 reply      
Not to be confused with AppJS[0], a (recently deprecated) SDK for building native desktop apps with Node.js.

[0] https://github.com/appjs/appjs

The Scroll Up Bar usabilitypost.com
123 points by muloka  11 hours ago   78 comments top 30
masterleep 8 hours ago 2 replies      
Fixed bars are annoying and should rarely be used. But what's even worse is when the mobile version helpfully removes the content or feature that you'd like to see.

Please just have one site, make it efficient, and be done with it.

Aardwolf 9 hours ago 4 replies      
Am I the only one who prefers to read websites in "desktop" mode?

I always feel totally alienated by the mobile page, there is information left out, annoying badly-implemented JS scrollers, etc...

The "desktop" site looks always more nice and familiar, and you can zoom and scroll around as you wish to read everything.

RobotCaleb 10 hours ago 3 replies      
They're a problem on non-mobile, as well. If the bar is over content, but the scrollbar isn't adjusted for it, pressing space or pagedown doesn't move down one page worth of visible content. I don't get why breaking pagedown is acceptable. It's my preferred way to read long text.
dkrich 1 hour ago 0 replies      
The problem is that he's assuming the primary goal of the site's UI is to make reading the content as enjoyable and efficient an experience as possible.

Sadly, that's rarely the case.

A site like Forbes gets paid every time you click through to another article. Once you've landed on an article and you're reading it, your value to them has been expended. Thus they have almost a diametrically-opposed interest to yours- you want to read the content uninterrupted, and they want you to click another link.

SoftwareMaven 10 hours ago 2 replies      
What's worse is some sites that do this can't seem to differentiate between iPads and iPhones and seem to use percentages everywhere in their CSS. As a result, you get a huge header (which feels larger in landscape orientation).
mmphosis 9 hours ago 0 replies      
I used to call these type of things, short-lived fads really: User Interface (UI) element of the week. Fortunately, UI isn't changing that often anymore. However, Javascript and the "mobile" platform are bringing "User Interface element of the week" back.

I am seeing lots of weird scrolly things these days. And, pages that blink out and then reappear, as they get rendered and rerendered and rerendered again by Javascript. Lots of other annoying Web 2.0 thingies that do little more than annoy. If I was a better writer, I could probably write a weekly WTF about "User Interface element of the week."

Yes, "fixed bars" are annoying. We've had "fixed bars" on non-mobile for as long as I remember: Main Menu Bar, Title Bar, ooh and now a "Tab Bar" which is just the modern version of Windows MDI, maybe Windows MDI, Navigation bar(s) which sometimes takes up a quarter of the screen height I kid you not, the Bookmarks Bar which gets hidden the first time I open the browser and all corporate links get deleted what a pain.

Including your "bar" in the contents and letting it scroll away might be easiest, and the author noted medium's clever idea.

pera 10 hours ago 4 replies      
I don't get why these bars are being used everywhere, personally I find them annoying. Scrolling up takes like half a second...
WickyNilliams 3 hours ago 0 replies      
Someone has already mentioned my JS lib to handle this below, but as the author I feel compelled to mention it myself with some additional explanation.

I built headroom.js [0] to handle exactly this. It simply adds classes at scroll up or down so you can be as fancy as you want (or not!) with the show hide effect. You can set a custom offset (eg. Don't invoke the hide/show mechanism until 100px down the page), you can set a tolerance (eg. Must have scrolled more than 10px before hide/show) and a few other features for more advanced usage.

And for fun I built a little playground so you can explore the various features and find a configuration you like [1]

[0] http://wicky.nillia.ms/headroom.js/

[1] http://wicky.nillia.ms/headroom.js/playroom/

(meta: I submitted it here, but it never gained traction, someone else submitted it to designer news and it absolutely blew up, can't believe it almost has 4000 stars!)

pfalke 10 hours ago 1 reply      
Worse than taking up screen space, these bars often exhibit laggy scrolling behavior and cause other UI bugs. There are very few well working implementations of such fixed bars, partly due to difficulties like the position:fixed implementation in Mobile Safari[0] - unless that has been changed in iOS7.

I highly recommend to refrain from using position:fixed on mobile devices.

[0] http://remysharp.com/2012/05/24/issues-with-position-fixed-s...

JamisonM 10 hours ago 3 replies      
The thing about this is that there is, as far as I can tell, no good solution for the fact that it can take a long, long time to return to the top of a page with a lot of text on a mobile device. If I read 3/4 of a lengthy story on my phone and I want to navigate somewhere else on the same website my only practical option is to revisit the main page of the site and start navigating from there.

Do any mobile browsers have a "return to top of page" function? My keyboard has a "Home" key, my phone does not.

Pxtl 4 hours ago 0 replies      
I honestly loathe the "scroll up bar" feature in Chrome browser because it throws off my instincts about how to interact with the UI. If something auto-hides into the top of the screen, my instinct is to pull it down from the top if I want to see it again. That just brings up the Android notification system.
lnanek2 10 hours ago 1 reply      
I was all ready to complain that I want some sort of nav/menu thing always stuck at the top, but I'm actually OK with the solution presented where it just shows as soon as you scroll up without having to scroll all the way.
erso 7 hours ago 0 replies      

  An interesting way to solve the issue is to hide the bar when scrolling down, and show it when scrolling up.
This pattern is one of the many irritating things about the mobile Chrome and iOS 7 browsers that prevents me from using devices implementing either.

I typically stick to reading around the top of my device, and occasionally I want to re-read something I just read. Instead of just getting to re-read the hidden lines, I have to continue scrolling while stupid chrome or a fixed bar appears, and then finally lets me scroll the content.

It's probably the case that a lot of people love this, but I hate it. If I want to see the browser chrome or navigational elements, I'm happy to tap the top of the window to scroll me there. I don't want the browser trying to figure out what I want to do based purely on scrolling.

vxNsr 8 hours ago 0 replies      
FIxed bars and JS pop-ups are the window pop-ups of the 90's and really I'm just waiting for someone to make a chrome extension that takes care of 'em in the same way.

The only real issue is that mobile browsers don't allow extensions of any kind (at least the ones I've used don't). So there is no real way to add such customizations to mobile browsers, and we're left hoping they go mainstream enough that someone either creates a browser around that feature (ie useragent switch, which is kind of annoying to use because it means copying the url and switching apps), or a dev in a mainstream browser makes it their weekend project. Neither one of these options is ideal, in the first you're left with a bunch of browsers that do one thing, in the latter you're going to end up with a feature that will slowly stop working as the dev's main work builds and his manager tells him to drop it.

baby 8 hours ago 1 reply      
I see no problems at all with fixed top bars. They take what? 20pixels? 50 pixels at most. It's really not a huge loss. I'm more annoyed by lateral bars since I already use the horizontal space with tree style tab.
xux 10 hours ago 2 replies      
For me, the ideal solution is just get rid of the fixed bar, and have a clickable menu drop down.

I don't want a page to analyze every of my behaviors and try to predict what I want to do. Sometimes I just like to scroll up and down to look for stuff, and I don't want some bar flashing in and out.

jasonkostempski 10 hours ago 1 reply      
I don't know if Android has the feature where you just tap the status bar to scroll to top but iOS does. With that feature I'd rather nav bars just stay at the top of the page, I'll just tap the status bar if I need to get back up there, which is almost never anyway.
dmalik 10 hours ago 1 reply      
I was hoping to read about a usability study done when I clicked the link. I do agree for blogs and articles but for web apps I'm not sold.
nilved 6 hours ago 1 reply      
I'm quite surprised to see anybody suggesting this pattern be used, because it's exactly why I had to stop using Firefox for Android. It makes no sense to need to scroll up and lose your place in a page to access the menu. Whichever designer suggested those two actions be bound together doesn't have any business designing.
lobo_tuerto 10 hours ago 0 replies      
The solution presented there is the same used by Google Chrome on Android, it might surprise a totally new user, but one gets used to it pretty fast. Seems like it's the best solution all around.
currysausage 6 hours ago 0 replies      
> An interesting way to solve the issue is to hide the bar when scrolling down, and show it when scrolling up.

No. When I scroll up, that's what I usually want to do: scroll up, see some content that is currently out of view. If that bar appears first, I have to swipe an inch more, which doesn't sound that horrible, but it results in an inconsistency between mental model (swipe down 1 inch, see what is 1 inch above) and technological reality (sorry, you need to scroll 2 inches!).

In the eBay Android app, where I want to quickly compare search results, this annoys the hell out of me.

One of the best things about touch interfaces is the natural mapping between mental model and technology. Let's not break this.

emsy 4 hours ago 0 replies      
The worst implementation I've seen was for a desktop only web application. The navigation bar was only expanded when the scrollbar was at the very top. For very long pages they implemented a button that scrolls the user back to the top of the page so the navigation bar expands and the user can navigate. In the meantime, the horizontal space went mainly unused. The boss saw the IBM page and wanted the menus like this.
pimlottc 6 hours ago 1 reply      
I suspect this was influenced by the browser behavior in iOS 7. Almost all the browser chrome (full address bar and status bar) disappears when you start scrolling but reappears if you scroll back up quickly.

In fact, the iOS behavior is rather more nuanced:

  * Scrolling down hides chrome  * Swiping up quickly reveals chrome  * Scrolling up slowly does not reveal chrome  * Scrolling to the top of the page reveals chrome  * Over-scrolling past the bottom reveals chrome
It's often interesting to see how much consideration Apple puts into small details like this.

sp332 10 hours ago 0 replies      
Vimeo has an odd top-bar that hardly shows up at all when you first load the page, but if you scroll up (when you're already at the top of the page) it unfolds and shows you more videos. e.g. http://vimeo.com/28408829
lukasm 10 hours ago 2 replies      
FYI one plugin that does that is http://wicky.nillia.ms/headroom.js/
dj-wonk 10 hours ago 1 reply      
"Creeper" nav bars (an appropriate term, I think) are partly a consequence of mobile devices not letting you instantly jump to the top of a page. Mobile devices should offer a snappy, intuitive way to jump to the top or bottom of a page.
jessaustin 8 hours ago 0 replies      
Is it weird that the title of TFA is "The Scroll Up Bar" and the title of this post is "Fixed bars are becoming a new nightmare on mobile"?
jasontsui 9 hours ago 1 reply      
Fun pattern, but is it more useful? I think its hyperbolic to call losing 160px of reading space on top of a desktop browser a "nightmare". Something a user can see all the time has greater affordances than something that is hidden. Especially if that something is as essential as navigation. Honestly, I think this is more style than usability.
julenx 9 hours ago 0 replies      
Am I the only one hating Twitter's Android app behavior in terms of fixed bars?

I tend to read tweets from oldest to newest, and the "Home/Discover/Activity" bar always gets on my way. Moreover, I don't even need the top blue bar. Just gimme the content!

dang 8 hours ago 0 replies      
We reverted the title. The submitted title was "Fixed bars are becoming a new nightmare on mobile".

The HN guidelines ask you not to rewrite titles. Especially please do not rewrite them to make them more controversial.

Altair 8800 Loading 4K BASIC with a Teletype [video] youtube.com
55 points by mmastrac  7 hours ago   18 comments top 9
kps 1 hour ago 0 replies      
Paper tape is why ASCII DEL is 0x7F it's the only idempotent punch.

And why typing DEL is properly a forward delete operation: it eliminates the character under the cursor converting it to a DEL, to be ignored when reading the tape and (like any other punch) advances to the next. (I blame the VT220 for mucking this up and leading to endless confusion.)

Gracana 1 hour ago 1 reply      
I've always been disappointed that I missed out on this era of computing. I had a couple Apple IIs in the late 90s, and I enjoyed them immensely, but it was old stuff by then and much of the community was gone, and people had moved on to newer things.

I have dreams of creating a modern microcomputer. A computer for the hacker masses, inexpensive with modest specifications and simple design, with ample parallel and serial IO. Something that puts you close to the metal with few distractions and limited complexity, like an arduino but interactive and self-contained. Like what the Raspberry Pi was meant to be, but without binary blobs and complicated operating systems.

Is that an idea that appeals to anyone else? Whenever I think about it, I feel all warm and fuzzy inside. I know some of it is nostalgia, but I also think there is a lot to be said for the creativity and inspiration that arises from working in simple constrained systems.

robterrell 4 hours ago 2 replies      
I remember doing this for my dad... sitting in his computer store, entering bytes of the boot-loader program by flipping switches. In my memory, it took much longer than in the video. The computer was an Altair clone, one of these: https://www.youtube.com/watch?v=VYhbzCAzNy0 and the paper-tape reader was also different (not connected to a teletype, just a stand-alone paper tape reader with a serial port) so I guess it was a different bootloader entirely.
0x0 56 minutes ago 0 replies      
We've come a long way in just a couple of decades. The gap between the computer shown in the video, and the ios device I'm watching the video on, is mindblowing.
jeffbarr 2 hours ago 1 reply      
I have yet to watch this, but I am pretty sure that the first two bytes of the loader, are 063, 307. I loaded Altair Basic dozens of times as a teenager in the late 1970's.

Am I right?

greenyoda 6 hours ago 1 reply      
At the end of the video (approx. 8:25), he tells you that the machine is actually an Altair clone made with modern technology, not a real Altair 8800.
JoeAltmaier 6 hours ago 2 replies      
My brother built one of those from a kit when I was a kid. I wrote my first computer game in 8080 assembler, in 128 BYTES of ram.

He donated it to the San Jose tech museum I think.

vmmenon 5 hours ago 0 replies      
wow. surreal ...
Data Modeling in Graph Databases: Interview with Jim Webber and Ian Robinson infoq.com
50 points by ancatrusca  7 hours ago   14 comments top 3
glesica 6 hours ago 1 reply      
Speaking from personal experience, so YMMV, the trickiest thing about moving from a relational or document DB mindset to a graph DB mindset is remembering that you can store information implicitly in the structure of the graph.

So, as a very simple example, you don't have a Comment node with attributes for the person who wrote the comment and the article the comment is associated with. You just have edges pointing back to those things. Nowhere in the comment, or even in the edges, is there anything that looks like an ID or foreign key.

Unlike a document DB, however, you don't have weirdness once you have something like co-authorship. Just point to both authors, no need to duplicate the data or set up some kind of pseudo foreign key. Once you get the hang of it, it's a really elegant way to store data.

joe_the_user 6 hours ago 1 reply      
Well, I'm grad to get text with InfoQ rather than a video.

Still, "Relational databases are fine things, even for large data sets, up to the point where you have to join. And in every relational database use case that weve seen, theres always a join and in extreme cases, when an ORM has written and hidden particularly poor SQL, many indiscriminate joins."

It seems like the overall argument is for (what I see as) a step backward from the declarative model to a lower level imperative model. "You never know what memory your implicit declarations will allocate, better do everything in explicit c-like loops as your data expands."

It's almost like an argument for a return to the world of "hardware is expensive, people are cheap" and for all I know that's what's happening with really big data. But it seems a bit sad to present it as a step forward.

k__ 5 hours ago 3 replies      
Last thing I heard, was, graph-dbs don't scale well.

I there any information about this?

I wanted to build a system with tagged content and thought about using a graph-db. (Soft-)realtime querys etc.

Averting Disaster A Guide To Computer Backups (2014) anandtech.com
16 points by tambourine_man  3 hours ago   6 comments top 4
yoda_sl 1 hour ago 1 reply      
My backup strategy which so far I am happy with:- on my Macs use Time Machine even on my MBP using a NiftyDrive with a selected set of folders that are important since the NiftyDrive is limited (currently I have a 64Gb)- remote with CrashPlan which I got a few years back during a Black Friday sale at a great discounted price (wish they were doing that again)

I had a few times to use both to restore some files and in one case my MBP HD that went bad... Through CrashPlan I was able to restore everything (it took almost 12 hours to re download everything) but at the end to my extreme surprise I lost less than 5 min of work since CrashPlan was quite up to date.

I do use of course Dropbox but I don't consider it as a backup destination and rather a sync'ing service. I do have for a few months now a FileTransporter from Connected Data and start to use it more and more since I can store up to 1TB and no monthly fee for it, but not yet done the full jump from Dropbox.

I will be curious to hear anyone else solution.

userbinator 1 hour ago 1 reply      
I think it should be mentioned that any form of high-capacity flash storage (USB drives, memory cards, SSDs) is not recommended for storing backups that are to be kept for a long time (>1-2 years); magnetic and optical media are preferred.

Making sure that your backups actually can be restored is also extremely important; there's not much worse than thinking that you have backups, but when you need them, find that they've become corrupted and unusable.

hga 58 minutes ago 0 replies      
While it's a bit dated, this book, Backup & Recovery: Inexpensive Backup Solutions for Open Systems (http://www.amazon.com/Backup-Recovery-Inexpensive-Solutions-...) is highly recommended. And higher level programs like BackupPC and Bacula are still excellent solutions.

It's comprehensive, covers off-line bare metal backups (which aren't exactly changing any time soon), points you at tools like rdiff-backup which you can use to get reasonably close to continuous data protection (I do it every hour), etc. etc. Along with a few good and short war stories. And preps you for the big times, if you're interested.

bcl 2 hours ago 0 replies      
I've been using BackupPC on a Linux box for years. First with RAID5 and now with RAID1, using ssh+rsync I can backup my Linux and OSX systems and have access to older backups via the web interface. For totally irreplaceable items I also back up a subset to Glacier.
Simple, Fast, Practical Non-Blocking and Blocking Concurrent Queue Algorithms rochester.edu
35 points by luu  6 hours ago   5 comments top 3
jhallenworld 4 hours ago 1 reply      
You can get better performance in the one writer thread / one reader thread case by using simple lock-free circular buffers. No special instructions are needed for this. Cache line thrashing can be minimized by allocating the pointers into four cache lines: one writer local, one for writer-to-read communications, one reader local, and one for reader-to-writer communications.

You do need barriers: issue a write barrier before giving a copy of the write pointer to the reader. Issue a read barrier before reading the data.

Avoid cache line thrashing in a mostly full case by never allowing the circular buffer to become completely full (leave at least one cache line free).

The reader thread can quickly poll many circular buffers for work. The pointers will all reside in the reader's cache until someone has written some data (and updates the reader's copy of the write pointer).

You can get more benefits by delaying the update of the other thread's pointers: on the writer's side, until we have no more data available to write. On the reader's side, until we have read all the data (or some larger chunk of it). This allows the cache-line prefetching hardware to work (to prefetch likely used data).

Anyway, if you really want to use a linked list, at the very least allocate multiple items per cache line, and then link the cache lines together (so one link pointer or less per cache line).

jleahy 5 hours ago 0 replies      
This should really be tagged as 1996, especially as since this was written the world of lock-free algorithms has moved forwards a lot. At the very least Fober et al suggested an alternative that performs a tad better in 2002 (http://nedko.arnaudov.name/soft/L17_Fober.pdf).
coder23 5 hours ago 1 reply      
I see a problem with implementing the non-blocking version in C. Compare-and-swap must operate on an entire struct. But for example gcc only allows integer types. Quote from gcc docs:The definition given in the Intel documentation allows only for the use of the types int, long, long long as well as their unsigned counterparts. GCC will allow any integral scalar or pointer type that is 1, 2, 4 or 8 bytes in length.

C11 has a new header stdatomic.h which only allows atomic for integer types.

How do you perform CAS on a struct in C( without using a mutex of course )?

Apple said to be prepping smart home software platform for WWDC techcrunch.com
49 points by JumpCrisscross  7 hours ago   60 comments top 16
bengoodger 7 hours ago 2 replies      
It will be interesting to see how much of this relies on the presence of existing hard wired systems, many of which are controllable over the network (e.g. Lutron, many security systems, etc). Writing to other peoples' APIs has not typically been Apple's strong suit. At the same time, wired-in infrastructure in a home is nothing like as disposable as the consumer technology Apple focuses on. There is no way I would want a closed system baked into my house. Hard-wired infrastructure is something you're stuck with for decades.

Relevant to me as I'm in the rough-in phase of a whole-home remodel project. I struggled a lot with what level of control to use, before settling on a minimal Lutron system for some areas of the house, comfortable in the knowledge that if I get tired of the Lutron interface, I can create my own & poke it over the network. I understand most people aren't interested in this level of control, but they might be concerned when their interface becomes dated in 5 years and can't be replaced due to an incompatibility with the hard-wired infrastructure. I see this as a downside of closed systems like Control4.

k-mcgrady 7 hours ago 2 replies      
It would be interested to see how the market would change if Apple got involved in this. As cool as home automation tech is it's never taken off. Apple could potentially jump start the market - or it could be a total flop.

Another interesting side effect of this will be how it effects Apple's 'coolness'. The only people willing to spend the money on home automation tech are those who own homes or who don't plan on moving any time soon - i.e. not young people.

Tiktaalik 1 hour ago 0 replies      
I'm not sure my ~600 sq ft apartment really needs automation. It's pretty simple and manageable. Maybe Apple has dreamed up a delightful solution for a problem I don't realize I have.

My garden however is a complex mess that could benefit from automation.

bluedevil2k 7 hours ago 1 reply      
One issue I see with this is that most home automation solutions (not all) run with a central hub, a server that talks to the other devices, and relays messages to/from them. This hub has an IP address that is reachable from home or away, so you can control it via your smartphone. The devices do not have individual IP addresses that are reachable. This would mean that to get a home automation system working, Apple would have to sell another device into the home. That's sometimes a hurdle for them. (Home automation server = new Apple TV??)

Yes, I know some home automation devices have the server built into every device (every plug, every camera, every light is a standalone "thing" in the Internet of Things), but this adds significantly to the cost. On the other hand, with some good software, it makes controlling everything without a central hub possible.

natch 6 hours ago 1 reply      
I've read many of the posts here saying how they don't see how home automation could be of interest to more than a handful of super geeky type people.

However, there is another side to this, and that is monitoring.

Home monitoring technology opens up a much larger group of interested consumers for this.

My point is we shouldn't just be talking about automation here. Automation is only part of the story. Monitoring may be a much bigger part.

egypturnash 6 hours ago 1 reply      
Oh man I really hope there's a reasonably open plugin architecture for this.
julianpye 7 hours ago 6 replies      
Currently most home automation products are geared towards family fathers who want to control every aspect of their families lives - control freaks. This is a market that does not scale. Apple therefore has to present an entirely new perspective.I have led several projects in this space and the one service that has been the most promising and was accepted by several family members was the anthropomorphization of the technology by presenting an addressable assistant that takes care of your home while you're away.
return0 6 hours ago 1 reply      
And so begins the War of the Thermostats
JimmaDaRustla 7 hours ago 0 replies      
Home Automation needs to be one of those things that is open - end points controlled and monitored from an independent system. I'd be upset if I went to buy a home and it wasn't compatible with my phone!
nitrogen 6 hours ago 0 replies      
I'm still upset that Apple bought PrimeSense and shut down OpenNI. Might this be the reason?
icpmacdo 6 hours ago 0 replies      
Do you think that they will open the home up to a app store like they did with phones?
confusedguy 7 hours ago 0 replies      
let's see if Apple could help with home automation popularity, a couple of months ago there were a couple of kickstart projects which gained quite a bit popularity, it's a good time for an industry giant to join...
dang 7 hours ago 3 replies      
We changed the url from http://on.ft.com/1w8mTkI, which is the original source but behind a paywall.

I hate to do this, because HN strongly prefers original sources. Of course, people sometimes post a Google search url that one can click through to read the OP. But we can't make that the official URL for the post.

If anyone has a suggestion to solve this problem, please let us know.

gondo 7 hours ago 0 replies      
can't even open the article, asked for registration straightaway
amirmc 6 hours ago 6 replies      
I don't trust Apple to do this well.

Firstly, they cannot (or will not) make their devices multi-user, which is a problem in the home environment (eg an Apple TV that's tied to only my account - and already exposes more info than I'd like).

Secondly, Apple has a habit of abandoning things and shutting them down when it doesn't suit them. This is mostly ok for some services but would be disastrous for me as a user if I've put bought into the system.

Finally, Apple doesn't play nice with others. Perhaps they've learned to be better since they've had to interface more due to the App Store but for the most part I don't see them caring about an ecosystem other than their own.

The Basis story: an accidental sync service in Clojure for iOS devices sapient-pair.com
39 points by mphillips  8 hours ago   4 comments top 3
bsaul 1 hour ago 1 reply      
Seems to me like the merge part of the algorithm is a bit overkill :

Generate a uuid for each item on your list upon creation ( on the client side), then sync the actions on the item ( create, update, delete). There's never the need to "merge" on the server. All you need is to create a serial "history" of all the actions on the server side, and replay them in that order on clients. You do that by locking on the DB when a client wants to send new actions.

PS : i'm currently coding exactly that. So it's more a way for me to share my ideas rather than criticize.

notjosh 4 hours ago 0 replies      
> And in fact Couch DB would have been close to perfect if I could have run a Couch DB server on the client, which would is an insane thing to contemplate.

Couchbase Lite (http://developer.couchbase.com/mobile/develop/guides/couchba...) does exactly this - except it's not an actual Couch instance, but rather an SQLite database, and a client that speaks the necessary Couch protocols for sync/replication. Not so insane, and (as far as I've seen so far..besides git!) the only tool that really handles sync well across platforms.

edit: Ah, CBL (and Datomic) are mentioned in part 3 of the article. All is well :)

orandolabs 4 hours ago 0 replies      
EnduroSync (https://orandolabs.com) is a new service for syncing object stores. It has clients for iOS and Android now, with several others right around the corner.

Data is modeled with objects. The object store works offline and online. If you sign up for the service, then you get syncing.

EnduroSync also has a very nice permission model, enabling sharing of object stores in a variety of ways (per user, per app, ...).

LDC: The LLVM-based D compiler a talk by Kai Nacke at FOSDEM 2014 [video] dotsrc.org
39 points by andralex  8 hours ago   1 comment top
kolev 6 hours ago 0 replies      
Nice! Here's a link to the GitHub project: https://github.com/ldc-developers/ldc
Show HN: YubiKey HMAC-SHA1 support for KeePassX github.com
19 points by 2bluesc  5 hours ago   7 comments top 2
ef4 3 hours ago 1 reply      
Thanks, this looks interesting. I'm not clear on how the master seed gets regenerated at every save.

Presumably the secret key used to generate the HMAC never leaves the YubiKey? So when you want to change the seed, you need to ask the YubiKey to sign the new seed? So saving the database requires pushing the button on the YubiKey again?

2bluesc 5 hours ago 1 reply      
Open to criticisms, suggestions and testing :)
The Sewing Machine Patent Wars (2009) volokh.com
15 points by gwern  4 hours ago   2 comments top
shasta 2 hours ago 1 reply      
These blogs describe Howe as an NPE/patent troll, but there's a big difference between a guy that invents something, tries to market it, and then sues people who build off of his invention (fixing the flaws), and a modern day patent troll who collects a bunch of patents and just waits for someone to accidentally infringe.
EC2 California Down?
4 points by nilsbunger  14 minutes ago   2 comments top
melvinmt 11 minutes ago 1 reply      
Down for me too (us-west-1b).

AWS Status Update: 5:51 PM PDT We are investigating connectivity issues for EC2 instances and impaired EBS volumes in a single Availability Zone in the US-WEST-1 Region.

New Haxe website haxe.org
94 points by yanhick  15 hours ago   41 comments top 12
sspiff 8 hours ago 1 reply      
I have a few bugs I've encountered over the weekend:

1. The Linux binaries on the new site are a tar.gz.gz, making `tar xf filename` fail, and besides that inconvenience, is probably a packaging error. You might want to fix that.

2. The OpenFL provided Haxe installer is broken, because it tried to download http://haxe.org/file/haxe-3.1.3-linux64.tar.gz, which is gone. Please either fix the OpenFL installer or make sure download links are backwards compatible by adding a redirect. (I'm assuming both projects are run by the same community)

I worked around problem #2 by modifying the installer script to use old.haxe.org, since the binaries on the new site are harder to use because of problem #1. But this could definitely dissuade newcomers to Haxe, despite all of its qualities (which, in my opinion, are many!)

joneil 13 hours ago 1 reply      
One of the developers of the new site here. Simon Krajewski did an amazing job writing the manual - it's a great resource and has all the gory technical detail - I'd encourage you to take a look. A number of other people have contributed to content and design (though we'd still like to give the design some more love).

The website itself is developed in Haxe, and all of the content is hosted on Github so we can encourage contribution but still keep an eye on the quality, unlike the wiki we had previously. There's a "Contribute" link at the bottom of each page that links you to the relevant file on Github.

If you have any questions let me know, I'm happy to answer. I hope you find it a valuable resource.

jdonaldson 4 hours ago 0 replies      
The conference was great as well. It was amazing to hear how fast Tivo was able to port their 1M+ loc library over to Haxe, and how they're able to compile the client to new platforms now.
hcarvalhoalves 6 hours ago 1 reply      
Interesting, never heard of it before. How can they possibly support so many targets? [1]

[1] http://haxe.org/documentation/introduction/compiler-targets....

jrpt 8 hours ago 0 replies      
I hope Haxe succeeds even more than it already has. Congrats guys.

What's a good and up-to-date Haxe tutorial for making mobile apps, targeted at Mac developers?

jimmcslim 12 hours ago 1 reply      
I think I was under the impression that Haxe was just an alternative Flash/AS3 runtime, but it appears that it is actually far more than that? Either than, or the association with Flash is obviously less important these days?

EDIT: I think that impression came from reading this [1] article, possibly found here... or possibly some other article... anyway it talked extensively about AS3 and made comparisons against Haxe.

1 - http://www.grantmathews.com/43

zschuessler 12 hours ago 5 replies      
Has anyone used Haxe in a production project? What were your experiences?

I like the end goal, but am confused why I haven't heard of it until today.

wiradikusuma 12 hours ago 3 replies      
I'm confused, what's the purpose of targeting language (http://haxe.org/documentation/introduction/compiler-targets....)?

I can understand e.g. people use CoffeeScript because JavaScript is more verbose. But why target Python from Haxe? Shouldn't it be the other way around?

Fizzadar 11 hours ago 3 replies      
Fantastic idea. Shame the language had to take so many cues from JavaScript!
chrismorgan 9 hours ago 1 reply      
Finding some 404s about the place.
abus 13 hours ago 3 replies      
Why does every website use broken unicode box character font import style now?
razaina 14 hours ago 1 reply      
Good job gyz
Has the average web page actually gotten smaller? webperformancetoday.com
27 points by gbl08ma  8 hours ago   6 comments top 6
adwf 1 hour ago 0 replies      
A single data point does not make a trend. I think it's highly unlikely that websites are going to get smaller over time, for the simple reason that consumer internet speed/bandwidth gets faster and faster every year.

The real interesting statistic is page load times. If load time is remaining static whilst page sizes are increasing (mostly due to images it seems), then there isn't anything to worry about. Businesses are just keeping their websites within a certain performance envelope and are responding to greater bandwidth as it comes about. (Not to mention JS speed increases)

If on the other hand, websites are growing fatter and page load is getting slower, then there is something counter-intuitive happening. But I doubt it.

hmsimha 1 hour ago 0 replies      
One important thing to consider (and I'm not sure that the survey does) is that a lot of this weight may come from cached sources, such as javascript libraries sourced from CDNs. While more libraries = more total content weight, using common libraries may reduce the weight of the content that actually gets loaded, by eliminating the requirement for redundant compatibility boilerplate and providing DOM manipulation tools that allow much of the page content to be created client-side.
gohrt 5 hours ago 0 replies      
First off, "smaller" is comparing May 2014 to Nov 2013, a short-term effect.


> The only content type that experienced significant shrinkage was other.

> So ruling out third-party content leaves us speculating that either the shrinkage is due to a decrease in use of video (quite possible) or an undocumented change in the testing process (somewhat possible).


There is some interesting info in the article, but the headline isn't it.

breck 5 hours ago 0 replies      
I wrote a tool a while back to see whether the number and complexity of DOM elements on webpages has changed over the years - http://domtree.breckyunits.com/

I was surprised to see that at least as far as the HTML is concerned, there hasn't been much change in the past 15 years.

dalek2point3 4 hours ago 0 replies      
separate shoutout to the HTTP Archivehttp://httparchive.org/about.php

Never knew this thing existed, but seems like a really nice resource.

lingben 4 hours ago 0 replies      
suspect it is due to CSS3 and its powerful arsenal which has replaced a lot of js
Discovery's Teardrop jacook.name
123 points by runlevel1  18 hours ago   19 comments top 5
ender7 4 hours ago 0 replies      
As an aside, I would highly recommend making a trip out to see Discovery if you are ever in the DC area (the Udvar Hazy Center is just a few minutes' drive from Dulles airport). They let you get remarkably close to the shuttle (close enough to touch, although you shouldn't do so). Also, they chose not to clean her up very much when they installed her (as is somewhat evident in the linked photos). As a result she retains all of her scars from usage and re-entry, which I think makes her feel both much more real and fantastical.
joezydeco 11 hours ago 1 reply      
Nice. Now I want a font geek to look into why Discovery's name is done is a nice Helvetica-like typeface but the final "y" has that odd kink at the end of it.
dewey 16 hours ago 6 replies      
I'm always surprised on how uneven and hacked together the surface of these shuttles look. I'm pretty sure there's a solid reason for keeping it that way, one would just think it's easier to spot issues on the surface if it's more consistent like the surface of a plane.
TimFogarty 11 hours ago 1 reply      
This was a beautiful read. I too feel an emotional connection to these great machines and the fantastic people who design, build, and pilot them. To me, our missions into space perfectly captures everything to be admired in Humankind. It represents our inexorable daring in the face of great adversity, it showcases our insatiable curiosity, and, since the the end of the Cold War, it has united people across the globe in the greatest adventure upon which we have ever embarked as a species.

From the ground the moon and the planets seem so far away, so ethereal. Yet just knowing that we have sent people and probes to touch them underlines how obtainable the impossible can be with grit and science, per ardua ad astra. It gives me hope.

I wish that the spirit of international cooperation that has grown around this endeavour will endure over the coming decades and beyond.

notfoss 16 hours ago 0 replies      
Fascinating read. Too bad that the conclusion is still a hypothesis.

Also, this is the first time that I have taken a close look at the surface of the shuttle, and as @dewey noted, I too found it to be very uneven and blocky. From a distance, they look reasonably smooth.

Gitjk command to undo what you just did in git github.com
67 points by mapmeld  6 hours ago   28 comments top 12
_kst_ 4 hours ago 2 replies      
If you name it "git-jk" rather than "gitjk", you can invoke it as

  git jk

bilalq 4 hours ago 0 replies      
It's kind of nice that this just gives suggestions by default rather than running potentially breaking commands.

But as someone else here mentioned, you really want to understand the reflog in git. Everyone screws up at some point or another, and it's much easier to work through things when you can rely on the reflog to act as a safety net and an anchor of sanity.

jamesgeck0 4 hours ago 0 replies      
I occasionally wish git had an undo-whatever-I-just-did command, and I've been using it for years.

The git fixup page comes in quite handy. https://sethrobertson.github.io/GitFixUm/fixup.html

tetha 4 hours ago 1 reply      
Why is sudo hardcoded into so many commands? I'll run your program with privileges if it needs and earned them, thank you very much.
tjdetwiler 2 hours ago 0 replies      
I'm pretty sure your fetch logic is not right. First it assumes no arguments are passed. You assume origin/master (which is convention but not guaranteed) and also you assume a refspec isn't passed (in which case you need to rollback FETCH_HEAD).
prezjordan 4 hours ago 0 replies      
The git reflog gets you out of all kinds of messes. https://medium.com/git-tips/a4189dd88c40
burke 3 hours ago 0 replies      
rebase isn't very hard to implement; you just have to check the reflog.
thrush 2 hours ago 0 replies      
Does it work if you call it 2x or more?
lunixbochs 4 hours ago 0 replies      
If anyone wants a fish function:

    function jk        history | head -n+10 | tail -r | gitjk_cmd    end
Maybe git full undo via automatic snapshotting (branch/stash)? It's fine unless you hit a perf wall on huge repos.

homulilly 4 hours ago 1 reply      
Uh... if you want to undo a git add shouldn't you be doing git reset?
develop7 4 hours ago 0 replies      
It does rely on history? Well, it won't work for me then in my history there's only frequently-used commands.
lazzlazzlazz 4 hours ago 1 reply      
sudo hard-coded, and an intolerably stupid name. Next.
       cached 27 May 2014 01:02:01 GMT