hacker news with inline top comments    .. more ..    2 Jul 2017 News
home   ask   best   2 years ago   
Minitel: The Online World France Built Before the Web ieee.org
56 points by sohkamyung  1 hour ago   13 comments top 7
slau 41 minutes ago 2 replies      
A company I worked for used to host Minitel services. In particular, it was a system to handle driving test allotments, reservations and cancellations between the government ("Prfecture") and the driving schools.

The service was provided for free to the government, the company organised free training sessions for government clerks, and it was the driving schools who paid for the service when dialling into our minitel servers.

It technically wasn't a monopoly, because the driving schools could still go down to the prfecture, and do everything using the forms/pen/paper.

The company tried on a number of occasions to get the driving schools to move from the Minitel service to the new web version. Every single time, there was a huge push-back from the driving school unions, about how expensive the new service was, and how "unusable" the website was, compared to the Minitel.

We even had people calling in, saying that we were extortionists. "We've been using this for over 20 years, and never paid a cent; now you want us to pay xx a month?" I guess some of them really didn't look at their phone bill.

I heard about 4 or 5 "planned terminations" of the Minitel service during my stint from 2010 to 2015. France Telecom/Orange even provides a "Minitel over IP" service these days, where a website can be enrolled into their payment service, and users pay per minute on the website. It's a superb scam tool (just have a hidden iframe open a pay-as-you-go page), and Orange is constantly fighting the fraudsters.

lloeki 17 minutes ago 0 replies      
A couple historical anecdotes:

There was the equivalent of the hug of death multiple times every year when students were checking their results, overloading the servers as (tens/hundreds of) thousands of people tried to furiously dial in simultaneously to get results from various nationwide exams such as the infamous Baccalaurat.

In 1981 for the presidential elections, the result was broadcasted live on the Minitel and showed up live on the news:

(On TV) https://youtu.be/rJHUZNlO9ao

(Remastered Minitel output) https://youtu.be/JIZ_D34J3-I

srge 34 minutes ago 0 replies      
It was great for "piracy". You had message board where people would swap floppies. You basically copied a game (Atari ST games of course) and would send the floppies hoping your counterpart would do the same.

It was a great time and I learnt a lot about the geek community, the sharing and got access to many games which at age 15 I could not afford.

ekianjo 30 minutes ago 1 reply      
Is this just me or are we getting more and more articles about the Minitel on HN these days?
fermigier 42 minutes ago 2 replies      
"and dudes (mecs in French) browsed the personal ads at 3615 MEC." <- Something was lost in the translation there.
mrkrab 51 minutes ago 1 reply      
Don't forget about Infova in Spain, too, even though that was much later.
ForHackernews 1 hour ago 0 replies      
Some previous discussion here: https://news.ycombinator.com/item?id=14577881
Memory use in CPython and MicroPython lwn.net
51 points by signa11  3 hours ago   16 comments top 2
netheril96 2 hours ago 3 replies      
When Python 3 broke compatibility, it would be perfect timing to implement tagged pointers, small integers and other performance improvements. At least it would be an easier sell to your boss to rewrite your code base. "We are doing so it runs faster" is much better than "we are doing it for intellectual purity" from a business standpoint.
dom0 1 hour ago 0 replies      
The comparison is interesting, but CPython's lavish use of memory (which of course has a performance impact by clogging caches, though that's one of the least problems you have with CPython, performance wise) is well known.

This is in part because there are still macros such as PyBytes_GET_SIZE which directly access struct members, and these macros are part of the stable interpreter ABI. That doesn't mean small integer opts and such for length fields aren't possible, it just means they can't happen for Python 3 any more. Tagged pointer would probably break too much code to ever happen.

well known as it may be, people are still surprised that bytes() requires at least 33 bytes (due to the implicit extra NUL byte), an empty string is around 50 bytes and every item in a dict or set takes between 30 and 70 bytes. All this overhead adds up.


Borg works around these problems with a simple hash table (straight out of the text book, with some associated issues). Even though that one is in itself inefficient in how it uses memory, it still only uses a fraction of what the equivalent dict in CPython 3.6 would use. I recently added a similar, pure Python construct in another place (borg mount).

India switches to a unified GST to replace various indirect taxes nytimes.com
44 points by nileshtrivedi  3 hours ago   13 comments top 4
allendoerfer 0 minutes ago 0 replies      
EU, take note. To me the premise is absolutely true: Further unifying the market will spur growth. European start-ups starting in Europe instead of their tiny home country will be a much bigger competition to their US counterparts, which at the moment can often easily crush them, because they already have conquered their rich, big and English speaking home market.
denzil_correa 1 hour ago 1 reply      
GST will increase revenues for consuming states and decrease revenue for manufacturing states. The government will compensate the manufacturing rates for the revenue loss. That's how all states agreed to a GST. This difference will ultimately come out from tax payers pockets. Basically, the tax payer pays for the government unable to handle compliance.

Anyways, there are many products kept out of the GST purview [0] and therefore, this is like release 0.1a of GST. The idea that it will spur economic growth with the projections given is an exaggeration.

[0] http://www.hindustantimes.com/india-news/gst-adding-to-econo...

imhoguy 4 minutes ago 0 replies      
High time for unified EU VAT, otherwise we will stay behind. VATMESS and lack of unified digital market rules is what keeps cross-EU online business tough.
ultramode 1 hour ago 3 replies      
India is the only federal nation with true GST. I am excited about the far reaching effects of GST.
Whats in a Continuation (2016) jlongster.com
102 points by ianrtracey  7 hours ago   32 comments top 8
mbrock 4 hours ago 0 replies      
A common strategy for implementing compilers for continuation-enabled languages is transforming code into "continuation-passing style" (CPS).

That means, basically, converting it into total "callback hell", so that functions don't return, they just call other functions with callbacks.

(There's a special "exit" callback that is implicitly the callback for the main function.)

Some imagination is required to see how, say, a for loop is rendered into CPS, but if you first imagine turning the loop into functional programming then it's easier (like in JavaScript, if you want to wait for an asynchronous thing in your for loop body, you have to basically do a CPS transformation manually, or use async/await which is closely related).

So basically in Scheme, your code is always in a callback-passing style, just that the language cleverly hides it, and then lets you explicitly access the current callback (using call/cc).

If you have experience with async programming in JavaScript, it should make sense that this lets you easily implement things like concurrency, custom blocking operations, etc.

Just like how JavaScript callbacks can be called several times, so with continuations. Since the callbacks are implicit in Scheme, you can make what appears to be a function that returns several times ("nondeterminism").

Callbacks can of course take several arguments. Most languages have an asymmetry where functions have several parameters but can only return one value. With continuations, it's easy to imagine calling them with several arguments. So in a language with continuations, it makes sense to have multiple return values too.

pdelgallego 1 hour ago 0 replies      
Lisp in Small Pieces [0] has a good explanation on how to implement Continuation based compilars

[0] https://www.amazon.com/Lisp-Small-Pieces-Christian-Queinnec/...

tonyle 5 hours ago 1 reply      
Simple continuation explanation for web developers.

 1. Write some JS code in chrome and verify expected behaviour. 2. Add a debugger statement. 3. When the debugger pops up, go down a few frames and add debug point. 4. Right click the frame and select restart. 5. In the new break point, write some code in the console to modify the state. 6. When you step through the code, it now does something else.
Now imagine if the language allow you to do this programmatically in the code by passing the frame around as an argument similar to functions.

I leave the rest to imagination.

convolvatron 4 hours ago 1 reply      
reading the comments here is pretty painful. inasmuch as the call stack seems to be so central to people's perception of what computing is.

if you look at it from the assembly perspective, its just a jump thats been augmented with state (the closure) and additional parameters. i think trying to describe it as a snapshot, or multiple returns, is confusing since it describes them in terms of their stack behaviors.

the easiest way to think about them is to add an implicit argument to each function, which is the place to return to (jump to with the context, augmented with the return value). call it c. return x is c(x).

there is no longer any stack or implicit return target (above me). removing that common control flow assumption lets you make all sorts of different plumbing and get into an arbitrarily deep amount of trouble (the good kind and the bad kind).

call/cc has a pretty natural implementation in this model (heap allocated activation records)

but as someone else mentioned if you choose the simple continuation model, that makes a lot of choices for you in the runtime and the compiler. common complaint from compiler land is that it makes it difficult to reason about reordering later on. you also lean really heavily on the gc to clean up the frames that the stack was taking care of for you (see charlie on the mta)

noway421 1 hour ago 1 reply      
They are quite powerful, and not to diminish their use of course, but they do feel like glorified, annotated goto statements. Which is not that bad in the end, because even break; would be just a special case of goto.
skybrian 5 hours ago 1 reply      
The bizarre thing about continuations is that you can call a function once and return from it an arbitrary number of times. It seems like this would break invariants in any function that takes a callback argument but doesn't expect the callback to save a continuation?

If the continuation could be resumed at most once, this would be more like suspending a thread/fiber and resuming it later.

Tarean 2 hours ago 0 replies      
Delimited continuations reify the rest of the block as a function. From within the definition of a continuation you can use all variables in scope together with that function and can mash them together however you want.

You can reimplement low level control flow with this but generally it is mostly useful as a reinversion of control. Some code (like async IO) expects callbacks so you lose control over the program flow which makes composition difficult. You can reinverse this by using futures which often just wrap continuations.

sillysaurus3 6 hours ago 9 replies      
It took a long time to grok continuations. There's an easy way to explain what they are:

Imagine running a program in a VM. You know how you can take a snapshot and then restore to it later? That snapshot is equivalent to a continuation.

Another way of phrasing it is, it's your program frozen in time. You can snapshot your program and restore to that point later.

To put it technically, step through each call stack frame, serialize all the local variables, and you have yourself a continuation. To invoke it, call those functions in order and set the local variables to those values, then set the program counter to wherever it was. (You don't literally do this, but maybe that makes it easier to understand what's going on with it.)

The confusion: What about a database connection? Or a network connection of any kind? An open file handle? Etc. The answer is that those things can't be saved in a continuation.

The way that this works in Scheme is that there's a special primitive called "dynamic wind". It takes three callback functions: "before", "during", and "after". It invokes those callback functions in order. If execution leaves "during" for any reason whatsoever, then "after" is invoked.

Here's the kicker: If execution goes back into "during", then "before" is invoked. I.e. if you save a continuation inside "during," then "before" is the place that you'd put the code to re-initiate a database connection or re-open a file handle. Or fail.

And of course, no discussion of continuations would be complete without the argument for why call/cc is generally an anti-pattern: http://okmij.org/ftp/continuations/against-callcc.html

Yet "On Lisp" presents several interesting ways to use them, and they are extremely powerful. One particular use is that you can implement cut, fail, and mark for non-deterministic backtracking. If you've used emacs lisp and you've ever written an edebug specification for how to debug a macro (https://www.gnu.org/software/emacs/manual/html_node/elisp/Sp...), some of the more complex features require backtracking: https://github.com/emacs-mirror/emacs/blob/0648edf3e05e224ee...

That's an area where continuations really shine, because the implementation can be just a few dozen lines compared to hundreds.

(elisp doesn't actually use continuations -- this is just an example of the territory they're useful in.)

How I Took an API Side Project to 250M Daily Requests ipinfo.io
384 points by coderholic  14 hours ago   137 comments top 34
Tloewald 11 hours ago 0 replies      
I'd like to point out that the things he says he's doing instead of marketing are, in fact, marketing. It's "guerilla" marketing, and it's being paid for with the writer's time. Nothing wrong with that, just don't confuse "marketing" with "advertising".
rickduggan 3 hours ago 1 reply      
This is super cool. I use a similar API to provide a client-side service called IP Request Mapper (https://chrome.google.com/webstore/detail/ip-request-mapper/...). Coming soon to a Show HN near you.

What it does is show where every asset on a web page is loaded from. It allows you to visualize how many different requests go into building just one web page. While it's gotten much better, the Houston Chronicle (https://chron.com) used to make about 500 individual requests to build its home page. It's down to about 125.

It's best to run it across two different monitors, with IP Request Mapper on one monitor and your "normal" browser window on another. Then enter any URL and watch the map start populating based on the geolocating every request made by the page.

But it's projects like ipinfo.io that make these other things possible. Standing on the shoulders of giants and all that...kudos to you, coderholic.

smokybay 14 hours ago 3 replies      
The author does not say how much does maintaining the service cost and what is the long term plan. As others have referred to already, a similar existed before and ended for a simple reason no point in maintaining it with constant loss and no clear revenue plan.


westoque 14 hours ago 2 replies      
Good strategy! That's also what I did to get my side project (Cookie Inspector - Google Chrome Cookie Editor) project to having 80,000/daily users.


I solely marketed it at Stack Overflow and was getting upvotes and that was all my marketing.


Also a big factor there are good reviews. When users like your project/product, they will market it for you.

davidivadavid 12 hours ago 13 replies      
I'm not sure why people are proud to do things without spending money on marketing.

What if spending money on marketing had made you grow twice larger? Twice faster?

When people say "I didn't spend money on marketing", the only translation is "I knowingly overlooked massive growth opportunities."

ribrars 4 hours ago 0 replies      
Great overview here on how you solved a problem and built a business around that.

I read that you use Elastic Beanstalk for your server config, but I wanted to ask:1. What programming language did you use?

2. What, if any, configuration did you have to do to the Elastic Beanstalk config to deal with network spikes and autoscaling?


reacharavindh 12 hours ago 0 replies      
Happy user here. My GF came up to me and asked if I could somehow get country names for the ip addresses she had of her survey respondants. I Googled and found this neat little API. True, I could have downloaded the raw databases from elsewhere and worry about the SQL I need, whether the data is recent or ancient or even correct. I decided it was an overkill for my need, and just used this API in a throttled(1 req/s) mode and left it overnight. If I need this IP to Location need again, I'd happily pay for this API.
unchaotic 4 hours ago 1 reply      
Crowded space. Quick google search of any of these keywords "ip address location api" , "ip lookup API" , "geolocation API by IP" etc. shows :

- https://db-ip.com/api - https://ipapi.co - https://freegeoip.net - ipinfodb.com - https://www.iplocation.net - http://neutrinoapi.com- http://www.ip2location.com- https://www.telize.com

and a few dozen more. I wonder if collectively they are serving over a few billion requests per day. Microservices & API culture FTW !

Scirra_Tom 13 hours ago 3 replies      
Where did you get the IP DB from? My understanding is most you can't resell access to?
babuskov 13 hours ago 3 replies      
I'm baffled why anyone would use this, when you can import data in a database and run it on your own server?

I mean, you might spend 20 minutes more to set it up, but you are safe from having to rely on 3rd party service.

Anyway, kudos to coderholic for creating this and sharing the story.

drej 2 hours ago 0 replies      
I see it's still a thing. Back in high school, some ten+ years ago, I coded up an 'ip2country' website. Not sure why, there were dozens of those. I guess I had a free domain and a lot of time on my hands. I put some Google AdSense on it and let it go. I checked my AdSense account some six months later and found out I was cashing $20/month. Easiest money I've ever made.
jacquesm 14 hours ago 2 replies      
That's great. Question: does it make money? The words 'profit', 'money', 'income' and 'revenue' do not appear in the article.
elnygren 2 hours ago 0 replies      
The author says I took even though this was pure luck and coincidence. Attribution bias is strong in this one.

However, it is important to acknowledge that he did put himself into a position where he was available to become lucky (= he built the API and linked to it).

kevan 6 hours ago 0 replies      
>90% of our 250 million daily requests get handled in less than 10 milliseconds.

Minor nit, but with that level of traffic I'd expect you to be bragging about P99.99 latency, not P90.

fusionflo 9 hours ago 0 replies      
Kudos to you guys for building this. There is always a lot of scepticism from people on "why would anyone pay for this" . Reality is not everyone has the time or resources to build their own kit. There are literally 1000s of businesses on the internet that that are in the business of selling "time" or timesavers and removing the risk of maintenance, ongoing support.

Keep improving this and with the rise of web personalization, the demand will continue to grow.

WA 14 hours ago 5 replies      
I use ipinfo.io mostly to see my own public facing IP address and for me, it's 2 reasons:

- I somehow can remember that domain. I don't have to google "my ip" and dig through weird domains that change all the time

- The design is clean and simple. Not too many information, no ads, loads fast.

craigmi 11 hours ago 0 replies      
Pretty cool man, use your site all the time for ASN lookups, although I find your carrier information wildly conflicts with digital element's DB.
firloop 14 hours ago 0 replies      
Related, some other adventures while running an API to retrieve IP addresses.


mrskitch 13 hours ago 0 replies      
I'm employing a similar strategy for my library https://github.com/joelgriffith/navalia as I couldn't find any solution to manage headless chrome (plus the API for that protocol is pretty heavy).

Building for what folks want, even developers, is so obvious that I think we often forget about it. It's also not as glamorous as self driving cars or rockets, so gets discredited easily.

Sound points though

kasbah 13 hours ago 1 reply      
Does anyone know how ipinfo compares to running your own instance of https://freegeoip.net?
diminish 13 hours ago 0 replies      
Congrats. I m not sure but ipinfo could be very interesting to startups and programmers. So a good idea could be writing attractive articles and posting them on HN and Reddit programming and some other subreddits. That would bring more customers with zero marketing.

See also:https://news.ycombinator.com/from?site=ipinfo.io

larsnystrom 7 hours ago 1 reply      
Ipinfo seems to have the exact same logo as Podio (https://podio.com), a service owned by Citrix.
ge96 4 hours ago 0 replies      
That's crazy the details like lat/long, what if proxy or where does that data even come from? ISP? Or you take time to build it out ie research. At any rate cool.
niko001 14 hours ago 1 reply      
This has worked well for me, too. I saw an influx of "How to offer a time-based trial version on Android" on SO and developed a trial SDK as an answer: https://www.trialy.io/
kpsychwave 12 hours ago 1 reply      
Given the fast lookup time, it would be useful if you could provide a JS API fot synchronous loading.

Essentially, a blocking script in the dom <script src="...api.js" /> that prepopulates the window object. With clever error handling, this could improve perceived performance significantly.

A few questions:

1. What differentiates you from ip-api.com and other providers?

2. Do you use MaxMind?

3. Is there an option for no-throttling? 100s of simultaneous requests?

I aggregate multiple IP databases for my SaaS (https://www.geoscreenshot.com) and I need highly performant / reliable IP look ups.

SirLJ 11 hours ago 0 replies      
I see the author is posting the same thing every 20 days or so, so here is the 0 marketing...
motyar 4 hours ago 0 replies      
Such stories don't let me stay focused on my freelance work.

I got inspired and start researching and Building. ( btw failing yet)

merb 14 hours ago 1 reply      
well currently my location is basically totally wrong.from https://www.iplocation.net/ I've only seen one service that tracks my location 100% correct (correct village), all the others are 200 or more km away from my real location.
rodionos 12 hours ago 0 replies      
Checked one of our static IPs: the country is correct, but the city is 500 miles off.
pier25 10 hours ago 0 replies      
So what's your stack? Still running PHP?
erikb 12 hours ago 1 reply      
How much money do you make per api req?
martin_hnuid 12 hours ago 0 replies      
Thanks for sharing.

I am ready to launch a startup and currently trying to figure out what to focus on (so many ideas!).

I posted an "Ask HN" earlier today. Wondering if anyone might have some thoughts or advice on this:


imaginenore 9 hours ago 1 reply      
Just some rough calculations. Assuming the worst-case scenario, everyone in the highest tiers (the cheapest per request), 250M daily requests means he makes

400 * 250M / 320K = $312,500 per month.

Or $3.75M per year.

Not counting the expenses.

kalal 11 hours ago 0 replies      
You are great! My karma goes down, please!
Pony: Combining better-than-Rust safe memory sharing with Erlang-like actors ponylang.org
42 points by samuell  4 hours ago   6 comments top 3
Animats 1 hour ago 0 replies      
Interesting. Looking forward to more comments on this.

* It looks like exceptions carry no error information. When something goes wrong, you know nothing. Is that right?

* Calling finalizers from GC is usually troublesome. They get called late, so they can't be relied to close files and such. They also have to be prevented from making trouble by doing things you can't do during GC, or "re-animating" the object being deleted. How's that handled?

* The notion that variable type is established at initialization is becoming mainstream. How about extending that to structures? The fields of structures could get their types inferred from the structure constructor. (There was a statically typed variant of Python, Shed Skin, which did this.)

posnet 26 minutes ago 0 replies      
I think that Adrian Colyer coverage of the one of the Pony papers is the best overview of the Pony capability system.

I've found that the capability system is both the most exciting part of Pony and the most difficult part to grok for a new comer.


jeffdavis 45 minutes ago 2 replies      
What kind of runtime requirements does it have?
O'Reilly's Decision and Its DRM Implication scottmeyers.blogspot.com
109 points by SeanBoocock  9 hours ago   47 comments top 18
clumsysmurf 8 hours ago 1 reply      
Tim O'Reilly once said "Obscurity is a far greater threat to authors and creative artists than piracy". I discovered and purchased almost every Rosenfeld Media book from OReilly.

After O'Reilly moved to DRM-free books, their 2009 sales went up by 104% http://toc.oreilly.com/2010/01/2009-oreilly-ebook-revenue-up...

In other interviews, he seemed confident that DRM wasn't worth ithttps://www.forbes.com/forbes/2011/0411/focus-tim-oreilly-me...

Perhaps some part of the equation has changed since then. I'm looking forward a deeper analysis of the business reasons for this.

I'm also interested to hear what more authors think - I wonder how many agree with Martin Kleppmann (Designing Data Intensive Applications) https://twitter.com/martinkl/status/880336943980085248

This independence day weekend there were a lot of sales, so I purchased:

* "Programming Clojure, Third Edition" from pragprog (30% off sale)

* The entire collection of "Enthusiast's Guide to ..." from rockynook (each for $10)

* "The Quick Python Book 3e", "Serverless Architectures on AWS", "Event Streams in Action", "Get Programming with Haskell" from Manning (50% off)

These sales are the only way I can afford the volume I read. Some of that money would have gone to OReilly authors, but they deleted my full cart with $100 worth of stuff before I could purchase!

EDIT: OReilly catalog seemed large & redundant with publishers (packt) offering the same materials on their sites. Some like Wiley / MKP only offered very few items from their catalogs. Others like Rosenfeld / rockynook / no starch now provide DRM free options directly from their sites. I'm hoping at least OReilly reconsiders selling their Animal books again.

rst 8 hours ago 2 replies      
Perhaps best read after the blog post from current O'Reilly media president Laura Baldwin, which makes two important points:

1) Book sales have been consistently declining overall, in all media. It's not clear that DRM has much to do with this.

2) They'll still be selling DRM-free through at least one merchant, Google Play. (It's not clear whether this policy extends to Amazon as well, but they wouldn't be the first publisher selling DRM-free there; Tor's science fiction novels have been DRM-free through all merchants for a few years now.)

Source: https://www.linkedin.com/pulse/oreilly-mission-spreading-kno...

elcapitan 4 hours ago 1 reply      
> They can also reach buyers who want to see the full product before making a purchasing decision or who wouldn't become aware of your book through conventional marketing efforts.

This is definitely true for me, and one of the reasons why Oreilly is one of the defining "colors" on my bookshelf. In particular with technical literature I really need to get a good look into the book before I make a commitment and a decision to buy. I just don't want to spend money first and then stick to something that turns out to be rather disappointing.

So my usual way of buying books is to download various publications on a topic via Bittorrent and then buying the best one once I know what I want. This is similar to going to a public library, getting a few books, and buying the most convincing one for long-term use. If there was a micropayment way of paying for the short-term evaluation, I would be more than happy to pay for that (as I implicitly pay via library contributions, which go to the publishers to some part).

Having said that, Oreilly traditionally had a market of being the "printed out manual of open source software", which I'm pretty sure is dead by now, and I wonder if they can reposition completely. One thing I noticed is that they now often sell books that have titles that sound very general "Data Science for blabla" but turn out to be really just tutorials/manuals for some particular framework. That's the kind of book I would want to avoid. Nothing against good examples, but I don't need printed out tutorials.

agibsonccc 3 hours ago 1 reply      
Disclaimer: I have commercial interest with oreilly. I speak at a ton of their conferences and lean on them partially for marketing and lead gen.

Oreilly author here. FWIW a ton of us were caught off guard by this as well. At the same time, I can't say I'm surprised.

My commercial incentives for working with oreilly wasn't about the book per se. I found a ton of value in working with them for their peripheral activities including safari, their strata , and AI conferences. I think other folks who write for oreilly tend to do these same things.

Pointing out where oreilly is making money: It tends to be large companies paying for access to safari now.

They will be putting other content in there now.

Print is a dying media. That being said: A ton of people prefer print still.

I don't think any end user or author of their's is "happy" about this per se. 1 benefit I liked of the online store was the ability to point people at that for pre releases and updates.You can't really do that with amazon.

I may be naively hopeful in saying this, but..

That being said: this should allow them to invest in other distribution channels now as well.

Oreilly showed they know how to run a distribution channel and may use that expertise in other areas.

As someone closer to this than a lot of people, I'm happy to answer general questions about the process, other ways this could affect us etc, if that helps.

sqldba 6 hours ago 0 replies      
They are my main source for PDF DRM free books (I don't care about stamping my name as long as it can be read anywhere).

If they stop selling then they lose 100% of my business which is about $100 a year.

I don't use other formats. They screw technical books too badly. Some other publishers like Apress and MS Press do okay too but if O'Reilly pulls out then it's quite a blow.

orbitingpluto 7 hours ago 1 reply      
I was at a garage sale and I was perusing a bunch of Wrox books. The seller offered me a ridiculous price and then I noticed he was on the cover of one of them.

I asked him about it and I think he'd rather have the latter benefits than the minuscule compensation:

"Piracy is a double-edged sword. On the one hand, it means you receive no compensation for the benefit readers get from the work you put in. On the other hand, pirated books act as implicit marketing, expanding awareness of you and your book(s)."

So I bought the books, but asked if he had another copy of his own book. He said that he did not, because that he guesses he should keep a copy as, after all, he was the author. That's a lot of trauma for him to say something like that.

joshmarinacci 7 hours ago 0 replies      
I sincerely doubt the change had anything to do with DRM. Book sales have been declining for years. Developers just don't use books as their primary source of learning anymore. I believe OReilly is going to double down on their subscription service, Safari.
guelo 7 hours ago 1 reply      
O'Reilly used to produce the definitive bibles that you were required to own to work on different technologies, but I feel like the quality has gone down over the last 10 years or so. I don't know if that is a byproduct of developers relying on books less, or technology moving faster, or poor publishing decisions. But it has probably contributed more than DRM, one way or the other, to their decline.
daeken 8 hours ago 2 replies      
The conclusion here (to paraphrase: "no DRM isn't a big enough draw") doesn't seem to be at all supported. It may be that they want to focus on publishing; it may be that they simply had to charge too much; it may be that it wasn't bringing in enough sales to make it worthwhile, compared to other retail channels.

Without more data (or really... any), this conclusion is pure speculation.

djhworld 1 hour ago 0 replies      
I wonder why they still offer the Google Play versions as being "DRM free"

Is it just Google has more weight they can throw around and O'Reilly didn't want to 'rock the boat', or was it a technical problem?

Apparently the Google Play versions of O'Reilly books are formatted strangely and aren't a direct PDF of the physical books.

newscracker 3 hours ago 3 replies      
> Piracy is a double-edged sword. On the one hand, it means you receive no compensation for the benefit readers get from the work you put in. On the other hand, pirated books act as implicit marketing, expanding awareness of you and your book(s). They can also reach buyers who want to see the full product before making a purchasing decision or who wouldn't become aware of your book through conventional marketing efforts.

The point I've emphasized above really matters a lot for people who do read many books. Despite select chapter previews that some publishers provide, there are people who really want to do their own evaluation of something before committing to buying it.

> My feeling is that most people who choose pirated books are unlikely to pay for them, even if that's the only way to get them. As such, I'm inclined to think the marketing effect of illegal copies exceeds the lost revenue. I have no data to back me up. Maybe it's just a rationalization to help me live with the knowledge that no matter what you do, there's no way you can prevent bootleg copies of your books from showing up on the Net.

Again, the emphasized sentence above has been known for a very long time in the areas of music, movies, TV shows, books - any content, actually. In my observation, people who pirate books also tend get into a habit of hoarding rather than reading (low disk/storage and bandwidth costs). Leaving aside the people who are in countries with poorer currencies and cannot really imagine buying a lot of the English language technical content produced, I doubt if the real loss in revenue is even substantial.

Books also, depending on the subject, require investments of time, attention, memory and repeated reference, unlike movies, TV shows and music that most of the time require a "one time investment". So I would not consider books to be in the same category as others when it comes to piracy.

I'm not at all happy with O'Reilly's decision, and did write to support at oreilly saying that this makes it difficult (finding DRM free content on amazon or elsewhere in multiple formats) and that I wouldn't be buying O'Reilly products again. I received a standard reply thanking me for the feedback and pointing me to the blog post. My guess is that the direct customer relationship and brand recognition through its website is going to be lost along with this decision.

I don't know if O'Reilly will change the decision, but people who do value the freedom of DRM free content in different formats must voice their opinions by writing to O'Reilly support.

djhworld 1 hour ago 0 replies      
I like reading PDFs of books on my phone, especially on my commute. It works surprisingly well, better than I thought it would.

My work bought a copy of Designing Data Intensive Applications for the team, I've started reading it but lugging around 1kg of book every day gets old really quick, I wish they would have offered a PDF download coupon or something inside.

hackerpolicy 7 hours ago 0 replies      
No more $5 dollar books by registering as a 'print-book owner'?
acomjean 7 hours ago 1 reply      
I bought probably about 20 books on O'Reilly website. Mainly because they are typically of good quality but also because they're drm free.

I always appreciated that they were available in any format (pdf/ebook etc) and thus easier to search. You can even sync them to your dropbox automatically after purchase. A download them.

We had a book service at a former company and it was terrible, one page at a time with a clunky web interface. Being able to download and scan them was much appreciated.

But as internet search gets better, you find quick solutions on stack overflow. It must be hard selling books.

a3n 7 hours ago 0 replies      
I have given a few O'Reilly DRM-free books to colleagues. Younger colleagues, who usually have never heard of an "animal book." I always point out where it came from, and suggest that if they like it, they either buy it, or buy something else from O'Reilly.

Maybe some did, I don't know. I suspect that the ones that didn't, probably wouldn't buy a similar book from anyone, not because of piracy, but because they just aren't into that particular kind of product. So no (real) harm, no foul.

reidacdc 8 hours ago 1 reply      
The death of the DRM-free model was a big concern to me, but a recent supplementary blog post from Laura Baldwin, linked-to in the article but not called out, seems to claim that O'Reilly books will still be available DRM-free from Google Books. It's not clear to me if this means the Google Play e-book store, or something else.

Blog post here: https://www.linkedin.com/pulse/oreilly-mission-spreading-kno...

Animats 2 hours ago 0 replies      
That sounds more like O'Reilly gave up on fulfillment and went with Amazon, like everybody else.
shmerl 2 hours ago 0 replies      
I didn't quite get that. Will their books still be available DRM-free or not? If not, that's a major setback.
2FA using a postcard shkspr.mobi
15 points by edent  4 hours ago   10 comments top 3
djhworld 29 minutes ago 2 replies      
I agree with the comment that the email hint is totally unnecessary, and the ambiguous 3 day expiry window is too confusing in an world where post might take 1 - 2 days to arrive. Additionally the postcard aspect means people at every point along the delivery chain can read the back and front, why not just send a letter?

As a side note, I really like the way the author of this blog has constructed his "Contact Me" bit at the bottom, it's very intuitive, clever, and uses URI schemes where appropriate, nice job!

Sami_Lehtinen 33 minutes ago 3 replies      
How about verifying against the national identity database. No need to second guess identity / address, etc. Isn't that exactly why we've got strong identity, that it's very easy to validate it. I wonder why some services use all kind of pseudo methods, when there are strong and proven methods available. Also postcard is bad, because it's not registered mail, where recipient identity is verified. Some businesses do use that. But it's still worse than using online id. Because it's still more likely that the identity verification when receiving mail, isn't done properly.
jwilk 31 minutes ago 1 reply      
How is this a 2FA?
An Adversarial Review of Adversarial Generation of Natural Language medium.com
87 points by sebg  9 hours ago   16 comments top 6
wodenokoto 46 minutes ago 0 replies      
If you want to read more discussion on this topic (and this article), see the article "A computational linguistic farce in 3 acts" and its HN discussion: https://news.ycombinator.com/item?id=14532306
slashcom 8 hours ago 1 reply      
It's worth noting this post caused a very intense twitter debate among the NLP and DL communities, especially after Yann Lecun replied to Yoav's comments. https://www.facebook.com/yann.lecun/posts/10154498539442143
fizixer 7 hours ago 3 replies      
This is an instance of the general issue of conflict across, what I like to call, the salesperson-slacker spectrum.

Most researchers/academics lie somewhere on this spectrum. (Well I guess most human beings involved in any activity probably).

On the one end are salespeople who love to make a mountain out of a molehill they just discovered. On the other, slackers are like the perfectionists who never get anything done because they never resolved their analysis-paralysis.

There are very few who are exactly in the middle of the spectrum. The middle is a point of unstable equilibrium. You have to work very hard to stay there and can easily fall off to one side or the other.

paganel 2 hours ago 0 replies      
I'm not going to comment about that paper being published on arxiv and I don't generally care about the NLP vs DL debate, just wanted to say that those generated examples did look indeed all rubbish to me.

As do most of the Google Translate pieces, even though I get the feeling that automatic translation of texts is now seen almost as a solved problem (it's not): all that Google Translate does is change some text from some original language to a second one, which is not a real language, it's just a language that's sometimes very close (grammatically and lexical) to one which the agent/user knows.

The idea is that we should try to look harder and have fairer judgements about the actual results and not get stuck on the methodologies.

a3_nm 2 hours ago 3 replies      
> This post is also an ideological action w.r.t arxiv publishing

Anyone else thought that this was very weird? The author appears to be complaining about the fact that reputable people/labs can post a PDF on arXiv and be taken seriously. How is this avoidable? Without arXiv, they could just post the PDF on their website or anywhere else.

The "risk" associated to publishing crap on arXiv is the same as always: have people notice it's crap and get a bad reputation. I'm not sure what ideology has to do with it.

denzil_correa 1 hour ago 0 replies      
> Communities will naturally recognize contributions and give credit when credit is due. It's always happen that way.

"Let the market decide!"

The earliest known versions of Dennis Ritchie's first C compiler github.com
32 points by jnord  6 hours ago   1 comment top
doppioandante 3 minutes ago 0 replies      
How was this compiler bootstrapped?
Edge cases to keep in mind when working with text thedroidsonroids.com
23 points by submiter_dor  5 hours ago   7 comments top 3
ewjordan 1 hour ago 1 reply      
The Turkish situation referenced (http://gizmodo.com/382026/a-cellphones-missing-dot-kills-two...) is not an indictment of bad tech, but of a fucked up honor-based patriarchal culture.

"Ramazan went to the family's home to apologize, only to be greeted by the father, Emine, two sisters and a lot of very sharp knives."

There's no technological way to fix people that would try to kill someone over a text misunderstanding without figuring out the truth first. People like this are garbage-people murderers, let's not blame tech mistakes for the fact that some people are scum. Everyone involved knew damn well that a couple characters would make the difference between a benign text and an offensive one, and frankly, even if the text was offensive, murder was not justified. Scum.

brudgers 4 hours ago 1 reply      
Text (aka strings) exists in virtually all software projects

For me, distinguishing between text as something that is intended to be read by humans and strings as serial sequences of characters that may or may not be human readable but will be processed by one or more computing automata is useful. For example in C, the string "Hello World" is terminated by a null character. The null character is not part of the text the string encodes.

Or to put it another way, I find that treating strings as text as two different layers of abstraction clarifies my intent. Code that manipulates text is built on code that manipulates strings and in between there's parsing that has to occur.

dvfjsdhgfv 1 hour ago 0 replies      
I wish more software developers kept these things in mind.At one of my customers I worked on interfacing their online store with several other software components. The store was the only piece able to handle the names of customers (from different parts of the world) correctly. All the rest failed at some point. There are so many additional aspects you discover only when you actually work on these things.
Solving the Rubik's Cube Optimally Is NP-Complete arxiv.org
14 points by seycombi  5 hours ago   3 comments top
js8 1 hour ago 2 replies      
Interesting. I wonder, if you can solve 5x5x5 cube, then you can solve them all. How good an approximation is using the optimal methods for solving 5x5x5 to solve the NxNxN case?
The history of the IBM PC, part one: The deal of the century arstechnica.com
18 points by masswerk  7 hours ago   2 comments top 2
fuzzfactor 6 hours ago 0 replies      
Part 1a.

By this point Microsoft had also supplied the BASIC for Perkin-Elmer's very expensive Model 3600 "intelligent terminal".

These were more than just terminals capable of limited client-side processing for time-sharing systems. As stand alone desktop units they were powerful enough for scientists to acquire data, store it, process it and/or control data-intensive instruments which a year earlier had required a large non-desktop host, often a forklift model.

The 3600 was the first desktop to actually resemble an early IBM PC, horizontal unit having two prominent 5 1/2" floppies, and with the detachable keyboard (also introduced softkey F-row) & monitor.

Other than that the box was simply a microprocessor & memory with two RS-232 ports (known as serial COM ports ever since later release on IBM PC's), and an external instrument connection, not supposed to be expandable or upgradable internally.

ROM bios booted (without a floppy) to a novel Disk-Based Operating System known as the Perkin-Elmer Terminal Operating System (PETOS) where a few of the DOS commands were available from ROM, but the remaining majority of the commands were expected to be present on a floppy residing at DISK0. DISK1 was expected to usually be employed for application program & data storage.

These were proprietary Perkin-Elmer programs to interface with their own scientific instruments but many users wanted to develop their own programs in BASIC like you could on the competitive Hewlett-Packard equipment using their built-in HP BASIC.

The team that had designed the 3600 had of course been separated into numerous more rewarding projects.

Anyway Perkin-Elmer got Microsoft to provide the BASIC for the 3600, and and as we all gained deeper knowledge of PETOS by operating this equipment, it really helped later when the IBM PC was launched because its DOS had such an uncanny similarity.

yuhong 6 hours ago 0 replies      
"Kildall, meanwhile, often didn't even seem certain he wanted to be running a business in the first place:"

I should mention Gates wrote the Open Letter to Hobbyists. Of course, this was before "open source" or "free software" even was a term. At the time, CP/M had a "BDOS" supplied by DRI with the OEM having to write the "BIOS". There were firms such like Lifeboat which had writing the BIOS as part of their job.

Thinking about it, I wonder if it would have been feasible for Kildall to work on CP/M full-time at Intel and release CP/M source to public domain instead. It would be nice if CP/M was the first thing ported to 8086 back in 1978 for example.

What I Learned from Researching Coding Bootcamps medium.com
72 points by vyrotek  12 hours ago   43 comments top 10
lr4444lr 11 minutes ago 0 replies      
My kitchen is a disaster. My whole house is just a mess. Anything that is not directly related to [the bootcamp] or to keeping me up and functioning, just goes by the wayside. [] I dont remember the last time I had a shower.

Hmm. Maybe the bootcamps do give you a more realistic preparation than I had previously thought.

soneca 9 hours ago 1 reply      
Good article (yet to read the paper). I believe learning of software development is a glance at the future of education. Diverse paths for diverse people. A myriad of options: college, bootcamps, internet; free, cheap, expensive; lectures videos, tutorials, documentation, blog posts, coding in the browser; teachers, tutors, colleagues, coworkers; templates, boilerplates, open source.

I think all of it is wonderful. Knowledge about software development is much more accessible than, eg, architecture. And much more aknowledged without a degree.

But, with the myriad of paths, come more risks of one choosing the wrong path, both from honest mistakes or being mislead by dishonest people. And these mistakes can be very costly.

I have a hunch that sometime in the future almost all knowledge will be able to be learned like this. It is a good thing to study the advantages and disadvantages of this environment.

pulpwave 5 hours ago 4 replies      
I used to teach front-end development at a bootcamp several months ago. What once was a great opportunity to share my 15 years of experience with various startups and fortune 500 tech companies ended up being a nightmare with management.

First and formeost, I liked most of my students. They came from all kinds such as: Background Actors, wives of programmers, students, military veterans, teachers, and a hodgepodge of characters. They were eager to learn a step-by-step procedure of HTML, CSS, and Javascript Fundamentals with a bit of web design elements.

What struck me as odd was the President/Founder of the bootcamp first hired me to do my job and observe the videos of what their past instructor did. So I did and found out he was too aggressive in his approach to teaching his students. He was talking too fast and went off in some awful tangents that wasn't on the itinerary. I told the founder, my boss at the time, about what changes I could do to make it less aggressive and with a flowable and presentable approach to the fundamentals of front-end development and he agreed.

In my first five days, things went well. Then the boss wanted me to stick with the previous itinerary while forcefeeding javascript trends in one week. It wasn't enough to teach them basic syntaxes, array methods, loops, and other things I learned in 1 year of CS studies (I finished my undergrad in Economics). He wanted me to feed AJAX, jQuery, Angular, and APIs with javascript in one week. I hesitated at first, but he insisted as his "board of directors" feel this would give my students a reason to keep buying more advanced courses in the school I don't want to give out.

I find out the real reasons for my services to my students is to not just give them a new career perspective, but to also have the boss read the idea proposals his students have as startup ideas and he would be the bridge to their funding. Except, he's the end of the bridge as an angel investor looking to own a huge chunk of their startup property before he prepares his elevator speeches to real angel investors. An another thing is he's also aiming veterans for their GI bills to be spent at his coding school.

I left after a month. I felt dirty and icky just looking back at it.

paradite 5 hours ago 2 replies      
> I couldnt work because I really needed to do job searching full time.

This is the part that don't quite understand. To me, the period when you search for jobs is exactly when you should be working as freelancer or working on side projects.

Working helps you get into the coding zone so that you are more prepared for coding sessions during interviews.

Working helps to add your resume and things to talk about when the interviewer asks "what have you been doing recently?".

Working should also be the natural next step to do after completing a bootcamp to put all the knowledge into practice.

I can't think of why it would need a full-time commitment to send CVs, practice algorithms and attend interviews.

udioron 1 hour ago 0 replies      
Interesting quote:

> Also, we noticed that at least four students happened to be married to programmers, and at least seven others had parents, siblings or other important people in their social circle who were programmers.

twoshoes 2 hours ago 1 reply      
Companies here value skills in specific programming languages/tech stacks over skills in core Computer Science. They are hiring bootcamp graduates (who are overall weaker), over stronger applicants, just because the bootcamp graduate will hit the ground running quicker.

It makes learning Computer Science in university less valuable, since university graduates lack knowledge in specific programming languages/tech stacks. It's like saying, "You know all that stuff you learned about memory management, virtual memory, I/O subsystems, CPU schedulers and so on? Yeah that doesn't matter if you don't know Angular X.0"

Does anyone else see a problem with this?

tmacro 3 hours ago 0 replies      
On the flip side of of bootcamps, are orgs like 42.us org, which aside from being a nonprofit, is also completely free! Much emphasis is placed on code review by your fellow students, with it being required that your projects must be graded by them. They just opened a new campus in Fremont that even inclides a dorm room for free for about 11 months. Disclaimer, I attend 42 at the Fremont campus
AndyMcConachie 6 hours ago 3 replies      
But do bootcamps train people to program well?

The metrics they measure against are money and job based. Which gives me the impression that we either don't know how to determine if people know how to program, or that no one really cares. We need some kind of standardized test or similar. I know of the Advanced Placement CS exam[1], but we need something more practical and not focused on a particular language.

[1] https://apcentral.collegeboard.org/courses/ap-computer-scien...

mettamage 6 hours ago 0 replies      
For people who are interested in a story (technically anecdata, I know, but I can't resist to chip in), here's mine.

I have taught a coding bootcamp 2 times now. The selection/recruitment of students -- not done by me -- was fairly loose in some cases. The only thing the bootcamp selection was strict on was that people completed their undergraduate degree (which is financially easier to do in Europe).

__Succesful Students__

What I noticed is that there are people who will be successful. These people somehow have a strong background in logical thinking. It can be by doing predicate logic in philosophy courses, being data oriented from your biology studies or even being hardcore at knitting [1].

Another thing I've noticed is that they learn quick and won't ask a lot of questions. They will use a search engine and will only come to you with difficult questions. They know how they learn quite well -- at the very least.

The final thing I've noticed is their determination. Some people come to a coding bootcamp with the expectation of "fix me, I need a job in this industry." The most successful students don't take anything for granted and know that they need to learn everything they can get their hands on.

__Unsuccessful Students__

What I noticed is with students who aren't successful is that they don't think logical -- their attention/focus does not allow it unfortunately. They may or may not ask a lot of questions. And they will completely fail the bootcamp if they don't ask fellow students for help. Also interesting is that they did not learn the terms or basics well enough.

In one of my groups I've noticed that one of my best students was coaching one of the least successful students. It was intensive, and the explanations and questions were sharp. Yet, by my surprise, it didn't do much. At the end of the bootcamp the student was still one of the least successful students. It makes me believe that not everyone can do a coding bootcamp, since the right mindset is required to start one. Teachers can only help you when you're open enough to receive the knowledge. I don't know to what extent that idea is true, but I want to find out.

__Unsuccessful Students Becoming Successful Students__

I have seen not so successful students become quite successful. They had an insane amount of grit and understood how they needed to learn the material. Compared to their other unsuccessful counter parts (in the beginning at least), these students are more structured and disciplined.

__When To Take A Bootcamp__

Even for not-so successful students it can be a great tool, provided they self-teach 1 to 2 months after that. However, unfortunately, it is not for everyone. If you don't think in a logical fashion, learn fast then at the very least you need to make that up in determination -- or in time after the bootcamp. I feel a bootcamp is as tough as nails if a student only learned bullet points from presentation slides during their academic years. Programming is more akin to learning a music instrument (i.e. it takes practice), and not everyone has experience learning that way.

[1] (following knitting instructions is very algorithmic -- it's a whole new world if you've never seen it. Here is an example of a pattern: http://vintagecraftsandmore.com/wp-content/uploads/2012/07/V...

partycoder 9 hours ago 5 replies      
I worked with a large group of a local graduates from a well-known coding camp. The results were horrible.

They would submit a lot of pull requests, but upon review it became very apparent that documentation was not being consulted resulting in unnecessarily hacky solutions. The reason: programming by trial and error.

In the face of that, the only thing I could do is: give them the benefit of the doubt by asking them to walk me through their problem solving approach. This insulted them, because there was no problem and solving approach, just bruteforcing code with live reload until the feature worked.

Unfortunately, sometimes the feature did not actually work, or would not handle edge or error conditions which caused the program to be unstable. Sometimes to the point of causing a live incident.

Since coding camp graduates graduate very frequently and because of referral bonuses, they were a majority in our team. They used their majority to deny code reviews (not allowing people to mark their tasks as finished), and took turns to pull off microagressions in a round robin manner so nobody is accountable enough to be retaliated against.

In the end, these people know they will not prevail through technical excellence but rather by pumping as much code as possible and by playing dirty: refer a lot of friends, become a majority, avoid situations where a relative rank can be established and bully any opposition until they quit.

We quit our jobs, remortgaged our houses how passion made Cuphead a reality gamesradar.com
49 points by danso  10 hours ago   40 comments top 12
jondubois 4 minutes ago 0 replies      
I'm tired of all these stories about reckless people remortgaging their houses to work on a startup. It's as though being reckless is a virtue. It's encouraging others to do the same and fall flat on their faces.
keyle 6 hours ago 4 replies      
They will do alright because of the art and the amount of press already generating sales ahead of time.

That's what you need as an indie developer. A lot of publicity and great art. It doesn't matter that you essentially reproduce the exact same recipe as any other Nintendo platformer, as long as you have great art and music, people will love your content.

Similar to hollywood movies being a rehash of each other with different actors and a different setting. It's all about the human condition.

So yeah, in the rare case of Cuphead, they will do well. Maybe enough to make the next one, if they didn't over-extend themselves.

For most small indies though, I can tell you first hand, it's tough. Especially since the whole Hello Games and No Man Sky's debacle, every 'gamer' out there is out to hate indie by default. We're scammers in sheep clothing.

juice_bus 8 hours ago 3 replies      
I hope it pays off for them, I can't help but think of how many other developers/studios have done the same, but failed anyway.
pcunite 7 hours ago 0 replies      
One of the most difficult things you'll do to explain success is say, "I just happened to be in the right place at the right time". So, lacking that, reasons must be provided to explain all that work and risk that was taken.
tunetine 6 hours ago 1 reply      
I'm curious if anyone can provide insight as to what the profits of a moderately successful game would be. Let's say they've spent $600-700k over the past three years. Is there any possibility of them saving their homes without the game becoming a massive success?
rl3 6 hours ago 1 reply      
The art looks even better in motion, as the trailer shows.[0]

The resemblance to the cartoons of the era is uncanny. I really hope Disney doesn't sue them or something. Hopefully they're on rock-solid legal footing. In a just world all those copyrights would be ancient and long expired, but heyMickey Mouse is still Disney's IP until 2023, and they'll probably figure out something to keep it going past that.

[0] https://www.youtube.com/watch?v=e5iGwE0XJ1s

fivedogit 5 hours ago 0 replies      
"I maxed out my credit cards, put it all on the hard 8, and won!"

A positive outcome does not equal a wise choice.

booleanbetrayal 6 hours ago 1 reply      
Been waiting for this game for over a year ... Think it'll make a big splash. Best of luck to the team. They're trying something new and injecting some actual art into the art form.
nopit 7 hours ago 1 reply      
Never heard of it
draw_down 7 hours ago 0 replies      
I look forward to seeing it! I still remember seeing the trailer and now impressed I was, and was recently wondering what happened with it. If it lives up to its promise it will really be something.
dmead 6 hours ago 1 reply      
gambling addicts have similar stories
You don't need a grid framework on top of the CSS grid layout rachelandrew.co.uk
115 points by kawera  10 hours ago   37 comments top 8
Brajeshwar 6 hours ago 1 reply      
Wow, it's been a while. Back in the days[1], it was a feat to do even a simple Faux 3-Column layout[2]. I made lots of friends and following on forums and 'mailing lists' helping and getting help with the quirks, and CSS hacks. Those in the know of such CSS hacks and their solutions got their 'guru' moments and treatments.

The ability to fully understand the CSS Box Model, and enlighten someone was the pinnacle of your CSS Ninja tactics.

There were the purists who would not write anything but in "Strict Mode". Debates on why "transitional" should be that - transitional. There were moments of triumph when Tantek[3] himself replied to your email, discussing his tan-hacks.

Then, there was my CSS mentor/superhero - Philippe Wittenbergh[4]. I would email him right away to test my work on the Mac browsers, especially with that nasty Internet Explorer for Mac. This was before I was introduced to the Mac in 2006.

Soon enough, writing up a CSS-Grid system for a project became second-nature and felt like ringing the bell. Hell, I even had a dead-simple CSS Grid that got some traction, which I later added to Github and people liked it, used it for a while.

It is such a good feeling that "Grid" is no longer just a CSS terminology to mean a setup of patterns to build a layout but now a keyword in CSS that the browser understands and do something. We have come a long way.

1. I think, the early 21st century - 2001/2003-ish.

2. https://alistapart.com/article/fauxcolumns

3. https://en.wikipedia.org/wiki/Tantek_%C3%87elik

4. https://l-c-n.com/

smexy 5 hours ago 1 reply      
A fun way to familiarize yourself with grid is CSS Grid Garden:


ourmandave 7 hours ago 5 replies      
Create solid markup, uncluttered by additional elements that the past tells you that you need. Design your site using what Grid and other new methods have to offer. Then look at how you deal with the browsers without support, by serving them something slightly simpler.

Perhaps a fallback like, oh I dunno, <TABLES>?!

TheAceOfHearts 7 hours ago 2 replies      
If you've been following CSS grid at all, or if you look up anything related to it, you'll probably encounter something written by Rachel Andrew. She's been doing an incredible job at documenting and promoting CSS grid.

In the past she wrote a CSS grid polyfill [0], but sadly it never got updated to the latest version of the spec. It'd be an easier sell if you could polyfill it reasonably on some older browsers, even if performance suffered a bit.

Right now the last holdout is Edge [1], which is funny because they were the first to ship the older spec, going back as far as IE10 or IE11. Luckily, the next version will be shipping the latest spec.

I've noticed autoprefixer [2] supports a grid option. I'd love a guide that explained any incompatibilities between the two versions. Getting a mostly working grid on IE11 and Edge would probably make it an easier sell for many.

When flexbox started growing popular there were guides showing you could support most features on IE10 and beyond with little effort if you used autoprefixer. Since there are equivalent versions of most attributes on older browsers, you just had to be aware of bugs and gotchas. For example, the default value for flex-grow, flex-shrink, and flex-basis (and their older equivalents) changed between versions, so if you always had to set them. In addition to any incompatibilities between versions, all implementations were buggy, so you had to be aware of bugs and their workarounds. Luckily, they're all well documented on the flexbugs [3] repo.

Considering grid is more complicated than flexbox, I wouldn't be surprised to find it has a few bugs. Does anyone know if there's a gridbugs repo that developers can track?

EDIT: After a quick search, I found a blog post [4] talking about supporting the older version.

[0] https://github.com/FremyCompany/css-grid-polyfill/

[1] http://caniuse.com/#feat=css-grid

[2] https://github.com/postcss/autoprefixer

[3] https://github.com/philipwalton/flexbugs

[4] https://rachelandrew.co.uk/archives/2016/11/26/should-i-try-...

andrewfromx 7 hours ago 2 replies      

wow, you can say display: grid and display: flex in css? and then flex: 1 1 200px;, grid-row-end: span 2;, grid-column: 1 / -1; What browser support these? Like is it 90% or 50%?

tannhaeuser 4 hours ago 1 reply      
I'm wondering how the Bootstrap CSS guys feel about CSS Grid, having just spent enormous work on porting the Bootstrap 3 grid to CSS Flexbox. I guess right after the eventual release of Bootstrap 4 there's going to be Bootstrap 5 based on CSS Grid, making "container", "row" and other divitis optional? But that's sure going to be another sh*tload of work with all the QA and corner cases.
usaphp 7 hours ago 1 reply      
I could not understand how can you set a custom height for one of your grid items, and then align it top/bottom/center for example.
slaymaker1907 7 hours ago 3 replies      
The title is kind of misleading. It seems to be arguing that the new CSS grid layout is extremely powerful, but the title implies otherwise.
As the U.S. fantasizes, the world builds high speed rail thetransportpolitic.com
364 points by jseliger  14 hours ago   440 comments top 41
serhei 14 hours ago 7 replies      
For those who think it's because of geography / hyperloop is better technology anyways / any other red herring besides Brezhnevian political stagnation:

It's not completely implausible that, 30 years from now, most of Europe and Asia are connected by hyperloops while the US has built nothing and Internet commentators are arguing that hyperloop is old news compared to yet-unproven teleportation technology, and anyways the population density of the US doesn't support hyperloops.

gokhan 13 hours ago 1 reply      
I'm on a vacation in Italy and just used one this morning from Florence to Bologna with Trenitalia. A lot of positive things: The ride was 35 minutes long, doesn't include any security theater, you can arrive the station 10 minutes before the departure and hop in a couple of minutes, comfortable, roomy, from city center to almost city center, and many more. And the train continued to all the way to Turin, visiting many cities including Milan in 2-3 hours. Cost was 16ish euros, I guess (deducing from a total payment for four people).

Doing the same though air travel would add at least a total of 2-3 hours for the whole thing. Don't know about the cost comparison but the user satisfaction is there.

twblalock 12 hours ago 5 replies      
The problems affecting high-speed rail in the US are the same problems that prevent low-speed rail, streetcars, subways, and buses from being more common -- people don't see themselves using such things and so they don't want to pay for them.

In the Bay Area, the Caltrain commuter train runs from San Jose to San Francisco, through the downtown areas of most major cities in between. It is currently so popular that it is significantly over capacity every day. Yet it is still a constant political battle to get funding to improve the system, even though the Bay Area is one of the most educated and politically liberal parts of the country, where support for public transit is higher than many other places.

Sometimes I think we would have much more transit funding overall if we set aside part of the transit budget to send Americans to other countries on vacation, so they will return knowing how good public transit can actually be.

tptacek 13 hours ago 4 replies      
This topic comes up routinely on Hacker News, and it's no surprise why: there are a lot of Europeans interacting with a lot of Americans here, and European high speed rail is an enviable asset for the continent.

But last time we talked about this, it seemed to me that if you looked into the details, it was clear why we don't have HSR in the US. Even assuming we built a network that operated at Shanghai Maglev speeds, at the distances the network would need to operate, air travel would remain significantly more economical.

In a thread 3 years ago, I made a list of the top US cities by GDP, and then broke out the crow-flies ground distances between them:


Of the 55 edges on this graph, only 6 were 700 miles apart or less. Several of those are already served by the Acela.

There's a definite advantage to rail over air, in that rail can deposit you right in the middle of the city you're heading to. But that advantage can't make up for the fact that no train is going to compete with a plane for trips between the largest US cities.

dghughes 13 hours ago 1 reply      
I envy mainland Europe with its rail system. I wish my region of Canada would build a rail system. What we have is old and mainly was for transporting ore, steel, grain. Even just regular rail not even high speed any speed is preferable in a snowstorm.

My region is small and would be perfect for a light rail system mainly because it's got few people scattered over a wide area with no direct route.

Bombardier even makes rail cars for many countries so it's a home-grown resource we could use.

I think south eastern Canada and north eastern US could have a great interconnected rail system. I'm only 800 km (~500 miles) from Boston but I may as well be on the dark side of the moon.

Like NY city before its subway system people were crowded in the city but when rail was expanded people could live in the suburbs and work in the city. I think a US and Canadian rail system would open up travel and trade on the eastern coasts of each country. Day trips to cities you'd never even think of visiting now or not even capable doing so now in a day.

chroem- 14 hours ago 17 replies      
Unpopular opinion, but I really don't see why we should want high speed rail in the first place. It's slower than flying on a commuter airline, but the tickets cost nearly as much, and it's also enormously expensive to build. Then there's also the issue of throughput and last mile logistics. You're limited to putting people in a few train cars, as opposed to a continuous stream of people on a freeway. Being a high speed train, stops are necessarily few and far between, so once you arrive at the station you still have to figure out how to travel tens of miles to where you really want to go.

My perception is that it's a huge money pit for something that's quite frankly inferior to our current infrastructure. We would be much better off improving our current insterstate system.

mc32 14 hours ago 8 replies      
Rail is not cheap. It's feasible when we have population density. It would make sense along the DC-Boston megalopolis and perhaps SF-San Diego, maybe some stretch of Texas. It makes little sense in the rest of the country.

That said, where it would make sense, like DC-Boston, we definitely should build it out. Build up the cities as the countryside is absorbed (as seen in Japan, and elsewhere in Asia) and let it become viable. Its deployment would definitely affect how cities and other communities grow and also depopulate, so we'd need to anticipate that and prepare for it.

Three things China has going for is vis a vis the US:

-Pop density

-Command economy (gov't can just move things through with little debate, displace 1MM people, if necessary.)

-Costs (in labor, materials, regulation, etc.)

notadoc 14 hours ago 2 replies      
I wonder if the USA will ever build and modernize its infrastructure? We're still coasting on what was built 50, 60, 70+ years ago.

Then you travel to the rest of the developed world, and wow, what a difference in infrastructure.

a_imho 35 minutes ago 0 replies      
From the little data I gathered, trains/mass transit seems to be much more efficient than cars regarding greenhouse emissions. YMMV depending on your stance on climate change, but I find it sad people dismissing rails so easily in this thread.
d--b 1 hour ago 0 replies      
Something that is never debated when we talk about high speed train in the us: would it bring development the same way the iron horse brought development in the 19th century?

I mean, yes the US is sparsely populated (in the middle), but isn't it also because it doesn't have fast and easy transport system?

Wouldn't a high speed line between San Fran and Portland develop the very rural regions of Northern California?

High speed train also means high speed cargo transport, isn't that driving some economical development?

These are not rhetorical questions, i seriously have no idea of the answers, but it would be nice to see what experts think about that.

armenarmen 14 hours ago 1 reply      
Well, the fact that te government subsidized the automotive and oil industries with the Federal highway act in '56 lead to our deprioritization of rail. Had this not been the case, chances are we'd have European equivalent rail, and the small local trolley systems that dotted americas small and medium sized cities would never have been torn up.
sdiq 14 hours ago 3 replies      
Every time I hear or read something about the US I realize the country is in many ways far behind Europe. When it comes to healthcare (in terms of accessibility), education (in terms of cost), infrastructure, etc, America seems to be doing much worse than these countries. Yet, ironically, America still leads these countries (and the rest of the world) in most other spheres.
jamespitts 12 hours ago 1 reply      
We have a serious problem with retrogradism in this country.

A large number of people are suffering from changes outside of their control, and they are disconnected from those at the forefront of social and technological progress. Many of these people have lost trust in the system, and even in progress itself (outside of progressions that are accessible and affordable such as games or phones). As a result, there is little enthusiasm for investing in major improvements to systems or building any major infrastructure enabling progress.

"High speed rail? What is in it for me? I work part time and can't afford these medications. I want the life I used to have back."

Perhaps a good place to start is understanding the experience of people who are voting for an imagined retrograde society. This can be difficult for those of us who have had the privilege of a better education, or better opportunities in the cities, or even all of our needs met as we build what we build. The privileged must try, and must succeed in understanding what is happening here. This is because the votes of those within what is essentially a ghetto lead to major consequences, including underfunding high speed rail. The result isn't just ridiculously under-qualified and intellectually isolated politicians that are easy to make fun of.

The underprivileged will keep voting in this way until their concerns are answered (or not).

We at the technological forefront know more about what needs to be done in terms of advancing progress, possibly even to the point of solving half of all social problems. However, we must also pay heed to the immediate, harsh reality of the people left behind. Our environment -- natural, political, or infrastructural -- depends on this.

If the ethical demand to listen and react appropriately to the suffering of others does not convince us to strongly act, watching the destructive results of their votes should.

closeparen 14 hours ago 3 replies      
The US has a strong inter-city travel network in the airlines.

The TSA severely limits its effectiveness, so it could be tempting to build a rail network just to bypass the TSA, but there's no reason to think the same screening procedures won't apply to HSR after the first incident (or just threat).

thisrod 12 hours ago 1 reply      
There is another question here. How the hell did the French build 300km of high speed track, going through central Paris, for only 10 billion dollars? If Australia could do that, the cost benefit analysis on Melbourne, Sydney and Brisbane would look very different.

Melbourne to Sydney is worth doing now, though it's a close thing. But the benefits come as time savings for rich businessmen, and Australia told them that if they really wanted it they could pay for it themselves.

faragon 10 hours ago 0 replies      
May be the US is doing a wiser thing. I live in Spain, where high speed train was pushed in the era of the housing bubble, and in my opinion is not that big deal, except for communicating the two bigger cities of Spain (Madrid and Barcelona). Lower capacity routes are on deficit, and I'm very skeptical about their long term viability.
em3rgent0rdr 1 hour ago 0 replies      
High speed rail doesn't actually provide the eco-benefits over planes that proponents think it will. And high speed rail is only competitive against planes and cars at distances less than 500mi. Unfortunately the US is too sparsely populated and the big cities outside of the coastal corridors are too spread out for high speed rail to be economically and ecologically sensible. http://www.newsweek.com/why-high-speed-trains-dont-make-sens...
mieses 3 hours ago 0 replies      
Rail is a bad idea wrapped in shiny engineering. Read Randal OToole http://ti.org/antiplanner/.
rmoriz 13 hours ago 0 replies      
You can't even compare the population density of France with Germany, hence why always apply the "high speed rail" idea to the US? Image a high speed rail system between large cities but people still have to own and use a car, drive 200miles to/from the next station. Also either the train stops at every small town or it will be an express train that leaves the rural areas behind.

IMHO a high speed railway network is not the start but an evolution of an existing regional rail system/public transportation system that acts as a feeder and communter infrastructure.

The US lacks those public transpotation systems even in mid-size towns. That's a bigger problem IMHO.

hassancf 13 hours ago 0 replies      
Even third world countries such as Morocco are building rail tracks for bullet trains...
bsaul 13 hours ago 1 reply      
A big difference between europe and us is also in the fact that people in europe tend to live inside the cities, and not just go there to work. Train is considered faster than plane her in france, because you can go to the trainstation using subway, and board immediately, whereas you need to leave your place 2 hours before your flight.

i have the feeling that this advantage would be lost in the us, where people live in suburbs way more, and so any trip starts at least with a 45 minutes drive ( not to mention the fact that parking in an airport is probably more convenient than in a city center).

ptr_void 12 hours ago 0 replies      
'Why Trains Suck in America' : https://youtu.be/mbEfzuCLoAQ
ravenstine 11 hours ago 0 replies      
When other countries play a bigger role in securing the global economy with military might, maybe we can start building a high speed rail infrastructure. Otherwise, I don't see an actual need for it. It would be an improvement, for sure, but all I see is people looking at much smaller countries and assuming that America is stupid for not doing everything they do.
ams6110 14 hours ago 2 replies      
In the U.S., President Obamas initiative was met by Republican governors elected in 2010 who, for reasons that had little to do with sanity, resisted free federal money to fund the completion of intercity rail projects their (Democratic) predecessors had developed

It's not insane. Federal money is never "free" it's taken from the people and always comes with strings attached.

mnm1 12 hours ago 1 reply      
I think it's too late for rail. Yes, we lost a generation of development in rail. We also lost a generation of development for pretty much every other transportation industry, and thus our whole infrastructure. The article briefly touches on it. Transportation in general hasn't been a priority for at least thirty years. I'm not worried about rail. Rail is dead in the US and has been for a long time. I'm worried about our highways. That's our infrastructure core, without which the US cannot survive. Not building rail projects in the US is pretty normal and on par for the downward trajectory we're on.

Not building and maintaining highways and bridges shouldn't even be an option. While some upper-class, rich people can afford to live in our cities, they are a huge minority and most people rely on cars and highways. Outside of a couple of cities, good city public transportation simply doesn't exist in the US and won't exist anytime soon.

I think we need to be realistic as to what is possible in the US. High speed inter-city rail isn't possible. And even if it is, can it compete with the price of plane tickets? Doubtful. Giving our cities good public transportation isn't possible. It may have been possible in the past, but not the last few decades. Having room inside a city for all who want to live there most certainly isn't possible. Building roads and bridges has now become almost impossible in many places. I have to wonder what is the plan for the US transportation infrastructure. As far as I can see, the plan is to let it deteriorate until it doesn't exist anymore. At least in that sense, it's consistent with education, social programs, and the rest of our crumbling society.

ortusdux 13 hours ago 0 replies      
I still resent Rick Scott's decision to reject federal funds for Florida's high speed rail. It would have been the first high speed rail in the country.


mickronome 9 hours ago 0 replies      
Several comments in this thread almost appear to be constructed to prove the author right in how the debate derails.It's not much of my concern, but still I couldn't help to notice something that felt like an unusual occurrence here on HN, or maybe I'm simply seeing ghosts. I am rather tired to be honest!

Anyway, my flawed observation:

Some sort of deadlock where instead of discussing how to improve the situation, the discourse get stuck in debating which is the correct reason for not doing anything, instead of trying to come up with improvements?

Several times arguments are made that a non existent technology will make current investment pointless in the future, so no investment should be made now. Isn't that the argument implied by the title of the article?

Obviously, it could be true that future inventions would make it pointless, but that certainly is not something you can calculate/know off the cuff, if it's even worth speculating about. Building a high speed rail network take long enough time that all those possible avenues can probably be explored in excruciating detail before the first shovel hits the dirt a decade from now if everything moves quickly.

People are sceptical towards hyperloop, which is understandable in many ways. But what if it would work? Wouldn't it be worth investing quite a lot of money simply to figure out if it could work?

Obviously it could potentially only solve a very specific part of the transportation puzzle, but one that could have quite some positive effects.

Positioning cars and aircraft as more-or-less the only viable ways of communication for the foreseeable future sounds like an awfully odd position to me, even for a very sparsely populated country. While, the correct solution might not be high-speed rail, some variation of it it could still be the best solution in several instances.

Maybe someone would come up with something like a tethered electrical ground effect aircraft/train which could take advantage of the sparse population if they knew there were money to be made. Instead of massive resistance and cartloads of red tape ?

pmurT 11 hours ago 0 replies      
Even if we had high-speed rail the gov would regulate it to death like everything else - they'll make it just as painful as flying. Imagine the TSA salivating for the mission creep.
cartercole 13 hours ago 0 replies      
so because every other country is subsidizing the shit out of stuff so should we? Economics drives our country not pipe dreams of people who want to get the taxpayers to foot the bill for their new hotness
bpodgursky 14 hours ago 4 replies      
IMO electric and self-driving car technology is advancing rapidly enough that investment in high-speed rail is going to end up like landline telephones -- it had a time when it was useful, but countries that missed the boat will end up doing A-OK without them.

Rail is convenient, but it will never ever be as convenient as having a car take you where you want to be, carry your kids, and carry your stuff around. As soon as self-driving technology eliminates the hassles of parking and clean solar-electric tech eliminates the environmental concerns, ridership is going to tank on all the fixed train lines. It might be 10-15 years out, but I would be shocked if any of the investments made today in rail ever pay off.

giardini 14 hours ago 2 replies      
What's the justification for spending on high speed rail vs roads vs air travel vs doing nothing (Google Car is coming, remember)?
ableton 2 hours ago 0 replies      
Interestingly a private company is tying to build a high speed train from Dallas to Houston tx. The great thing is that it would be privately owned so if it's a flop taxpayers arent on the hook.
ensiferum 14 hours ago 8 replies      
For americans a train is socialism. They need their V8s for crawling walking speed on the 4 lane highway burning a ton of fossil fuels. Actually the bigger the truck the better since it means freedom (or something). ;-)
exabrial 14 hours ago 0 replies      
This sounds like a title written by someone that's never visited anywhere but New York or LA. The USA is very large, and we don't have high population density except for the Eastern seaboard (where rail seems to work pretty well there).

Roads are a much better, cheaper, faster, flexible option. We just need a 10x revolution in: storage density, fast charging, or efficiency.

flimflamvir 8 hours ago 0 replies      
The first one went OK, the second collapsed the Japanese rail industry. All require huge subsidies.

America is smart!

anjbe 7 hours ago 0 replies      
Im going to share my experience with commuter (not highspeed) rail: the New Mexico Rail Runner. https://commons.wikimedia.org/wiki/File:Trainroadrunner.jpg

The Rail Runner was built in 2006 primarily due to Governor Bill Richardsons efforts. It essentially covers two cities, Albuquerque (~500,000 people) and Santa Fe (the capital, ~70,000 people), which are already connected by Interstate 25.

I love trains; I recently took Amtrak to LA and back. And I love the Rail Runner. Its my favorite way to get to Santa Fe by far. Once I arrive, being without a car is not too bad: Santa Fe is a fairly walkable city, Albuquerque has a decent bus system, and a bicycle (which I can take on the train) makes things a lot easier.

The big problem with the Rail Runner is its cost.

Richardson originally was very vague about the cost, and initial estimates were (it turns out, a wildly optimistic) sub$100 million in initial capital. The state took out a loan to pay for construction. The total cost is now estimated to be about $800 million. Currently the state Department of Transportation pays about $25 million a year on the loan; as currently structured, that will slowly increase to $35 million per year until 2025 and 2026, where the payments jump to $110 million (per year!).

New Mexico is currently in a budget crisis (not just due to the Rail Runner). (http://fortune.com/2016/12/04/new-medico-budget-crisis/) There have been special legislative sessions called this year to sort things out, and theres conflict between all three branches of our state government. I have no idea where the DOT will find $80 million in their budget the next ten years, at least not without serious cuts to our already underfunded highways.

Then there are the operating costs. This is not so bad. Revenues only cover about 10% of operating expenses. But at least the rest is covered (at the moment) by county taxes, federal grants, and payments for use of the track by Amtrak and BNSF.

Ill be cynical: my personal belief is that Richardson intentionally hid the costs and pushed the Rail Runner as a shortterm publicity stunt for his 2008 Presidential run, without a care as to what it would do to the state ten years later. It is very like him. (Dont even get me started on the spaceport.) https://www.abqjournal.com/news/state/602848nm10-16-07.htm

The legislature sponsored a study to determine the feasibility of selling the Rail Runner (https://lintvkrqe.files.wordpress.com/2015/11/final-hm-127-s...). It concluded that nobody would be willing to buy it due to low revenues, high operating costs, and the plethora of exclusivity agreements that would need to be renegotiated (with BNSF, Amtrak, the pueblos, the federal Department of Transportation). And selling it wouldnt help since it wouldnt absolve us of the requirement to pay off the debt. At this point I dont foresee a solution other than refinancing the loan (again) to avoid those $100 million cliff payments, at the cost of further interest payments.

Like I said, I love the Rail Runner, and I really want to see it (or passenger rail of some sort) succeed in New Mexico. I do think the way the Rail Runner was handledintentionally hiding the costs and having no concrete plan to cover operating costsis completely unconscionable.

Not that it has to operate at a profit; after all, highways lose money too. But the Rail Runner loses so much money, and were already a poor state. It is valuable to connect New Mexicos capital with its largest city. I just feel like there has to have been a better way to do it. I hope the proposed train from Las Cruces, NM to El Paso, TX (http://www.lcsun-news.com/story/news/local/2017/06/28/study-...) will learn from the mistakes made with the Rail Runner.

Whew. After all that, Im curious: what successful rail projects have you seen, and what makes them successful?

taw55 14 hours ago 0 replies      
What about the logistics industry?
scythe 9 hours ago 0 replies      
http://en.wikipedia.org/wiki/Brightline appears to be for real in Florida and the Acela now carries a majority of traffic on some parts of the Boston-NY corridor and service to Washington. Texas has much cheaper gas and stronger car culture than anywhere in Europe or Canada. It's really just California that's lagging behind expectations; the other two projects could be better. And Florida / Texas / California / Northeast wraps up all of the locations in the US that are viable for high-speed intercity passenger rail. The only truly underserved corridor in North America is Toronto - Detroit/Windsor - Chicago, but most people don't even recognize it as a possibility because it crosses a border.

So instead of "what's wrong with the US" we should ask "what went wrong with CA HSR?".

graycat 13 hours ago 0 replies      
How much money does Amtrak in the US Northeast Corridor make each year, in ballpark, round numbers? $100 million? $1 million? $1?
graycat 12 hours ago 0 replies      
A guess: Now that Trump is talking about "infrastructure", the passenger train people are coming out of the woodwork again looking for big subsidies from the US Federal Government.

Some years ago, for a while I was a prof in Ohio. Well, there was a group all hot on connecting all the Rust Belt cities -- Chicago, Detroit, Cleveland, Columbus, Dayton, Cincinnati, Muncie, Akron, Indianapolis, South Bend, Youngstown, Toledo, etc. with passenger trains. They were really hot.

Look, guys, the US had a very good passenger rail network. Could go by train from one tiny crossroads to any other, all by train. And people did that. But soon that whole thing was killed off by, and may I have have the envelope please? Right, the Model T, etc. Private cars. A lot of the tracks grew up with weeds.

After WWII, soon, for trips up to 1000 miles, say with the whole family, people would rather just take the family car. Just after WWII, the passenger trains were still running, but, no thanks, people would rather take the family car, e.g., from Florida all the way to Grandma's near Buffalo, NY. As soon as I got married, my wife and I went to her family farm for Christmas, 900 miles, by car, car packed with stuff. Plane? Train? Bus? No thanks.

Gee, guys, now with the TSA, no way will I want to take a car full of luggage, toys, Christmas presents, etc. past the TSA. No way.

For me, for anything like family travel, public mass transportation, no matter how fast, how roomy, how cheap, how safe, due to the TSA and all the luggage handling problems, lack of privacy, being legally under the thumb of a lot of people, rules, bureaucrats, various cases of police, being subject to being forced to wait in my seat for four hours while whatever is going on, etc., the answer is no, no way, never, don't bother to ask again.

There are a lot of people and projects there in the woodwork eager to come out with lots of publicity, reasons, and excuses and eager to scarf up Federal subsidies. A LOT of people/projects. Clearly there is a whole industry of this stuff. They are always back in the woodwork, and as soon as they smell money, and they are good at smelling money, out they come, big publicity drives, etc.

mattfrommars 14 hours ago 11 replies      
People keep on forgetting the immense size of the continental USA.


Thoughts on Improving Academic Journals douglaslcampbell.blogspot.com
4 points by frgtpsswrdlame  3 hours ago   1 comment top
dbcooper 16 minutes ago 0 replies      
Three things I want in terms of format/content:

(i) A single download that includes the article and supplementary information. Some journals are doing this now.

(ii) Articles in epub format - document reflow on my tablet!

(iii) A single download of an entire issue. Particularly desired for Nature Biotechnology.

Some kind of advanced article annotation system with an easy way to add citations + links to other sources would be great.

Dov Charney's American Dream: The rise, fall and comeback of an apparel empire retaildive.com
48 points by petercooper  12 hours ago   8 comments top 4
ChicagoBoy11 11 hours ago 1 reply      
For those wanting a really intimate and interesting look at Dov's new venture, Gimlet's "Start-up" did a great job profiling him over a few episodes. Definitely worth a listen.


eli 10 hours ago 2 replies      
Always neat to see one of our articles on HN! Especially one where the team worked really hard to make it happen. Dov doesn't give many interviews and the page layout & design is all new.
HillaryBriss 5 hours ago 1 reply      
word from a designer i know: many garments produced by American Apparel at the end of Mr. Charney's reign were overpriced; poorly sewn (often with un-hemmed un-finished edges); sometimes little more than a piece of stretch fabric fashioned into a tube or a large rectangle; inconsistently sized and fit to the customer.

also: the advertisements and commercials featured hyper-sexualized, semi-nude models who were often regular employees.

it seemed to be an innovative way to run such a business but doing that sort of thing in LA is quite possibly the wrong model nowadays.

a year or two back, the LA Times ran a story about some LA clothing manufacturers leaving the city and relocating to El Paso, TX: http://www.latimes.com/business/la-me-korean-jobber-market-2...

6stringmerc 8 hours ago 1 reply      
A Swedish rail line now scans microchip implants in addition to tickets businessinsider.com
7 points by Erwin  3 hours ago   2 comments top 2
petters 5 minutes ago 0 replies      
> For those brave enough to get an implant inserted into their hand, however, the time they save not standing in ticketing lines may be worth it.

There aren't really any ticketing line problems in Sweden. Anyone can buy their tickets online and have them delivered to their phones.

throwanem 8 minutes ago 0 replies      
Sounds creepy at first blush, but actually pretty neat. They aren't offering implants themselves, but just adding support for the implants that a couple thousand people in Sweden already have.
$185M in 5 days: Block.one sets new ICO record with its EOS token venturebeat.com
10 points by redditmigrant  1 hour ago   7 comments top 2
danr4 45 minutes ago 2 replies      
I was young during the dot com boom, but I would like to know from people who weren't: Does the ICO boom resemble the dot com boom? Because this screams bubble like nothing I've seen before.
Kiro 9 minutes ago 0 replies      
Why are these numbers so high? Why don't they set the supply to something more realistic, like $10M? How do they justify it?
A large satellite appears to be falling apart in geostationary orbit arstechnica.com
52 points by robin_reala  6 hours ago   22 comments top 5
Animats 3 hours ago 1 reply      
Cleaning up geosync orbit is going to be tough.

There's a proposal to remove small objects from low earth orbit by shining an 10KW or so earth-based laser at them for a few hours. This exerts enough light pressure to drop their orbit deeper into the atmosphere, where they slow further, re-enter, and burn up. That's probably the most cost-effective approach proposed so far. Now that solid-state lasers in the 50KW range are available, this looks like a viable option.

Tepix 2 hours ago 0 replies      
The relative speeds of pretty much all objects that are in GEO orbit is probably very small - after all they all started out orbiting the earth in the same direction at the same speed. Shouldn't this make collecting debris much, much easier? Navigating within the orbit could be done using electric propulsion.Of course such an undertaking would still be very expensive given that once you have collected a bunch of debris you need to deorbit it.
wglass 5 hours ago 1 reply      
"space weather problem" ?
ge96 3 hours ago 0 replies      
That video is cool
yummmuy 6 hours ago 0 replies      
If only we knew the details ...
Drones may soon have to identify themselves electronically while in flight recode.net
29 points by jonbaer  7 hours ago   14 comments top 6
slaymaker1907 1 hour ago 0 replies      
I doubt the FAA can regulate this since the drone registration law was blocked in court. Congress passed a law a few years back restricting the FAA from regulating recreational model aircraft, and drones are clearly just the latest incarnation of these.
th3c47 2 hours ago 3 replies      
While this might stop honest drone owners from flying in certain zones, it will eventually create an underground drones (bad choice of words - I know) that will fly "under the radar" or transmit fake ID. It's going to be interesting...
CrackpotGonzo 4 hours ago 1 reply      
Good news for uAvionix https://www.uavionix.com
zkms 5 hours ago 0 replies      
Will this be via low-power ADS-B or something else?
empressplay 4 hours ago 1 reply      
It makes sense that drones should need a transponder just like any other aircraft... it should be broadcast in a way that can be readable with a smartphone though (bluetooth?)
honestoHeminway 3 hours ago 0 replies      
Oh, yeah, more energy, more batterys, more weight, more energery.. arr, she flys like a cow.
Effect-Driven QuickChecking of Compilers [pdf] janmidtgaard.dk
21 points by ingve  10 hours ago   3 comments top 2
ingve 8 hours ago 0 replies      
Seems like the server couldn't handle the HN traffic.

The actual implementation is here:


0xFFC 9 hours ago 1 reply      
It redirects to

>Ups, du overskred CPU forbrug grnsen

It seems link does have problems. Anybody else does experience same thing?

Scaling a Web Service: Load Balancing vivekpanyam.com
57 points by vpanyam  7 hours ago   1 comment top
ge96 3 hours ago 0 replies      
I haven't built anything that requires load balancing (no users) my thought was to use something like public cloud where it's 0.01cents/GB or something.

Not sure how I could easily "make an image" copy of my stack/code and possibly this is where serverless is nice, you just need to throw code/traffic to something.

Again something I have to cross at some point.

Saved by Alice eff.org
298 points by nsgi  1 day ago   40 comments top 8
heisenbit 17 hours ago 0 replies      
More important than innovation is often execution. Protecting shallow innovation at the cost of hampering maybe less innovative but well executed small businesses is not serving the common good.
cyphar 5 hours ago 0 replies      
I've always found the culture around software patents incredibly toxic. Very rarely are patents used as means to fund future work (which is their entire intention) and far more often they are used as an extortion tactic (even by non-trolls).

And even if they were being used "correctly", software development is still poorer as an industry because of them. As they say, Stallman was right all along. https://www.gnu.org/philosophy/software-patents.en.html

CalChris 18 hours ago 0 replies      
Clarified by Enfish.

Enfish v Microsoft

> The Supreme Court has suggested that claims purport[ing]to improve the functioning of the computer itself, or improv[ing] an existing technological processmight not succumb to the abstract idea exception.


makecheck 16 hours ago 3 replies      
I can see two rules that would help immensely with trolls:

1. There should be an absolute maximum on the litigation that can be brought against any organization for any reason in a given time period (say, a year), and that total cost should not be able to exceed some tiny fraction of its total operating costs for that period (parent companies included, to avoid hiding actual illegal activities in subsidiaries). In other words, it should be impossible for someone to kill a startup in the crib simply by creating overwhelming lawsuits that are too expensive in time and money to deal with.

2. There should be a very substantial penalty for failing to convict after accusing a someone of a patent violation; something like 10x the legal costs of the party that was accused, and a moratorium on any similar accusations against any party for some period (like 6 months). In other words, slow these trolls down and hit them hard when they fail, and they might not try to make a shady business out of it.

jfaucett 11 hours ago 0 replies      
Why not just make all patents illegal? Then just let companies mitigate the risks of IP infringement through the purchase of insurance. The insurer could offer various policies some that compensate for all losses due to infringment and/or also going after perpetrators, etc. This seems like a more optimal solution to me than the current lot of legislation and trolling, I think it would maximize rate of innovation while also allowing companies to invest and have safeguards against IP theft.
ATsch 15 hours ago 1 reply      
This is meta, but I it would be awesome they'd add a slight margin to the text on mobile, it's very difficult to read if the text starts on the border of the page.
IPS3c 19 hours ago 0 replies      
Patent trolls only stifle real innovation.
lucb1e 18 hours ago 4 replies      
So... what is Alice?
How Aging Research Is Changing Our Lives nautil.us
144 points by dnetesn  16 hours ago   49 comments top 9
dahart 13 hours ago 2 replies      
Great article, this is the first one on aging I've read in a while that has it's head on straight and isn't trying to make weasel arguments that suggest aging cures are just around the corner. Happy to see a clear distinction between lifespan and longevity, so many people like to leverage confusion between those concepts to argue that medical advancements have improved longevity, like Ray Kurzweil does.

> People talk about healthcare, but in essence what we have right now is not healthcare. Its sick care. Some people see their physician when theyre well, but most people dont because theres not much advice that they can give you other than not to smoke and to exercise and all that.

I feel like I identify with this; I want to have conversations with my doctor all the time about routine advice seeking kind of stuff, but I don't because there's nothing wrong, plus it costs a lot. I would love to have a health care plan where conversations during and about being well were expected and included, but I don't see it coming any time soon.

GCA10 13 hours ago 2 replies      
Time to own up to the elephant in the room. Medicine so far has done a really good job of prolonging people's seventies and eighties. All the symptoms of advanced aging that used to be compressed into ages 70-82 or so, ending in death, now can be stretched out from ages 70 to 98.

Will this initiative double the amount of time that we've got the full capacity of our 20s and 30s? Or will it just allow people a prolonged, half-century tour of the twilight years, from 70 to 120?

rubber_duck 57 minutes ago 0 replies      
Hmm he says "Exercise is an incredible anti-aging medicine." but doesn't go specific. Does this mean cardio workout or strength training ? I do very little cardio but I do strength training >5h week - I wonder how those two stack up in terms of long term health.

I remember reading that if you got enough ST you essentially get the same cardiovascular benefits of cardio but I can't find the source.

snvzz 14 hours ago 2 replies      
There's the sens foundation, working on a "repair the damage we understand as aging" approach.


dpatrick86 14 hours ago 2 replies      
Great interview. I thought his comments about the upper limits of human lifespan were especially interesting:

> There currently is an upper limit, and the upper limit is probably around 115, 120. You have a very large number100 billion people to choose the number of people that have ever livedand you have only one who has made it through to 122, Jeanne Calment. The second oldest was 119. It does seem there is an upper limit. Some people have shown that in the last hundred years, even though we have progressively increased the average lifespan, the number of people who live above 115 has not increased.

I also couldn't help but think that his remarks about immortality being the naughty "I-word" is a roundabout way of addressing some of the excitement stirred by a certain slightly sensationalistic wizardly chap in the aging field.

Techowl 14 hours ago 2 replies      
The Buck Institute [0], featured in this article, is a pretty outstanding organization -- they're unique in being a sizable research center devoted to researching aging. I'm excited to see their idea of age-related diseases as biological maintenance problems gaining some traction.

I'm not entirely sure where the interviewer was going with this statement, though.

> Theres a lot of Silicon-Valley buzz about longevity and many startups working to develop immortality pills.

I've yet to hear of a startup working on an "immortality pill."

[0] - https://www.buckinstitute.org/

agumonkey 14 hours ago 0 replies      
I've seen some people in this field, I don't know how mature is their research but they're not joking. Funny that they still encounter resistance based on superficial argument and lack of renown.
paulcole 7 hours ago 0 replies      
If they live long enough, some people may actually get paid by Nautilus.
Razengan 13 hours ago 2 replies      
So we have the following use cases for immortality:

"Selfish" vanity, fear, etc. including accumulation of wealth, power, etc.

A desire to remain with loved ones, including pets, and to keep loved ones around, including celebrities.

A safeguard against unexpected "unfair" death, e.g. getting murdered, assassinated or dying in a freak accident, or terrorist attack etc.

Carrying out long-term plans that take longer than human lifespans.

Participating in projects that take longer than human lifespans and may not be easily restocked with new humans: e.g. traveling interstellar distances in confined spaceships.

- The aforementioned "plans" and "projects" may simply mean the preservation and protection of certain things, ideas, cultures, rituals and records that cannot be automated or archived.

A desire to explore more of the universe than can be done in a mortal lifespan.


And the following possible ways where one or more of the above goals may be achieved:

Cloning. You basically get a new person that may or may not be "as good" as the loved person/pet/celebrity you wanted to preserve.

Mind/Memory archival/copying.

Repairing/Rejuvenating one body for as long as you can.

Separating the brain from the body and having it remotely control multiple "backup" bodies.

Reincarnation. This of course assumes a "higher" plane of existence where our "true selves" actually live; e.g. this reality being a VR MMO that you can only play for as long as you paid for.

       cached 2 July 2017 10:02:02 GMT