hacker news with inline top comments    .. more ..    20 Aug 2016 News
home   ask   best   3 years ago   
1
Nasa just made all its research available online for free independent.co.uk
211 points by signa11  2 hours ago   10 comments top 8
1
thearn4 1 hour ago 0 replies      
I'm actually curious what is new within this. As it stands, NASA research pre-prints are available on the tech reports server (http://www.sti.nasa.gov/), typically right after the export control review process.

edit: maybe it's related to certain journals where there wasn't access on the STI server? Seems odd to me. For the 8 years that I've been at NASA, it's always been expected that my group's work had to be accessible.

edit2: not all rules are universal across the agency, so my experience may be too specific to Glenn Research Center/my division. In any case, the more open, the better.

2
WalterBright 1 hour ago 0 replies      
> Nasa announced it is making all its publicly funded research available online for free

This is the way things ought to be for all publicly funded research, not just NASA. Thank you, NASA, for leading the way.

The beginning of technological progress for mankind started with writing, the beginning of the industrial revolution started with the printing press. Having all the world's knowledge available on your desktop just a click away is the beginning of another exponential leap forward.

3
codyb 1 hour ago 0 replies      
Link to the actual data [0].

Looks pretty neat actually. This seems to stem from an executive order by President Obama in 2013. Mobile browsing is okay but I'm excited to check out some of the APIs when I get back to my computer a bit later on. They're seperated by category (Earth Science, Aerospace, etc).

Seems like a lot of the data is already queryable by their api's and I assume there are data dumps and research papers available as well.

Very cool. There is a serious wealth of data and apis available for tinkerers and builders these days from watson to NYC data to Nasa!

[0] - http://www.nasa.gov/open/researchaccess/pubspace

4
owenversteeg 1 hour ago 0 replies      
Ok, so maybe I'm missing something, but what's new here? Is it just the portal?

I've been reading NASA technical papers and publications for years, and although most of the research I was reading was very focused in a specific field, everything I wanted to see was freely available. Even some of the publications linked in the article have been online for a decent amount of time.

5
joeyrideout 1 hour ago 0 replies      
Searching the web portal leads to a PMC database query with the filter "nasa funded". Here's a link to a query with just that filter, which returns all 863 articles:

http://www.ncbi.nlm.nih.gov/pmc/?term=%22nasa+funded%22%5BFi...

6
eecks 1 hour ago 0 replies      
Has anyone looked at the resource? Are there data sets and other similar things that could be used in novel ways?
7
girishso 2 hours ago 0 replies      
At last... I remember Richard Feynman complaining exactly about this!
8
jpeg_hero 2 hours ago 1 reply      
3
Fixing JSON tbray.org
34 points by robin_reala  52 minutes ago   28 comments top 12
1
outsidetheparty 1 minute ago 0 replies      
Shameful confession: when I was first introduced to JSON, I was convinced it would go nowhere. "XML already does everything JSON does! And there's no way to differentiate between nodes and attributes! And there are no namespaces! And no schemas! What's the point of JSON?" And a couple of years later I looked up from the tangled mess of XSLT I was working on to discover that the rest of the world had moved on.

JSON is just javascript, and it leaves out everything else. That's its entire reason for existing, and it caught on because that's all that 99.44% of anyone needed.

Timestamps you can add today, without changing the protocol; just put them in your data if you need them. So I'm not sure what he's even proposing there.

Schemas: OK, he doesn't like JSON Schema's anyOf. Fair enough. There's no real proposal here for how to fix it, so not much to say here.

Replacing commas with whitespace sounds to me like replacing a very minor irritant with a constant full-body rash. Stuff like his example of "IDs": [ 116 943 234 38793 ] would lead to far more confusion and errors than the occasional stray trailing comma.

So I guess I pretty much vote no on this one thanks for asking

2
philsnow 0 minutes ago 0 replies      
If you're going to drop the commas in the dictionary definitions, why not drop the colons too?

 { "key" "value" 42 37 }
It's a rhetorical question; I think removing the commas is a bad idea. Troubles while hand editing json shouldn't be an overriding use case that drives the syntax.

All of those complaints about json and nothing about how you can't have comments?? I feel like Tim Bray and I aren't even on the same planet.

3
otterley 3 minutes ago 0 replies      
I'm not sure how dropping comma separators would work in practice, because carriage returns aren't required between array or hash elements in JSON.

So this would also have to be legal:

 { "foo": "bar" "baz": "quux" }
I'm not sure that's better, personally.

4
_greim_ 25 minutes ago 1 reply      
With the exception of trailing commas, all these things would break the fact that JSON is a subset of JavaScript. Breaking that would cause a lot more problems than it solves. It's just that the minor problems you have to live with seem worse than the major problems you don't.
5
wesleytodd 15 minutes ago 0 replies      
Personally I would rather it require trailing commas like Go does, any optional syntax is a step backwards IMHO
6
tootie 28 minutes ago 2 replies      
If you did all of that you'd still fall way short of what you have been able to do with XML for probably 10 years. XSD defines structure in terms of inheritance, composition, strong types, numerical ranges, enumerations, referential integrity, namespaces.
7
ambrop7 5 minutes ago 0 replies      
Two more issues:

- No support for comments.

- No support for NaN and Infinities (i.e. lack of full mapping to IEEE 754). Also at least on Chrome, JSON.parse(JSON.stringify(-0)) gives 0.

8
emilsedgh 30 minutes ago 2 replies      
Major irritant: Commas

I thought the major irritant was having to add double-quotes around keys? Commas never bothered me personally, but having to use double quotes is actually irritating.

9
dlbucci 38 minutes ago 1 reply      
At this point, I'd settle for JSON5 (http://json5.org/), which says any JSON that's a valid ECMAScript 5 object is valid JSON (including trailing commas, unquoted keys, single quote strings, and comments). I don't think it's very popular at this point though, but I guess I've never tried to use it.
10
nemild 31 minutes ago 0 replies      
Two quick observations:

- Getting standards passed is hard (especially for popularly used standards), but can be supported by latent usage in the community. You can create a plugin for a few languages that automatically allows the timestamps and doesn't require commas. Get a bunch of people using it, and you'll be on your way to actually getting it in the standard in the future (still debatable in this specific example, but broadly, usage never hurts for getting something into a standard)

- If your concern is your own frustrations (rather than benefits of the vast majority of people) - I highly recommend a simple editor-level mapping that translates from your preferred DSL to the underlying standard

11
lhopki01 30 minutes ago 1 reply      
Just use yaml?
12
greenyoda 43 minutes ago 5 replies      
I never understood why the double quotes around property names are mandatory. For example, in JavaScript, I can write

 { foo: 1, bar: 2 }
but JSON syntax insists on

 { "foo": 1, "bar": 2 }
This makes JSON less easily readable by humans, and harder to write or edit by hand.

Anyone know why JSON was designed this way?

4
Guiding a Venture-Backed Startup to a Felony linkedin.com
33 points by sigacts  1 hour ago   12 comments top 4
1
jessaustin 1 hour ago 3 replies      
Calling this a "venture-backed startup" is a stretch. It was really just a con. One couldn't reasonably expect a single author who wrote five westerns all by herself to net $250k for her efforts. They didn't really expect to be able to pay back the investor after buying all those goodies for themselves, even if they could have gotten a single book written.
2
dmix 43 minutes ago 0 replies      
> Share a Worst Case Scenario - The prospectus for the Scoundrel deal showed a conservative, most likely and optimistic scenario. The conservative scenario showed that Mr. Brandenburg would receive all of his investment back in just two years. In fact, Mr. Brandenburg never saw any dividend or repayment on his $250,000 investment. Dont gloss over risks.

This is an interesting piece of advice. I never thought to plan the negative scenarios when doing high-level planning. You always discuss possible risks and keep them at the back of your mind but adding a failure scenario to your plans among low, medium, high outcome scenarios is a really good idea.

You don't hear much about risk management/planning for startups. If you're not monitoring your risks along with your positive KPI's (or factoring them in your KPI analysis) it's easy to gloss over them or delude yourself that they don't exist.

Although, otherwise I question the value of taking much lessons out of this 'startup' failure. It's not very analogous to anything like the standard HN web startup. Just some outsider non-techies who wanted to make a 'web portal' and wasted a bunch of money frivolously (ie: 100k went to the author for licensing upfront). Even ignoring their lack of technical industry experience, their business plan heavily centered around building a community with a target market they didn't already have any traction or influence in - which is always the hardest part in any community-centric product. The tech is (usually) the easy part.

3
wtvanhest 24 minutes ago 0 replies      
IMNAL but...

> Keep personal expenses out of your company books.

Is likely the most important advice in this piece. If you only take the compensation you agree with your investors/board to take, and you never pass unrelated expenses back to your entity, you will at least appear to be acting morally.

The second you start commingling funds, is the second you enter a period of extreme risk. Never commingle investor funds with personal funds and avoid spending on things that you 'justify' as a business expense if they benefit you personally in anyway.

I have a feeling we are about to read about more cases like this one. America is in full startup fever right now and people will act immorally when they realize that success is out of their grasp.

4
api 54 minutes ago 0 replies      
The international bureau of you can't make this stuff up delivers yet again.
5
Playing with Syntax stevelosh.com
82 points by jsnell  3 hours ago   34 comments top 10
1
reikonomusha 1 hour ago 0 replies      
I agree with the overall message of the article (Lisp is a good substrate to explore syntactic ideas) but find the article just sort of wandering aimlessly to construct some macros of dubious value.

Don't Repeat Yourself is a common tenet of programming. Lisp gives you the opportunity to virtually never repeat yourself because making syntactic abstractions is cheap and easy. It must be recognized however that abstractions range from totally useless to paradigm-shifting, and DRY isn't about saving typing but rather encapsulating ideas into reusable constructs. A common problem I see in production Lisp code bases is the existence of a programmer's set of pet syntactic abstractions that don't really have a high ceiling; you see all kinds of zaps, frobs, and lets for a reason no other than they save some typing. I do not appreciate such abstractions. The Lisp syntactic abstractions I do appreciate are ones that bring me into a new paradigm of thinking. For instance, Lisp includes a set of macros to express complex iteration readably. Without this, iteration wouldn't be much more than writing (un)conditional jumps, and by having the abstractions, you shift your thinking from the notion of jumps to the notion of traversal. Another recent example is a set of macros to simplify index wrangling in tensor algebra. These "macros" already existed in contemporary math in the form of constructs such as Einstein notation and they make reasoning and thinking about tensors easier.

This article, in my opinion, treks to the destination of a macro that, while perhaps neat, isn't going to give you higher quality programs.

2
expression 3 hours ago 6 replies      
I guess I'm never to actually get Lisp to appreciate its syntax.

>Aside from the prefix ordering, Common Lisps syntax is already a bit more elegant because you can set arbitrarily many variables without repeating the assignment operator over and over again:

 ; Have to type out `=` three times x = 10 y = 20 z = 30 ; But only one `setf` required (setf x 10 y 20 z 30)
I utterly fail to see the aforementioned elegance, although I certainly can't miss the line where it happens.

3
userbinator 2 hours ago 0 replies      
It turns out that SBCL is really quite good at optimizing let statements and local variables.

That is not surprising for any dataflow-based compiler, since "a = b" by itself won't generate even one extra instruction unless those two variables' values get used independently, creating a "fork" in the path and necessitating a copy.

However --- and this is a good example of how compilers, especially for very high-level languages, are still far from optimum even for trivial optimisation --- the majority of those instructions are still unnecessary. Ultimately, the purpose of that code is to perform one addition and one modulus. Doing that should not take over two dozen instructions on any sane architecture.

At a glance I can already see that line 9 is superfluous: it writes the quotient into RBX, where it will never be used again. And there should not be any reason to check for 0 explicitly if you are going to just raise the exception anyway --- the CPU will do that automatically.

If I analyse that snippet a little more I could probably find more ways to optimise it, but it would be easier and faster to just show what a human, and presumably a better compiler, could do:

 add rax, rcx cqo idiv rdx
Assuming we are using registers for arguments and have a choice of where the caller expects the return value, those 3 instructions are all that's necessary, and the cqo is only because x86's divide takes a double-width dividend.

This is why I love Common Lisp. You can be rewriting the syntax of the language and working on a tower of abstraction one moment, and looking at X86 assembly to see what the hell the computer is actually doing the next. Its wonderful.

I want to believe that compiler optimisations are actually good enough to "collapse the abstractions" and make these languages generate nearly the same code as an expert Asm programmer, but when a simple function, even at maximum optimisation, turns into 23 instructions in 70 bytes vs. the 3 instructions in 8 bytes that I'd expect, it's rather disappointing.

4
akkartik 2 hours ago 0 replies      
It seems worth pointing out that credit for the name zap should go to Arc. (Unless it already existed somewhere else even before that?)

https://github.com/arclanguage/anarki/blob/15481f843d/arc.ar...

5
kazinator 2 hours ago 0 replies      

 (defun move-ball (ball distance screen-width) (zapf (ball-x ball) (mod (+ distance %) screen-width)))
In TXR Lisp:

 (defun move-ball (ball distance screen-width) (placelet ((it (ball-x ball))) (set it (mod (+ distance it) screen-width)))
Trivial exercise: zapf macro expanding to the placelet form.

TXR Lisp's anaphoric operators like ifa and conda use placelet, so that their "it" can be a place referring to the original:

 ;; decrement (ball-x ball) place if it exceeds 15: (ifa (> (ball-x ball) 15) (dec it))

6
jiyinyiyong 2 hours ago 0 replies      
I would still suggest editing AST direcltly https://www.youtube.com/watch?v=g0tAVjwuc1U text syntaxes brings complexities, a lot.
7
rch 2 hours ago 0 replies      
I'd like to see a service that lets people play with grammar specs interactively, with working snippets.
8
abritinthebay 2 hours ago 3 replies      
Every Lisp article promoting the "elegance" of its syntax will do the exact opposite to non-users of Lisp.

This is no exception.

From the first example on there is not one case where the Lisp syntax is a clearer expression of the concept and it's hard to justify that anything that is less clear is elegant in the slightest.

Personally I'd put destructing assignment as more elegant for the first example:

 [a, b, c] = [10, 20, 30]
But that's just me.

9
qwertyuiop924 2 hours ago 0 replies      
And all I can think is that all of this would be more elegant in scheme. In particular, scheme's cut macro makes it practical to re-order args, as opposed to implementing the % expansion used here. And it composes.

Or you could always just used readtables to implement your own version of the Clojure lambda shorthand. I'm not clear on why Steve didn't do that. It's a more effective alternative to implementing the % syntax in every macro.

10
mcphage 40 minutes ago 0 replies      
Does this article improve as it goes on? I got to:

> Aside from the prefix ordering, Common Lisps syntax is already a bit more elegant because you can set arbitrarily many variables without repeating the assignment operator over and over again

Which is both wrong, and pointless, and it didn't seem worth continuing.

6
The World Is Closer Than Ever to Eradicating Guinea Worm washingtonpost.com
41 points by dpflan  2 hours ago   12 comments top 4
1
bertil 26 minutes ago 1 reply      
I remember a story about a worm that is extracted using a stick: the worm is presented with the stick, wrap itself around it and you can pull it out. Presumably, that image of a long animal around a stick as a healing process is what gave the caducean, the symbol to medicine of a snake around a stick.

Anyone familiar with anything like this?

2
praptak 30 minutes ago 0 replies      
I wonder how much danger there is of the worm finding a new host species. I mean an animal other than dog and possibly harder to control. That would be a disastrous outcome.
3
kpwagner 32 minutes ago 0 replies      
Neal DeGrasse Tyson did an interview with Jimmy Carter on his podcast (Startalk); the interview covered the guinea worm in detail--very interesting.
4
enraged_camel 2 hours ago 7 replies      
Going back to the conversations we've had in the Zika thread[1], why are we OK with eliminating the Guinea Worm, but not certain species of mosquitoes?

[1]https://news.ycombinator.com/item?id=12322885

A lot of people in that thread made the argument that we shouldn't exterminate a species without first fully understanding the consequences. Yet we seem to be doing that here with this parasite, and no one seems to be saying, "but... think of the ecosystem!"

7
Intel's Kaby Lake CPU: The Good, the Bad, and the Meh makeuseof.com
35 points by walterbell  2 hours ago   25 comments top 5
1
Meegul 1 hour ago 2 replies      
The author of this article says that Kaby Lake is an 'anomaly' when it comes to the tick-tock strategy that Intel has been following. However, this is not the case. Following the rollout of Intel's 14nm lineup, Intel realized that it was no longer feasible to move to a smaller processing node every 2 years. And so tick-tock died.

It is currently not expected that after Kaby Lake or the 14nm node, that Intel will move back to the tick-tock model. Kaby Lake is no anomaly - it's a normal architecture update in the post 14nm world.

2
smlacy 1 hour ago 2 replies      
What does it even mean for a processor to "not support Windows 7"? Does this mean it's not fully x86 backward compatible, or is this just marketing speak for "we have a strategic alliance with Microsoft, and they don't want you to run Windows 7, so we've put hooks into our hardware to prevent it"
3
technofiend 48 minutes ago 0 replies      
The linked website makeusof.com goes on the "never visit" list right next to Forbes.com due to a buzzing full screen with no close options ad.
4
protomyth 32 minutes ago 1 reply      
Microsoft could improve sales of Windows 10 by leaps and bounds by coming out with a No Analytics edition. They may think its a techy problem, but I keep having normal people tell me they won't upgrade because "it spies on you and snoops your credit cards". This is getting silly and stupid.
5
revelation 1 hour ago 2 replies      
What's all this talk of "the CPU won't run Windows 7"? Is this just a confused blogspam site?
8
The Japanese Urban Zoning System marginalrevolution.com
95 points by Osiris30  5 hours ago   23 comments top 10
1
smcl 1 minute ago 0 replies      
"[The] great rigidity in allowed uses per zone in North American zoning means that urban planing departments must really micromanage to the smallest detail everything to have a decent city. Because if they forget to zone for enough commercial zones or schools, people cant simply build what is lacking, theyd need to change the zoning, and therefore confront the NIMBYs"

Oh my word - when playing the SimCity series as a kid I always thought "well, this is a kinda weird way to work but I guess it's just a game..." - I had no idea that this was a relatively accurate abstraction for real-life city planning in the USA.

2
codyb 3 hours ago 0 replies      
Very neat. I love reading about cities and urban planning. I have "The Death and Life of American Cities" by Jane Jacobs laying around which I need to read and I read another book about architecture here in New York City and how it has evolved over time in response to things such as fires and 9/11. Really really very interesting stuff.

This was a great post. Mixed use cities are generally much more desirable in my eyes. The American suburbs are a blight on the land which make our nation so much more carcentric than it needs to be spreading the population out and reducing the efficacy of our now underfunded and sprawled out public transportation system.

Of course it's not a shocker that such a system evolved in a nation with such a staggering amount of land available to it's citizens who live and have lived in a nation with huge amounts of racial segregation.

The idea of a set of national guidelines makes sense from an engineering perspective because small towns can't afford engineers often (or may not even think to seek out their expertise) and in the end you have a group of individuals with no qualifications determining the fate of their town with arbitrarily decided rules.

And rating things by their nuisance (traffic plus noise) levels is a really neat way to quantify whether a building should exist in a given area.

Awesome! Lots of new things to entertain the mind.

3
nxc18 2 hours ago 0 replies      
If this is an interesting topic, check out the book, Seeing Like a State. It has really great coverage of the history of things like urban planning, naming of people, mapping, etc.

The standout thesis of the book is that from a birds-eye-view perspective (what the state sees), cities that were designed to meet the local needs of people look chaotic, but the ideal clean order from that perspective ignores needs of people.

For example, a neighborhood with just houses connected by a highway to a commercial district looks clean and organized on a map. Only with boots on the ground do you see what a disaster that is compared to the model of having neighborhoods and boroughs with all of the community resources (housing, work, food, shopping, parks, schools, etc.) they need.

4
irq11 3 hours ago 2 replies      
Japan's property values haven't risen in the last 20 years because their economy is in the toilet, they're struggling with deflation, and their population is declining.

When the economy was growing like crazy in the 80s, Japan had the most expensive real estate on the planet, and the same zoning they do now.

5
saosebastiao 3 hours ago 2 replies      
I'm glad Tyler is talking about this, as he tends to be the leading edge of a lot of policy discussion around newish ideas, and the Japanese zoning system is a marvel of both efficacy and simplicity.

While the Japanese system is implemented and managed on the national level, I can imagine this being more effective on a more granular level, while still being much higher up than the city level as to avoid the political repercussions of local special interest landowners.

6
masklinn 1 hour ago 1 reply      
> in the US zones tend to be exclusive but in Japan the zones limit the maximum nuisance in a zone

It's not just japan, I'm not sure the US-style exclusive euclidian zoning is practiced anywhere else, if only for historical reasons: in most countries cities grew organically and mixed-use.

7
Cyph0n 3 hours ago 0 replies      
I read the original article when it was posted to HN a while back. It's a bit too detailed, but provides a very comprehensive analysis of why the Japanese system just works.
8
contingencies 3 hours ago 1 reply      
It's not untrue to say that Japanese society as a whole does have a culture of Confucian orthodoxy/heirarchy, ie. people do not 'rock the boat' or 'aggressively agitate for change'. This means that tradition survives well and a cultural premium on internalizing discontent and preserving outward peace rules the land. Urban zoning is probably one of the areas most affected by this cultural property. That said, in my mind even traditional Japanese urban architecture is pretty well optimized for high density, itself largely based on Tang Dynasty Chang'an in China, arguably one of the greatest periods of multicultural tolerance and philosophical debate in the history of the planet (terminus of the Silk Road, vast exchange of ideas).
9
pessimizer 1 hour ago 1 reply      
I'm not sure what people are getting out of this post. It's made up entirely of quotes from a very good blogpost [https://urbankchoze.blogspot.ca/2014/04/japanese-zoning.html] about Japanese zoning (that was also posted here years ago) wrapped in completely unsupported declarations of that system's superiority, followed with some odious racial conspiracy theories in the comments, which are dominated by a pretty well-known white supremacist.

This link isn't value added over the original post, it's value subtracted.

10
atemerev 3 hours ago 0 replies      
SimCity is to blame. It cemented the current system. :)
9
If English were written like Chinese (1999) zompist.com
75 points by rogerbraun  3 hours ago   8 comments top 6
1
allemagne 1 hour ago 0 replies      
The author really knocked it out of the park with this analogy. This explained things to me that I might have learned through experience but had never been explicitly told before. He even took the time at the end to point out where his own analogy breaks down. Zhongwen.com is also a fantastic site worth a look for students of Chinese. Overall extremely impressive, creative, and interesting.
2
WalterBright 28 minutes ago 0 replies      
The increasing use of icons and emoji suggests that English will become like Chinese!

(Ever try to look up an icon in a dictionary? This puts paid to the idea that icons are decipherable by people who don't know the language. Copyrighting the icons makes even that infinitely worse, as it prevents standardization.)

3
wodenokoto 2 hours ago 1 reply      
This article is actually a lot better at explaining the characters system used in both china and Japan, than the articles that have hit the front page today and in the last week or so.
4
Y_Y 38 minutes ago 0 replies      
I've been waiting for years for somebody to do something like this with emoji. All we'd need is a way to add radicals and a central dictionary.
5
superobserver 20 minutes ago 0 replies      
FYI we're close to this with the use of acronyms and abbreviations, not just emoticons, given English's phonetic basis, etc.
6
kwhitefoot 42 minutes ago 1 reply      
Is this meant to be serious (surely not) or rather like Mark Twain's take on spelling reform?
10
Algorithm Visualizer Knight's tour problem jasonpark.me
64 points by avinassh  4 hours ago   3 comments top
1
bogomipz 2 hours ago 1 reply      
This is neat.

There is something similar if you are more inclined towards python:

https://pyalgoviz.appspot.com/

       cached 20 August 2016 19:02:01 GMT