hacker news with inline top comments    .. more ..    21 Jun 2016 Best
home   ask   best   3 years ago   
Alan Kay has agreed to do an AMA today
1081 points by alankay1  21 hours ago   525 comments top 197
di 20 hours ago 5 replies      
Hi Alan,

In "The Power of the Context" (2004) you wrote:

 ...In programming there is a wide-spread 1st order theory that one shouldnt build ones own tools, languages, and especially operating systems. This is truean incredible amount of time and energy has gone down these ratholes. On the 2nd hand, if you can build your own tools, languages and operating systems, then you absolutely should because the leverage that can be obtained (and often the time not wasted in trying to fix other peoples not quite right tools) can be incredible.
I love this quote because it justifies a DIY attitude of experimentation and reverse engineering, etc., that generally I think we could use more of.

However, more often than not, I find the sentiment paralyzing. There's so much that one could probably learn to build themselves, but as things become more and more complex, one has to be able to make a rational tradeoff between spending the time and energy in the rathole, or not. I can't spend all day rebuilding everything I can simply because I can.

My question is: how does one decide when to DIY, and when to use what's already been built?

fogus 7 minutes ago 0 replies      
I can think of no better person to ask than Alan Kay:

What are the best books relevant to programming that have nothing to do with programming? (e.g. How Buildings Learn, Living Systems, etc.)?

alankay1 16 hours ago 3 replies      
Hi Folks

Thanks for all your questions, and my apologies to those I didn't answer. I got wiped out from 4:30 of answering (I should have taken some breaks). Now I have to. I will look at more of the questions tomorrow.

Very best wishes


guelo 20 hours ago 1 reply      
When you were envisioning today's computers in the 70s you seemed to have been focused mostly on the educational benefits but it turns out that these devices are even better for entertainment to the point were they are dangerously addictive and steal time away from education. Do you have any thoughts on interfaces that guide the brain away from its worst impulses and towards more productive uses?
satysin 20 hours ago 2 replies      
Hi Alan,

I have three questions -

1. If you were to design a new programming paradigm today using what we have learnt about OOP what would it be?

2. With VR and AR (Hololens) becoming a reality (heh) how do you see user interfaces changing to work better with these systems? What new things need to be invented or rethought?

3. I also worked at Xerox for a number of years although not at PARC. I was always frustrated by their attitude to new ideas and lack of interest in new technologies until everyone else was doing it. Obviously businesses change over time and it has been a long time since Xerox were a technology leader. If you could pick your best and worst memories from Xerox what would they be?

Cheers for your time and all your amazing work over the years :)

sebastianconcpt 20 hours ago 1 reply      
Hi Alan,

1. what do you think about the hardware we are using as foundation of computing today? I remember you mentioning about how cool was the architecture of the Burroughs B5000 [1] being prepared to run on the metal the higher level programming languages. What do hardware vendors should do to make hardware that is more friendly to higher level programming? Would that help us to be less depending on VM's while still enjoying silicon kind of performance?

2. What software technologies do you feel we're missing?

[1] https://en.wikipedia.org/wiki/Burroughs_large_systems

cozuya 18 minutes ago 0 replies      
Hey Alan, did you know my uncle Warren Teitelman at PARC? Any fun stories? He passed away a few years ago.


losvedir 19 hours ago 7 replies      
At my office a lot of the non-programmers (marketers, finance people, customer support, etc) write a fair bit of SQL. I've often wondered what it is about SQL that allows them to get over their fear of programming, since they would never drop into ruby or a "real" programming language. Things I've considered:

 * Graphical programming environment (they run the queries from pgadmin, or Postico, or some app like that) * Instant feedback - run the query get useful results * Compilation step with some type safety - will complain if their query is malformed * Are tables a "natural" way to think about data for humans? * Job relevance
Any ideas? Can we learn from that example to make real programming environments that are more "cross functional" in that more people in a company are willing to use them?

ianbicking 19 hours ago 1 reply      
1. After Engelbart's group disbanded it seemed like he ended up in the wilderness for a long time, and focused his attention on management. I'll project onto him and would guess that he felt more constrained by his social or economic context than he was by technology, that he envisioned possibilities that were unattainable for reasons that weren't technical. I'm curious if you do or have felt the same way, and if have any intuitions about how to approach those problems.

2. What are your opinions on Worse Is Better (https://www.dreamsongs.com/RiseOfWorseIsBetter.html)? It seems to me like you pursue the diamond-like jewel, but maybe that's not how you see it. (Just noticed you answered this: https://news.ycombinator.com/item?id=11940276)

3. I've found the Situated Learning perspective interesting (https://en.wikipedia.org/wiki/Situated_learning). At least I think about it when I feel grumpy about all the young kids and Node.js, and I genuinely like that they are excited about what they are doing, but it seems like they are on a mission to rediscover EVERYTHING, one technology and one long discussion at a time. But they are a community of learning, and maybe everyone (or every community) does have to do that if they are to apply creativity and take ownership over the next step. Is there a better way?

IsaacL 20 hours ago 3 replies      
What do you think of Bret Victor's work? (http://worrydream.com/) Or Rich Hickey?

Who do you think are the people doing the most interesting work in user interface design today?

arloc 22 minutes ago 0 replies      
Hi Alan,

Can you confirm that a message is a declarative information sent to a recipient which can (or can not) react to it?And what is your opinion about inheritance in OOP? Is it absolutely essential feature in an OOP language?

LeicesterCity 20 hours ago 1 reply      
Hi Alan,

Previously you've mentioned the "Oxbridge approach" to reading, whereby--if my recollection is correct--you take four topics and delve into them as much as possible. Could you elaborate on this approach (I've searched the internet, couldn't find anything)? And do you think this structured approach has more benefits than, say, a non-structured approach of reading whatever of interest?

Thanks for your time and generosity, Alan!

coldtea 20 hours ago 1 reply      
Hi Alan,

On the "worse is better" divide I've always considered you as someone standing near the "better" (MIT) approach, but with an understanding of the pragmatics inherent in the "worse is better" (New Jersey) approach too.

What is your actual position on the "worse is better" dichotomy?

Do you believe it is real, and if so, can there be a third alternative that combines elements from both sides?

And if not, are we always doomed (due to market forces, programming as "popular culture" etc) to have sub-par tools from what can be theoretically achieved?

edwintorok 20 hours ago 1 reply      
Hi, I have a few questions about your STEPS project:

- Is there a project that is the continuation of the STEPS project?

- What is your opinion of the Elm language?

- How do you envision all the good research from the STEPS model could be used for building practical systems?

- STEPS focused on personal computing, do you have a vision on how something similar could be done for server-side programming?

- Where can I find all the source code for the Frank system and the DSLs described in the STEPS report?

jarmitage 19 hours ago 1 reply      
Hi Alan,

What advice would you give to those who don't have a HARC to call their own? what would you do to get set up/a community/funding for your adventure if you were starting out today? What advice do you have for those who are currently in an industrial/academic institution who seek the true intellectual freedom you have found? Is it just luck?!

germinalphrase 17 hours ago 1 reply      
Hi Alan,

As a high school teacher, I often find that discussions of technology in education diminish 'education' to curricular and assessment documentation and planning; however, these artifacts are only a small element of what is, fundamentally, a social process of discussion and progressive knowledge building.

If the real work and progress with my students comes from our intellectual both-and-forth (rather than static documentation of pre-exhibiting knowledge), are there tools I can look to that have been/will be created to empower and enrich this kind of in situ interaction?

16bytes 20 hours ago 1 reply      
Hi Alan,

I'm preparing a presentation on how to build a mental model of computing by learning different computer languages. It would be great to include some of your feedback.

* What programming language maps most closely to the way that you think?

* What concept would you reify into a popular language such that it would more closely fit that mapping?

* What one existing reified language feature do you find impacts the way you write code the most, especially even in languages where it is not available?

nnq 10 hours ago 1 reply      
Hi Alan, the question that troubles me now and I want to ask you is:

Why do you think there is always a difference between:

A. the people who know best how something should be done, and

B. the people who end up doing it in a practical and economically-successful or popular way?

And should we educate our children or develop our businesses in ways that could encourage both practicality and invention? (do you think it's possible?). Or would the two tendencies cancel each other out and you'll end up with mediocre children and underperforming businesses, so the right thing to do is to pick one side and develop it at the expense of the other?

(The "two camps" are clearly obvious in the space of programming language design and UI design (imho it's the same thing: programming languages are just "UIs between programmers and machines"), as you well know and said, with one group of people (you among them) having the right ideas of what OOP and UIs should be like, and one people inventing the technologies with success in industry like C++ and Java. But the pattern is happening at all levels, even business: the people with the best business ideas are almost never the ones who end up doing things and so things get done in a "partially wrong" way most of the time, although we have the information to "do it right".)

fchopin 12 hours ago 1 reply      
Hi, Alan!

Like many here, I'm a big fan of what you've accomplished in life, and we all owe you a great debt for the great designs and features of technologies we use everyday!

The majority of us have not accomplished as much in technology, and many of us, though a minority, are in the top end of the age bell curve. I'm in that top end.

I've found over the years that I've gone from being frustrated with the churn of software/web development, to completely apathetic about it, to wanting something else- something more meaningful, and then to somewhat of an acceptance that I'm lucky just to be employed and making what I do as an older developer.

I find it very difficult to have the time and energy to focus on new technologies that come out all of the time, and less and less able as my brain perhaps is less plastic to really get into the latest JavaScript framework, etc.

I don't get excited anymore, don't have the motivation, ability, or time to keep up with things like the younger folk. Also, I've even gotten tired of mentoring them, especially as I become less able and therefore less respected.

Have you ever had or known someone that had similar feelings of futility or a serious slowdown in their career? If so, what worked/what didn't and what advice could you provide?

Thank you for taking the time to read and respond to everyone you have here. It definitely is much appreciated!

nostrademons 20 hours ago 1 reply      
What turning points in the history of computing (products that won in the marketplace, inventions that were ignored, technical decisions where the individual/company/committee could've explored a different alternative, etc.) do you wish had gone another way?
lispython 1 hour ago 1 reply      
Hi Alan, I'm an editor at Programmer Magazine in China[1], could you allow me to translate your answers into Chinese, and share with readers in China?

[1] http://programmer.com.cn/

CharlesMerriam2 20 hours ago 3 replies      
Many mainstream programming tools feel to be moving backwards. For example, Saber-C of the 1980s allowed hot-editing without restarting processes and graphical data structures. Similarly, the ability to experiment with collections of code before assembling them into a function was advance.

Do you hold much hope for our development environments helping us think?

discreteevent 19 hours ago 1 reply      
Hi Alan,

A lot of the VPRI work involved inventing new languages (DSLs). The results were extremely impressive but there were some extremely impressive people inventing the languages. Do you think this is a practical approach for everyday programmers? You have also recommended before that there should be clear separation between meta model and model. Should there be something similar to discipline a codebase where people are inventing their own languages? Or should just e.g. OS writers invent the languages and everyone else use a lingua franca?

kartD 20 hours ago 1 reply      
Hi Alan, What do think about the current state of language design (Swift, Rust, Go)? Anything that makes you happy/annoys you?
trsohmers 17 hours ago 1 reply      
Hi Alan,

We met at a retreat last fall, and it was a real treat for me to hear some fantastic stories/anecdotes about the last 50 years of computing (which I have only been directly involved with for about 1/10th of). Another one of my computing heroes is Seymour Cray, which we talked about a bit and your time at Chippewa Falls. While a lot of HN'ers know about you talking about the Burroughs B5000, I (and I bet most others) would have had no idea that you got to work with Seymour on the CDC 6600. DO you have any particular Seymour Cray/6600 stories that you think would be of interest to the crowd?

Thanks again for doing this, and I hope to be able to talk again soon!

emaringolo 20 hours ago 1 reply      
Do you still see an advantage of using Smalltalk (like Squeak/Pharo) as a general purpose language/tool to build software or do you think that most of its original ideas were somehow "taken" by other alternatives?
asymmetric 20 hours ago 0 replies      
Do you agree with the POV that sees Erlang as implementing some of the core tenets of OOP, namely message passing and encapsulation of state? (Cfr. for example http://tech.noredink.com/post/142689001488/the-most-object-o...)
ducklord 20 hours ago 2 replies      
Hey Alan, you once said that lisp is the greatest single programming language ever designed. Recently, with all the emergence of statically typed languages like Haskell and Scala, has that changed? Why do you think after being around for so long, lisp isn't as popular as mainstream languages like Java, C or Python? And lastly, what are your thoughts on MIT's switch to use Python instead of Scheme to teach their undergraduate CS program?
_mhr_ 20 hours ago 0 replies      
What is HARC currently working on? Is it for now a continuation of the old CDG Labs / VPRI projects or are there already new projects planned / underway?

Also, how do you organize and record your ideas? Pen and paper? Some kind of software? What system do you use? I ask because I'm fascinated by the idea of software that aids in thought, collaboration, and programming - meshing them all together.

I've seen elsewhere (https://news.ycombinator.com/item?id=11940007) that you agreed that many mainstream "paradigms" should be "retired". Retiring implies that you replace. In particular, I'm curious what you would like to see filesystems or the Unix terminal replaced with?

erring 7 hours ago 1 reply      
In a recent talk, Ivan Sutherland spoke in the lines of, Imagine that the hardware we used today had time as a first-class concept. What would computing be like? [1]

To expand on Sutherland's point: Today's hardware does not concern itself with reflecting the realities of programming. The Commodore Amiga, which had a blitter chip that enabled high-speed bitmap writes with straightforward software implementation, brought about a whole new level in game programming. Lisp machines, running Lisp in silicon, famously enabled an incredibly powerful production environment. Evidence is mounting that the fundamental concepts we need for a new computing have to be ingrained in silicon, and programmers, saved from the useless toil of reimplementing the essentials, should be comfortable working in the (much higher and simpler) hardware level. Today, instead of striving for better infrastructure of this sort, we are toiling away at building bits of the perpetually rotting superstructure in slightly better ways.

The more radical voices in computer architecture and language design keep asserting in their various ways that a paradigm shift in how we do infrastructure will have to involve starting over with computing as we know it. Do you agree? Is it impossible to have time as a first-class concept in computing with anything short of a whole new system of computing, complete with a fundamentally new hardware design, programming environment and supporting pedagogy? Or can we get there by piling up better abstractions on top of the von Neumann baggage?

[1] This is from memory. Apologies for a possible misquotation, and corrections most welcome.

testmonkey 20 hours ago 0 replies      
Jaron Lanier mentioned you as part of the, "humanistic thread within computing." I understood him to mean folks who have a much broader appreciation of human experience than the average technologist.

Who are "humanistic technologists" you admire? Critics, artists, experimenters, even trolls... Which especially creative technologists inspire you?

I imagine people like Jonathan Harris, Ze Frank, Jaron Lanier, Ben Huh, danah boyd, Sherry Turkle, Douglas Engelbart, Douglas Rushkoff, etc....

lispython 1 hour ago 0 replies      
Recalling those past days, is there any idea that not yet really played an important role, but to be forgotten?

Especially we are losing the initial generation of programmers.

juliangamble 1 hour ago 0 replies      
Hi Alan,

My understanding was that you were there at the keynote where Steve Jobs launched the iPad. From what we've heard Steve came up to you after the event and asked you what you thought (implicitly acknowledging your work on the Dynabook).

Subsequent interviews suggested you thought that the iOS range of products "were the first computers good enough to criticise".

My question is: what has to happen next for the iPad to start achieving what you wanted to do with the Dynabook?

ontouchstart 18 hours ago 1 reply      
Hi Alan,

We know you are not a big fan of web. Regardless how we got here, what is your view on how we should address the real world decentralization problems in the context of http://www.decentralizedweb.net/ ?

tlack 21 hours ago 1 reply      
Have you spent any time studying machine learning and how it might affect the fundamental ways we program computers? Any thoughts on how the tooling of machine learning (TensorFlow, ad hoc processes, etc) could be improved?
defvar 20 hours ago 1 reply      
Hi Alan, do you still do coding (any kind of, for any purpose) these days? If you do, what's your comfortable setup (say, language, editor, tools and etc)?
adamnemecek 20 hours ago 0 replies      
What are some opinions (CS related or not) that you completely changed your mind on?
panic 16 hours ago 1 reply      
Hi Alan,

There's a lot of economic pressure against building new systems. Making new hardware and software takes longer than building on the existing stuff. As time goes on, it gets harder and harder to match the features of the existing systems (imagine the effort involved in reimplementing a web browser from scratch in a new system, for example), not to mention the massive cost benefits of manufacturing hardware at a large scale.

Many people working in software realize the systems they use are broken, but the economics discourage people from trying to fix them. Is it possible to fix the economics? Or maybe we need more people able to resist this pressure?

iyn 20 hours ago 1 reply      
Thanks for doing this AMA.

Q: How do you think we can improve todays world (not just with technology)? What do you think is our species way forward? How as a civilization can we 'get to higher level'? Specifically, I'm interested in your views on ending poverty, suffering, not destroying the Earth, improving our political and social systems, improving education etc. I understand that these are very broad topics without definitive answers but I'd love to hear some of your thought about these.

Thank you and I just want to mention that I appreciate your work.

walterbell 20 hours ago 0 replies      
Do you see further research paths for metacompilers [1] to reduce code and enable customizable user interfaces?

With containers and hypervisors now in desktop OSes (Windows, Mac OS, Linux), could an open-source research OS (e.g. KSWorld) be packaged for developers and end-users who want to test your team's experimental UIs?

Is there long-term value in "private machine learning" where some data and algos are focused on user/owner interests, with "public machine learning" providing variably-trusted signals to user-owned algos for intelligence augmentation?

[1] https://news.ycombinator.com/item?id=8297996

pizza 21 hours ago 1 reply      
Hi Alan, how do you think that object-oriented programming and distributed computing will intertwine in the not-so-far future?
siteshwar 19 hours ago 1 reply      
Hi Alan,

You have an interesting reading list at http://www.squeakland.org/resources/books/readingList.jsp. However it seems that it was created long time back. Are there any other books that you would like to add to this list ?

Atwood 1 hour ago 0 replies      
Is durufle Requiem hindered or helped by a full pit? Does Chip excite you the way OLPC XO did/does? Salutations/felicitations, appreciation for the ama.
snowwrestler 19 hours ago 4 replies      
I recall reading an article about 10 years ago describing a PARC research project in which networked computers with antennae were placed throughout a set of rooms, and the subject carried a small transmitter with them from room to room. As the computer in each room detected the transmitter, it triggered actions in each room. I think it was called "ambient computing."

Does this ring a bell for you? I have searched for this article recently and not been able to find it again.

diiq 20 hours ago 1 reply      
As a community, we often think on quite short time-scales ("What can we build right now, for users right now, to make money asap?"). I feel like you've always been good at stepping back, and taking a longer view.

So what should a designer or a developer be doing now, to make things better in 10 years, or 100 years?

psibi 20 hours ago 1 reply      
What do you think about functional languages like Haskell, OCaml etc ?
unimpressive 20 hours ago 0 replies      
Hi Alan,

One of the concepts I've heard you talk about before in interviews and the like is simulation. I think simulation is huge and we should be seeing products that cater towards it, but largely aren't.



Do you still think simulation is an important promise of the computer revolution, and are there any products you know of or ideas you have that are/would be a step in the right direction?

agumonkey 20 hours ago 0 replies      
Hi Sir Kay,

How do you feel about the role of computing technology in society today ? is it still important or should be work on other domains (education, medicine, ecology, and the industrial tissue that is our interface to reality nowadays).

While I'm at it, just did a MOOC about Pharo (~ex squeak) and ST was indeed a very interesting take on OO (if I may say so ;). So thanks for you and your teammates work along the years (from ST to STEPS).

Bystroushaak 18 hours ago 1 reply      
I have two questions:

1. It is known that you read a lot. Do you plan to write a book? You have been a big inspiration for me and I would love to read a book from you.

2. What is your opinion about Self programming language (http://www.selflanguage.org)? I've read STEPS Toward The Reinvention of Programming pdf and this feels related, especially to with the Klein interpreter (http://kleinvm.sourceforge.net/).

s800 20 hours ago 1 reply      
Any conventional paradigms that you'd like to see retired? FS, Unix, signalling/messaging, etc.?
oooooppmba 20 hours ago 1 reply      
What is your one piece of advice to college students studying CS?
nextputall 2 hours ago 0 replies      
Hi Alan,

What do you think about the Newspeak Programming Language (http://www.newspeaklanguage.org)?

gnocchi 19 hours ago 1 reply      
Hi Alan,

I'm part of a generation who didn't grew up with the PDP but had LOGO and basic available in computers and calculators.With the Amstrad CPC it was possible to interupt a program and to change a few lines of code to make it do something else which was a great way to keep interested. And with calculator it was possible to code formulas to resolve/check problems.

But how would you teach programming today to a kid?Would you choose a particular medium such as a computer, a raspberry pi or even a tablet?

And if I may, do you recommend any reading for bedtime stories?

Thanks you, Kevin

lispython 1 hour ago 0 replies      
Hi Alan,

Comparing how you think about doing research now with how you thought when you were starting out, what is the biggest change in your thinking?

brogrammer6431 20 hours ago 1 reply      
I remember a few weeks back, you said that you wanted to take a closer look at the Urbit project (www.urbit.org). Just wondering if you had gotten the chance to do so, and, if so, what your thoughts were.
filleokus 20 hours ago 1 reply      
Do you believe everyone should be thought, or exposed to, programming in school? I'm afraid that universal inclusion of programming in the curriculum would have an opposite effect and make the next generation despise programming, in the same way some people feel about math today.
stop1234 5 hours ago 0 replies      
Hi Alan,

Thank you for spending some of your time here and writing your thoughts.

I would like to ask you for some advice.

The idiom "Everything old is new again" is currently picking up steam, especially in the hardware and software scene.

Amazing stuff is happening but it is being drowned in the mass pursuit of profit for mediocrity in both product and experience.

What would you say to those who are creating wonderful (and mostly educational) machines but finding it difficult to continue due to constraints and demands of modern life?

Most don't have the privilege to work at a modern day Xerox PARC. Then again there is no modern day Xerox PARC.

Thanks for all the inspiration!

pdog 20 hours ago 1 reply      

Computers have been a part of education in the United States for several decades, but you've argued that technology isn't used to its full potential.

1. Why has technology in schools failed to make an impact?

2. How would you design a curriculum for your children that uses technology effectively today?

lispython 1 hour ago 0 replies      
Hi Alan,

Have you ever made serious mistake? That if given an opportunity, you would start over with a different approach.

dookahku 19 hours ago 2 replies      
You invented a lot of what I'm using this very instant to compose this message.

I yearn to do great works of engineering and art, which I consider what you have done.

How do you come up with ideas?

kens 19 hours ago 1 reply      
Looking at your 1972 Dynabook paper [1], would you make any changes to the Dynabook vision now? Also, what do you see as the biggest missing pieces (software or hardware) today? What current software gets closest to the vision?

[1] Everyone should really take a look at the Dynabook paper: http://history-computer.com/Library/Kay72.pdf

qwertyuiop924 12 hours ago 1 reply      
Hey Alan,

You seem very disapointed and upset with the way computing has gone in the last decade. Speaking as a younger (15) and more satisfied (I still think the UNIX abstraction is pretty solid, despite what others may say) programmer, how do you not get depressed about the way technology is going?

Also, what do you propose to eliminate the "re-inventing the flat tire" problem? Should every programmer be forced through a decade of learning all of the significant abstractions, ideas, and paradigms of the last 50 years before they write anything? Because I don't see another solution.

vainguard 20 hours ago 1 reply      
How important is finding the right language?
dineshp2 5 hours ago 0 replies      
Hi Alan!

From your comments, it's clear that you are not happy with the state of programming languages as it stands.

You mentioned that the current languages lack safe meta-definition and also that the next generation of languages should make us think better.

Apart from the above, could you mention more properties or features of programming languages, at a high level of course, that you consider should be part of the next generation of languages?

tmerr 7 hours ago 1 reply      
Hi Alan,

I watched an OOPSLA talk where you described how tapes were used when you were in the air force. The tape readers were simple because they stupidly followed instructions on the tapes themselves to access the data. You seemed to like this arrangement better than what we have with web browsers and html, where the browser is assumed to know everything about the format. One way to interpret this is that we should have something more minimal like a bytecode format for the web, in place of html.

So I'm interested on your take: Is WebAssembly a step in the right direction for the web? (Although it's not meant as a replacement for html, maybe it will displace it over time).

patrec 3 hours ago 1 reply      
Hi Alan (and others!),

Logo is ~50 years old now, squeak 20 and olpc ~10. Do you know innovators who are now in their 20ies, 30ies and 40ies and who at least partly credit their mental development to childhood exposure to logo, e-toys, mindstorms, Turtle Geometry etc?

olantonan 3 hours ago 0 replies      
Do you have an opinion on text based vs visual programming languages? I think the latter is good for learning, but feel impractical in my day-to-day job. Is there a sweet spot?
anildigital 20 hours ago 1 reply      
Do you think Java is an Object Oriented programming language?
spamfilter247 19 hours ago 1 reply      
Hi Alan. What are your thoughts on how rapidly GUIs are evolving nowadays? Many apps/services revamp their UI fairly often and this oftentimes hurts muscle memory for people who have just about gotten a workflow routine figured out.

Also, what big UI changes do you foresee in the next 10 years - or would like to see. Thanks.

wdanilo 17 hours ago 2 replies      
Hi Alan!I've got some assumptions regarding the upcoming big paradigm shift (and I believe it will happen sooner than later):

1. focus on data processing rather than imperative way of thinking (esp. functional programming)

2. abstraction over parallelism and distributed systems

3. interactive collaboration between developers

4. development accessible to a much broader audience, especially to domain experts, without sacrificing power users

In fact the startup I'm working in aims exactly in this direction. We have created a purely functional visual<->textual language Luna ( http://www.luna-lang.org ).

By visual<->textual I mean that you can always switch between code, graph and vice versa.

What do you think about these assumptions?

paulsutter 20 hours ago 0 replies      
Alan, what is your view of deep/machine learning as an approach to programming? The Deepmind Atari player is only 1500 lines of code, because the system learns most of the the if/thens. Software is eating the world, but will learning (eventually) eat the software?
auggierose 20 hours ago 0 replies      
Hi Alan,

what do you think about interactive theorem proving (ITP)? Assuming that you are aware of it, is it something that you have tried? If yes, how was your experience, and which system did you try? If no, why not? What do you think about ITP's role in the grander scheme of things?

mempko 19 hours ago 1 reply      

You have inspired me deeply, thank you. I love working with man's greatest invention, but I have a deep sense of dread. HN is very good about projecting a fantasy about the future, that technology can solve all problems. I would love to see a world where people use computers to compute. However, global warming is a real threat and my biggest fear is that our pop-culture will prevent us from solving our problems before the chance to solve them is taken away from us.

With such a huge threat to humanity on the horizon, do you maintain a sense of optimism here? Or will humanity forget how to "compute" the same way Europeans forgot how to make Roman concrete?

nwmcsween 8 hours ago 0 replies      
Hi Alan, a few questions:

1. Do you have any recommended books to read?

2. Why do you think current programming paradigms are bad?

3. What changes to current operating systems need to happen?

[2] My view is you want to pass terse but informative information to a compiler in order for optimizations to take effect and there are three roads programming languages take: abstract away by layering which burdens the programmer to unravel everything (C++), abstract away from the hardware so much that specifics are hidden (most high level languages) or something similar to C.

gorlist 1 hour ago 0 replies      
Can you name a few of today's Michelangelos?
jfaucett 20 hours ago 1 reply      
Hi Alan,

Do you think we'll ever have a programming language that isn't fully text based and gets closer to a direct mapping of our own thoughts than current systems? If so any ideas what properties this language would have?

kirkdouglas 20 hours ago 0 replies      
What are you working on now?
diiq 20 hours ago 1 reply      
How do you seek out the people you choose to work with, now or in the past? Is it an active process, or do you find interesting people naturally glom around a nucleus of interesting work?
lootsauce 18 hours ago 0 replies      
What is your take on the future of our general relationship with technology in the light of the new optimistic view of AI with recent advances in machine learning? I can't help but think we are over-estimating the upside, and under-estimating the problems (social, economic, etc.) much like the massive centralization of the net has had downsides hi-lighted by Snowden and others.
zyxzevn 14 hours ago 1 reply      
Hi Alan,Since you may spend another day answering questions.. a got some more for you :-)

What do you think about the different paradigms in programming?

And what do you think about type theory, etc?

Bonus question:I am trying to develop a new general programming system for children. I was inspired by Smalltalk and ELM.

http://www.reddit.com/r/unseen_programmingIt is a graphical system that uses function blocks connected with flow-logic. So basically it is functional,but much simpler.The function-blocks form a system, very similar to classes/objects in Smalltalk. What do you think about such a system,or what tips do you have about designing a new language?

corysama 20 hours ago 1 reply      
What are the first 3 books I should read on the topic of teaching tech to kids? Thanks!
annasaru 16 hours ago 1 reply      
Hi Alan,

I am sometimes involved with mentoring younger people with STEM projects (arduino etc). Its all the buzz. But I heard one of my younger relatives lament about the tendency of young people to gravitate towards a quantitative field of study / training - is there too much hype?. "Learning to Code" is a general movement that is helping many youth to improve their career prospects. Do you think it's being effective in improving education on a meaningful scale.

What kind of educational initiatives would you like entrepreneurs (of all shades) come up with. Do these need to be intrinsically different in different parts of the world?

Finally, as a person who gets scared away from bureaucracy ("the school district") - what would you advise. School districts don't always make the best technology investments on precious dollars.

mathattack 19 hours ago 1 reply      
Hi Alan. A lot has been made of the visions that your colleagues and you had that ultimately became the fabric of our industry.

Did you have any ideas, predictions or visions which ultimately didn't play out? (And any ideas on why?)

Thank you very much to your contributions to our industry. Anyone blessed to be working in this field today owes you an enormous debt of gratitude. You have made a dent in the universe.

username3 18 hours ago 1 reply      
Is there any site that lists all arguments from all sides and reach a conclusion? If they don't reach a conclusion, do they have an issue tracking system and leave the issue open for anyone to find easily and respond?

Debates should have a programming language, have CI for new arguments, have unit tests to check logic, have issues tracked and collaborated on GitHub.

testmonkey 20 hours ago 1 reply      
What do you think about a "digital Sabbath," [1] specifically in the context of touchstones like:

Engelbart's Augmenting Human Intellect [2]Edge's annual question, How is the Internet Changing the Way you Think? [3]Carr's Is Google Making Us Stupid? [4]...and other common criticisms of "information overload"

[1] http://www.sabbathmanifesto.org/[2] http://www.dougengelbart.org/pubs/augment-3906.html[3] https://www.edge.org/annual-question/how-is-the-internet-cha...[4] http://www.theatlantic.com/magazine/archive/2008/07/is-googl...

melloclello 18 hours ago 0 replies      
Hi Alan, what do you think of the Unison project? [1]

On the surface it's a structured editor for a type safe language in which it's impossible to write an invalid program, but the author has some pretty lofty goals for it.

[1] http://unisonweb.org/2015-05-07/about.html

xt00 20 hours ago 0 replies      
Hi Alan, do you think privacy on the web should be guaranteed by design or malleable such that in special cases the government can look up your google searches and see if you follow terrorists on twitter? When I say guaranteed by design, I mean should people be creating a system to obfuscate, encrypt, and highly confuse the ability of people who wish to track/deduce what people are doing on the web?
OoTheNigerian 18 hours ago 1 reply      
Hi Alan,

I have read a lot about you and your work at Xerox.

Do you enjoy travel? What continents have you been to? What's your favorite country outside the US?

How many hours a day did you sleep per day during your most productive research years? Cos i usually wonder how very productive people seem to achieve much more than others within the same 24 hours we all have.

Greetings form Lagos Nigeria.

hydandata 18 hours ago 1 reply      
Hi Alan,

1. Do you think the area of HCI is stagnating today?

2. What are your thoughts on programming languages that encapsulate Machine Learning within language constructs and/or generally take the recent advancements in NLP and AI and integrate them as a way to augment the programmer?

ldargin 19 hours ago 1 reply      
What do you think of Chris Crawford's work on Interactive Storytelling? His latest iteration is "Siboot" http://siboot.org/

Note: He mentions being inspired for that from discussions with you at Atari.

testmonkey 20 hours ago 0 replies      
What role do people like Terry A. Davis (and his TempleOS) serve in imaging what's possible in computing? I'm thinking of Jaron Lanier's idea of society getting "locked in" after certain technical decisions become seemingly irreversible (like the MIDI standard).
ksec 6 hours ago 1 reply      
What do you think of Steve Jobs? And the Current Apple. Do you have any meaning friendship with him? Do you miss him? Any Story to share?
torstenB 18 hours ago 1 reply      
Beside objects one true revolutionary idea in Smalltalk is the uniformity of meta facilities - an object knowing about itself and being able to tell you.

I see so many dev resources burnt just because people build boring UIs or persistence bindings by wiring MANUALLY in traditional languages. All this is a no-brainer when enough meta infos (objects and relations) are available and a program is reflected as data as in Smalltalk (not dead text). You can not only transform data but also your code. Pharo now makes some more additonal steps to enhance reflection (metalinks, slots, etc).

What do you see as next steps in using metadata/-infos for (meta)programming ...

antoinevg 20 hours ago 1 reply      
Hi Alan,

Do you think we're yet at a position where we could catalog a set of "primitives" that are foundational to programming systems? (Where "systems" are fundamentally distributed and independent of software platform, programming language or hardware implementation)

IonoclastBrig 19 hours ago 1 reply      
I have been designing and hacking my own languages (to varying degrees of completion) for almost as long as I have been programming. A lot of the time, their genesis is a thought like, "what if language X did Y?" or, "I've never seen a language that does this, this, and that... I wonder if that's because they're insane things to do?"

When you're working on a system, how do you approach the question, "Is this really useful, or am I spinning my wheels chasing a conceit?" Is the answer as simple as try it out and see what happens? Or do you have some sort of heuristic that your many years of experience has proven to be helpful?

icarito 20 hours ago 1 reply      
Hello Alan,In light of the poor results of the OLPC project, in reference to the Children's Machine, leaving aside commercial factors, do you think the Sugar user interface is appropriate for the task? If not, how can it be improved, what is good/bad about it?


grincho 18 hours ago 1 reply      
I admire the compactness and power of Smalltalk. What advice would you give language designers looking to keep the cognitive load of a new language low? What was your design process like, and would you do it that way again?
GregBuchholz 18 hours ago 0 replies      
At the OOPSLA keynote speech in 1997, you mentioned that "The Art of the Metaobject Protocol" was one of the best books written in the past ten years. Any new candidates for "best" books?
rjurney 17 hours ago 1 reply      
How satisfied are you with the tablets that finally satisfied your vision (did they?) of a personal computer? How much were you able to infer about how they would work? Any lessons from this?
azeirah 19 hours ago 0 replies      
Hi Alan, this is a bit of a long shot, but I'd like to try anyway. I've been following CDG from early on, and am really interested in the exploratory research that's going on in there. I'm a 20 year old computer science student looking for an internship, would it at all be possible to pursue an internship at CDG? My primary selling point is that, given the right environment, I have a lot of motivation.

I understand this is not the right place to discuss these matters, but I know it's highly likely that this message will be read here, I am happy to take this topic elsewhere.

amasad 17 hours ago 0 replies      
Hi Alan,

You've been involved in visual programming environments like GRAIL and Etoys for kids. What do you think of the current state of visual programming for both kids and adults?

olantonan 6 hours ago 1 reply      
No language today is able to improve itself like Smalltalk was able to. That's pretty sad, wouldn't you say?
stuque 20 hours ago 1 reply      
What language do you think we should teach first to computing major students these days? What about non-major students?
bachback 20 hours ago 1 reply      
What do you think of Bitcoin and use of computers for money and contracts?
mythz 19 hours ago 1 reply      
Hi Alan,

You've been a long-time proponent for creating educational software (e.g squeak etoys) helping teach kids how to program and have been fairly critical of the iPad in the past. What are your thoughts on Apple's new iPad Swift playground (http://www.apple.com/swift/playgrounds/) in teaching kids how to learn how to program in Swift?

Do you think UI aesthetics are important in software for kids?

gbenegas 19 hours ago 1 reply      
Hi Alan, I'm a CS student thinking about graduate school.

1. Would you suggest going into a popular field that excites me but already has lot of brilliant students? (for example AI and ML)

Or rather into a not-so-popular field where maybe I can be of more help? (for example computational biology)

2. If I had to choose between studying my current favorite field at an average research group, or another still interesting field with a top group, would you suggest going with the latter for the overall learning experience?

testmonkey 20 hours ago 1 reply      
If you were to design your own high school curriculum for gifted students, what would it look like?
smegel 17 hours ago 0 replies      
What do you think about 4GLs - do you think they still hold any promise and/or represent a solution to today's language woes?
state_less 18 hours ago 1 reply      
When will we get better at saying what we mean? I don't think this just important when speaking with computers, but also human-to-human interaction.

What is the best interface for computer programming? I have settled on the keyboard with an emacs interpreter for now, but I'm curious if you believe voice, gestures, mouse or touch are or will be better ways of conveying information?

smd4 20 hours ago 1 reply      
Hi Alan - the innovation from PARC appears to be the result of a unique confluence of hardware, software, market forces, recent government research investment, and Michelangelo-level talent for bringing big ideas to fruition.

Do you think that any factors that were significant back then are going to be difficult to reproduce now, as HARC gets started? Conversely are there novel aspects of today's environment that you wished for at PARC?

poppingtonic 17 hours ago 1 reply      
Alan,Thank you for doing this AMA!

I would love to hear your thoughts on how to "train" "System 1", in order to make "System 2" more powerful. Not necessarily here, due to the time factor, but if you find some time to think more deeply on this, please let me know and we can think through this together.

david927 20 hours ago 1 reply      
You have stated before that the computer revolution hasn't happened yet. It seems we stopped trying in earnest back in the early 1980's. Why?

And what could be done to re-spark interest in moving forward?

My gut feeling says that it would require a complete overhaul in almost every layer in the stack we use today and that there's reluctance to do that. Would you agree to some degree with that?

noobermin 20 hours ago 1 reply      
It seems that dynamic or at least sloppily typed langauges like javascript and python have become more and more popular. Do you think typeless/dynamic languages are the future? I personally really like "classless OOP[0]".

[0] https://www.youtube.com/watch?v=PSGEjv3Tqo0&t=6m

yan 20 hours ago 0 replies      
What most recent topic in the field of programming languages, or computing more broadly, have you changed your mind about in a substantial way?
duck 18 hours ago 0 replies      
Hi Alan, how do you keep up with technology/news? Do you subscribe to any newsletters? Visit HN regularly?
huherto 20 hours ago 1 reply      
A piece of advice for software engineers on the mid of the careers ? How do you find challenges and leverage their experience ?
adamgravitis 20 hours ago 1 reply      
Hi Alan,

I've heard you frequently compare the OOP paradigm to microbiology and molecules. It seems like even Smalltalk-like object interactions are very different from, say, protein-protein interactions.

How do you think this current paradigm of message-sending could be improved upon to enable more powerful, perhaps protein-like composition?

olantonan 20 hours ago 0 replies      
There's still no good languages for young kids in my opinion. Should we get LOGO back, including a turtle robot?
bouh 18 hours ago 0 replies      
Dear Alan,

What do you think about EAST paradigm which tries to revamp the original spirit of OOP you stated ?

Do you think that the machine learning community suffer from the syndrome of "normal considered harmful". Like using vendor hardware instance of designing their own (FPGA for instance)

sandgraham 20 hours ago 1 reply      
Hello Mr. Kay, Are you still going to be active with VRI or CDG now that HARC! has formed?


I once ran into you in Westwood and you invited me to check out the CDG lab. Unfortunately I missed you when I came by. I'm always tempted to try again, but I'd hate to interrupt the serious thinking of the fellows stationed up there.

josephhurtado 20 hours ago 1 reply      

What do you think will be the impact of Cognitive Computing, and AI on the way software will be built in the next 5 years.

Do you think AI may automate fully some jobs, or part of the jobs people do today in IT & Software Development?

If so what do you think is the best approach professionals should take?

Thanks in advance for the answer, and thanks for doing this AMA.


Dangeranger 19 hours ago 1 reply      
Hello Alan,

Something that I find striking about you and your work is your cross discipline approach to hardware, software, and "humanware".

Can you speak about people and subjects which have inspired you from fields other than computer science and how they have changed you as a person and technologist.

ehudla 20 hours ago 1 reply      
What are your thoughts on work/life balance in the computing industry? On growing older in our industry?
syngrog66 17 hours ago 1 reply      
hi Alan!

Q: I've always been a big fan both of text console accessible UI's like CLI's and REPL's as well as of GUI's. In my mind they each clearly have a different mix of strengths and weaknesses. One way a user might have a bit of the "best of both worlds" potentially is an app or client featuring a hybrid input design where all 3 of these modes are available for the user to drive. Any thoughts on that?

I'm writing a paper in my free time about some architectural ideas in this area and would love to hear your thoughts. Feel free to tell me this is a FAQ and that I should go read a particular book/paper of yours, and/or to get off your lawn. :-)

thank you!

wslh 18 hours ago 1 reply      
Hi Alan, are you envisioning a way to participate/connect-to YC Research as an independent researcher? I don't mean as an associate since many of us have the daily focus in startups but as a place where our ideas and code would be better nurtured.
msutherl 16 hours ago 1 reply      
Hi Alan,

I'd like to go deeper into your notion of "pop cultures" vs. "progress", in the context of innovation, but also the arts. Can you recommend some readings that might fill out those concepts?

anildigital 20 hours ago 0 replies      
What do you think of statement "Erlang is the only true object oriented language in use today."?
westoncb 19 hours ago 1 reply      
Hi Alan,

I'm curious whether you think it might be an important/interesting direction for program editors to depart from character sequence manipulation to something along the lines of AST editors. Or is this only a red herring, and perhaps not so deep a change?

quakeguy 16 hours ago 0 replies      
He is a great guy, i just want to thank him via this textfield i am given.
blendo 20 hours ago 0 replies      
Any thoughts on UC Berkeley's "Beauty and Joy of Computing"? (http://bjc.berkeley.edu/)

Should AP's new "CS Principles" course count towards the math requirement for college admission?

pierre_d528 20 hours ago 0 replies      
Thank you very much for all you have done and will do.

How can we apply to join the HARC and make the Dynabook a reality?

cardmagic 20 hours ago 0 replies      
Why haven't machine learning and neural networks been applied to programming languages with as much interest as human languages? Wouldn't AI augmentation of writing computer code lead to faster breakthroughs in all other fields within computer science?
logicallee 20 hours ago 2 replies      
1. What do you wish someone would ask you so that you could finally share your thoughts, but nobody has broached as a subject?

2. (This question is about interactive coding, as a dialogue).

Human dialogs (conversations) are interactive. I think in the past computers were limited, and computer languages had to be very small (as compared with any human language + culture) so that a programmer could learn what the computer could do. But now that services can be connected (programming as a service?), would it make sense to have a dialogue? My example is that in the 1980s it wouldn't have made sense for any programming language to have a function called double() that just multiplies by 2. There's * 2 for that.

But in 2016, it makes sense for a beginner to write "and double it" and considerably less sense for a beginner to have to learn x *= 2 if they wanted to double a number.

Human language is also ambiguous. It would make sense for an interactive language to ask:

"Did you mean, set x equal to x multiplied by 2?" which most people would select, but maybe someone would select

"Did you mean, set x equal to the string "x" appended to the string "x"?"

For these reasons: do you think it would make sense to have an interactive programming language that is connected with a server you "talk" with interactively?

Or should programmers still have to learn a fixed programming language that has no room for interpretation, but instead a strict meaning.

Among other things, this means programmers can never write "it", "that", "which" to refer to a previous thing (since the referent could be ambiguous if the compiler doesn't confirm.) But every human language includes such shorthand.

I'd love to hear your thoughts regarding a connected, interactive programming process similar to the above (or just on whatever lines).

collint 20 hours ago 0 replies      
Hello Alan,

I'm curious if you've read much about Activity Theory. (in particular, Yrj Engestrm's Learning by Expanding.) I feel like it's compatible with much of what I've heard you discuss in lectures. Is it something you have an opinion on?

acd 19 hours ago 0 replies      
Hi Alan!

Biggest thanks for helping create the modern computer and its peripherals and helping advocate programming for children! Computers is the base which I enjoy the most as a hobby and make my living off.

What is your vision for the future of computing?

man2525 20 hours ago 0 replies      
Is a web browser sufficient to provide rich and meaningful experiences on the Internet?
duncanawoods 20 hours ago 0 replies      
Hi Alan,

Whats the next step to improve remote working? Face to face still seems to be so superior for relationship building and problem solving despite the wealth of video conferencing, social and collaboration tools we have. I don't want to wear goggles...


alehander42 20 hours ago 0 replies      
Hi Alan,

Do you think the object ~ biological cells metaphor can be related somehow to automated programming using GP or neural networks? (I've sometimes imagined neural networks as networks of many small objects with probability-based inheritance)

mythz 19 hours ago 0 replies      
Do you still code today? If so what's your preferred language, editor, OS?
slrigevol 11 hours ago 0 replies      
this is a spectacular thread. I gave up everything and traveled to the U of U in 1976 because alankay1 had done his thesis there. They got so much right in such a short time - Eliot Organick (Multics), Tony Hearn (Reduce, symbolic OS for TI 92, 89). All inspired by Alan Kay.
kafkaesq 20 hours ago 0 replies      
So what do you think of Scala?
alehander42 16 hours ago 0 replies      
Is syntax important?

Do you imagine a future where there would be just several programming languages semantically using a lot more bidirectional "syntax skins"?

Ericson2314 20 hours ago 0 replies      
Alan, while trying come up with a good question, I learned you are a musician. Great! As a fellow musician (also jazz and classical) I'm curious whether you feel this has influenced your engineering.
mti27 20 hours ago 0 replies      
What TV show or movie have you seen that has realistically portrayed advanced computer technology, or is growing into it? In other words, now that we have Amazon Echo is the Forbin Project more realistic?
arkj 16 hours ago 0 replies      
Consider a kid starting to learn programming, which language would you suggest him to learn first?

Also is there a minimal list of must know languages?

rudedogg 19 hours ago 0 replies      
What are your favorite talks you've given? Can you link to the videos if they were recorded?

I enjoy watching your presentations, but I'm sure there are some I've missed.

corysama 20 hours ago 0 replies      
Could you recommend a small number of historic papers in computer science for undergrads to read so that they can have a bit more context for the state of modern tech? Thanks!
lpalmes 20 hours ago 0 replies      
Hi Alan, If you are familiar with Go, what do you think about it's simplicity as a language? It's something other languages should start thinking about in their design?
pnathan 16 hours ago 0 replies      

I'm curious what you think the most interesting line of research is today in the 'computering' world.

Thanks for taking the time to do this Q&A.


childintime 18 hours ago 0 replies      
Hi Alan,

What skills would your (hypothetical?) apprentice need to have?

If this were more like a partnership what would be the subject to work on?

For that matter, what are you working on now?

alehander42 20 hours ago 0 replies      
Do you think artificial human languages designed with strong logical rules (Lojban..) can be succesful?

Do you think they can act in a-la Newspeak(the 1984 Newspeak) way?

mbrock 20 hours ago 0 replies      
Do you recall any interesting work on discussion forums or alternatives to them for promoting collaborative thinking?

Or for another approach, how do you like HN?

olantonan 17 hours ago 2 replies      
Does it suck getting old for you? Do you have stamina to make new stuff?

I'm old, very hard to stay on top of all the changes.

ldargin 19 hours ago 0 replies      
Do you consider the recent advances in AR/VR as a useful trend, or is it's emphasis on spatial movement mostly superfluous?
musha68k 20 hours ago 1 reply      
How can I get my thinking out of the/my box?
agentgt 19 hours ago 0 replies      
What other interests do you have that are not technology related. For example what kind of music do you like? Do you like art?
dillonforrest 20 hours ago 0 replies      
What are your pet peeves within your field of work?
AndrewCrick 18 hours ago 0 replies      
Hi Alan,

I came up with an idea that seems a bit like the Dynabook. It helps the user to understand design decisions. Here's a short video about it (under 2 mins):


In this case it's about how to build a digger.

I'd love to know what you think about it.

bitmadness 18 hours ago 0 replies      
Hi Alan,

I'm a CS PhD student at Caltech. What advice do you have for young computer scientists, especially for PhD students?

testmonkey 20 hours ago 1 reply      
Any memories or thoughts about Gregory Bateson?
ehudla 20 hours ago 0 replies      
Do you like the culture of Silicon Valley?
buzzkills 20 hours ago 0 replies      
In terms of real, in use, user interfaces, what do you think are the best examples? What do you like about them?
atarian 18 hours ago 0 replies      
What is your stance on the future of AI? Is it something we should be concerned about?
osense 19 hours ago 0 replies      
What is your opinion on the so-called Function-level programming, and languages such as J?
nekopa 16 hours ago 0 replies      
Hi Alan,

What is your view on Literate Programming and why it hasn't taken off (yet)?

brebla 20 hours ago 1 reply      
What impresses you the most about american free enterprise? What most disappoints you about it?
mej10 20 hours ago 1 reply      
What is your recommendation to someone wanting to get into the kind of research you do?
olantonan 20 hours ago 0 replies      
I'm no doubt your biggest fan. What do you think of the Simula inventors work?
anildigital 20 hours ago 0 replies      
Statically typed programming languages or Dynamically typed programming languages?
huherto 20 hours ago 0 replies      
How do you become a lifelong learner ? How do you stay excited about the future ?
cardmagic 20 hours ago 0 replies      
What do you believe that many programmers your know don't agree with?
0xdeadbeefbabe 20 hours ago 1 reply      
Well what is thinking about then? What was the mistake the greeks made? In this video[0] you said thinking is not about logic and that was the mistake the greeks made.

[0] https://youtu.be/N9c7_8Gp7gI?t=45m45s

miguelrochefort 20 hours ago 1 reply      
Do you believe that the gap between consuming software and creating software will disappear at some point? That is, do you expect we will soon see some homoiconic software environment where the interface for using software is the same as the interface for creating it?

I feel like the current application paradigm cannot scale, and will only lead to further fragmentation. We all have 100+ different accounts, and 100+ different apps, none of which can interact with each other. Most people seem to think that AI will solve this, and make natural languages the main interface to AI, but I don't buy it. Speech seem so antiquated in comparison to what can be achieved through other senses (including sight and touch). How do you imagine humans will interact with future computer systems?

drzaiusapelord 20 hours ago 0 replies      
How has your relationship with technology changed, especially in regard to its use politically and socially, as you've gotten older?
jyotipuri 20 hours ago 0 replies      
Hi Alan,

What do you find most frustrating about software development at current times.


akeck 19 hours ago 0 replies      
What's your most successful problem solving technique?
pyed 20 hours ago 1 reply      
Do we "really" need more programming languages ?
icc97 18 hours ago 0 replies      
What do you do to keep focus during the day?
Adam-Kadmon 20 hours ago 0 replies      
What is the best language to learn OOP concepts ?
BrutallyHonest 17 hours ago 0 replies      
What is Actor Model lacking?
chews 19 hours ago 0 replies      
Given that tablets have lived up to the Dynabook concept, what do you think about seeing 3 year olds with iPads?
kev009 20 hours ago 0 replies      
What is your opinion on Operating Systems research and industry? I find the Linux monoculture tiresome.
0xdeadbeefbabe 20 hours ago 1 reply      
I get the impression from the book Dealers of Lightning that Bob Taylor played an indispensable role in creating Xerox Parc. What are the Bob Taylors of today up to, and why aren't they doing something similar?

Edit: just noticed HARC and YC-Research. I'll check it out.

seccess 20 hours ago 0 replies      
Are you familiar with (the programming language) Go? What do you think of Go's approach to objects?
philippeback 20 hours ago 0 replies      

What do you think of the Pharo project?

miguelrochefort 20 hours ago 1 reply      
What are your thoughts on the Semantic Web? Why do you think it hasn't succeeded yet?
EGreg 16 hours ago 0 replies      
What is Alan Kay doing these days?
dredmorbius 17 hours ago 0 replies      
Do you have any thoughts or favourite authors on the topic of technology and innovation, and the process of that specifically?

I've been particularly interested lately in the works of the late John Holland, W. Brian Arthur (of PARC & Stanford), J. Doyne Farmer, Kevin Kelley, David Krakauer, and others (many of these are affiliated with the Santa Fe Institute).

In particular, they speak to modularity, technology as an evolutionary process, and other concepts which strike me being solidly reflected in software development as well. Steve McConnell's Code Complete, for example, first really hammered home to me the concept of modularity in design.

samirm 17 hours ago 0 replies      
Hi Alan,

Tabs or spaces?

BrutallyHonest 19 hours ago 0 replies      
Hi Alan,

Could you please answer:

1) What is your opinion about Actor Model? Does it have a potential? What is the next step for OOP?

2) Do you think software of the future should be end-user modifiable?

3) What would be the Dynabook of 2016? Smart contact lenses with a gesture interface?

Thank you very much!

0xdeadbeefbabe 19 hours ago 0 replies      
Did you guys ever talk about Man-Computer Symbiosis in terms of the computer unfairly benefiting some men over other men?

One example could be, give me money and I'll give you a computer that can translate English to Spanish.

Another example could be, Apple share holders profit from iphone sales, and the iphone UI leads naive/normal people to think texting while driving is ok.

jsprogrammer 19 hours ago 0 replies      
You say the problem with Xerox is that they were only interested in billions (instead of trillions).

Should we currently be interested in quadrillions, upper trillions, or, perhaps, larger? Once we become interested in an appropriately large number, what preparations should we be taking so that we can operate at that level? Do we just start putting product out there and collect the value on the open markets, or, do we need to segment markets to maximize value? Can you tell us about any other mistakes you feel Xerox might have made in realizing the value of PARC?

nxzero 19 hours ago 0 replies      

What research have you been a part of that is the most promising, yet least known, and why do you feel it failed to become more well known?

miguelrochefort 20 hours ago 0 replies      
What are your thoughts on Ethereum and DAOs (Decentralized Autonomous Organizations)? Do you believe they will lead to a new way to think about and distribute software? It kind of reminds me of the "fifth generation computer", with constraint/logic programming, smart contracts and smart agents.
skull205485 15 hours ago 0 replies      
i need help because i do not know how to hack so can you help me?
alankay1 20 hours ago 4 replies      
-- I was surprised that the HN list page didn't automatically refresh in my browser (seems as though it should be live and not have to be prompted ...)
olantonan 19 hours ago 2 replies      
Reddit commenter implying this AMA is fake. How are HN accounts verified? How do we know this is a real AMA? Just curious.
Spaceship Generator github.com
843 points by mnem  2 days ago   128 comments top 34
daredevildave 2 days ago 7 replies      
Sadly it doesn't seem to generate UVs for the models. But using the cube projection in Blender seems to do an OK job.

Add a skybox and a couple of simple particle effects in PlayCanvas.

And we have some WebGL spaceships :-)


andreasklinger 2 days ago 9 replies      
I love that the ships look more like "skyscrapers in space" then airplane like spaceships we usually see in tv shows/movies.

This is most likely (according to multiple hard scifi authors) a much more realistic depiction of how spaceships are going to look like.

Example given: The Expanse - "Flip and Burn" https://www.youtube.com/watch?v=X4EiW1bHwsQ

maxander 2 days ago 4 replies      
I saw the headline and initially assumed it would be some droll Game of Life invention. Then I thought, no, maybe its some automatic sci-fi art generation script. And I was rewarded!

Procedural generation like this is quite possibly key to the future of indie games- if you don't have the team to design large sets of art assets, its important to be able to put something pretty out there using your own wit. (A good example would be No Man's Sky.)

duncanawoods 2 days ago 7 replies      
I don't really like most procedural generation. It has no meaning and the results are not intellectually stimulating. At best you can't spot the pattern and parameters but you usually can after a few examples.

An idea I am more interested in is that you generate requirements and use an optimiser to solve the actual design. This way, there is a hidden "why". With some study, a human might be able to discern why x is so thick or why A is attached to B. When a design has a use in mind then it has meaning.

intrasight 2 days ago 1 reply      
I find the arguments of procedural vs artistic content somewhat humorous given that artistic content is just an attempt to create something that is a fascimile of actual natural and human generated artifacts. The natural world is by definition procedural (unless you believe in a Creator). The human artifacts have some artistry - but only in the broad strokes. Most buildings, and most details of architected buildings, follow a procedural semantic. The natural decay of human artifacts follow a procedural semantic.

I assume that in the near future of VR that few will play "AAA games" (and hence few will exist) because they won't be able to complete with free procedurally generated environments.

smilekzs 2 days ago 0 replies      
I find Blender quite handy when it comes to (randomized/parameterized) generation of 3D objects for the purpose of rendering, although I've only done really simple things. However the API, while in Python (which is good), feels very unpythonic and clumsy, all while being severely under-documented. It would be really nice if the API could be cleaned up in a future major release...
sandworm101 2 days ago 0 replies      
Cool, but the ships are all rather samey. They all spring from the root hull, with a main body being longer than it is wide. Perhaps that is needed to conform with our terrestrial concept of "ship". But I think it could be improved by randomizing the number or shape of starting hulls. It would also be interesting to see what could be done by extruding spherical shapes rather than boxes.
mortenjorck 2 days ago 2 replies      
The selected examples got me thinking: Those eight are presumably examples the author felt turned out particularly well. Could one build a neural network trained on generated ships that "turned out well" to automatically generate better-looking random ships?

Could this sort of process be used in games where procedural generation might otherwise be rejected because it looks "too random"?

Tloewald 2 days ago 1 reply      
Very cute, although the ships look samish (would probably be interesting to start with different base templates etc.

Still, it's only a few steps away from "random British 1970s SF book cover".

tenpies 2 days ago 0 replies      
The extreme examples remind me so much of the Shivan ships in the Freespace series.

I don't think the script replaces a professional designer, but this is awesome for brainstorming ship ideas.

sargun 2 days ago 2 replies      
My physics knowledge is pretty weak. Wouldn't you want a space ship to be closer to something like an oblate spheroid? Less surface area <-> volume ratio builds cheaper, lighter space ships presumably? The primary thing I'm unsure of is steering, but how much of a problem can that be?
DanBC 2 days ago 1 reply      
Something to do this in Lego Digital Designer would get considerable interest.
eggy 2 days ago 0 replies      
Great work! I love a lot of the ships you show. I'll have to look at the code next weekend. I liked the city-building scripts from years ago in Blender too. Fun stuff to create a ton of assets automatically.

I was working on a procedural art generator in Blender in 2006, and I tried to use genetic programming written in Lisp to create random parameters into a fixed generator I had written in Python copied from a parametric formula renderer. I can't remember the original author. I couldn't get it to work well, and got sucked into Processing shortly thereafter, and other things Blender.

You could meld your script with a Genetic Program to present quick renders it evolves, and use Neural Nets to drive towards what you like, and away from what you don't to evolve a design. This cuts the search space, and thus the time, down from a simply random form generator in producing images you may want.

You've killed my next weekend!

Again, great work!

100ideas 2 days ago 0 replies      
Form follows function! No wait... I think I have it backwards
noahbradley 2 days ago 1 reply      
Awesome stuff. Would love to see concept artists incorporating procedural generated assets in their workflow. Produce 100 samples like this, teach the computer which ones they prefer, produce 100 more, take a few and refine them by hand.
zeristor 1 day ago 0 replies      
Has anyone 3D printed one of these models?


wlievens 2 days ago 1 reply      
Thing is, /r/proceduralgeneration is running a monthly challenge for exactly this right now but I haven't seen it listed as entry yet. Check out the previous entries, some of them are neat (others less so).
zeristor 2 days ago 0 replies      
Dawkins wrote the biomorph software to demonstrate evolution, you got to select squiggles and breed them.

What would be call is to evolve spaceships by selective breeding.

No doubt this being The Internet this was done by a Russian seven years ago, and I failed to GTFA.

smcameron 2 days ago 1 reply      
When I tried to run it, I got:

Traceback (most recent call last): File "/spaceship_generator.py", line 737, in <module> File "/spaceship_generator.py", line 711, in generate_spaceshipAttributeError: 'BevelModifier' object has no attribute 'offset_type'

I ran it with Blender 2.69

agumonkey 2 days ago 0 replies      
Very macross of you.
kordless 2 days ago 0 replies      
Someone should be working on spaceship guidance software with UI interfaces for humans. We're going to need it.
laretluval 2 days ago 0 replies      
I'm delighted by how little code this is!
hetfeld 2 days ago 1 reply      
There is an error in script.

seed = 'tweer'obj = generate_spaceship(seed)

Python can't redefine functions as variables. Seed is a function.

wtbob 2 days ago 0 replies      
This this is awesome. Thanks for sharing!
gravypod 2 days ago 0 replies      
If this works as "advertised" then this is amazing. I'd love to see a game implement this.
hobo_mark 2 days ago 0 replies      
For a second I had to look if this was from the "limit theory" guy but no, RIP LT.
hydroo 2 days ago 0 replies      
Awesome work! Now I need website where I can just generate them and view them in webgl.
lifeisstillgood 2 days ago 3 replies      
Oh wow - I can procedurally generate 3D models ?!!! Blender tutorial needed :-)
stretchwithme 2 days ago 0 replies      
I'm guessing not the algorithm used to create the Destiny.
personjerry 2 days ago 3 replies      
Why are there windows on a spaceship?
throwaway_fish 2 days ago 0 replies      
Could you do procedural cars?
planteen 2 days ago 0 replies      
This is very impressive. Nice work.
dave2000 2 days ago 3 replies      
> Start with a box.

And end with some boxes stuck together.

The Monaco Code Editor microsoft.github.io
680 points by algorithmsRcool  21 hours ago   157 comments top 26
tiles 20 hours ago 4 replies      
Does anyone know on a technical level why the Monaco editor feels so much faster than the Atom editor? Is there any mechanism Microsoft is employing that Atom could adopt, or are the two editors that fundamentally different?
satysin 20 hours ago 2 replies      
Sigh I select C and it gives me C++. That perfectly sums up Microsoft's attitude to C :(
tracker1 21 hours ago 1 reply      
It's pretty nice seeing the work to bring this out of VS Code, IIRC it started as a separate project for Visual Studio Online, but embedded/enhanced as part of VS Code.

Either way, it's definitely one of the better performing code editors in JS/Browser usage. For that matter VS Code works surprisingly well compared to Brackets and Atom.

lobster_johnson 18 hours ago 1 reply      
This looks pretty great. I've been frustrated trying to implement ACE in a project for Markdown support; it turned out to not work on iOS and Android at all, and it has a ton of bugs elsewhere, too. I ditched it for CodeMirror, which turned out to be nearly as bad on mobile.

A quick test shows that Monaco does work on iOS, although there's apparently no selection support within the editor. Surprisingly, double space produces "." as it should, but it seems iOS autocompletion doesn't work (not sure if it can be enabled).

numlocked 19 hours ago 1 reply      
I maintain an open source project for writing SQL queries[0] that currently uses codemirror. I find it a little sluggish to load. This looks like it could be a good option -- the one thing that would make it a codemirror killer for me is the ability to resize the editor window; something text areas obviously natively support, but codemirror does not. Any idea if Monaco does?

[0] https://github.com/groveco/django-sql-explorer

amasad 19 hours ago 1 reply      
The JavaScript IntelliSense support looks really solid. Does it use TypeScript under the hood for type inference?

Also, any plans to add intellisense support for other languages?

kentor 20 hours ago 1 reply      
how does this compare to ace editor[0]?

[0]: https://ace.c9.io/

pkill17 20 hours ago 2 replies      
In their diff example, line 33 on the left and 35 on the right are shown to be unchanged, however the indent isn't the same... Seems like they've hardcoded this example incorrectly unless I'm missing something? Left line 32 is removing a bracket at indent level 2, left line 33 is unchanged bracket at indent level 1, but there's now one less bracket at level 1 in the right side, even though no bracket at level 1 was removed?
rattray 20 hours ago 4 replies      
Very impressed with the diff-editing feature.

... turns out there's a similar package for Atom: https://atom.io/packages/split-diff

Mahn 15 hours ago 0 replies      
Surprisingly responsive for a web based editor, really well done. I guess it's time to give VS Code a try!
rsrsrs86 15 hours ago 0 replies      
Just downloaded Code. Looks pretty fast. Really like the integrated terminal!
ausjke 15 hours ago 0 replies      
https://ace.c9.io/ ACE has been the choice for many, how does Monaco compares to?
Secretmapper 21 hours ago 5 replies      
Even Micrsoft doesn't use IE. Their github's screenshot uses Google Chrome... in incognito.

EDIT: I'm actually getting downvoted. This statement is just made in jest people, chill :)

nilved 21 hours ago 2 replies      
Interesting choice for the name since that is Apple's monospace font.
petemill 17 hours ago 0 replies      
Interesting choice of names - Monaco was the name[1] of their web-based Visual Studio editor that looked very similar to Visual Studio Code, except it came a couple of years earlier. It's pretty clear they're the same evolution of code

[1] https://dzone.com/articles/first-look-visual-studio

GreaterFool 11 hours ago 1 reply      
I just tried the editor in Opera and I have a strange issue where the cursor inverts and turns gigantic when I hover over some elements (line numbers for instance)
deanclatworthy 18 hours ago 0 replies      
Anyone got this on a CDN yet? I can't find an actual github repo on github that contains a bundled JS file?
Matthias247 18 hours ago 1 reply      
Is there some documentation of how to integrate some kind of intellisense for a custom language to it and not only syntax highlighting?

I might need to integrate an editor for a custom DSL into a webapp soon, and Monaco could of course be an interesting alternative to codemirror or ACE.

leeoniya 20 hours ago 0 replies      
btw, if anyone has some time (i don't unfortunately, since the env setup is pretty lengthy [1]), it would be great to be able to disable the always-on semantic highlighting [2].

[1] https://github.com/Microsoft/vscode/wiki/How-to-Contribute

[2] https://github.com/Microsoft/vscode/issues/5351

blahi 20 hours ago 0 replies      
I wonder if RStudio can switch to this. Probably will never happen :(
shirro 15 hours ago 0 replies      
What is the key to get out of insert mode and into normal mode?
sdegutis 20 hours ago 1 reply      
<off-topic> You know, my first thought was that it's weird for the scroll bar to disappear like that. My instinct is that I want to know what part of the document I'm on. But then I remember that I have (scroll-bar-mode -1) in Emacs, so I guess I don't really care that much. </div>
profeta 21 hours ago 4 replies      
you see github is dead when even microsoft moves faster than they do.

Still waiting to be able to have decent by 1970 standards code diff on their site.

undoware 18 hours ago 1 reply      
The dropdown language picker is busted (at least in Chrome). It's got an off-by-one (or possibly 2) bug.

i.e. Pick 'javascript', it shows JS below... but says 'less'.

benologist 20 hours ago 3 replies      
Web based vscode in 10 ... 9 ...
rodionos 18 hours ago 1 reply      
How is it different from notepad++?
The Fathers of the Internet Urge Todays Software Engineers to Reinvent the Web ieee.org
631 points by jonbaer  1 day ago   222 comments top 42
pfraze 1 day ago 6 replies      
Ok, a lot of people don't know the context for this.

Last week, the Internet Archive ran the Decentralized Web Summit. It was an opportunity for new projects [1] to gather with prominent people in the industry [2]. It was productive and very fun. It also resulted in a bunch of news pieces, like this one, which have been hitting the HN FP for a week. Some of those articles have been better than others; a lot of them feel like fluff to me. (There are some semi-interesting bullet points buried at the bottom of this one.)

What was interesting, was the level of focused energy that this event was showing. The Internet Archive did a great job organizing it, and the speakers were compelling, but the real drive came from the different teams that were present. The news orgs all focus on "Recognizable name calls for new Web," but those speakers only offered spiritual guidance to something that's moving entirely on its own. And, I think they'd be the first to say so.

There was plenty of self-awareness and open discussion. Kahle gave a good talk at the end of day 1, where he pointed out that nobody quite knows what the end-user's interest is here. Are we talking about "open-source websites?" What's the big picture? Doctorow, Baker, Kahle, and Lee all talked about values. Cerf talked about Named Data Networking, which is about content-addressing, an idea that's definitely at the heart of the new work. Zooko threw cold water on everybody ("Is this just 1999 again?"). It was very interesting. A lot of it is online [3]

1 IPFS, Dat, WebTorrent, ZeroNet, InterLedger, MediaChain, Neocities, many others

2 Vint Cerf, Mitchell Baker, Tim Berners Lee, Brewster Kahle, Cory Doctorow, many others. RMS even made an appearance.

3 http://www.decentralizedweb.net/

dang 1 day ago 3 replies      
Before posting something dismissive about this, please remember how easy it would have been to dismiss the internet and the web themselves as things that couldn't ever happen, nice visions but impossible in the real world, etc. etc. and so on.

When the people who actually made these things talk about what needs making next, we should hear them with an open mind, not rush to think of objections.

sanderjd 1 day ago 7 replies      
> That utopian leveling of society, the reinvention of the systems of debate and governmentwhat happened to that?

Only speaking personally, what happened for me was that I noticed that these utopian online communities that we reinvented are not really particularly wonderful because a lot of the loudest people in such communities are really nasty people with really nasty things to say. On the other hand, the internet has proven incredibly useful for enabling people to keep up their deep high-trust relationships (usually forged offline) across longer distances and more life changes. It makes me a bit sad too, but it doesn't appear that being open and distributed is an important ingredient in building these types of communities, as Facebook, WhatsApp, and others have shown.

It's always amazing to me how much more down on technology we technologists seem to be than the majority of people I know, who just think it's amazing that they can stay so connected with their friends and family all the time. If I told them that Tim Berners-Lee is bummed that they're sharing pictures and liking posts instead of creating their own web pages, they wouldn't understand why, and I don't think I could really explain it to them (or myself).

Animats 1 day ago 2 replies      
The Web has already been reinvented, but not in the direction that Bernars-Lee wants. We have HTTP2 running everything through one pipe to big sites. We have Javascript that puts the site in control of the user's machine and makes web pages display-only, like PostScript.More than half of all traffic is coming from the top 10 sites. The federated systems, email, IRC, and Usenet, have been replaced by Gmail, WhatsApp, and Facebook.
jokoon 1 day ago 4 replies      
Whatever the solution is, I really believe data must be on user's devices, not on proprietary servers.

Open source won the battle of software. The next battle will be about data itself.

It surely implies very hard problems when it comes to standardization and how data is exchanged. It might involve something like flatbuffers. How data is exchanged, what rights you have on it, how it is made secure, there are no ubiquitous idea or software that can reinvent the web because network programming is just hard and it wont change soon.

What I think could really be relevant is a database that syncs itself like the kinds of bittorrent sync and syncthing. Once you have atomic data that is spread across users, nobody needs to rent servers, and the data belongs to the users. That is a true and real way to reinvent the web, it also solves some of the controversial problems of the internet: advertising and surveillance.

pudo 1 day ago 5 replies      
The argument to re-decentralise the web wildly confuses economic, technical and political concerns. The reason that Facebook runs a centralised system isn't that it cannot figure out a technical alternative, it does so for economic reasons. To assume that making incremental advances in decentralised technologies will somehow fundamentally alter those economics is wishful thinking.

Instead, we need to recognise the fact that all of this is not ultra-new and never-seen-before, but rather an issue of market failure and growing monopolies and that there's an existing mechanism to deal with it: government regulation.

We can treat Facebook & co as utilities, as monopolies - there's a whole range of regulatory options and those in relevant agencies could really use the help of the tech community to figure out how to apply these tools.

Instead, the web community is out on the playground building DHT sandcastles with a bitcoin moat. Let's grow up.

z3t4 1 day ago 2 replies      
The web is already decentralized, but lack the convenience of Facebook et al. where you have a virtual identity, a friend list and can choose who get access to your images etc.

This can however be accomplished with something like SSH keys, where your "friend" list is basically a list of public keys. And a small daemon that will let "friends" make queries like "is this a friend or a friend", etc.

With a identity system in place, other things get more easy to solve, like spam, and micro-payments based on chain-of trust and reputation.I also think it would be fairly easy to implement in current web tech like browsers and http servers, e-mail servers, and chat services.

Note that your id will only be a hash (public key), and you will thus be anonymous until you tell others that this is you, and the client software could also ask the user before giving it away to a server.

It would also work with something like TOR, where your IP is hidden, and the hash is your fingerprint, witch you can change whenever you want.

danjoc 1 day ago 1 reply      
Have they forgotten an important area in need of decentralization? I didn't see any mention of the internet networks in the article. The network is currently very centralized. Comcast, Time Warner, Cox, etc. Communities need to own their network, not rent it from a giant corporation. I can't run my own server, because my ISP says I have to pay extra for business class. All they've done is block ports on the network to my home. Municipal fiber and wifi networks have actually been outlawed by these big companies in various places around the country. When the network itself prevents running peer nodes, I don't see how any amount of software running on top of that will help.
doublerebel 1 day ago 1 reply      
I think there is a ton of validity in this desire and I've collected a long list of projects, articles, and leaders that are pointing towards a similar conclusion.

Most of us are too deep into the status quo or our view of technology to break in a different direction. Most of the projects so far aim to reinvent "from the ground up" like Urbit or IPFS -- which is an impressive goal but misses what I think is the main point: the average person should be able to grok and contribute to the Internet. We can do that with the simple tools already included with every computer. Ground-up can come later.

I think that posting global knowledge should be as accessible as posting to Twitter. And sharing that knowledge should be as simple as email or Airdrop. This is what I've been working towards with Optik.io. It's in stealth but I'm always looking for like-minded people to join with to achieve such a knowledge freedom for all of us.

michaelfeathers 1 day ago 1 reply      
The deep learning is that there are forces that lead to centralization: generally they are variations of human choice and seeking economies of scale.

For instance, it's no accident that we started with hundreds of thousands of small websites and ended with most traffic going to a few. Ask yourself how the internet could have been constructed that would have prevented sites like Facebook or Twitter winning the popularity contest. The 'Power Law' dynamics behind this are the same ones that lead to some airports being hubs and others not, the fact that there is a backbone rather than everything going point-to-point, and many other phenomena.

We can have decentralization but there are costs. Someone has to pay them.

LukeB42 1 day ago 2 replies      

I'm working on something else right now but when it's feature complete I'll port synchrony to Go.The plan is to give full consideration to potential pitfalls of multiple overlay networks, the contacts list, and of course peer-to-peer streaming hypermedia.

"ENABLE_WEB_APP" is also going to be a configfile option.

Users should also be able to modify a list of domains they won't utilise overlay networks for.

It will also have to perform the necessary alpha transforms on javascripts to prevent them from modifying the proxys' interfaces' objects whilst presenting a public API, so that in-network resources can do friends list operations for things to the tune of "network/nodeid/uid would like to play grand theft space wizards with you".

textmode 1 day ago 1 reply      
Can the "Fathers of the Internet" offer a financial incentive or even a stipend to nerds that want to work on these projects?

I was reading a book a while ago on peer-to-peer technologies written some years ago and there was a chapter by a very talented, well-known programmer.

Today, like Cerf, he's on the Google payroll. Needlesss to say he will not be working diligently on releasing finished projects that help to decentralize the web.

tmpanon1234act 1 day ago 0 replies      
I can say with some confidence that the blame lies almost entirely with the stewards. Until we learn how to converge faster towards consensus, things are going to remain painfully broken for long stretches of time. The inhibitor here isn't technological and it has nothing to do with the people on the ground relying on the web. It has to do with how things get run at a top level.
GroSacASacs 1 day ago 0 replies      
> Lee and other speakers at the event pointed out a key problem of the Web today is its ephemeral nature

Why is it a problem exactly ? If something great appears on the web it will be shared, saved and discussed and not forgotten. If something is useless or bad, it will be lost when the server stops, and that's good right ?

Same happened with paper books the last 2000 years, great one were replicated, shared etc. others went lost.

Not every website has open useful data for the long term.

tracker1 1 day ago 1 reply      
I'm thinking that there is a lot to look forward to when the browsers start supporting the likes of IPFS, along with some DNS hints for IPFS nodes... perhaps something similar to a CNAME record, that points a DNS name to an IPFS published directory... Although, that would need a relatively low TTL, as it should be possible to publish, then update said reference quickly.
jlg23 1 day ago 2 replies      
* The technology for a lot of things is there (e.g. diaspora instead of facebook, etherpad instead of google docs) but hosting these costs money. And people don't want to pay and they don't understand they currently pay with their data/privacy.

* Making the data behind a commercial site open is a great, noble idea - but all of the current big, consumer facing players make money with the customer data. They have no incentive to open their data.

* Big players have absolutely no incentive to inter-operate with new competitors and this means that those who use new, decentralized services have to maintain two identities or lose contacts.

I think the only thing that has a chance to get us out of this is intervention by the government:

a) Make running your own node a human right.

b) Give every person on the planet a free node if they cannot afford one (paid for but not controlled by governments).

c) Make IPv6 mandatory (so b can work)

d) Subsidize open source efforts that enable us to have a virtual presence hosted on our own node, interconnected with our friends' nodes.

e) Elevate all electronic communication to the legal status of snail mail: If your MTA blocks my host, you have to have a damn good (security) reason, tell me exactly why and timely unblock me when I have fixed the problem (Yes, AT&T and 1&1, I am talking to you.)

f) Enforce net neutrality.

g) Force current big players to allow machine readable, convenient exports of user generated content by the user.

hNewsLover99 1 day ago 1 reply      
So... these internet-founding Einsteins actually think that Bitcoin should be a part of "our" future Utopian web? Well what do they (and all the other blockchain fan-kiddies) think of the fact that Bitcoin has now replaced Western Union as the preferred getaway car for ransomware and other extortionists around the world?

Businesses, universities, and even hospitals whose critical activities grind to a halt at financial gunpoint, and who are advised by law enforcement to roll over and pay up because nothing can be done for them - must be delighted to know that the tools of their demise are so "Utopian".

The internet isn't never was and never will be securable. Even the most resourced orgs are unable to defend their data. The founding fathers of the internet and W3C should admit this, apologize, and stop holding out false hope for the future.

skywhopper 1 day ago 0 replies      
Some good ideas, but I find some of them at odds with the rhetoric. Decentralize the web with new centralized naming and archive systems. Come up with new ways of doing things that there are already multiple failed solutions for.

Separate content and presentation layers, URLs as names, open pub/sub systems--these all have good solutions. They haven't failed to catch on because the technology wasn't there.

Anyway, to a large extent we already have a re-invented, private, encrypted, Bitcoin-funded, de-centralized web. It's called Tor, and it's not always very pretty.

LinuxFreedom 1 day ago 0 replies      
What is the current state of diaspora and similar things?

Is there a cough centralized collection of decentralized software alternatives, like an "awesome-decentralized-net" on cough even harder github?


manju_sharma 1 day ago 0 replies      
extra credit if we can make it that people can make money by publishing without going through a third party.

Why is Google sponsoring this event?! Won't they lose big chunk of revenue if this happens

yugai 1 day ago 0 replies      
I think Internet is great in it's current state. The most popular sites are among the worst in my opinion, but there are others. Internet offers diversity. Internet would be pretty much the same without Google, Facebook and Apple because there are equivalent alternatives out there.
qwertyuiop924 1 day ago 1 reply      
I'm not optimistic for this effort succeeding, but it's doing the right things to make sure that success happens eventually. Instead of having a big get-the-corporations-to-work-with-us feel, this very much had the "screw it, let's do this shit" feel. It doesn't guarantee success, but no technical project ever succeeded by having people sit around talking.
ionised 1 day ago 0 replies      
They can't reinvent the web, they are too busy coming up with ways to fill the current incarnation with advertisements.
carlsborg 1 day ago 0 replies      
Re-posting a link to the key part of Tim Berners-Lee's talk at the summit:


tylerlarson 18 hours ago 0 replies      
I can't build all of this by myself but I would be happy to help.
bertan 1 day ago 0 replies      
Aral Balkan and his small team are already working to develop a p2p web and they need help and funding.

[1] https://ind.ie

Rathor1 1 day ago 0 replies      
I don't think we should reinvent the net, in its present day shape it wasn't supposed to be monetized or for statistics mining. I'm certainly frightened of the internet 2.0
aboveL4w 1 day ago 1 reply      
security flaws are racking up like crazy. the dom was designed for the 1990s. it will happen. but its going to be one hard transition. we all know how hard small scale migrations can be. now consider that on the scale of the internet. they need to enforce new password schemas, disable capchats , they are irritating and mak eppp leave and bots get by them anyway, and windows needs to fix their api. you can never fix a bug before someone exploits it, but you design standards that force the user to follow best practices and force tech giants to stop using oaml. vulnerabilities on large traffic sites should be jail time for the person responsible. i may just active my vpn and use tor for now on for everything, compromise my bandwidth to protect my information. btw, im sure all the cloud competition does not share data with affiliates to profit enabling black hat to exploit and use targeting techniques to launch dox attacks against individuals. im not a sec major. just my 2 bytes. also, critical 0 days need an amber alert like system that forces you to change all affected sites.
zkhalique 1 day ago 1 reply      
That's funny, they are basically describing our platform.


mxuribe 22 hours ago 0 replies      
I - for one - am quite excited about this future!
peterwwillis 1 day ago 0 replies      
What is the web, essentially? Connection. Connecting communication, connecting content, connecting media, connecting communities, connecting trade. The Web is designed to connect things.

The web is already decentralized; just ask anyone who was at one time restricted to the "online service providers" like AOL, CompuServe and Prodigy. But whether or not the web is decentralized, it still connects things. I don't think decentralization will improve the connections; rather, as anyone who has ever run a large network will tell you, decentralization causes almost as many problems as it solves.

In many ways, making the web more decentralized could make it easier to defeat its simple design and raise new problems. So to my mind, we need to be addressing more specific challenges and design with the intent to address those challenges, and not simply to make change for change's sake, the way most technological improvements have haphazardly occurred.

As a very simple example of the decentralization of the web, let's look at the real new Web: mobile application platforms. On Apple's platform, pornography is not allowed. This is the result of the kind of "social protections" that our societies have traditionally been governed by. But if this was instead totally decentralized, there could be the potential for "harming children" and other persons sensitive to certain content, and as a result, governments both local and around the world may enact laws forbidding certain content on the network, or even the whole network itself.

Is the loss of certain content like pornography - specifically in a _centralized_ marketplace like the Apple Store - worth the access to such a large marketplace of content and applications? Or should we tempt society at large with unrestricted access to content? One could argue that if we were not so dependent on the internet already, modern uses could have resulted in it being banned around the globe long ago.

Here's an example of a targeted solution: an open platform, with subscriber-specific controls. Imagine a universal mobile network and platform, so apps would just run on Android, iOS, etc. But now, to find and access the applications, you would pay $1.99 a month to a company that curates the content for you. Less of what you consider garbage, more practical content. And you could use the company that restricts pornography, or the company that promotes totally unrestricted content. Suddenly there is both increased freedom, choice, and universal compatibility.

Then there's questions of how connected we really want to be. YouTube comments and Twitter are some examples that to me exemplify the kind of harsh environment that the human mind is capable of creating. Will decentralizing the web further result in an increase in this kind of damaging combination of anonymity and unrestricted communication? Is humanity really ready to have an unrestricted, unlimited form of connection?

Now keeping that in mind, let's imagine a new decentralized web: platforms that provide the same content in different ways. Imagine being able to browse YouTube comments, and only see the ones flagged as positive, uplifting, and helpful - but not by YouTube users, but anyone who used that specific browsing platform. You could choose a platform that conforms with your particular world view, and thus see primarily content that you agree with. But wouldn't this simply breed new forms of closed societies that don't take into account things that you don't like, or information you wouldn't have normally wanted to see or hear? Could this not actually set humanity back by reducing exposure to the parts of life we may not like, but are ultimately real and part of society?

We are as flawed as we are complex, and the unforeseen side-effects of the changes we implement will affect the future of how humanity is connected. I think we should tread carefully.

krapp 1 day ago 2 replies      
>Think about some sort of publish/subscribe system, in which a web-page creator can regularly hit a publish command that makes it available for archiving, and various web archives can subscribe to receive updates

This seems like something we could (should?) have right now. Maybe IA should write a Wordpress plugin, if they haven't?

>Think about creating an archive of software as well, that perhaps may have to include emulations of defunct hardware and operating systems to make the Web always backwards compatible.

A site that archives software and runs it in js-based emulators sounds like a great idea. It would probably be illegal, though. And it almost certainly wouldn't work properly for everyone, as long as it depended on the browser. But still a great idea. That any runtime and software can have a URL is incredibly compelling.

Maybe we need to leave the browser model for documents and come up with something else for using what amounts to streaming software?

>Change the naming system, and stop thinking of the URL as a locationits a name, a format he picked to look like a Unix file name simply because people were comfortable with that.

YES. No TLDs, just unique arbitrary strings.

sidcool 1 day ago 0 replies      
The Seif project is an attempt in that direction
tootie 1 day ago 1 reply      
Their goals seem at odds with the economic incentives in the entire developed world. It's a beautiful dream, but we're not sufficiently civilized to make it reality.
ObeyTheGuts 1 day ago 0 replies      
Maidsafe is comming dont worry!
pasbesoin 22 hours ago 0 replies      
cough physical layer cough
kowdermeister 1 day ago 0 replies      
What I don't see among the raised call to action questions is "who pays the bills"? It's a very important question that explains the existence of silos they feel sad about.

The number of people using the internet won't shrink, but grow steadily. Facebook, Google has enormous operating costs and if they want to offer an alternative, a better future, those costs (at least bandwidth) should be factored in. The infrastructure is not free, but Facebook and Google users are not paying for it now (well, not with money). But imagine if we say that hey, here's the new web, it's awesome: it's decentralized, privacy is baked in and works everywhere. You just have to pay 0.01 to access the New York Times. Per page. Then it would be a different situation if costs are not baked in the beginning.

Then there's video. Gazillion of videos are created per day and it grows exponentially as devices get better and better at recording ultra high resolution. Now, again, YouTube pays the bills and users get it in exchange for watching advertising. How do you want to offer an at least as good service as YouTube, but decentralized, privacy concerned and universally accessible and free?

What I see here is a problem that really exists, but the proposed radical new solutions are a bit misguided. You won't convince people with a sub par (but technically better) alternative you have to propose a iPhone level of wow, because only then you can get people's attention.

> Change the naming system, and stop thinking of the URL as a locationits a name, a format he picked to look like a Unix file name simply because people were comfortable with that.

That's a problem again, most people use Windows. Don't assume that the end users will instantly "get it" because it's more Unix like. This leads my to the next point.

Another question I haven't seen raised is User Experience. UX. Today's web is rather good at it, at least the top players embrace it very well. Most company websites now pay attention to get it somewhat right. Startups also pay a lot of effort to get UX right.

How about baking in good UX too to the new web? Today I only need to buy a $500 phone and I'm ready to consume the web. How? I type in a string and the rest is magically handled for me. I can read, watch anything. Can yo do the same with the decentralized web? I don't want to install anything, nor download terabytes of blockchain data, no encrypted distributed filesystem of somebody else's cat videos, waiting for hours to sync in. I also don't want pay for hosting somebody else's cat videos. Torrents work well for TV shows, but what would it look like on YouTube scale?

That's they key part here. To have a radically new internet, getting technologies right doesn't stop at replacing HTTP, HTML, CSS, DNS... you need to replace ISP-s and infrastructure providers too or at lest factor them in so that the new system is not born dead.

nickpsecurity 1 day ago 0 replies      
Alright, a lot of statements made by bright people. Now, lets evaluate them one-by-one to see which get praise or reality checks. :)

re silo effect

Schneier calls this the Feudal Model of Security or Convenience with nice write-up here:


We can also look at it as a form of lock-in. In any case, recent discussions on Elsevier and other scientific publishers shed light on where it might take us. Many academics gripe about not knowing the state-of-the-art or even prior work in their field since they can't afford access to the silos its stored in. Many, despite being customers of Elsevier et al, rushed to download all kinds of stuff from Sci-Hub when it appeared. Now lets imagine that effect applied to most knowledge or content to see how bad it could be for progress of both knowledge and society. Let's, if not paywalled, think of how restricted search and selective promoting can create similar effects by preventing people from connecting dots or even experiencing new things. Then, we see that the siloing could have tremendous, negative impact on people in many ways. Better to switch to something similar to old web where all kinds of content appeared, was easily accessible, and easy to build on.

re trading privacy for free stuff is a myth

It's actually a reality given users dumped their freedom, privacy, and paid offerings in mass for ad-supported, web content/services. The demand side of this was so strong and so many experimental alternatives failed that providers were largely pushed in the direction of ad-support just to survive. It also came with significant, financial rewards. Good write-up here:


So, he needs to quit pretending people are ignoring some solution that works in favor of ad-supported, free-for-users content in a target market that almost exclusively goes with ad-supported offerings. The rational choice is to do what works in a market or with given demand. If they want privacy, they can pay for it or take steps to get it. It's why I have a paid, MyKolab account w/ GPG keyring. Many others used Fastmail or Lavabit for years. Yet, vast majority uses surveillance platforms (eg Gmail, Yahoo, Microsoft) that sell them out to advertisers but also reliably handle the email on the side. I can't remember the market share but I put money that it massively contradicts those arguing against ad model in terms of what people actually do versus what they say.

re sites blinking on and off. Big problem. Needs to be eliminated in next architecture or at least Wayback Machine-style thing with greater integration/convienience. Think snapshots or rollbacks at the browser level.

re sketchy privacy controls. User's fault. They didn't care in practice. They do business with scumbags whose whole model is selling them out and who have a string of abuses. Most won't pay even $2 for private messaging app or $5/mo for private email. Yet, they gripe about privacy issues. I say stick with self-selection plus reboot a simpler, effective model for evaluation of product/service privacy or security along lines of Common Criteria. Security experts, esp experienced in realities of fielded programs, would contribute to it from many different countries to reduce risk of subversion or simply unworkable ideas. Baseline of features & assurance activities critical to privacy and security of product or service plus independent review they're implemented & trusted distribution. Nothing more unless company volunteers as differentiator.

re Vint Cerf. Good ideas across the board with products/services actively attempting to deliver all of them except copyright. That one isn't legal yet, though. The pub-subscribe is a decent idea given there's many robust implementations, even high-assurance schemes, for that sort of thing. Even military is deploying something like that now with at least one high-security demonstrator (below). Commercial/FOSS sector has things like ZeroMQ, which has other benefits. Much field experience out there in doing it right. The older & more field-proven something is, the more likely it will work right the next time. Tried and true beats novel and new.


Note: DTIC is another source of wisdom in terms of old papers with great ideads or implementations in them if you know how to find them. Can't help there but keep the DTIC link to anything you find that doesn't have a steady link elsewhere. DTIC link usually stays available longer than average website. CiteseerX and obviously archive.org as well.

re Lee. His idea on URL has been implemented many times over. Just doesn't get acceptance due to bootstrapping problem where all the web browsers have to support the alternative but they're not adding something with little demand most of the time. Dot-archive is nice and could integrate with archive.org. Might even do it with a small fee that simultaneously supports archive.org (or its replacement) plus gives clean link in return similar to subdomains or shortcut links. "Surface the data" is idea behind Semantic Web. It was largely a failure. Market went with API's instead. They're probably better but mixing two might create interesting hybrids.

re Kahle. Decentralized clouds like Amazon definitely worth imitating. Google applied principle to RDBMS's nicely with F1 RDBMS. Awesome stuff. JavaScript will be necessary evil due to market share, ASM.JS, and so on. However, still room for another Flash to happen across a significant chunk of market if done well enough. Don't think of blockchains as about every goal we've listed has been solved in isolation & sometimes decentralized without them. Its inefficent alternative. Now, Merkle or hash trees will likely be useful at some point. Keybase.io & others working on public key angle. That "Wordpress" and "Wordpress alternative" are typed into Google many times a week make his last point solid. Even Freenet and I2P support forms of blogging.

re Doctorow. Vulnerability research being legal is a must. "Computer obeys owners" is a good principle but lay owners vs technical attackers make that a weakness. Feudal model gives up control for safety with good results on Apple, etc. So, maybe an override the user can activate locally or maybe physically. I'm still a fan of jumpers or physical switches for write-protect of critical storage. :)

So, that's my take on these statements.

johan_larson 1 day ago 0 replies      
Too much rough consensus. Not enough running code.
teraformer 1 day ago 2 replies      
Job 1: Advertising interests be damned, this mass surveillance thing has got to go.
tsunamifury 1 day ago 4 replies      
Most of these guys are a disconnected from where the global market is going and are proposing preposterous ideas for west coast tech scenes own problems.

The web was based on a Western educational reference model that is not the normal mode for 3/4ths of the planet.

The new web would need to be pushed based, not pull based. It would need to be need to be instantly authorable and aware of people and devices, Not document based.

PayPal has demanded that we monitor data traffic seafile.de
628 points by zolder  7 hours ago   229 comments top 32
josteink 7 hours ago 5 replies      
> PayPal has demanded that we monitor data traffic as well as all our customers files for illegal content. They have also asked us to provide them with detailed statistics about the files types of our customers sync and share on https://app.seafile.de

That's a pretty big WTF right there.

I know PayPal has a on overall pretty scummy reputation, but I still I cannot imagine PayPal doing this because they themselves think they'll benefit from this data.

To me this seems like a demand which comes "upstream" from above PayPal, from its payment providers (VISA, MasterCard, American Express, etc). Would I be overly paranoid to imagine these demands and claims are the result of lobbing by entities like RIAA and MPAA? They do have a history for blocking payments to known pirate-friendly services after all.

And as such, they clearly have too much power, and there needs to be some anti-discriminatory financial regulation to stop business-hostile practices like this from being lobbied and put in place.

Because this is just madness.

nakodari 4 hours ago 1 reply      
Another victim of Paypal here. I run Jumpshare, a file sharing and collaboration service for creative professionals. This is what Paypal sent us:

"May 8, 2016: When you signed up for your PayPal account, you agreed to our User Agreement and Acceptable Use Policy. Because some of your recent transactions violated this policy, we've had to permanently limit your account.

Please remove any references to PayPal from your website."

They never mentioned which transactions violated the policy, we have never had any complains from our customers. There was no prior warning. We called them and they asked us to email them. We sent multiple emails and nobody bothered to respond back. We lost 30% of our recurring monthly revenue right away!

We now use Stripe as our sole payment service provider. After this experience, we will probably never accept Paypal again.

Draiken 4 hours ago 2 replies      
I'm once again astonished with how much control of our businesses is simply out of our hands.

When looking at practices these financial institutions use it makes me wonder what can't they do?

Everyone cites "regulations", but as far as I understand, they make the regulations. Directly or indirectly.

Take for example the known cases where PayPal freezes accounts holding people's money. If I take someone else's money and refuse to give it back to them, it's a crime in pretty much every nation. But when banks and financial institutions do that, they get away scot-free (with maybe some small rants from the internet) and keep doing this systematically profiting in almost all cases.

If we're not bound to middle men like Stripe and PayPal, we're bound to Visa and Mastercard. Is there any way out of this madness?

rio517 7 hours ago 4 replies      
Given that we've all read similar situations happening all over the web, I'm surprised organizations aren't including "paypal drops us for arbitrary reasons" or "paypal freezes our funds with them for arbitrary reasons" in their risk assessments when choosing vendors. In almost every case, that risk should probably push decision makers away from Paypal.

It is also a little entertaining that their "brand risk" department is probably doing so much unintentional damage to the brand.

MichaelBurge 7 hours ago 6 replies      
File sharing services are listed as requiring pre-approval, so Seafile should've sent them an email before accepting it as payment:


They're well within their rights to decline your business. If a bank told the government, "We have no absolutely no idea what our customers are doing with their money or who they're sending it to. Maybe they're sending it to terrorists or drug lords, maybe they're not; it's none of our business and we respect their privacy", they'd get shut down in a heartbeat.

I can understand if Paypal doesn't want to appear on the front page of the news for funding an underground child porn ring that signed up as one of your "enterprise clients".

howfun 7 hours ago 1 reply      
Apparently the other filesharing companies spy on user data,that is why they are on Paypal.
Taylor_OD 7 minutes ago 0 replies      
I use PayPal often for online stores where my Discover card isnt accepted but I cant believe they are still around and have a huge market share. The platform is so broken.
jeena 6 hours ago 1 reply      
So, suprise suprise, my 6 years old blogpost continues to be spot on https://jeena.net/paypal that is when I deleted my PayPal account. But it was not all dance on roses after that, suprisingly many only offer PayPal as a way to pay them, so I always have to try to contact them and to try to explain and to ask for another way to pay. Most of the time they won't/can't help me.
contingencies 38 minutes ago 0 replies      
Had to use Paypal today to make a payment to a company who can't otherwise find a reasonable way to take credit cards online. I feel their pain, having been in that position. Paypal randomly saw that it was reasonable to demand I answer a phone in another country (though I haven't been based there for perhaps 15 years) if I wanted to log in to my account. I had to work around this by having them send a payment request, then paid about USD$1000. Wish they accepted Bitcoin, I was livid at the experience. Every time I deal with Paypal it's the same. Their PR crap a year or two back about "sorting things out" was obviously empty. Stripe isn't much better: after a reasonable start, last time I wanted to use them I couldn't because my address is in a different country to my card (ANYONE LISTENING?). These abuses are reaching a breaking point, nobody is going to deal with credit cards soon. Here in amusingly progressive mainland China, they are a minor mode of payment and shrinking: good riddance!
mootothemax 6 hours ago 1 reply      
Here's PayPal's page on what they require for file-sharing services:

>Merchants offering file-sharing programs or access to newsgroup services must monitor for and prevent access to illegal content.


raverbashing 6 hours ago 4 replies      
Paypal only exists because the current infrastructure of payments in the US is a joke

Nobody needs paypal in Europe. Of course, they try to sell themselves as "the easiest way" (which is right to a point) but it's mostly unneeded

adrianmsmith 5 hours ago 1 reply      
Presumably with TTIP harmonizing laws between the US and EU, violating the privacy of users will stop being illegal in the EU at some point.
benevol 2 hours ago 1 reply      
It's time to share & spread information about PayPal's competitors:

What has your experience or market research yielded (stripe.com and paymill.de probably being the most obvious ones)?

INTPenis 18 minutes ago 0 replies      
Well, sorry paypal, but I've seen a host of new payment services crop up lately. Now is not the time to push your clients around.
Matt3o12_ 4 hours ago 0 replies      
I wonder if legal action can be taken against PayPal for demanding this kind of information. PayPal blackmailed them into breaking a law (they didn't break it but they suffered financial loss from not doing it).

If I told a customer who absolutely depends on my business to harass/attack somebody, I would certainly hold liable as well.

reitanqild 7 hours ago 0 replies      
Improvement: PayPal at least asked first, IIRC that was not always the case.

Still I advise people not to depend solely on PayPal because of their tendency to freeze funds over nothing.

mathattack 3 hours ago 0 replies      
Since complying with this demand would violate German / European data protection laws (and also be morally wrong in our opinion) we have declined to comply with this demand.

Is everyone in Germany going to have this issue?

leommoore 4 hours ago 0 replies      
What is the most disappointing is PayPal's lack of Customer Service. Why would any vendor want to use their services when they are so badly treated? It also throws into question the data security of other vendors's cloud storage. Is everyone looking at my stuff?!?
infodroid 1 hour ago 1 reply      
To a corporate lawyer, every file sync and team collaboration solution looks no different from Megaupload.
kriro 2 hours ago 0 replies      
That's a pretty huge WTF request.

Can you sue Paypal for basically telling you "break the law or don't use us" (I think that's a pretty bad idea on ideological grounds but I wonder if it's technically possible)? Especially since Paypal has to be regulated within the EU in some way I'd say such a request should result in at least a cursory check if the EU license (iirc. they operate as a bank) of Paypal should be revoked/suspended/investigated.

joopxiv 7 hours ago 2 replies      
I have quite some personal experience with PayPal, and their policies seem quite random. If one person gets it into his or her head that something is not allowed, no voice of reason will change the decision. I wouldn't shed too much tears over it though, there are many better and cheaper alternative forms of payment.
ohitsdom 3 hours ago 1 reply      
Has anyone made a site yet detailing all of these horror stories? Paypal's behavior is unacceptable and needs to change, yet years later things still seem just as bad for their customers.
morganvachon 4 hours ago 0 replies      
I think I just found my new cloud provider. It's rare to see a company stand firm on their values like this.
herghost 6 hours ago 0 replies      
This is the reason I stopped using Paypal.

The want to position themselves as a financial services provider equivalent to traditional banks, except that they reserve the right arbitrarily - and without recourse - remove their services.

As a customer this means you can be cut out from your ability to use "currency".As a business this means you're beholden to arbitrary decisions that you can't really risk assess against - and if you've built your business on this service it could be devastating.

lucaspottersky 3 hours ago 0 replies      
This means that it's easier than ever to spot companies that are monitoring YOUR data: just check whether they accept PayPal or not.
mertens 1 hour ago 0 replies      
OP, are you from Zolder, Belgium???
mk89 6 hours ago 1 reply      
Another great example of the so called "democratization".
sschueller 6 hours ago 1 reply      
A little of topic but has anyone else noticed the paypal website being extremely slow?
premasagar 7 hours ago 2 replies      
Shenanigans like this will only hasten the onset of a bitcoin economy.
nxzero 2 hours ago 0 replies      
Anyone have a link to the notices PayPal sent Seafile?
phyyl 2 hours ago 0 replies      
faith in humanity restored
tn13 7 hours ago 0 replies      
I stopped using file-sharing software long back. I use s3 now.
Hello, Tensorflow oreilly.com
573 points by lobsterdog  1 day ago   38 comments top 9
ajschumacher 1 day ago 4 replies      
Wow! I was going to post this but here it is already! I wrote (with a lot of help) the article there. I also jotted down some notes on the process of writing it with O'Reilly in case anybody's interested in that side of things: http://planspace.org/20160619-writing_with_oreilly/
tromobne8vb 1 day ago 2 replies      
As I've been reading about tensorflow lately I feel like I'm missing something regarding distributed processing. How can Tensorflow 'scale up' easily if you are outside of Google? We have big datasets that I want to run learning on but it seems awkward to do with tensorflow. We're big enough that the team managing our cluster is separate than development and it is a huge pain if we need them to go install tools on each node. Even with Spark support it seems like the tensorflow python libraries need to be set up on each machine in the cluster ahead of time.

Am I missing something?

50CNT 1 day ago 5 replies      

 TensorFlow is admirably easier to install than some other frameworks
I thought most frameworks are fairly easy to install in python, usually with a single call to pip. NLTK takes one "pip install nltk" and then "python", "import nltk", "nltk.download()" to download all the corpuses and miscellaneous data. Installing tensorflow seems complicated compared to that.

 # Ubuntu/Linux 64-bit, CPU only: $ sudo pip install --upgrade https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.8.0-cp27-none-linux_x86_64.whl # Ubuntu/Linux 64-bit, GPU enabled. Requires CUDA toolkit 7.5 and CuDNN v4. For # other versions, see "Install from sources" below. $ sudo pip install --upgrade https://storage.googleapis.com/tensorflow/linux/gpu/tensorflow-0.8.0-cp27-none-linux_x86_64.whl
Not that either are particularly complicated, but saying other frameworks (assuming they're referring to python frameworks) are "a lot harder to install" seems disingenuous.

That said, I haven't played around with AI frameworks too much, so I might just be missing a real stinker.

anilshanbhag 22 hours ago 1 reply      
I like the clarity of thought and structure of the article. I have used Tensorflow and had to explain it to a friend. So many times, I end up assuming things which are obvious to me but not to someone getting started. As said in the article, Tensorflow stands out for the ease of use and is to the best of my knowledge first distributed learning framework. Theano, Torch et al are faster but do not come with goodies like Tensorboard.
farresito 1 day ago 0 replies      
I love this short tutorials that give you an introduction to anything in an hour. They help you get interested in stuff you wouldn't have gotten interested in otherwise.
bduerst 18 hours ago 1 reply      
This is fantastic - thank you for doing this. When paired with the browser tool it makes a lot more sense: http://playground.tensorflow.org/

Is this planned to be released as an intro in a book about tensorflow?

tedmiston 1 day ago 0 replies      
> For more on basic techniques and coding your own machine learning algorithms, check out our O'Reilly Learning Path, "Machine Learning."

This learning path is also available free for Safari Books Online subscribers.


Fenntrek 1 day ago 0 replies      
Wonder when we will see the day when an O'Reilly book is written by AI.

Could be a nice little loop if part of the creators of the AI that would accomplish this learned part of their craft from O'Reilly books.

JPKab 1 day ago 1 reply      
Hey Aaron! Hello from one of the people you taught in your first GADS class in DC.
Bought and returned set of WiFi home security cameras; can now watch new owner reddit.com
448 points by tshtf  1 day ago   137 comments top 23
mmaunder 1 day ago 4 replies      
If something like this happens to you - where you gain unauthorized access inadvertently to something - I'd be careful. Under the CFAA you can be charged criminally and the penalties are severe.

So for example, if the OP was to casually drop a few photos the camera took and a badly worded warning in their mailbox trying to help, the 'victim' could report it to the police and an inexperienced DA might try to bag their first cyber prosecution.

I'd definitely not contact the customer. Contact the vendor instead with an email and immediately remove your own access to the system. That way you have it on record (the email) and mention in the email you immediately revoked your own access.

The CFAA is a blunt and clumsy instrument that tends to injure bystanders.

Here's an extract from the CFAA:

Whoever having knowingly accessed a computer without authorization or exceeding authorized access, and by means of such conduct having obtained information that has been determined by the United States Government pursuant to an Executive order or statute to require protection against unauthorized disclosure for reasons of national defense or foreign relations, or any restricted data, as defined in paragraph y. of section 11 of the Atomic Energy Act of 1954, with reason to believe that such information so obtained could be used to the injury of the United States, or to the advantage of any foreign nation willfully communicates, delivers, transmits, or causes to be communicated, delivered, or transmitted, or attempts to communicate, deliver, transmit or cause to be communicated, delivered, or transmitted the same to any person not entitled to receive it, or willfully retains the same and fails to deliver it to the officer or employee of the United States entitled to receive it;


matt_wulfeck 1 day ago 5 replies      
These types of exceeding invasive products need to have their damages tested in courts. After a few lawsuits and payouts the liabilty will begin to increase and that will force companies to adapt/improve or go under.

The problem is our entire generation doesn't care about privacy. They willingly hand over everything about them to an app and care not a single drop that their government spies on them without a warrant.

Mister_Snuggles 1 day ago 1 reply      
I have a handful of D-Link cameras, and plan to buy more.

D-Link offers some sort of cloud service, but I've never used it. I keep the cameras segregated onto a separate Wifi network that can't access the internet, and they work just fine in that configuration. The cameras have built-in HTTP servers and present what they see as an MJPEG stream. I use 'motion' running on a machine to handle motion detection, recording, etc. I use a VPN server to handle my remote access needs.

I get everything that the cloud stuff offers, but all hosted locally.

What's described in the article scares me, which is why I've set things up the way I have. Even if the cameras were used (they weren't) and tied to someone else's account, they can't send anything back to the cloud service.

louprado 1 day ago 1 reply      
"I'm not mistaken, anyone could get the serial number off your cameras and link them to their online account, to watch and record your every move without your permission."

There's a name for a hacking strategy where you mass purchase products, modify it or acquire relevant information, then resell them or return them. "Catch and release" comes to mind, but I can't find any references.

userbinator 1 day ago 3 replies      
I set up an online account

The title is missing an important fact: these are not traditional network cameras, they're ones that apparently stream video into the cloud.

Those cameras that do not "phone home" to a cloud service don't have this problem; the ones that you can set up with a username/password and then connect directly to from the network. Ironically it's the cheap no-name ones that usually work like this, as the company just sells the hardware and isn't one to bother with their own set of servers/accounts/etc.

IMHO these cameras that do rely on a third-party service are to be avoided, since what happens to that service is completely out of your control.

jedberg 1 day ago 1 reply      
Props to Dropcam/Nest for solving this problem.

My brother gave me his Dropcam after setting it up for himself, and I had to prove my identity and he had to prove his to get them to move the camera to my account. It was a hassle at the time, but I was glad to know that they at least had decent security.

RickS 1 day ago 1 reply      
HN readers: Do you think the engineers knew?

I ask because I've worked on various products, and single units change hands between engineers constantly. Phones for testing, accounts with shared dev passwords, the actual hardware, all kinds of test units get spun up and passed around, even on crappy products where the engineers' imaginations are the only QA.

Surely one engineer set up a camera, passed it along to another engineer, who set up the camera and encountered this error?

There are lots of classes of error that can hide in a product, but this feels like one that it's nearly impossible not to hit.

JChase2 1 day ago 3 replies      
I've tried finding a camera that has a server that can encrypt traffic, and I can't. It'd be nice to have access from outside of my network but I don't trust it. It really took me by surprise how bad at security these things are. I guess I could set up some kind of vpn but I assumed when I bought it I could enable ssl or something.
markbnj 1 day ago 1 reply      
Systems that provide an online account tied to a physical device have to be carefully designed for transfer of ownership scenarios, and it sounds like they didn't do the work here, or else something went wrong and the resulting error state is unfortunate.
nateguchi 1 day ago 2 replies      
You can more than likely pick up the serial through the web-admin panel that these cameras expose on the local network.

God forbid they have a wireless AP with the serial number somehow encoded in the SSID.

How is it that these companies still don't give security a passing concern?

geofffox 1 day ago 0 replies      
I had the same problem with a WD home server. I returned it when it wouldn't do what it was supposed to do. Later, I started receiving emails from the server as it kept me up-to-date on its status.
mtkd 1 day ago 1 reply      
I guess the devops team can view all of them
wepple 1 day ago 0 replies      
this is a general class of problems that is only going to get bigger.

When I returned my lease car I had to have a bit of a think about what might be sync'd from my phone via bluetooth with it, and what functionality existed to erase that. The answers didn't make me feel great.

The fun pastime of buying old HDD's off ebay and carving deleted files off them to see what might be kicking about is going to get a whole lot more interested with everything-connected society moving forward.

reiichiroh 14 hours ago 0 replies      
I can't tell but it doesn't seem like the OP reset the devices before he returned him. Isn't this his or her fault then? Like having nude selfies on a phone and returning it without wiping the phone to factory defaults?
takeda 1 day ago 0 replies      
What's with the "cloud" security systems? Why don't they just provide hardware where you store the information locally?

Ignoring the privacy implications mentioned here, and that you esentially pay monthly/yearly for storage, if your ISP has an outage your security system is becoming useless. It also is a weak point for smarter thieves (just make sure that Internet access is cut).

happyslobro 1 day ago 1 reply      
Wow. You know the situation is bad when you are actually better off implementing you own security as a bunch of Arduinos with webcam shields on the LAN and a server with a feature phone in the closet.

LOL, just look at this vigilant little bastard :p http://www.arducam.com/arducam-porting-raspberry-pi/arducam-... No one is sneaking up on that without leaving a mugshot.

dboreham 1 day ago 0 replies      
fwiw I recently started using the Samsung network camera sold by Costco (SNH-V6414BN), after various homebrew and RPi solutions over the years. It has an on-camera password that is set as part of the WiFi pairing process so is not open to this kind of attack. This password is separate from the cloud account credentials, so provided you don't ask the web site or mobile app to retain it (optional), without that password the camera content can't be accessed remotely (of course the firmware could be compromised and I don't know if the password is adequately protected from eavesedropping).
Aelinsaar 1 day ago 3 replies      
Until people start demanding security, and become willing to pay for it, the IoT is going to be positively defined by this kind of nonsense. That, or some kind of legislative action I guess, but that seems like pure fantasy.
nxzero 1 day ago 1 reply      
Seen this same method applied to used equipment for sale, especially if it was stolen.

Basically, someone steals a laptop, wipes it, reinstalls the OS with backdoors, sells the laptop for cash, exploits backdoor access to own other devices, exploits owned devices, etc.

andrewclunn 1 day ago 0 replies      
Holy shit. Never buying off the shelf consumer grade security equipment now.
arca_vorago 1 day ago 0 replies      
Yet people still recoil as if in horror when I try to explain that this is one of the core reasons why gplv3 is so important. Look, we've lost the hardware freedom wars so far, but we still have software, and we can work on improving our hardware side as we progress.

One of the Common arguments I hear in response is, "But open source doesnt pay, and therefore doesnt innovate as much."

While the lack of funds coming arent ignorable, innovation is always happening in the foss space, often surpassing the proprietary alternatives, often falling far behind as well. It still gives you the power to control your own systems, which is the freedom you can choose to not give up.

The only way you surrender your freedom is voluntarily.

hackney 1 day ago 0 replies      
Sounds like the security part is sorely lacking. That and someone needs to get a life.
China issues demolition order on worlds largest religious town in Tibet tchrd.org
480 points by nu2ycombinator  3 days ago   233 comments top 22
kweks 3 days ago 16 replies      
I was there literally three days ago. The place is nothing short of breathtaking. Nestled at 4200km, it was a three day motorbike ride from Chengdu. Even at its foothills, you have no concept of the scale of the town hiding in the hills.

This will truly be a tragic loss - the town is much more than the 'slum' it is represented as - it's arguably the most important Buddhist learning institution in the world..

Photos for the interested from my visit: http://travel.ninjito.com/dump/2016-06-15-Larung-Gar/index.h...

Edit: Got a good internet connection, uploaded decent photos.

oi5zkc 3 days ago 3 replies      
I visited Yaqing/Yachen mentioned at the bottom of the article about a month ago. The experience was surreal. In the middle of nowhere, 4 hours from the nearest city, the monastery is a sprawling complex of huts around the bend in a river in the tibetan tundra.

Here are some pictures: http://imgur.com/a/v4gYI

The hygienic conditions are very poor - people doing their business squatting on the streets, no toilets, rubbish everywhere. I am surprised that was not an argument being made by the government for the demolition.

There is some concern about foreign - especially American and British - influence on Tibetan Buddhists in the government. I am not sure this move will serve to diminish this influence.

Ps: on another note, a surprising number of these ascetic monks had an iPhone 6s+ or Samsung S7 in their pockets!

andy_ppp 2 days ago 7 replies      
I do not understand the cowardice and fear the Chinese government seem to have of a small buddhist settlement in the middle of nowhere high up in the mountains of Tibet... It looks like they feel the opposite of a superpower and that a few buddhist monks might threaten them so much they have to destroy them. It's very easy to forget how free we are by comparison to those persecuted in occupied territories.
huahaiy 2 days ago 3 replies      
Allow me to share some of my perspectives.

I grow up in Seda County in the late 70 and early 80s, and am intimately familiar with culture there. Tibetan Buddhism is not what people in the west think what it is. It is actually quite repressive and brutal. After 1950s, many regular Tibetans were glad to worship the new religion of Chairman Mao instead. Yes, it was true. Chairman Mao was worshiped as one of major Gods at Tibet when I grow up there. Then Deng Xiaoping took power and demolished Chairman Mao worshiping (one of his major blunders, on the same scale as that of 89 Tienanmen massacre), now we got this huge slum town of "religious learning" at a hot basin of Buddhist rebellions. Yes, going unchecked, that town would surely become such a terrorists base, because Buddhist monks in that area had always been very militant and had launched numerous rebellions in 60 and 70s. As a child, I heard all kinds of horrific stories Tibetan monks and their rebellious army inflicted on the Chinese soldiers and civilians alike. For that reason, many Han Chinese families kept firearms at home in that area, a rare thing in China.

On the other hand, Tibetan people in general are good people. One of my cousins married a Tibetan man and we are good drinking buddies. However, Tibetan religious upper class are representatives of a theocracy at worst: greedy, deceptive and brutal.

I am surprised that this town was tolerated for so long. I guess Deng's power was still strong even after his death.

mhuffman 3 days ago 5 replies      
Wow! It is amazing that we do not consider this a human rights violation. Oh well, gotta keep them prices low at Walmart!
icebraining 3 days ago 0 replies      
Here are a bunch of photos from the place: http://www.dailymail.co.uk/news/article-2349761/Little-boxes...
tomglynch 3 days ago 0 replies      
"In 2001, Chinese authorities implemented similar crackdown on Larung Gar by destroying thousands of monastic dwellings and expulsion of monastic and lay practitioners, some of whom died of shock or resorted to suicide, while some were rendered mentally unsound."
eggy 3 days ago 0 replies      
This will bring the Free Tibet movement to the front again for the next POTUS.

I am sure China will put the 'safety' issue forward, and deflect the independence movement. Remember the earthquake of May 12, 2008 in Sichuan province, China, where over 68,000 people died, and more went missing. The focus in the media was on the houses not being up to standard building codes.

The earthquake in Nepal in 2015 surely affected Tibet too, but information was controlled by China, so the numbers are questionable. Larung Gar is a sprawl of houses for monks, worshipers, students and visitors that could be seen as a potential earthquake hazard area as spun by Chinese media.

nxzero 2 days ago 1 reply      
China's obsession with the destruction of Tibet and its culture is truly troubling.

If China has its way, hundreds of years from now, Tibet will be gone, no record of it will exist, etc.

31reasons 2 days ago 0 replies      
Of all people the residents of the town will be the least unhappy about it. Because they truly know everything is impermanent. Its their practice and will see it as part of nature. Today I shall not buy anything Chinese. I wish there was a supermarket that sold things not made in China.
abrbhat 3 days ago 0 replies      
whistlerbrk 3 days ago 1 reply      
Remember when free trade was going to democratize and open China? What happened? This is terrible
ommunist 1 day ago 0 replies      
So what? This is a normal procedure. Authorities set ceiling of no more than 5000 dwellers there. That was ignored. Face the concequences.

For those curious - China is colonising Tibet, nothing wrong in industrial nation wiping out the weakling. When the British colonised Americas they did the same with more than 600 First Nations. They just did it earlier.

mac01021 3 days ago 1 reply      
I didn't notice in the article any mention of why there is a population limit set for this town.

Is it explicitly a measure taken to limit solidarity in a religious/ethnic minority? Is there a concern about the food supply?

eisvogel 3 days ago 0 replies      
This is ass. It's time for a regime change.
sbose78 3 days ago 3 replies      
Any way of getting the UN involved?
colordrops 2 days ago 0 replies      
And yet people are more concerned about ancient statues blown up by muslim extremists. This seems like it shall be a greater tragedy by at least an order of magnitude.
reustle 3 days ago 4 replies      
Not saying this is right, but it sounds like the city exceeded their legal cap of 5k people? So it doesn't sound like it's a total surprise to them. Am I reading it wrong?
sova 2 days ago 0 replies      
What great sadness and folly. Oh may the future be free of such mindless pursuit of conquest.
gscott 2 days ago 1 reply      
China has some seriously evil leaders.
discardorama 2 days ago 1 reply      
Chinese are bullies. And bullies pee in their pants when faced with real resistance. That's the best way to describe it.
jmknoll 3 days ago 3 replies      
We detached this subthread from https://news.ycombinator.com/item?id=11928168 and marked it off-topic.
Were pretty happy with SQLite and not urgently interested in a fancier DBMS beets.io
388 points by samps  1 day ago   145 comments top 29
SwellJoe 1 day ago 8 replies      
I never stop being impressed at how often people will jump to odd, unsupportable, conclusions like, "using MySQL will make this thing faster".

I've seen it so many times over the years regarding users and email configurations. I can't count the number of times I've dropped into someone's badly behaving mail configuration and found they had MySQL hosting the users, and explained it was for "performance" reasons. Somehow they didn't grasp that /etc/passwd fits entirely in memory, and the map files Postfix uses for various lookups and stuff are already a database (just one specifically designed for the task at hand) and also fit entirely in memory. Putting that MySQL layer in there is disastrous if performance matters; it is orders of magnitude slower for any case I've seen...still plenty fast for most cases, but it's ridiculous that this idea gets cargo-culted around that if you store your mail users in MySQL your mail server will be faster.

A little knowledge is a dangerous thing, is what I'm trying to say, and people who know MySQL is "fast" may not know enough to know that it's not the right tool for the job in a lot of cases...and is probably slower for many use cases. I'm pretty confident this is one of those cases. SQLite is wicked fast on small data sets, and being smaller means more of it will fit in memory; I can't think of any way MySQL could be a more performant choice for this workload.

Also, I don't even want to try to imagine shipping an installable desktop application for non-technical users that relies on MySQL!

Annatar 1 day ago 0 replies      
Wow, the article is such a fresh breath of air, primarily because the author demonstrates common sense.

He (they?) picked SQLite for all the correct reasons:

- best tool for the job for their situation;

- write-light and read-heavy;

- zero configuration;

- easy to embed;

- understanding that optimizing queries by far gives the best performance in the shortest amount of time.

As an aside, I'm currently using SQLite for Bacula and Postfix, and it's a joy to use; the only drawback I found so far is lack of REGEXP REPLACE in the SQL dialect which the database supports (must be loaded with .load /path/to/lib/libpcre.so, but it is not part of the language). I used the Oracle RDBMS for my PowerDNS deployments, but in retrospect, the way PowerDNS works, SQLite would have been an even better match. All in all, it is great to read that someone picked it for all the correct reasons, rather than some fashion trend, as is often the case in computer industry.

chjj 1 day ago 5 replies      
Premature optimization is evil, but preemptive optimization is necessary unless you want to paint yourself into a corner. I realized this after implementing a bitcoin full node.

In my bitcoin implementation, as an experiment, I tried storing the blockchain in sqlite, postgres, and leveldb. I gathered up a bunch of data from the first ~200k blocks of the blockchain and benchmarked all three databases. I queried for something like 30,000 utxos out of a set of a couple million. What took 300-400ms in leveldb took 1.6 seconds in postgres (on the repl. in my actual node it would have taken longer due to deserialization of the utxos). What took 1.6 seconds in postgres took over 30 seconds in SQlite.

Now, you can tell me I did the benchmarks wrong, and "oh, if you just did this it would be faster!", but 30+ seconds is slower to an absolutely insane level. Needless to say, I went the key-value store route, but I was still astounded at how slow sqlite was once it got a few million records in the database.

I actually like sqlite, but when you know you're going to be dealing with 70gb of data and over 10 million records, preemptive optimization is the key. If I were the author, I would consider switching to postgres if there are over 500k-1m records to be expected. That being said, if they're partial to sqlite, SQLightning (https://github.com/LMDB/sqlightning) looks pretty interesting (SQLite with an LMDB backend).

edit: To clarify, these weren't particularly scientific benchmarks. This was me timing a very specific query to get an idea of the level of data management I was up against. Don't take my word for it.

int_19h 1 day ago 1 reply      
While we're speaking of SQLite; one thing that has little exposure that could probably use more is that it now ships with Windows as a system DLL:


Between that, and packages readily available on most Linux and BSD distros out there (and, in most cases, installed by default), it's well on its way to become a de facto standard system API for relational storage.

c-smile 1 day ago 0 replies      
If to speak about desktop applications then any embedded DB will be unbeatable.

So I am speaking about embeddable DBs here.

Konstantin Knizhnik have implemented impressive set of various embedded DBs: http://garret.ru/databases.html

Like his POST++ has direct mapping to C++ classes so if you use C++ then you don't need any ORM.

In my Sciter[1] Engine I am using his DyBase library [3] as a bult-in persistence for Sciter's script [2] (JavaScript++).

With the DyBase in script you have features similar to MongoDB (noSQL free-form DB) but without any need for ORM and DAL - you can declare some root object as be persistable and access those data trees as if they are JavaScript objects. The engine pumps objects from DB into memory when they are needed:

 var storage = Storage.open(...); var dataRoot = storage.root; // all things inside are persitable dataRoot.topics = []; // flat persistable list dataRoot.topics.push({ foo:1, bar:2 }); // storing object /* create indexed collection with string keys, keys can be unique or not */ dataRoot.titles = storage.createIndex(#string);
DyBase has Python bindings too.

[1] http://sciter.com - multiplatform HTML/CSS UI Engine for Desktop and Mobile Application

[2] TIScript - http://www.codeproject.com/Articles/33662/TIScript-Language-...

[3] DyBase - http://www.garret.ru/dybase.html

niftich 1 day ago 4 replies      
I was unfamiliar with this project and assumed it was a hosted service at first. Not so, this is a local application, so an embedded database makes sense.

It took until the very last paragraph for the blog post to make that point.

Feneric 1 day ago 3 replies      
SQLite also does remarkably well with recovering from all manner of power loss / crashes / worst case scenarios. We created a "power loss" rig just to test this facility for one particular system. Really SQLite's biggest weakness is concurrency, and if your app needs that in any serious amount you probably ought to look elsewhere. If you're just dealing with occasional concurrency though SQLite shouldn't be dismissed out-of-hand.
coleifer 1 day ago 0 replies      
Classic HN bait.

You don't even need to read the comments to know what people will say:

"SQLite is a great fit for this type of application. It's a replacement for fopen people. fopen."

"What about ALTER TABLE?"

"It's just a toy database, it doesn't even support concurrent writers"

----- "WAL mode"

"Hey, golang, rqlite"

----- "Whoa I wrote something similar for a ..."

----- "Why would you use this? Just use postgres"

"SQLite is the best database ever"

"SQLite is the worst database ever"

omarforgotpwd 1 day ago 1 reply      
For any database that isn't huge, a library embedded into your application is going to be faster than anything that has to communicate with a server over a socket connection. Though both execute SQL queries, SQLite is completely different than relational database servers and appropriate many places where running a full RDBMS is not. For example, you can't run MySQL or Postgres on the iPhone, but you can use SQLite.
chillacy 1 day ago 3 replies      
> were read-heavy and write-light

> we have almost no indices, no principled denormalization

Sounds like an easy win. People are probably suggesting a database switch because they're finding issues with the current speed, but they're not using their current database to its full potential yet.

jwatte 1 day ago 0 replies      
Sqlite is fine for small scale systems. It is not a "web scale" database, but not every web site is "web scale."SQLite does have performance limits, and will break at certain load, but until that, it's okay.For single user databases, like desktop applications, SQLite is awesome!What the others bring to the table is concurrent sever performanc, user management, and such.There's nothing surprising about this, right?
cyberferret 1 day ago 1 reply      
I'm a long time user and lover of SQLite, since way back when. Use it in a lot of our projects (web and Win32) that require local databases for logging etc.

Sure for larger or concurrent user access to a db, we use other databases, but nothing beats the 'zero footprint' install of SQLite. I even enjoyed the fact that earlier versions of SQLite forced me to think optimise my queries due to the lack of nested SELECTs.

SQLite still kind of reminds me of my early days in MySQL. I was recently trying to downgrade MySQL 5.7 to 5.6 after a slew of issues, which forced me to reminisce about how simple things used to be when MySQL was a lot like SQLite still is now...

pskisf 1 day ago 0 replies      
You're doing it right for your application! MySQL or PostgreSQL would most probably be slower and introduce a lot more overhead as they are client/server oriented systems. Don't listen to those armchair architects!
oppositelock 1 day ago 1 reply      
I don't get the point of this article. SQLite is fine, especially in an embedded database, but once you have concurrent access, it starts to suffer because it has very coarse grained locks, so a "real" database is better for a distributed single-DB design. It's more about using the right tool for the job, and the author seems to be talking himself out of some kind of guilt for SQLite being the right tool for him.
jedberg 1 day ago 0 replies      
I can't blame them. I've been a huge fan of SQLite for years. Anytime I need storage it's my default choice unless there is a specific reason to use something else.

Another nice advantage of it is if you are distributing something that requires a small dataset[0][1]. If I give you both the code and the data already imported into a sqlite database, then you can use the code right away, or you can dump the data to a different database very easily.

[0] https://github.com/jedberg/wordgen

[1] https://github.com/jedberg/Postcodes

zaphar 1 day ago 0 replies      
What people actually seriously suggest that a desktop application needs more than sqlite offers in the way of databases?

Desktop apps are like the sweet spot for sqlite. It's practically made for them.

qwertyuiop924 1 day ago 0 replies      
Suggesting you add a server dependancy to your desktop app as a solution to a problem that isn't there is pretty braindead.
therealdrag0 20 hours ago 0 replies      
For everyone loving SQLite, you should consider donating to them. I remember a post this last year about the maintainers working on it full time, but making much less than most of us probably do.
tedmiston 1 day ago 2 replies      
I have used SQLite for similar use cases, but occasionally it's led to a corrupted db. I had a cron task writing to it once a day, but an issue with the scheduler led to 2 tasks one day with the latter one finishing before the former.

Of course I can add locking or something in my code, but I'd prefer to handle at a lower level for example, have SQLite take the latest write without corrupting. I'm hoping someone has solved this problem with SQLite elegantly.

franciscop 1 day ago 0 replies      
I didn't know beet, but it looks exactly like what I've been wanting for years.
kefka_p 1 day ago 1 reply      
Some people are unfamiliar with the phrase "right tool for the job".

As the developers behind the project, I'd have to think the authors are in the best position to make the determination about which tool is appropriate.

chmike 1 day ago 1 reply      
SQLite is Ok, but write access must be synchronized. I used it for my Flask (Python) application and was forced to switch to PostgreSQL because of synchronization problems. I would prefer sticking with SQLite which was simpler to manage.

The author doesn't say a word about synchronization when writing to SQLite.

ww520 1 day ago 2 replies      
I wish HTML5 storage standardized on Sqlite. The inconsistent story on HTML5 storage across browsers is kind of sad.
nickysielicki 1 day ago 0 replies      
Beet is an awesome program, you should really check it out if you still are among the minority of people who actually have a music collection and don't rent access from spotify/itunes/etc.

I'm glad to see this post; one of the reasons that I like beet so much is that everything is self-contained.

Ultimatt 1 day ago 0 replies      
The bigger news here is they arent using an ORM to make moving between databases trivial.
partycoder 1 day ago 0 replies      
Even a file can be convenient. It's all about how you integrate it into the system.
known 1 day ago 0 replies      
Firefox uses SQLite
smegel 1 day ago 1 reply      
This was a useful and informative post.

Where on earth do you think the author is asking for a 'pat on the back' in this?

coleifer 1 day ago 0 replies      
I have written several "Use Sqlite!" posts that have made the rounds on hackernews... reading this watered down post, which is devoid of any new, surprising or usable info, it strikes me that repping SQLite has achieved meme status.

If you want tangible info you can actually use, read sqlites documentation. There's a wealth of information there.

Here are some of posts, for the Python crowd:




When everything else fails, amateur radio will still be there and thriving arstechnica.com
355 points by Tomte  1 day ago   176 comments top 22
davidwihl 1 day ago 0 replies      
I was a volunteer ham working a first aid station during the 2013 Boston Marathon bombing. The cell bandwidth was rapidly overwhelmed. Ham continued to work, including amazing professionalism on the part of Net Control when things got really busy. It's battle proven.

FMI http://www.arrl.org/news/radio-amateurs-provide-communicatio...

grandalf 1 day ago 4 replies      
Amateur Radio is a great hobby if you're interested in radio wave propagation, RF engineering, radiosport, digital modes, satellites, etc.

It's surprisingly easy to get a license, and you'll find that many of the older generation of radio amateurs are among the most young-at-heart oldsters you'll encounter.

walrus01 1 day ago 4 replies      
When everything else fails because a massive catastrophe has hit your local area, even something as big as a 9.2 earthquake destroying most of California... Everything satellite based (that is not dependent on teleports in CA) will still function fine. Amateur radio is nice and all for voice communication, but for IP data you could still use:

a) C, Ku and Ka-band VSAT terminals via geostationary satellite, to earth station anywhere else in the same hemisphere. Example: 1.2m VSAT in CA, teleport in TX.

There are all sorts of mobile VSAT systems including auto-aim/auto tracking antennas and military grade ones that will fit into a large backpack.

b) Handheld satellite phones: Iridium phones will work fine after a huge clusterfuck disaster. And run on a lot less power than a ham radio rig. They use a LEO satellite network. The Inmarsat iSatphone talks to the I-4 series of geostationary satellites and will work fine.

c) Portable L and S-band laptop sized Inmarsat terminals (BGAN), again speaking to the I-4 series satellites. These are about the size of a fat laptop and also require a lot less power than a ham radio setup. Speeds from 100 to 500 kbps depending on spot beam capacity/utilization and TDMA contention ratio. Some have built in wifi hotspots, others have a 100BaseTX interface to plug in your own router.

You can do all sorts of useful VoIP tricks with Iridium and Inmarsat satellite phones - both services offer regular US NPA DIDs that ring on your phone, and it's easy to set up a phone with a short 50 ohm coaxial cable to an exterior roof antenna if you need to semi-permanently install one on the desk of an indoor command center/disaster relief comms post.

edit: The major use of geostationary satellite in a disaster is to repair and bring back online a broken/islanded TCP/IP network. You can show up to a completely off-net command center (for example: Disaster operations HQ for City of San Francisco) and bring it back online to the outside world by parking a 1 to 2 meter sized VSAT dish on the roof and connecting a satellite modem to the WAN uplink of their router. Satellite serves a different and complementary purpose to ham radio which is almost purely analog voice in a disaster scenario. Two people can carry the equipment needed to bring a 5 Mbps x 5 Mbps pipe with 0.0% packet loss.

slr555 1 day ago 1 reply      
For me the best thing about HAM is that it makes you learn before you can play. For a non-engineer such as myself, having to learn the electronics, physics, antenna design, FCC rules etc. forced me to acquire knowledge I probably wouldn't have otherwise gone out of my way for as an adult. I got my General class license a few years ago and keep trying to get myself motivated to go for my Amateur Extra license but doing HAM in NYC is hard. Unless you are lucky enough to have someplace you can put up some kind of antenna you are largely limited to a 5 watt HT. There is so much that is exciting happening in HAM today. SDR systems are very exciting as is all the internet hybrid. It is super fun and I do hope a new generation gets psyched about it and drives innovation.
brian-armstrong 1 day ago 1 reply      
One of the things I like about amateur radio is that it teaches you to respect a common good (in this case, the spectrum). Hams seem to really understand this concept. It's easy to be defeatist and cry about tragedy of the commons but in amateur radio people are largely respectful and abide by the etiquette.
kqr 1 day ago 2 replies      
So for someone completely uninitiated: what's the smallest and cheapest possible step into this world? I'm not ready to dive into it fully, but I feel like a good first step might be to just get portable, cheap equipment that lets me tune in and listen to broadcasts on various frequencies. Does my thinking make sense?

Background: I have ADHD so I have to force myself to not jump in at the deep end whenever I hear of something novel and cool.

peterkshultz 1 day ago 0 replies      
My father introduced me to amateur radio at a young age. I got my license as a teenager.

I play with a mode of communication called Earth-Moon-Earth, or EME. The idea is to bounce signals off the moon and have them get picked up by a pre-arranged partner back on Earth. It feels cutting-edge.

Were more people exposed to such off-the-wall applications of ham radio, I think there'd be a resurgence in the hobby.

qwertyuiop924 1 day ago 2 replies      
I got my license in the 6th grade. Which wasn't all that long ago for me. We helped establish a local radio club. Like programming, amateur radio can be very intimidating, but isn't ultimately that complicated.

On a side note, megabit speeds on HAMNET? Holy Crap. Most packet radio only talks maybe 9600 baud max. Hmm. Come to think of it, Linux does have kernel-level AX.25 networking support... Anybody up for Quake over radio? :-P

whamlastxmas 1 day ago 4 replies      
Amateur radio will thrive but only relatively speaking - it will still be pretty useless.

1. Without repeaters, which in the best situations only have enough battery for less than a day, you will not be able to reliably communicate farther than 10-20 miles in most circumstances. With handheld devices, only a couple of miles.

2. For repeaters that do manage to stay up, even less than a day, they are usually exclusively for emergency response use only.

3. Anyone you need to talk to has to have a radio. Most people don't. Most people don't even know someone who has one.

I looked in amateur radio as a tool for emergency situations and found that its usefulness was pretty limited. If I had a natural disaster in my city and needed to communicate to family in a city 300 miles away, it's pretty complicated and expensive to do so without a repeater, and repeaters can't really be relied upon in situations like that. My state even has a repeater network that accesses most major cities in the state, but given that only a single person can talk on it at a given time, the opportunity to talk to my family over it during a natural disaster seems pretty unlikely.

mikegioia 1 day ago 5 replies      
This is really cool. The article mentions that encryption is illegal over these radio frequencies, but why is that? Are people actively detecting encrypted data?

It would be cool to experiment with these radios but have it all communicate using TLS or something.

vanous 1 day ago 1 reply      
I have an antenna ready to be raised and station in a box.... but other then feeling like a prepper I have no practical use for it. Network effect applies here too, like for any other social network. Only old people here on waves, checking weather daily. I was excited years back, but now it feels like dying breed.
agumonkey 1 day ago 1 reply      
There are videos on youtube where some guy scans very long range (across continents; maybe leveraging atmosphere reflections) and randomly connecting with dudes up high somewhere in Siberia. I felt living 2001 a space odyssey in real time. Since I want to go into HAM.
gp10 1 day ago 0 replies      
The intersection of Ham radio and SDR (software defined radio) is proving to be quite interesting.
joeyspn 1 day ago 3 replies      
My neighbour (next door) is a radio aficionado and a thing that worries me is that sometimes when I'm with the headphones or speakers connected to the computer, I can hear him speaking... Does anyone knows how is this possible if the macbook does not have receiver? this has me puzzled...

I find the amateur radio somewhat interesting, but on the dev level. I was about to buy a HackRf, and I'll probably do it when I have more free time...

carlesfe 1 day ago 1 reply      
It seems there are some ham radio operators here, I hope someone can help: I'd like to get introduced in this world, is there some guide I can read to know what I'll need to learn and which equipment I'd need to buy in order to get started?
imglorp 1 day ago 2 replies      
> When everything else fails...

What about nuclear war or a solar mass ejection event? I'm thinking a big fat EMP will smoke the semiconductors in most amateur rigs: they often have big antennae and sensitive pre-amps.

Is there any RAD-hard amateur gear?

wprapido 1 day ago 0 replies      
amateur radio was used in the civil war in yugoslavia back in the 90's as a way to spread news and as a way for friends and relatives on conflicting sides to kind of get in touch


omginternets 1 day ago 2 replies      
This might seem like a silly question, but what does one do with a HAM radio license? Just talk to other people?

I have trouble seeing where the creative, build-cool-shit part comes in, though I'd love to be wrong about this!

tmpanon1234act 1 day ago 0 replies      
Pretty fun read. I've been into amateur radio for a while, so here's hoping it one day proves more useful than just a frivolous hobby :)
elcapitan 1 day ago 0 replies      
When everything else fails, I'll write the message on a piece of vellum.
nxzero 1 day ago 1 reply      
Given the lack of privacy, security, etc. - using amateur radio is a no go for me.
nateguchi 1 day ago 0 replies      
*if everything else fails
Docker Betas for AWS, Azure, Mac, and Windows docker.com
304 points by petemill  22 hours ago   97 comments top 13
dang 20 hours ago 0 replies      
We merged the threads on these announcements. The Mac beta is at https://www.docker.com/products/docker#/mac and the Windows one at https://www.docker.com/products/docker#/windows.
parent5446 21 hours ago 6 replies      
What happened to the good ol' Unix philosophy? The docker command used to be about containers, not service and network scaling in the cloud.
julienchastang 20 hours ago 5 replies      
Slightly off topic. What are people doing about user data persistence on the cloud/Docker? Specifically, we are porting a desktop application to the cloud via application streaming technology with Docker, but we would like the users data and preferences to not go "poof" when the cloud instance disappears. Ideally, we would like some automagic way to attach, say, the user's dropbox account or the equivalent to the cloud instance. Is anyone working on that problem?
Sanddancer 15 hours ago 1 reply      
They need to put a note on their page that docker for windows is not compatible with docker for windows containers. I've been playing around with docker for windows containers for a few days, then saw this and thought, "cool, an update." I installed, and discovered that while the client is compatible between the two, the daemons they run cannot see the containers created in the other daemon. MS and Docker need to sort this stuff out, because right now, windows containers are nicer for the few images that have been released, but docker for windows allows for the full docker ecosystem.
mwambua 6 hours ago 0 replies      
IBM's Bluemix has supported Docker Containers for a while now, but hasn't gotten much limelight... probably as a result of the size of their community and Bluemix's sketchy [but improving] stability. Does anyone have any experience using their container service? And would this be a big improvement?
andor436 21 hours ago 2 replies      
As usual with AWS (or anything I guess) there's more than one way to accomplish a particular goal. How much of Amazon's Elastic Container Service is replicated by Docker for AWS? I'm currently using ECS + Docker but this looks potentially simpler.
spilk 11 hours ago 1 reply      
Not a fan of the new Windows version as it requires you to enable Hyper-V, which stops any other virtualization (Virtualbox, VMware, etc.) from functioning. The only workaround I've found is rebooting to enable/disable it on demand.
jowiar 21 hours ago 2 replies      
Long-term, where does using something like this make sense vs. Mesos?
GordonS 8 hours ago 1 reply      
And the Windows version is still Windows 10 only :/

Many large organisations are going to be tied to Windows 7 for a good while yet...

FloNeu 6 hours ago 0 replies      
Now even docker wants me to upgrade to windows 10 ^^Keep up the great work!
tacos 19 hours ago 2 replies      
They never emailed me from the last "private beta" before announcing this. And now deploying this on AWS or Azure requires me to sign up for yet another "we'll get back to you..." private beta.

Love the tech, hate all this marketing runaround.

EDIT to add: either your beta is ready or it ain't. I understand a gentle initial seed to verify it's not an utter catastrophe but no need to play nanny with my bits across 20+ beta releases. I'm a grown up. Make a disclaimer and let me assess the risk. Your corporate logo looks like a 1970s Carvel ice cream cake--I know what I'm getting myself into.

zymhan 20 hours ago 1 reply      
Am I the only one getting SSL errors trying to connect to their blog?
pgz 20 hours ago 2 replies      
I don't understand why Docker for Mac is a GUI application. I'd rather get the same features from the CLI.
Y Combinator's Xerox Alto: restoring the legendary 1970s GUI computer righto.com
298 points by kens  2 days ago   105 comments top 15
Animats 2 days ago 1 reply      
I haven't used one of those in a long time.

Stanford had several Alto machines, but they didn't have Smalltalk, due to some licensing issue. They just ran standalone Mesa programs. When I was at Stanford, few people wanted to use the obsolete Altos, so time on them was available. So I did a small project on them.

Bravo was used as both the text editor and the word processor. The file format was plain text, then a control-Z, then the formatting info. The compiler stopped at control-Z. So you could use bold and italic in your programs, and make the source code look good.

As in the picture shown, the Stanford machines had the keyboard and display on top of the computer. This isn't required, and it's really annoying to type on. The keyboard is great; it's a massive casting around clicky keys.

Altos talk PUP, Parc Universal Protocol, over 3MB coax ethernet. Stanford had gateways to connect this to the wider world.

I think I still have some of the Alto manuals.

The vision statement for the Dynabook is in "Personal Dynamic Media"[1] This is worth re-reading every few years.

[1] http://www.newmediareader.com/book_samples/nmr-26-kay.pdf

dang 2 days ago 1 reply      
We've all been fans of Ken's blog for years, so were thrilled that he took an interest in this project. Not only is Ken doing these amazing writeups, he gathered together the master restorers and engineers, some of whom worked on Altos at the time, who are now working on this one. Seeing them set to it, inspecting the Alto and figuring out what would be needed, was a real lesson in self-organization. I felt honored just to watch from the side.

We have two goals. One is to have the restoration chronicled as it goes along, in a way the HN community can discuss and participate in. Obviously we hit the jackpot there, with one of the best technical bloggers in the world.

The other goal is to do something with the Alto that the community will find interesting once it's running. A couple ideas are to make it fetch and render the front page of HN (we'd happily write whatever code was needed to serve it in a suitable format, since HTML is probably a bridge too far), or if we could find a second Alto to communicate with, play Maze War on them (http://www.digibarn.com/collections/games/xerox-maze-war/#ma...). But we'll be eager to hear any suggestions the community comes up with!

gumby 2 days ago 4 replies      
I used the Alto at MIT (and for fun when I worked at PARC -- we had more powerful machines by then).

There are two other things about the alto that have really stuck in my mind. First, the whole thing uses only 300 SSI and MSI TTL chips! No higher order chips (no LSI, much less VLSI).

The other is that the bus bandwidth was only 3/2 the screen update rate. Updating the screen was really important: this was a user-centered, IO-focussed machine which was super radical for its time. If you wanted to do a lot of computation you could steal cycles from the screen update, causing it to go black (in just the bottom half or so IIRC which I probably don't).

Error in the article: I do believe the Alto was the origin of the BITBLT instruction, but it was based on the PDP-10 (PDP-6) block transfer instruction BLT, and the expression blitting was current before the Alto was developed. In fact PARC had a PDP-10 which was the standard research computer at the time -- homemade as well (clones) because at that time Xerox was in the computer business and wanted PARC to use an SDS. (Again this is before my time though MAXC was still running when I was there -- with an Alto as its front end!)

Also contrary to what the article says, the Alto display was not unusual in being portrait mode -- most glass TTYs (think ADM-3A, Hazeltine, VT-52, and I believe the 3270 as well) were taller than they were wide, like a piece of paper. The Alto display was unique, as mentioned, by being bitmapped and black on white. Because of the Alto, bitmapped portrait mode was standard for workstations such as the CADR lisp machines, Bell's BLT terminal, three rivers PERQ, and of course the later PARC computers we used, Dolphins, Dorados (all ECL logic!), Dandelion (sold as the Star). I remember vividly the first landscape machine I used, the Symbolics 3600 in 1985. I didn't, and still don't appreciate the wasted space of landscape displays.

Three-button mice with the mouse buttons arrayed horizontally was also standard because of the Alto. The first time I saw the Macintosh mouse in 84 I was shocked: how could someone use only a one-button mouse? There was a lot of mouse (originally called the "bug") experimentation in the 70s on button count and layout.

The microcode of the alto was compatible with the DG nova as that was the computers used at PARC before the Alto was developed (before my time!).

edit: forgot to mention the origin of blitting.

kabdib 2 days ago 0 replies      
We had some Altos (and a laser printer) at the Bureau of Standards, when I interned there in the late 1970s and early 80s. They had a number of games; one of them was written in SmallTalk, which you could break into, and then muck around in. Some screenshots from BYTE and a few papers gave us syntax hints, and we were off.

At one point we had some questions about SmallTalk-76 and called up Xerox PARC. Managed to get hold of Adele Goldberg, who answered our questions but was not terribly amused. I think Alan Kay would have been friendlier to us kids :-)

dom96 2 days ago 2 replies      
My first impressions of it are that the portrait oriented monitor actually looks very stylish. There is something almost futuristic about it.
progman 2 days ago 4 replies      
There is a lot of documentation about the Alto. Are there also complete circuit diagrams? I am just curious because the processor was made in TTL at a time before the 6502 and Z80 were born.

Sooner or later the last functional Xerox Alto will cease to work (sadly). In that case it could make sense to replace the dysfunctional parts with modern retro circuits. I wonder if a project to build a functional Alto clone (with TFT as screen) would make less effort than the famous monster 6502 which was presented recently.

pmarin 2 days ago 0 replies      
The restoration is also being documented in Marc's youtube channel.


intrasight 2 days ago 0 replies      
Xerox Alto brings back some memories indeed. I used one at CMU as a frosh to do engineering drawings for the Terragator (http://gizmodo.com/5072167/25-years-of-strange-and-wonderful...). Being new to computers, I didn't really get that a mouse and GUI were revolutionary. It just seemed so obvious. But that is the beauty of innovation done right - that to users it just seems obvious and natural.
krylon 2 days ago 2 replies      
Total envy! There is an Alto in the Heinz-Nixdorf-Museum, but it is not functional.

Seeing one of these machines in action would be awesome. (Is there an emulator available?)

kilroy123 2 days ago 2 replies      
My uncle worked at PARC when this was being made. I probably wouldn't of gone into tech, if it weren't for him.

This makes me want to message him and ask him about his time there.

pjmlp 2 days ago 2 replies      
This was a great system.

The more I research into Xerox's papers and manuals for Interlisp-D, Smalltalk and Mesa/Cedar systems, the more I become convinced it was a big step back to the industry the adoption of inferior systems like UNIX.

Thankfully many traces of those ideas are now in Windows, Mac OS X, Android and iOS, Language Playgrounds and many IDE workflows.

e12e 2 days ago 2 replies      
> The disk drive at the top of the cabinet takes a removable 2.5 megabyte disk cartridge. The small capacity of the disk was a problem for users, but files could also be accessed over the Ethernet from file servers.


> The Alto was introduced in 1973.

2.5 megabytes of removable/swappable storage ? In the 70s? I'm amazed that users found it constraining! That's more than even the Amiga managed to fit on 3.5" floppy disks (Unlike the PC which generally were only able to format for 1.44 MB, the Amiga generally fit ~1.8 on the raw 2MB HD 3.5" floppies).

protomyth 2 days ago 1 reply      
There are so many points in the computer industry where one or a small group's decisions changed everything. After listening to "Dealers in Lightning" you wonder about quite a lot of decisions. One of the later ones got me to thinking, what would have happened if Xerox had not got rid of its pre-IPO Apple stock and allowed the Lisa team to license Smalltalk? I wonder the sequence at that point with the Mac.
fabled_giraffe 2 days ago 2 replies      
The original Macintosh really seems like it borrowed a lot aesthetically, in addition to the similarity in operation:



acd 2 days ago 1 reply      
Did the Alto have a graphical chat system?
Facebook is wrong, text is deathless kottke.org
306 points by danso  3 days ago   167 comments top 58
gumby 3 days ago 8 replies      
> Human brains process it absurdly well considering there's nothing really built-in for it.

That's unclear. The sheer bitrate of reading suggests that it might tap into some deep structures -- hacking some parts of the visual and speech systems, if you prefer.

I don't click on HN video links because I find video slow and frustrating way to learn almost anything. Text is so random access -- you can skip over boring bits re-read hard bits, luxuriate in the really wonderful bits...all of which is hard in video. And in fact because the visual channel is so complex, I find reading more multimedia than video -- it's hard to feel cold when watching someone march through the snow, though a well written book can make me shiver with cold, even on a summer day.

toomanythings2 3 days ago 4 replies      
Well ... uh ... I think ... I think ... uh ... everyone can hear me? ... I'll wait till the guys in the back ... oh and the girls, too, right hahahahahah ... OK ... let's get started...

Sure. I know Facebook wouldn't put out videos like that but it seems 80% of all videos linked to do start out like that and that's the quality you're going to get from anyone worth listening to. Even the polished ones, though, blatter on about things I don't care about but I can't skim ahead in the video without worrying I'm skipping something I really do want to hear.

Thus, the advantage of text. To the point. Skimmable, both forward and backward with the ability to understand what you are skipping over.

Have you ever read transcripts from talks given? Don't you wish someone would have edited out all the garbage talk beforehand? And doesn't your mouse wheel finger hurt scrolling down the page, only to be told "to hear the rest of the video, click here".

Now let's talk about the weight of video files ....

sly010 3 days ago 3 replies      
A lot of us here works with information for a living. Of course, we prefer the leaner, more information dense medium.

Facebook is not a platform to communicate interesting ideas succinctly. I apologize for not coming up a better way of saying this, but Facebook is catering for the not-so-sophisticated. The majority of users probably can't scan/process text very fast.

Facebook is TV. It wants to be TV.

terryf 3 days ago 3 replies      
I guess maybe I'm too old to understand (38) but to me, reading some text is simply way faster than watching someone read that text in a video... but they are saying the exact opposite.

This is completely baffling to me.

shiven 3 days ago 1 reply      
I can't stand video content unless the visual component is absolutely critical to transmission of the idea that is being communicated. I'd take a podcast/audiobook (with playback speed control) over 99% of videos that are shared on HN and a gazillion other sites.

The "information density" to "bandwidth" ratio (is there a term for it?) is seldom justified for the majority of video content.

Mendenhall 3 days ago 1 reply      
I cant stand videos for most information. By the time the video loads and the probably slow speaking individual goes over just the introduction of the information, I could have already read a more informative article/post.
notliketherest 3 days ago 1 reply      
This article borrows heavily from an article that Graydon Hoare (founder of the Rust programming language) posted a couple years ago - http://graydon.livejournal.com/196162.html
hanniabu 2 days ago 3 replies      
It'd be great if we could extract subtitles from video and put into a nice text format that's easy to read so it's skimmable. Then when you get to an area of interest, you should be able to click the place where you're interested and a video will pop up and start from there. Then when you've heard what you wanted to hear you can minimize the video and skim the text for the next area of interest and so on.
kalleboo 3 days ago 1 reply      
How much Facebook browsing is done discreetly in school, at work, in meetings, on public transport, when waiting for someone? Will video work in these situations? I doubt they'll be able to replace their current text with video.

It sounds more to me like they're looking at the bigger ad dollars YouTube is getting and want to absorb their market.

The only way for Facebook to grow now is to get out of the "friends and family" market and take over Twitter and YouTube's "celebrity/popular people" market. It seems like this could be a difficult pivot.

matchagaucho 2 days ago 0 replies      
The 90 / 9 / 1 rule is finally taking hold on Facebook.

With only 1% of the users generating content, 9% simply liking/commenting, and 90% logging in to just watch.... FB is desperate to satiate the immediate gratification needs of 1B+ people.

Video is the quick fix.


ivv 3 days ago 0 replies      
One way to read it is "people are increasingly preferring video over text, so that's where Facebook is going". I think what it really means is that Facebook sees more ad money in video than in text, so that's where they are steering the ship.
l33tbro 3 days ago 0 replies      
This is anecdotal, but I've still never watched one of those auto-play videos you see at the top of news articles. It's just so much more efficient to just skim the text below and find the relevant information you were after.
heisenbit 3 days ago 0 replies      
Humans are wired to seek validation.

Consumption of media that expresses resonant emotions is to a degree validating.

However expressing and understanding myself and seeing myself emotionally and intellectually understood by peers is validation on another deeper level. The composition and writing of text has been shown to excite other areas of the brain than simple speech (there is a whole school on writing therapy).

There is always going to be a need for an immediate way to consuming and reacting. That market is served by twitter and snap-chat. Then there is the need for longer, carefully considered deeper thought. Thought where emotions have been deliberately moderated to provide breathing space for facts.

Video may provide more bits per second and via the eyes is more directly wired to our decision system. But the emotional space is already take by twitter and snapchat. The deliberation space is taken by text. It is not clear to me video will grow beyond a extended snapchat.

0xCMP 3 days ago 0 replies      
This will force media companies to do something which they should have done a long time ago.

Think of all the videos you ignore on Forbes, WSJ, Bloomberg, etc. because you can't view them muted. Think of all the videos from Buzzfeed and others which aren't of such high caliber but are so easy to consume that you do.

This will force those with real content to publish that content in an accessible way. I'm fine with videos, especially muted ones, because if the current trends stay I think they'll be more useful to me and everyone else.

Text IS great. But somethings like a presidential speech or a short interview need the visual element. Some things don't need the visual part, but a media company can make it better than without it. The key will be to keeping it short because as others mention in the comments it's very difficult to skip around in videos for what you care about. Most of the videos I mentioned are already pretty short though, so I'm guessing that won't be much of an issue for them to adapt.

partycoder 3 days ago 2 replies      
Until now publishing content has had limitations.

Now anyone can publish as much content as they want, however irrelevant, and that is becoming a problem: proliferation of irrelevant content (irrelevant from the perspective of the reader).

So I think the next challenge is to just be able keep content concise, relevant and affine to your interests. Twitter took a stab at that, but it's not there yet.

Having a machine to filter and produce summarized versions of whatever endless feed you are reading, as well as remembering seen entries (a bit like Snapchat) is the next frontier.

Another key issue is selective ignorance, biases and such. Only exploring stuff related to your interests can trap people within a detached state with respect to reality.

Bahamut 3 days ago 1 reply      
There are a lot of people who like communicating through image memes & other short soundbites - I am not one of them.

I value text. I like reading deeper insights from people much more than cheap flyby memes or time consuming videos. If communication regresses like that, I'll probably withdraw from using those features. It's simply what I don't want in a social network.

tracker1 3 days ago 1 reply      
As FB has removed and reduced features, including messaging from their mobile web app, I've been using them ever less... I'm not sure they aren't alienating as many people as they're actively engaging, and in the effort to keep the new millenials, they're pushing everyone else away.
combatentropy 3 days ago 0 replies      
Yes, for almost everything online, I prefer text: programming tutorials, the news, discussions like this on Hacker News. Imagine if each reply here had to be an uploaded video of the member talking.

For some other things, I prefer a video: how to cook something, how to repair something, how to tie a tie, an interview with a person whom I admire. Even then it can depend on my mood, and if I'm in a hurry I am like, "Oh, just cut to the chase, or put it in a one-paragraph article."

ape4 3 days ago 1 reply      
Is it out of fashion to still want Facebook to die.
mark_l_watson 1 day ago 0 replies      
I don't agree with the article. As others have already said here, text is random access and easy to process and retain information.

I am going to drift off topic: the future is AI that understands what we say, recognizes our facial expressions, and generally "gets us." While there are obvious potential downsides, the upsides involve getting notified of things in the order of most useful and entertaining first. AIs will use text, still photos, and videos to show us what we want to know and experience. This whole idea creeps me out a bit, but it is probably the future. At some point in time, effective computer to human neural I/O connectors will be invented, and the effect of civilization will be interesting. So, video -> direct neural implants.

welanes 3 days ago 0 replies      
"stats showed the written word becoming all but obsolete, replaced by moving images and speech."

Obsolete would be a stretch however for many the web is all about consumption. Mobile devices cement this point. And videos are a very efficient way to consume information:

- Read the book VS watch the TED talk.

- Go to the recipe site VS watch this Tastemade video.

- Read the Foreign Affairs article on the complexities of the Syrian war VS Watch this cool graphic filled Vox video.

And even with the written word, text is becoming more terse:

- Read this article vs Read this set of tweets

But people will never stop writing so perhaps all Facebook is saying is that the written word will become less relevant to their business model as they slowly turn into something resembling Snapchat

x4m 2 days ago 0 replies      
I think that interesting topic is "will we create programs as text for a long time?". Projection IDE seems to stall.Some time ago I was thinking about visual query language for databases (domain-specific) and composed this list of text advantages:

1. Version control systems (VSC) out of the box. Software developers have been using safe source code tracking for many decades. Since queries are the main tools for data analysts, they should be treated the same way, but the development of all VCS features in a custom query editor is not economically viable.

2. Portability. Text queries can be written even on a whiteboard and a notepad. One of the great advantages of text queries is that they are unambiguous: there are no hidden parts in a text query.

3. Detachable. Text queries can be run in different warehouses. They should not depend on identifiers, conditions, and environmental variables.

4. Fragmentation. An analyst can extract some feature from a query and partly pass it to his colleagues.

5. Embedding. Software developers easily can embed queries into autotests, side subsystems can embed query parts into their code or resources.

6. Specification. DSQL can be a part of a systems applied programming interface (API) for third party systems if DSQL specification is precise enough.

andrewfromx 3 days ago 0 replies      
I always sent emails in plain text mode. I hate the idea of sending an entire html doc for a simple text only job.
aklemm 3 days ago 0 replies      
They're probably thinking of memes and emojis as well. It can't be argued that well-structured text is a strong suit among individuals communicating with Facebook.
PhasmaFelis 3 days ago 0 replies      
> Mendelsohn went further, suggesting that stats showed the written word becoming all but obsolete, replaced by moving images and speech. "The best way to tell stories in this world, where so much information is coming at us, actually is video," Mendelsohn said.

This is going to go down in history alongside the mythical "640K ought to be enough for anybody" and the guy who thought the internet was just a fad.

sverige 3 days ago 1 reply      
While video has the ability to communicate certain things more efficiently than text, it is far inferior in other ways.

For example, video is great at showing the physical relationship between car parts - what they look like, where they go - so if I'm trying to figure out how to get to that sixth spark plug on a 1990 Bronco II, I'm going to look for a video.

Video is also good at pacing the timing of emotional reactions. If it weren't, no one would watch movies.

If, on the other hand, I'm trying to learn or experience something complicated, like how to code in a new language, or an overview of the history of the idea of revolution, or any kind of theology, I'm going to read about it. I'm not going to watch some talking head give me far fewer words in the same amount of time, even if the video has some nice music and pans across a few pictures of cathedrals while the narrator speaks.

Words are the fundamental unit of thinking, and video is piss-poor at communicating words, especially when compared to writing.

chiefalchemist 2 days ago 0 replies      
FB is missing three significant differences. 1 - Consuming video is time consuming. There is no quick scan for key words etc. If you're locked in on video X there's no scrolling on, etc. Your experience, in a way, stops. 2 - Video is passive. Reading takes engagement. Those will effect the brain very differently. 3 - Video is not conversational. It's traditional top down broadcasting.

Sure FB might become more video but that doesn't mean it's going to result in the same attachment to the product, at least for adults.

Animats 3 days ago 0 replies      
It's not clear what Facebook is thinking here. Video is useful for showing what you did, but not useful for communicating what you want to do. Maybe this reflects that people don't plan trips on Facebook any more; they use Instagram for that. One could interpret this as Facebook abandoning communication in favor of being a collection of public scrapbooks.
fuzzfactor 2 days ago 0 replies      
As a consumer/dependent-based institution which benefits from increased control strives for growth and succeeds, eventually you reach a point where the most benefit to the institution can best be achieved using less literate followers/subjects.
milesf 3 days ago 0 replies      
The folks at Facebook are not stupid. This is likely setting the groundwork to ream more bandwidth through their Aquila project https://info.internet.org

More bandwidth means more money, which means convincing more of the public to invest more in Facebook.

rathish_g 3 days ago 1 reply      
I work in eLearning and we create a lot of visual content as its easy to convey ideas in video than through text.

The cost of video production is at least 50X or 100X more than text and hence, we take extra caution to make sure that it's precise and accurate. What is said in 1500 words is trimmed down to 250 words to create a video. It goes through multiple talents like motion graphics team, anchors etc before it hits the screen and hence it's interesting to watch. Text, caption etc are just there for SEO. Bandwidth is the only constrain, especially in developing nations.

And then there are low-quality mugshot videos, which takes less budget than what it takes to present the same content in text.

The quality ones will replace text forever.

foobarbecue 3 days ago 0 replies      
Interesting that the author chose "deathless" over "immortal."
marxidad 3 days ago 0 replies      
I think that the only thing that will make text obsolete is synthetic, panpsychic telepathy.
sametmax 3 days ago 0 replies      
Text is an efficient medium to express something complex. But the average facebook user is not expressing something complex. He shares quick reactions, emotions, stuff that trigger anger or fun. Not things that requires deep analysis.

For that, video is perfect.

So in the context of Facebook, it makes sense. In the context of ads, it makes sense.

Yes, we don't want to believe it because it sounds like hell to people reading HN religiously, where text is kind, where people debate, where people don't use smileys, and where ideas get evaluated.

But that's not what Facebook is for. And that's not what the people on Facebook want.

Illniyar 3 days ago 1 reply      
Wasn't there a paper from facebook where it says 80% of videos are watched without sound?

Doesn't it mean that most of these videos are watched with subtitles?

How is that taken into aaoumt with the "death" of text?

gedrap 3 days ago 0 replies      
I don't think Facebook is wrong.

It just so happens that video works better on Facebook (and probably other social networks), where the stream of information is huge. People, most of them anyway, suck at describing things. Most of them can snap a picture of something nice, exciting, cute, whatever. Not everyone is writer, nor should one be to share some everyday thing.

Text has it's uses. It's just that other mediums usually tend to work better on Facebook.

And that's fine.

dandare 3 days ago 0 replies      
Video is good for showing falling people (or naked). When I look for information - news, tutorials, knowledge - I avoid video like a plague. Three words: Low Information Density.
spazzpp2 3 days ago 0 replies      
First, FB assimilated Wikipedia (just text, few images). FB also assimilated IMDB.Then, they discovered that most links aren't Wikipedia anymore but it's youtube. So they assimilate a video service so that they will be quoted when another hyped video comes up and not youtube again.Aaand facebook will continue to assimilate. Google's AI features should be at their next aim.

Facebook is Borg. Faceborg.

sbmassey 3 days ago 0 replies      
To devils advocate, though, what if automatic transcripting of text from video actually worked reliably? You might then get many of the advantages of text in video form: ability to search, copy-paste, fast forward and backward while still seeing what is going on. You would still see the presenters ugly mouth flapping up and down, however, which might be thought a negative.
Clubber 3 days ago 0 replies      
I just read most of the comments, but I didn't read the article. I was much more interested in the comments.

Having said that, if these comments were an audiobook, or a word for word movie script, I'd probably only be through about 5 of them.

Words rule!

On the other hand, if a picture is worth 1,000 words, a movie must be worth billions! Unfortunately it takes a lot of study of a picture to get 1,000 word out of it. That's a lot of pausing.

pixelmonkey 2 days ago 0 replies      
It's interesting to consider how painful a "video-only Wikipedia" would be to use. If the point is to distract, then video is certainly at an advantage. If the point is to communicate or inform, then certainly text is the winner.

I guess that explains FB's hope for a rise of video content.

petra 3 days ago 0 replies      
Just today we had a "children illustrated guide to kubernetes" receiving lots of upvotes(243), and some comments saying it's a much better way to learn stuff.

So i'd say text isn't the ideal, it's just a matter of economics that we are surrounded by text all the time, and maybe facebook can shift that.

On a sidenote, i'm curious why aren't there aggregators for well illustrated content ?

enobrev 3 days ago 0 replies      
I was first employee and lead engineer at a startup that was primarily focused on video, which eventually got aquired by a very well known video sharing platform. If there's one thing I learned in that 5 years, it's that I don't care much about video. I had every opportunity to find a way to care. I enjoyed creating the technologies that we created. I enjoyed managing the systems we built and meshed. I enjoyed inventing ways to store and retrieve Terabytes of short videos for thousands of people. I even enjoyed building a prototype video renderer in Javascript (which was eventually ported to C). But as far as consumption is concerned, I couldn't be bothered with the vast majority of it.

The applications that we built were an attempt to resolve some of this. We were building editing / curation tools to help improve video, and some of the people using our software created incredible things, that I absolutely loved.

I especially dislike the trend in the past (5?) years toward programming tutorials in video. Give me a written tutorial. Give me a blog post. Give me a well-commented repo, or at least a poorly commented one with a decent README file. When teaching anything involving a textual artform / profession, give me Anything but a visual / audial medium. I write and read code for a living, and otherwise I write and read about problems solved with code all day. When it comes to programming, video is a square peg in a knife fight.

That said, I have an enormous collection of TV shows and Movies, and I have a great deal of appreciation for the medium, in general. One of my close friends is an Editor in Hollywood and I have such an absolute respect for him and the creative sides of his industry. I love to see his work and hear about the intricacies therein. When edited and curated, video is splended. I can dive in and forget the world exists and appreciate nuance and symmetry of visuals, sounds, and ideas. And despite my feelings about programming videos, I know it's possible to learn from video, depending on the subject.

I don't want to watch videos of everyone I know doing every day things. I don't want to watch unedited, uncurated crap in real time. Every time a "Live" video link shows up on my FB notifications, I skip it. I don't want to see you live. If I did, I'd video chat with you, or I'd call you up to hang out. But if you want me to take some time out to watch your broadcast, then I expect some planning, editing, and and overall respect for my time. This is video, after all. It's all-consuming. I have to watch AND listen in real time, which means I'm not doing anything else. Asking that much of me requires respect.

Of course another benefit of text is that it's so easy to edit. I don't know if you've actually enjoyed reading this post, but I assure you the first version was much, much worse.

And if you just skimmed it and ended up here, at this sentence, well there it is: my favorite part about text over video.

fuzzywalrus 3 days ago 1 reply      
I'm guessing this a pivot that actually is moving away from everyday user generated content as facebook already has become increasingly a news and image-macro aggregator/sharer. Why not video I suppose? I imagine forcing video advertisements pays significantly better.
arkj 3 days ago 0 replies      
Temporal disillusions caused by a feeling of being in control. A desire to rush to novel/radical claims with a cloak of pretentious knowledge. Sometimes its such a strain to wade through. And yes, go ahead, downvote.
jacquesm 3 days ago 1 reply      
This suggests that video lectures such as offered by coursera could be improved on by providing a transcript for every video lecture. Has this been done by any of the other online teaching institutions?
sklogic 3 days ago 1 reply      
> "It conveys so much more information in a much quicker period."

What a pathetic load of hogwash. Video is the slowest possible way of conveying any meaningful information.

hacksonx 3 days ago 1 reply      
Am I the only one who stops reading an informative English article when the word "because" is used to begin a sentence? Maybe text does deserve to die.
salmonet 3 days ago 0 replies      
Videos can be made to be extremely entertaining and take very little effort to consume. It makes sense that video will be increasingly important for Facebook.
bikamonki 3 days ago 0 replies      
I am the kind that reads thrice before hitting send. It would be too slow/too weird to do that with video or voice notes. Text is deathless.
threepipeproblm 3 days ago 0 replies      
I would say the information density of text is much higher than that of video, at least for my purposes.
erikb 3 days ago 0 replies      
Maybe a reason for declining text on FB is that people who like to talk to other people don't use FB anymore.
aout 3 days ago 3 replies      
"Plenty of people can deal with text better than they can spoken language"That's a joke right ?
potlee 3 days ago 0 replies      
Maybe you should have made a video of this?
aaron695 3 days ago 0 replies      
This is why it makes me laugh when idiots think the Minority Report interface could ever work.

Text will rule for a long while yet.

mxuribe 3 days ago 0 replies      
So would some sort of animated ascii art be considered video or text?
nelmaven 3 days ago 0 replies      
You don't need sound for text.
known 3 days ago 0 replies      
Irrational exuberance by FB
Valve funding VR projects, exclusivity-free, with pre-paid Steam revenue reddit.com
297 points by Doolwind  3 days ago   110 comments top 8
corysama 3 days ago 6 replies      
The actual source email is shorter than any of the articles in the citation chain.


If you need context:

Oculus has been offering funds to VR devs in exchange for limited time exclusivity to the Oculus Store. This wouldn't be such a stink if the news didn't immediately follow Oculus Store DRM adding checks for the Oculus headset hardware. So, a game bought through Steam can play on either a Vive or a Rift depending on support put in by the dev. But, a game bought on the Oculus Store is blocked from running on a Vive.

IMHO, the conversation about this topic is very muddied between the four issues of funding, store exclusivity, DRM and hardware blocking. But, AFAICT, the PC gaming openness advocates are clearly OK with Steam DRM and mostly fine with funded, temporary store exclusivity. However, the hardware block is a serious issue. (At least in it's intent. In practice, it was immediately worked around.)

Kuiper 3 days ago 3 replies      
Gabe Newell's phrasing makes it sound like Valve is basically offering a cash advance to some developers (similar to trade publishing, where authors get an advance check which they must then "earn out" before they can receive future royalty checks).

It's basically an interest-free loan which doesn't need to be paid back if the project is a failure (i.e. fails to "earn out" its advance). Quite generous, assuming Valve isn't taking any more than their usual cut of the Steam revenue as part of the arrangements.

curiousgal 3 days ago 4 replies      
So basically, Facebook is buying out devs? Surprising practices...

I remember when I had to decide on a Samsung monitor over an Asus monitor because Samsung had the exclusives I wanted.

Wait...that didn't happen because that would be insane.

agar 3 days ago 4 replies      
For those new to VR drama, be aware that VR has become infested with the kind of dogma, fandom, agenda setting, and selection bias that once was limited to game consoles. At this point, Oculus could give their headsets away for free and certain people would complain that Facebook is using unfair business practices to create a monopoly.

Most of the (fairly reasonable) statements Oculus has made about its practices are ignored, dismissed, or called outright lies. Actual developers have tried to explain that Oculus's behavior isn't evil, designed to split the VR community, etc., but they are called liars, accused of doing damage control, or are somehow paid off by Oculus.

The reality is that Oculus has 100% funded the development of certain games, and contributed engineering talent and best practices based on their VR research. These include Chronos, Edge of Nowhere, The Climb, and others - titles at major developers that would not have otherwise existed. In return, those games must be sold through the Oculus store. However, the studio (i.e., Insomniac, Crytek, etc.) maintains ownership of the IP. Any future game built by those studios (including sequels) can be sold anywhere, using the VR expertise they otherwise wouldn't have.

Oculus also offers development grants to independent developers. An indie that is starved for cash and may otherwise need to release a game early to recoup their investment now has the option to spend extra time on the game. In return, the game must be released first on the Oculus Store; afterwards, it can be released on Steam or anywhere else.

Many call this "buying exclusivity" - but thus far, every developer that has taken advantage of this has admitted that they were farther away from release than appeared to the public, were running low on cash, and/or needed the assistance that Oculus could provide.

Also, most people don't seem to realize that Valve's offer of pre-paid royalties is just another kind of store exclusivity. Developer won't host their app outside of Steam until the advance is paid off, as those non-Steam royalties don't count towards the pre-pay.

Accepting the pre-pay also forces them to use the OpenVR SDK, which means they cannot host the game on the Oculus store.

Valve isn't altruistic. They merely have a more sophisticated strategy and a technology (the OpenVR SDK that can wrap the Oculus SDK) that allows them to position themselves as taking the high road while they focus on boxing out a competing software platform.

JustUhThought 3 days ago 2 replies      
How can one change the game anymore without selling out to the VCs or Google or Facebook? It's sad really. To be expected, in a maturing industry, but very sad.
pandaman 3 days ago 1 reply      
It's funny people in the comments mention that they did not buy games exclusive to the monitor manufacturer. It's true but it also reminded me of the 3D movies wars of 2010. Major CE manufacturers all got exclusive rights to the parts of a small 3D movie library and made these titles available only with the HW purchase. E.g. Panasonic had Avatar so to watch Avatar in 3D you had to buy a Panasonic TV, Samsung had How To Train Your Dragon, Sony had a bunch of Sony titles and I don't remember if LG had anything. We all know how this turned out for the home 3D.
kendallpark 3 days ago 3 replies      
Two thoughts on this:

1. I'm still not sold on that VR is the future of gaming. I do agree that exclusivity will hurt more than it can help. Similar to the way Telsa opened their patents because the adoption of electric cars by the populace is better for their business than protecting their IP.

2. It makes sense that Valve is investing so much in this because it's just another excuse to not make video games anymore. RIP HL3. ;)

EDIT: oh, the down votes. Heaven forbid there be any sarcasm on HN.

dang 3 days ago 0 replies      
Url changed from http://www.vg247.com/2016/06/17/valve-offers-vr-developers-f..., which points to http://www.pcgamesn.com/valve-vr-funding, which points to this. The latter article seems to have the clearest title so we took that.
If ICANN only charges 18 per domain name, why am I paying $10? (2014) stackexchange.com
292 points by rms_returns  3 days ago   112 comments top 19
thedevil 2 days ago 17 replies      
An idea I've been tossing around: What if domains cost $100 if more than one person wants it? The idea here is to squeeze the squatters with 000s of domains.

Whenever I want a domain, most of the ones I want are taken. Not by people who are making use of it, but by people who are squatting in hopes of extorting anyone who would actually use the domain to produce value.

I'd actually be happy if Verisign or any other private company or any government extracted such an unfair price for useful domains because it would free up so many more useful domains for use.

Edit: I'd love to hear good arguments against this, I'm partly throwing this out there to see others thoughts.

stymaar 2 days ago 0 replies      
Laurent Chemla, the founder of gandi.net, a French registrar wrote a book [1] on this topic, describing himself as a thief for selling stuff that have no cost in the first place (domain names).

[1] confession d'un voleur (confession of a thief) : http://www.confessions-voleur.net/confessions/ (in French)

ajosh 3 days ago 5 replies      
I don't claim to know what the fair price for the .com registry. I will say this though the hosting isn't free. There are millions of requests per second across all networks from all across the world. They must respond with low latency. The major registries are nearly always under DDoS attack. There is a reason verisign has some of the best DDoS protection.

All the data centers across the world, the network connectivity, the DDoS hardware, servers, custom software and staff to run it all must be expensive.

kukx 3 days ago 3 replies      
Ignoring the fact that the revenue goes to a private company there's one good reason why 18 would be a bad idea - already too many domains are hoarded by a small group just for resell; I imagine it would be even worse then.
RileyKyeden 3 days ago 0 replies      
This page is stuck in a redirect loop for me. This one works: http://webmasters.stackexchange.com/questions/61467/if-icann...
jasode 3 days ago 1 reply      
Previous thread where I explain that the stackexchange accepted answer (now with 78+ votes instead of 41) outlines more of the history rather than answer the question in a technical way for a technical audience:


(small correction: my previous answer had a typo of "17 million" instead of "11.7 million")

shitgoose 3 days ago 0 replies      
this is a classic example of how monopoly position is monetised. nowdays monopolies are established and maintained by the government. there is a lot of money to be made, being a monopoly, but government cannot show this on its books (joe the plumber may not like that). so they create a commercial proxy entity, that accumulates most of the profits. the part left in the shadow is how moneys make it back to individuals in the government who arranged this deal, but judging by how vigorously government protects the proxy, there is little doubt that this is happening. for starters i would run a cross check between names of verisign subcontractors and names of family and friends of top icaan officials.
james_pm 3 days ago 0 replies      
There are a few different places where the money goes in a domain registration transaction.

ICANN gets $0.18 per gTLD registration.

The registry (Verisign, Neustar, Donuts, Radix, etc.) sets a wholesale price that the registrar pays the registry for each domain year. That can be anything from a few bucks to hundreds or thousands of dollars (see .cars, for example).

The registrar adds a markup to what they charge the registrant on top of that wholesale fee.

In general, ICANN gets very little, the registry gets the most and the registrar somewhere in between. The registrar supplements this small yearly amount with add-on services like WHOIS privacy, or by selling email or hosting alongside the domain.

The registrar also pays a yearly fee to be accredited by each registrar which also adds to the cost to consumers.

wl 2 days ago 0 replies      
After Verisign's SiteFinder stunt, I was shocked to hear their contract was renewed. This is the first I've read about Verisign suing ICANN over the matter and getting a settlement giving them the contract renewal.
tszming 2 days ago 0 replies      
Sadly, if Verisign is not earning $7.85 per domain, I guess every possible combinations of meaningful dot com domains were already exhausted.
robalfonso 2 days ago 0 replies      
I operate a registrar, I'll outline a couple pieces of the puzzle here and the justification for the price. ( I couldn't read the stack exchange article - redirect issue)

First its recognized these are digital goods where the incremental price is very nearly approaching 0. Technical aspects of domain registration have very little to do with the price.

That said ICANN has various requirements for registrars and registries that do require more than just keeping an entry in a database. For registries, you have to not only have a hot failover data centerlocation for your registry but you are contractually obligated to test it (I believe every 6 mos. but its not my part of the industry). For registrars you have to escrow your entire domain settings both incrementally and fully (daily and weekly) for all domains, this way if you disappeared as a business tomorrow they could recover everything. This is audited regularly and swift action is taken for those who are delinquent.

Those are the technical issues, the rest of the cost is highly related to the administrative burden of domain management.

Some of these are:

 DMCA takedown requests Generic legal requests UDRP Claims Governmental abuse claims NGO abuse claims (i.e. other hosting firms etc) WHOIS verification claims
We get these kinds of requests on a very regular basis (ie hourly/daily), it takes a team of people to manage them and its a never ending torrent. Some of them can be disposed of quickly others take a lot of time.

There is also a huge amount of what I'd term "misguided requests". For example:

A company who makes a ICANN complaint because after the UDRP ruling saying they won a domain and access to the domain was provided, they failed to renew it, let it expire and another company got the domain. This happened and turned out as you would think. This still took 1 person a week of investigation and back and forth to dispose of.

Complaint that a domain registrant did not receive expiration notices when in fact there is no history that they'd ever been a registrant of the domain.

People who file false WHOIS complaints with ICANN because they don't like the domains owner, whois complaints MUST be verified or the domain is taken offline. These complaints create a huge burden.

All of these requests have rules about procedure, deadlines that must be complied to and a form of investigation that must be adhered to.

While many domains are nice and quiet and don't need much attention. The legal, abuse and governmental drivers that run the modern internet create a ton of overhead for registrars. That overhead is manifested in your fees.

This is only about normal fees. Premium domains are strictly market driven and the "scarcity" of a domain/tld is highly subjective. Costs are less about overhead and more about what the market bears.

Hope this has been insightful for anyone who read this far!

betaby 2 days ago 1 reply      
Distributed DNS like service would solve most of the problems. Something similar to DHT for DNS.
bikamonki 3 days ago 0 replies      
The retail price of goods and services is determined by supply and demand, not cost plus markup. Registrars will sell you both 99 domains that nobody wants and premium domains at thousands of dollars. Supply just increased thanks to new TLDs, yet demand seems to be moving slow to these new options. When more of the new TLDs are registered and adopted by the market, overall prices will go down. Eventually, if a blockchain-driven domain registry takes hold, ICANN domains may become free or even disappear.
superasn 3 days ago 4 replies      
I think if Google and Firefox join forces and create a Dns override in their browsers that resolves domains without querying the root servers it could be a real game changer.I remember this used to be an option in very old versions of IE where some company it using some plugin. This can be an end to all such monopolistic tactics and domain squatting.
ck2 2 days ago 0 replies      
It still blows my mind they Verisign to retain .com despite all the obvious corruption.
dopkew 2 days ago 0 replies      
Why do registrars charge more per year if I try to register a domain for more than one year?
curiousgal 3 days ago 4 replies      
Why do we even need domain names? I mean unless it's a brand, we rarely type in a domain name, it's usually bookmarked or linked.
gscott 2 days ago 0 replies      
Domains used to cost $100 a year. I don't see any reason to complain (although before they cost $100 they were free).
curioussavage 2 days ago 0 replies      
In short the domain name system is a racket.
A Third of Valve Is Now Working on VR uploadvr.com
255 points by jn1234  12 hours ago   126 comments top 20
partiallypro 1 hour ago 5 replies      
In all of my experiencing of the new VR products, I am firmly in the position that it's not ready for public consumption yet and won't be for quite some time. The only real reason it's being pushed hard is because people have a fear of missing out. Meanwhile, if I were a company I'd be focusing on AR, because at least there you can push for enterprise customers which won't need the full immersion yet that a general consumer will clamor for. (I also think AR has a much brighter future)

I expect a lot of VR units are shown off to friends and thrown into the closet or put on a shelf to collect dust. It's something you show off, but not something you'll (at least 98% of people) use.

jn1234 12 hours ago 6 replies      
The by far most interesting comment from the thread is this one (It's by Alan Yates who works on Vive/SteamVR) https://www.reddit.com/r/Vive/comments/4osav8/lighthouse_tra....

>Of course. We want AR/VR/MR to be ubiquitous.Over the past four years or so I've seen many companies big and small bring their demos to show and tell. They all have bits and pieces of the larger puzzle. Good eye tracking, interesting haptic techniques, next generation display technologies. But most of them are narrowly focused on their thing, and struggle alone to make a successful product. Partially this was just because the market didn't exist but also many of them were/are just trying to boil the ocean. The minimum viable product is now a pretty high bar and that can stifle innovation. We can offer a running start, the traditionally "hard" parts of HMD technology, the things other than GPUs that kept VR niche for so long.In return we ask that your device leveraging our technology works with our platform. And mostly that is it. We won't ask that it only works on our platform, we won't stop you from targeting other industries. This gives both you and your users freedom of choice and security that isn't dependant on either party's future decisions. It is a pretty good deal really. Our platform has a rapidly growing collection of great content for your end-users so your product won't be an orphan and you don't need to convince anyone to author for it. Day one people can fire up Tilt Brush and have their minds blown by your awesome new hardware.

If Valve games are "locked" to SteamVR and won't play on Oculus, then nobody is going to buy an Oculus. Does Facebook really think that people are going to choose Lucky's Tale over Portal 3 or Half-Life 3? Facebook is going to have to capitulate and focus on their hardware advantage.

baldfat 2 hours ago 4 replies      
Half-Life 3 on VR would be the killer app. It would be this generations Lotus 1 2 3.

Though I personally feel VR for video games are a lot like Wii its awesome for a while then its collecting dust throughout the world. I really think the future is Augmented Reality and VR will be for mostly media consumption.

danso 9 hours ago 3 replies      
The Valve publications page has a few slides and documents relating to Valve presentations regarding VR:


One of my favorite tidbits comes in the presentation, "Lessons learned porting Team Fortress 2 to Virtual Reality", on preventing VR motion sickness: http://media.steampowered.com/apps/valve/2013/Team_Fortress_...

> Dont change the users horizon line, ever. You can see here how the camera follows the motion and rotation of the characters head and so it rolls. Your actual head isnt going to roll when you get killed by an Eyelander, so the mismatch will make you sick.

Here's the presentation's video, bookmarked at the aforementioned insight:


For those of you non-TF2 players, the "Eyelander" is the name of a player-wieldable sword, and when it connects, the victim's head flies off and rolls around the ground. Apparently simulating that effect (changing the user's "horizon line") will make people very sick.

vocatus_gate 2 hours ago 0 replies      
I wish they would get back to what they used to do - make actual computer games. I still can't shake the nagging suspicion that the recent VR hype is just the product of the every-decade-or-so fad cycle (similar to what happened in the 80's with it).
contingencies 49 minutes ago 0 replies      
I've just spent a week here in Shenzhen, China. What was the most impressive thing? There are literally entire floors in the electronics markets here filled with VR headsets. Even if it's early stage hardware, someone has to be buying them.

As for killer apps, like every other technology it's a fair bet that commercial success #1 will be porn.

From a more cognitive standpoint, I've long felt that what segregates spatial awareness from other senses is the sheer volume of data that can be presented, reasoned with and remembered. As old school hunter-gatherer-wanderer primates, it's our highest bandwidth input. This reality will eventually be utilized for problem solving (eg. VR excel spreadsheet visualizations and black box / static code analysis may become a non-gimmick norm).

kendallpark 10 hours ago 2 replies      
Of all the gaming companies, Valve makes the most sense to be this heavily invested in VR--given their stake in the PC-industry.

Or maybe too many devs thought VR was the coolist project to work on and moved to it. (Valve is known for their flat structure http://www.valvesoftware.com/company/Valve_Handbook_LowRes.p...)

joezydeco 11 hours ago 3 replies      
Jeri Ellsworth of CastAR (ex-Valve AR/VR hardware guru) did an interview recently* where she mentioned that Valve was kind of "painting themselves in a corner" by gearing their system's performance to play AAA-title games.

Now owners of those consoles will expect every game to be an AAA title.

* http://embedded.fm/episodes/156 @ 53:20

blastofpast 1 hour ago 2 replies      
I wish Valve was publicly traded so that I can invest.
intrasight 1 hour ago 0 replies      
After reading this thread, I have to comments:

1. Exclusivity on a headset? Replace "headset" with "monitor" and you get how childish, stupid, and impractical that will be.

2. I don't think monitor based games will transition well to headsets.

ksec 7 hours ago 2 replies      
Side observation.

So Valve now has an better VR Set then Oculus. And all of a sudden every news on VR seems to flowing in Valve direction.

This reminds me a lot of the early days when we move from iD 's Doom to Valve's Half Life.

Note: ( John Carmack = CTO of Oculus and Founder of iD )

alimbada 5 hours ago 0 replies      
I've come to realise that Valve is no longer a game developer.
JanneVee 6 hours ago 0 replies      
Why wouldn't they? The computer industry is stagnating because people can use phones and tablets for most of their computing needs. VR has the potential to drive sales of high-end hardware again. And Valve has skin in that game.
josefdlange 1 hour ago 0 replies      
Half-Life 3 Confirmed
diziet 10 hours ago 2 replies      
VR is amazing and the opportunity is incredible. However, I am worried worried that the technology to produce high enough resolution displays will take some time to get here. Without smartphones driving the demand, will we get to 5-10k dpi displays for VR tech? We have 800 or so dpi displays rolling out, with current generation devices filling their FOV with about 500 dpi screens. We're driving 1.3-1.8m pixels per eye, but that is not enough if you want to pretend to gaze at something 20 meters away. The pixel density, especially in the center area of the display, should be much higher. Otherwise only abstract low polygon count content will work.
giskarda 7 hours ago 1 reply      
And in the meantime no Half Life 3. What about priorities?
techdragon 10 hours ago 2 replies      
So one third of Valve is working on VR... absolutely definitely not working on Half Life 3 or Portal 3. Which pushes their possible/probable/hypothetical release dates even further into the future. :-(
totoz 10 hours ago 0 replies      
ProAm 11 hours ago 0 replies      
erikb 6 hours ago 0 replies      
Half Life 3 will have VR confirmed!
Ethereum is Doomed nakamotoinstitute.org
276 points by kushti  17 hours ago   199 comments top 27
DennisP 15 hours ago 4 replies      
I have actually tested this attack technique (using my own contracts on a local test chain), and I've been in discussions with some Solidity developers and the guy who first published the attack. The situation is not as bad as this article claims.

For starters, you can use address.send(x) instead of address.call.value(x)(). All computations on Ethereum have to be funded with "gas" (transaction fees), and send() only forwards a small amount of gas. The recipient can write to the log but that's about it. TheDAO used .value() which forwards all the available gas.

If you do use .value(), then you can do it safely by doing it only once per method, as the very last step, and only using it in top-level methods. Then any sort of reentrant callback will find any required state changes already done. E.g. subtract from the user's balance, send the funds, and if the send fails then throw. The recipient can call back but the balance is already decremented. Using a mutex is another option. I've tested all this and it works.

There are a lot of reasons Ethereum is designed this way. The most common reason for a contract to run code when receiving funds is to prevent users from accidentally losing their money. E.g. in any contract that holds ether and maintains a ledger with balances for multiple users, if anyone sends ether to the contract address, they'll lose it, unless the contract either throws an exception or automatically adds to the user's balance on its ledger.

User addresses and contract addresses are interchangeable, because it lets you do things like use a multisig timelocked vault to receive funds and anything can send to it just as if it were a regular user account.

All that said, the devs are working on improvements too.

SlipperySlope 16 hours ago 3 replies      
The programming classic "Mythical Man Month" had a chapter for the "second system effect". Its where developers put lots of complicated features in to the second system in a series. Its a pattern for failure.

In this case, Ethereum developers built a much more complicated cryptocurrency by making it Turing complete - to the extent of writing their own new computer language and virtual machine to run it.

They skipped over the halfway step of declaring parameters in the payment transactions, that open source enterprise library code, e.g. Java, could execute for for required behavior. No. They made the new code itself a user-contributed and immutable! part of the system.

munificent 14 hours ago 2 replies      
> Solidity is like programming in JavaScript, except with your bank account accessible through the Document Object Model.

Such a beautiful, terrifying sentence.

rwallace 7 hours ago 1 reply      
The problem with Turing completeness is that if it's not there from the start, people inevitably end up needing features that aren't present in the restricted language, and then a Turing complete language is added to either the original system or a successor, and being an afterthought it usually ends up being a hack job.

Examples: SQL was supplemented with procedural languages for stored procedures, Excel was supplemented with VBA, HTML was supplemented with JavaScript and, well, Bitcoin was supplemented with Ethereum.

So providing a Turing complete language was the right decision as far as it goes, but it's not enough by itself. If you need reliability - which this does - it needs to be the case that typical code does not involves arbitrarily complex computation. Typical code needs to fit stereotyped patterns whose behavior is predictable.

If Ethereum is to continue, it's clear that exhorting people to try even harder not to make mistakes in the current version of Solidity, won't cut it. The language needs to be supplemented or replaced with something that makes it easy to write correct code for the kind of things people typically want to do.

erikpukinskis 11 hours ago 1 reply      
People (rightly) make the same critique of Bitcoin: it is very easy for regular people to lose their money. To keep it safe you need to be a security expert, and even then there are no guarantees, and mistakes are forever.

My response is the same for both Bitcoin and Ethereum:

Even if these technologies are only used by trained professionals, they can still be revolutionary. Almost no one can safely operate a printing press, and yet that invention has toppled empires.

At worst, these are back office technologies. At best in a decade or two someone will figure out how to design a good enough GUI that a novice can safely explore a powerful subset the technology, trusting that bumpers are in place around the sharp edges.

mmanfrin 15 hours ago 3 replies      
Having a little bit of trouble parsing how the attack worked -- so when a transfer happens, the destination account gets to run a command; they then ran a command that called the source's send-ether command, which ran because it was still in 'we owe destination account money' mode because the full initial contract had not concluded?

Then at some point it terminates itself before emptying the source account (and thereby causing a rollback that would cascade back?)

3pt14159 15 hours ago 8 replies      
Ethereum isn't doomed. I don't think it will ever have as much backing as Bitcoin because of institutional buy-in; but the world needs programmable smart contracts. Some rich soon-to-be-dead billionaire tech nerds want to be able to do something every year without worrying about a judge taking that away from them. Individual will is what cryptocurrencies are about. Bitcoin in terms of international payments and store-of-value; and Ethereum in terms of contracts and computing.
aakilfernandes 15 hours ago 1 reply      
Here's a better explanation of the attack: http://vessenes.com/more-ethereum-attacks-race-to-empty-is-t...

Ethereum is not doomed. But devs do need to be educated of this attack.

dkarapetyan 15 hours ago 5 replies      
I didn't know the language was turing complete. Isn't this computing 101? If you want a secure thing then you must be able to reason about it statically. Making things turing complete means any non-trivial property of program correctness can not have a generic solution and so you've just opened yourself up to a world of hurt.
bithive123 13 hours ago 0 replies      
"I thought again about my early plan of a new language or writing-system of reason, which could serve as a communication tool for all different nations... If we had such an universal tool, we could discuss the problems of the metaphysical or the questions of ethics in the same way as the problems and questions of mathematics or geometry. That was my aim: Every misunderstanding should be nothing more than a miscalculation (...), easily corrected by the grammatical laws of that new language. Thus, in the case of a controversial discussion, two philosophers could sit down at a table and just calculating, like two mathematicians, they could say, 'Let us check it up ...'" --Gottfried Wilhelm Leibniz
AceJohnny2 14 hours ago 0 replies      
I'm getting a strong Accelerando deja-vu here.

(ref: http://www.antipope.org/charlie/blog-static/fiction/accelera... )

odrekf 14 hours ago 0 replies      
As expected, "Ethereum is turing-complete" turned out to be just a marketing meme (or a horrible idea at best), just like "Litecoin is silver", "Doge is community", and all the others.
endergen 14 hours ago 2 replies      
I love that Smart Contracts act basically as bug bounties to make the Ethereum systems, community knowledge, and open contracts improve dramatically.
bobjones201606 10 hours ago 2 replies      
Could the person who got the DAO's ETH create a contract to pay out miners who vote against a hard fork with the "stolen" ETH?
anaphor 14 hours ago 2 replies      
What if they had used a language without the possibility of recursion and without loops? I'm being serious. Languages that disallow certain types of recursion, and disallow loops exist. You'd think they would've used something that actually allows for total functions that are guaranteed to terminate in a certain way, instead of a JavaScript clone...
pt1988 7 hours ago 0 replies      
Its too bad, I really thought Vitalik was the Chosen One, but I think he reacted incorrectly here by panicking and showing concern.

If I had been him I would have let DAO fail without even discussing any fork. Be as emotionless about it as the code itself. Admit mistakes were made, somebody (DAO investors) will lose some money in the short term but stay true to the principles of Ethereum.

PLUS even after all this drama, $ETH has a market cap of ~$1B as well as independence (fingers crossed) from government intervention making it an alternative to Fiat currency, as well as a much better block rate than BTC lending itself to mining on consumer-grade GPU's for the moment at least. I'm looking at litecoin LTC as my next crypto-bubble investment as well.

There's still lots of good to be said about Ethereum. DAO is toast, long live DAO2! Ethereum will come out stronger long term.

thelollies 5 hours ago 1 reply      
I wonder if a language like Haskell with a strong type system and strict typing could help. If you were able to express exchange of currency with the type system.
barisser 15 hours ago 1 reply      
Software is almost always flawed to a certain extent.

The problem with Ethereum is that you can't push a fix to your contract.

apapli 12 hours ago 0 replies      
Surely an issue here irrespective of technology flaws or not is the economic impact on the time taken to write smart contracts.

If it now takes substantially more dev time to write a smart contract that is "secure" (ie by spending lots and lots of time testing for vulnerabilities) this will invariably come at a cost.

So while the theory of smart contracts is brilliant this recent hack might go down in history as a case study as to why writing contracts is too expensive to do in all but the top 1% of situations.

lquist 9 hours ago 0 replies      
The Coinbase founder that a month ago called ethereum the future of cryptocurrency looks kind of silly now.
Animats 9 hours ago 0 replies      
It's now clear that Ethereum's contract system has major security vulnerabilities. This is a killer flaw for something which exists only to secure transactions between mutually mistrustful parties. Maybe someone else will do this again, better. Etherium itself probably is doomed.

There are also hard questions to ask the people involved. The same people seem to be behind Etherium, the DAO, and the programmable door-lock startup which the DAO was supposed to fund. This is suspicious. This might be an inside job. At least three times in the Bitcoin world, some Bitcoin exchange claimed they lost money due to a "hack", but in the end it turned out to be an inside job.

xoa 11 hours ago 1 reply      
I wrote this originally in response to a post by cyphar (https://news.ycombinator.com/reply?id=11942889), but I'd really love a response from anyone with more knowledge or thoughts on how the law might interpret this situation in relation to already existing "virtual legal systems". cyphar wrote:

 "It was definitely fraud. Either it was a contract, in which case "taking $50 million without anyone's consent" doesn't pass the officious bystander test. Otherwise it wasn't a contract, so the default is still fraud (though in that case, the DAO would be the thing being defrauded rather than the investors)."
I'd like significantly more details on your legal theory for how this was "definitely fraud" or how "the default is fraud" if there was no legal contract, the split executor was a direct party to the agreement, and the terms of the agreement were followed. "$50 million" was not taken, 3.6 million or so "ether" was split out from the original smart contract/account, which may or may not end up having some dollar value.

In at least two previous threads [1, 2], comparisons were made to the MMOG EVE Online (developed by CCP Games), and I think the analogy raises some interesting questions. EVE is a self-contained artificial system that, so long the system framework itself is not exploited, is ruled entirely by code. It has a digital currency (ISK) that has an official conversion rate from real world fiat currency to ISK, and an unofficial ToS breaking (but non-criminal AFAIK) conversion rate from ISK back to fiat. CCP has nothing to do with this exchange, but exchanges for ether to other currencies are also 3rd party. Virtual items as well as ISK itself thus have calculable value, both within the system, in terms of time/rarity, and in terms of nation-state issued money. There are person-actor driven "contracts" of various sorts, and people driven markets. There is fraud, piracy, and pure chaotic destruction as well, and that's considered entirely within the system framework so long as the overall code rules of the world are not violated.

Given that, what exactly is the legal difference between the DAO and EVE Online? In EVE, if Alice attacks Bob, or infiltrates his organization, and steals/destroys a few hundred billion in ISK (equivalent to a few thousand dollars at current exchange rate), or sets up a fraudulent bank/scheme that makes off with hundreds of billions or trillions (this has actually happened), do you think Bob should be able to sue in a real world court of law? The essence of the situation looks similar/identical to me: that the parties are operating under a non-contract non-legal agreement to be governed within the a specific set of world rules, and that therefore any interactions within those world rules, however angry they might get about it on a personal level and whatever colloquial English they might use to describe it, creates no legal standing of any sort outside. If EVE client/server were hacked (the world rules distorted) it'd be an issue (though still not necessarily a legal one), just as if the Ethereum VM was hacked it'd be an issue, but if those were operating entirely as they should be then what's the basis of legal complaint? What's the generalized objective basis for court involvement in people agreeing to mess around with made up numbers on computers, even if other people decide to try to exchange those numbers for money?

1: https://news.ycombinator.com/item?id=11925904

2: https://news.ycombinator.com/item?id=11929208

baybal2 11 hours ago 1 reply      
Fiy, i recently came upon this company http://m.imgur.com/Hv9n7fk,55JgdgK that wants to sell ETH to Chinese. I checked the company registry, and found out that the guy has registered a proprietorship back in 2013.

Apparently a friend of Vitalik

yarrel 10 hours ago 0 replies      
....to succeed. ;-)
teslaberry 13 hours ago 0 replies      
run for the bitcoin hills!.
cheez 15 hours ago 3 replies      
> They created a situation in which bugs would be expected to arise in an environment in which bugs are legally exploitable. That is hacker heaven.

Every single system has this problem. You think banks aren't hacked?

heliumcraft 13 hours ago 0 replies      
Absolute trash article.
JSON Web Tokens vs. Sessions float-middle.com
314 points by darth_mastah  2 days ago   165 comments top 28
StevePerkins 2 days ago 11 replies      
For people using JWT as a substitute for stateful sessions, how do you handle renewal (or revocation)?

With a traditional session, the token is set to expire after some period of inactivity (e.g. one hour). Subsequent requests push out the expiration... so that it's always one hour from the last activity, rather than one hour from initial authentication.

With JWT, the expiration time is baked into the token and seems effectively immutable. To extend the session, you have to either:

1. Re-authenticate from the browser every hour and store a new JWT token, which is kind of an awful user experience, or

2. Renew the JWT token from the server side every hour. At least in Google's implementation, this requires the user to grant "offline access" (another awful user experience)... and you'd need some hacky approach for replacing the JWT token in the user's browser.

So with all the recent discussion about not using JWT for sessions, what do you guys do? Do you simply make users re-authenticate every hour? Is there another magic trick that no one has brought up?

In my own shop, we're using the browser's JWT token as a server-side cache key... and storing the "real" expiration in cache so that it can be extended (or revoked) as needed. I would be interested to know if others take a similar approach, or have issues with that?

DenisM 2 days ago 3 replies      
IIRC tptacek has been beating this drum for a while, but it seems that he got tired of it, so I should pick the drum sticks in his stead:

Use less crypto. The less crypto is being used, the fewer mistakes are being made.

When it comes to sessions, generated a secure random 256 bit token and use that as a session id. Store it in a database or in-memory store. Sticky sessions + local session token storage will fix your network latency problems when you start scaling out.

Federation becomes moderately difficult. Perhaps you could store a copy of session id on each node, and when a session id is compromised, proactively reach to all nodes and ask them to purge the session. This allows immediate mitigation for a session id leak, and since it doesn't rely on timeouts there is no vulnerability window for data exfiltratiin upon a breach. And no crypto.

jordz 2 days ago 2 replies      
Also, worth mentioning this article as an opinion:


Edit: Title is Stop using JWT for sessions

nstart 2 days ago 5 replies      
That last part where he talks about logging out being the responsibility of the client is rather key. Basically I can't invalidate the key from the server side. So if a user's account is compromised and they recover it on their mobile app for example, I can't sign the user out of everywhere else too. It's what has given me pause about jwt so far and has held me back from using it. I find the cookie is generally good enough to hold most other information I need about the user.
mkagenius 2 days ago 1 reply      
tootie 2 days ago 1 reply      
If you need to validate the Authorization header on every request that's not really different than using session tokens we've been using for the past 15 years. JWT is just a formalized way of managing cookies. Which is nice and I like it, but it doesn't actually enable anything that couldn't be done before albeit with a more ad hoc approach.
saynsedit 2 days ago 1 reply      

 [headerB64, payloadB64, signatureB64] = jwt.split('.'); if (atob(signatureB64) === signatureCreatingFunction(headerB64 + '.' + payloadB64) { // good } else // no good }
You really need a constant time compare for the signature, else you leak information about the correct signature in the timing of the response.

hit8run 2 days ago 1 reply      
A similar approach are encrypted cookies. You can take data and sign it. The server can then check if the cookie data is correctly signed and accepts or rejects the data in the cookie. This approach also scales horizontally. I use it for a while now in go (http://www.gorillatoolkit.org/pkg/sessions). If you want you can also encrypt the expiry date into the secured cookie.
zatkin 2 days ago 1 reply      
Aren't cookies the safest approach for storing authorization tokens? I recently found out that both Google and Facebook use cookies for authorization, so it seems like the way to go, though I've read that it gives programmer headaches.
palmdeezy 2 days ago 1 reply      
This is the 3rd or 4th article in the last 3 weeks on JWT. Each has argued that JWT is either secure or totally useless. What is the deal?
jcoffland 2 days ago 1 reply      
This is a practical implementation of something I described on stackoverflow in 2013. http://stackoverflow.com/questions/319530/restful-authentica...

The discussion there is rather interesting. The problem of invalidating logins is discussed. I have not found any satisfactory solution to this problem. You can set a timeout on tokens but then the user would have to log back in periodically. If the software can renew the token automatically then there is nothing to stop an attacker with a stolen token from doing the same, indefinitely. Still, in many situations these problems are no worse than compromised session based logins.

carapace 2 days ago 1 reply      
Reminds me of "macaroons".

http://research.google.com/pubs/pub41892.html "Macaroons: Cookies with Contextual Caveats for Decentralized Authorization in the Cloud"

skybrian 2 days ago 0 replies      
I didn't understand this part:

"if your application maintained a list of key/algorithm pairs, and each of the pairs had a name (id), you could add that key id to the header and then during verification of the JWT you would have more confidence in picking the algorithm"

This implies there's a security benefit, but I don't understand how it's better than checking the alg parameter against a whitelist. Perhaps if you're using non-standard names for algorithms, that guards against mistakes?

jcadam 2 days ago 0 replies      
So, I've started working on a new project recently, and thought I'd use JWT (instead of Ye Olde cookie-based sessions). After working with it for a bit, I've decided that I can probably get away with using it for authentication, but definitely NOT for authorization.

I'm using a kind of RBAC and storing Roles in the JWT just seems like a bad idea. Header size is one issue, but also there is the problem of granting/revoking a role and having that change reflected immediately (rather than waiting on the token refresh window).

So, now my API requests happen thusly: "Ok, I have a valid token for User X, so I accept that this request came from User X. Now, let's check User X's roles in the database to see if they have permission to perform action Y on resource Z..."

Hmm... I'm not sure this feels right.

Bino 2 days ago 2 replies      
Why the obsession with 3 letter short names? Why "typ" and not "type". I'm sore the overhead can be ignored and the parser doesn't care.
carsongross 2 days ago 0 replies      
I recently added support for reading/writing localStorage variables in intercooler.js to support things like this:


moyok 2 days ago 2 replies      
The thing that scares me about using JWT is that all security completely relies on the one secret that is used to sign tokens - any person with access to that secret has got potentially unlimited access to the app. They can now impersonate any user and do basically anything.
stephenr 2 days ago 0 replies      
I've had this "them: we should use JWT, it's easier to scale; me: the customer has 200 clients, they don't need to scale shit, sessions are fine" argument before, and honestly at this point I'm pretty convinced JWT isn't a replacement for session auth.

The one place I can see it might be useful, is when you need to give a third-party system some kind of access to your API, as an alternative to storing a secondary hashed field(s) for api keys.

spriggan3 2 days ago 0 replies      
A few years later : "JWT in place of sessions considered harmful".
hharnisch 2 days ago 1 reply      
JWT is especially useful for validating requests in a microservice architecture. You can pass around the token an embed roles in them. No need to keep a session store with them!
erpellan 2 days ago 0 replies      
The assertion that cookies require server state is wrong. A JWT would easily fit inside a cookie. Or a URL for that matter.
ams6110 1 day ago 0 replies      
Functionally, sounds a lot like http basic authentication, only more complicated.
smaili 2 days ago 2 replies      
Correct me if I'm wrong but the benefits sound very similar to a good old fashion cookie except that you're not limited by 4kb.
geuis 2 days ago 1 reply      
How does this avoid the problem of a 3rd party getting your jwt token? Then they can do any request as you.
blazespin 2 days ago 0 replies      
jwt are the capability model. They are not forgeable and they expire. You can revoke them by telling the database to ignore them by the token id. If you have a services based model you can pass them along and they can verify your capability without going to the db.
davb 2 days ago 0 replies      
XHR? Sessions.

REST? Bearer tokens or OAuth.

Third party trust? JWT.

tszming 2 days ago 0 replies      
People don't realize it is not a proper comparison as JWT is only the format/spec - you can still achieve stateless client session by encrypting an XML payload (e.g. user id) in the browser cookie. Storing data in client and verify by signature is not a new thing.
lukeh 2 days ago 0 replies      
Ah, Kerberos :-)
Deconstructing the DAO Attack: A Brief Code Tour vessenes.com
252 points by bpierre  3 days ago   157 comments top 17
RobertoG 2 days ago 1 reply      
I'm going to be a little polemic here.

It's this an attack?

It's not the problem that those technologies try to solve, to get rid of subjectivity?. In a way, this could be interpreted as trying to be free of politics.

If my understanding of what they are trying to accomplish here is correct, if the system allow it, then, by definition, it's legal.

If you require a framework where something allowed by the code but with unexpected consequences is illegal, you are again where you started.

winteriscoming 3 days ago 5 replies      
The apparent typo is odd, IMO. I don't know the language in which that program is written, but looking at that snippet, "Transfer" takes 3 arguments whereas the "transfer" function takes 2 arguments. Isn't there any code review involved when this makes into the codebase. Assuming there was some code review and the reviewer just missed it (which is very much possible), a basic unit test case would have easily caught this bug, isn't it? After all, the test case would have checked for the balance etc... after execution of these functions. Aren't there any test cases around this?
jacquesm 3 days ago 7 replies      
Whoever thought it was a good idea to have case sensitive function names where the names are allowed to be identical but not identical in function?

Major fuck-up there.

That should have never passed the concept stage, nor the review stage.

Function names should describe what a function does.

seibelj 3 days ago 0 replies      
Bored computer scientists: analyze contracts that are less popular but still have money attached to them and see if you can locate a bug. Apparently you can get the money legally and the only way to stop you is to hard fork the entire currency.
lordnacho 2 days ago 0 replies      
It reads like the debugging that you've done a million times if you are working with ordinary, mutable, C-family code.

To start off with, you think you're developing a valuable skill in being able to follow a call stack along with a bunch of variables. It certainly has its uses, but after a while you come to wonder why it always ends up this way. Sure, you can understand every bug eventually, but surely there's some reason why you're constantly reasoning about the state of the variables and the order of the function calls. You can write unit tests, but of course you can't think of all the undesired behaviours beforehand.

The author does mention that something functional with strong types is necessary. Probably a good idea.

As for the split function, there's a question of whether it is really necessary. I wouldn't have thought an investment fund would need a split function. An ordinary fund doesn't; if you're in, you're in for whatever proportion that you've put in. Why not just make a second fund with the same (reduced) code if you have some other group of investors? More functions = bigger attack surface.

tdaltonc 2 days ago 0 replies      
It looks like this is a typo. The question is, "does that matter?" It seems to me that either the code means what it says, or it doesn't. If a cabal of humans are able to overrule the machines, then what is the point of Etherium? Wouldn't we just be trading one group of authorities for another?

I want to see Etherium work, and it is extremely unfortunate that basically THE first high profile smart contract has gone sideways. But I'm not sure if I want there to be anything we can do about it. The whole point of this project is that the blockchain is supposed to be like an indifferent force of nature.

harmegido 3 days ago 0 replies      
I'm curious, are there no tests for the code running this? It seems like some of these bugs would be caught with them (especially the typo). And it seems like insanity to not have them.
jcfrei 3 days ago 1 reply      
So, this review makes me wonder: Will the hard fork of the ethereum blockchain be of any use at all? The DAO appears to be fundamentally flawed, are all the coins there simply lost? What prevents an attacker from exploiting the DAO after the hard fork? Or will they simply invalidate all the transactions that went to the address of the DAO?
winteriscoming 2 days ago 1 reply      
This function will reduce user balances, before the vulnerable withdraw function is called. So, instead of the logging function, we should have:

if (!transfer(0 , balances[msg.sender])) { throw; }

This would <snip>.... also reduce the tokens available to the user later on

The more I think of the typo and the explanation about it in that article, the more unclear I am about that whole code.

Keeping aside this hack, for a moment, if that is indeed a typo and instead it should have called the lower case transfer method to really reduce the user tokens, then does this mean that all these DAO contracts (or whatever the right term is) that have been executed till date have been affected by this and as such the entire eco-system state was already messed up before this hack?

Nothing in that article suggests that this code flow, where this apparent typo is present, is applicable only for this specific hack.

amaks 2 days ago 1 reply      
The biggest problem I see with the code at https://github.com/slockit/DAO/ is absense of the unit tests. The code looks complex, how the hell can you make sure that it's right?

The Solidity wiki doesn't have a single entry on testing either:


htns 2 days ago 0 replies      
I don't get how the transfer vs Transfer typo is supposed to have enabled a larger attack. The token balance is zeroed in the splitDAO function anyway, and presumably the sanity checks in transfer are also redundant, or else there would have been no need for recursion.
altern8tif 3 days ago 1 reply      
So all this hoo-ha is really just because of a capitalised "T"? If the code were fixed, is the DAO model still valid in the real world?

(I'm a cryptocurrency noob... Some of technical stuff just went over my head)

beachstartup 2 days ago 1 reply      
if you could 100% accurately and reliably model a contractual understanding in any kind of language, it would have been done eons ago. human language (i'm fond of english in particular), is extremely expressive.

but there's a reason we still have courts and judges. it's because language is the tool through which humans express their desires and fears (computer language included, because it is written by humans, so far), it's not a machine.

willtim 3 days ago 8 replies      
Modelling financial contracts in an imperative event-driven paradigm just seems like an accident waiting to happen.

EDIT: To the downvoters, if you disagree, I would very much like to understand why, please reply with a comment.

EDIT2: My position is that it is very difficult to reason about correctness and maintain invariants in a highly imperative setting. IMHO, it would be more desirable to use a declarative, or functional language. There are many successful commercial applications of functional programming to financial contracts, e.g. Lexifi, which is based on a subset of OCaml and various in-house solutions developed by banks. Using a functional programming language would also mean the code is far more amenable to formal methods and static verification. One new startup I was very impressed with, aestheticintegration.com, has had success rewriting financial algorithms in a functional programming language in order to prove and verify numerous correctness properties about them. It is a huge shame that folk do not reach straight for the functional programming toolbox when faced with these types of problems.

tinco 3 days ago 4 replies      
In addition to this the language is not explicit enough. Even standard Haskell wouldn't be safe enough for a program that manages $250MM directly without safe guards.

In addition to typesafety, it would have to be both explicitly typed at the lowest possible granularity and annotated with pre and post conditions.

Not only did this code allow a bunch of tokens be transferred without compensation, it left the whole accounting system in a corrupted state! One that could have easily been detected and verified at any point.

This line:

 ``` Transfer(msg.sender, 0, balances[msg.sender]); ```
Should have looked like this:

 ``` // postcondition: balances[msg.sender] == 0 Transfer(msg.sender, 0, balances[msg.sender]) :: Action<Transfer>; // Compiler result: // # Error 1: line XX expected type Action<Transfer> but instead got type Transfer // # Error 2: line XX violates postcondition, expected balances[msg.sender] == 0, got balances[msg.sender] != 0 ```
And the whole function 'splitDao' should have a type annotation that completely captures the possible actions it might encompass, and pre- and postconditions that at the very least constrain the result of the function to not corrupt the accounting of the DAO (i.e. it should still have as many tokens as it believes it has).

It's nice that Vitalik Buterin is a genius, but it shows that this guy is only 22 and dropped out of university because anyone with a degree in computer science knows about this stuff and its importance in high reliability systems. He should have known the contract language should have had typechecked side effect and he should have known proper pre- postconditions would have been the least he should have done.

fabled_giraffe 3 days ago 15 replies      
I know I'll get napalmed for saying this, but why is this all over HN? Why are people obsessed with cryptocurrencies?

It's bad enough that those with sufficient computing speed/power can already rip off the stock market; why does the world need another way for people to rip each other off?

Since we've determined over history that some people will take advantage of weakness for their own gain, no matter what system you create to transfer goods, resources, services or value representing those, it will be exploited. So- why even spend time on it?

If everyone who cares to maintains their own accounting data does so, regardless of what technologies we are using, we really could just trade in goods, services, and coin and completely get rid of all virtual currency. Investments could go back to literally to providing coins, goods, resources, or services to those that we'd like to support possibly in exchange for reciprocity.

Since that won't happen right away,if you really want to secure your financial future, major S&P 500 index tracking funds have been one of the smartest things over its history to invest in: http://finance.yahoo.com/echarts?s=%5EGSPC+Interactive#symbo...

While some of you made money on bitcoin, many lost. Now the same game has been played again with people believing that someone can write code to replace our currency system and if they get in early enough, they'll be kings or queens. Well- again, it didn't happen. What really is to gain from this fantasy world of cryptocurrencies? It is harming more than it is helping, from what I see.

And if you downvote this, tell me why, otherwise I'll just assume you are with those that want to exploit me and others.

homero 3 days ago 0 replies      
Glad I never touched alts. My bitcoin are safe and increasing
Docker 1.12: Now with Built-In Orchestration docker.com
267 points by sashk  22 hours ago   151 comments top 21
d33 21 hours ago 12 replies      
I chose Docker for my project with hope that it would help me create an easily reproducible development environment... And I'm not really sure if it was a good idea. There's so many moving parts and I can't name one that actually works well. Configuring storage is a hell (running out of disk space can lead to errors that are really difficult to debug), the dance with bridges makes configuring firewall a terrible experience. I definitely wouldn't recommend it as a stable container solution.
csears 21 hours ago 3 replies      
Just based on the details mentioned in the keynote, the services API and swarm features look very similar to what's offered by Kubernetes... service discovery, rolling updates, load balancing, health checks, auto-healing, advanced scheduling.

I guess this was to be expected, but it's also kind of sad. I think it would have been a more strategic move to embrace Kubernetes instead of trying to compete with it.

tapirl 13 minutes ago 0 replies      
Docker is great (wonderful in fact) for building development/CI environment. I never use docker for production.
namidark 20 hours ago 2 replies      
I wish they would address stability issues first, like this bug[1] that has been around 2 years and is still happening on the latest versions.


ShakataGaNai 20 hours ago 3 replies      
This seems like a really exciting, albeit natural, evolution of the Docker "platform". It's really true that no one cares about containers, everyone just wants their apps to work.

That being said the one giant omission from Docker is still seems to be management of volumes/data. Great, we can run 100 nodes on AWS in a matter of minutes, but if your system has data storage requirements (ex: Almost every database ever) ... You're kinda left up to your own still. How does Docker Orchestration migrate the data volumes?

They really tried to sell this as "You don't need to do anything but the AWS beta, the DAB and a few commands" which would be wonderful. However with the need for reliable data storage... you're still stuck doing everything "the old fashion way".

(Edit: No, I don't mean store data IN the container, I presume no one is that silly. I meant the attached volumes. No volume mgmt = less greatly less helpful).

boulos 8 hours ago 1 reply      
I find it super amusing that GCE isn't mentioned, that ostensibly Kubernetes is the source of lock-in (despite being open-source and likely having more cycles on EC2 than on GCE/GKE), and that this proudly uses gRPC for http/2 and protobuf goodness. So which is it, is Google an evil vendor trying to lock you in, or are we actually doing work in the open and just hoping you'll choose us when you want infrastructure?

Disclaimer: I work on Compute Engine, and think of the Kubernetes folks as friends.

waitwhatwhoa 21 hours ago 1 reply      
At first I was worried that this is yet another Raft implementation, but it seems they're building on etcd's version [1] which should be a win for everyone trying to prove correctness for the various implementations.

1. https://github.com/docker/swarmkit/blob/a78a157ef66adf0978f0...

geodel 21 hours ago 2 replies      
I also noticed that now 'Docker for mac' is available without beta invite. I just downloaded and installed though its version is 1.12.0-rc2.

Maybe they will make formal announcement when 1.12 for mac is released.

eddd 19 hours ago 7 replies      
I wish docker released lite version of docker. Only containerisation, no networking configurations, weird cache mechanisms - just something small and easy to use locally.

No one sane will run docker on prod anyway.

lvh 20 hours ago 1 reply      
The new orchestration looks based on Swarm and possibly Compose; have they resolved the issue where the lack of lifecycle semantics means that your services don't work if they have any sort of cluster semantics, like Riak, Cassandra or Zookeeper? These previously always required outside hacks to work, and work flawlessly with Kubernetes. It looks like there's some service discovery support; which may or may not fit the bill.
foxylion 18 hours ago 1 reply      
If someone else also want to see the keynote, they have a recording: http://www.ustream.tv/recorded/88610090
cdnsteve 17 hours ago 0 replies      
I have two code bases that are microservices. I thought using Docker would be easy to setup and share a redis instance between these services. After spending a day trying various thing I just gave up and did 'brew install redis' and got back to real app dev. Docker isn't streamlined enough, does have great examples and is still changing lots. Sometimes doing it the old fashioned way is what lets you ship features faster.
estefan 7 hours ago 0 replies      
Have they sorted out storing secrets yet? That's one thing I noticed that Kubernetes manages that docker is still scratching it's head over.

But this looks like it could be great progress. Orchestration was always the main pain that stopped me from using docker.

kstenerud 20 hours ago 2 replies      
Omg their navigation bar covers half my screen! Seriously, why do companies inside upon reducing my screen real estate for stupid shit?
heurist 18 hours ago 0 replies      
Great, now make it a first class citizen in compose and mix in docker-machine's provisioning capabilities!
Annatar 9 hours ago 0 replies      
The article is lots of blah blah plus "lock-in" for good measure, with exactly zero code or configuration examples, and it reads like a snake oil salesman brochure. I'm still skeptical about the purported advantage of Docker / Linux over zones / SmartOS.
ssijak 17 hours ago 1 reply      
I was using osx beta but uninstalled it for now because docker was eating all my space without recovering it after deleting containers.Anybody using rkt for dev wants to share their experiences, I am thinking about trying it but would not invest time if it is not ready yet.
hosh 21 hours ago 5 replies      
I'm not surprised the Kubernetes primitives got copied into this. I had stated before that Kubernetes was what Docker wants to be when it grows up -- and maybe that time is here. Having used Kubernetes in production, I don't know robust the Docker orchestration primitives are in comparison. But I'll probably find out soon.

The big advantage having a built-in Docker orchestration is that Kubernetes is painful to install from scratch. (Yes, there are scripts to help mitigate this; yes, GKE is effectively hosted Kubernetes). I'm involved in another Dockerization projection, but we don't know if we want to invest the time into setting up Kubernetes (though GKE is an option). This would be a good time to check things out.

Just to show how quickly things change. In 2015, I tried the docker compose, AWS ECS, and Kubernetes for production deployment, and of all of them, Kubernetes addressed the pain points in a practical way. The Kubernetes project ended in late 2015, and now six months later, things changed again ...

rcarmo 19 hours ago 1 reply      
This is great. I can't wait for an ARM build, though.
curiousDog 20 hours ago 1 reply      
Maybe i'm not uptodate or misunderstanding, but doesn't Mesos/Mesosphere do something similar?
webaholic 16 hours ago 0 replies      
is docker as good as rkt nowadays?
Scammers Game Amazon A-Z Policy By Replacing iPhones With Clay coryklein.com
293 points by coryfklein  21 hours ago   221 comments top 38
jliptzin 18 hours ago 5 replies      
This happened to me about 5 years ago on ebay when selling an iPhone. I was an eBay member since 1999, 100% positive feedback, occasional seller (about once a month). I sold a perfectly fine iPhone, buyer with no feedback history complains to eBay he never received the phone despite tracking showing it was delivered. He also leaves me negative feedback (in broken English). eBay sided with the buyer immediately and attempted to withdraw the money out of my paypal account which I'd fortunately already withdrawn. They demanded for over a year that I pay them the money they claim I owe them or they'd destroy my credit. Fuck eBay, fuck PayPal, terrible companies that should have gone out of business long ago.
soyiuz 21 hours ago 6 replies      
I've had a similar thing happened to me a few years back when I sold a no longer needed but still unopened Windows Vista DVD that I purchased directly from MS Store (on Microsoft campus). The buyer returned the package, but the returned merchandize was clearly counterfeit. The box was of a light material, missing holograms, faded colors etc. The buyer's history was full of similar purchases.

I've been careful about documenting shipments since. Overall, I am reluctant to sell on Amazon these days because of the A-Z guarantee, which is only sustainable for large sellers.

Fraud is a sticky problem at scale. A few bad actors poison the well for everyone. As a buyer I appreciate Amazon's policies, but as an occasional seller I find them completely draconian. There is very little you can do to contest a decision against you.

ericabiz 19 hours ago 0 replies      
I wrote about this back in 2013 (and made it to the front page of HN): "Scammed By Amazons 'A-to-Z Guarantee'"


tl;dr I sold my old cell phone on Amazon and the buyer claimed it was stolen. Amazon refunded him in full and took the money out of my bank account.

I also sold a computer on Amazon around the same time and the buyer tried to do the same thing. That one I caught and argued successfully for, but it was still a pain to go back and forth.

I had another laptop I wanted to sell online, but because of these two experiences I was really gun-shy. I ended up stuffing it in a cabinet for over a year and eventually sold it to a friend for significantly less than I would have sold it for online--but at least I knew I wouldn't get scammed.

I've had better luck with the Amazon trade-in program, especially since they sometimes do incentives. I've traded in two Kindles and ended up getting a new Paperwhite for only about $40-50 after all the incentives from both old Kindles were in my account.

Someone1234 21 hours ago 2 replies      
These might be buyer scammers, or sellers themselves have been scammed upstream. This article is right you'd have to be bonkers to try and pull this on Amazon as a seller. But he's assuming the seller knows there is no iPhone in the sealed box.

Perhaps Amazon Sellers are attempting to buy iPhones on the grey market at a discount, in order to resell them on Amazon at a profit. However the person they're dealing with upstream is actually shipping them clay or similar and plans to disappear when discovered.

Don't get me wrong, buyer scams exist, and that might be what is happening here. But why even post a review at all? The A-Z guidelines don't require it. He's also making the assumption that an Amazon Seller is going to open a sealed phone, or that their channels never get fraud in them.

ChuckMcM 21 hours ago 1 reply      
As others have pointed out, dealing with fraud at scale is really really difficult. As a result its cost are often priced into sales (either transaction costs or product costs).

While the privacy implications are large, it is an area where the more data around the transaction can identify systemic fraud. (which is to say organized group that spread their activity across a wide area to keep it below the radar of fraud detection algorithms).

In the case of phones, the seller has a lot of power. They can record the serial number/IMEI data from the box prior to selling it. At which point you can track where the phone is and if its being used. The challenge has always been taking action against the scammers. So perhaps there is a startup idea for a private investigation group in large cities that would go get scammers and recover merchandise. Such action would quickly reveal if the scammers were part of a larger organization or acting independently.

AJ007 21 hours ago 5 replies      
Interesting conclusion to the article. I stopped buying Apple items off Amazon a really long time ago because they let counterfeit chargers list as real Apple products and did jack shit about it. To some extent this has begun flowing in to other product categories for me as well.

"Meanwhile, the product page is flooded with fraudelent (sp) reviews, poisoning the well and moving future customers away from Amazon and towards trusted Apple retailers. Amazon and customers lose, but Apple possibly gets more customers."

codecamper 21 hours ago 4 replies      
Another thing that could be done... is that shippers could help out.

UPS, Fedex, USPS could all offer last step photo verification of packages, as a service. A few more dollars and you get a photo of what is inside and the handler packs the box up.

bambax 18 hours ago 1 reply      
This doesn't make much sense: why would supposedly fraudulent buyers bother with leaving a (public) review with photos of the clay??

This is not only a waste of time and energy but could potentially be harmful to them since it's usually possible to extract clues from a photograph.

Isn't it possible that the story is rather this:

- bad buyer 1 buys a phone, takes the phone out of the case, replaces it with clay, re-seals the box and returns it, claiming he didn't even open it

- seller restocks the package without verifying it

- good buyer 2 buys what they think is an iPhone and turns out to be a box full of clay, is upset and leaves a review

username3 19 hours ago 2 replies      
Trusted buyers. Amazon can have options for sellers to only sell to buyers that have bought from Amazon X amount of times or X years.
jimrandomh 21 hours ago 1 reply      
There are a couple hypotheses the author hasn't considered. It's possible the buyer stole the phone, yes, but it's also possible that it was a warehouse company employee, a shipping company employee, or a previous buyer who made a return.
bdrool 21 hours ago 6 replies      
Amazon is complete crap these days.

It's becoming more and more difficult to figure out who you are buying from, and as a result there are outright scams sitting on Amazon's store at this moment that they continue to do nothing about:


See that? That's the infamous "dangerous" counterfeit Apple Macbook charger that made the rounds[1][2] a while back. Notice how it says "by Apple-Computers"? This is a scam. It's not by Apple. It's a dangerous fire-hazard knockoff. Yet it's tagged in a way that will no doubt mislead people into thinking it's safe and "official". People were pointing this out months ago[3], yet it's still there on Amazon's store, no doubt along with countless other dangerous misleadingly-branded ripoffs.

This is fraud, pure and simple. And Amazon continues to profit from it, right out in the open. It's disgusting.

 [1] http://www.righto.com/2016/03/counterfeit-macbook-charger-teardown.html [2] https://news.ycombinator.com/item?id=11325150 [3] https://news.ycombinator.com/item?id=11325464

aandon 21 hours ago 1 reply      
My guess is sellers absorb this as a cost of doing business. As long as fraud rates remain in low single digit percentage, it's probably not worth it for the seller or Amazon to fight it.

It is always fun to dissect these scams. There's an interesting e-commerce scam prevalent in India that popped up as the country embraced e-commerce without much credit card infrastructure in place (most online orders are paid to the delivery person in cash) https://simility.com/delivery-fraud:

"The fraudster businesses ordered hundreds of products from the victims website to be delivered on a daily basis. Meanwhile, if customers came into their store asking for an out-of-stock product, they were told it would be in stock later that day. Then the fraudsters paid the delivery person in cash for the small fraction of products they had pre-sold to customers, while returning the vast majority of unsold products without paying for them at the cost of the e-commerce company, thus completing the delivery fraud cycle."

tempestn 13 hours ago 1 reply      
I doubt it would ever work in practice, but in theory this kind of thing could be solved by having both parties deposit an additional say 20% of purchase price to an escrow account. So buyer deposits 120% and seller deposits 20%. Once buyer confirms receipt, 120% is sent to seller and 20% back to buyer.

If there is a problem, buyer can send the item back to the seller, and once seller confirms (re-)receipt, they each get back their original contributions. However, if buyer claims the item wasn't received and seller claims to have sent it, the money remains in escrow. So the side that's lying doesn't benefit. They actually lose 20%. And therefore there's no rational reason to attempt this scam.

However, in the real world there are several problems. The person being "scammed" would be out 120% of the value (whether they're the buyer or the seller), which wouldn't fly with most people, even if the scam is theoretically unlikely. It would also require significantly greater trust of Amazon. Something like this might work with hyper-rational actors using something like a bitcoin multi-sig address, but not for Amazon.

Which means I'm back to not seeing any perfect solution to this kind of problem.

codecamper 21 hours ago 1 reply      
Amazon could set up some honey pots.. whereby they themselves send out the product. They know its good.

If a customer tries to scam that one... then Amazon has found their scammer.

Also... I'm surprised that Amazon just hands out these sorts of refunds on big ticket items. If I were them I'd require some sort of biometric data for the refund.

laksjd 21 hours ago 3 replies      
I'm going to guess this only works with sellers that aren't fulfilled by Amazon? It's an annoyingly effective scam: How can sellers protect themselves?
profeta 21 hours ago 1 reply      
given the amount of garbage amazon sells nowadays, if it didn't favoured the buyer they would be gone by now.

Almost everything i buy that is not books i have to return once or twice until i receive the item from a seller that is not trying to fence a couterfeit version. This happens with $100 pro SD cards down to $15 arduinos clones (which i think amazon still do not carry the original, but all ads show the original in the picture)

Animats 21 hours ago 3 replies      
Why is Amazon even selling new iPhones from sources other than Apple? That's eBay's job.

Looking on Amazon, I'm not seeing any iPhone 6 sellers other than Apple.

swanson 21 hours ago 1 reply      
Question: how should a seller document and defend that they indeed shipped an iPhone (not clay)?
pyrophane 18 hours ago 0 replies      
This sort of thing is why I pretty much only sell things like this locally, in-person, and transact in cash. Buyer gets the chance to check out the item, verify the serial number, and so on. I get the guarantee that if they walk out happy it is a done deal.

No one has been able to improve upon this online. Sure, it can be more convenient to sell online, and for people who sell at volume the cost of fraud may be worth it, but for individuals who occasionally sell desirable items that are likely targets of fraud, selling online is too fraught with risk.

jseliger 21 hours ago 0 replies      
Interestingly, too, I noticed in 2013 that Amazon's general marketplace is rife for abuse: https://jakeseliger.com/2013/02/16/is-amazon-coms-marketplac.... I assume that Amazon tracks the number of returns a particular account engages, but still, I wouldn't sell anything of real value on the site.
Paul-ish 17 hours ago 0 replies      
Do these buyers use the same shipping address multiple times? That seems like a good way to track them. Alternatively, if they use something like a PO box, you could disallow new accounts from shipping their first few things to a PO box.
capote 14 hours ago 0 replies      
Partially related: Is anyone else bothered by the volume of negative reviews on Amazon that simply target an individual seller rather than the product? All instances of any product are covered by the same set of reviews, regardless who the seller was, so reviews like this are totally unfair in dragging the product's rating down.

Why doesn't Amazon have a policy against this?

nabaraz 21 hours ago 2 replies      
I remember this being big on ebay days. How can retailers like Amazon/Ebay verify neither sellers nor buyers are not getting scammed? I cant think of any solution except intercept all packages and make sure the actual product is there.
banku_brougham 21 hours ago 0 replies      
somebody is going to get a ? in email soon
jedberg 19 hours ago 1 reply      
When I worked at eBay, we constantly debated whether the policies should be pro seller or pro buyer. You can't be both in a two sided market.

Throughout my time there, we switched back and forth, but ultimately decided happy sellers brought in more revenue that happy buyers (but maybe that changed again).

Apparently Amazon has decided to be pro buyer right now, which fits with their general model of making happy customers.

Scirra_Tom 21 hours ago 3 replies      
I wonder if the postal services could create an additional service which calculates a heat map of the weight distribution for the package. You could then very accurately I imagine determine if there is actually an iPhone in there, and if someone shows a photo of clay it will be obvious they are scamming if the weight distribution passed as an iPhone distribution during transit.
rietta 19 hours ago 2 replies      
Gosh. It's almost worth it to never resell anything. Just throw old electronics away even though that's bad for the environment and squanders value both for the owner of no longer needed gear and the potential new user of it. The world would be so much nicer if everyone were honest dealing.
ortusdux 21 hours ago 1 reply      
I wonder if using Amazon's fulfillment service would help prevent against this. It is possible to get merchandise directly delivered to a fulfillment center from the vendor? It would be pretty easy for a seller to battle a fraud claim if they can show that they never got within 500 miles of the product.
silveira 21 hours ago 0 replies      
It is also possible that the buyer is a seller of iPhones and is trying to get rid of competition.
hackerweb 21 hours ago 1 reply      
But a lot of the negative reviews for the linked seller seem to be about a locked or blacklisted phone, or one with the wrong amount of storage. Those customers would presumably have to send back their phones, so what's the buyer-side fraud there?
WalterBright 17 hours ago 0 replies      
You'd think with all the datamining Amazon does, this would be a worthwhile problem to solve with it. After all, it is good for Amazon if people can trust their transactions there.
vblord 21 hours ago 2 replies      
I can't believe that this works at all. I imagine that between an established seller and a brand new buyer, Amazon would be on the side of the seller. In the cases where the buyer wins, who pays the cost of the device (the seller or Amazon)?
sharms 21 hours ago 2 replies      
This isn't an Amazon only problem, same with Ebay etc. I would love to hear how people can sell items online safely (for the time being I only use Craigslist for this exact reason)
w8rbt 21 hours ago 0 replies      
There are too many counterfeits and schemes like this on Amazon. Finding a reliable seller that has a legitimate product and a good reputation has become almost impossible.
lordnacho 20 hours ago 1 reply      
Maybe I'm being naive, but won't there be a record of the address the item was delivered to? And credit card details? How do you avoid that?
n-gauge 20 hours ago 0 replies      
So is the clay endorsed with the fingerprints of the scammer?
balls187 21 hours ago 0 replies      
Looking at the reviews, many have only left a single review.
brador 21 hours ago 2 replies      
The only solution is a verifying third party. Postal service could offer this for a fee to make some easy money.
DAOs, Hacks and the Law medium.com
216 points by ikeboy  3 days ago   148 comments top 15
darawk 3 days ago 5 replies      
As an investor in the DAO and believer in smart contracts and Ethereum, I agree with this article.

I think we all got over-excited and took this way too fast. But I do think it would be a mistake to be too rash in throwing out the entire concept because of this one mistake. Will there be mistakes like this in the future? Definitely. But it is my hope that the entire cryptocurrency community will be chastened by this experience into taking things a little more slowly in the future.

IMO those of us who invested in the DAO should lose our investment. We fucked up and we deserve the loss, and if people see that real money was lost, perhaps they will be more judicious in the future with their investment decisions. I certainly will be.

But I don't think it makes sense to let the thief get away with the money either. I know in some sense there is a philosophical problem that the 'code is the contract' and the 'contract is the law' and therefore the code is the law, for better or worse. But IMO allowing this to happen would just be counter productive. There's no benefit to letting he (or she!) take the money and run, and quite a bit of harm to the ecosystem and probably lots of people who just held some ether and didn't invest in the DAO.

I'd like this event to be seen as a learning experience. People were overzealous and they got burned. In the future, let's be more careful, but let's keep exploring the possibilities of this technology.

stevecalifornia 3 days ago 1 reply      
"DAO, I closely read your contract and agreed to execute the clause where I can withdraw eth repeatedly and only be charged for my initial withdraw.

Thank you for the $70 million. Let me know if you draw up any other contracts I can participate in.

Regards, 0x304a554a310c7e546dfe434669c62820b7d83490"

sbov 3 days ago 2 replies      
Is it just me, or would this going to court would be the worst case scenario for Ethereum?

I mean, if the "hacker" wins, then it shows how impractically dangerous "code as contract" can be - you better be damn sure it's correct.

And if the "hacker" loses, it invalidates code as contract completely. The DAO claimed the code, and only the code matters. But what the DAO claims doesn't mean shit if courts say that is not true. Your whole idea is now just bullshit.

curiousgal 3 days ago 2 replies      
I believe people are looking at the wrong direction. Whoever did this already made millions shorting Eth right before the "attack".


jacques_chester 3 days ago 5 replies      
I have to admit, I share some of the author's smugness.

The world is very, very complex.

That is why the law is very, very complex. It covers everything humans do, have done, or will do. Alone, together, in small groups or large groups. As private individuals or public bodies. With real objects or imaginary objects. In their homes, on the street, in public buildings, in private parks. On the ground, under the ground, on the water, under the water, in the air, in orbit, out to the limits of human space.

Every day people come to the courts with potentially totally novel combinations of people and events, and the courts guarantee they will make a decision.

The courts have been doing this for nearly a thousand years and are still chugging along solving new problems. This should indicate that this is not a permanently solvable problem. The law is an adaptive, dynamic system.

All of this is why, as a software engineer who once studied (and mercifully quit) law, I am sometimes bemused by the idea that bodies of law can be ignored or swept away by code.

The law doesn't see it that way and in this game, the law gets the final move.

abalone 3 days ago 4 replies      
This armchair legal theory is rather pleasantly eviscerated by Matt Levine today on Bloomberg.[1] In short, you do not get immunity from real world contract law with a one paragraph disclaimer.

[1] "Blockchain Company's Smart Contracts Were Dumb" http://www.bloomberg.com/view/articles/2016-06-17/blockchain...

biot 3 days ago 3 replies      
DAO needs to be battle tested in EVE Online for a year before letting loose on the real world where money is at stake. Much like ponzi schemes and other exploits of the past, the EVE Online developers just consider it part of the game. Caveat emptor.
lpage 3 days ago 0 replies      
Good lawyers don't write overly complicated contracts, and they don't speak in legalese. The concepts covered within the contract might be complex, but the writing itself is deliberately readable. Complex things are brittle. Clear, well written contracts are more likely to be interpreted correctly by all parties if something unforeseen happens. Contrast this to the alternative - writing a contract by enumerating every possible outcome, hoping that you don't miss one, knowing that if you do, every party will argue for the interpretation that's most favorable to them. The adversarial aspect makes things that much harder.

Contracts in something as flexible as Ethereum strikes me as the ultimate in fragility. There's a great use case for anything that looks like a smallish FSM - formal methods will yield something very usable and provably correct. Being able to do that on a system with a state space the size of Ethereum + The DAO - yea, we're a ways away from that one.

brianpgordon 3 days ago 0 replies      
> according to the DAOs own legal contract

Why does it even matter what's on the DAO's website? They don't control the DAO, and you don't need to have gone through the DAO website to have invested in the original offering or in the spot market afterwards. What legal force would their website have anyway?

api 3 days ago 2 replies      
What would happen if the hacker went public, hired legal counsel, and asserted their right to the funds as per the terms of the contract?

Now that would be interesting. I could see a top legal team taking it simply for the sake of an opportunity to set legal precedent.

cel1ne 3 days ago 1 reply      
An exam-question a german lawyer told me to illustrate how layman's understanding of law and expectation of logic therein often doesn't apply:

"Your employer tells you to break into an opponent's office and steal something. You do that, jump out of the window and break a leg. Is your workplace insurance legally obliged to cover the medical cost?"

noonespecial 3 days ago 1 reply      
Sounds like a case of:


In regular law there's actually a way to say "that's not what I meant and you knew that's not what I meant". Prove it and the law is with you.

baby 3 days ago 0 replies      
Interestingly, this story has attracted numerous people to Ethereum. It was a pricey advertising but it worked.
dang 3 days ago 1 reply      
Url changed from http://www.bloomberg.com/view/articles/2016-06-17/blockchain..., which points to this, which is arguably a dupe of https://news.ycombinator.com/item?id=11921900, but if people want to discuss it separately we'll leave it up.
jabgrabdthrow 3 days ago 1 reply      
Personal attacks and name-calling aren't allowed in HN comments.

You've posted quite a few uncivil comments to HN, unfortunately. Please don't do that anymore.


OpenAI technical goals openai.com
212 points by runesoerensen  23 hours ago   60 comments top 9
Houshalter 20 hours ago 8 replies      
I'm concerned that none of these goals involve AI safety. Instead all of their goals are nearly the exact opposite, of accelerating AI technology as much as possible.

Safety was one of the main goals they promoted when it was founded. 2 of the 4 authors listed have publicly spoken about their belief of AI as an existential risk.

I'm not saying that a game playing AI is going to take over the world. But it does demonstrate the risk - we still have no idea how to control such an AI. We can train it to get high scores. But it won't want to do anything other than get high scores. And it will do whatever it takes to get the highest score possible, even if it means exploiting the game or hurting other players, or disobeying it's masters.

Now imagine they succeed in making smarter AIs. And their research spawns new research, which inspires new research, etc. Perhaps over several decades we could have AIs that are a lot more formidable than being able to play Pac Man. But we may still not have made any progress on the ability to control them.

JoshTriplett 21 hours ago 0 replies      
#4, solving multiple games with one agent, seems like a reasonably interesting step towards more general intelligence. An agent that can play a new game without any game-specific information other than "what can I do" and "what should I value" starts to sound a lot more like an agent that can solve non-game problems. Especially if that agent can play games that model real-world problems.
ktta 21 hours ago 2 replies      
I find it really interesting that Goal 4 is a game playing agent. Deepmind has been focussing on this since the beginning[0] and actually has great progress as far as Atari games go[1].

And DeepMind is able to use a single agent rather than a different specific one for each game. I wonder if OpenAI wants to go in a different direction although RL has considerable success. Whereas the other goals definitely need a lot of work before they are "real-world" functional, especially Goal 3. But of course that depends on their definition of 'useful'.



feral 18 hours ago 1 reply      

"Build a household robot" is high up the list.That doesn't seem inherently 'general'.

Certainly, people have been working on that for years; there are all sorts of subproblems like vision, contextual reasoning etc.

It could be treated as a general problem, requiring a lot of 'common sense'.

But a team which sets out to optimize that particular goal, could spend years on relatively narrow tasks that get good performance returns on household chores (e.g. developing version 10 of the floor cleaning algorithm), but don't really make progress towards the problem of general intelligence.

For me, what was really interesting about the benchmarks that Deepmind chose (the choice of a selection of Atari games) was that they were inherently somewhat general.

Are you not worried that by putting a narrow domain fairly high up, you'll get distracted by narrow tasks, rather than making progress towards what's really interesting - generality? Won't it introduce tension to try and keep the general focus in the presence of a narrow goal, where you can get good returns by overfitting?

jimfleming 21 hours ago 5 replies      
This is great! The goals seem reasonably ambitious and mostly doable over a few years.

I am surprised by #2: "Build a household robot". It's my understanding that efficient actuation and power are largely unsolved problems outside of the software realm. What's the plan for tackling stairs, variable height targets, manipulator dexterity, power supply, etc. in a general purpose robot with off-the-shelf parts? (Answering these questions may be part of that goal but maybe someone knows more on the subject.)

fitzwatermellow 21 hours ago 0 replies      
Goal 5: Inventing the Next Paradigm

Any impetus into actually dreaming up what may come next after GPUs and Async DRL? Non-neural models, quantum computing based AI, optogenetic hacking ;)

Otherwise, excellent list!

mrfusion 14 hours ago 0 replies      
I wish they had mentioned asimovs three laws.
lowglow 18 hours ago 0 replies      
Are the objectives of "OpenAI" conflicting with the interests of the startups applying to YC? OpenAI is building their own products/platforms and insights acquired from AI/Robotics startups applying and sharing information about what they're working on might be used as a competitive advantage.

Is there an information firewall between what startups are sharing in the hope of investment and what is shared to advance OpenAI? If there is a firewall, how is it enforced?

nxzero 21 hours ago 2 replies      
Is OpenAI willing to support true AI having basic rights to given to humans?

If so, why is this not one of the fundamental technical goals?

If not, why?

Simple Contracts are Better Contracts: the Meltdown of the DAO blockstack.org
188 points by jackaltman  2 days ago   102 comments top 16
sandworm101 2 days ago 8 replies      
How many of the TheDAO Curator members are lawyers?

Contracts are agreements that are meant to be legally enforceable. The enforcer has always been the King, a local governmental authority and a third party. The very concept of a contract assumes the neutral third party. That third party is to interpret the contract, identify potential scoundrels, nullify illegal contracts and generally make sure everyone isn't playing games. Smart contracts seek to sidestep that ancient structure by replacing the neutral third party with an inflexible machine. Good luck with that.

Contract language is also meant as a manifestation of intent. Smart contracts seek a perfect manifestation, dismissing all notions of imperfect knowledge or misunderstanding. Typos rarely matter in real contracts. Intent can trump language where appropriate. But in smart contracts typos are everything. Good luck with that too.

grellas 2 days ago 1 reply      
Can code both embody and replace law for the exact function for which it is set up?

DAO strives to execute through code an idealized pooled investment system by which contract issues are resolved entirely by code and wholly apart from any external societal legal or enforcement mechanisms.

All well and good but, where people are involved, code simply cannot define all the relations needed to capture what the law does (and, indeed, and in spite of its flaws, does very well indeed).

Consider the argument that the exploit here is not a flaw at all but just another variation on what the code does, with the result that investors who suddenly are $50M lighter in their wallets have not been harmed at all and should have no recourse to any remedy to restore their funds to them. The idea here is that the code is the contract and, if that is what the code does, well, that is what you bargained for, whether this is good or bad from any particular moral perspective. Right at the entry point of the system is a prominent disclaimer that says this in exact words. So a contract is a contract. If you don't like the result, tough.

The participants here are wealthy and presumably sophisticated investors. What if they aren't? What if this were marketed to a lot of gullible small investors who were induced to part with their money through various representations stating that their funds were entirely safe, subject only to normal investment risks relating to the underlying companies they funded? What does society do when people like this lose their life savings when some newly discovered "feature" of the code allows a sharpie to walk away with their funds? Are they to have no legal recourse because a "contract is a contract," especially if it embodied in code?

And what happens if a system is set up and the person or persons who find the new "feature" enabling them to walk away with other people's funds are the very people who organized the fund? Does law from the broader world step in to provide a remedy to those who lost their money? Or does the "contract is a contract, especially in code" logic work to deny any remedy to the participants here as well?

And, setting aside any of the more extreme examples, what if it is simply the case that those who did participate had reasonable expectations that any code that would define and limit their rights would do all that was expected in terms of defining their investments but would include safeguards that would prevent anyone from simply coming in to remove their funds altogether (dare I say "steal")? What if they were misled into having such expectations by promoters of the venture who said or implied that such safeguards existed? Is it enough to say that none of this matters because of some disclaimer buried in fine print? Is all of this simply irrelevant just because a "contract is a contract, especially in code"?

Contracts are part of any system of law that includes private property, and a very important part at that.

But contracts can never define the totality of the law that applies to a given situation, even if the parties swear up and down that that is their intent.

That is why securities laws exist, to help investors who get swindled by sharpies with well-honed contracts.

That is why the laws relating to fraud exist, to help those who are misled by others to their financial detriment.

Indeed, that is why a sophisticated body of laws exists relating to contracts themselves, to cover cases where the intent of the parties is sometimes so frustrated by one thing or another as to make it inequitable to enforce a contract.

Law is and always has existed in multiple layers. Legislatures pass statutes but courts exist to interpret them to cover specific cases as disputes arise. The same with administrative regulations promulgated by agencies. Even within the courts themselves, common law courts would declare legal "rules" only to have courts of equity intervene to correct things where the "rules" led to harsh or inequitable results.

Basically, all of this is another way of saying that human relations are complex and any system of laws and justice needs to be able to handle such complexity if it is to be worthy of being a system of justice.

Perhaps in narrow cases, things such as DAO can be set up to create a rich guy's playground of sorts in which, for the overwhelming number of cases, outside laws play no part within the self-contained system. Perhaps there is even an ideal of some type to be realized here (get rid of lawyers, etc.).

But no such system can ever be utterly divorced from the rules of the broader society. Ideal or no ideal, this is just not how the law works. Apart perhaps from some survivalist society or other, people simply cannot exempt themselves from the general rules of law no matter how much they desire to do so. They can limit the application of such broader laws to a degree but, when key bounds are transgressed, the law will apply in its full force regardless of their intentions.

So, I would say that the curators here probably had no choice. It was either do what they did or watch as lawsuits followed, probably in abundance. This may have violated some ideal in play here but it was a pragmatic necessity given how law in reality works (and always will work).

Animats 2 days ago 1 reply      
There are two fundamental problems with Etherium contracts.

1. They're executable programs. They could have been a set of declarative rules listed in priority order, but no, the designers went overboard and made them general programs with loops and recursion. There are straightforward ways to analyze sets of rules; they're usually amenable to case analysis. It's hard to analyze programs.

Writing a declarative contract language is a challenge. But doing so forces the designers to think through what they want the system to be able to do, and what they don't want it to do. Doing contracts as executable programs is punting on the problem. It says "we don't know how to do this, so we'll dump the problem on the users."

2. The stack overflow problem is idiotic. The system should have been designed so that if a program aborts, anything it did is rolled back. That's the design flaw this attack exploits.

alistproducer2 2 days ago 0 replies      
Bailing out the DAO undermines the core value propositions of Ethereum - contract immutability (stability) and decentralization. I understand that the argument is "this is a special (ie, too big to fail) case; however, who can be sure?

IMO the better way to handle this is to acknowledge the mistake and let it fail. It's embarrassing I get it. Honestly, the big bank types who threw millions at this tech without doing due diligence deserve to lose their shirt. It's called speculation for a reason.

If the core team cares about the long term credibility of the project with the people who real matter - the tech community - they will not bail it out.


For anyone interested, there's a really great discussion on this subject at the Ethereum reddithttps://www.reddit.com/r/ethereum/comments/4oiqj7/critical_u...

louprado 2 days ago 1 reply      
"Simple Contracts are Better Contracts" has always been the mantra of the Ethereum and DAO team. Most times when a security question was raised, "simple contracts" was their defacto answer [1].

This exploit suggests that the most competent developers in this space, who always preached simple contracts, are not yet able to consistently write secure contracts.

Also, the OP states the importance of being able to update a contract. As of last year that meant the original contract MUST include a self-modifying code provision. Self-modifying code doesn't align well with keeping your code simple.

As an aside, "contracts" are Ethereum's raison d'tre and the Ether currency value is largely based on adoption. Even though this exploit did not expose a flaw in the Ethereum block chain, the Ether sell-off is an expected consequence.

Lastly does anyone have a link to the original contract code and how it could be rewritten so that it isn't vulnerable to this exploit ?

[1] https://www.youtube.com/watch?v=cahj4WJtp20Q&A at 42m44s is relevant.

Edit: corrected time stamp for above video

alttab 2 days ago 3 replies      
Simple contracts then are only as scalable, reliable, and secure as the code that runs off the blockchain.

Arguably, that defeats the whole purpose because it is then who controls the code (since it is no longer decentralized), controls the contract.

If I'm reading this right (I'm not 100% sure of that), this is the equivalent almost of not running a blockchain at all (if the idea is taken to its finality).

Storing the who and the what of contracts has never really been the issue, its been the execution and the honoring of the contract that man has not yet solved.

But centralizing the code that runs the contracts, and taking it off the blockchain doesn't sound like the way to do it.

Aelinsaar 2 days ago 5 replies      
Just from reading commentators here at HN, it doesn't seem like these issues were unforeseen. Rather, it seemed that whatever intense optimism exists around cryptocurrencies is capable of overwhelming investor sense in return for the promise of some ideological "win".
nzoschke 2 days ago 0 replies      
As a software engineer everything here rings true to me.

Use as simple of tools as possible when programming and offer many ways for mere humans to change code and review correctness and review, approve and roll back critical transactions.

But this sounds effectively like the status quo with credit cards and Kickstarter.

So I'm not sure what a blockchain adds other than a different platform and maintainers than the existing financial and group purchasing corporations.

brbsix 2 days ago 1 reply      
The AI equivalent to a recursive call bug are self-replicating Von Neumann probe paperclip maximixers that consume the entire universe. We're going to be in a world of hurt if we aren't able to sort this out. It's pretty essential that machines are able to discern our intent or the spirit of our contracts one way or another.
cyrillic 2 days ago 1 reply      
If the contract code can be upgraded by the majority of involved parties, it would be simple to buy 51% of the voting power and change the code to pay out everything else. Each takeover would double your wallet. Am I missing something here?
simpleblend 2 days ago 0 replies      
I ended up writing an article explaining my position on the whole thing: https://blog.simpleblend.net/dao-attack-whos-blame/
jcoffland 2 days ago 0 replies      
If simple contracts, where much of the code is off chain, are to be the way of things it begs the question, why do we even need Turing complete contacts in the first place. A blockchain could be created which has a few fixed rules which enable the basics of a DAO with much lower risk.
modarts 2 days ago 0 replies      
Thought this was a commentary on the poor API contracts exposed by data access objects
jawatson 2 days ago 0 replies      
I may be stepping outside of my area of expertise, but doesn't this seem like a perfect place to apply formal verification tools? As long as the contract isn't too long, it should be possible to ensure that the implementation exactly matches the specification.
draw_down 2 days ago 0 replies      
Code is law! Ohh, ummm, err, uhhh, except when we say it's not.
ybroze 2 days ago 0 replies      
I really wanted to know about the Data Access Object.
How Hired Hackers Got Complete Control of Palantir buzzfeed.com
215 points by minimaxir  2 days ago   78 comments top 16
nikcub 2 days ago 5 replies      
Shaming companies for carrying out pentests is counter-productive, i'm more interested in who is leaking against Palantir and why.

This is the third story now that William Alden has written about Palantir that appear to be based on internal documents[0]

His profile of the company a month ago opened with:

> A trove of internal documents and insider interviews has pulled back the curtain on one of Silicon Valleys most secretive and highly valued companies, Palantir Technologies.

There isn't much public interest in large parts of the profile nor the follow-up stories, so it has a feeling of a disgruntled employee. A really difficult class of threat to defend against and stop, but each additional story and leak provides a few more bits of data that can narrow down the suspect pool.

I really hope the leaker and journalist in this case know what they're getting themselves into - because based on the pentest report the infosec team at Palantir appear capable of tracking the leak down.

[0] https://www.buzzfeed.com/williamalden?language=en

[1] https://www.buzzfeed.com/williamalden/inside-palantir-silico...

hendzen 2 days ago 4 replies      
I don't think Palantir should be shamed for this. It's laudable that they invested in penetration testing - better they find out this way than by an actual APT/hacking group.
tptacek 2 days ago 4 replies      
I don't understand why we're reading this. Was it leaked? Pretty much every big tech firm does this twice a year, but nobody releases the reports.
arca_vorago 2 days ago 1 reply      
Honestly it's as if people have forgotten one of the fundamental principles of strategy and tactics. The attacker almost always has the advantage. Its true in the book of five rings, its true in the art of war, it's just a basic principle.

Now, thats not to say an adanced attacker still can't be defended against, but as a sysadmin who has seen the inside of companies from law firms to publicly traded big guys to the IT firms themselves... And that's that almost no managment has put forward the personell or the budgets or the culture needed to really secure things.

Hell, a family member of mine recently got a tour of spacex and was apalled at the security. If Musk and his money dont do it right or well, almost nobody is.

I've basically told people who run windows systems for business they're already comprimised most likely, and the best thing to do is to be doing hids and good log analysis so you catch it when it happens quickly... but you probably arent going to stop any kind of semi-advanced attacker.

So to be frank, it completely makes sense to me that a company like Palantir would be massively vulnerable from thr inside. The edge of the sword they live on cuts both ways. These days, its about response time and forensic afteraction.

"When Palantirs information security employees finally discovered the intruders, they provided a rapid network response in which they identified and mitigated the majority of the red teams actions within days, the report says. Compared with other large companies, this defensive response was unusually robust, the industry source said, based on a reading of the report."

ScottBurson 2 days ago 2 replies      
Wow, this is sobering.

I've certainly heard it said that if you're a big tech company, you are already infiltrated by state-sponsored hackers. But since I've never seen one of these red-team reports, this synopsis provides a lot of color on how people can get around inside the network, once they get in, that I wasn't aware of (obviously I don't work in opsec).

Too bad for Palantir that this got leaked, but perhaps it can be instructive for many of us.

leroy_masochist 2 days ago 2 replies      
How does Palantir's response stack up?

Having read the article, I'm not sure whether to read it as: "The extent to which the red team was able to exploit the network is a sign that Palantir's network security is bad"or, "Despite Palantir following best practices and having a lot of smart people on their security team, the red team was still able to do a lot, which bodes really badly for other companies that don't have the same internal resources"?

swingbridge 1 day ago 0 replies      
Every major company performs these sorts of "red team" tests and virtually no major company passes with flying colors... so it's not surprising that Palintir has its fair share of issues. What is surprising is that all this stuff leaks out about Palinitr while other companies manage to keep things under tighter wraps.
utefan001 2 days ago 0 replies      
This recent post from microsoft describes how to mitigate some of the risk that got Palantir.


"I cant stress how important this change is an administrator who connects using normal RDP exposes his or her credentials to the remote system with every connection. RDPRA, on the other hand, ensures that credentials arent exposed to the attacker on the remote computer being managed."

SCHiM 2 days ago 0 replies      
> A Palantr is a dangerous tool, Saruman. They are not all accounted for, the lost Seeing Stones. We do not know who else may be watching!

It's weird that this story is considered newsworthy. I skimmed over the article and it looks about normal for the industry. As sad as it sounds, they aren't doing much worse than many other companies out there.

kriro 2 days ago 1 reply      
What is the standard measures taken to protect against spear phishing? Mostly educating users and trying to filter out the mails?

Palantir basically started the red team in a position where they had successfully spear phished someone and that seems to be common practice. Is trying to protect against it just a waste of time and should the resources be invested into proper segmentation to protect against the successful spear phis case? Or are people usually 80/20ing this and taking anti spear phish measures but only to an extend that covers a lot of ground at relatively low cost/time?

jacquesm 2 days ago 0 replies      
So, who will take the bet that it was only the red team doing the pentest that managed to get this level of access?

The one saving grace here is that the red team had to be 'let in', in other words, they started from a position that is substantially different from being a complete outsider.

It also makes you wonder what else was leaked besides the report.

bobedybobbob 2 days ago 0 replies      
With time, creativity and motivation a good offensive security team will always win. All we can do as defenders is to find ways to raise the cost of such an attack.

A bit disappointed they seem to have started on the internal network rather then coming in from the outside :)

2 days ago 2 days ago 2 replies      
Ha! ... quick turn around by Fortune, http://fortune.com/2016/06/18/palantir-hack-buzzfeed/

Bizarre to watch a company move like that ... a bit unnerving in an Enders-Game sort of way. Information manipulation, and suddenly you wonder who's wagging the dog.

nerdponx 2 days ago 0 replies      
What a ridiculous clickbait title
Bromskloss 2 days ago 0 replies      
What does Palantir do, really?
graycat 2 days ago 0 replies      
Gee, we have to wonder if some good anomaly detection


would have detected that intrusion?

American society increasingly mistakes intelligence for human worth theatlantic.com
222 points by guscost  3 days ago   277 comments top 50
guscost 3 days ago 5 replies      
I'm a huge fan of Mike Rowe. He's done some amazing journalism with "Dirty Jobs" and his mikeroweWORKS foundation is terrific. He claims that unconditional encouragement toward intellectual dreams does not help most people, and argues that more kids should be pursuing professional trades.

This overlaps with one of the better suggestions from this article. In general I think these are hit or miss, and maybe too academic, but the problem is a real one and it's not going to go away tomorrow. People have this notion that robots will just replace all manufacturing jobs next year or whatever, but as usual it's a lot more complex and subtle than that. And it's a cultural problem as much as an economic one.

Of course I'm not able to say this from firsthand experience, having had little difficulty in school, and having accumulated enough skill with computers to find plenty of good work. And I can't really call for some given change before understanding more about the people in worse situations. I did go to a trade school though and more of those sounds like the best idea so far.

kstenerud 3 days ago 7 replies      
This is pretty silly. In any society, we must decide how to distribute the resources generated by that society. So you make the rules, set up enforcement, and plod along, making changes as you go, as far as you dare in the face of the inevitable political machinations of everyone involved.

The more intelligent and clever always benefit, because they know how to use the system to their own purpose. This has been so since the beginning. The less intelligent either end up in a position where their labor can be exploited, or they're left to slow decline or even death, depending on how much society values keeping them alive.

That's any given society, in a nutshell. They may dress it up in value statements, laws, philosophies, religions, and so on, but in the end it comes down to power, and who has enough of it to extract the resources and status he wants.

The difference is that now the "menial" jobs are almost gone, and the higher up jobs are on the chopping block. And so the first to die off will be those without power (as usual). We can come up with all sorts of platitudes about the value of human life, but in the end it always comes down to this. Welcome to the true nature of man.

Of course, nowadays we have middle class folks who can experience the philosophical angst of the less fortunate, and come up with all sorts of rules, decide which words are acceptable or not, and police outward appearances and speech.

You wouldn't think twice in deciding what to call someone who implemented a password policy with a max of 8 characters, or stored the passwords unsalted or even plaintext behind a system that hasn't had a security update in 10 years. But soon you will, because "stupid" is not an acceptable word anymore, nor is any word that compares intelligence or ability. We're all equal, after all, and that person deserves his job and the income it generates by virtue of his being human and equal to every other human. You wouldn't want him to starve now, would you?

manachar 3 days ago 3 replies      
Alternatively, American society values a human's worth by economic output and the economy has shifted to value humans who can successfully navigate the new more intelligence-based economy.
eggy 3 days ago 2 replies      
I grew up in a poor, working class, crime-ridden neighborhood of Brooklyn in the 1970s, and I remember being smart as a kid was a reason to harass, and sometimes beat on somebody who was seen as too smart in class. This wasn't happening in the wealthier, more educated neighborhoods in NYC (I had friends there afterwards). Although, all parents wanted their kids to exceed in both areas.

I think there was also some anger and negative fallout against college, since a lot of working class folks went to Vietnam, and saw college-educated protesters as disrespectful, and actually not so smart.

WalterBright 3 days ago 2 replies      
Anti-intellectualism runs deep in American society. Consider the words nerd, wonk, geek, brainiac, college boy, the negative Hollywood stereotypes, etc. The Big Bang Theory is the latest incarnation of that.

My father learned to keep his book collection out of the living room.

zizzles 3 days ago 3 replies      
Are you guys living in the same world that I am?

Nerdy geek-type intellectuals are at the BOTTOM of the totem pole in terms of respect and human worth in the 21st century. They are ostracized, bullied in school, most are awkward virgins (intellect is the anti-thesis when it comes to being a "womanizer" or "stud") and it's only getting worse as society progresses.

tbabb 3 days ago 0 replies      
Intelligence is a perfectly fair criteria by which to judge a candidate, as in most jobs it will have a direct effect on the candidate's ability to perform their duties.

If an employer over-weights intelligence to the exclusion of other criteria, like work ethic or social aptitude, then she may wind up with a candidate deficient in some other important area to her own detriment.

It is important that everyone, regardless of their abilities or what they can offer for society, have access to certain basic standards of living. But it's absurd to suggest, as the author does, that we should hobble economic output just to favor creating meaningless busy work over automation. If that's important to us, we should just offer a basic standard of living directly.

If intelligence isn't a valid/fair selection criteria for job candidates, then there is no such thing.

WalterSear 3 days ago 1 reply      
No, it confuses material success with human worth.

And this is not a new human behaviour, it's just exacerbated by increasing inequality, and made more obvious when the rules of success change, favouring different social groups.

American society's infatuation with gungho capitalism certainly plays a part (thanks, Edward Bernaise), but I'm convinced it's a hard wired behaviour, like our preference for attractive people. Associating ourselves with both attractive and the powerful provide an evolutionary advantage.

CodexArcanum 2 days ago 1 reply      
There are two rather egregious flaws in this line of thinking from the very start.

The first is that "intelligence" is purely an inborn trait. That "being stupid" is a condition on par with serious mental disability. Stupidity isn't mocked by the smart to belittle the dumb. It's mocked by the majority of society to promote feelings of superiority in the mockery yes, but also to share lessons about what not to do.

Anyone (barring mental disability, and even then) can learn, and can be taught. I've certainly encountered my share of the intelligentsia too brilliant to lower themselves to the common level, but I've encountered far many more gifted people who wanted deeply to share their knowledge with those around them. Yes, our society fetishises intelligence, but the vast teeming masses of "dumb people" are a lot smarter than most would believe.

The second fallacy here is that financial earning or being able to survive at all are somehow connected to "human worth" which is connected to intelligence. As we automate more and more fields of labor, we are moving towards a post-job society. The issue isn't that Americans over value intelligence, it's that we over value the concept of "hard work." Especially as enshrined in the old advice of many parents, "you have to work hard to get ahead in the world."

Well, no, you don't. Working smart helps a lot, but soon you won't have to work at all. Or you won't be able to. Forget our negative perception of "the dumb." We need to fix our negative perception of "the lazy." We must learn how to value people for something beyond their contributions to labor, and maybe we must also teach people how to have and feel valuable beyond labor.

And we should do it fast, because after we automate away the drivers, the merchants, and the laborers, it won't be long before we've automated away the lawyers, artists, programmers, and doctors. And then there will be nothing left but politicians, and is that the world you want to live in?

osazuwa 2 days ago 0 replies      
I don't buy this.

This is not a problem. Anti-intellectualism is the problem. Our culture encourages people not to be smart. In a sentiment captured by Marco Rubio when he said it is better to be a welder than a philosopher, we teach young people that non-intellectual pursuits are more noble (so long as they make money). We teach men that it is more manly to work with your hands. We teach women that service-sector jobs, or jobs that require more social acumen are more womanly. We teach people the false-dichotomy of left-right-brain dominance, making it easy believe that one is simply unequipped for logic and quantitative mental work. We teach black youth that their role models should be athletes and pop-stars, not people successful in STEM-fields (personal experience). Our popular culture characterizes smart people as weird, eccentric, and disconnected from the rest of humanity -- those tropes are obvious in the article's examples of "Big Bang" and Sherlock Holmes.

When my wife learned coding, all her old college girlfriends were amazed that she turned out to be so "smart". She was shocked, as she never saw herself as particularly smart (though she is), successfully learning to code is just a function of putting in the practice. It became apparent that her friends were socialized to believe that this coding was the exclusive domain of this mythical class of brainy people.

Sure some people are naturally smarter, thus it is relatively easier for them to do mentally laborious work. However, If the economy increasing favors people who can carry a heavier mental workload, then the first step to preparing citizens for participating in that economy is overcoming our culture's resistant strain of anti-intellectualism.

robg 3 days ago 1 reply      
And gets it very, very wrong. "Intelligence" moves around with nutrition, stress, sleep, pollution. The brain is organic. We miss out on human potential in all the neighborhoods around the world that aren't safe and supported. The good news is we are headed in the right direction as a planet, but surprisingly this article doesn't mention it [1].


Aloha 3 days ago 1 reply      
I learned when I was a child it was dangerous to be the smartest person in the room. It's much safer to appear to be the village idiot, or at least less bright that the brightest.

I do disagree fundamentally with the concept of the evils of merit - merit is so much more than smarts and formal education, its a mix of capacity for the role, agreeability and eagerness, productivity, effort expended, grooming and appearance, and a dozen other factors, most of which are intangible. You don't have to be information work level of smart to achieve these qualities either.

The problem is a belief that certain kinds of education make one more suitable for a role. I'd take someone with 4 more years of real world experience over that college degree any day.

throw2016 3 days ago 1 reply      
American society worships financial success and the mythology of supermen and superwomen to the exclusion of nearly everything else and this extracts a huge price in the individual and social sense of well being.

Since capitalism will rarely allow more than 20-30% 'success' that leaves a huge percentage of your population feeling inadequate.

Sometimes it feels the focus on individual brilliance without context of social value impoverishes rather than nourishes.

It can induce an adversorial state of hyper competition, in and out groups and a heightening state of frenzy to belong that leaves little room for niceties like empathy and community.

The folklore of America's success reinforces the mythology of individual brilliance, but rational heads need to question without taking away from individual brilliance in the past, now or the neccessity of it in the future, the benefits of the colononization of an entire continent.

The downside to this single minded focus is the simple fact that individuals are disempowered. They cannot effect any change. It takes real community and society to effect even the smallest change and that requires a functioning social setup with community, cooperation, empathy and mutual respect. Any communal action will require large numbers of americans to cooperate for instance protests or activism, at that point americans will discover their social structure does not allow for it and will not deliver. There is very little if any sense of community. It's been engineered out. It's unlikely individuals who have been devalued and left feeling worthless will make common cause in a hyper adversorial social structure.

sixtypoundhound 3 days ago 0 replies      
Part of the problem is inserting the same standard across multiple roles; for example, mandating all managers have a college education.

You don't need a college degree for most sales, customer service, and operations roles. Some of the best people managers and sales people I've met have sketchy degrees (or none at all).

On the other hand, putting someone in charge of a process which is highly reliant on statistics or technology who doesn't have a college degree (or real world experience which proves ample ability) can be a total disaster. I've seen what happens when you take a non-statistical director and put them over a data science process.... they hide in familiar details (project plan! communications!) and can't understand the harder decisions involved in the role. Like does the magical black-box system even work....

sandworm101 3 days ago 2 replies      
>> The 2010s, in contrast, are a terrible time to not be brainy.

That sums it up. Intelligence doesn't matter, rather the appearance of intelligence gets one ahead. Appearing "brainy" counts more than actually being brainy. TheBigBangTheory is part of that trend. Intelligent people are expected to project a particular image. IRL that never happens. Many very smart people couldn't care less for trek, comics or any of the other stereotypes. And conversely, there are many absolutely not-smart people who do very well by simply dressing the part. I say bring on the standardized tests. Let us separate the smart from the fashionable.

kartan 3 days ago 0 replies      
The premise is erroneous and it creates a difficult debate.

> American society increasingly mistakes intelligence for human worth

I think that we are just in the same place that we have always been.

There were this intelligence tests in the 1900s that is what, still nowadays, commonly people mistakes as innate intelligence. "Yerkes intelligence exams (alpha, beta and individual) were culturally biased, taken under markedly different conditions and tended to reflect years in the U.S. and familiarity with dominant culture, rather than innate intelligence" http://www.understandingrace.org/history/science/race_intel....

Even innate intelligence is more complex than a number as there are different components to account for.

So when the article reads "American society increasingly mistakes intelligence for human worth" it should read "Americans increasingly immature when judging others".

I blame this phenomenon at the lack of education quality. Memorizing and doing mental calculus is far from what a good education should give you. Critical thinking, capacity to self-learning, appreciation of universal human values should be the main part of that education.

And as it says America I don't think that it is restricted to any country. Education could be a lot better everywhere. But shrinking education budgets all around the world, because "crisis", are jeopardizing our future.

BuckRogers 3 days ago 1 reply      
That's a huge improvement over where we were a few years ago, when Steve Jobs died and I noted all the human misery and pain he caused with his employment of Foxconn sweatshops.

My coworker said Jobs' life was worth more than those factory workers. I reminded him that everyone has equal worth and he seemed stunned at my forceful response. Noting that your life has to feel pretty hopeless to jump out the window of the factory you work/live in.

I guess being a do-well white boy in Chicago afforded him quite the luxury of deciding the world pecking order. Except I was also a white boy in Chicago calling him out on it. Considering we had multiple 2nd generation immigrant coworkers around us, who should in theory empathize with 3rd world work conditions, I was the only one who spoke up.

Probably because the Steve Jobs > all guy was a fairly influential architect. But almost nothing stops me from piping up anyway. Still, judging human worth over intelligence is definitely a bit better than judging based on entrepreneurial capacity as my colleague was.

Maybe in a couple more years we'll all agree that humanity has equal worth and dignity. Then all life, and we'll be set.

rrecuero 3 days ago 0 replies      
As I see it, the world, and america in particular is worshipping the culture of the intellect to the extreme, forgetting other innate and arguably more important human qualities.

As philosophers like Jacob Needleman have said before, there is a line of knowledge and a line of being. If one progresses too further ahead from the other,it becomes useless and even harmful. We see examples everyday of people that are really smart but can't treat others like humans.

The canonical example is a great coder that is a total jackass. There is a dire need for empathy, now more than ever...

jackgavigan 3 days ago 3 replies      
A meritoracy can reward qualities other than intelligence - integrity, talent, dedication, diligence, courage (both physical and moral), kindness, generosity, willingness to help others...
selectron 3 days ago 1 reply      
I don't have any problems with a society that is a meritocracy. This article is nonsense. If anyone wonders why so many people don't like political correctness just point them to this article.
emptybits 3 days ago 2 replies      
I'm having difficulty reconciling this claim with how I see American society judging the worth of political candidates. (Also in the journalistic outlets American society values the most.)
Hondor 3 days ago 4 replies      
It makes a valuable point that our popular culture that we think makes us "civilized" is really just a sham and we're no different from olden-days racists and homophobes but with a few extra rules applied. We're very careful not to abuse people because of their race, religion or certain types of sexuality, but for all the non-taboo classes of people, the gloves are off and we're no different from slave owners who justified mistreating blacks because they were sub-human. We can't call people niggers but we can call them retards. We can't lock up homosexuals but we can lock up pedophiles and zoophiles. We can't deny jobs or voting rights to women but we can deny jobs and voting rights to foreigners.

We really aren't any more moral than we were 100's of years ago, we just fool ourselves into thinking we are.

gravypod 2 days ago 0 replies      
I'd just like to chime in and say that the word meritocracy was not invented in the 50%.

You can clearly see that here: https://books.google.com/ngrams/graph?content=meritocracy&ye...

frozenport 3 days ago 0 replies      
>> hiring decisions were based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.

Because this would be a much better world?

teraformer 3 days ago 0 replies      
In America, lots of excessively toxic, obnoxious bastards get much farther along in life than so many others by trading on unsophisticated deception, intimidation and betrayal, and not much else.

After a while, you get tired of being smart and getting nowhere for all the effort you expend.

Then, knowing what you know, and having noticed what's surrounded you for so long, well... intellectuals can decide for themselves a logical corollary to this.

powera 3 days ago 1 reply      
Was this article ghost written by Diana Moon Glampers?

I disagree with basically every sentence in the article and don't even know where to begin.

stronglikedan 3 days ago 0 replies      
I wonder how the stats and facts in this article would compare to other developed countries. I know it's not a comparison article, but more of a call to action for America, written by Americans, but I would still be interested in the comparison. Especially, since they mention a couple of world-wide examples in the Darwin Awards and reddit.
jondubois 3 days ago 0 replies      
A terrible thing about society today is our diminishing ability for self-reflection (which is a form of emotional intelligence). Most of the stuff we do as individuals is driven by marketing and social pressure - So much so that many of us have become oblivious to our true internal needs. Maybe that's why so many people go to psychologists nowadays.

In the past, people put more value on self-reflection, philosophical thinking and other types of emotional intelligence (e.g. wisdom) over raw I.Q.

mcguire 3 days ago 0 replies      
I am not sure that the author isn't confusing America's wealth disparity with differential treatment of intelligence (which still seems to remain undefined).

Sure, those with high SAT scores have greater access to the means for economic success, but I don't think that is because of any intrinsic difference in high-SATers and the run of the mill dummy. Rather, I'd suggest it is due to the winner-take-all nature of the current American society.

vasilipupkin 3 days ago 0 replies      
Incentivizing companies to resist automation seems like a terrible idea that would make everyone poorer, including the "less brainy"
djokkataja 3 days ago 0 replies      
Sortition[1] (randomly selecting from the population for government positions) combined with direct democracy where possible would fundamentally address problems of inequality regardless of their source. Meritocracy, plutocracy, nepotism--you'll end up with a small group of people in power in any system unless the system is carefully designed to prevent it.

Put another way, the article mentions several possible measures to try to provide for better outcomes for everyone who isn't "smart"; these are bandaids for cancer. These are top-down methods for "the elite" in various forms to provide for "everyone else" so that the have-nots have at least enough to avoid offending the sensibilities of the powerful. If fairness is a genuine concern and central goal for a society, then positions of power should be filled by a statistically representative sample of the population. Until then, inequality will be the status quo.

As I see it, the primary benefit in this specific situation (technological unemployment or underemployment for the "not smart enough") would be faster reaction time. You might see similar solutions to those proposed in this article, but you wouldn't have to wait for people like Mike Rowe to write about problems of inequality (and you wouldn't have to wait even longer for people in positions of power to decide to do something). Random selection would produce many people who deeply care about the "not smart" because that describes about 50% of the people selected and many people they personally know.

[1] https://en.wikipedia.org/wiki/Sortition

jacquesm 3 days ago 1 reply      
I'd substitute 'wealth' for 'intelligence', and it goes much further than just American society.
osazuwa 2 days ago 0 replies      
Reading illiteracy is no longer a big problem. STEM-illiteracy is the new problem.

It is easier for some to learn how to read than it is for others. Yet as a society we don't give people who find difficulty in learning to read a pass on reading. Nor should we for STEM fields.

gwern 3 days ago 3 replies      
I disagree. American society has always had something of a love-hate relationship with intelligence, and there have always been examples like _Big Bang Theory_ venerating intelligence ('Sputnik moment' and astronauts, anyone?). Intelligence correlates with success cross-culturally and over time, so that's not in question.

What is in question is the 'increasingly'; the only evidence offered, aside from anecdote, is that the returns from education have increased. This is inadequate. The research literature I'm aware of (http://www.gwern.net/Embryo%20selection#iqincome-bibliograph...) shows that intelligence has consistently been correlated with SES/income/occupation and it's not that clear it has been increasing at all. Particularly relevant is Strenze 2007, "Intelligence and socioeconomic success: A meta-analytic review of longitudinal research" http://emilkirkegaard.dk/en/wp-content/uploads/Intelligence-... who says:

> Several studies have investigated changes in the association between intelligence and success during past decades. Although Herrnstein and Murray concluded that the main point seems beyond dispute (1994: 52) and some studies have found support for this point ( Murnane, Willett, & Levy, 1995 ), there are still serious reasons to doubt that the importance of intelligence is or has been growing. Neither the meta-analysis by Bowles et al. (2001) nor the review by Jencks et al. (1979) found any clear trend in the correlations between intelligence and success. The same conclusion was reached by Flynn (2004) and Hauser and Huang (1997) . Breen and Goldthorpe (2001) found that the association between intelligence and occupational status in England is, if anything, declining

Strenze's own meta-analytic results using ~44 studies are on pg13 Table 2, where the correlation of intelligence with education/occupation/income over the 20th century show no clear pattern of increase or decrease:

> The influence of the third moderator variable, year of success (i.e., year of the measurement of success), is analyzed in the third section of Table 2 . Year of success ranges from 1929 to 2003 in the present meta-analysis. Judging by the sample size weighted corrected correlations ( p ), there appears to be no historical trend for any one of the moderator variables: correlations with education and occupation remain more or less stable throughout the period under study; correlations with income fluctuate more but without any obvious direction. Quite surprisingly, if unweighted and uncorrected correlations ( r ) are observed instead, then the correlations with education and occupation exhibit a declining trend.

So, if American society 'increasingly mistakes intelligence for human worth' and 'as recently as the 1950s, possessing only middling intelligence was not likely to severely limit your lifes trajectory', why are the more intelligent, if anything, less likely to be successful when we go from 1920 to 2000?

anortons 3 days ago 1 reply      
"The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along"*

* also being male, White, straight, Christian

...As though the 50's were the bastion of fair hiring practices.

aplummer 3 days ago 0 replies      
"based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics..."

I don't know if the author is trying to make this sound _better_ than today, considering "physical characteristics" often meant "white and male".

arisAlexis 3 days ago 0 replies      
In the European Commissiom aka the public sector the only test you take to go into any position even clerical is an IQ test. I can't comprehend this it should be the other way around. Since maximizing profit is not so important but serving the public is.
maker1138 3 days ago 1 reply      
After all human history where jocks ruled the earth, they're now upset that nerds are becoming important?
DominikD 3 days ago 0 replies      
Society historically confusing religious convictions with human worth is somehow becoming worse by confusing intelligence with human worth. Fascinating case of rationalization.
ken47 3 days ago 0 replies      
For those who count themselves among the intelligent, we should not doubt that in a generation or few, our skills may also be automated away.
leecarraher 2 days ago 0 replies      
worthington's law: more money = better thanhttps://www.youtube.com/watch?v=ke9iShKzZmM
exratione 3 days ago 1 reply      
In a world that is increasingly data, the ability to process and react to that data becomes worth.

Marking worth is just more data processing, a declaration of subjective value in the scheme of your choice. Nothing special.

The nature of your interactions with those of lesser worth, and your views of them, however...well, there is a defining thing for a human being.

ss108 3 days ago 1 reply      
This hardly seems to be the case if you follow politics and talk to people about it.
threepipeproblm 3 days ago 0 replies      
Don't be stupid vs. don't be stupidist?
cmurf 3 days ago 0 replies      
If only this were true in elections.
PeterStuer 3 days ago 3 replies      
As a non-American, and so 'judging' America only through the media, it does feel like the complete opposite. Deep ingrained anti-intellectualism, idolization of stupidity and love of the brazen and the rich. Scary decline of the sciences and rapid expansion of superstitions. Of course this does not apply to this here readership, but at least from an outside perspective we see 'idiocracy' surging onto the stage at breakneck speed.
known 3 days ago 0 replies      
marks != merit

voting != democracy

darpa_escapee 3 days ago 1 reply      
Behold, the most Hacker News post on Hacker News.
3 days ago 3 days ago 3 replies      
This is the worst kind of off-topic tangent that we see on HN: trotting out ideological talking points, then slathering on flamewar fodder. It's so ridiculously out of place here that we're banning this account.

I think we banned your previous account for doing this as well. If you don't want to be banned on HN, it needs to stop. If you want to commit to keeping this sort of comment off HN, you can email hn@ycombinator.com and we'll look into unbanning your original account.

milesward 3 days ago 2 replies      
Uhm, STFU. Geniuses = good. Sexism = bad.
CIA Director John Brennan Pretends Foreign Cryptography Doesn't Exist schneier.com
193 points by CapitalistCartr  5 hours ago   58 comments top 7
chopin 5 hours ago 4 replies      
As commented somewhere in the Schneier thread:

Presumably Brennan refers to the big players, Apple, Google, Microsoft, Facebook, Intel (ME) et al. These companies are delivering crypto and/or hardware for the masses and could be subverted maybe with small short term effect.

Of course, the terrorists will switch. And Mr. Brennan knows that. This is the revealing part: It's about mass surveillance of people who are not terrorists by any means.

yarper 4 hours ago 3 replies      
Interestingly it seems that GPG is originally from Germany [0] (a headline act from the first crypto war, I expect there are many other examples!)

[0] https://en.wikipedia.org/wiki/GNU_Privacy_Guard

vaadu 1 hour ago 1 reply      
This is SOP for this administration. If you deny a problem exists it does not exist. The same state of denial exists when they refuse to use the words Islamic terrorists.
Rathor1 4 hours ago 1 reply      
He is not dumb, he is deliberately lying.
Bromskloss 4 hours ago 2 replies      
> US companies dominate the international market as far as encryption technologies

When do people turn to "companies" for encryption instead of using publicly available libraries or applications?

golergka 3 hours ago 1 reply      
Well, if he's talking about consumer products, he's absolutely right Telegram is about the only widely popular non-US consumer product offering encrypted communication I know.
nxzero 2 hours ago 1 reply      
Really depends on the requirements for encryption. Unbreakable encryption is easy to do with basic math.
Member of The European Union gowers.wordpress.com
326 points by matthewrudy  1 day ago   195 comments top 23
RcouF1uZ4gsC 23 hours ago 11 replies      
By this logic, wouldn't the most logical thing be for the EU members to all petition to join the United States of America as states (USAE - United States of America and Europe)?

After all, would this group not have much more influence than even the EU? In addition, there is a strong tradition of subsidiarity in the United States (see states rights).

I think most Europeans would be horrified of joining the United States. Why? Because there is not enough shared identity.

The United States works because the shared American identity is very strong. Even if I live in Iowa, my identity is more strongly American than Iowan. This American identity was forged through centuries of migration and assimilation (i.e. the Westward expansion where people would leave their state to go out West but would still be American) and in fighting and dying together alongside fellow Americans in the various wars over the years.

While the elite of Europe may have a stronger European identity than that of their country of citizenship, what you are seeing with the recent polls and movements in Europe is that for many of the common people, their national identity is stronger than their European identity. This is to be expected as for many of these countries, the national identity was forged over hundreds of years, while the modern European identity is only a few decades old.

Thus, for many people, they see the European identity as seeking to assert primacy over their national identity and they are wanting out.

rhaps0dy 23 hours ago 5 replies      
>Many people think that a country is better off if its workers are decently paid, do not work excessively long hours, and work in a safe environment. (If you are sufficiently right wing, then you may disagree, but that just means that you will need other examples to illustrate the abstract principle.)

Would it be possible to resist stabs at the "other side" of politics, if you are trying to explain something? People in the other side will be distracted by this, and think less of the author, or (like me) think it's a misguided or untrue example.

rrggrr 23 hours ago 2 replies      
Gower's thesis ignores the rent-seeking, grabbing hand of government. Yes, many government policies are defensible, if not desirable, when framed for the common good, common ideals, common values, etc.

But we know that many policies only advance the interests of those in government, their friends and those who hold power of politicians. The grabbing hand of government engages in rent-seeking at the local level, federal level, and the common body level; and at each step the citizen in Leeds or London is further removed from the policy-maker.

Against all the arguments to remain in the Euro-zone, there is this argument... Its not enough for capital, people and ideas to be free to travel physical borders. There must be sufficient sovereign diversity that they can travel to countries where there is less institutional resistance to a status quo that keeps them from flourishing.

Even better if they do not have to travel anywhere to flourish because the people chose to exit a restrictive, occasionally repressive federation.

fixermark 23 hours ago 6 replies      
It feels a little like there should be a simple, rational argument from politics and economics for retaining the EU as a single unit.

"The world markets are dominated by a few forces. One is a country with the bulk of the world's population; its population is about sixteen times the population of the most populous European country. Another is a single country with territory approximately fifteen times the square mileage of the largest single European nation. Unifying the voices of all people in Europe under one political entity doesn't zero out the playing field, but it may very well decrease the absurd order-of-magnitude scales of difference between political entities as these two juggernauts stomp inexorably towards their desired world order while the individual nations of Europe debate their own individual selfish interests."

But people aren't often rational about national ties.

seomis 23 hours ago 0 replies      
I thought this was going to be about set theory.
cja 7 hours ago 0 replies      
The EU is protectionist, weakens democracy, removes decision making from the populace, is barely accountable, is mysterious (most citizens don't even know who their MEPs are, let alone how the EU government works) and is a paradise for lobbyists.

It pretends (at least to the anti-federalist British) to be a big family of European nations, protecting human rights and stronger together than apart. This is the fantasy we are sold by our politicians.

In reality the EU is a project to create a single European state. This is not hidden in France, Belgium or Germany, only in the UK.

Also, the EU has created circumstances very friendly to big companies. The remoteness of the lawmaking process from the people makes it easy for lobbyists to have considerable influence over it. And the free movement of labour allows the mass movement of cheap labour from poor countries to rich countries, causing problems in the source and destination problems.

pithic 22 hours ago 0 replies      
The problem with supra-nations is that, as bigger monopolies on government, they are much harder to vote against with your feet. You know, in case they become captured by special interests and impose burdensome taxes, oppressive laws, and smothering regulations.
UK-AL 23 hours ago 7 replies      
So you want the EU to intentionally cripple the UK's competitiveness in order to work towards your own moral values which maybe right or wrong? Though in these particular cases my values are aligned with yours. However there are many cases where peoples values differ. You are using the supremacy EU laws to force your values on to them.

If these values are truly universal there is nothing stopping you getting some form of treaty between all these countries anyway outside of the eu. Leaving other things more open.

It's a very simple individuals/small groups vs large groups issue. There are two ways to resolve it. Allow each individual to make their own decision about it, or take a side and force your values on to everyone.

Companies treat workers well because they have to compete to get them. Not because the government forces their hand. There's always a way around government regulation, so it's not really effective. There's always loop hole. So why do companies offer workers good conditions? Because they are forced to by competition. You solve the issues of companies treating workers badly by developing a surging economy not by laws.

These 3rd world countries with sweatshops have actually gotten better standards then what came before. As spare capacity decreases companies are forced to offer better conditions. You can see this now, where in china companies are going elsewhere because workers demanding a lot more.

Tax argument only works if you think government is more efficient with money in the first place.

bitL 23 hours ago 2 replies      
If we had algorithmic laws that would adapt to situation at any given time, then having a non-profit organization managing them could be beneficial. But when the bureaucracy in Brussels is strikingly similar to the one in Moscow during its "glory" days, I am not so sure we really need that. What we have now is that all politicians the member states wanted to get rid of due to their incompetence and other vices, end up in Brussels. You can gauge the quality of representatives from this.

As for potential Brexit, what would happen is that GB would enter TTIP earlier than other EU countries and get a deeper integration with US much earlier. So in a way it could give UK an edge (if TTIP proves in any way beneficial to UK and not only to US). The main industry is anyway financial; once TTIP is signed between EU and USA there won't be much in the way of UK either.

danmaz74 22 hours ago 0 replies      
> Similarly, a little while ago I heard a fisherman talking about how his livelihood suffered as a result of EU fishing quotas, and how he hoped that Britain would leave the EU and let him fish more. He didnt put it quite that crudely, but that was basically what he was saying. And yet without quotas, the fishing stock would rapidly decline and that very same fishermans livelihood would vanish completely.

The problem wouldn't just be the fishing stock disappearing. The UK fisherman could start a pricing war with EU fishermen, and that would be followed by an import tariffs war, which would for sure expand to other sectors. The end result? Not good for the EU, but even worse for the UK.

scottmsul 21 hours ago 0 replies      
One problem I have with this argument is that personal freedom isn't taken into account. Every time a decision is moved up one subsidiarity level, members at lower levels sacrifice a little bit of freedom.

For example, suppose the EU recognizes that gas-consuming vehicles contribute to climate change. From a utilitarian standpoint, it makes sense to ban all cars unless the economic advantage from driving outweighs the ecological damage from climate change. So perhaps the EU should ban all cars by default, but allow citizens to fill out a petition every time they need to drive. Except this is ridiculous, because we shouldn't need permission to drive. This is an assumed basic liberty (of course drivers should still pass a driver's test to get a license, but this only needs to be done once).

I think the utilitarian argument can still hold, but only if personal freedom is taken into account. It should be weighed like everything else, and recognized as a tradeoff. The United States allows freedom of speech, in exchange for giving people the right to post hateful comments online. The utilitarian might argue this is a bad thing, since society might be better off if hateful speech was removed from the internet. But in this case, I personally believe the benefits of freedom outweigh the drawbacks of offensive speech.

Every time a higher authority recognizes a prisoner's dilemma and enforces a rule, they are forbidding the players from playing the game themselves. Even if the societal benefits outweigh the drawbacks, there are hidden costs associated with sacrificing freedom.

dominiek 20 hours ago 0 replies      
I am a Dutch person and Ive lived abroad for about half of my adult life in places like Japan and USA. I identify myself as both a Dutchman and a European. Just two weeks ago I shared a beer with an unknown Frenchman while watching the Eurocup in an Irish Pub while rooting for his country (Netherlands did not qualify). I am European and I love Europe. The things I miss most about home are my friends in Amsterdam (many of which have different nationalities) and I miss European food (the seasonal stuff you find in small villages in Southern Europe).

Even though I have these strong feelings towards Europe, I am against the EU as a super-state. I have been a Euro Skeptic ever since I started reading about liberty and became a libertarian. Unfortunately, being a Euro Skeptic and/or believing in smaller governments gets you cast into the Right-wing Nutbag camp in Europe (even by my own parents).

A lot of Remainians have strong personal feelings towards Europe just like me. Theyve benefited from some EU initiative at a university, or perhaps they married a fellow European person. Also, they enjoy the benefits of the Schengen treaty, being able to travel and settle freely - without spending years of legal fees and stress on immigration like me. All of these things can be accomplished by treaties and standards without a monolithic EU super nation. Trade and treaties (like Shengen) are great, and they are perfectly achievable without an EU super state).

There are countless of arguments around the democracy and economics of the EU, but at the end of the day the EU is an idealistic construct. It was created with noble intensions to create peace on a warn-torn continent - accelerated quickly after the fall of the Berlin Wall (to address a strong and unified Germany).

Yet once again, the idealism of Europe (and in my opinion the disregard for the fundamentals of liberty) is creating a new cycle of instability on the continent. Whether theres going to be a Brexit or Bremain, a storm is coming.

mathgenius 22 hours ago 0 replies      
I'd like to see more mathematicians / scientists stick their neck out like this. Too often these people hide behind castle walls, because they have been taught to stay in their own speciality. Yes, it is quite likely that they will make a rookie mistake, but they will also learn at a fantastic rate.
clarkmoody 21 hours ago 0 replies      
In the abstract, the case for supranational organizations is almost too obvious to be worth making: just as it often benefits individual people to form groups and agree to restrict their behaviour in certain ways, so it can benefit nations to join groups and agree to restrict their behaviour in certain ways.

But nations aren't anything more than collections of people. Nations, like societies and governments, cannot do anything. Rather, the people within are the ones taking action.

Supernational organizations that constrain member states without the consent of the citizens of those states are just another form of tyranny.

You don't need giant bureaucracies to have free trade or free movement of people (or prosperity, for that matter).

mattmanser 22 hours ago 1 reply      
I think the supranational organisations argument is a bad argument to make given that throughout the world everyone else is devolving, and splitting into smaller countries or presently trying to split off.

Off the top of my head, countries/peoples recently or actively trying to split or setting up a new regional government, several that are actually in the EU:

 Czechoslovakia Yugoslavia Catalan Scotland Wales N. Ireland The Kurds
I'm sure there are many more as I really don't keep up with African or South American politics.

So why is everywhere else in the world seemingly moving to devolution, apart from Europe which is trying to create a super-state out of very, very different peoples.

And I know that they're very different. I'm half-dutch, half-english, my Dad lives in France, I've been to Belgium, Spain, Italy and Germany. Each of these peoples have very different outlooks. And the Germans, quite alarmingly as one of the key leaders of this change, still consider themselves more German than global citizens[1] compared to other countries. And don't misinterpret this, nothing to do with Nazis, everything to do in that they're obviously not doing it out of a love for the EU.

So why the need for the super state? Look at what happened to poor old Spain and Greece being forced into what is now a very disparate economic relationship with Germany, the Germans have even effectively rescinded Greece's national economic decision making.

There don't seem to be any rational arguments coming out of either camp, and of all the arguments, the "supranational organisation" one has got to be the most ridiculous. To make it a quasi-mathematical one is even more absurd as our countries have much better labor laws than America which is bound by exactly that sort of organisation, and we had the bulk of them before the EU was involved. The Social Democracies of the EU came way before the EU even existed, before the EC, before the EEC. Before the EU we certainly didn't have zero-hour contracts, so there's clear evidence his quasi-mathematical arguments don't hold water.

And I say this all as a pro-remain. Posts like this honestly don't help because they are partisan bullshit dressed up as 'logic' with little in the way of facts.

[1] http://www.bbc.co.uk/news/world-36139904

Qantourisc 22 hours ago 2 replies      
Personally my biggest beef with the EU seems to be the lack of optional participation.

Some rules just work better if applied to the entire EU, but it's not essential for a lot of rules. Example the vacuum cleaners: do we seriously need these things enforced from the EU?

There are of course rules that would require everyone to participate to work.

Also IMO a country should be able select which parts of the EU they wish to participate in: monetary, military, schengen zone, product safety rules (and thus easy in and export...), ...

weddpros 21 hours ago 0 replies      
read David Ellis for sound and clear counter-arguments


atmosx 21 hours ago 0 replies      
I agree with the ideas this post tries to perpetuate, but its written as if the author lives in a different reality. Post 2008, the EU has been the mother of all evil for PIIGS and non-PIIGS too.

> In the abstract, the case for supranational organizations is almost too obvious to be worth making: []

Most Euro-sceptics, myself included, do not see the EU as a supranational organisation that is for my best interest. What I see is a body of non-elected puppets, taking important decisions about my life, mishandling crisis after crisis (debt crisis, Ukraine, Syria, etc.), the French sitting quietly while Germany abuses everyone.

The EU is deeply, deeply undemocratic. There are no published discussions of the Eurogroup. So basically a finance minister goes there, comes back with new laws but theres nothing published on paper or broadcasted about who said what and why.

Every time an anti-popular law is passed, its because of the EU as if the EU was something abstract (exactly as this post describes) in which we, the people, have no say. So if we really have no say, why on earth would be support that? Because the opposite is worst? Sorry but this aint gonna cut it.

The EU have not solved the basic macro-economic problems that every union or federal country has like surplus recycling: You cant have Germany with huge surplus without having Greece with a huge pile of debt! You just cant! Its macro 0101.

Why do you think that every time a referendum has been to called in relation to the EU (France, Netherlands, Greece and now the UK) the anti-EU choice has dominated. I mean the EU is turning unpopular even in Germany which is the country who profited most because of the single currency. That was expected because it's the biggest exporting power in Europe, so it has a surplus vis-a-vis with EVERY state that can't devalue.

There is something fundamentally broken within the EU. The Germans are clearly unfit to lead while the French are too weak to react when it matters the most and others simply don't count. The whole is going to blow up for sure.

Would you rather have 50 EUR and be able to decide what to do with 'em or 500 EUR but another country telling what to do with them?

Now Prof. Varoufakis, which I admire deeply, believes that what comes after a break-up is an even greater depression and possibly war. I dont like this narrative, not a bit, but I dont see how we can avoid it by staying within the EU.

The fact that we have neo-nazis in the Greek parliament, extreme right wing rising in Austria, France and Hungary is not pure chance. Its the result of EU policies promoting unemployment, stagnation and hatred.

So until the EU starts resembling more like a union and less like a bullying squad, theres no chance.

[1] That was the exact words Scheuble used to explain Varoufakis that Greece should continue with austerity as is although it was clearly visible that everything in the Greek program was wrong.

amelius 22 hours ago 0 replies      
Now try to explain that to the average voter :)
swalsh 23 hours ago 1 reply      
I upvoted this on accident, a one letter title requires extreme clicking precision. I think HN should force a min title length.
sgnelson 23 hours ago 5 replies      
Could the title of this article be any less informative?
SlipperySlope 23 hours ago 1 reply      
Religious flamewar comments (and this is not the first one you've posted) are not allowed on Hacker News. Please don't do this again.

We detached this comment from https://news.ycombinator.com/item?id=11938702 and marked it off-topic.

putzdown 22 hours ago 0 replies      
People are generally selfish and lazy. We act in immediately efficient ways even if those ways have high long-term costsballoon payments and so forth. We do this individually and we do it when we band together in companies and nations.

At the extremes, there are two ways to deal with selfishness and laziness. You can let it go, allowing people to make their own choices and face their own consequences, hoping they'll have enough foresight to avoid ballooning dangers. In economics and government this is generally associated with "conservative" politics. Or you can control people through rule-making and enforcement, preventing them from making choices they should know will end up bad and forcing them to take the right actions. This is associated withsomewhat unexpectedlythe "liberal" approach.

It's pretty clear that either extreme is a bad choice. Pure conservatism allows the wealthy and powerful few to exploit the oppressed many, the large corporation to crush the individual, the effective short-term competitor to ignore long-term costs to self and others. Pure liberalism crushes individual choice, presumes the wise choice of those who rule, sustains complex legal and enforcement systems, and adds the profound inefficiency of government to the many expenses a society must bear.

Yet anything in between the two extremes is difficult. Ambiguity abounds. No single choice for a conservative or liberal approachin health care, for example, or gun control, or airport security, or child rearingis clearly correct and without serious downsides. People are difficult. Living together is hard.

If there is a clear error, therefore, it is an over-enthusiasm for one extreme or the other, an over-pessimism for the other side of the aisle.

What's more, one's own self-bias is all but impossible to see. This article begins with pretensions of objectivity that the author compares, if vaguely, to mathematics, yet the bias of the author shines through unmistakable. He searches for objective premises from which to argue, yet even the way he frames the questions, even the premises he chooses to expound all but guarantee his conclusions.

And so here we are: difficult people, trying to live together, discussingor pretending tohow best to do so, yet choked full of our own biases and unwilling (because we are selfish and lazy) to really hear the other side. And that's too bad, because the right answer is surely a blend, a case-by-case blend, of freedom and legislation.

When you compare the laws of the EU to nations contemporary and historical, I think it's pretty clear that it skews liberal and that there is a great deal of legislation: legislation "for their own good." It's possible that the EU has good legislation, a good amount of legislation, and that it is effective in enforcing that legislation such that the member nations have done, are doing, and will do better than they would without the union. (Not just economically better but morally, ecologically, etc.whatever values we ought to care about. Though that's not clear.) But it also reasonable to wonder whether the EU is incorrectly skewed toward too much legislation, or of the wrong kind, or with bad enforcement, so that the member nations, or perhaps a single member nation, would do "better" (by some sound standard) removing itself from these laws. It helps in making these speculations that the EU, after all, is quite new in historical terms; Britons can look back even in their own memories and know what life without the EU might look like. There's no going back really, of course, but there is some historical basis for speculation. They can also look at their own values and culture and look for broad compatibilities or incompatibilities that might bode well or ill for the ongoing relationship.

It seems to me that the choice comes down to those kinds of questions. Were Britons better off before the EU, and if so, do they have a reasonable chance of returning, post-EU, to a similar or better situation? Is the rule of the EU over the UK, considered historically, abnormally invasive, complex, or incompetent, or is it on the other hand effective in legislation and enforcement?

So I think it's a historical, not mathematical or evenquitelogical question.

Study suggests that the brain training industry may be a placebo arstechnica.com
212 points by Aelinsaar  19 hours ago   129 comments top 26
serg_chernata 19 hours ago 12 replies      
Maybe it's just me, but I've always thought that being "smart" has everything to do with having a wealth of information and being able to draw connections between various facts and ideas.

Repeatedly solving a simple puzzle only makes you good at repeatedly solving a simple puzzle.

bjourne 16 hours ago 1 reply      
Even if you can't "train your brain" what you can train is your ability to focus at a task. For example, if you play chess you probably do not "get smarter." All you get is better at visualizing future board positions and maybe you learn some general strategy and opening theory. But 99% of chess is raw calculation.

Most people when they begin playing can't stand games longer than fifteen minutes or so. They have not trained their brains to focus over extended periods of times. But the more you play, the better you become at sitting still and concentrating on the position on the board. Eventually after years of practice, you'll be able to play games where both players have 120 minutes+increment and use your time without getting bored.

Replace chess with any other brain teaser, like a quiz or an iq test. If you can sit and stare at the same problem uninterrupted for hours trying to work out a solution, then you will condition your brain to be able to do just that.

I firmly believe that is useful in all parts of life. Random example, you are thinking about divorcing your wife. Now if you are able to spend several hours analyzing the possible ramifications of that decision you will be better of than if you can only think about it for twenty minutes without losing your train of thought.

So I don't think it is all placebo. Ability to concentrate and perseverance through difficulties are two very important skills to have.

TheSisb2 18 hours ago 1 reply      
I've read of some people who were finally set free after a long stretch in solitary confinement, and many of them attributed the preservation of their sanity to simple math or word exercises. I believe one would just count exponents of two as far as he could go and another would try remembering every word in every language they knew.

Not sure of similarities to brain training games, but I would bet there is something there.

CharlesMerriam2 19 hours ago 5 replies      
This study argues that people recruited to use Brain Training score higher on fluid intelligence than people recruited to participate in a study for a reward. That is, Brain Training works for people who believe it works.

So, shouldn't it be used by people who believe it works?

matznerd 15 hours ago 0 replies      
What really matters is cross-over. Does the skill you are training have application to what is being simulated? It's kind of like you get what you measure, in that you get what you train. For example, through training head-to-head in math based games where speed is important, I have noticed that I am much more able to do quick calculations on checks more easily. There is a game in Elevate where you are shown avatars of people then told information about them and their names. You have to remember what they look like and basic information about them through various rounds and comparisons to the other players. I have noticed that when I play that game frequently, it helps with memorizing names of people. Dual-n-back is said to help increase working memory and I believe it because in that game you are having to constantly keep in your head the last auditory and visual records at the same time and integrate new information with them. I am annoyed by people constantly saying this type of training does not work...
ultramancool 13 hours ago 0 replies      
I seem to remember studies from a few years back showing Lumosity as less effective than World of Warcraft at "brain training". At least WoW can engage a few social and competitive aspects rather than just mindless puzzle solving.

CBC's Marketplace also did a show about this, pretty much crapping on the evidence for the whole thing:


andriesm 18 hours ago 2 replies      
"Dual N-Back" remains one of the best - it may be the only one that really works, unlike the bulk of other brain games that truly are scams.

This one has extensive research literature, and many free/ad-supported and cheap ones on android and iPhone app-stores.

For a critical discussion of the research see: https://www.quora.com/How-does-dual-n-back-actually-increase...

It seems to be one of the best if you interested in this sort of thing.

Claims include: improved working memory, improved executive functioning (great for ADHD/ADD), improved emotional control, self control and regulation, minor bump in IQ.

jrockway 17 hours ago 1 reply      
Yeah, the human body seems to be very good at doing the minimum required to meet the provided training stimulus. Ride a bike fast, you get better at riding a bike fast.

I started doing crossword puzzles a few months ago. Mainly I've gotten better at doing crossword puzzles. When I started, I had to think about each clue, but now I know the common words that always show up and don't even have to read the clues to get those. Meanwhile, as I never write about "MDs" or the "EPA", this has had no useful impact on my life.

gabe_smith 18 hours ago 0 replies      
At least to me, it seems like doing repetitive puzzles isn't likely to do much more than make you good at that type of puzzle, rather than building actual problem-solving skills. I used to play these games with pretty little actual impact outside of my score...

In math and logic, I've found a pretty great resource in https://brilliant.org/ with many quizzes and wikis that focus on the underlying connections and general problem-solving approaches, which has made me better at solving actual problems.

zizzles 17 hours ago 3 replies      
The only real "brain training" is reading, learning practical and technical skills, and perhaps living a healthy lifestyle.

Other than that, forget it. Brain power and being a quick-witted genius is all about GENETICS (like most things in this life)

hmahncke 15 hours ago 0 replies      
This research article and the arstechnica commentary are examples of exactly the overreach that they aim to criticize.

A number of large-scale multi-site randomized controlled trials have shown that specific types of brain training generalize to untrained measures of cognitive function and real-world activities. Here's two examples:

ACTIVE: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3934012/

IMPACT: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4169294/

Some brain training works, and some doesn't. Of the ones that work, some show certain types of benefits; and others show other types of benefits. Throwing the whole field out is like saying "laetrile doesn't treat cancer, so no molecules treat cancer."

disclosure: I was an investigator on IMPACT; and I work at a brain training company (Posit Science)

Madmallard 18 hours ago 3 replies      
I've had sharpness and focus improvements taking energy metabolism supplements. Could be that I'm deficient here or that it could also help others.

CoQ10, Nicotinamide Riboside, L-Carnitine, Magnesium Malate, B complex, N-Acetyl Cysteine

In terms of just being smarter and quicker to learn I'm not really sure that seems to be a combination of experience and natural factors. The experience part you can obviously change but the other one who knows.

GoToRO 15 hours ago 1 reply      
"brain training" industry may be a placebo but brain training is real. I have yet to understand why people think that you can have a very good brain if you don't train. In sports it's a given: if you want to run fast, you have to train, you have to eat right, you have to sleep rigth, you have to take breaks to not destroy your body.

But somehow people that work with their brain think that the brain, the organ, does not work like the rest of the body and that you are smart no matter what an so on. You are smart if you trained, if you sleep right, if you eat right and in general if you take care of yourself.

People say "it's just a puzzle". It's like a football player would say "why do I have to do squats?! I never have to do one while I play". Sure, at some point the puzzle becomes too easy. Thats why they do squats with weights.

Cacti 14 hours ago 1 reply      
I thought it was well-proven a year or two ago that these apps/offers/classes were a scam?
davycro 5 hours ago 1 reply      
Studies show that surgeons who play first person shooter video games are faster and better with laproscopic and robotic procedures. Does that not count as brain training?
bunkydoo 18 hours ago 0 replies      
Well this doesn't really come as a huge surprise. these brain training apps seem to target logical repetitive tasks that would only train your brain to be better at doing just that - logical repetitive tasks. I do however think it is very possible to train the brain, it just hasn't been put in the form of an app or autonomous computer software yet.
gtrubetskoy 17 hours ago 1 reply      
Unlike "brain training", good old "learning" and "education" are definitely _not_ placebos.
Hondor 15 hours ago 0 replies      
Chewing gum can also increase intelligence. So it's easy to see that the effect of games can be hard to isolate from other factors.


elcapitan 6 hours ago 0 replies      
Brain suggests that the "study" industry might be a placebo.
kingkawn 18 hours ago 1 reply      
Isn't the point of brain training sort of to intentionally induce placebo?
mattfrommars 16 hours ago 0 replies      
I find N Back training to be actually effective from personal experience.
nxzero 18 hours ago 0 replies      
An interesting thing about placebos is that all placebos are not equal, which to me calls into question what a placebos really is.
BurningFrog 17 hours ago 0 replies      
The intelligence is all in my mind?
dredmorbius 16 hours ago 0 replies      
This has been both obvious and revalidated numerous times.

I'd like to know why public broadcasters, particularly PBS and NPR in the US, aren't called to task for advertising this crap heavily during their sponsoship drives.

Associating yourself with crap spreads the stains.

hackney 16 hours ago 1 reply      
Looks like you've mastered the art of having no thoughts, and with so many words even.
arkj 15 hours ago 0 replies      
The funny part is why does the placebo work? I know, there you go again....
Show HN: A Fortran web framework github.com
218 points by mapmeld  1 day ago   80 comments top 12
projectramo 23 hours ago 9 replies      
I would love it if these projects were accompanied by a little blog post stating who the person is, why they decided to do it etc. for projects where it is obviously a labor of love.

Especially because of the effort it would take to get this working on a "dead" language. Like who are you? How did you decide to work on this for the pure joy of it? Why not Cobol? Why a web framework?

cpr 23 hours ago 0 replies      
Seeing this gave me a subtle but definite frisson of nostalgia for Fortran (first language, learned in late 60's).
ufmace 16 hours ago 0 replies      
This reminds me of the time I actually found myself charged with integrating a chunk of Fortran code compiled into a Windows DLL with an Excel sheet via VBA. The VBA side was straightforward if a little icky, but handling strings in Fortran was really giving me headaches. Documentation on Fortran, even down to basic syntax, seemed inconsistent and hard to find online. I think I got the project mostly working before I left that company for unrelated reasons.
mhd 19 hours ago 0 replies      
I'm mostly used to Fortran 77 and that was ages ago, that actually looks quite readable...
supernintendo 20 hours ago 1 reply      
Nice work! That home page, while admittedly simple, renders incredibly fast. Have you done any benchmarks / performance comparisons to other web frameworks?
k__ 22 hours ago 0 replies      
haifeng 23 hours ago 2 replies      
Just because you can?
jchomali 19 hours ago 1 reply      
I am definitely going to try this! Would love to see a bit more about the creator!
antidaily 13 hours ago 0 replies      
nickpsecurity 17 hours ago 1 reply      
For those wondering why it's used, here's a write-up I found on Eric Raymond's blog comparing Fortran and C:


Some comparisons are dated. However, I think the older variants still have advantages in readability, more clear semantics, semantics closer to algorithm itself (esp mathematical functions), less guessing by compiler for optimizations, and maybe a few others. Those I listed seem to still be advantages over C in general or as it's commonly used.

ha470 23 hours ago 0 replies      
An accomplishment, certainly, but...


       cached 21 June 2016 15:11:01 GMT