hacker news with inline top comments    .. more ..    12 Jul 2017 News
home   ask   best   2 years ago   
How Discord Scaled Elixir to 5M Concurrent Users discordapp.com
288 points by b1naryth1ef  3 hours ago   97 comments top 24
iagooar 2 hours ago 3 replies      
This writeup make me even more convinced of Elixir becoming one of the large players when it comes to hugely scaling applications.

If there is one thing I truly love about Elixir, it is the easiness of getting started, while standing on the shoulders of a giant that is the Erlang VM. You can start by building a simple, not very demanding application with it, yet once you hit a large scale, there is plenty of battle-proven tools to save you massive headaches and costly rewrites.

Still, I feel, that using Elixir is, today, still a large bet. You need to convince your colleagues as much as your bosses / customers to take the risk. But you can rest assured it will not fail you as you need to push it to the next level.

Nothing comes for free, and at the right scale, even the Erlang VM is not a silver bullet and will require your engineering team to invest their talent, time and effort to fine tune it. Yet, once you dig deep enough into it, you'll find plenty of ways to solve your problem at a lower cost as compared to other solutions.

I see a bright future for Elixir, and a breath of fresh air for Erlang. It's such a great time to be alive!

jakebasile 2 hours ago 4 replies      
I'm continually impressed with Discord and their technical blogs contribute to my respect for them. I use it in both my personal life (I run a small server for online friends, plus large game centric servers) and my professional life (instead of Slack). It's a delight to use, the voice chat is extremely high quality, text chat is fast and searchable, and notifications actually work. Discord has become the de facto place for many gaming communities to organize which is a big deal considering how discriminating and exacting PC gamers can be.

My only concern is their long term viability and I don't just mean money wise. I'm concerned they'll have to sacrifice the user experience to either achieve sustainability or consent to a buyout by a larger company that only wants the users and brand. I hope I'm wrong, and I bought a year of Nitro to do my part.

jlouis 31 minutes ago 1 reply      
A fun idea is to do away with the "guild" servers in the architecture and simply run message passes from the websocket process over the Manifold system. A little bit of ETS work should make this doable and now an eager sending process is paying for the work itself, slowing it down. This is exactly the behavior you want. If you are bit more sinister you also format most of the message in the sending process and makes it into a binary. This ensures data is passed by reference and not copied in the system. It ought to bring message sends down to about funcall overhead if done right.

It is probably not a solution for current Discord as they rely on linearizability, but I toyed with building an IRCd in Erlang years ago, and there we managed to avoid having a process per channel in the system via the above trick.

As for the "hoops you have to jump through", it is usually true in any language. When a system experiences pressure, how easy it is to deal with that pressure is usually what matters. Other languages are "phase shifts" and while certain things become simpler in that language, other things become much harder to pull off.

didibus 1 hour ago 2 replies      
So, at this point, every language was scaled to very high concurrent loads. What does that tell us? Sounds to me like languages don't matter for scale. In fact, that makes sense, scale is all about parallel processes, horizontally distributing work can be achieved in all language. Scale is not like perforance, where if you need it, you are restricted to a few languages only.

That's why I'd like to hear more about productivity and ease now. Is it faster and more fun to scale things in certain languages then others. Beam is modeled on actors, and offer no alternatives. Java offers all sorts of models, including actors, but if actors are the currently most fun and procudctive way to scale, that doesn't matter.

Anyways, learning how team scaled is interesting, but it's clear to me now languages aren't limiting factors to scale.

Cieplak 2 hours ago 3 replies      
I know that the JVM is a modern marvel of software engineering, so I'm always surprised when my Erlang apps consume less than 10MB of RAM, start up nearly instantaneously, respond to HTTP requests in less than 10ms and run forever, while my Java apps take 2 minutes to start up, have several hundred millisecond HTTP response latency and horde memory. Granted, it's more an issue with Spring than with Java, and Parallel Universe's Quasar is basically OTP for Java, so I know logically that Java is basically a superset of Erlang at this point, but perhaps there's an element of "less is more" going on here.

Also, we're looking for Erlang folks with payments experience.


rdtsc 3 hours ago 3 replies      
Good stuff. Erlang VM FTW!

> mochiglobal, a module that exploits a feature of the VM: if Erlang sees a function that always returns the same constant data, it puts that data into a read-only shared heap that processes can access without copying the data

There is a nice new OTP 20.0 optimization - now the value doesn't get copied even on message sends on the local node.

Jesper L. Andersen (jlouis) talked about it in his blog: https://medium.com/@jlouis666/an-erlang-otp-20-0-optimizatio...

> After some research we stumbled upon :ets.update_counter/4

Might not help in this case but 20.0 adds select_replace so can do a full on CAS (compare and exchange) pattern http://erlang.org/doc/man/ets.html#select_replace-2 . So something like acquiring a lock would be much easier to do.

> We found that the wall clock time of a single send/2 call could range from 30s to 70us due to Erlang de-scheduling the calling process.

There are few tricks the VM uses there and it's pretty configurable.

For example sending to a process with a long message queue will add a bit of a backpressure to the sender and un-schedule them.

There are tons of configuration settings for the scheduler. There is to bind scheduler to physical cores to reduce the chance of scheduler threads jumping around between cores: http://erlang.org/doc/man/erl.html#+sbt Sometimes it helps sometimes it doesn't.

Another general trick is to build the VM with the lcnt feature. This will add performance counters for locks / semaphores in the VM. So then can check for the hotspots and know where to optimize:


mbesto 3 hours ago 1 reply      
This is one of those few instances where getting the technology choice right actually has an impact on cost of operations, service reliability, and overall experience of a product. For like 80% of all the other cases, it doesn't matter what you use as long as your devs are comfortable with it.
jmcgough 3 hours ago 0 replies      
Great to see more posts like this promoting Elixir. I've been really enjoying the language and how much power it gets from BEAM.

Hopefully more companies see success stories like this and take the plunge - I'm working on an Elixir project right now at my startup and am loving it.

joonoro 2 hours ago 1 reply      
Elixir was one of the reasons I started using Discord in the first place. I figured if they were smart enough to use Elixir for a program like this then they would probably have a bright future ahead of them.

In practice, Discord hasn't been completely reliable for my group. Lately messages have been dropping out or being sent multiple times. Voice gets messed up (robot voice) at least a couple times per week and we have to switch servers to make it work again. A few times a person's voice connection has stopped working completely for several minutes and there's nothing we can do about it.

I don't know if these problems have anything to do with the Elixir backend or the server.

EDIT: Grammar

ramchip 25 minutes ago 0 replies      
Very interesting article! One thing I'm curious about is how to ensure a given guild's process only runs on one node at a time, and the ring is consistent between nodes.

Do you use an external system like zookeeper? Or do you have very reliable networking and consider netsplits a tolerable risk?

danso 3 hours ago 1 reply      
According to Wikipedia, Discord's initial release was March 2015. Elixir hit 1.0 in September 2014 [0]. That's impressively early for adoption of a language for prototyping and for production.

[0] https://github.com/elixir-lang/elixir/releases/tag/v1.0.0

ShaneWilton 3 hours ago 1 reply      
Thanks for putting this writeup together! I use Elixir and Erlang every day at work, and the Discord blog has been incredibly useful in terms of pointing me towards the right tooling when I run into a weird performance bottleneck.

FastGlobal in particular looks like it nicely solves a problem I've manually had to work around in the past. I'll probably be pulling that into our codebase soon.

ConanRus 2 hours ago 1 reply      
I do not see there any Elixir specific, it is all basically Erlang/Erlang VM/OTP stuff. When you using Erlang, you think in terms of actors/processes and message passing, and this is (IMHO) a natural way of thinking about distributed systems.So this article is a perfect example how simple solutions can solve scalability issues if you're using right platform for that.
_ar7 3 hours ago 0 replies      
Really liked the blog post. Elixir and the capabilities of the BEAM VM seems really awesome, but I can't really find an excuse to really use them in my day to day anywhere.
alberth 32 minutes ago 2 replies      
Is there any update on BEAMJIT?

It was super promising 3 or so years ago. But I haven't seen an update.

Erlang is amazing in numerous ways but raw performance is not one of them. BEAMJIT is a project to address exactly that.


brian_herman 3 hours ago 0 replies      
I love discord's posts they are very informative and easy to read.
myth_drannon 3 hours ago 1 reply      
It's interesting how on StackOverflow Jobs Elixir knowledge is required more often than Erlang.


jaequery 2 hours ago 6 replies      
Anyone know if Phoenix/Elixir have something similar to Ruby's bettererror gem? I see Phoenix has a built-in error stack trace page which looks like a clone of bettererror but it doesn't have the real-time console inside of it.

Also, I wish they had a ORM like Sequel. These two are really what is holding me back from going full in on Elixir. Anyone can care to comment on this?

framp 1 hour ago 0 replies      
Really lovely post!

I wonder how Cloud Haskell would fare in such a scenario

brightball 3 hours ago 1 reply      
I so appreciate write ups that get into details of microsecond size performance gains at that scale. It's a huge help for the community.
zitterbewegung 2 hours ago 1 reply      
Compared to slack discord is a much better service for large groups . Facebook uses them for react.
marlokk 1 hour ago 0 replies      
"How Discord Scaled Elixir to 5M Concurrent Users"

click link

[Error 504 Gateway time-out]

only on Hacker News

khanan 1 hour ago 0 replies      
Problem is that Discord sucks since it does not have a dedicated server. Sorry, move along.
orliesaurus 2 hours ago 1 reply      
Unlike Discord's design team who seem to just copy all of Slack's designs and assets, the Engineering team seems to have their shit together, it is delightful to read your Elixir blogposts. Good job!
Doppio: JVM written in JavaScript plasma-umass.github.io
18 points by api  41 minutes ago   7 comments top 6
the_duke 10 minutes ago 0 replies      
But... why??


> This paper presents DOPPIO, a JavaScript-based runtime systemthat makes it possible to run unaltered applications written in generalpurposelanguages directly inside the browser.

Someone should really have told them about webassembly...

Koshkin 10 minutes ago 0 replies      
> (Read the academic paper)

I admire the effort, but: doesn't "academic" mean "scientific"? Can there possibly be any "science" in having a well-known VM reimplemented in a well-known programming language?

bwidlar 19 minutes ago 1 reply      
JVM written in Javascript, what could be wrong?
Scarbutt 10 minutes ago 0 replies      
Impressive, loaded a clojure.jar, got a repl and wrote/called some silly functions, it worked...
mehrdada 30 minutes ago 0 replies      
flukus 32 minutes ago 0 replies      
Does it run java applets?
ECMAScript 2017 Language Specification ecma-international.org
484 points by samerbuna  9 hours ago   190 comments top 25
thomasfoster96 9 hours ago 2 replies      
Proposals [0] that made it into ES8 (whats new):

* Object.values/Object.entries - https://github.com/tc39/proposal-object-values-entries

* String padding - https://github.com/tc39/proposal-string-pad-start-end

* Object.getOwnPropertyDescriptors - https://github.com/ljharb/proposal-object-getownpropertydesc...

* Trailing commas - https://github.com/tc39/proposal-trailing-function-commas

* Async functions - https://github.com/tc39/ecmascript-asyncawait

* Shared memory and atomics - https://github.com/tc39/ecmascript_sharedmem

The first five have been available via Babel and/or polyfills for ~18 months or so, so theyve been used for a while now.

[0] https://github.com/tc39/proposals/blob/master/finished-propo...

callumlocke 9 hours ago 3 replies      
This is mostly symbolic. The annual ECMAScript 'editions' aren't very significant now except as a talking point.

What matters is the ongoing standardisation process. New JS features are proposed, then graduate through four stages. Once at stage four, they are "done" and guaranteed to be in the next annual ES edition write-up. Engines can confidently implement features as soon as they hit stage 4, which can happen at any time of year.

For example, async functions just missed the ES2016 boat. They reached stage 4 last July [1]. So they're officially part of ES2017 but they've been "done" for almost a year, and landed in Chrome and Node stable quite a while ago.

[1] https://ecmascript-daily.github.io/2016/07/29/move-async-fun...

flavio81 7 hours ago 2 replies      
What I wish ECMAScript had was true support for number types other than the default 32-bit float. I can use 32 and 64 bit integers using "asm.js", but this introduces other complications of its own -- basically, having to program in a much lower level language.

It would be nice if EcmaScript could give us a middle ground -- ability to use 32/64 bit integers without having to go all the way down to asm.js or wasm.

HugoDaniel 9 hours ago 5 replies      
I would really love to see an object map function. I know it is easy to implement, but since they seem to be gaining ranks through syntax sugar, why not just have a obj.map( (prop, value) => ... ) ? :)
ihsw2 8 hours ago 2 replies      
Notably, with shared memory and atomics, pthreads support is on the horizon.


Granted it may be limited to consumption via Emscripten, it is nevertheless now within the realm of possibility.

For this that cannot grok the gravity of this -- proper concurrent/parallel execution just got a lot closer for those targeting the browser.

pi-rat 9 hours ago 5 replies      
Really hate the naming for JS standards.. ES2017, ES8, ECMA-262. Way to confuse people :/
baron816 7 hours ago 0 replies      
Regardless of what gets included in the spec, I hope people think critically about what to use and what not to use before they jump in. Just because something is shiny and new in JS, it doesn't mean you have to use it or that it's some sort of "best practice."
drinchev 9 hours ago 1 reply      
For anyone wondering what's NodeJS support of ES8.

Everything is supported, except "Shared memory and atomics"

[1] http://node.green

speg 9 hours ago 1 reply      
Is there a "What's new" section?
pgl 9 hours ago 2 replies      
Heres whats in it: https://github.com/tc39/proposals/blob/master/finished-propo...

And some interesting tweets by Kent C. Dodds: https://twitter.com/kentcdodds/status/880121426824630273

Edit: fixed KCD's name.Edit #2: No, really.

wilgertvelinga 2 hours ago 2 replies      
Really interesting how bad the only JavaScript code used on their own site is: https://www.ecma-international.org/js/loadImg.js
43224gg252 9 hours ago 5 replies      
Can anyone recommend a good book or guide for someone who knows pre-ES6 javascript but wants to learn all the latest ES6+ features in depth?
rpedela 9 hours ago 2 replies      
Has there been any progress on supporting 64-bit integers?
jadbox 9 hours ago 1 reply      
I wish this-binding sugar would get promoted into stage 1.
ascom 8 hours ago 1 reply      
Looks like ECMA's site is overloaded. Here's a Wayback Machine link for the lazy: https://web.archive.org/web/20170711055957/https://www.ecma-...
gregjw 9 hours ago 1 reply      
I should really learn ES6
emehrkay 8 hours ago 2 replies      
I'd like to be able to capture object modifications like Python's magic __getattr__ __setattr__ __delattr__ and calling methods that do not exist on objects. In the meantime I am writing a get, set, delete method on my object and using those instead
komali2 4 hours ago 0 replies      
>AWB: Alternatively we could add this to a standard Dict module.

>BT: Assuming we get standard modules?

>AWB: We'll get them.


espadrine 8 hours ago 0 replies      
I made a short sum-up of changes in this specification here: http://espadrine.github.io/New-In-A-Spec/es2017/
lukasm 9 hours ago 1 reply      
What is up with decorators?
j0e1 7 hours ago 1 reply      
> Kindly note that the normative copy is the HTML version;

Am I the only one who finds this ironic..

idibidiart 3 hours ago 0 replies      
Wait, so async generators and web streams are 2018 or 2016?
Swizec 9 hours ago 3 replies      
Time to update https://es6cheatsheet.com

What's the feature you're most excited about?

cies 9 hours ago 2 replies      
Nice 90s style website ECMA!
bitL 7 hours ago 2 replies      
Heh, maybe JS becomes finally usable just before WebAssembly takes off, rendering it obsolete :-D
Cloudflares fight with a patent troll could alter the game techcrunch.com
394 points by Stanleyc23  7 hours ago   132 comments top 21
jgrahamc 5 hours ago 0 replies      
More detail on what we are doing from three blog posts:

Standing Up to a Dangerous New Breed of Patent Trollhttps://blog.cloudflare.com/standing-up-to-a-dangerous-new-b...

Project Jengohttps://blog.cloudflare.com/project-jengo/

Patent Troll Battle Update: Doubling Down on Project Jengohttps://blog.cloudflare.com/patent-troll-battle-update-doubl...

avodonosov 2 minutes ago 0 replies      
I've read the patent. But what part of CloudFlare services it claims to cover?

Also, the patent applies the same way to almost any proxy server (ICAP and similar https://en.wikipedia.org/wiki/Internet_Content_Adaptation_Pr...)

JumpCrisscross 4 hours ago 2 replies      
I've used Latham & Watkins. Just made a call to let a partner there know what I think about his firm's alumna and how it colors my opinion of him and his firm.

Encourage everyone to check with your firm's General Counsel about this. If you use Latham, or Kirkland or Weil, encourage your GC to reach out and make your views heard. It's despicable that these lawyers are harassing their firms' former and potential clients.

tracker1 5 hours ago 2 replies      
I think that this is absolutely brilliant. I've been against the patent of generalistic ideas, and basic processes for a very long time. Anything in software should not really be patentable, unless there is a concrete implementation of an invention, it's not an invention, it's a set of instructions.

Let software work under trade secrets, but not patents. Anyone can implement something they think through. It's usually a clear example of a need. That said, I think the types of patent trolling law firms such as this deserve every bit of backlash against them that they get.

notyourday 6 hours ago 3 replies      
It is all about finding a correct pressure point.

Long time ago certain Philadelphia area law firms decided to represent vegan protesters that created a major mess in a couple of high end restaurants.

A certain flamboyant owner of one the restaurants targeted decided to have a good time applying his version of asymmetric warfare. The next partners from those law firm showed up to wine and dine their clients in the establishment, the establishment(s) politely refused the service to the utter horror of the lawyers.

Needless to say, the foie gras won...

[Edit: spelling]

siliconc0w 6 hours ago 1 reply      
I'm not a fan of the argument that if Blackbird weren't a NPE it'd be okay because Cloudflare could then aim it's 150 strong patent portfolio cannon back at them. It's basically saying incumbents like Cloudflare don't really want to fix the system, they want to keep the untenable 'cold war' status quo which protects them but burdens new entrants.
oskarth 5 hours ago 5 replies      
> So-called non-practicing entities or holders of a patent for a process or product that they dont plan to develop often use them to sue companies that would sooner settle rather than pay what can add up to $1 million by the time a case reaches a courtroom.

Why on earth aren't non-practicing entity patent lawsuits outlawed? Seems like a no-brainer, and I can't imagine these firms being big enough to have any seriously lobbying power.

mabbo 6 hours ago 1 reply      
> [Is Blackbird] doing anything thing that is illegal or unethical? continues Cheng. For the most part, its unethical. But its probably not illegal.

If it's not illegal, more work needs to be done to make it illegal. Inventors always have avenues, moreso today than ever before.

ovi256 7 hours ago 3 replies      
I've noticed a Techcrunch comment that makes this fight about software patents and states that forbiding them would be a good solution. I think that's a very wrong view to take. The software patent fight is worth fighting, but do not conflate the two issues. Abuse by patent trolls or non-practicing entities can happen even without software patents.

The law patch that shuts down patent trolls will have no effect on software patents, and vice-versa.

fhrow4484 2 hours ago 0 replies      
What is the state of "anti-patent trolls" laws in different state? I know for instance Washington state has a law like this effective since July 2015 [1][2]. What is it like in other states, specifically California?

[1] http://www.atg.wa.gov/news/news-releases/attorney-general-s-...

[2] http://app.leg.wa.gov/RCW/default.aspx?cite=19.350&full=true

shmerl 6 hours ago 2 replies      
Someone should figure out a way how to put these extortionists in prison for protection racket.
redm 5 hours ago 0 replies      
It would be great if the "game" was really altered but I've heard that statement and hope many times over the last 10 years. While there has been some progress, patent trolling continues. Here's hoping...
SaturateDK 5 hours ago 0 replies      
This is great, I guess I'm going "Prior art searching" right away.
danschumann 5 hours ago 0 replies      
Can I create 5 more HN accounts just to +1 this some more?
FussyZeus 7 hours ago 3 replies      
I've never heard a good argument against this so I'll say it here: Require that the plaintiff in this cases show demonstrable, actual, and quantifiable loss by the activity of the defendant. It seems like such a no-brainer that a business suing for damage to it's business prospects after someone stole their idea would have to actually show how it was damaged. Even allowing very flimsy evidence would do a lot to dissuade most trolls, because as every article points out, they don't make anything. And if they don't make or sell a product, then patent or not, they haven't lost anything or been damaged in any way.
kelukelugames 6 hours ago 1 reply      
I'm in tech but not in the valley. How accurate is HBO's representation of patent trolls?
unityByFreedom 2 hours ago 0 replies      
> Blackbird is a new, especially dangerous breed of patent troll... Blackbird combines both a law firm and intellectual property rights holder into a single entity. In doing so, they remove legal fees from their cost structure and can bring lawsuits of potentially dubious merit without having to bear any meaningful cost

That's not new. It's exactly what Intellectual Ventures was (or is?) doing.

draw_down 4 hours ago 0 replies      
Unfortunately, I think this is written in a way that makes it hard to understand what exactly Cloudflare is doing against the troll. They're crowdsourcing prior art and petitioning the USPTO?
dsfyu404ed 6 hours ago 1 reply      
subhrm 6 hours ago 1 reply      
Long live patents !
ivanbakel 7 hours ago 2 replies      
I don't see anything game-changing about their approach. Fighting instead of settling should definitely be praised, but the only differences between this legal challenge and any of the previous ones are the result of recent changes in the law or the judiciary, which are beyond Cloudflare's control. Nothing suggests that patent-trolling itself as a "game" is going to shift or go away after this, and until that is made to happen, it's going to be as lucrative as ever.
Students Are Better Off Without a Laptop in the Classroom scientificamerican.com
222 points by thearn4  8 hours ago   154 comments top 41
zeta0134 5 hours ago 5 replies      
Oh, okay, I thought the study was going to be on the benefits of attempting to use the laptop itself for classroom purposes, not for social media distractions. This would be more accurately titled, "Students Are Better Off Without Distractions in the Classroom." Though I suppose, it wouldn't make a very catchy headline.

I found my laptop to be very beneficial in my classroom learning during college, but only when I made it so. My secret was to avoid even connecting to the internet. I opened up a word processor, focused my eyes on the professor's slides or visual aids, and typed everything I saw, adding notes and annotations based on the professor's lecture.

This had the opposite effect of what this article describes: my focusing my distracted efforts on formatting the article and making my notes more coherent, I kept myself focused, and could much more easily engage with the class. Something about the menial task of taking the notes (which I found I rarely needed to review) prevented me from losing focus and wandering off to perform some unrelated activity.

I realize my experience is anecdotal, but then again, isn't everyone's? I think each student should evaluate their own style of learning, and decide how to best use the tools available to them. If the laptop is a distraction? Remove it! Goodness though, you're paying several hundred (/thousand) dollars per credit hour, best try to do everything you can to make that investment pay off.

makecheck 7 hours ago 7 replies      
If students arent engaged, they arent going to become star pupils once you take away their distractions. Perhaps kids attend more lectures than before knowing that they can always listen in while futzing with other things (and otherwise, they may skip some of the classes entirely).

The lecture format is what needs changing. You need a reason to go to class, and there was nothing worse than a professor showing slides from the pages of his own book (say) or droning through anything that could be Googled and read in less time. If there isnt some live demonstration, or lecture-only material, regular quizzes or other hook, you cant expect students to fully engage.

ourmandave 6 hours ago 5 replies      
This reminds me of the running gag in some college movie where the first day all the students show up.

The next cut some students come to class, put a recorder on their desk and leave, then pick it up later.

Eventually there's a scene of the professor lecturing to a bunch of empty desks with just recorders.

And the final scene there's the professor's tape player playing to the student's recorders.

njarboe 4 hours ago 1 reply      
This is a summary of an article titled "Logged In and Zoned Out: How Laptop Internet Use Relates to Classroom Learning" published in Psychological Science in 2017; The DOI is 10.1177/0956797616677314 if you want to check out the details.

Abstract: Laptop computers are widely prevalent in university classrooms. Although laptops are a valuable tool, they offer access to a distracting temptation: the Internet. In the study reported here, we assessed the relationship between classroom performance and actual Internet usage for academic and nonacademic purposes. Students who were enrolled in an introductory psychology course logged into a proxy server that monitored their online activity during class. Past research relied on self-report, but the current methodology objectively measured time, frequency, and browsing history of participants Internet usage. In addition, we assessed whether intelligence, motivation, and interest in course material could account for the relationship between Internet use and performance. Our results showed that nonacademic Internet use was common among students who brought laptops to class and was inversely related to class performance. This relationship was upheld after we accounted for motivation, interest, and intelligence. Class- related Internet use was not associated with a benefit to classroom performance.

shahbaby 2 hours ago 0 replies      
"Thus, there seems to be little upside to laptop use in class, while there is clearly a downside."

Thanks to bs articles like this that try to over generalize their results, I was unsure if I "needed" a laptop when returning to school.

Got a Surface Book and here's what I've experienced over the last 2 semesters.- Going paperless, I'm more organized than ever. I just need to make sure I bring my surface with me wherever I go and I'm good.

- Record lectures, tutorials, office hours, etc. Although I still take notes to keep myself focused, I can go back and review things with 100% accuracy thanks to this.

- Being at 2 places at once. ie: Make last minute changes before submitting an assignment for class A or attend review lecture to prepare for next week's quiz in class B? I can leave the surface in class B to record the lecture while I finish up the assignment for class A.

If you can't control yourself from browsing the internet during a lecture then the problem is not with your laptop...

imgabe 5 hours ago 4 replies      
I went to college just as laptops were starting to become ubiquitous, but I never saw the point of them in class. I still think they're pretty useless for math, engineering, and science classes where you need to draw symbols and diagrams that you can't easily type. Even for topics where you can write prose notes, I always found it more helpful to be able to arrange them spatially in a way that made sense rather than the limited order of a text editor or word processor.
kgilpin 21 minutes ago 0 replies      
It sounds like what students need are better teachers. I haven't been to school in a while but I had plenty of classes that were more interesting than surfing YouTube; and some that weren't.

The same is true for meetings at work. In a good session, people are using their laptops to look up contributing information. In a bad one... well... you know.

stevemk14ebr 7 hours ago 2 replies      
I think this is a highly personal topic. As a student myself i find a laptop in class is very nice, i can type my notes faster, and organize them better. Most of my professors lectures are scatter brained and i frequently have to go back to previous section and annotate or insert new sections. With a computer i just go back and type, with a pen and paper i have to scribble, or write in the margins. Of course computers can be distractions, but that is the students responsibility, let natural selection take its course and stop hindering my ability to learn how i do best (I am a CS major so computers are >= paper to me). If you cannot do your work with a computer, then don't bring one yourself, dont ban them for everyone.
baron816 6 hours ago 0 replies      
Why are lectures still being conducted in the classroom? Students shouldn't just be sitting there copying what the teacher writes on the board anyway. They should be having discussions, working together or independently on practice problems, teaching each other the material, or just doing anything that's actually engaging. Lecturing should be done at home via YouTube.
rdtsc 6 hours ago 2 replies      
I had a laptop and left it home most of the time. And just stuck with taking notes with a pen and sitting upfront.

I took lots notes. Some people claim it's pointless and distracts from learning but for me the act of taking notes is what helped solidify the concepts a better. Heck due to my horrible handwriting I couldn't even read some of the notes later. But it was still worth it. Typing them out just wasn't the same.

alkonaut 6 hours ago 0 replies      
This is the same as laptops not being allowed in meetings. A company where it's common for meeting participants to "take notes" on a laptop is dysfunctional. Laptops need to be banned in meetings (and smartphones in meetings and lectures).

Also re: other comments: A video lecture is to a physical lecture what a conference call is to a proper meeting. A professor rambling for 3h is still miles better than watching the same thing on YouTube. The same holds for tv versus watching a film on a movie screen.

Zero distractions and complete immersion. Maybe VR will allow it some day.

brightball 6 hours ago 1 reply      
Shocker. I remember being part of Clemson's laptop pilot program in 1998. If you were ever presenting you basically had to ask everyone to close their laptops or their eyes would never even look up.
exabrial 16 minutes ago 0 replies      
Students are best of with the least amount of distractions
tsumnia 6 hours ago 1 reply      
I think its a double edge sword; not just paper > laptop or laptop > paper. As many people have already stated, its about engagement. Since coming back for my PhD, I've subscribed to the pencil/paper approach as a simple show of respect to the instructor. Despite what we think, professors are human and flawed, and being in their shoes, it can be disheartening to not be able to feed off your audience.

That being said, you can't control them; however, I like to look at different performance styles. What makes someone binge watch Netflix episodes but want to nod off during a lecture. Sure, one has less cognitive load, but replace Netflix binge with anything. People are willing to engage, as long as the medium is engaging (this doesn't mean easy or funny, simply engaging).

[Purely anecdotal opinion based discussion] This is one of the reasons I think flipping the classroom does work; they can't tune out. But, if its purely them doing work, what's your purpose there? To babysit? There needs to be a happy median between work and lecture.

I like to look at the class time in an episodic structure. Pick a show and you'll notice there's a pattern to how the shows work. By maintaining a consistency in the classroom, the students know what to expect.

To tie it back to the article, the laptop is a great tool to use when you need them to do something on the computer. However, they should be looking at you, and you should be drawing their attention. Otherwise, you're just reading your PowerPoint slides.

fatso784 3 hours ago 0 replies      
There's another study showing that students around you with laptops harm your ability to concentrate, even if you're not on a laptop yourself. This is in my opinion a stronger argument against laptops, because it harms those not privileged enough to have a laptop. (not enough time to find study but you can find it if you search!)
zengid 6 hours ago 0 replies      
Please excuse me for relating an experience, but it's relevant. To get into my IT grad program I had to take a few undergrad courses (my degree is in music, and I didn't have all of the pre-reqs). One course was Intro to Computer Science, which unfortunately had to be taught in the computer lab used for the programming courses. It was sad to see how undisciplined the students were. Barely anyone paid attention to the lectures as they googled the most random shit (one kid spent a whole lecture searching through images of vegetables). The final exam was open-book. I feel a little guilty, but I enjoyed seeing most of the students nervously flip through the chapters the whole time, while it took me 25 minutes to finish (the questions were nearly identical to those from previous exams).
LaikaF 4 hours ago 0 replies      
My high school did the one laptop loan out thing (later got sued for it) and I can tell you it was useless as a learning tool. At least in the way intended. I learned quite a bit mainly about navigating around the blocks and rules they put in place. In high school my friends and I ran our own image board, learned about reverse proxying via meebo repeater, hosted our own domains to dodge filtering, and much much more. As far as what I used them for in class... if I needed to take notes I was there with note book and pen. If I didn't I used the laptop to do homework for other classes while in class. I had a reputation among my teachers for handing in assignments the day they were assigned.

In college I slid into the pattern they saw here. I started spending more time on social media, paying less attention in class, slacking on my assignments. As my burnout increased the actual class times became less a thing I learned from and more just something I was required to sit in. One of my college classes literally just required me to show up. It was a was one of the few electives in the college for a large university. The students were frustrated they had to be there, and the teacher was tired of teaching to students who just didn't care.

Overall I left college burnt out and pissed at the whole experience. I went in wanting to learn it just didn't work out.

emptybits 5 hours ago 0 replies      
It makes sense that during a lecture, simple transcription (associated with typing) yields worse results than cognition (associated with writing). So pardon my ignorance (long out of the formal student loop):

Are students taught how to take notes effectively (with laptops) early in their academic lives? Before we throw laptops out of classrooms, could we be improving the situation by putting students through a "How To Take Notes" course, with emphasis on effective laptopping?

It's akin to "how to listen to music" and "how to read a book" courses -- much to be gained IMO.

kyle-rb 6 hours ago 0 replies      
>students spent less than 5 minutes on average using the internet for class-related purposes (e.g., accessing the syllabus, reviewing course-related slides or supplemental materials, searching for content related to the lecture)

I wonder if that could be skewed, because it only takes one request to pull up a course syllabus, but if I have Facebook Messenger open in another tab, it could be receiving updates periodically, leading to more time recorded in this experiment.

Fomite 3 hours ago 1 reply      
Just personally, for me it was often a choice between "Laptop-based Distractions" or "Fall Asleep in Morning Lecture".

The former was definitely the superior of the two options.

Bearwithme 5 hours ago 0 replies      
They should try this study again, but with laptops heavily locked down. Disable just about everything that isn't productive including a strict web filter. I am willing to bet the results would be much better for the kids with laptops. Of course if you let them have free reign they are going to be more interested in entertainment than productivity.
free_everybody 3 hours ago 0 replies      
I find that having my laptop out is great for my learning, even during lectures. If somethings not clear or I want more context, I can quickly look up some information without interrupting the teacher. Also, paper notes don't travel well. If everything is on my laptop and backed up online, I know that if I have my laptop, I can study anything I want. Even if I don't have my laptop, I could use another computer to access my notes and documents. This is a HUGE benefit.
vblord 5 hours ago 0 replies      
During indoor recess at my kids school, kids don't eat their lunch and just throw it away because of the chromebooks. There are only have a few computers and they are first come first serve. Kids would rather go without lunch to be able to play on the internet for 20 minutes.
jessepage1989 1 hour ago 0 replies      
I find taking paper notes and then reorganizing on the computer works best. The repetition helps memorization.
wccrawford 7 hours ago 3 replies      
I'd be more impressed if they also did the same study with notepads and doodles and daydreams, and compared the numbers.

I have a feeling that people who aren't paying attention weren't going to anyhow.

However, I'd also guess that at least some people use the computer to look up additional information instead of stopping the class and asking, which helps everyone involved.

zitterbewegung 3 hours ago 0 replies      
When I was in College I would take notes using a notebook and pad and paper. I audited some classes with my laptop using latex but most of the time I used a notebook. Also, sometimes I would just go to class without a notebook and get the information that way. It also helped that I didn't have a smartphone with Cellular data half of the time I was in school.
homie 7 hours ago 0 replies      
instructors are also better off without computers in the classroom. lecture has been reduced to staring at a projector while each and every students eyes roll to the back of their skull
zokier 5 hours ago 1 reply      
I love how any education-related topic brings out the armchair-pedagogist out from the woodworks. Of course a big aspect there is that everyone has encountered some amount of education, and especially both courses they enjoyed and disliked. And there is of course the "think of the children" aspect.

To avoid making purely meta comment, in my opinion the ship has already sailed; we are going to have computers in classrooms for better or worse. So the big question is how can we make the best use of that situation.

nerpderp83 7 hours ago 1 reply      
Paying attention requires work, we need to purposefully use tools that are also distractions.
erikb 6 hours ago 0 replies      
I'd argue that students are better off without a classroom as long as they have a laptop (and internet, but that is often also better at home/cafe than in the classroom).
polote 4 hours ago 0 replies      
Well it depends on what you do in the classroom, when class is mandatory but you are not able to learn this way (by listening to a teacher), having a laptop can let you do other things. And then use your time efficiently, like doing some administrative work, send email, coding ...

Some students are of course better with a laptop in the classroom

marlokk 4 hours ago 0 replies      
Students are better off with instructors who don't bore students into bringing out their laptops.
TazeTSchnitzel 6 hours ago 0 replies      
> In contrast with their heavy nonacademic internet use, students spent less than 5 minutes on average using the internet for class-related purposes

This is a potential methodological flaw. It takes me 5 minutes to log onto my university's VLE and download the course materials. I then read them offline. Likewise, taking notes in class happens offline.

Internet use does not reflect computer use.

Kenji 6 hours ago 0 replies      
If you keep your laptop open during class, you're not just distracting yourself, you're distracting everyone behind you (that's how human attention works - if you see a bright display with moving things, your attention is drawn towards it), and that's not right. That's why at my uni, there was an unspoken (de-facto) policy that if you keep your laptop open during lectures, you're sitting in the backrows, especially if you play games or do stuff like that. It worked great - I was always in the front row with pen & paper.

However, a laptop is very useful to get work done during breaks or labs when you're actually supposed to use it.

Glyptodon 6 hours ago 2 replies      
I feel like the conclusion is a bit off base: that students lack the self control to restrict the use of laptops laptops to class-related activities is somehow a sign that the problem is the laptop and not the students? I think it's very possible that younger generations have big issues with self-control and instant gratification. But I think it's wrong to think that laptops are the faulty party.
ChiliDogSwirl 5 hours ago 1 reply      
Maybe it would be helpful if our operating systems were optimised for working and learning rather than to selling us crap and mining our data.
partycoder 4 hours ago 1 reply      
I think VR will be the future of education.
rokhayakebe 6 hours ago 1 reply      
We really need to begin ditching most studies. We have the ability now to collect vast amount of data and use that to make conclusions based on millions of endpoints, not just 10, 100 or 1000 pieces of information.
FussyZeus 6 hours ago 0 replies      
Disengaged and uninterested students will find a distraction; yes, perhaps a laptop makes it easier but my education in distraction seeking during middle school, well before laptops were even close to schools, shows that the lack of a computer in front of me was no obstacle to locating something more interesting to put my attention to.

The real solution is to engage students so they don't feel the urge to get distracted in the first place. Then you could give them completely unfiltered Internet and they would still be learning (perhaps even faster, using additional resources.) You can't substitute an urge to learn, no matter if you strap them to the chairs and pin their eyeballs open with their individual fingers strapped down, it won't do anything. It just makes school less interesting, less fun, and less appealing, which makes learning by extension less fun, less appealing, and less interesting.

microcolonel 6 hours ago 1 reply      
Students are also better off without forcible teacher's unions and federal curriculum mandates; no chance of hearing about that.

Maybe the best way out of this mess is vouchers.

If the schools are functioning, it should be obvious to them that the laptops are not working out.

bitJericho 7 hours ago 1 reply      
The schools are so messed up in the US. Best to just educate children yourself as best you can. As for college kids, best to travel abroad.
Math education: Its not about numbers, its about learning how to think nwaonline.com
393 points by CarolineW  11 hours ago   249 comments top 41
d3ckard 10 hours ago 15 replies      
Maybe I'm wrong, but I have always believed that if you want people to be good at math, it's their first years of education which are important, not the last ones. In other worlds, push for STEM should be present in kindergartens and elementary schools. By the time people go to high school it is to late.

I never had any problems with math until I went to university, so I was merely a passive observer of everyday struggle for some people. I honestly believe that foundations are the key. Either you're taught to think critically, see patterns and focus on the train of thought, or you focus on numbers and memorization.

The latter obviously fails at some point, in many cases sufficiently late to make it really hard to go back and relearn everything.

Math is extremely hierarchical and I believe schools do not do enough to make sure students are on the same page. If we want to fix teaching math, I would start there, instead of working on motivation and general attitude. Those are consequences, not the reasons.

gusmd 8 hours ago 3 replies      
I studied Mechanical Engineering, and it was my experience that several professors are only interested in having the students learn how to solve problems (which in the end boil down to math and applying equations), instead of actually learning the interesting and important concepts behind them.

My wife went to school for Architecture, where she learned "basic" structural mechanics, and some Calculus, but still cannot explain to me in simple words what an integral or a derivative is. Not her fault at all: her Calculus professor had them calculate polynomial derivatives for 3 months, without ever making them understand the concept of "rate or change", or what "infinitesimal" means.

For me that's a big failure of our current "science" education system: too much focus on stupid application of equations and formulas, and too little focus on actually comprehending the abstract concepts behind them.

Koshkin 10 hours ago 9 replies      
Learning "how to think" is just one part of it. The other part - the one that makes it much more difficult for many, if not most, people to learn math - especially the more abstract branches of it - is learning to think about math specifically. The reason is that mathematics creates its own universe of concepts and ideas, and this universe, all these notions are so different from what we have to deal with every day that learning them takes a lot of training, years of intensive experience dealing with mathematical structures of one kind or another, so it should come as no surprise that people have difficulty learning math.
monic_binomial 40 minutes ago 0 replies      
I was a math teacher for 10 years. I had to give it up when I came to realize that "how to think" is about 90% biological and strongly correlated to what we measure with IQ tests.

This may be grave heresy in the Temple of Tabula Rasa where most education policy is concocted, but nonetheless every teacher I ever knew was ultimately forced to chose between teaching real math class with a ~30% pass rate or a watered-down math Kabuki show with a pass rate just high enough to keep their admins' complaints to a low grumble.

In the end we teachers would all go about loudly professing to each other that "It's not about numbers, it's about learning how to think" in a desperate bid to quash our private suspicions that there's actually precious little that can be done to teach "how to think."

J_Sherz 9 hours ago 2 replies      
My problem with Math education was always that speed was an enormous factor in testing. You can methodically go through each question aiming for 100% accuracy and not finish the test paper, while other students can comfortably breeze through all the questions and get 80% accuracy but ultimately score higher on the test. This kind of penalizing for a lack of speed can lead to younger kids who are maximizing for grades to move away from Math for the wrong reasons.

Source: I'm slow but good at Math and ended up dropping it as soon as I could because it would not get me the grades I needed to enter a top tier university.

spodek 10 hours ago 1 reply      
> it's about learning how to think

It's about learning a set of thinking skills, not how to think. Many people who know no math can think and function very well in their domains and many people who know lots of math function and think poorly outside of math.

quantum_state 4 hours ago 0 replies      
Wow ... this blows me away ... in a few short hours, so many people chimed in sharing thoughts ... It is great ... Would like to share mine as well.Fundamentally, math to me is like a language. It's meant to help us to describe things a bit more quantitatively and to reason a bit more abstractly and consistently ... if it can be made mechanical and reduce the burden on one's brain, it would be ideal. Since it's like a language, as long as one knows the basics, such as some basic things of set theory, function, etc., one should be ready to explore the world with it. Math is often perceived as a set of concepts, theorems, rules, etc. But if one gets behind the scene to get to know some of the original stories of the things, it would become very nature. At some point, one would have one's mind liberated and start to use math or create math like we usually do with day to day languages such as English.
lucidguppy 23 minutes ago 0 replies      
Why aren't people taught how to think explicitly? The Greeks and the Romans thought it was a good idea.
brendan_a_b 5 hours ago 0 replies      
My mind was blown when I came across this Github repo that demonstrates mathematical notation by showing comparisons with JavaScript code https://github.com/Jam3/math-as-code

I think I often struggled or was intimidated by the syntax of math. I started web development after years of thinking I just wasn't a math person. When looking at this repo, I was surprised at how much more easily and naturally I was able to grasp concepts in code compared to being introduced to them in math classes.

g9yuayon 8 hours ago 2 replies      
Is this a US thing? Why would people still think that math is about numbers? Math is about patterns, which got drilled into us by our teachers in primary school. I really don't understand how US education system can fuck up so badly on fundamental subject like math.
jtreagan 5 hours ago 0 replies      
You say "it's not about numbers, it's about learning how to think," but the truth is it's about both. Without the number skills and the memorization of all those number facts and formulas, a person is handicapped both in learning other subjects and skills and in succeeding and progressing in their work and daily life. The two concepts -- number skills and thinking skills -- go hand in hand. Thinking skills can't grow if the number skills aren't there as a foundation. That's what's wrong with the Common Core and all the other fads that are driving math education these days. They push thinking skills and shove a calculator at you for the number skills -- and you stall, crash and burn.

The article brings out a good point about math anxiety. I have had to deal with it a lot in my years of teaching math. Sometimes my classroom has seemed so full of math anxiety that you could cut it with a butter knife. I read one comment that advocated starting our children out even earlier on learning these skills, but the truth is the root of math anxiety in most people lies in being forced to try to learn it at too early an age. Most children's brains are not cognitively developed enough in the early grades to learn the concepts we are pushing at them, so when a child finds failure at being asked to do something he/she is not capable of doing, anxiety results and eventually becomes habit, a part of their basic self-concept and personality. What we should instead do is delay starting school until age 8 or even 9. Some people don't develop cognitively until 12. Sweden recently raised their mandatory school age to 7 because of what the research has been telling us about this.

WheelsAtLarge 3 hours ago 0 replies      
True, Math is ultimately about how to think but students need to memorize and grasp the basics in addition to making sure that new material is truly understood. That's where things fall apart. We are bombarded with new concepts before we ultimately know how to use what we learned. How many people use imaginary numbers in their daily life? Need I say more?

We don't communicate in Math jargon every day so it's ultimate a losing battle. We learn new concepts but we lose them since we don't use them. Additionally a large number of students get lost and frustrated and finally give up. Which ultimately makes math a poor method to teach thinking since only a few students can attain the ultimate benefits.

Yes, Math is important, and needs to be taught, but if we want to use it as away to learn how to think there are better methods. Programming is a great way. Students can learn it in one semester and can use it for life and can also expand on what they already know.

Also, exploring literature and discussing what the author tries to convey is a great way to learn how to think. All those hours in English class trying to interpret what the author meant was more about exploring your mind and your peer's thoughts than what the author actually meant. The author lost his sphere of influence once the book was publish. It's up to the readers of every generation to interpret the work. So literature is a very strong way to teach students how to think.

jeffdavis 8 hours ago 1 reply      
My theory is that math anxiety is really anxiety about a cold assessment.

In other subjects you can rationalize to yourself in various ways: the teacher doesn't like me, or I got unlucky and they only asked the history questions I didn't know.

But with math, no rationalization is possible. There's no hope the teacher will go easy on you, or be happy that you got the gist of the solution.

Failure in math is often (but not always) a sign that education has failed in general. Teachers can be lazy or too nice and give good grades in art or history or reading to any student. But when the standardized math test comes around, there's no hiding from it (teacher or student).

BrandiATMuhkuh 9 hours ago 0 replies      
Disclaimer: I'm CTO of https://www.amy.ac an online math tutor.

From our experience most people struggle with math since they forgot/missed a curtain math skill they might have learned a year or two before. But most teaching methods only tell the students to practise more of the same. When looking at good tutors, we could see that a tutor observes a student and then teaches them the missing skill before they actually go to the problem the student wanted help with. That seems to be a usefull/working approach.

taneq 11 hours ago 6 replies      
As my old boss once said, "never confuse mathematics with mere arithmetic."
JoshTriplett 10 hours ago 0 replies      
One of the most critical skills I see differentiating people around me (co-workers and otherwise) who succeed and those who don't is an analytical, pattern-recognizing and pattern-applying mindset. Math itself is quite useful, but I really like the way this particular article highlights the mental blocks and misconceptions that seem to particularly crop up around mathematics; those same blocks and misconceptions tend to get applied to other topics as well, just less overtly.
Nihilartikel 9 hours ago 0 replies      
This is something I've been pondering quite a bit recently. It is my firm belief that mathematical skill and general numeracy are actually a small subset of abstract thought. Am I wrong in thinking that school math is the closest to deliberate training in abstract reasoning that one would find in public education?

Abstract reasoning, intuition, and creativity, to me, represent the underpinnings of software engineering, and really, most engineering and science, but are taught more by osmosis along side the unintuitive often boring mechanics of subjects. The difference between a good engineer of any sort and one that 'just knows the formulas' is the ability to fluently manipulate and reason with symbols and effects that don't necessarily have any relation or simple metaphor in the tangible world. And taking it further, creativity and intuition beyond dull calculation are the crucial art behind choosing the right hypothesis to investigate. Essentially, learning to 'see' in this non-spacial space of relations.When I'm doing system engineering work, I don't think in terms of X Gb/s throughput and Y FLOPS... (until later at least) but in my mind I have a model of the information and data structures clicking and buzzing, like watching the gears of a clock, and I sort of visualize working with this, playing with changes. It wouldn't surprise me of most knowledge workers arrive have similar mental models of their own. But what I have observed is that people who have trouble with mathematics or coding aren't primed at all to 'see' abstractions in their minds eye. This skill takes years to cultivate, but, it seems that its cultivation is left entirely to chance by orthodox STEM education.

I was just thinking that this sort of thing could be approached a lot more deliberately and could yield very broad positive results in STEM teaching.

simias 10 hours ago 1 reply      
I completely agree. I think we start all wrong too, the first memories I have of maths at school was learning how to compute an addition, a subtraction and later a multiplication and division. Then we had to memorize by heart the multiplication tables.

That can be useful of course (especially back then when we didn't carry computers in our pockets at all times) but I think it sends some pupils on a bad path with regards to mathematics.

Maths shouldn't be mainly about memorizing tables and "dumbly" applying algorithms without understanding what they mean. That's how you end up with kids who can answer "what's 36 divided by 4" but not "you have 36 candies that you want to split equally with 3 other people, how many candies do you end up with?"

And that goes beyond pure maths too. In physics if you pay attention to the relationship between the various units you probably won't have to memorize many equations, it'll just make sense. You'll also be much more likely to spot errors. "Wait, I want to compute a speed and I'm multiplying amperes and moles, does that really make sense?".

jrells 9 hours ago 0 replies      
I often worry that mathematics education is strongly supported on the grounds that it is about "learning how to think", yet the way it is executed rarely prioritizes this goal. What would it look like if math curriculum were redesigned to be super focused on "learning how to think"? Different, for sure.
lordnacho 10 hours ago 4 replies      
I think a major issue with math problems in school is that they're obvious.

By that I don't mean it's easy. But when you're grappling with some problem, whatever it is, eg find some angle or integrate some function, if you don't find the answer, someone will show you, and you'll think "OMG why didn't I think of that?"

And you won't have any excuses for why you didn't think of it. Because math is a bunch of little logical steps. If you'd followed them, you'd have gotten everything right.

Which is a good reason to feel stupid.

But don't worry. There are things that mathematicians, real ones with PhDs, will discover in the future. By taking a number of little logical steps that haven't been taken yet. They could have gone that way towards the next big theorem, but they haven't done it yet for whatever reason (eg there's a LOT of connections to be made).

dahart 10 hours ago 4 replies      
I wonder if a large part of our math problem is our legacy fixation on Greek letters. Would math be more approachable to English speakers if we just used English?

I like to think about math as language, rather than thought or logic or formulas or numbers. The Greek letters are part of that language, and part of why learning math is learning a completely foreign language, even though so many people who say they can't do math practice mathematical concepts without Greek letters. All of the math we do on computers, symbolic and numeric, analytic and approximations, can be done using a Turing machine that starts with only symbols and no built-in concept of a number.

ouid 8 hours ago 0 replies      
When people talk about the failure of mathematics education, we often talk about it in terms of the students inability to "think mathematically".

It's impossible to tell if students are capable of thinking mathematically, however, because I have not met a single (non-mathlete) student who could give me the mathematical definition of... anything. How can we evaluate student's mathematical reasoning ability if they have zero mathematical objects about which to reason?

dbcurtis 8 hours ago 0 replies      
Permit me to make a tangentially related comment of interest to parents reading this thread: This camp for 11-14 y/o kids: http://www.mathpath.org/ is absolutely excellent. My kid loved it so much they attended three years. Great faculty... John Conway, Francis Su, many others. If you have a math-loving kid of middle-school age, I encourage you to check it out.
listentojohan 10 hours ago 0 replies      
The true eye-opener for me was reading Number - The Language of Science by Tobias Dantzig. The philosophy part of math as an abstraction layer for what is observed or deducted was a nice touch.
alexandercrohde 6 hours ago 0 replies      
Enough "I" statements already. It's ironic how many people seem to think their personal experience is somehow relevant on a post about "critical thinking."

The ONLY sane way to answer these questions:- Does math increase critical thinking?- Does critical thinking lead to more career earnings/happiness/etc?- When does math education increase critical thinking most?- What kind of math education increases critical thinking?

Is with a large-scale research study that defines an objective way to measure critical thinking and controls for relevant variables.

Meaning you don't get an anecdotal opinion on the matter on your study-of-1 no-control-group no-objective-measure personal experience.

k__ 6 hours ago 0 replies      
I always had the feeling I failed to grasp math because I never got good at mid level things.

It took me reeeally long to grasp things like linear algebra and calculus and I never was any good at it.

It was a struggle to get my CS degree.

Funny thing is, I'm really good at the low level elementary school stuff so most people think I'm good at math...

cosinetau 7 hours ago 0 replies      
As a someone with a degree in applied mathematics, I feel the problem with learning mathematics is more often than not a problem or a fault of the instructor of mathematics.

Many instructors approach the subject with a very broad understanding of the subject, and it's very difficult (more difficult than math) to shake that understanding and abstract it to understandable chunks of knowledge or reasoning.

archeantus 8 hours ago 0 replies      
If we want to teach people how to think, I propose that math isn't the best way to do it. I can't tell you how many times I complained about how senseless math was. The real-world application is very limited, for the most part.

Contrast that to if I had learned programming instead. Programming definitely teaches you how to think, but it also has immense value and definite real-world application.

keymone 10 hours ago 1 reply      
i always found munging numbers and memorizing formulas discouraging. i think physics classes teach kids more math than math classes and in more interesting ways (or at least have potential to).
humbleMouse 6 hours ago 0 replies      
On a somewhat related tangent, I think about programming the same way.

I always tell people programming and syntax are easy - it's learning to think in a systems and design mindset that is the hard part.

CoolNickname 7 hours ago 0 replies      
School is not about learning but learning how to think. The way it is now it's more about showing off than it is about anything actually useful. They don't reward effort, they reward talent.
calebm 6 hours ago 0 replies      
I agree, but have a small caveat: math does typically strongly involve numbers, so in a way, it is about numbers, though it's definitely not about just memorizing things or blindly applying formulas.

It just bugs me sometimes when people make hyperbolic statements like that. I remember coworkers saying things like "software consulting isn't about programming". Yes it is! The primary skill involved is programming, even programming is not the ONLY required skill.

crb002 10 hours ago 2 replies      
Programming needs to be taught alongside Algebra I. Especially in a language like Haskell or Scheme where algebraic refactoring of type signatures looks like normal algebra notation.
gxs 1 hour ago 0 replies      
Late to the party but wanted to share my experience.

I was an Applied Math major at Berkely. Why?

When I was in 7th grade, I had an old school Russian math teacher. She was tough, not one for niceties, but extremely fair.

One day, being the typical smart ass that I was, I said, why the hell do I need to do this, I have 0 interest in Geometry.

Her answer completely changed my outlook and eventually was the reason why I took extensive math in HS and majored in math in college.

Instead of dismissing me, instead of just telling me to shut up and sit down, she explained things to me very calmly.

She said doing math beyond improving your math skills improves your reasoning ability. It's a workout for your brain and helps develop your logical thinking. Studying it now at a young age will help it become part of your intuition so that in the future you can reason about complex topics that require more than a moment's thoughts.

She really reached me on that day, took me a while to realize it. Wish I could have said thank you.

Wherever you are Ms. Zavesova, thank you.

Other beneits: doing hard math really builds up your tolerance for building hard problems. Reasoning through long problems, trying and failing, really requires a certain kind of stamina. My major definitely gave me this. I am a product manager now and while I don't code, I have an extremely easy time working with engineers to get stuff done.

jmml97 9 hours ago 1 reply      
I'm studying math right now and I have that problem. We're just being vomited theorems and propositions in class instead of making us think. There's not a single subject dedicated to learning the process of thinking in maths. So I think we're learning the wrong (the hard) way.
yellowapple 9 hours ago 0 replies      
I wish school curricula would embrace that "learning how to think" bit.

With the sole exception of Geometry, every single math class I took in middle and high school was an absolutely miserable time of rote memorization and soul-crushing "do this same problem 100 times" busy work. Geometry, meanwhile, taught me about proofs and theorems v. postulates and actually using logical reasoning. Unsurprisingly, Geometry was the one and only math class I ever actually enjoyed.

0xFFC 10 hours ago 0 replies      
Exactly, as ordinary hacker i was always afraid of math. But after taking mathematical Analysis I realized how wonderful math is. These day i am in love with pure mathematics. It literally corrected my brain pipeline in so many ways and it continues to do it further and further.

I have thought about changing my major to pure mathematics too.

pklausler 7 hours ago 0 replies      
How do you "learn to think" without numbers?


EGreg 8 hours ago 0 replies      
There just needs to be faster feedback than once a test.


yequalsx 8 hours ago 2 replies      
I teach math at a community college. I've tried many times to teach my courses in such a way that understanding the concepts and thinking were the goals. Perhaps I'm jaded by the failures I encountered but students do not want to think. They want to see a set of problem types that need to be mimicked.

In our lowest level course we teach beginning algebra. Almost everyone has an intuition that 2x + 3x should be 5x. It's very difficult to get them to understand that there is a rule for this that makes sense. And that it is the application of this rule that allows you to conclude that 2x + 3x is 5x. Furthermore, and here is the difficulty, that same rule is why 3x + a x is (3+a)x.

I believe that for most people mathematics is just brainwashing via familiarity. Most people end up understanding math by collecting knowledge about problem types, tricks, and becoming situationally aware. Very few people actually discover a problem type on their own. Very few people are willing, or have been trained to be willing, to really contemplate a new problem type or situation.

Math education in its practice has nothing to do with learning how to think. At least in my experience and as I understand what it means to learn how to think.

bitwize 8 hours ago 0 replies      
Only really a problem in the USA. In civilized countries, there's no particular aversion to math or to disciplined thinking in general.
The History of GeoWorks, Microsoft Windows Upstart 90s Competitor (2016) atlasobscura.com
55 points by FollowSteph3  3 hours ago   16 comments top 8
Lerc 53 minutes ago 0 replies      
I was blown away by Geoworks. Using a 25MHz 286 felt like a power machine. I remember printing a page with a Giant lower case e on it to see if it scaled it up to full page nicely. It printed two pages. The second just had a tiny triangle of black on it. Looking at my document I saw the end of the e had just clipped past the edge of the page.

At the time WYSIWYG was a bullet point promise that never delivered. Seeing it actually happen was amazing. That it was in a product that had the feel of "Of course it does it that way, because that's how it should be done"

I often lament that it was never an Open Source project. It got passed around companies looking to use it in some niche or other while it slowly decayed. It had enough enthusiasts that as an open project it would have developed.

mnm1 2 hours ago 0 replies      
I loved Geoworks. It did things on my 386 that Windows 3.1 had no hope of doing ever (like loading and working). Also, the banner program wasn't just useful with dot-matrix printers. I had an inkjet and remember many birthdays printing out banners for my family (we used Scotch tape to put them together). Good times.
pacaro 2 hours ago 0 replies      
I (briefly) had to write code to run on GeoWorks GEOS. The code was written in a dialect of C with object oriented features that was compiled from .goc and .goh files to .c files in a manner similar to cfront. It was not a pleasant experience.

Edit: This was in 1997. We were writing a web browser for the Brother Geobook which was a device with late 90s PDA capabilities in the form factor of a late 90s laptop. I don't think that it was a particularly successful product.

compsciphd 3 hours ago 2 replies      
I remember running geoworks on an 8086 with 1MB of ram. ran reasonably well and came with the proto AOL (for some reason I thought it came with a Quantum Link client that looked like an AOL client, but according to wikipedia Quantum Link was already renamed to AOL at this time).

It was a pretty amazing piece of code that made that 8086 very usable for a few more years.

TazeTSchnitzel 37 minutes ago 1 reply      
> That computer wasnt super-fastwhat, with its 40-megabyte hard drive and one megabyte of RAMand, as a result, it really benefited from the lightweight, object-oriented approach of GeoWorks.

am I the only one that made a double-take at this? I don't associate OOP with being lightweight. It's either oxymoronic or irrelevant.

rrdharan 40 minutes ago 0 replies      
I remember reading about (or at least seeing cool looking ads) for GeoWorks, DESQview (and DESQview/X!), and GEM in magazines like Byte in the early 90s. I was always sad that I never got to try out any of them on my machine.
0x445442 2 hours ago 2 replies      
By the early 90s pc clones were much cheaper than $2K and I'd reckon OS/2 was a much bigger competitor to Windows than GEOS.
moonbug22 2 hours ago 0 replies      
Used to swear by my HP OmniGo 100.
How to GraphQL A Fullstack Tutorial for GraphQL howtographql.com
242 points by sorenbs  7 hours ago   63 comments top 18
Cieplak 4 hours ago 5 replies      
Just wanted to give a huge shout-out to PostgREST:



You can get many of the benefits of GraphQL using postgrest's resource embedding:


We're using it in production.

PS: To be clear, you can't expose it directly to your users. We wrap it in a proxy service that provides authentication and authorization, and parses and transforms the users' URL queries destined for PostgREST. We also apply some transformations to the data coming back from PostgREST, such as encoding our internal UUIDs. It may sound complicated, but it's actually only about 200 lines of Erlang.

sayurichick 2 hours ago 1 reply      
I like USING GraphQL (for existing services), like Github's API.

However, 99% of the tutorials on graphql , this one included, fail to show a real life use case. What I mean by that is a working Example of a SQL database from start to finish.

So this tutorial was very cool, but not very useful. Just like the rest of them.

I've yet to find a recent tutorial that covers full stack node.js + PostgreSQL/MySQL + whatever front end. It's always MongoDB or only covers the concepts of GraphQL.

nikolasburk 7 hours ago 0 replies      
Hey everyone

We're super excited to finally launch this resource that we've worked on together with amazing members of the GraphQL community! The goal of How to GraphQL is to provide an entry-point for all developers to get started with GraphQL - no matter what their background is.

The whole site is open-source and completely free to use! If you want to contribute a tutorial because your favorite language is still missing, please get in touch with us!

Here's the official announcement blog post on the Graphcool blog: https://www.graph.cool/blog/2017-07-11-howtographql-xaixed1a...

If you find a bug or another problem, create an issue or submit a PR on the GitHub repo: https://github.com/howtographql/howtographql

Follow us on Twitter to be informed about new content that's added to the site: https://twitter.com/graphcool

slaymaker1907 2 hours ago 1 reply      
I've looked a bit through the GraohQL syntax. While it is an immense improvement over plain Rest, I dislike the abuse of JSON as a query language. I think it would be better if it were a true DSL like SQL or some sort of Lisp.

Instead of allPersons, I think it would be cleaner and easier to understand as

(all Person)

Which makes the generic nature of "all" explicit.

renke1 3 hours ago 0 replies      
I recently started using GraphQL and I love.

The best thing about GraphQL is having a standard interface for queries (and more) and all tools that built upon it (such as Apollo). To name a few existing and upcoming (?) features from Apollo: Query batching, real-time updates (WebSockets + subscriptions), caching, optimistic UI, polling, pagination, live queries and many more.

Also, GraphiQL is pretty cool, too, basically Swagger for free.

schrockn 1 hour ago 0 replies      
Great job. Awesome to see all the focus on documentation and conceptual explanations. Clear messaging and spreading understanding are just as important as the tech.
notheguyouthink 4 hours ago 6 replies      
What I've not understood about GraphQL is how to map it to My/Postgre/etc SQL. The info about resolving specific fields seems .. complex, and difficult to optimize tom say, reduce SQL calls.

Is there a library that, say for Golang, helps translate a GraphQL Query into SQL statements to actually get the data?

alexchamberlain 4 hours ago 1 reply      
It's fantastic to see a tutorial that separates the frontend from the backend. As someone that _doesn't_ work in web technologies, it can be tough to follow basic tutorials or take them to the next level when you're trying to understand node, npm and webpack, as well as the concept you want to learn: GraphQL, which although hyped, does look like it brings substantial value to the table.
Dirlewanger 4 hours ago 4 replies      
I've only given a cursory look into it, and with regards to building a JSON API in Rails (of which the many extant standards and their Rails libraries are still immature), is this just another fad, or an actual API standard that has staying power? It seems like the latter, but given that it's Facebook-backed, as soon as they get tired of it, they could pull a Google on it.
j_s 3 hours ago 0 replies      
I am curious to here about any experiences implementing a GraphQL backend in production on .NET (non-Core) + SQL Server. Even React feels left out here in my neck of the woods.
bdibs 4 hours ago 0 replies      
I just read more and played with GraphQL just the other day, and it's really an amazing and powerful tool.

Save requests and bandwidth, no need for explicit version control, and to top it off an easy to use syntax? Sign me up!

I'll definitely be using this in future projects.

shroom 5 hours ago 1 reply      
Awesome resource! I've been playing around with GraphQL on a Node/React+Apollo project and found it very useful and fun to use.

My background with backend is mostly PHP. Any good plans on adding PHP guide to backend section or is there no good GraphQL-server/implementation for PHP?

soflete 5 hours ago 2 replies      
I have recently started learning React and I am following the lessons in www.hotwographql.com. I am trying to decide which GraphQL library to use for the practical exercise. Should I go for Relay or Apollo?

Also, I have been working as an Android developer for the last couple of years, and I was wondering how similar are the React and Android implementations of Apollo.

zimme 7 hours ago 0 replies      
Awesome! I'll share it with GraphQL Stockholm.
ai_ia 4 hours ago 0 replies      
Exactly what I was looking for. Thanks team behind this.
chaaau 5 hours ago 2 replies      
Heyhey! I heard a lot about GraphQL before, but I'm not quite sure what it is. Is it similar to Neo4j?
justforFranz 2 hours ago 0 replies      
The GraphQL has a diversity problem. :)
Huvik 4 hours ago 0 replies      
Nice job! Great source for learning Graphql :)
Introducing Gradient Ventures gradient.google
32 points by framschwartz  3 hours ago   5 comments top 3
iandanforth 1 hour ago 1 reply      
IMO this has significant appeal. Lots of firms can provide capital, but few can provide relevant expertise let alone training.

They seem to be trying to cover all bases with the Google Brain Residency the Machine Learning Ninja program, standard VC funding, and now this. If you have talent in ML/AI there is a way Google can help you succeed in the style of your choice. Want to be a founder? Excellent! Want to be a founder but also kinda part of Google? Sure! Are you super talented and experienced in other disciplines and want to explore AI and maybe contribute a 2-5% improvement to one of our model's performance? Yes! We have that!

claytonjy 11 minutes ago 0 replies      
I thought this was odd, from the about page

> We can help you find and incorporate data sets into your first models. From cleaning data to extracting the most important features, our team can help you get your production models to market.

While realizing the hardest part of a startup is everything but the tech, it seems odd they're telling AI companies they'll help with the hardest parts of the technical side, the ones that need to be done right well before anyone can tell if your tech has any merit.

I'd hate to be a first-pass reviewer for all the pitches they're gonna get. "I have this amazing idea, I just need someone else to build the AI behind it!"

forgotmysn 43 minutes ago 0 replies      
Google (and other large tech companies that have invested in AI internally) are probably the most valuable investors, because they have access, and can purchase, the largest and best data sets available. It's hard for AI start-ups to do anything without access to the right data sets, and large companies can and have better access to that data.
Tiny Apps tinyapps.org
15 points by lsh  1 hour ago   1 comment top
lsh 1 hour ago 0 replies      
I was curious about how small (file size) and efficient (memory) I could write a program with a GUI and came across this old gem. Most of them have links to their source.
Echo devices are Amazon Prime Days best sellers techcrunch.com
21 points by janober  2 hours ago   26 comments top 9
AlexB138 1 hour ago 1 reply      
I'm not surprised. There are very few mass appeal deals in Prime Day, and the Echo devices are heavily subsidized. I think it's more a statement of how lackluster Prime Day continues to be than it is a success of Echo.
hkmurakami 1 hour ago 0 replies      
I bought a Camelbak portable coffee mug that's scheduled to be delivered between early to mid August. Not exactly what "Prime" advertises itself to be in most cases, when it comes with a 3 week shipping lead time (in in no hurry so I didn't mind the delay)
kevin_thibedeau 34 minutes ago 0 replies      
Eagerly awaiting the day-after-prime-day so I can make a purchase I was prevented from making yesterday.
Sindrome 1 hour ago 2 replies      
Wish Alexa came out before I spent $500 on a Sonos. Now I don't even use my Sonos . . .
notyourwork 1 hour ago 0 replies      
I wonder how Google Home installs compares to that of Amazon alexa hardware?
pasbesoin 10 minutes ago 0 replies      
The Motorola G5 4 GB / 64 GB for $180 appealed -- until I saw that it comes with Amazon's ad-ware at the lock screen (and wherever else, I guess...).

Thought about searching to find out how hackable the Amazon version is... Then decided I have better things to do.

kome 58 minutes ago 1 reply      
I am puzzled... are they useful?
unclebucknasty 57 minutes ago 2 replies      
Ah, Echo: awesome "AI" for consuming Amazon services, including buying stuff from Amazon.

Not so great for much else.

The real trick is that they manage to get people to pay them anything for these little trojan horses.

muninn_ 1 hour ago 2 replies      
Of course. Amazon subsidizes them to be extremely cheap and people get excited about a new thing.

Personally, I have no use for one. "Alexa turn off the lights", or just get up or use your phone? That's what I do at least.

Pentagon Tiling Proof Solves Century-Old Math Problem quantamagazine.org
102 points by petethomas  7 hours ago   26 comments top 11
GregBuchholz 3 hours ago 2 replies      
When it came to the description of "einsteins" (a single tile aperiodic tessellation), I couldn't help but think of the images in the 3rd edition of The Scheme Programming Language:



...Are there holes in those "tilings", or are the tiles not all the same shape, or am I misunderstanding what non-periodic means in this context?

And what is the name for those types of "self-surrounding" tiles on the cover:


v64 40 minutes ago 0 replies      
Here's [1] an article from 2015 describing Casey Mann, Jennifer McLoud, and David Von Derau's discovery of the 15th type of pentagon.

[1] https://www.theguardian.com/science/alexs-adventures-in-numb...

kleer001 10 minutes ago 0 replies      
I wonder if it has a Conway's Life glider like Penrose tiles.
lordnacho 4 hours ago 1 reply      
I love how there's a mix of simple things like why you can't tile things with more than 6 edges and really complex things like what the headline is about.
euyyn 4 hours ago 1 reply      
I didn't know about Marjorie Rice; interesting and uplifting "underdog" story.
pierrebai 4 hours ago 2 replies      
When they talk about the einstein, I assume they mean a shape that can only tile the plane non-periodically. If the tile is allowed to tile the plane both periodically and non-periodically, the solution would be obvious.
bmc7505 5 hours ago 3 replies      
I was recently watching some Computerphile videos and was surprised to learn that several geometric problems have fundamental applications outside the physical sciences, such as geometric sphere packing and error correcting codes. Does tiling research have any known applications in CS or information theory?
Aissen 5 hours ago 0 replies      
Fascinating. And the fact that it's been confirmed by a competing team makes it really believable.
newtem0 4 hours ago 0 replies      
I would really like to see a website dedicated to showcasing beautiful visual manifestations of tiling
droithomme 3 hours ago 0 replies      
Marjorie Jeuck Rice, who was the real key to this breakthrough by finding new tilings that had been claimed impossible, passed away only last week.
microcolonel 6 hours ago 1 reply      
I love that pattern where the pentagons make up hexagons. When I own a house that's going in the kitchen.
3D scanning by dipping into a liquid edu.cn
248 points by jakobegger  11 hours ago   75 comments top 22
zellyn 10 hours ago 4 replies      
I expected they'd be either visually watching the changing contours of an opaque liquid, or somehow using refraction to get multiple visual angles of the same features, but

they're repeatedly dipping it, and using the volume displacement to reconstruct the shape. Amazing. The site is hammered right now so I can't get more details: anyone see how many dips are required to get the highest-detail models they show on the landing page?

azernik 12 minutes ago 0 replies      
Comment on the Hacker News system - most combinations of {edu,ac,gov,mil}.{$ccTLD} should probably be collectively treated as a TLD for site-display purposes. e.g. sdu.edu.cn (Shandong University) would be more descriptive than plan edu.cn (some academic institution in China).
AndrewKemendo 8 hours ago 1 reply      
That is amazing. I think I've looked at every photogrammetry, desconstruction, hand modeling etc... technique for 3D reconstruction and this one takes the cake for ingenuity, quality and capability.

Not sure how practical it is right now, but I wonder if you could do this with air volume at a high enough delta measurement resolution you might get some amazing results.

randyrand 1 hour ago 0 replies      
I wonder if they are taking water cohesion into account.

e.g, some of the water will stick to the sides of the object.

papercrane 10 hours ago 1 reply      
Found a presentation on this technique on youtube:


anfractuosity 10 hours ago 2 replies      
Wow this is very impressive. I was thinking originally they were using the milk scanning technique - http://www.instructables.com/id/GotMesh-the-Most-Cheap-and-S...
xixixao 3 hours ago 0 replies      
Suggestion for increasing applicability: Start with the optical scan, then only use the method to nail the occluded parts. And instead of just gathering data consider what angle will give you the biggest amount of new information next. Not sure if authors tried either.
gene-h 7 hours ago 2 replies      
I once scanned myself at a maker fair in a similar manner. A swimming pool of blue dye with a camera facing above was used so that objects could be scanned by looking at their outline in the blue dye as they were dipped in(a different approach to the volume transforms presented here). Now to do this with a person involved strapping that person to a board and slowly dunking them in. Overall, the experience was unpleasant and what I imagine waterboarding is like, but hey at least I got a 3d scan of myself.
proee 6 hours ago 0 replies      
My friend created an advanced fluid scanner using a dodecahedron. His method is novel in that it:

1. Does not require rotation of the DUT, but instead uses just rising fluid level.

2. Uses permeable fluid so it achieves full density scans.

He spent a number of years trying to get the product to market as a startup, but ran out of personal funding.

He believes Archimedes may have used the Roman dodecahedron as a fluid scanner to test the quality of their projectiles to improve accuracy.

See http://www.romansystemsengineering.com/our_product.html

beagle3 6 hours ago 1 reply      
Isn't this "dip transform" basically the (inverse) Radon transform[0] used in CT and MRI?

[0] https://en.wikipedia.org/wiki/Radon_transform

simon_acca 8 hours ago 2 replies      
Hey, if you add a force sensor to the dipping arm couldn't you, in principle, obtain a 3d density map of the scanned object as well using archimede's principle?
dingo_bat 8 hours ago 2 replies      
Awesome technique! But I cannot imagine this ever being a fast process. 1000 dips for a small model, and you cannot dip it with force.
hemmer 7 hours ago 1 reply      
I wonder how much of a role wetting/capillary effects play in this? The liquid interface will distort as it approaches the object, and will try to meet at a certain contact angle (based on surface tensions etc). Correcting for this might help improve the resolution of the scans?
skykooler 10 hours ago 0 replies      
This is a really clever solution!
anotheryou 10 hours ago 2 replies      
Hugged to death :/

How do they handle overhangs that trap bubbles?

Maybe shaking and scanning in reverse? (can stall cause weird effects when the air can't get back in, but should be more detectable.

antman 8 hours ago 1 reply      
That seems equivalent to trying to get a joint distribution from its marginal distributions. So the constraint is probably that either it needs to be convex or you need to have a prior estimation of the object's cavities which means you need to know the 3d shape beforehand to have a mathematically guaranteed measurement.
dre85 8 hours ago 0 replies      
Is the software open sourced? This looks like it would translate into a fun hack-a-day project. At first sight, the required hardware seems pretty basic, or? It would be awesome if someone replicated it with a RaspPi or something and posted a step-by-step tutorial.
rsp1984 7 hours ago 1 reply      
Note that this only works for objects that don't have any flexible parts and don't interact with the water in any other way than pushing it aside (e.g. soak water).
nom 7 hours ago 0 replies      
How does one come up with something like this? The method is everything but straight forward and not practical at all, but it still produces good results. Amazing work.
deepnet 10 hours ago 1 reply      
samstave 7 hours ago 1 reply      
How does it determine shape when measuring volume displacement? is it only measuring the displacement of the top surface-tension-layer of water as if it were a slice?
SAFEs are not bad for entrepreneurs ycombinator.com
120 points by janober  7 hours ago   73 comments top 16
anotherfounder 6 hours ago 3 replies      
So, for founders raising let's say, a seed round (with a Series A 18 months down the line), is the recommendation still raise on SAFE (with some cap and/or discount), and then price it at A?

It would be useful for the founder community (especially outside YC network) to have examples of how different recent startups have done it - offered discount or cap or both, how they determined the cap, the experience at A, experience with SAFE when dealing with angels/micro VCs, etc.

Any founder willing to share that here?

oyeanuj 6 hours ago 4 replies      
> and the industry standard is that companies pay for BOTH their own legal counsel and the investors legal fees.

Serious question - how is this still the case, or make any sense? Wouldn't it be in the investor's interest that the company doesn't spend $60K out of their raise on this, and instead on hires, product, etc?

And given how standard a process this must be for every VC firm, I imagine they would have a well-negotiated rate, which for them is an incremental cost of investing?

I'd like to believe that there are firms out there that don't do this, and that this is turns out to be some sort of advantage for them (a form of founder-friendly/company-friendly, if you will).

djrogers 6 hours ago 2 replies      
For everyone who had no idea what a safe is (beyond the big metal thing you put valuables in), I eventually found this with some digging:


ryandamm 6 hours ago 0 replies      
The article this is responding to is extremely misleading and is pure clickbait.

Safes are great. They're easy to understand, and any semi quantitative founders should be able to build dilution spreadsheets without the help of a lawyer.

pmarreck 46 minutes ago 0 replies      
Is there a good resource (short of getting a finance degree) that would permit me to fully understand this instrument, the problems it solves, and its caveats? (heck, even a youtube video)
yesimahuman 4 hours ago 1 reply      
We did a Safe in our first VC round (and then converted it in a priced round later), and it was an easy, fast way to close. I had to convince our investor to do one because it was their first, and they also had a very positive reaction to it. Deals die because of time and we were able to get the deal done and get back to work. I would definitely do one again and I think the downsides are overblown.
danieltillett 3 hours ago 1 reply      
For those of you who are running Aussie startups we don't have any equivalent of the YC SAFE. We are stuck with pricing rounds even at the seed stage which wastes a huge amount of everyone's time.

I actually tried to get some lawyers here in Australia to convert over the YC SAFE agreements to Australian law and I could not find one. Apparently the big blocking point is none of the law firms wanted to take responsibility for the legal liability. This is one area where our "innovation" government could get involved to sort out.

lpolovets 6 hours ago 2 replies      
This is a good post, but the one thing that stuck out at me is the $60k Series A figure. I have no doubt it's true, but comparing priced Series A legal costs to seed stage SAFE legal costs is apples to oranges. No one is debating using SAFEs for Series A's, as those are already priced rounds ~100% of the time. For a priced seed round, I've heard legal costs can vary from $5k to $20k or so. That's not insignificant, but it's way lower than $60k.
jeremyt 3 hours ago 0 replies      
SAFEs are not bad for entrepreneurs, they're bad for investors.

I won't make one anymore.

I've done two deals that involved a SAFE, and it's been almost 2 years and the companies are still looking to raise a round. If they do, I'm looking at a 10-20% return.

It's not worth it for the risk.

Indeed, at least convertible notes are debt and can be seen you're in a liquidation. A SAFE doesn't even give you that.

middleout 2 hours ago 0 replies      
I'm not a fan of YC (I find it elitist, and imo they crowd out the 99.5% of startups that get rejected), but I can't find fault with the SAFE as an instrument.

To the extent that founders suffer too much dilution while raising via a SAFE, it's more likely the case that there was something not working with the business.

Sure, if the founders could have raised a priced equity round from the get-go, they probably should have done that over a SAFE, but more likely the legal expenses would have been too onerous for that to have been an option...

nodesocket 2 hours ago 1 reply      
If you go with SAFEs for a early Angel round (< $1M), how do you distribute equity for the founders, early (first 3-5) employees, advisors?
elmar 5 hours ago 0 replies      
The SAFE by design contains a full ratchet anti-dilution protection for the investor, if you have raised a loot of money normally by stacking some SAFES on different dates and then raise the next converting round at a low valuation you can find yourself completely diluted.
thefahim 7 hours ago 4 replies      
The cost of priced rounds is often brought up as a reason to choose a SAFE. Why are priced rounds so expensive in the first place?
HelgeSeetzen 4 hours ago 2 replies      
As somebody who has been on both sides of the investment table, I can confirm that very few founders understand the complexities of convertible notes (SAFE or otherwise). But I think the authors of both the pro and con argument are covering only one of the points. Yes, first time founders often don't intuitively understand the impact of convertible notes on their cap table. But that's not that hard to model. Much harder to understand are the secondary impacts of convertible notes. I have raised, led and participated in dozens of rounds and, frankly, still get caught out by those.

In general, the problem is that most benefits that investors enjoy are properties of their shares rather than the money that they invested. For an equity round this is one and the same. Not so much for convertible notes. A simple example:

An entrepreneur raised a $1M convertible note with a $5M cap. Ignore discount, interest and other factors for now. She then raises a $5M round at a valuation of $20M. That yields a dilution of 20% for the round plus a "hidden" dilution of ~17% for the note conversion (1/6). That's the blurry issue that both authors discuss. But if anything the share rights are even blurrier. Let's say that the equity round came with what is commonly referred to as a 1x liquidation preference (non-participating). So they would get $5M back before other shareholders get anything. Even though I just worded that as matching the money that they put in, it is generally a property of the share class that the investors hold. For example, their $5M might have bought 5M shares at $1/share that each says "redeemable for $1 or convertible to common shares". Our note investors also hold those shares now. But instead holding one per dollar, they now hold four per dollar (since they pay 1/4 the price for such a share). Suddenly, they have effectively a 4x liquidation preference benefit and the company has to return a full $9M before common shareholders/founders see a penny of payout (despite only having $6M in the bank).

Interest rates, pre-round ESOP increases, and many other factors in convertible notes make this problem worse. And it affects just about all aspects of the cap table including voting rights, protective provisions, redemption rights, etc.. Basically, the bigger the gap between the cap and the eventual round, the bigger the privilege the note investors pick up. Not just in economic benefit where you would expect it, but also in power/insurance/protections/etc. where it isn't obvious at all. Nowhere in your term sheet for the note or equity round will it mention 4x liquidation preference. Doing so would cause instant rejection of the deal by even the most inexperienced founder! But that's exactly would is going to happen once all the conversion mechanics are executed. And that can catch even seasoned entrepreneurs off guard (and seasoned investors, including plenty of note holders who never understood that they would get these benefits).

Convertible notes - SAFE or otherwise - have a role to play in venture financing. But they are complex instruments and should be use carefully. Anything else is just a recipe for pain in the long run.

laser 6 hours ago 1 reply      
Are these supposed founders that haven't done the basic dilution math on the funds they're raising fit to be raising funds at all?
bmh_ca 7 hours ago 4 replies      
> The safe and its predecessor, the convertible note, have almost identical conversion features

A SAFE _is_ a convertible note, with standardized language.

Saying Sorry When Things Go Wrong [pdf] nhsla.com
60 points by DanBC  5 hours ago   25 comments top 14
verelo 1 hour ago 0 replies      
Literally 15 minutes ago I went to the neighbor behind my house [who complained in writing to me today] and apologized for letting construction workers work past 7pm in my backyard. I knew working past 7 was wrong, but no one had said anything and the project is behind schedule...so i let it slide. Today i had to go back and acknowledge that I was wrong to let that happen and that it won't happen again.

I first wrote a note, but opted to deliver the message in person as they were home [also handed them the note at the end of it as it had my phone number and email on it should they need me]. At the end of the day, i think I've turned what was potentially the beginning of a bad situation into something that brought us closer together and more likely to communicate effectively in the future.

Edit: One thing some people are saying further down is regarding admissions of guilt and i forgot to touch on that but it's clearly what I did here...

I think we live in a society that is often scared that doing the right thing [often in the form of apologizing] will get us in trouble. Plenty of times this holds true, but i think if we all did it more often it might be for the greater good, plus sometimes getting in trouble teaches us a valuable lesson. Doing the right thing should be your priority, but when you mess up I feel it's very important to correct it or you will often suffer small but longer lasting side-effects [stress, bad relationships, etc].

ineedasername 3 hours ago 1 reply      
I know the context of the PDF is a little different, but towards making the sentiment more broadly applicable, I'd suggest that an ability to own up to a mistake of any sort is a very useful professional "skill". Lacking that skill, it can be much more difficult to correct a problem before it becomes an issue (You hesitate, because a change implies the pre-change situation was "wrong") Once there's an issue, it also makes the cleanup much harder, more awkward, etc.

If you practice being comfortable saying something like, "I'm sorry, I overlooked X. I'm fixing it now, and will do my best to make sure it doesn't happen again" then your professional and personal life may be just a tad smoother. All of this implies, and requires, a mindset that is self-aware enough to evaluate itself. That can be the difficult part, that requires regular practice and resistance of instincts to go on the defensive and justify something, even if it was "right" at the time, if circumstances now show otherwise.

jessriedel 4 hours ago 0 replies      
1. It says that an apology is not of itself an admission of guilt. But it doesn't say that an apology can't be used as one piece of evidence of negligence/wrong-doing.

2. This document conflates negligent/avoidable mistakes with situations where "there has been an unintended or unexpected event" and "includes recognised complications referred to in the consent process". In the latter cases, "I'm sorry" is an expression of sympathy, not a true apology.

3. These sort of apologies are required by statute. How meaningful can they be?

smcg 4 hours ago 1 reply      
Note that the above is a UK publication and the laws around apologies are different in other countries. This document I have here (1996, pdf) is a rather good journal article on the subject for US legalities. http://scholarship.law.missouri.edu/cgi/viewcontent.cgi?arti...

tl;dr apologies are usually a good thing in your personal life and can prevent litigation in legal matters and are rarely admissible as proof of guilt. I am not a lawyer.

roceasta 43 minutes ago 0 replies      
The corollary of this is: don't use people's apologies against them. If they are sincere then accept. Don't exploit for political or financial gain or to make it look like you 'won'.
brandon272 3 hours ago 0 replies      
As a Canadian I already know all of this.

Seriously though, I find myself apologizing profusely all of the time in ordinary conversation. Ordinarily I would think that it's a habit I need to curb, but the sorries are all genuine!

imhelpingu 1 hour ago 0 replies      
If I realize someone I know is one of those "never say you're sorry" people, I lose a little bit of respect for them.
quirkot 4 hours ago 0 replies      
Knowing how to apologize well, being successfully heartfelt, is a perennially too uncommon skill
magic_beans 4 hours ago 1 reply      
This is really not relevant to anyone but healthcare professionals in the UK...
logicallee 3 hours ago 0 replies      
(This comment is totally serious.)

There is a chart with "Do say" and "Don't say" listening appropriate and inappropriate phrasing examples.

I will ask an extreme question about it then make it a bit less reductionist.

The chart doesn't say whether "I'm sorry I caused your son's death" falls under "Do say" or "Don't say".

The thing is, although the above line exaggerates, if you are being transparent then there are a LOT of statements that reduce to "I (we) caused your son's death", but which are much more technical, i.e. regarding what was done.

In this case it is unclear whether these are to be avoided or can be mentioned? It says "These steps include informing people about the incident" but it is not totally clear whether they mean it.

Malpractice is the third-leading cause of death in the United States[1], so my question isn't an idle one.

The PDF could be far more specific here.

[1] https://www.google.com/search?q=malpractice+as+cause+of+deat...

valbaca 5 hours ago 1 reply      
I'm sorry, but what's the point of this?
karmellad 4 hours ago 0 replies      
Doctors don't learn to say sorry in the us?
Kenji 3 hours ago 2 replies      
Why be sorry? It's an admission of guilt and weakness. Take a page out of Donald Trump's book - never be sorry for anything. You do what you do and the others can deal with it. That gets you the presidency in today's world, it seems. I don't even want to get political here but the US elections have shown me how you get ahead in life.
brian_herman 5 hours ago 0 replies      
Toward a Reasonably Secure Laptop qubes-os.org
274 points by doener  13 hours ago   82 comments top 11
HugoDaniel 10 hours ago 2 replies      
"Finally, we are going to require that Qubes-certified hardware does not have any built-in USB-connected microphones (e.g. as part of a USB-connected built-in camera) that cannot be easily physically disabled by the user, e.g. via a convenient mechanical switch. However, it should be noted that the majority of laptops on the market that we have seen satisfy this condition out of the box, because their built-in microphones are typically connected to the internal audio device, which itself is a PCIe type of device. This is important, because such PCIe audio devices are by default assigned to Qubes (trusted) dom0 and exposed through our carefully designed protocol only to select AppVMs when the user explicitly chooses to do so."

This made me download Qubes. Amazing project that seems to care.

x86insecure 11 hours ago 4 replies      
There are things we can do to help get us out of this Intel ME rut.

* Let AMD know that open-sourcing/disabling PSP is important to you [1].

* Contribute to RISC-V. You can buy a RISC-V SoC today [2]. Does your favorite compiler have a RISC-V backend?

[1] https://www.reddit.com/r/linux/comments/5xvn4i/update_corebo...[2] https://www.sifive.com/products/hifive1/

cyphar 12 hours ago 0 replies      
> Another important requirement were introducing today is that Qubes-certified hardware should run only open-source boot firmware (aka the BIOS), such as coreboot.

I recently flashed coreboot on my X220 (and it worked surprisingly enough). However, I couldn't find any solid guides on how to set up TianoCore (UEFI) as a payload -- does Qubes require Trusted Boot to be supported on their platforms (I would hope so)? And if so, is there any documentation on how to set up TianoCore as a payload (the documentation is _sparse_ at best, with weird references to VBOOT2 and U-Boot)?

Otherwise I'm not sure how a vendor could fulfill both sets of requirements.

d33 12 hours ago 10 replies      
If I read that right, they're allowing Intel ME, which sounds like a sad compromise to me. Given that it's a pretty big complex black box that one can't easily disable, would you agree that x86 is doomed when it comes to security? If that's the case, is there any hope we could have a CPU with competitive capabilities? (For example, is there an i7 alternative for ARM?)

What could one do to make it possible to have ME-less x86 in the future?

Taek 11 hours ago 3 replies      
Is this something we could achieve with a corporate alliance? I know a lot of tech companies would like to give their employees secure laptops. I also know that there are large costs associated with making hardware, especially if you are talking about dropping ME.

A dozen companies with 1000 employees each and a budget of $2,500 per employee gets you $30 million, which is surely enough to get a decent, qubes-secure laptop with no ME. You aren't going to be designing your own chips at that point, but you could grab power8 or sparc or arm.

Are there companies that would reasonably be willing to throw in a few million to fund a secure laptop? I imagine at least a few. And maybe we could get a Google or someone to put in $10m plus.

ashleysmithgpu 12 hours ago 5 replies      
Looks like Qubes make you pay to get certified: https://puri.sm/posts/ "The costs involved, requiring a supplementary technical consulting contract with Qubes/ITL (as per their new Commercial Hardware Goals proposal document), are not financially justifiable for us."
Aissen 10 hours ago 1 reply      
> The vendor will also have to be willing to freeze the configuration of the laptop for at least one year.

This is one of the most important points. The speed at which laptop vendors are releasing new SKUs is staggering. I know the whole supply chain is to blame, but apart from a few models, the number of different SKUs is way too high.

digi_owl 7 hours ago 1 reply      
Once more i get the impression that computer security people are off in a different universe where a computer at the bottom of the ocean is a "reasonable" way to do computing.
notacissp 7 hours ago 0 replies      
This article helped me get up and running with Qubes:


listic 5 hours ago 0 replies      
Looks like even Purism is not interested in certifying compatibility with Qubes anymore. That's sad.
awinter-py 8 hours ago 0 replies      
It's a shame that chromebook's boot verification isn't easily extensible to open source.
The farthest star syfy.com
23 points by yitchelle  4 hours ago   5 comments top 3
kirykl 1 minute ago 0 replies      
Amazing to me is the causal influence this star has on the Earth from 9 billion light years away. Physically the influence is zero, but the meaning humans are able to give it, makes it huge.
chiefofgxbxl 39 minutes ago 0 replies      
An amazing discovery. If we can see things that far away, it's a wonder what we'll discover when we keep improving telescopes, introducing new telescopes (e.g. Webb space telescope), and even build devices on other celestial bodies! I'm looking forward to a permanent telescope on the far side of the moon where Earthly signals are no distraction.

As for the article, it was distracting by the poor writing style (emphasis mine). It's frustrating to hear people say "like" and "literally" all the time in speech, but far worse to see it in written pieces:

- "...point source (literally, a dot of light)"

- "...galaxy, a star, you, me bends space, literally warps it"

- "...too faint to see. Like, hundreds of times too faint"

pmoriarty 1 hour ago 2 replies      
Here's a nice size comparison of objects in the universe.[1] According to that video (and a Wikipedia article on it[2]), the observable universe is 93 billion light years in diameter.

Yet the "farthest star" article says the farthest star is only 9 billion light years away, which is ten times closer than the diameter of the observable universe.

[1] - https://www.youtube.com/watch?v=4S69zZwYrx0

[2] - https://en.wikipedia.org/wiki/Observable_universe

Paying Professors: Inside Googles Academic Influence Campaign wsj.com
90 points by NN88  9 hours ago   45 comments top 9
gnicholas 5 hours ago 5 replies      
> The money didnt influence his work, Mr. Heald said, and Google issued no conditions: They said, If you take this $20,000 and open up a doughnut shop with itwell never give you any more moneybut thats fine.

At a glance, this seems like the funds are no-strings-attached. But when you think for a minute, you realize it's the exact opposite.

Google is saying that if they don't like what you do with the money, they won't give you any more but if they do like what you do with it then you might get more. This incentivizes the professor to use the money to do things that Google would like, which is the opposite of no-strings-attached.

There technically are no strings attached to this money, but the possibility of future payments (which ranged from $5k to $400k) is a pretty big enticement.

amoorthy 1 hour ago 2 replies      
Hi folks - below is an article I read recently which opened my eyes to the risks of corporate funded research. Companies have long funded research to back their interests that can have serious ramifications on public safety and use of public resources.

Long read but enjoyable and informative.

[1]: The Most Important Scientist You've Never Heard Of: http://mentalfloss.com/article/94569/clair-patterson-scienti...

ucaetano 1 hour ago 2 replies      
Wait, a profit-driven company is spending money supporting research into areas related to the company's interests?

Why is this even news? Is there a single for-profit company that funds research contrary to the company's interests?

surveilmebro 5 hours ago 1 reply      
To be fair, similar tactics are standard practice in many non-tech fields: pharmaceuticals, law, and agronomy to name a few. What's perhaps different here is that researchers may not be accustomed to disclosing financial support that is only weakly connected to the research in question.
mankash666 2 hours ago 1 reply      
Ridiculous! This is how ALL academic funding works. The headline might as well read NSF/NIH is paying professors for propagating "views". Given that Murdoch owns WSJ, fundamental science like evolution and global warming morph into "views", not facts/axioms.
frgtpsswrdlame 3 hours ago 0 replies      
A company using the skyhigh profits it makes from it's market dominance to fund academic research arguing that it doesn't abuse it's market dominance? Perfect.

If you're looking for arguments for antitrust in this area beyond consumer welfare you've found them. The concentrated wealth produced by big monopolistic firms has a gravity field of it's own, distorting public information and opinion.

nl 1 hour ago 0 replies      
Several papers argued that Googles search engine should be allowed to link to books and other intellectual property that authors and publishers say should be paid fora group that includes News Corp, which owns the Journal. News Corp formally complained to European regulators about Googles handling of news articles in search results.


And all those graphs showing how big Google is have nothing to do with the story. News Corp wants an anti-trust investigation into Google in the US too.

NN88 9 hours ago 0 replies      
ocdtrekkie 8 hours ago 0 replies      
Sadly, this has been a known fact for a long time, and it never gets a lot of attention. Joshua Wright, the former FTC Commissioner, was one of the professors previously paid to write 'academic studies' for Google.
Show HN: Hosted JavaScript dependency tree graphs for GitHub READMEs github.com
20 points by fiatjaf  3 hours ago   2 comments top
me_bx 1 hour ago 1 reply      
Nice one.

Glitch (the hosting service) is returning 504 errors, but it's easy to run it locally:

 cd /tmp git clone https://github.com/fiatjaf/node-dependencies-view.git cd node-dependencies-view/ npm install npm start
Then in the browser


* In repos containing many modules, the svg is really too wide, even when decreasing ratio to 0.1. A more space-efficient layout could possibly be found. Vertical instead of horizontal, perhaps?

* Fails to render anything whenever a module is not found (e.g. `require('./params')`). Proper fallback may be implemented.

Transit Detection of a Starshade at the Inner Lagrange Point of an Exoplanet arxiv.org
89 points by sanxiyn  10 hours ago   48 comments top 8
ChuckMcM 8 hours ago 5 replies      
Here I was hoping that Kepler had found a star shade.

Interesting point that at some point your observing apparatus gets good enough that you can 'see' the structures built by sufficiently advanced civilizations (sure they cloak their ships in orbit but you can see how they make their home world comfy!)

At one of the SETI seminars there was a discussion about when would be the "right" time to alert a newly discovered intelligent species that they aren't alone in the universe. There was a lot of back and forth about indigenous tribes in the Amazon, some of who learned of other tribes by the arrival of missionaries, some by loggers, and some who were out walkabout and came upon the strangers. How you meet outsiders has a different impact on how it affects you.

So if you were aliens and you didn't want to 'alarm' or 'damage' humans, what would you use as a signal that it was probably a good time to say "Hello" ? I've always felt that once you could detect they were having conversations on other planets you would now "know" we weren't alone and someone could appear in orbit and say Hi. Others felt it would only be safe if humans felt reasonably confident in their own ability to meet them at their level (so perhaps at least colonies on other solar system bodies). One person at our table was firmly in the only when it is unavoidable, which is to say they are about to send a probe to an inhabited planet or come across a construction like a station that is not easily concealed or moved.

yodon 9 hours ago 1 reply      
The article calculates that solar-radiation reducing shields for earth-sized exoplanets (like some propose for mitigating climate change) will be detectable by the next generation of astronomical telescopes.
ansible 5 hours ago 0 replies      
This is some interesting speculation, and I applaud the researchers who think of things like this to look for.

I still estimate that by the time a civilization has planetary-scale engineering capability, that they won't need to make things like starshades.

If you have molecular nanotechnology, you can either adapt yourselves to whatever location you find, or just skip the biological body business, and directly upload your consciousness to a computer network.

The 2nd option is far more mass and energy efficient to support large numbers of sophonts, and I expect that any civilization to endure long enough will have the majority of its population living online instead of offline. If that even ends up being a thing, and they all don't just merge into a single entity (going in the direction of Star Trek's borg).

btilly 8 hours ago 3 replies      
You know, this sounds like a great idea.

Maybe we should be building ourselves one of these...

a_gopher 8 hours ago 3 replies      
Paper doesn't address what measures the aliens might use to disguise any obvious signature of the starshade.

You'd think that the aliens might not be too keen to create huge beacon advertising their presence to all and sundry...

mmjaa 8 hours ago 0 replies      
Looking for umbrellas in space. What will we think of next?
graycat 4 hours ago 1 reply      
So why has ET not used something like a Dyson sphere or a starshade to send us light signals as a form of "Hello"?

Maybe because by the time a planet is able to send such signals, they have discovered better means, faster than light speed, communications.

So, the "Hello" communications have been coming to us for maybe millions of years, but like our planet before we understood radio, we can't detect their communications.

E6300 5 hours ago 0 replies      
I know I'm gonna get downvoted for saying this, but this is kinda dumb. Might as well look for Dyson spheres by looking for gravitational lensing around dark spots in the sky.
.NET Core Support in AWS CodeStar and AWS Codebuild amazon.com
64 points by janober  9 hours ago   11 comments top 2
LyalinDotCom 5 hours ago 4 replies      
Just going to mention it here since many folks don't seem to know this but we here at Microsoft also have a cloud build (CI/CD), Git source control, Kanban (lots of agile tools), etc and its all free for up-to 5 users.

The build system also geared for "any developer, any platform" with support for Xcode, Android, iOS, Java, .NET, and other types of applications using our Windows or multi-platform agent.

We also do unlimited private repo's for those 5 users which I know is super important to people.

Details on VSTS:https://www.visualstudio.com/team-services/

hsod 3 hours ago 1 reply      
I tried out CodeStar a month or so ago and it was decidedly not ready for primetime. In particular, I recall it being absurdly difficult to debug/troubleshoot failing builds.
A Huge Diamond Mine That Helped Build The Soviet Union (2014) gizmodo.com
99 points by mitul_45  11 hours ago   46 comments top 7
jpatokal 17 minutes ago 0 replies      
This is one of many insane places on Koryo's "Abandoned Russia" tours, which have long been on my bucket list:


From $7,300/person, but you'll need to wait until next year since this year's kicked off today.

ortusdux 8 hours ago 1 reply      
This mine is the first thing I zoom in on when checking out a new procedurally generated 3d world map service. It is easy to spot because of the nearby Vilyuy Reservoir built to power the mine. Everyone's algorithm can handle mountians, but I've yet to see one that correctly renders this crater. Even google earth shows the mine as flat.
willvarfar 9 hours ago 4 replies      
> These diamonds were all of a uniform size and shape and were dubbed 'Silver Bears'. While DeBeers could not understand how the Soviets were producing such a large quantity of gem diamonds of such uniform size, and supposedly from one mine that by DeBeers surveys should not be capable of such diamond production, they were, nevertheless, pressured to purchase them all lest the Soviets simply dump the diamonds on the open market, thus flooding it and bringing down diamond prices.

What's the speculation as to their surprising abundance and uniformity?

supahfly_remix 9 hours ago 1 reply      

 Helicopters can't fly over itthe downward force of the air would pull them in.
Can someone explain the physics of this? The air above the hold is at the same pressure as that beside the hole, otherwise there would be a constant wind. Also, above a certain altitude helicopters don't rely on ground effects.

EA 9 hours ago 0 replies      
thriftwy 10 hours ago 1 reply      
In 1955 Soviet Union was already built. The peak of Soviet period is 1965, which is pretty close.
ocschwar 4 hours ago 0 replies      
Diamonds. Is there anything they won''t do?

They can prop up Stalinism regimes.They can prop up apartheid.They can prop up the likes of Mobutu Sese Seko.

So lovely.

Google bans its ads on sites that use annoying pop-unders techcrunch.com
94 points by janober  7 hours ago   48 comments top 8
fredley 3 hours ago 1 reply      
The apis used to control focus should just be put behind a permission, as microphone/camera access etc. are. If a site wants to control window focus, I should have to explicitly allow it.
paulpauper 5 hours ago 2 replies      
wouldn't it be better to just find a way to prevent pop-unders in the first place. Why is it so hard to thwart them? All these sites use the same script that evades the popup blocker built into chrome yet none of google's 1000's of engineers can do anything about it apparently.
paulpauper 5 hours ago 0 replies      
Pop-unders are extremely annoying, not only do they slow the browser but they are always filled with malware. Good move by google.
natch 1 hour ago 0 replies      
That's a great step. I can think of a few more steps they could take, such as removing ads from fake news sites.

Of course that gets into difficult judgement calls, whereas this pop-under case is pretty clear cut.

It makes sense if they want to move slowly and deliberately, but I hope they won't stop here.

grillvogel 5 hours ago 4 replies      
this is not because google is your friend, this is because they want you to have less reason to use adblock.
Sir_Substance 3 hours ago 0 replies      
I didn't know these had become a thing, due to my use of ad blocking.

Thanks, ad blocking.

smegel 3 hours ago 1 reply      
Or fix Chrome that allows this kind of abuse? Google doesn't give a shit about protecting users.
cronjobber 5 hours ago 2 replies      
Antitrust regulators should look into this one.

EDIT: ...because the policy punishes advertisers use of other advertising providers.

Bind broker tedunangst.com
69 points by donmcc  9 hours ago   10 comments top 6
profwick 3 hours ago 0 replies      
Rather than proxying data, why wouldn't you just bind the socket, and then transfer the file descriptor over the UNIX domain socket (using sendmsg/recvmsg)?

Or acept() incoming connections, and then pass the connection's file descriptor.

tyingq 8 hours ago 1 reply      
User space proxying for protocols other than http, though, is a bit tricky. If you aren't exposing things like source ip, listen queue depth, buffer sizes, errno from failures, etc...you are potentially limiting how well it works. Plus the read/write overhead. I'm not convinced this is any better than other approaches. Brokered iptables (or pf,etc) port rewrites seems cleaner, though it has issues as well.
nhaehnle 4 hours ago 0 replies      
This is neat. There's a minor race condition at startup because scanhostdir calls scanportdir before watchdir. Reversing the order of calls would close the gap and shouldn't have any adverse effects.
elFarto 6 hours ago 0 replies      
It sounds like you should probably just go the whole hog and give each user their own network namespace and bridge them to the main network. Then they can run DHCP and get their own address and do with it what they like.

Wouldn't really work for Internet accessible IPv4 addresses, but IPv6 would be fine.

zokier 8 hours ago 1 reply      
I think these days I would approach the problem by creating per-user network namespaces and hack the privileged port limitation away from kernel (is there a sysctl for that/why not?)
Apple and Google embody two alternative models of capitalism theatlantic.com
134 points by nthuser  12 hours ago   88 comments top 17
eldavido 9 hours ago 10 replies      
There is so much wrong with this article I hardly know where to begin.

First, it presumes a 19th-century separation of "capital" and "labor" where "capital" is a bunch of greedy pigs trying their damndest to exploit labor, with little crossover between the two groups. The modern reality is way more complicated. Almost every member of "labor" has some form of pension, 401(k), IRA, or personal stock holding, and even if they don't, their governments do. Huge pension funds like CalPERS are heavily invested in the stock market, which matters because (a) many state employees rely on them for income, and even if you don't work for the state, (b) your taxes are directly tied to the investment performance of these funds. Bottom line, it's complete folly to suggest the stock market is a "rich person's problem" even if you're poor. Anyone invested in the S&P 500 is going to have a large position (relatively) in Apple.

Second, this article makes no mention of Google's hiring of Ruth Porat or the recent moves to put better capital allocation processes in place. I, for one, wish Google would behave more like Apple. I think it shows admirable restraint that Apple can pay so much cash out without wasting it on dumb things.

Third, it's just a sloppy article in general. They make no mention of whether the "performance" of the two includes the cash thrown off by dividends, which in Apple's case, is significant. They also didn't mention the complex back-story of why the Irish subsidiary is used [1], nor any of the academic finance research suggesting that "Short-term" decision making actually benefits investors long-term.

[1] https://stratechery.com/2016/apples-eu-tax-problem-how-apple...

frgtpsswrdlame 10 hours ago 5 replies      
It's worth mentioning on a post like this that there is no legal (or historical) basis for the idea that maximizing shareholder value is the primary concern of a corporation. See these two sources:


[pdf] http://scholarship.law.cornell.edu/cgi/viewcontent.cgi?artic...

skywhopper 8 hours ago 3 replies      
I'm disappointed that two other large pieces of the economic puzzle are left out: workers and government. Corporations have more cash than they know what to do with? That means that 1) wages are too low, and 2) taxes are too low.

More specific to Apple's case, the hoarding of cash overseas to avoid paying US taxes on it is one of many poisonous symptoms of our international capitalism. The amount of tax Apple would pay if those profits did come to the US would make a not-inconsequential dent in the federal deficit. The whole take-on-debt-to-pay-dividends strategy is so skeevy that while I don't doubt it's legal, it's very questionably ethical.

All that said, I think US tax policy contributes to the problem. During the Bush administration, an effort was made to argue that taxes on dividends amounted to double taxation because the corporation had already paid taxes on that money, so why should the investors also pay tax on it. And while dividend income was not made tax free, it is now (or at least was for a while, I haven't kept up) taxed at a significantly lower rate than "earned" income. But the fix that makes the most sense to me, and which would solve Apple's problem, is to exempt corporations from paying taxes on the money they then pay out as dividends, and tax the individuals earning the dividends their normal marginal tax rate. This would encourage corporations to pay more dividends, end the "double" taxation, solve some percentage of off-shore hoarding, increase US government revenue, and put more money into the economy and not in corporate bank accounts.

jtraffic 10 hours ago 2 replies      
> they embody two alternative models of capitalism, and the one that wins out will shape the future of the economy.

Even in such a vague form, this is a touch sensationalist. The impact of 'the one that wins' may be negligible. There also may be no winner.

theonemind 8 hours ago 0 replies      
Apple got to this point by notoriously never paying out dividends. Paying out dividends obviously takes away money usable for growth. I don't see any competition between these two models. You issue stock to get capital to compete with other behemoths. If not for having contributed cash to the endeavor, the investors function more like parasites that want to extract the maximum they can from the host. Such massive payouts will probably stop Apple from becoming a major conglomerate with varied tech/science/engineering interests in the distant future and limit them to high end consumer electronics. They got to this point by acting more like Google, and they will likely degrade like HP or IBM now.

A very silly article comparing two companies with similar money-management history and acting like some competition exists between the models because investors recently got an upper hand with Apple. They wouldn't have gotten their market position doing this. It strikes me like comparing two athletes when one recently acquired a disability and suggesting that the disability contributed to their historical success and suggests some new model for the sport going forward.

_nalply 10 hours ago 1 reply      
When I read the title I thought about the difference that Apple is selling to users and Google to advertisers. These are two different ways of capitalisms, too.
l5870uoo9y 10 hours ago 0 replies      
> More importantly, though, how do these strategies impact the lives of everyday people? A capitalist system aims for the efficient allocation of capital, and indeed, workers have a better shot at seeing median wages increase when money is being put to its most productive use. So to an extent, how they fare under each system has to do with who is deciding where and how profits get invested. When managers reallocate profits, that reallocation benefits from the capabilities and knowledge that companies have built over decades, but suffers from the possibly poor incentives of managers. When investors are the ones reallocating profits, however, the scope of the reallocation can be broader, theoretically leading to more innovation; at the same time, those investors dont have preexisting organisational capabilities and they may suffer from their own short-term time horizons.

In the end the economist unwillingly reveals that the actual ramifications are highly theoretical (bordering on non-sense) and and leaves the reader only to conclude that this won't as stated "decide the future of capitalism". In discussing the future economy I would be much more inclined to ask: "How do we create a model where wealth and power is distributed broadly across society?", "What constitute infrastructure in a modern economy?", "How do know we aren't underperforming?" and so on.

johnsmith21006 5 hours ago 0 replies      
Google broke $100B market cap at seven years old. Apple mail did it at 29 years old. Google never has declined for a single quarter YoY since day 1. Not a single time.

Apple had $2.33 EPS for Q2 2015 and for Q2 2017 reported $1.90 EPS. So declined over the last 2 years.

Google other revenues (non ad) were over $10B for 2016 and growing at 50%. Apple total revenues in 2004 when it was 28 years old were less than just Google other. Just sayin.

Btw, Google holds the record of getting to $100B cap faster than any other. Even accounting for inflation.

auserperson 10 hours ago 1 reply      
I don't respect any article about the future of economy that does not take into consideration climate change, sustainability and the anthropocene. Capitalism will have to change dramatically soon, our world is collapsing. But sure. let's talk about apple vs google. I am not a fan of capitalism, but I'm not even criticizing capitalism per se, only that all big companies nowadays are existing in a world and way of producing that will for sure destroy itself in at max 100 years. So that is the future. Not Apple way of managing. I feel like saying wake up sheepele, because that's how it feels reading an article like that. I have no idea what will happen and hope for the best, but let's start accepting that we know major changes are needed and are going to happen whether we plan them or not. Climate change/mass extinction/deforestation/etc is not just about polar bears, is about our energy and ways of consumption and production of goods.
stretchwithme 51 minutes ago 0 replies      
How can anybody look at how much Apple executives are currently taking home and think that lush perks are being limited by large shareholders?
thedevil 10 hours ago 1 reply      
tldr: Google founders have more voting rights than Apple executives. In case you didn't know, this impacts corporate behavior, especially dividends.

I thought this would be about the way each interacted with customers, which is far more interesting.

wageslaving 10 hours ago 0 replies      
The issue with yielding to investors is that investors are primarily interested in making money for themselves, rather than growing the companies they are investing in. An investor will always vote to have large companies take loans in order to purchase the hundreds of millions of dollars worth shares they just purchased back a 25% increase over their current market value, or get a lump sum in the form of the newly issued dividends as it was in Apple's case. And then they take the money they made, reinvest in another large company they can leverage and do it again.

There's absolutely no reason to seek growth based returns which carry risk while this approach is available. Dividends and stock-buybacks represent a no-value-created system of incentives for the richest people in the world, directly extracting the surplus value of laborers at the expense of workers and long-term investors. Only when a company starts to topple does there seem to be any interest in moving into new markets or improving their existing lines of business.

spectrum1234 9 hours ago 0 replies      
I was expecting this to be about open source vs closed source. After reading this, I really wish it had been.
HugoDaniel 10 hours ago 0 replies      
"Google is, like Apple, making loads of money. From 2013 to March 2017, it generated $114 billion in operating cash flow. How much has the company distributed to shareholders? In contrast to Apples 72 percent payout rate, Google has only distributed 6 percent of that money to shareholders."

Apple is way older than Google, maybe this has an influence in both diff. approaches...

mschuster91 4 hours ago 0 replies      
There is a third way of capitalism: the Musk way. Whatever money he made, he invested it in a way that benefits society:

- by building a payment service that's way cheaper and easier than e.g. Western Union or banks

- by using literally every last cent in his pockets to make SpaceX work (which benefits the whole world in terms of cheap, reliable, environmentally-friendly, russian-free launches as well as the planned Mars colony)

- by launching the maybe most successful pure electric car, which soon goes into mass-market price range

This is what I as a socialist see as the one example of capitalism that actually WORKS.

not meant to be racist at all, but political - given the obvious tensions between Russia and the Western world, e.g. by Russian meddling in US elections, financing at least the German and French neonazi parties and invading Ukraine, it's simply unacceptable to depend on Russia honoring their rocket engine delivery contracts.

gaius 3 hours ago 2 replies      
Neither pay their taxes
shmerl 8 hours ago 0 replies      
I wonder if it influences Apple's common nasty behavior.
Show HN: BotEngine Easy tool for creating chatbots botengine.ai
95 points by ajaskiewicz  10 hours ago   46 comments top 15
zerop 8 hours ago 3 replies      
I have lots of text about a particular topic (50K Articles crawled from internet about travel experiences). Now I want to create a chat bot whose chats/answers are from this given text. Basically I want to feed my text to this bot service and creates a chat bot for this given text. Is anyone aware of how do I do this.. Not sure If I made it clear..
konradkpl 10 hours ago 1 reply      
Imho the most intuitive user interface for bot creation software: http://wstaw.org/m/2017/07/11/Screen_Shot_2017-07-11_at_16.2...
anotheryou 9 hours ago 1 reply      
A bit off-topic, but I want custom bots per person via facebook or mail for something like:

You still owe me back book XY. Type "snooze" to be reminded again in 7 days. Or be reminded again tomorrow. Type "done" if you got this reminder despite having done it (same as "snooze", but I'll manually review if I should remove the reminder)

matthoiland 5 hours ago 1 reply      
Props for a fresh, clean design. The docs are fantastic, easy to read, and doesn't contain verbose technical bloat. Overall excellent design and UX execution.
phatbyte 4 hours ago 1 reply      
Honest question: Are chatbots really used or is just a nice-thing to have? When I visit website and the chatbot pops up it really feels like it disrupts MY browsing, I immediately turn it off. I really wanted to if chatbots can translate into new sales.
water42 10 hours ago 1 reply      
>BotEngine allows you to create a chatbot for any service.

except for the ones that are not implemented. would be interested to try this out once it has telegram support.

wiradikusuma 9 hours ago 1 reply      
how does it different than other (seemingly similar) offerings? e.g. api.ai, wit.ai, chatfuel, manychat. i've used wit.ai, it's buggy (randomly missing stuff after you save them) and api.ai (so far ok).
cgrs 10 hours ago 2 replies      
Is it free 100%? Or has it some premium paid features?
adventured 2 hours ago 0 replies      
You guys desperately (I'd like to type that 37 times in a row for emphasis) need a serious, one-click to start, interactive product demonstration (without requiring my email first). I should be able to instantly step into serious business use examples catering to multiple industries (eg whatever industries you guys might choose to target first), so you can show me what it can actually do.

I clicked on product tour, entered a name, and then got this ridiculousness:

Me: Johnny Cash

Bot: That's interesting

Bot: Ready for an adventure?

Me: No

Bot: Oh no, why? It will be fun!

Bot: Are you sure? :(

Me: (pre-scripted) ok, I can try

Bot: Great, so let's try again.

Bot: Do you want to do something relaxing or should we go crazy?


That's where I quit. Unintentionally creepy bot is slightly creepy. Am I talking to a serious product bot, or is this get trashed and sleep on my couch party bot? Wild and crazy times ahead.

Potential customers should be able to dive right into a conversation with your bot tech. You should be extremely eager to show me what it can do in a live conversation and you should have stellar pre-built examples for that purpose all available from one click on the home page. My take away from my experience, is your bot tech can't do much so you're not immediately getting into showing off its capabilities (I don't know if that's the case or not, but if this were any other site, that would be my take away from it, and I'd never return).

yenoham 10 hours ago 2 replies      
OT but I really like the website - do you mind if I ask who created it?
option_greek 10 hours ago 1 reply      
Is there a way to just get the user response and forward it to a webhook ? (and of course send a reply received from the webservice)
victormustar 5 hours ago 2 replies      
No pricing? imo you lose users /w pricing page
pantulis 9 hours ago 2 replies      
Really cool. Is it possible to publish the bot to a website as a JS widget?
Everula 10 hours ago 1 reply      
nice! any integration with Intercom in plans? Also, found a small typo here http://prntscr.com/fuckqw
frgtpsswrdlame 9 hours ago 1 reply      
Why SaaS?
Sizing Up Servers: Intel's Skylake-SP Xeon versus AMD's EPYC 7000 anandtech.com
109 points by zdw  7 hours ago   32 comments top 8
slizard 6 hours ago 1 reply      
This looks like a solid if not amazing comeback of AMD into the server market. Sure, single threaded performance may not beat Skylake-SP, nor will the LINPACK (and most wide SIMD/FMA-heavy works) performance, but that still leaves most of the HPC/engineering applications that either do not have workload that lends itself to heavy vectorization or are simply not tuned for it (and won't be overnight).

All in all, whhen you have a server that seems so close in performance to Intel for less money and consuming less power, I can't imagine that EPYC won't see broad adoption and Intel won't be squeezed.

I'm glad AMD is back and there is renewed competition in the server market!

exmadscientist 5 hours ago 3 replies      
Very nice analysis. A couple of things stand out:

1. The mesh interconnect looks like a big loser for the smaller parts. It's a big jump up in complexity (there's an academic paper floating around which describes the guts of an early-stage version) and seems to be a power and performance drain. I can't imagine they got the clock speeds they wanted out of it. Sure, it's probably necessary for the high-core-count SKUs, but the ring bus probably would have done a lot better for the smaller ones.

2. There's almost nothing in here for high-end workstations (which typically have launched with the server parts). Sure, AMD has Threadripper coming soon, but this looks like Intel's full lineup... so where are the parts? We've bought plenty of Xeon E5-1650s and 1660s around here, and it doesn't look like there's anything here to replace them. That's unexpected. The "Gold 5122" (ugh what a silly name) is comparable, but at $1221 is priced just about double what an E5-1650v4 runs.

Workstations are a bit of an interesting case because their loads look a lot more like a "gaming desktop" than a server: a few cores loaded most of the time with occasional bursts of high-thread-count loads. That typically favors big caches, fewer cores, and aggressive clock boosting. If you're only running max thread count every now and then, you can afford a huge frequency hit when you do. But since these are business systems we try to avoid anything that doesn't say "Xeon" on it (or "Opteron", in years past) as reliability is paramount. To see nothing here from Intel in this launch is discouraging, to say the least. I have an upgrade budget and it looks like it'll be heading nVidia's way at this point.

andrenotgiant 6 hours ago 1 reply      
If you want to try using the new Skylake chips, DigitalOcean just launched high-CPU droplets that run the Intel Skylake 8168 https://blog.digitalocean.com/introducing-high-cpu-droplets/

<disclaimer, I work for DO>

DuskStar 6 hours ago 3 replies      
Looks like die-to-die latency isn't all that great on EPYC, as expected:

"What does this mean to the end user? The 64 MB L3 on the spec sheet does not really exist. In fact even the 16 MB L3 on a single Zeppelin die consists of two 8 MB L3-caches. There is no cache that truly functions as single, unified L3-cache on the MCM; instead there are eight separate 8 MB L3-caches."


"AMD's unloaded latency is very competitive under 8 MB, and is a vast improvement over previous AMD server CPUs. Unfortunately, accessing more 8 MB incurs worse latency than a Broadwell core accessing DRAM. Due to the slow L3-cache access, AMD's DRAM access is also the slowest. The importance of unloaded DRAM latency should of course not be exaggerated: in most applications most of the loads are done in the caches. Still, it is bad news for applications with pointer chasing or other latency-sensitive operations."

I was kind of expecting this, but it's still disappointing to see. Looks like if you need a lot of L3, Intel is still the best/only option. Not to say that AMD hasn't made massive improvements though - and it's also worth noting that while AMD's memory latency is generally worse, throughput is also typically better than Intel.

gbrown_ 4 hours ago 0 replies      
A nitpick regarding the comment on the 8XXX series which is targeted pretty much only for 8 socket systems (or 4 in non-fully populated configs).

> This pricing seems crazy, but it is worth pointing out a couple of things. The companies that buy these parts, namely the big HPC clients, do not pay these prices.

We in HPC would not touch these outside big memory systems which is even niche for us. The consumers of these are far more likely to be those with data warehouse style needs (a.k.a Oracle customers).

Much like the rest of the world 2 socket systems in HPC are by far the most common.

zokier 5 hours ago 1 reply      
EPYC sure does look good on paper. But the big question in my mind is how will OEMs react to it. Will it be offered on equal footing in actual server systems from major brands (HP, Dell etc)? Most people won't be buying CPUs by themselves, so the list prices are mostly moot point. I do seem to recall that K8-era Opterons didn't do as well on the market as they could have been based on the HW alone. I fear we might see a reprise of that play again.
dis-sys 6 hours ago 2 replies      
There are some interesting numbers there on the "memory subsystem: bandwidth" page. Basically Skylake-SP has a pretty low single thread bandwidth (12G/sec) to start with, that is just 40% of what you can get using a single pinned thread on Epyc, but it increases almost linearly when you have more threads.

Wondering other than some sparse matrix applications known to be memory bandwidth bound, what kind of performance impact this is going to cause. Is there any real memory bandwidth bound applications other than ML/AI stuff used by those Internet big names?

valarauca1 6 hours ago 0 replies      
So the single thread performance isn't _amazing_. The power consumption and multithreaded benchmarks AMD quoted were mostly correct.

Looks pretty solid. Sure not everything scales linearly with corecount but if your task does, it looks like AMD might be worth considering.

       cached 12 July 2017 01:02:01 GMT