hacker news with inline top comments    .. more ..    26 Jan 2017 News
home   ask   best   2 years ago   
Paralyzed man regains use of arms and hands after experimental stem cell therapy usc.edu
357 points by jpgvm  4 hours ago   77 comments top 17
Xcelerate 3 hours ago 4 replies      
This is very exciting. This type of work can even benefit people who aren't paralyzed damaged cartilage can also be repaired this way.

When I ran track in college, I somehow developed focal cartilage defects in both knees. This brought my running career to a halt and made walking extremely painful for about a year. In an attempt to fix this, I had a type of surgery known as the OATS procedure performed. This is where the surgeon takes a plug of undamaged articular cartilage from a low load bearing region of the knee and swaps it with the damaged cartilage. Walking is mostly pain-free for me now, but it still hurts too much if I attempt to run.

As though one cartilage injury weren't enough, I somewhat stupidly decided to take up weight lifting after I couldn't run anymore and attempted to set down a barbell that was way too heavy for me. In doing so, I triggered a mild lower lumbar disc herniation. So now I have two permanent injuries. Luckily, neither injury is very severe, so some days I don't even notice the pain while other days it approaches mildly annoying "background noise".

These types of cartilage injuries are common, and arthritis is even more common. But the issue with cartilage is that once it's damaged, it doesn't heal on its own because cartilage has no vascular system. You can break all the bones you want and eventually they will heal, but damaged hyaline cartilage will not. The best that your body can do is to produce "low-quality" fibrocartilage in place of the damaged hyaline cartilage.

Fortunately, there's been a lot of research over the last decade on using mesenchymal stem cells (taken from your own bone marrow) to regrow true hyaline cartilage as opposed to fibrocartilage. The stem cells have actually been shown to differentiate into hyaline cartilage. For me, this has the potential to permanently alleviate both knee and back pain. Moving this research away from clinical trials seems to be taking forever for some reason though...

tokenadult 5 minutes ago 0 replies      
The patient's age being twenty-one may have made a difference in this happy news story. My dad had a slip and fall on ice at age seventy-two that left him paralyzed from the chin down until he died six years later. He had had a similar injury from a car crash (back when cars didn't have seat belts) at age eighteen, from which he recovered fully (although he wasn't so paralyzed from the first injury). So when he had his second injury, he at first thought he would also recover from that injury. Maybe because the second injury aggravated damage to his spine still remaining from his first injury, or maybe just because he was a lot older when injured the second time, he never recovered much at all from the second injury. His experience reminds me how many other people in a family are affected by spinal cord injuries, and thus how important it is to find better treatments for them.

So it's hard to say how wide a range of patients will be treatable with the new technique, but that's what medical research is for: to find out what helps for which patients. I hope further research continues on this and other treatments for spinal cord injuries.

neuronexmachina 3 hours ago 3 replies      
Given that their methodology uses embryonic stem cells, it'll be interesting to see how this plays out in the US with opponent Tom Price as head of Health & Human Services.

> The stem cell procedure Kris received is part of a Phase 1/2a clinical trial that is evaluating the safety and efficacy of escalating doses of AST-OPC1 cells developed by Fremont, California-based Asterias Biotherapeutics. AST-OPC1 cells are made from embryonic stem cells by carefully converting them into oligodendrocyte progenitor cells (OPCs), which are cells found in the brain and spinal cord that support the healthy functioning of nerve cells. In previous laboratory studies, AST-OPC1 was shown to produce neurotrophic factors, stimulate vascularization and induce remyelination of denuded axons. All are critical factors in the survival, regrowth and conduction of nerve impulses through axons at the injury site, according to Edward D. Wirth III, MD, PhD, chief medical director of Asterias and lead investigator of the study, dubbed SCiStar.

warcher 7 minutes ago 0 replies      
One really important caveat here-- nature abhors a vacuum, and spinal cord injuries are no exception. A chronic injury to the spinal column will result in scar tissue coming in over the wound, which is a totally separate issue from the initial injury. If your spinal cord injury is over a month old, you have a completely separate problem, ie, how to clear out the scar tissue so that nerve regeneration is even possible.
startupdiscuss 4 hours ago 2 replies      
Despite all the cynicism, there are still wonderful things about good old medicine and science.

90 days!Paralysis to utility!

Is someone going to tell me something like: oh, the nerve wasn't completely severed so recovery might have happened anyway?

Well, go ahead, but in the meantime I am enjoying this news.

obeone 3 hours ago 0 replies      
Here is the December 14 2016 update to that story:http://keck.usc.edu/stem-cell-therapy-gives-paralyzed-man-se...
willholloway 1 hour ago 2 replies      
Stem cells are showing more and more promise. One thing we know about them is that young stem cells are better than old ones. I think there is a lot of promise in stem cell banking.

I haven't pulled the trigger yet, but do plan on banking my own stem cells while I'm in my early 30's because parts of me will inevitably start to break down in the coming decades, and I really like the idea of tapping my own young cells when I am old to heal some of that.

The only company I know of that is doing this is Forever Labs, https://www.foreverlabs.co/, I am not associated with them, I just think they are on the right track with stem-cell banking and have spoken to one of the founders and was pretty excited about what they were doing and think its something worth supporting, which is why I'm writing this comment.

cpncrunch 2 hours ago 0 replies      
It says it is a clinical trial, but I can't find the actual study, only another press release:


We really need to wait until the clinical trial publishes its results to know whether or not the treatment works. This person might have recovered naturally without any treatment.

pdimitar 3 hours ago 3 replies      
Can somebody explain the stem cell therapy to a complete laic like myself? I don't mean links to science articles, I mean a description you would attempt projecting at me if we were having a beer.

For me the stem cells are some sort of a magical Wolverine regeneration sauce. Never understood why they even work.

overcast 4 hours ago 0 replies      
It's exciting to know that paralysis will be a thing of the past in potentially our near future. Deafness/Blindness/Cancer/etc all benefits from these studies.
avenoir 1 hour ago 0 replies      
I believe the same article was already posted here a few months back. Regardless, it would be interesting to know how much functionality this young man will gain after his rehabilitation. Stuff like this is hard to believe to be 100% effective, at least at this stage of our understanding and use of stem cells. But man this is absolutely astounding if it helps him regain even like 30% of his limbs.
narrator 2 hours ago 2 replies      
Meanwhile, 9 years ago in China, people were already receiving this kind of therapy, but nobody believed them[1]. As Gibson has said, "The future is already here, it's just not evenly distributed."


keeganjw 3 hours ago 0 replies      
Congrats! I feel so happy for this guy. Awesome news!
rce123 4 hours ago 1 reply      
After an accident such as this, is there a point at which it's too late to perform this type of injection? Asking for someone in a similar situation.
rudolf0 4 hours ago 1 reply      
Could stem cell therapy be used to potentially improve existing muscular or neurological connections in normally functioning individuals? Kind of like a whole-body neurotropic.
uptownfunk 3 hours ago 1 reply      
Is there a greatly increased risk for cancer from stem cell use given these are cells that have a high rate of proliferation?
DamnInteresting 3 hours ago 0 replies      
Note that this was published several months ago, in September 2016. So if this news sounds familiar, that's probably why.
The foundation of a more secure web: Google Trust Services googleblog.com
143 points by noinsight  2 hours ago   79 comments top 14
aduffy 57 minutes ago 4 replies      
You can now have a website secured by a certificate issued by a Google CA, hosted on Google web infrastructure, with a domain registered using Google Domains, resolved using Google Public DNS, going over Google Fiber, in Google Chrome on a Google Chromebook. Google has officially vertically integrated the Internet.
niftich 1 hour ago 2 replies      
I don't think this is a bad thing. Instead of a third-party you trust (or rather, your user-agent trusts) vouching that Google's indeed Google, it's now Google vouching for itself, and you trust them by the virtue that they're Google.

This ought not be surprising: presumably, who better to say that Google is indeed Google than Google itself?

The reason everyone doesn't run a root CA is because it's difficult to coordinate trust between parties that may not know about each other ahead of time, and each and every root CA adds more maintenance burden on part of trust-stores. When I self-sign my cert, I am effectively my own root CA, but I lack a compelling value proposition for everyone to add it into their trust-stores, and of course there's the initial difficulty of me propagating my key fingerprint over a tamper-proof "out-of-band" channel ahead of time where you have assurance that it's coming from me.

Google, on the other hand, is fairly easy to verify that they're indeed Google, considering they just published their public keys on their own website. By having a prior web property that's already trusted, they have bootstrapped the trust necessary for fingerprint distribution, and the rest should follow.

When Google's CAs start issuing certs to non-Google parties, we can revisit the 'eggs-in-basket' question.

reddiric 2 hours ago 4 replies      
"If you are building products that intends to connect to a Google property moving forward you need to at a minimum include the above Root Certificates."

The foundation of a more secure web apparently requires you to trust Google with the entire internet, using their properties as leverage to force it to be so.

algesten 2 hours ago 5 replies      
I have no love for most the major CAs I've interacted with, but this feels wrong, though I can't quite pin point why.

Perhaps just a general feeling that all the internet eggs are being put, one by one, in one single alphabet basket.

jhugg 2 hours ago 2 replies      
I love that you can just buy a CA and devices will trust the new owner. Thats not messed up or anything.
wbond 1 hour ago 1 reply      
It is interesting to see that Google decided to opt for NIST P-384 curve for the root certs it is going to have valid until 2036.

Brian Smith has argued for supporting only P-256, P-384 and Curve25519: https://briansmith.org/GFp-0. That said, Mozilla decided to continue to advertize support for P-521 for NSS (https://bugzilla.mozilla.org/show_bug.cgi?id=1128792).

P-256 and P-384 are widely supported in various TLS libraries (SChannel, SecureTransport, OpenSSL, NSS), whereas Curve25519 doesnt yet seem present in Microsoft or Apples libraries. I suppose with TLS 1.3 support perhaps we may see it implemented?

Unfortunately it seems none of the NIST curves (P-*) are considered safe by DJB and Tanja Lange: https://safecurves.cr.yp.to/.

gergles 6 minutes ago 0 replies      
I'm going to be 'that guy', but: Great, another page that requires JavaScript to display text. Hey, Google, you know what's more secure? Browsing plaintext sites without JavaScript.
zokier 2 hours ago 1 reply      
It feels like a new age of internet when we have stuff like Googles private .goog gtld with domains signed by Googles private Root CA. It's not strictly bad (and I'm not complaining), but it feels bit silly/weird/scary/... .
Endy 2 hours ago 1 reply      
All I can say is this - what if I don't trust Google one iota?
asdfasdfasdfa12 2 hours ago 1 reply      
I think SSL certificates need to be replaced. Security can NOT be designed with the 'good guy' in mind. if it can be broken at all we need an alternative.
andy_ppp 2 hours ago 3 replies      
I mean people don't trust Google's motives but I trust the certificate authorities less...

How do we (or Google) know that the CIA and FBI can't create certificates from all the CAs because they have stolen/demanded the Root CA for them?

If I was a TLA I'd want the ability to perfectly MITM anyone.

I think these questions imply that there needs to be a better way to think about security and trust for web endpoints in the days of the state as a bad actor.

scottm84 1 hour ago 0 replies      
Unless they launch a satellite far from earth with locked keys on it, I don't see how this is anything more than a corporate NSL
finid 2 hours ago 0 replies      
The proverbial fox guarding the hen house.
forrestthewoods 1 hour ago 0 replies      
I don't trust Google.
Introducing React Navigation for React Native reactnavigation.org
56 points by evv  1 hour ago   11 comments top 6
fersho311 5 minutes ago 0 replies      
Definitely going to try this out. Having a consistent navigation library seems to be the last missing puzzle when building cross platform apps. Currently our project is set up to use the same Redux reducer and actions codebase for both mobile and web platform. The goal is to share all codebase and adding a new platform will be simply creating a new view component. Unfortunately, a lack of unified navigation library has caused us to create all kinds of hacks to get to make things work. We use react router on web and react-native-router-flux on native.

I'm part of a group working together on a project to teach ourselves coding so if anybody has any existing solution or alternatives, please share!

a13n 7 minutes ago 0 replies      
Awesome! A better navigator is the #2 feature request for React Native - this solves a huge pain point. https://productpains.com/product/react-native

In my experience this was one of the worst parts about using React Native - that there wasn't a simple, easy router (like React Router for web).

Looking at the GitHub repo (https://github.com/react-community/react-navigation), it looks like this was a collaboration between FB engineers, Exponent (great product + team!), and the open source community. So thankful for these incredibly smart folks working on software we can all use for free. <3

cridenour 15 minutes ago 0 replies      
Wow! Looks great. Love the simplicity of drawer[1] and tab[2] navigation as well. Compared to current offerings, a lot less intimidating.

[1] https://reactnavigation.org/docs/navigators/drawer[2] https://reactnavigation.org/docs/navigators/tab

matthewvincent 38 minutes ago 1 reply      
This addresses probably my number one pain point jumping between React - React Native apps. Looking forward to trying it out!
arvinsim 18 minutes ago 0 replies      
Great! Coming from using NavigationExperimental, I am happy that we have converged on a solution that is much easiee to implement.
sagivo 11 minutes ago 1 reply      
what's the relations of this with the official RN library? we have `NavigationExperimental`, `Navigator`, `NavigatorIOS` and tons of other navigation libraries out there.
Using Immutable Caching to Speed Up the Web mozilla.org
202 points by discreditable  5 hours ago   88 comments top 14
achairapart 2 hours ago 4 replies      
Maybe it's time for browsers to go beyond the cache concept and implement a common standard package manager. Download once, stay forever. True immutable.

As developers, we try everyday to squeeze till the last byte and optimize things. We all know how performance is important.

So why download for every website the same asset: React, jQuery, libraries, CSS utils, you-name-it? What a waste!

btilly 3 hours ago 3 replies      
What I really want is the exact opposite. I'd like to see a flush-before header to have a particular web page NOT pull older static resources.

The reason is simple. Websites have lots of static content that seldom changes. But you don't know in advance when it is going to change. However after the fact you know that it did. So you either set long expiry times and deal with weird behavior and obscure bugs after a website update, or set short ones and generate extra load and slowness for yourself.

Instead I'd like to have the main request send information that it does not want static resources older than a certain age. That header can be set on the server to the last time you did a code release, and a wide variety of problems involving new code and stale JS, CSS, etc go away.

zokier 4 hours ago 5 replies      
I'd be very wary of using any HTTP headers with permanent effects. They seem like a way to get easily burned by accident. For immutable caching in particular, I'd probably try to utilize some variation of content-based addressing, eg having the url have the hash of the content.

See also: http://jacquesmattheij.com/301-redirects-a-dangerous-one-way... and the related HN thread with good discussion

Animats 19 minutes ago 0 replies      
This should be done using subresource integrity. Then, you know it hasn't changed. There should be some convention for encoding the hash into the URL, so that any later change to an "immutable" resource will be detected.

With subresource integrity hashes, you don't have to encrypt public content. Less time wasted in TLS handshakes.

georgeaf99 3 hours ago 1 reply      
The concept of immutable assets and links is at the core of IPFS, a distributed alternative to HTTP. Since Firefox inplements the concept of immutable assets now, it would be totally reasonable to load these assets in the browser peer to peer (see WebRTC and webtorrent). I think this would be a great way to retrofit some decentralization into webpages!
whack 2 hours ago 1 reply      
Sorry if this is a dumb question. How is immutable caching any different from cache-control headers with a max age of 100 years?
roddux 4 hours ago 0 replies      
This is related to the Chrome caching update; as discussed here: https://news.ycombinator.com/item?id=13492483

Two wholly different strategies, which has ultimately split how the browsers handle caching.

mixedbit 3 hours ago 2 replies      
I just realized that until http is completely replaced with https, private mode should be always used to browse the Internet on a not trusted wifi network. Otherwise malicious content injected on such a network can be cached and reused by the browser forever.
nicolaslem 5 hours ago 6 replies      
What is the difference between an immutable resource and setting the resource to expire in 10 years?

Many websites already do that, they change the URL each time the content changes.

romaniv 3 hours ago 1 reply      
HTTP caching is a mess. I wonder why no one proposed a properly redesigned and negotiable protocol that covers all the edge cases. (And maybe supports partial caching/partial re-validation of pages.)
jjoe 4 hours ago 2 replies      
With WiFi hotspots dropping connections more often than not, how many people would know they need to CTRL-F5 to "fix" a broken page/image/JS/CSS?

I just hope the draft as-is expires and never makes it to an RFC.

mnarayan01 4 hours ago 2 replies      
I wonder if this will bring back the "hard refresh".
niftich 2 hours ago 0 replies      
What a mess, but perhaps a happy ending. I made two other comments prior to this one in this thread, but then I read the Bugzilla thread [1] opened by Facebook that laid out the issue and Mozilla's defense. It's a highly enlightening read; I can't recommend it enough.

To summarize, the issue is that Facebook was seeing a higher rate of cache validation requests than they'd expect, and looked into it. Chrome produced an updated chart documenting different refresh behaviors [2], which is the spiritual successor of this now-outdated stackoverflow answer from 2010 [3], and in response to Facebook's requests, and have re-evaluated some of their refresh logic.

In this thread, Firefox was being asked to do the same, but they pushed back on adding yet another heuristic and in turn proposed a cache-control extension. Meanwhile, Facebook proposed the same thing on the IETF httpbis list, where the response not enthusiastic [4], largely feeling that that this is metadata about the content and not a prescriptive cache behavior, and that the HTTP spec already accounted for freshness with age. One of Mark Nottingham's responses [5]:

(...) From time to time, we've had people ask for "Cache-Control: Infinity-I-really-will-never-change-this." I suspect that often they don't understand how caches work, and that assigning a one-year lifetime is more than adequate for this purpose, but nevertheless, we could define that so that it worked and gave you the semantics you want too.

To keep it backwards compatible, you'd need something like:

Cache-Control: max-age=31536000, static

(or whatever we call it)

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1267474[2] https://docs.google.com/document/d/1vwx8WiUASKyC2I-j2smNhaJa...[3] http://stackoverflow.com/questions/385367/what-requests-do-b...[4] https://www.ietf.org/mail-archive/web/httpbisa/current/msg25...[5] https://www.ietf.org/mail-archive/web/httpbisa/current/msg25...

nachtigall 3 hours ago 0 replies      
The last image about Squid proxy is included too small in the post, here it is in a readable way: https://hacks.mozilla.org/files/2016/12/sq.png
Stop Checking Email So Often nytimes.com
40 points by colinprince  1 hour ago   30 comments top 15
liquidise 1 hour ago 2 replies      
I don't mean to hijack the focus from email specifically, but i find slack suffers from the same, if not worse, issue. Without exception, the most productive days i have had since my team adopted slack are the days when i forget to open it in the mornings, and achieve hours of uninterrupted work.

All of these conversations, be it open-office/email/slack/etc are a rehashing of something we all know: fewer distractions leads to increased productivity. The trick is to balance that with collaboration and help channels, so the product and collective productivity of a team is improved, not just an individual's.

paulmd 4 minutes ago 0 replies      
I think this is especially important for developers. Our jobs are very intellectually complex, we have to constantly maintain a mental model of the code as we trace a bug back from the place it manifests to where it originated, as well as when formulating a solution to correct the bug that won't break anything else in the program.

When you get popped out of "the zone", you don't just lose the two minutes it takes you to answer your coworker's question, you also are losing the 15-30 minutes of context that you built up about the behavior of the problem.

Donald Knuth phrased it thusly: "Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don't have time for such study."

It's bothered me in the past, now I'm thinking about doing something about it. Like disabling desktop notifications in Outlook for anything that's not flagged Important.

cagrimmett 1 hour ago 1 reply      
My method to stop compulsively checking messages yet still be responsive to my clients' support emails was to institute the Pomodoro Method (25 minutes on a task, 5 minutes off, repeat for the whole work day). I close off all communication (Slack, email, Skype, etc) for the 25 on-task minutes and only check them during the 5 minute breaks. If something urgent comes up, I dedicate my next 25-minute task period to dealing with that item.

This has broken me of my involuntary email/Slack/IM checking and instead keeps me focused on the task at hand.

zakki 1 minute ago 0 replies      
I should add "don't check HN too often" to myself.
jimmaswell 8 minutes ago 0 replies      
I just look at my notifications when I pick up my phone to check what new emails came in, if it's important I'll either reply to it there or if I need to reply later with something I need a PC for I'll leave the notification up so I remember later. I don't have any issue with this and it seems pretty much perfect to me. I don't get the problem.
v64 1 hour ago 3 replies      
When my manager stops freaking out when an email goes unanswered for 10 minutes (or I get a new job), then I'll stop checking my email so often.
bborud 32 minutes ago 0 replies      
I have no problems not checking my email. Email has become the dumping ground for emphemera. Clingy bullshit from companies you have bought stuff from, people who whine, automated emails from so many different places despite adding new filters every week.

Want my attention? A handwritten note is actually the best way nowadays. Because I only check my work email once or twice per day, and I only check my personal email every two days or so.

unicornporn 55 minutes ago 1 reply      
Seriously. In this day in age, is email really the big threat? Email feels like bliss to me, compared to the constant bombardment of information via different instant messaging channels. Riot, Signal, Telegram, Wire and WhatsApp yes, they're all installed on my phone and they are constantly polling for my attention by blinking that notification led on my Android phone. Yuk.
drops 1 hour ago 0 replies      
Nowadays there's not much reason to actually check the email manually because of all the mail-checking browser extensions.

This of course brings up another problem: getting the emails as they arrive can be as distracting as constant stream of notifications from messengers and team chats like Slack.

makecheck 1 hour ago 3 replies      
It might be neat to apply machine learning to something like incoming messages (whether E-mail or text or otherwise), to automatically determine the real importance of a communication based on how you classify other messages.

After awhile, it should be possible to achieve minimal interruption, based on what you consider worthy of an interruption.

I know that I dont like simple solutions like marking E-mail as important. In my experience, some people will always mark their stuff as important and their definition is not my definition.

spraak 1 hour ago 0 replies      
s/Email/HN/g for me
6stringmerc 57 minutes ago 0 replies      
For a while Xobni was elite at being able to see metrics and dashboard like stats for email communications. Not sure about the Infinity Version though...
edgarvaldes 1 hour ago 0 replies      
I check my email once an hour. I hate the feeling of imminent distraction, so I close the email client and open it again one hour later.
acbabis 40 minutes ago 0 replies      
When I saw this in the HN feed, my first thought was "I should see if I got that email"
jayajay 1 hour ago 0 replies      
If you are checking a feed like email, or chat, you are expecting a new item. If you fail to get a new item 9/10 times you check the feed, it certainly makes sense that you may feel stressed or feel like you have less control over things. You can't control when the items come into your feed. 9/10 times, you are hyping yourself up to expect something, and then getting disappointed when it's not there. I think this kind of result is true for anything that requires waiting for something out of your control. The train, the doctor's office, replies to your comment, etc.
Alphabet Announces Fourth Quarter and Fiscal Year 2016 Results abc.xyz
38 points by maverick_iceman  1 hour ago   24 comments top 4
theelfismike 53 minutes ago 2 replies      
Interesting the effective tax rate went from 5% last year to 22% this year.

What would cause that?

MarkMc 27 minutes ago 1 reply      
Revenue is up 22% but EPS is up only 7%. I would normally expect Google's EPS growth to be higher than revenue growth. Where are they spending all that extra income?
zitterbewegung 1 hour ago 1 reply      
Also, Alphabet(GOOG) missed expectations. https://finance.yahoo.com/m/e866d713-3773-3b26-83ce-07bb214b...
aresant 49 minutes ago 6 replies      

Paid clicks on Google properties +43% - "yaaay we're driving more paid search!"

Cost-per-click on Google properties -16% - "ouch our advertisers are seeing less value on these add'l clicks!"

To me the second part of this is going to be most interesting to watch - if the clicks they are onboarding are lower quality, this is going to be a net negative.

Craig Newmark donates $500k to reduce harassment on Wikipedia wikimedia.org
42 points by The_ed17  2 hours ago   18 comments top 7
vdnkh 27 minutes ago 2 replies      
> Blocking making it more difficult for someone who is blocked from the site to return

If they're talking about IP bans on viewing Wikipedia here, this is a terrible idea. If some troll gets banned on a college campus that will result in the inadvertent ban of thousands of other connected to the same network. This line strikes me as naive.

netman21 27 minutes ago 0 replies      
I wish he would donate $500K to filtering scams from Craig's list.
Sir_Cmpwn 55 minutes ago 3 replies      
I can't believe a half a million dollars is necessary to reduce harassment. I'm all for reducing harassment online but that much money could make a serious difference applied in a different way. Look at Wikimedia's report on harassment (which this is in response to):


There's room for improvement but not a half a million dollars worth of room for improvement. I suppose people will donate to what's important to them, though.

WhitneyLand 12 minutes ago 0 replies      
Is this talking about harassment by moderators? Serious.
akjainaj 10 minutes ago 0 replies      
Wikipedia is the worst online community I've seen in my entire life, and believe me, I've seen a lot. It's like a steel cage death match between autists and narcissists. I doubt there's enough money in this world to fix something like that.
sparkzilla 25 minutes ago 0 replies      
It's interesting that after 15 years of operation, Wikipedia does not apparently have decent tools to detect harassment /sarc.

There will be a lot of discussion about the symptoms here, but the cause is straightforward: Wikis are built through conflict, and much of that conflict involves harassment, doxxing etc. Ask anyone who has tried to edit any major page.

The real solutions to harassment are counterintuitive: Enforce full anonymity, take measures to stop people and gangs "owning " pages, stop using a system that lets any user at any level veto other user's edits, have a proper editorial workflow, and many more. But none of these will never happen, so the harassment will continue.

It should also be noted that the Wikimedia Foundation just raised millions of dollars in its latest fundraising drive, and has millions more in the bank, so it really doesn't need the money.

throwaway420 28 minutes ago 0 replies      
Not a huge fan of this.

The editors on Wikipedia wield a large amount of power in shaping the site.

When they go and make arbitrary decisions about the content on the site, and users start calling out the editors for bias and bogus decisions, well now all of a sudden the crooked editor can just cry "harassment and cyberbullying!" and go a long way to shutting down rational criticism.

Google Brings AI to Raspberry Pi bbc.co.uk
134 points by Lio  4 hours ago   40 comments top 13
general_ai 1 hour ago 3 replies      
Since the article doesn't have any information whatsoever, here's my prediction: by "bringing AI to Raspberry Pi" they mean being able to call their cloud APIs from there.

TensorFlow is not suitable for anything practical on the Pi. You can certainly get it to run there, but CPU vector math on resource constrained devices is not going to be a forte for a framework designed primarily for quickly iterating over models on a GPU workstation or a multi-GPU server. TF very much likes to have a very beefy GPU.

toisanji 3 hours ago 4 replies      
I built a robot with tensorflow running on the raspberry pi to do autonomous driving and computer vision: http://www.jtoy.net/portfolio/

I'm working on the next version to make it more useful, but all the technology is not there yet, I want the robot to be able to understand speech and talk back to users. I also want the robot to be able to play games with people. I think the platform has a lot of potential. I want Google to release a low power tensor processing unit made for the pi to make this more useful. This will open up a lot of doors for robot and AI enthusiasts. I'm looking to turn this into a platform, contact me if this is of interest to you.

madenine 4 hours ago 1 reply      
"Google has asked makers to complete a survey about what smart tools would be "most helpful".

And it suggests tools to aid face and emotion recognition, speech-to-text translation, natural language processing and sentiment analysis.

Google has previously developed a range of tools for machine learning, internet of things devices, wearables, robotics and home automation."

That's the meat of it. Google put out a survey - speculation ensues.

bloaf 50 minutes ago 0 replies      
Wolfram would probably argue that they've already brought AI to the Raspberry Pi.



ungzd 4 hours ago 1 reply      
The article conveys almost no information. Why speech synthesis, NLP and so on are called AI? What's special with Raspberry Pi, isn't it a regular general-purpose computer, just of small size? Will it be run locally or on Google servers? Why BBC has articles of so poor quality?
zython 4 hours ago 0 replies      
Kind of an misleading title, nothing is release per se yet.

They've just announced that they might release something ML/AI related in 2017 for the PI.

reitanqild 4 hours ago 1 reply      
Wondering if they just mean providing an api that RPis can use or if they are somehow going to get meaningful "AI" running on a RPi?
b1gtuna 3 hours ago 0 replies      
Glad I am not the only one who thought the article lacked real info.
anfractuosity 4 hours ago 2 replies      
Sounds cool! I wonder if this will be with TensorFlow. If so, does the RPis videocore have any capability to accelerate such things or not out of interest.
thetrevdev 4 hours ago 0 replies      
Tired of seeing this article when the word AI doesn't appear once in the article and its just an announcement for untold mysteries sometime in the future....
awqrre 2 hours ago 1 reply      
I don't want Android on my RaspberryPi...
shahbaby 4 hours ago 0 replies      
thank you commentators for saving me before I wasted my time reading the article, this is why I love hacker news
g123g 2 hours ago 0 replies      
More fake news from BBC.
Instrumentation: The First Four Things You Measure honeycomb.io
100 points by cyen  4 hours ago   21 comments top 8
ackerman80 1 hour ago 1 reply      
Came across this which gives good insight into the 4 golden signals for a top-level health tracking: https://blog.netsil.com/the-4-golden-signals-of-api-health-a...

One thing of note in the graph is the tracking of response size. This would be very useful for 200 responses with "Error" in the text. Because then the response size would drop drastically below a normal successful response payload size.

In addition to Latency, Error Rates, Throughput and Saturation , folks like Brendan Gregg @ Netflix have recommended tracking capacity.

anw 2 hours ago 0 replies      
While this is good advice, I feel it is a bit too over-simplified.

Counting incoming and outgoing requests misses a lot of potential data points when determining "is this my fault?"

I work mainly in system integrations. If I check for the ratio for input:output, then I may miss that some service providers return a 200 with a body of "<message>Error</message>".

A better message is to make sure your systems are knowledgeable in how data is received from downstream submissions, and to have a universal way of translating that feedback to a format your own service understands.

HTTP codes are (pretty much) universal. But let's say you forgot to inlcude a header or forgot to base64 encode login details or simply are using a wrong value for an API key. If your system knows that "this XML element means Y for provider X, and means Z in our own system", then you can better gauge issues as they come up, instead of waiting for customers to complain. This is also where tools like Splunk are handy, so you can be alerted to these kinds of errors as they come up.

bbrazil 3 hours ago 2 replies      
> A histogram of the duration it took to serve a response to a request, also labelled by successes or errors.

I recommend against this, rather have one overall duration metric and another metric tracking a count of failures.

The reason for this is that very often just the success latency will end up being graphed, and high overall latency due to timing-out failed requests will be missed.

The more information you put on a dashboard, the more chance someone will miss a subtlety like this in the interpretation. Particularly if debugging distributed systems isn't their forte, or they've been woken up in the middle of the night by a page.

This guide only covers what I'd consider online serving systems, I'd suggest a look at the Prometheus instrumentation guidelines on what sort of things to monitor for other types of systems: https://prometheus.io/docs/practices/instrumentation/

siliconc0w 1 hour ago 0 replies      
Every request to and from the app should be instrumented. Paying addition to the requests to the app is a good start - but you really need detailed instrumentation of all downstream dependencies your service uses to process it's requests to understand where the issue is. It's often likely you're slow or throwing errors because a dependency you use is slow or throwing errors. Or maybe the upstream service complaining has changed it's request pattern and they're making more expensive queries. There is often a small minority of requests that are responsible for most of the performance issues so even if the overall volume hasn't changed, the composition and type of requests matter as well.
techbio 3 hours ago 4 replies      
Author appears to use "downstream" and "upstream" to refer to "further down the stack" and "further up the stack".

Is this normal usage? Seems reversed to me.

vaishaksuresh 3 hours ago 1 reply      
Off Topic: Does anyone know what tools the author uses to make the diagrams?
jdormit 3 hours ago 1 reply      
Is the last paragraph a joke? If so, could someone explain it?
New Study Finds Performance-Enhancing Drugs for Chess worldchess.com
82 points by EvgeniyZh  3 hours ago   77 comments top 15
the_watcher 1 hour ago 1 reply      
The title is somewhat misleading - I assumed they discovered a new type of drug that was enhancing performance. Actually, it's just the first study to actually bear out the (not particularly shocking) conclusion that drugs that enhance cognitive performance help in a game that depends on cognitive performance.
WhitneyLand 15 minutes ago 1 reply      
How do they know it's not due to confidence boost?

These drugs can make you feel "good" and confident, which all top competitors spend a lot of time trying to build up.

zw123456 32 minutes ago 2 replies      
I actually did a test myself with pot a while back, also playing against a computer, my thinking there was that it would provide the most consistent opponent skill level. I tried varying amounts. I saw a fairly significant improvement with small doses, less than what someone would use for entertainment purposes, an amount just before what could be called a high. I think it may increase concentration at the right dosage level.
towaway 1 hour ago 1 reply      
Hah. Oddly enough this tallies with my experience - about 5 years ago I ended up buying a box of modafinil and took them periodically before work. Someone asked me about what I thought of them and I reflected that I wasn't sure if it helped my work or not but that it meant I could beat my phone Chess app on a harder difficult setting than usual when I played on the way to work.
aantix 2 hours ago 6 replies      
I've been taking 400mg of Magnesium Glycinate two times a day along with 200mg of Theanine (also twice a day) with my usual two cups of coffee.

I feel fantastically focused without the jittery edge that accompanies caffeine alone.

cven714 3 hours ago 3 replies      
The study was done with rapid games, against computers, where they discarded human losses on time. Not very confident in those results. But hopefully the researcher gets the chance to try and reproduce and use classical time controls, and human vs. human opponents.
nv-vn 43 minutes ago 1 reply      
Would be interesting seeing the effects of microdosing psychedelics for chess. In theory, it should improve problem-solving skills. While the effects of stimulants seem rather expected in this situation, I think seeing information on microcodes of LSD or psilocybin would be very interesting because they could change performance in different ways.
beefman 1 hour ago 0 replies      
To actually demonstrate this, same-individual comparisons with and without the drug (on different days) should be made using IPRs.[1] I expect enhancement may be seen in novice players but not in players with a stable Elo rating.

[1] https://rjlipton.wordpress.com/2016/12/08/magnus-and-the-tur...

eutectic 3 hours ago 0 replies      
I wonder how nicotine would fare.
tedsanders 2 hours ago 2 replies      
Does this study actually provide strong evidence for performance enhancement?

Folks who took the drugs took longer to play their moves. They lost more games on time, but ignoring those games, their play was better and they consequently won more games.

But imagine an alternate study where players were given the instructions to take more time on their moves. In this study, they would have lost more games on time, but when they didn't lose on time, their play would have been stronger and they consequently would have won more games.

In such an alternate study, I wouldn't say their brains got better. They just made a different choice to play at a different point on their efficient time-quality frontier.

I wonder - how much did these drugs push the frontier outward, rather than just trading off to move along it?

Nonetheless, it's interesting that this study tried to answer why/how these nootropics work (maybe they make you more reflective) and also understand in what circumstances such drugs won't help (in situations where you're under time pressure).

mistercow 3 hours ago 2 replies      
Transhumanism sure has a long road ahead as far as people's attitudes toward self-enhancement.
mtw 1 hour ago 0 replies      
Ha! Should we ban coffee?!?
fdsfsaa 2 hours ago 2 replies      
Why is it surprising that nootropics exist? Half the good people in tech are on them. And there's nothing fucking wrong with that. "My body, my choice", as the slogan goes.
yarou 1 hour ago 2 replies      
Anecdotally, Modafinil improved the functioning of the right hemisphere of my brain. I noticed this type of enhancement with Phenibut as well, my "emotional intelligence" was increased by one or two orders of magnitude.

On the other hand, Adderall increased my raw intelligence by several orders of magnitude. I strongly believe that amphetamines should be decriminalized and unscheduled, as they are the only class of pharmaceuticals that are proven to increase cognition with a fairly limited side effect profile.

Show HN: Ship 2.0 A macOS Native Interface to GitHub Issues realartists.com
128 points by kogir  4 hours ago   55 comments top 15
thedjinn 1 hour ago 3 replies      
This app installed webhooks on all of my repos with no automated way to remove them. Very annoying.
rsanheim 2 hours ago 2 replies      
This is a impressive amount of work to keep things fast and to avoid hammering GitHub with redundant requests, which can get you rate limited[1] pretty quickly.

Couple questions:

* I'm assuming you are using the actual end user's token for auth, so at least you have the 5000 requests per user per hour?

* It sounds like the Ship server has a lot of (potentially private) user data sitting around to keep things fast. What sort of security do you have in place to prevent data leaks or malicious access?

[1] - https://developer.github.com/v3/#rate-limiting

synthecypher 41 minutes ago 0 replies      
Nice I use Bee which supports GitHub and JIRA.


Unfortunately this application supports only GitHub issues.

kogir 4 hours ago 5 replies      
Fred, James and I are in the thread and happy to answer any questions about the product and how we built it. There's definitely more detail than made sense to include in the blog post.
nicky0 24 minutes ago 1 reply      
I got as far as the screen where is asks for full read/write access to my code, including all my private repos. I'm fairly sure you won't mess anything up or steal my secrets, but why does it need this?
bsimpson 1 hour ago 0 replies      
I'd be a lot more curious about it if it handled Pull Requests. (Can't find anything in the screenshots or docs to indicate that it does.) The biggest pain point I've seen with GitHub's current UI is that it's hard to track PRs as they evolve over time - old revisions and comments are often hidden away, with no way to indicate what's been resolved and what hasn't.

Though to be honest, I'd be skeptical even if it supported that. I imagine finding enough open source people who will pay $9 a month for a wrapper around a free tool is a challenging business model, and I'd hate to become reliant on it only to find it's been sunsetted due to the cost model.

Best of luck to the team! I don't mean to be pessimistic, just aware of the realities of the developer tools market.

koolba 1 hour ago 2 replies      
> Ship 2.0 is two parts, a hybrid Cocoa/ObjC/CoreData/JavaScript/React application that runs on your Mac, and a C#/MSSQL/Orleans server that runs in Azure. Oh, and a little bit of glue written in Python running on AWS lambda. Yes, it's truly an unholy alliance.

What compels someone to write a net new application (from scratch with no legacy DB) that uses MSSQL?

In my book it's in front of only DB2 and Oracle on the "Will not use unless you're already using it, migration is impossible, and you're going to have to seriously pay me to deal with this" list.

QuentinM 4 hours ago 1 reply      
If only this would support PRs / code review with revisions ... !
TAForObvReasons 3 hours ago 1 reply      
How does it compare to the official Github app: https://desktop.github.com/
zacmps 3 hours ago 1 reply      
Why did you develop just for OSX and would you consider supporting other platforms (Linux!) in future?
mcescalante 3 hours ago 2 replies      
For a new user, they may have no idea that this is a 30 day trial until they download and run the app, and I don't think that there is pricing information on the website anywhere. Not necessary to have any of this, just some feedback :)
davidcollantes 2 hours ago 0 replies      
Very nice, but $9/month to use?
niemyjski 2 hours ago 0 replies      
Brew support?
dbg31415 3 hours ago 1 reply      
Given all the 3rd-party integrations available to the web interface (ZenHub, Harvest, to... Grammarly even), what's the point of having a native MacOS-only interface? It seems limiting / backwards to have native applications when there's already a universal web interface.

Are there features that the web interface doesn't have?

mosselman 3 hours ago 1 reply      
I just tried the app and I think the typography of the issue list makes it hard to look at for me. Also, I doubt that the app is useful to me. What kinds of teams is it aimed at?
Tesla sues ex-Autopilot director for taking proprietary info, poaching employees techcrunch.com
18 points by dwynings  1 hour ago   10 comments top 2
DannyBee 19 minutes ago 3 replies      
"The suit accuses Anderson of having tried to recruit away employees from Tesla"...I hope not, since that's not a thing that's illegal.

Tesla should certainly know that most non-solicits are not enforceable in california (and i only say most because occasionally, one is found valid, but the vast majority are not)

marricks 19 minutes ago 0 replies      
It's strange how, besides the blatant theft, this worked out pretty well for Tesla. Only lost three employees after (alleged) attempted poaching and they now have the guy who managed swift for a while managing Autopilot whose exceedingly competent.
Launch HN: Voodoo Manufacturing (YC W17) AWS for Manufacturing voodoomfg.com
106 points by jschwartz11  5 hours ago   44 comments top 15
jschwartz11 5 hours ago 0 replies      
Hi Hacker News! I'm one of the co-founders of Voodoo Manufacturing (https://voodoomfg.com/) in the current YC W17 batch. Voodoo Manufacturing is an automated 3D printing factory capable of producing anywhere from 1-10,000 plastic parts in 2 weeks or less, at injection molding prices. The difference between us and other 3D printing services is that we use lower-end commoditized 3D printers rather than commercial or industrial machines. Today we have 160 MakerBot Replicator 2s. We started Voodoo because we believed these machines were capable of making high-enough quality parts and products for various use cases. Obviously we're not (yet) producing any high-performance parts for airplanes, etc., but we can make cost-competitive hardware components, and surprisingly we make a lot of promotional products (event swag, trophies, memorabilia, etc.)

Since launching Voodoo we've come to realize that traditional manufacturing is still really hard for most people and companies, and definitely falls short when you're producing parts in runs of less than 10,000 units. More than 3D printing, we're excited about the future of fully digital manufacturing -- going from a digital design to physical parts with virtually no upfront costs or overhead. Today we're cost-competitive with injection molding for runs of 1-10k units (the exact cross-over point depends on the part), but our goal is to reduce costs by 90% over the next 5 years, and thus become competitive for runs of up to 100k units. We also believe that 3D printing and other digital manufacturing technologies will continue to improve, from materials to machines, and so over time we'll become capable of producing high-quality and high-performance parts that can be used for more applications.

A couple of projects we've worked on include this current marketing campaign with Dixie (http://www.crushtomizer.com/) where they're using our API to submit orders to be manufactured and drop-shipped on-demand, and a pet-project where we printed a life-size body model in less than 24 hours (http://blog.voodoomfg.com/2016/06/10/a-life-size-human-model...). Today, we've worked with over 1,200 customers, from large brands to small startups, to help people make things that wouldn't be possible or economical with other manufacturing methods. We're excited about all the things currently happening within manufacturing, and are proud to be one of the companies trying to push forward and build a new type of factory, right here in the US.

We'd love to hear your thoughts on what we're doing, what you might want that we don't currently offer, or help you out if you have a project in mind. Thanks so much!

jc4p 3 hours ago 3 replies      
Are you just doing FDM printing or can you also do milling?

I'm very very interested in multiple possible $300-$1000 budget things for both. Edit: Looking at the other comments (I haven't priced anything out using the site yet, scared to do so b/c i don't like drip campaigns), some of my projects might be below this budget using you! Wow!

Is the goal that I use this service to make beautiful enclosures for my projects, or to do all the hardware needs for my project from design to complete?

I've mostly used Oshpark/Shapeways. I was looking into getting a desktop milling machine like an Othermill or Shapeoko, but from my testing they're still way too loud to have in my home office.

Do you cater to one-off things from people like me, or is the hope to be doing mostly B2B?

froindt 4 hours ago 1 reply      
Good to see a manufacturing startup in YC! I'm currently a grad student in industrial engineering and have been working with geometric analysis for manufacturability for around 2 years now.

What types of analysis are you doing on the parts to determine cost? Slicing, orientation optimization, volume analysis, full toolpaths with estimated time?

Does your analysis automatically look at the tradeoff between injection molding and 3d printing for a particular design which was uploaded?

I noticed the current interface requires all parts to be in the same unit. Any chance of a per-part unit selection (while keeping the batch)?

fudged71 3 hours ago 1 reply      
What made you choose to go through YC a second time after your startup Layer By Layer? Isn't the network there for you after the program?
antoniuschan99 3 hours ago 2 replies      
I did a price comparison vs Shapeways of the direct print which was 20 units.

Total cost was $158 for Voodoo and Total cost was $332 for Shapeways.

Is it true that you guys are cheaper and any reason why?

physcab 3 hours ago 1 reply      
"we make a lot of promotional products (event swag, trophies, memorabilia, etc.)"

Is there a gallery of products you've made? Also, do you do your own drop-shipping?

Looks cool!

qume 3 hours ago 2 replies      
We use 3D printing extensively, for large sized parts, and there are very few options out there.

Consider buying a few machines that can do very large parts. We would use you literally today if you could do that. It's a very underserved part of the market.

I'm talking apx 700mm x 400mm x 300mm

cindywu123 4 hours ago 1 reply      
I just got one of the Voodoo manufactured "coffee stoppers" from http://www.crushtomizer.com/ in the mail!
spraak 1 hour ago 0 replies      
Did anyone else think of 'OMFG' in the URL? I.e. I read it as 'Voodoo OMFG'
manglav 4 hours ago 1 reply      
This looks great! I myself just bought a chinese 3d printer to prototype some parts. One thing I noticed was QA was difficult. If I did a run of 10k parts, how would you QA them?
hengheng 2 hours ago 1 reply      
How do you compare to protomold?
killerham 4 hours ago 1 reply      
What's the pricing like? Or is it quoted per order?
ph0rque 4 hours ago 1 reply      
Really cool! I signed up for your mailing list, eagerly awaiting when you can offer materials other than plastic.
notnot 2 hours ago 1 reply      
Why no .SLDPRT support?
LinuxFreedom 3 hours ago 1 reply      
Please develop a different, self-degrading material - we already have too many plastic material in the ocean and it is a major problem - you certainly did read about it, this is commonly known nowadays. I wonder how anybody would build a new business on the idea of flooding the world with even more poison.
Raspberry Pi 3 based home automation with Node.js and React Native github.com
250 points by sconxu  9 hours ago   56 comments top 13
pjungwir 6 hours ago 5 replies      
This looks like a lot of fun! Using a RaspPi is really attractive to me because you can make everything interoperate, and nothing depends on some company's servers.

I've done two Pi projects now that I keep meaning to blog about. One was a sprinkler control system. My old system was dying, and not very flexible, so I decided to run everything off a Pi. The tricky part was driving 13 sprinkler lines with 24V AC current. I bought a 16-relay board and eventually got it wired up. For someone with little electronics experience, there was a lot to learn. [1] is an attempt before I realized I had to use the relays. Eventually I got it working and used it all summer. With cron, I can schedule things however I want!

The second is a security camera for a vacation rental home, and is not quite done yet. The hardware side was not challenging at all, but I still need to work out how to copy the images up to S3 or a Linode. I'm using MotionEyeOS and it doesn't seem to know how to do that itself. One of the big reasons I went with a Pi is I didn't want to pay or rely on someone else's servers. Also I wanted to avoid the security problems that have been in the news lately. I don't want inbound traffic to my LAN; I'd rather push the video somewhere else.

It took me a long time to figure out worthwhile uses for a Pi. A friend of mine loves using these things for media servers and CI servers and whatnot, but to me it's only satisfying if it's something where you actually need the miniature scale. Also a Pi really hits the sweet spot for me in terms of hardware-vs-software. I'm sure I could have done the first project with an Arduino, but using ssh, cron, and python was really nice.

[1] http://raspberrypi.stackexchange.com/questions/50435/driving...

linker3000 2 hours ago 0 replies      
There's some really useful stuff going on there - nice write-up.

Anyone getting into this field should take a look at Peter Scargill's Tech Blog - he has published details and code for a home control system centered around a Pi using MQTT with a range of modules (mostly ESP8266). The most interesting recent stuff is on control, monitoring and dashboard design for phone and Web apps - his work on the dials and gauges is very good.

Pete also takes a regular look at other non-Pi platforms from an IoT control perspective.


redsummer 7 hours ago 3 replies      
I managed to get bilingual voice activation (Alexa and Siri/HomeKit - maybe Google Home in future) working with Home Assistant, homebridge, pi-mote, raspberry pi 3 and four energenie sockets. (In the US I guess you could use etekcity sockets)





I wouldn't call it simple to set up, but it was cheap - about 70 (not including Alexa device, which could even be the same pi - https://github.com/alexa/alexa-avs-sample-app/wiki/Raspberry... )

deepsyx 6 hours ago 2 replies      
Hello and thanks for the post! I'm the author of this repo. I would be happy to answer any questions :) Also any feedback/ideas are greatly appreciated!
Freestyler_3 1 hour ago 2 replies      
Ok, Here is what I have been thinking of:

a PI, many sensors and controls.the pi to do the things the pi always does.the sensors to sense room temp. in each room.And this is the hardest one, flow control per radiator.

I want that I can set room temp schedules, and can go off schedule using the app (manual intervention) When the current temp in any room is below the set temp for that room the heater system turns onThe rooms that are already above their set temp have the radiators turned down.

Why? Because I don't like to waste heat to a room I don't enter 90% of the day.And when my living room has reached the target temp there is always a room that is either still stone cold or feels like sauna.

The hardest part about this is the controllable radiator valves, the rest already exists.

dnadler 5 hours ago 2 replies      
Cool! I'm doing something similar, but more from the data-analysis side of things for my condo's efficiency.

I've been trying to keep a blog of my progress, if anyone is interested, though please forgive the poor grammar / stream-of-consciousness in the posts... I've been writing quickly to get caught up.


ryfm 2 hours ago 0 replies      
Cool, working on a similar project, but having a different stack

1. Z-wave switches/outlets/locks - all lamps, receptacles and locks controlled by Vera Edge;2. DSC alarm system - door/flood sensors, integrated with Vera;3. Nest cameras - not integrated;4. Nest thermostat - integrated with Vera and Alexa;

Currently trying to integrate Vera and Alexa to have fully voice- controlled home.

wiradikusuma 6 hours ago 1 reply      
anyone know good but cheap WiFi/Bluetooth-enabled adapter(?) for lightbulbs? e.g. so I can use any machine (not just Pi) that supports WiFi/Bluetooth to control (at least on/off) the bulb.

so: wall --> adapter --> lightbulb

z3t4 4 hours ago 1 reply      
any ideas how to run cables ? im currently using radio but it can be noisy with many units.
nialv7 4 hours ago 1 reply      
Why is everyone trying to write everything in JS nowadays...?
dvh 7 hours ago 4 replies      
I'm gonna repost comment made by niftich half year ago because it's spot on:

 Internet of Things isn't about people hooking up jailbroken Kindles to one-way mirrors to show the weather. It's not about Ardunios and Raspberry Pis being used to collect some data, move some servos, and make a blog post about it. It's about big money to be made by introducing new monetization channels in places there were none before.

aarondf 7 hours ago 2 replies      
FYI: This was submitted two other times in the past 19 hours, making this the third.


brian_herman 8 hours ago 0 replies      
RRRRR GGGGG BBBBBB this is awesome!
A Deep Dive into HandBrake and Video Transcoding robservatory.com
114 points by tambourine_man  6 hours ago   38 comments top 12
msimpson 3 hours ago 4 replies      
This isn't really a deep dive into HandBrake or video transcoding, it's more of an examination of presets that wrapper provides.

You'd be better off diving into the foundation of HandBrake and using it directly:


Also, in before, "ffmpeg is too complicated for the average user ..." It isn't, you just need to RTFM.

doc_holliday 5 hours ago 1 reply      
Some interesting analysis here, but I think you are analysing a video far too heavily in the spatial space i.e at frame level, this doesn't particularly give reliable results as compression and hence artifacts can vary according to frame in time.

Video needs to be analysed ideally in the temporal space (i.e as a sequence). I see no mention of the GOP structure or length of the encoding chosen, which would need to be considered.

For example the one frame you have chosen to compare could be an I frame in some of the video compression or could be a P or B frame which would result in slight variance in quality and artifacts.

KerrickStaley 5 hours ago 3 replies      
When I open Handbrake, it's usually because I have a video that won't play in some device/program. For example, I have an mkv but my Chromecast only supports MP4. I found out recently that Handbrake isn't always the best option.

Oftentimes you want to remux the video, which means copying it to a new container format without touching the underlying stream. Handbrake doesn't support remuxing (no idea why); you have to turn to ffmpeg on the commandline [1]. Remuxing works if the video and audio codecs are supported by your target but the container isn't, and is very fastabout as fast as copying the file.

With remuxing you still have the option of changing the order of the audio tracks. For example, you can take a Cantonese film with a Mandarin dub and make the Mandarin dub primary so you don't have to fiddle with it (protip if you're learning Mandarin: all the good movies are in Cantonese but they all have Mandarin dubs).

[1] There are GUI options that I haven't explored, see below.

billyhoffman 1 hour ago 1 reply      
It is really confusing and super annoying that the author keeps calls 2 totally different tasks "ripping".

First, the author calls extracting the video/audio of the movie from the Blu Ray (and circumventing the DRM) "ripping". Quote from Article "On the Blu-ray, the music video is 2:18 long. When I ripped it to the hard drive using MakeMKV, the final file size was 279MB."

But then the author also calls encoding the extracted video/audio using HandBrake or transcode-video "ripping". Quotes from the article: "I then fed this .mkv file to both HandBrake and transcode-video, running it through all 26 different conversion options. This table shows the results, sorted by ripping app first..." and "I then copied that same frame from all 26 of the ripped versions of the music video..."

No! No you didn't! You ripped the movie once, and then encoded that 26 times, each time using different options, to compare encode-time, size, and attempted to compare the output quality.

Honestly, confusing these 2 fundamental concepts makes me seriously question your entire experiment.

wolfgang42 45 minutes ago 0 replies      
Handbrake settings are hard. A few years ago I did a project that involved finding the appropriate settings to use[1]; it took me several weeks to go through all of the settings, look them up, figure out what they did, and test them to see what the correct value was. Some of the settings are (or were) really poorly documented--for example, why wouldn't you want iPod 5G support? Well, apparently it "breaks compatibility with certain players," though I couldn't find any data on which players would break.

[1]: http://www.linestarve.com/blog/post/rendering-html5-video-wi...

JamesSwift 1 hour ago 0 replies      
I really can't say enough about how great the transcode-video library is in terms of "set it and forget it".

When I cut the cord I started down the path of figuring out how to do all the optimizations myself, but quickly realized just how deep and technical that was. I stumbled upon Don Melton's library and have used it as a key piece of my batch processing pipeline ever since.

Definitely check it out if you are looking to convert arbitrary video files to be played back later on arbitrary devices:


SloopJon 24 minutes ago 0 replies      
In another post, the author mentions a particular Pioneer Blu-ray drive that he uses with his Mac. Are they all pretty much the same, or are some models preferred?
Jaruzel 5 hours ago 3 replies      
I've done shedloads (500+ discs) of rips through hand brake, mostly DVDs and some Blu-Rays.

What I discovered was, that unless you are rippling losslessly, your results will vary greatly from film to film.

The advice I give out is:

1. Start with the recommended defaults in Handbrake.

2. Rip a film you know well.

3. View that film on ALL platforms you are likely to view it on. [a]

If you are happy. You are done - keep ripping discs.

If not:

4. Adjust quality up/or down goto #2. Repeat until happy.

Do the above for each GENRE of film - Bright Action movies need different ripping specs compared to period dramas.


[a] I did not do this step originally, and the first batch of rips I did, looked good on my 24" monitor, good on my 50" plasma, and awful on my 9ft projector screen!

PaulHoule 1 hour ago 0 replies      
The critical problem I see with encoded films is what happens in high-action scenes. Often the detail is "good enough" for me until all of a sudden things start happening and you can really see degradation.

Thus I don't find their method of looking at stills to be all that compelling.

j_s 5 hours ago 0 replies      
Google Cache since I'm seeing a 500 error page:


relics443 4 hours ago 0 replies      
Ah I thought this would be more about how it works. Oh well, pretty informative nonetheless.
lossolo 5 hours ago 1 reply      
Both of this tools use the same encoders under the hood, what you see here is just comparing different settings in default presets they provide. If you would like to experiment yourself with x264 encoding you can find descriptions of possible options here http://www.chaneru.com/Roku/HLS/X264_Settings.htm
Show HN: Easily connect to a VPN in a country of your choice github.com
64 points by rodrigogs  5 hours ago   32 comments top 3
kobayashi 2 hours ago 3 replies      
It's bad enough of an idea to connect to an open/untrusted WiFi network - now we're showing HN how to connect to random VPNs all over the world? My cursory opinion of this is that it's the worst idea ever I've ever seen make HN's front page.
hathawsh 1 hour ago 1 reply      
I guess this tool is intended for command line operation and scripting. If all you want to do is get around regional restrictions to watch videos, see http://hola.org/ .

EDIT: Also, you should assume that any anonymous VPN service has a good chance of being spyware or even malware, so you should sandbox it in a virtual machine or similar.

simlevesque 4 hours ago 9 replies      
> Automatically connect you to a random VPN in a country of your choice.

Why would someone do such a thing?

Open-RethinkDB meeting notes #4 docs.google.com
82 points by deepanchor  5 hours ago   17 comments top 6
grizzles 40 minutes ago 0 replies      
The investors should get an automatic tax benefit for releasing the IP into the public domain. There are great companies that you've never heard of that die every day and nothing ever happens to their IP. It's a massive economic waste.
overcast 4 hours ago 0 replies      
This project is just too awesome to let die, I hope it continues development for years to come.
williamstein 4 hours ago 3 replies      
News: "Our initial plan was to acquire the intellectual property left over from the RethinkDB company, enabling us to relicense the code and use the name RethinkDB. After aggressively pursuing this plan since the company shut down in September, little progress has been made." Dang.
yeasayer 2 hours ago 0 replies      
Oh wow, 2.4 is out next week? I thought we're stuck with 2.3 for a long time. Top 1 daily Hacker News submission incoming.
jwr 3 hours ago 0 replies      
Very happy to see progress being made. There is nothing comparable to RethinkDB changefeeds out there.
tehchromic 45 minutes ago 0 replies      
Spin wants to bring dock-less bike sharing to the US techcrunch.com
50 points by yurisagalov  3 hours ago   36 comments top 10
vcarl 49 minutes ago 0 replies      
Hey, I tried to do this! I was CTO of a little student startup called A2B Bikeshare for a while. We called it "stationless" though :)

We got a prototype working with a custom Android device with touch, 3g, GPS, and built in credit card swiper in Lansing MI, but ultimately we b it off more than we could chew. The team collapsed in mid 2015, and the company was just recently completely shuttered. It's validating to see a more skilled team put together a solid product, but damn it would have been fun to be that team haha.

Best of luck!

newy 3 hours ago 5 replies      
Hello HN! I'm Euwyn, one of the co-founders of Spin (also, 2x YC alum). We're excited about bringing the dock-less bikeshare model to our home city of SF.

I've lived in SOMA for years, and have always wanted to bike around, except: (a) it was always a hassle to deal with a bike after going around town, because I would inevitably end up Uber-ing somewhere after my first stop and (b) my bikes would get stolen :(

Would love to hear any questions you guys had about the business, and look forward to you guys riding our bikes soon.

ps. If you want to follow along or have any questions about Spin or startups in general, I'm 'euwyn' on Whale and Snapchat.

dsr_ 57 minutes ago 2 replies      
The problem I've always had: what I want from a bikeshare is to be able to exit the train station, grab a bike, bike the remaining kilometer or two to work and drop the bike. Later in the day I want to do the same in reverse.

That only works in a hub program if there's a dock next to my office building. (You can count on a dock being next to the train station.) Or I could pay for all-day rental of the bike, when I'm only using it for two ten-minute periods.

In a dockless system, does it actually get any better for my use case?

colept 1 hour ago 1 reply      
How is this better than docking stations?

I tried bike sharing and gave it up because the bikes were poorly maintained and replaced with cheaper bikes (thanks Citibike). Bikes fail, sometimes missing brakes, other times a flat tire.

Why are docking stations important? Because a couple times a week I notice the company pull up with a trailer and maintain the bikes. They swap out the broken ones and fill the stations.

Is Spin going to drive around the city tracking individual bikes to replace them?

Eridrus 1 hour ago 0 replies      
This reminds me of the April fools video Google put out about self-driving bikes. It would be neat if self-driving bikes could go to where they were needed.
mgberlin 3 hours ago 3 replies      
Your bikes are going to be stolen, stripped, and scrapped.
kmicklas 48 minutes ago 0 replies      
There are already more Citi Bike docks in NYC than I feel like is necessary for the service to be convenient. Within one block of my apartment there are already 3!
cjbenedikt 2 hours ago 1 reply      
Has worked in Germany for years. Sponsored by Deutsche Bahn.
RandomMaker 2 hours ago 1 reply      
This is a fantastic idea! Can't wait for it to go live :)
begotles123 2 hours ago 0 replies      
this looks amazing!
Run for Office: Find all elected offices you are eligible to run for runforoffice.org
343 points by smacktoward  6 hours ago   125 comments top 40
jMyles 5 hours ago 7 replies      
This doesn't appear to include student offices, which, especially at large state universities, are very powerful and misunderstood.

If you are a student, consider running for student office. It's likely (in fact almost certain) that your student government makes more decisions that impact your life and spends more of your money than the national government does.

They typically have multi-million dollar budgets, seats on powerful state-wide committees where tens or hundreds of millions are spent, and the ability to impact academic policy for students of the next generation.

Some also run sprawling field campuses, nature preserves or camps. I'm particularly fond of one that is run by the Albany Student Gov: Dippikill. Very special place.

Students in decades past fought very hard for the power and placement that today's student governments enjoy, and education "administrators" are always looking to carve them out and take them away. They need to be continually defended and used.

davidw 5 hours ago 3 replies      
I applied as a candidate for the local planning commission because I think that restrictive zoning is driving some portion of inequality in this country, and also has a lot of problems in terms of long term financial stability (see: Strong Towns). I've never done anything like that before; we'll see how it goes.
gknoy 4 hours ago 0 replies      
On a related note, Shea Silverman posted on his blog [0] about how he ran for office in an extremely cost effective manner. It was linked on HN about two months ago [1].

> This year I ran for Florida House of Representatives District 49 (Orlando). I lost, but I got 31% of the vote and I only spent $3000. My opponent got 69% of the vote and spent $100,000

He open sourced [2] several tools that he used to do this.

He advises participation in local hob-nobbing events (I presume this was a local Florida thing, but your local community may have similar), and More Facebook Advertising. He reminds us that signs should be big, and WILL be stolen. He also advised cultivating good relations with local media by doing press releases.

While he was running for a state office, you could probably use similar tactics for more local things.

0: http://blog.sheasilverman.com/2016/11/how-to-run-for-florida...1: https://news.ycombinator.com/item?id=130080712: https://github.com/SheaSilverman

LeifCarrotson 5 hours ago 5 replies      
This would be much more useful if they displayed some election statistics and information on incumbents and political parties. The stated purpose of the tool is:

> Today, 40% of state legislature races go uncontested and the problem is worse at the local level. There is a crisis of leadership occurring in our democracy. We need more people to lead. We need more people to run for office.

> Run for Office is a free service that provides all the tools you need to launch a successful campaign whether you are a seasoned veteran or firsttime campaigner.

You won't "launch a successful campaign" if you are a nobody running against an incumbent who is supported by the dominant political party in a bright red or deep blue state. You'd need to be a seasoned veteran with wide name recognition who is very well connected, well liked, and also wealthy, but then you're not likely to be using this tool.

It's in the races where the incumbent is not running for reelection, the position is nonpartisan, or there's a lot of turnover that someone could say "Hey, I have a chance at that spot."

jimterrapin 5 hours ago 9 replies      
My name is Jim Cupples and I work on RunForOffice.org Thank you for the suggestions and comments.

I'm a politics nerd that likes local government, and believe it has dormant power for movements. I wrote a piece called The Bottom Carries the Top that explains my thoughts on that stuff.

Anyway, I agree with many of the comments and we think the university positions is a great idea.

If you like, feel free to email me at jimcupples@gmail.com or cupples@nationbuilder.com

Hope to hear from you

GeneralMayhem 5 hours ago 1 reply      
I know this might run somewhat counter to the point of the site, but I'd love more information about what the effective qualifications for filling the positions are. Obviously any moron can get elected, but that doesn't mean I'd feel particularly good about, for instance, running for SF Public Defender without being a lawyer, or city treasurer without any background or knowledge in finance.
OliverJones 5 hours ago 2 replies      
Stale data, unfortunately. I could run for the US federal legislature, with an election day of Nov 8, 2016.

Also, no local elective offices in my jurisdiction.

tptacek 5 hours ago 2 replies      
There's a powerful social permission / imposter syndrome thing that works against people running for office, so an effective thing you can do even if you yourself can't bring yourself to stand for office is to convince friends to do it, and offer to support their candidacies.

I've found it's remarkably easy to have that conversation with people. Ultimately, telling someone you'd support them if they ran for elected office is a pretty major compliment. And you don't really need any context or permission to bring it up; like saying "that's a nice jacket", you can say it any time. More importantly: when someone they know gives them permission, people actually will consider running.

I would much rather get 10 of my friends into local office around the country than run myself, so that's the goal I've set for myself. I'm making decent progress!

You or your friends should consider (and probably prefer) local administrative offices, like park districts or water reclamation. You don't need to be a subject matter expert to run for these --- that's not expected of you (source: friends who have these jobs). If you're wondering "why bother?", the answer is that having any kind of elected office magnifies your influence with other representatives and stakeholders. It's also great practice.

Running for the kinds of offices everyone has heard of --- Congress and Senate --- is extremely expensive (in the multiple hundreds of thousands of dollars) and, depending on the state you're in, might have a large component of building relationships with your state party. I say this not to discourage people from trying --- especially if you're thinking of running D in an unopposed R district or vice versa --- but to set expectations. Running your first time is easier than you probably think it is, as long as you pick the right office to run for.

If you're on a (recreational) Slack right now, consider opening up a channel and inviting people into it and spontaneously getting a couple people interested in running. I was on IRC channels in the 1990s that started companies that later sold for 9 figures (and the #!w00w00 people can tell you better stories). Getting a couple friends elected to their local library board seems like an extremely reasonable expectation to set for yourself.

If you're organizing something to get people to run for local office, I'm interested in talking/helping. I think there are a bunch of useful applications to be built here, and also a lot of opportunities for people to get together to share encouragement and notes.

saurik 1 hour ago 0 replies      
Most people know they can run for federal and state offices, even if they don't know what district they are in (and this barely matters, as the election office will tell you); and while they don't know some of the more esoteric offices, those offices tend to not have much power.

What people usually don't know about is all of the local government positions that affect their daily lives: many of the things you care most about exist below the level of a city, and yet websites like this throw aside all of the water districts and park districts and commumity service districts as if they don't matter :/.

Honestly, as someone deeply involved in politics at this point, who burns a lot of time into trying to educate people as to how government works and how to get involved, I believe this website--which is so bad it didn't even include city council of a smaller city as a local result, returning only internal democratic party positions--can only do more harm than good.

saycheese 3 hours ago 1 reply      
Anyone know of training for running for office?

For example, for women, there's this course:


jedberg 2 hours ago 0 replies      
This is handy, having a list of all the positions in one place with their filing deadline!

However, it is showing me a lot of "Next Election" times that were last year. I guess the data needs some cleanup.

Also, President seems to be missing. :)

CM30 4 hours ago 0 replies      
Interesting idea for a site. However, is there a version of this for people in countries outside of the US? Because as someone in the UK, I'd love to know what elected offices I'd be elligible to run for over here.

And I'm sure people in places like Canada, Australia, France, etc would be interested in a similar site for their country as well.

Still, seems pretty neat so far.

EternalData 4 hours ago 0 replies      
I always love technological solutions that ease the path forward to meaningful action.

I wonder if there's a way to algorithmically determine how a candidate with subpar funding could beat somebody established with many more (a playbook of kinds). That would be the next step forward -- sure, you can run, but it flips the dynamic when you're smart about running to win.

Bedon292 5 hours ago 0 replies      
I don't think just accepting an address is enough. Many of these positions have age and residency requirements. Or other requirements. I know they want to make it quick, but I am not eligible (yet) for most of the positions they listed, and would rather focus on a position I can actually run for.
robbyking 4 hours ago 0 replies      
I searched for Athens, GA, and two positions in Chicago were included in the results:


tedchs 3 hours ago 0 replies      
I was previously appointed to one of my town's local boards, reviewing aesthetics of new development. It was very rewarding in that I got to advocate for "fair" outcomes, sometimes as a lone dissenting voice. I also learned a ton about both government and architectural design. If I could do this in my mid-20s, so could many others.
kibwen 4 hours ago 0 replies      
This is a great demonstration of the ways that people with technical skills can contribute to improve their communities. Enabling others to exercise their civic duties is just as valuable as exercising them yourself.
a3n 3 hours ago 0 replies      
After recent marches and protests, and the real but not necessarily self-sustaining enthusiasm that they generated, people ask "but what now?"

And the answer to that is to run for your state legislature. It's there that voting rules, and support for incumbent national members of Congress, are set and enforced.

Join the smoke filled room that's close to home.

xwowsersx 5 hours ago 3 replies      
Is this a (clever) marketing stunt for NationBuilder?
1024core 5 hours ago 1 reply      
I work fulltime. Can we filter for positions that do not require a fulltime commitment?
jimterrapin 5 hours ago 1 reply      
My name is Jim Cupples and I work on RunForOffice.org

Thank you for the comments and suggestions. We agree with the comment on student governments, by the way, and would be happy to include them.

Anyone can email me at jimcupples@gmail.com if they like

tedchs 3 hours ago 1 reply      
Is this run by Nation Builder (very much not a non-profit)? There are a couple links there from the homepage.
buckhx 3 hours ago 1 reply      
I've thought about this in the past. If I wanted to run for governor of my home state in 20 years, what are things that I could be doing today to move towards that goal? I'd assume it would be running for smaller offices to gain some political experience and creating relationships.
throwaway729 5 hours ago 0 replies      
Is there any contact information for the site itself? The "APPLICATION GUIDELINES" and "submit application to" information for one of the positions I just viewed is inaccurate...
dmschulman 5 hours ago 1 reply      
Very cool! I like how you also provide resource to get one's candidacy started
codingdave 2 hours ago 0 replies      
This does not seem to include any local offices... it begins at the state level for me. There are many city and county positions that are better beginnings.
jimterrapin 4 hours ago 0 replies      
I have to get to work, but would love to hear from anyone that's interested in this stuff. We are looking forward to building an API and helping people continue to build.

Mostly, we want people to simply know what they can run for and how to get on the ballot.

Inaccuracies on the site are my fault. I work with amazing students, mostly from USC, and we put it together as best we can. The shapefile stuff is also sometimes wonky. A GIS student from UCLA is the awesome person behind that.

downandout 1 hour ago 0 replies      
Just a technical note: it shows that the "next election date" for several offices in my area is November 8, 2016.
anigbrowl 2 hours ago 0 replies      
* unless you're a legal un-person in which case we'll exclude you from the political process as much as possible.
MichaelGG 5 hours ago 0 replies      
Doesn't seem to list president.
dsfyu404ed 3 hours ago 0 replies      
Granted I'm not sure the impact (if any) it has but holy crap that's a pretty one sided sponsors list for something that's not a partisan issue.
dragonwriter 3 hours ago 0 replies      
Some glaring bugs, like showing some "next" election dates that are in the past, but definitely a good idea.
timdorr 5 hours ago 1 reply      
Oddly, it gave me an office in Chicago when I searched my home address in Atlanta...
codeddesign 4 hours ago 0 replies      
After looking at NC, I was astonished to realize that a congressman makes $50k more than the state's governor.
felipelemos 4 hours ago 0 replies      
I find this incredible because politicians should not be a profession.
rc_bhg 4 hours ago 0 replies      
Is there a site like except is tells me the exact day to go vote?
kuschku 4 hours ago 0 replies      
@mods: Could the title be changed? This seems to be US-only.
sbw 3 hours ago 0 replies      
What about local offices?
EGreg 5 hours ago 3 replies      
Why would you want to become a cog in a machine when you can build your own?

Many people run for office thinking they can change the fundamental culture entrenched around them. But then they realize that there are so many existing systems safeguarding the status quo, that one person can't do it. You'd need to hire a whole new administration.

Instead, what if organizations would contract with one another for services and respect each other's cultures as a take-it-or-leave-it kind of thing? Then market discipline would apply.

What's new in Swift 3.1 hackingwithswift.com
73 points by ihsw  6 hours ago   25 comments top 7
chris_7 5 hours ago 2 replies      
Great language-level features (especially the concrete extensions, finally), but I'm most interested in whether the compiler still crashes, whether it can correctly report error messages in code using type inference and closures, and whether incremental compilation works properly.
matt2000 4 hours ago 1 reply      
The original statement regarding Swift 4 was "the primary goals are to deliver on the promise of source stability from 3.0 on, and to provide ABI stability for the standard library." I hope this statement is still true. The Swift 3 upgrade was painful, I hope to see more stability out of the language from here on out.
adamnemecek 5 hours ago 1 reply      
Nice to see features I'm pumped about in a decimal version.
Entangled 46 minutes ago 1 reply      
I'm coding all my apps in Swift and I am never going back. Server, desktop, mobile, tv, watch, linux, embeded, etc. One language to rule them all.
moflome 5 hours ago 0 replies      
I find myself now thinking in Swift and coding in ES6... looks like SwiftJS is dead [0] but notice the Cocos2D folks just released a Swift-to-Android feature [1]. I really wouldn't mind Swift uber alles... someday.

[0] https://github.com/shift-js/shift-js[1] http://forum.cocos2d-objc.org/t/swift-version-is-ported-to-a...

paxcoder 5 hours ago 4 replies      
They should finish the free Foundation implementation[1] already which is the true standard library. Swift is not very useful without it, and they're talking about 4.0. It seems like Apple's giving their minimum to the community


shujito 5 hours ago 0 replies      
I hope that after swift 4 new changes are non breaking
Tech Creation and Corporate Survival: Why the Shutdown of Vine Matters thetechladder.com
37 points by Heffay626  3 hours ago   2 comments top
justinucd 3 hours ago 1 reply      
"Vines business problems throughout the years poor management from directors leaving every year, competitors such as Instagram implementing their own short video features, massive lack of innovations to keep users, and their biggest stars from jumping to other platforms." Seems to sum it up there: lack of innovation, adaptation, and forward thinking.
I Am a Recruiter (a.k.a Spammer) at Amazon devway.xyz
59 points by amirsaboury  2 hours ago   46 comments top 18
joshstrange 1 hour ago 5 replies      
People in this thread complaining about "humble bragging", "You're lucky to get recruiter emails", "People wish they were in your position" are missing the point.

This is NOT a "All these amazing companies are trying to throw money at me but I don't want it" it's a "These people keep offering me a job I don't want, given their requirements, and I'm not at all guaranteed it even if I do accept their requirements". This is akin to getting credit card offers in the mail IMHO. Getting recruiter "spam" (calls or email) is 99.99% of the time NOT an indication that you are a great developer but rather you have made some sort of an email list.

Sure if companies were contacting me with better jobs that I currently have or jobs that I'm a shoe-in for (ie. they've done their research on me and they want ME not just someone who has certain keywords on their LinkedIn profile) then I'd be super happy and a blog post about that would be humble brag but this is different. These recruiters spam thousands of people with the same form-letter so they can pull in a handful and get a percentage of their salary for the first 3/6/12 months they are employed at their new job.

mabbo 1 hour ago 3 replies      
It's the bullshit that gets me. The lies.

"I read your profile and I think you'd be perfect for this role" - Amazon recruiter, who didn't notice I'd stopped working at Amazon 4 weeks before, as clearly indicated on the profile they 'read'.

They aren't even outside of the norm. Technical recruiters are about 90% bullshitters who would say anything at all if it meant they got you into an interview. I'm with my current company in large part because our recruiter is in the other 10%.

koolba 1 hour ago 1 reply      
> I want to remind the Amazon human resource and recruitment team that you have many talented software developers in your company. And, these folks can create a database of do-not-want-to-move candidates for you, and you can ask your recruiting team not to spam them! A simple cross join!

That'd be an anti-join, not a cross join.

On a more serious note, there should be a "Do Not Call List" for email addresses. It'd be damn near impossible to implement for everything as I doubt Nigerian princes will bother checking it, but a "name and shame" campaign against companies and their recruiting affiliates that ignore it would work.

Course the other problem is that having such a list and such a service also provides a way to verify the validity of an email address.

cwmma 14 minutes ago 1 reply      
Usually I ignore recruiter spam, but a while back I got this one that started off with the line

> I recently came across your blog (one picture of your cat had me dying).

I'm fairly certain the recruiter did not read the whole blog post [1] because it is a eulogy for my dead cat. Normally I'm fairly calm and try to be understanding and empathetic especially for people just doing their job, but this put me in such a white hot rage. Not only did I email the recruiter back telling them off, but this was the kind of rage where you are no longer acting impulsively, but have come back around the other side to being calm and rational in your rage, so I tracked them down on google, figured out who their boss was and CC'd them on it. I felt a little bad about it, latter I've had shitty sales jobs too, but man rule 1 page 1 of bland icebreaker email openers has got to be 'make sure it's not a god damn eulogy'.

1. http://calvinmetcalf.com/post/94533627852/a-thousand-days-of...

jaypaulynice 9 minutes ago 0 replies      
Worst a recruiter can do is call your company's main line and ask to speak to you (found your name on LinkedIn). And then they're like: "Hi Jay, how are you?" like you're best buddies.
CaptSpify 1 hour ago 0 replies      
> What bothers me is the fact that, on average, every three months they approach to the same potential passive candidate for the same type of position!

This isn't at all exclusive to Amazon. I get these all the time from other places. Bad recruiters abound inside and outside of Amazon

ffef 1 hour ago 0 replies      
If only there was a way to filter emails you don't want to receive.
monster_group 40 minutes ago 0 replies      
I don't mind getting such emails. Every now on then there's a free happy hour invitation in them. I try to attend as many as I can. Usually they are good opportunities to see what companies in town are up to (and of course free beer and snacks). Some manager at the happy hour ends up asking me 'Why are you looking for a change?' to which I reply 'I am not. Your company invited me so I came. Now it is up to you to convince me to interview for your company'.
b_t_s 34 minutes ago 1 reply      
Yea, Amazon's the most spammy of any known company, but the one that really blows me away is Apple. They occasionally send me recruiter spam at my work email. The work email that they got from its team membership in my employer's iOS developer account. That's pretty brazen.
pfarnsworth 28 minutes ago 0 replies      
I've threatened spamming recruiters with the CANSPAM act. I have no idea if I have a legal ground to do it, but it certainly stopped these particular recruiters from contacting me again, at least for now.
andrewstuart 44 minutes ago 1 reply      
The reason they do it is because it works.
laddng 1 hour ago 2 replies      
I kind of like to look at these kinds of posts as if it were a very attractive person complaining that so many people are hitting on them. There are a lot of people who would do anything to be in your position having recruiters constantly offering them interviews.
PaulHoule 1 hour ago 1 reply      
At my local startup accelerator I find that AMZN has contacted many other devs in my area. I talked to the hiring manager, they are a nice bunch of folks working on a revolutionary problem but, alas, I am not moving to Seattle either.
astdb 32 minutes ago 0 replies      
From what I hear, Amazon and AWS have totally different cultures (AWS being much better)
anon987 37 minutes ago 0 replies      
Recruiters are salespeople. Treat them as such.
blazespin 50 minutes ago 0 replies      
Do what I do. Just accept it, let them fly you in, enjoy a day of a nice hotel and some interviews. I'm not particularly serious about accepting, but they don't seem serious about listening to me so, hell, I just gave up and enjoy the ride.

Make sure you keep your receipts and bill them, though. Eat out at a nice restaurant!

And if you flub the interviews, who cares. It's great practice.

lago 1 hour ago 2 replies      
humble brag, move along
zython 1 hour ago 1 reply      
What is the problem exactly ?

Dont want to recieve emails ? Filter them out.

Dont want to be hired by Amazon, send a restraining order and what not, contact their HR directly, or what not but ranting on medium will not solve the authors problem.

This makes me question if the author is secretly just clickbaiting for attention

Proofs are Programs 19th Century Logic and 21st Century Computing (2000) [pdf] ed.ac.uk
111 points by michaelsbradley  8 hours ago   62 comments top 10
tpetricek 7 hours ago 3 replies      
People like to see logic and computing in this way and the author is a great storyteller. But it is a somewhat misleading perspective that is kind of similar to "Whig historiography" (https://en.wikipedia.org/wiki/Whig_history):

> Whig is an approach to historiography that presents the past as an inevitable progression towards ever greater liberty and enlightenment, culminating in modern forms of liberal democracy and constitutional monarchy.

Just replace the "greater liberty" bit with what computer scientists (and the author) argue is the right way of doing computer science today.

I'd love to read a historical treatment of the topic that is written by someone who is a historian rather than computer scientist twisting history to support their world view.

jwtadvice 3 hours ago 4 replies      
Anyone really familiar with this: are non-constructive proofs, things like Cantor Diagonalization or Erdos's Probabilistic Method, programs? Are they just non-terminating programs?

I can see Cantor Diagonalization as a non-terminating program really easily, but I don't understand what the Probabilistic Method looks like as a program.

agentultra 7 hours ago 2 replies      
I love Wadler's papers. He did one on Monads that was approachable and revealing for me. I also love logic and the predicate calculus.

Highly recommend reading this one. :)

cr0sh 6 hours ago 5 replies      
This may be slightly OT - but the one thing I have always wondered about the 19th century is why Babbage didn't go down the route of Boolean logic/math for computation? Furthermore, why he didn't incorporate and use relays (from telegraphy)? For that matter, why didn't Hollerith?

Ultimately, to me it seems all connected to this "fixed" idea (that is, thinking "inside the box") that computation was about numbers - that there was only calculation and tabulation - counting, iow.

While Boole came up with the math/logic - few pursued the idea of making machines to work with it (a few -did- exist, but were obscure, iirc); embodying logic using electrical circuits would have to wait. Then there was the "revelation" of computation being about symbols, not numbers - which opened everything up.

I'm just kinda rambling here; my apologies. And, hindsight is always easier than foresight, of course. It just bugs me, and makes me wonder what "obvious" ideas we are missing today, that will change the world, that we won't know until we look back. I guess I wonder why many of the most world changing ideas (in the long run) are often very simple, and why did it take so long for humanity to see/discover them? I wonder if anyone has studied this - of if it is something that can be studied...?

mrcactu5 1 hour ago 0 replies      
i am having my doubts these days.

i would like to read my programs as proofs. then I ask myself what is the Amazon.com front page proving?

we talk about proof writing in Haskell or Agda. What about Python? Can we prove things there?

And finally I think about the statements I would actually like to prove? Much of mathematics requires second order logic or beyond, such as the Fundamental Theorem of Calculus.

So for me the triad between Math and Logic and Computer Science is less than air-tight and will remain that way.

imode 3 hours ago 0 replies      
curry-howard and Wadler's "Propositions as Types" talk at StrangeLoop* opened my eyes pretty early on to the sweet, sweet utility of logic in my every day programming. it sounds odd to start with but I never realized why these things mattered prior to viewing this talk.

props to wadler. best ~40 minutes I've ever spent.


27182818284 8 hours ago 0 replies      
I had math professors that would express this sentiment in undergrad. It is very true.

However, problem sets generally aren't graded like they're programs, haha.

didibus 5 hours ago 2 replies      
Write a program to prove another one. It's pretty cool in concept, but is it useful in practice?
pron 7 hours ago 5 replies      
That's certainly a very language-and-logic perspective on things. But one of the reasons Church missed the general concept of computation whereas Turing didn't was precisely because Church was stuck in the formalism of logic, and couldn't recognize that computation extends well beyond it. So proofs are programs, but so are, say, metabolic networks, neuron networks, and just about any discrete dynamical system. But are metabolic networks proofs? That's not so clear, and therefore it's not so clear that there is a universal equivalence here.
anentropic 6 hours ago 0 replies      
shame about the misplaced apostrophes
KDE Slimbook slimbook.es
135 points by bananaoomarang  9 hours ago   88 comments top 21
dotancohen 2 hours ago 1 reply      
Why does the KDE Slimbook have a Windows key?http://kde.slimbook.es/images/xheader.png.pagespeed.ic.WfFJL...

Could they not get a Tux or even the KDE Gear icon printed on the keyboard?

tedmiston 11 minutes ago 0 replies      
Just curious if anyone considering this has compared it to the Chromebook Pixel 2015 model. The Pixel is discontinued now but still available (and for half of its retail price). It seems like the two machines target a similar audience.
asciimo 6 hours ago 4 replies      
Here's another source. https://liliputing.com/2017/01/kde-slimbook-linux-powered-la... the i7 upgrade with 16mb looks good, though I'd prefer better screen resolution.
rocky1138 7 hours ago 3 replies      
This is great. The only trouble is that it's only got two USB ports and no Ethernet. Secondarily, the USB ports are only 3.0 not 3.1.

Other than that, having a stable KDE Neon hardware platform is really exciting to me. I run KDE Neon full-time on my Macbook Pro.

Any other KDE Neon fans out there?

brudgers 7 hours ago 1 reply      
KDE announcement: https://dot.kde.org/2017/01/26/kde-and-slimbook-release-lapt...

Slimbook site seems to have collapsed under the internet's hug.

gravypod 6 hours ago 0 replies      
AdmiralAsshat 7 hours ago 2 replies      

Feel like for the price, you could get a comparable laptop that's a hair lighter and just throw KDE onto it. The only thing I see on there that makes it inherently more FOSS-friendly is the inclusion of an Intel wifi card, that has a kernel-supported driver in Linux.

I'm also not too keen on the bezel size. I guess I've been spoiled by my XPS 13.

ris 2 hours ago 2 replies      
Not again. I do wish the KDE project would stop getting distracted by quixotic hardware and/or mobile projects and concentrate on building a killer desktop. I hate to say it but parts of KDE today do feel worse than what we had ten years ago e.g. kmail2/akonadi which causes me pain on a daily basis.
johnnycarcin 5 hours ago 8 replies      
I was super excited about this until I saw it was another 13" screen. Does anyone enjoy using a screen that small (honest question)? To me anything under 15" just doesn't make sense, especially if 50% of my time is spent without an external monitor.
DeepYogurt 7 hours ago 0 replies      
There seems to be no mention of firmware on the site. Can someone in the know comment on if coreboot is supported?
Roritharr 5 hours ago 2 replies      
Please give me a 32GB Ram option.

If not possible, please tell me why. :(

olympus 3 hours ago 1 reply      
This looks remarkably similar to the HP Envy that I bought in 2015: https://www.amazon.com/gp/product/B015AD1ZFA/ The HP has a slightly different port arrangement on the sides and a higher resolution screen.

I'm sure the cost supports the KDE project, but you could buy the HP, install Linux yourself (if you're buying one of these you are probably capable of installing your own OS), and donate $50 to the KDE project and come out ahead-- assuming you live in the USA and can get the old Envy for $715.

RazrFalcon 1 hour ago 0 replies      
>CD-ROM/DVD - Not available

>Package Contents - 1 x drivers disk


shmerl 3 hours ago 1 reply      
KDE is great, though it feels like things take a long time to implement (such as move to Wayland).
jasonkostempski 1 hour ago 2 replies      
Are there Linux users out there that would actually use a pre-installed OS?
tormeh 4 hours ago 0 replies      
I would be a lot more interested in a laptop from Red Hat or Canonical.
Splendor 3 hours ago 0 replies      
Very nice. I would buy this over a MacBook Air assuming two things:

* The trackpad isn't terrible

* The battery life is decent (6+ hours)

cJ0th 6 hours ago 2 replies      
The picture provided isn't big enough to be 100% sure but as far as I can tell the super key features the windows logo...
nmca 7 hours ago 0 replies      
This looks great; may well be my next laptop.
cies 7 hours ago 2 replies      
slimbook.es is down, anyone has the pricing info?
brilliantcode 3 hours ago 0 replies      
> 729,00 = $1021 CAD!

There's just no way this is going to compete with Chromebook which you can install Ubuntu as well or buying any slightly older laptops and installing ubuntu.

what is the minimum cost of such "slim" devices? I'm optimistic that Chromebook will come down in prices and armed with built in cellular connectivity with ARM processor for that all day battery life is the ultimate dev tool.

I enjoy using last year's macbook but I equally enjoyed using Ubuntu on my older laptop for development purposes.

Nostalgia for Now wilsonquarterly.com
33 points by pepys  4 hours ago   5 comments top 3
ashark 2 hours ago 1 reply      
Spot on with our ability to preserve the past being the cause of this.

Someone born in 1940 didn't grow up watching their favorite children's movies over and over and over on VHS, you know? You missed something then, it was often just gone. Maybe there'd be a picture or two of the event (or stills from the film) in a book somewhere in the library. (granted, yes, now they can go back and see that stuff, probably, but they couldn't in, say, the 60s or 70s, for the most part)

It's been trending this way since the camera and phonograph were invented, but things really took off right around the 80swalkman, home camcorders becoming more affordable and the media cheaper, VCR, cassette tape, and so on.

It's only going to get worse. Kids born today will very likely be able to walk around entire recorded 3D environments from their youth. Grandma's house at Christmastime when you were 9 years old? Done. Maybe we'll even be able to reconstruct semi-interactive versions of the people who were there. Your junior year high school dance? Don't see pictures and video of it, live it again. We'll see what effects that has on culture when it comes, I guess.

ENTP 2 hours ago 0 replies      
Nostalgia isn't what it used to be!
Nadya 1 hour ago 1 reply      
>A few seconds later, I looked back at the family to see how their picture was turning out. They were gone. They didnt want to see Niagara Falls. They wanted a picture so that they could be seen forever standing in front of Niagara Falls.

This line jumped out to me - because it is how my mother is. She wants a photo for the memory but never pauses for the memory itself. She was confused when I visited a foreign country for the first time and didn't take a single picture. Myself and the author seem to share the same feeling of wanting to experience something, not keep proof of it contained in a photo.

I still take photographs. But I do so to remember how things were. The photos I have of my family are mostly candid shots of them doing some mundane task and the idea of "3-2-1 cheese" is missing entirely. My grandmother bringing a birthday cake to my nephew, a smile on their faces. My sister investigating her sea monkeys with a look of curiosity. Things like that. Because they might not remember the experience - but I'm sure they'll remember being there. The experience, to me, is the important thing to photograph and not the fact they were there.

It's for that reason I find "selfie culture" to be somber. They aren't photographing the experience - they're documenting that "they were there" and they have proof they were because they have a selfie. But they know they were there so what is the selfie for?

Athelas: Our Road from Hack to Product ycombinator.com
44 points by craigcannon  4 hours ago   12 comments top 6
mifeng 2 hours ago 0 replies      
As a product geek, I love this. It's an awesome application of computer vision to solve an ancient and expensive problem in a new way. Just building the product and getting enough labeled data to a level when you can demonstrate proof of concept must have been a monumental achievement. Kudos!
avip 2 hours ago 1 reply      
I'm confused. You've submitted a 510K, you call the device "pre cleared", and you're distributing? Please elaborate.
dsuchr39 2 hours ago 0 replies      
The applications are quite far ranging. I am really impressed by the transparency of the Athelas founders.
ttandon 2 hours ago 0 replies      
Hi HN! Athelas cofounder Tanay here. Happy to share any more stories/answer questions about the post.
bcatanzaro 3 hours ago 2 replies      
Would love to hear about how this is different from Theranos.
eps 2 hours ago 1 reply      
This is easily one of the most impressive products that came out of YC in a while. Fantastic work. Break a leg with the FDA approval.
Show HN: Command Line Challenge cmdchallenge.com
90 points by zoidb  6 hours ago   45 comments top 9
glandium 21 minutes ago 1 reply      
It doesn't handle glob expansion.

 bash(0)> awk '{print $1}' access.log* + awk '{print $1}' 'access.log*' awk: cannot open access.log* (No such file or directory)

ianmcgowan 1 hour ago 0 replies      
This is a really fun idea, and I enjoyed going thru it. I like the fact that it doesn't matter how you get the answer, as long as it's right (seq 100 | tr '\n' ' ').

This would be a really useful product as part of the interview process for technical people. I'd ask candidates to self-identify their expertise level, let them skip questions, and consult google as much as they wanted. You could have different subjects and maybe mix and match - shell, powershell, python, postgres etc.

zoidb 6 hours ago 6 replies      
This was a small side project I completed recently, would love to hear feedback. Thanks!
teddyh 37 minutes ago 0 replies      

 python: command not found

the_watcher 3 hours ago 0 replies      
I like this a lot, thanks for putting together. Definitely made me think hard about how much of the command line I actually know.
_kst_ 2 hours ago 1 reply      

 bash(0)> pwd Internal Server Error

rando832 2 hours ago 1 reply      
Interestingly, locale can change one answer.

with LC_COLLATE=en_US.UTF-8,

sort faces.txt | uniq -c

5 ()

uniq treats () as equal to ().

Copied the file to my machine to test.

joelthelion 3 hours ago 2 replies      

 bash(0)> grep GET !$ + grep GET '!$' grep: !$: No such file or directory
Mmm. Fake bash.

cleech 4 hours ago 1 reply      
fun, although I did cheat a bit on the corrupted text and pulled the expected output from README instead of the input file :)
Two Infants Treated with Universal Immune Cells Have Their Cancer Vanish technologyreview.com
244 points by phr4ts  6 hours ago   107 comments top 10
jackweirdy 4 hours ago 6 replies      
> Although the cases drew wide media attention in Britain, some researchers said that because the London team also gave the children standard chemotherapy, they failed to show the cell treatment actually cured the kids.

Could someone explain what the process for proving it would be in this case? I presume some kind of animal testing?

Seems pretty immoral to give someone a wildcard treatment and not back it up with something known to work. Not disparaging this treatment but given we don't know if it works yet (and how aggressive cancer can be) it seems totally fair to administer it alongside chemo.

eliben 4 hours ago 4 replies      
Does anyone here know which companies are looking for software engineers to help with this kind of stuff?
mediocrejoker 4 hours ago 4 replies      
> We estimate the cost to manufacture a dose would be about $4,000, she says. Thats compared to a cost of around $50,000 to alter a patients cells and return them.

> Either type of treatment is likely to cost insurers half a million dollars or more if they reach the market.

Can someone more familiar with the pharmaceutical industry explain this?

rdlecler1 4 hours ago 1 reply      
>And I guarantee you even if things were equal, which they are not, you would want your own stuff, not someone elses cells.

This sounds like wishful thinking from VC who just put a lot of money into the Bespoke solution. The idea of immunotherapy is that your injecting cells that can attack the cancer. What does it matter it it's your own cells especially when the cost is an order of magnitude bigger.

gt565k 5 hours ago 4 replies      
'Although the cases drew wide media attention in Britain, some researchers said that because the London team also gave the children standard chemotherapy, they failed to show the cell treatment actually cured the kids. There is a hint of efficacy but no proof, says Stephan Grupp, director of cancer immunotherapy at the Childrens Hospital of Philadelphia, who collaborates with Novartis. It would be great if it works, but that just hasnt been shown yet.'

So the title is basically click-bait, as the treatment is unconfirmed.

CodeSheikh 3 hours ago 1 reply      
"Rights to the London treatment were sold to the biotech company Cellectis, and the treatment is now being further developed by the drug companies Servier and Pfizer."

And that's when it will get sunk into an abyss of bureaucratic processes and this treatment probably won't see sunlight for another ten years.

blisterpeanuts 4 hours ago 1 reply      
Happy babies! As a point of interest, this type of engineered T-cell treatment has its roots in research at the Weizmann Institute in Tel Aviv. They demonstrated the use of CARs modified T-cells in curing leukemic disease in rats and mice several years ago.

Their research was expanded to human trials at Univ. of Penn. where 27 of 29 patients with incurable leukemia and most of whom had a prognosis of death within a few months went into remission and showed no sign of the disease.

This modality may work with other forms of cancer; engineered T-cells that can enter every capillary in the body could potentially wipe out entire colonies of cancer cells. The potential is enormous, as are the challenges; cancers can be very difficult to differentiate from healthy tissue.


ryeguy_24 4 hours ago 6 replies      
"The ready-made approach could pose a challenge to companies including Juno Therapeutics and Novartis, each of which has spent tens of millions of dollars pioneering treatments that require collecting a patients own blood cells, engineering them, and then re-infusing them."

Is author serious with this statement? Two kids lives were saved with a potentially inexpensive technique and the author think's this point is relevant? If 500 companies go bankrupt because cancer is cured, that is a monstrous win.

FT_intern 3 hours ago 0 replies      
Isn't this how 'I Am Legend' started?
guard-of-terra 4 hours ago 1 reply      
It's hard to cure cancer definitely because cancer cells will always be trying to out-evolve any treatment. Anything that quickly kills 99% of cells, unfortunately, lets them select for survival traits really fast.
       cached 26 January 2017 23:02:01 GMT