hacker news with inline top comments    .. more ..    13 Oct 2016 News
home   ask   best   11 months ago   
1
It's time to reconsider going on-prem gravitational.com
49 points by twakefield  2 hours ago   21 comments top 12
1
chrissnell 3 minutes ago 0 replies      
I'm a huge believer in colocation/on-prem in the post-Kubernetes era. I manage technical operations at a SaaS company and we migrated out of public cloud and into our own private, dedicated gear almost two years ago. Kubernetes and--especially--CoreOS has been a game changer for us. Our Kube environment achieves a density that simply isn't possible in a public cloud environment with individual app server instances. We're running 150+ service containers on each 12-core, 512 GB RAM server. Our Kubernetes farm--six servers configured like this--is barely at 10% capacity and I suspect that we will continue to grow on this gear for quite some time.

CoreOS, however, is the real game-changer for us. The automatic updates and ease of management is what took us from a mess of 400+ ill-maintained OpenStack instances to a beautiful environment where servers automatically update themselves and everything "just works". We've built automation around our CoreOS baremetal deployment, our Docker container building (Jenkins + custom Groovy), our monitoring (Datadog-based), and soon, our F5-based hardware load balancing. I'm being completely serious when I say that this software has made it fun to be a sysadmin again. It's disposed of the rote, shitty aspects of running infrastructure and replaced it with exciting engineering projects with high ROI, huge efficiency improvements and more satisfying work for the ops engineering team.

2
agibsonccc 4 minutes ago 0 replies      
Note: This may not generalize to your use case. We mainly serve big data customers including banks, telco, and have also seen other environments similar to the "air gapped environments" listed below.

That being said:Would just like to add some coloring to this.

>> However, not every customer that wants on-premise is a government agency with air gapped servers.

This is the bulk of our customer base and also a very large portion of the market still. I deliver software via dvd (flash drives nor wifi not allowed)

A few notes from these kinds of customers:They won't let you just install anything.

Docker is great but doesn't have a lot of enterprise adoption (despite the self perpetuating hype cycle) outside the companies that already have mature software engineering teams as a core competency.

They are often running centos 6.5 or less yet.

A lot of these environments still require deb/rpm based installation.

Admins at companies that run on prem installations tend to be very reserved about their tech stack. Docker looks like the wild west to folks like that.

Our core demographic:We do a lot of hadoop related work. We have a dockerized version of[1] that we deploy for customers.

We have also been forced to go the more traditional ssh based yum/deb route. We have automation for both.

They are right that many "on prem" accounts are now "someone else's AWS account".

We also have to run stuff on windows server as well. Docker won't fly in that kind of environment either. Microsoft still has large market share in enterprise yet and will for a long time.

K8s and co is great where I can use it, but it shouldn't be assumed that it will work everywhere let alone in most places in the wild yet. Hopefully that changes in the coming years.

Again: This is one anecdote. There are different slices of the market.

[1]: http://www.forbes.com/sites/curtissilver/2016/10/03/skyminds...

3
dantiberian 25 minutes ago 1 reply      
This is a great article, but it would be good to state up-front that the author works for a company that sells a service designed to help you go on-prem. It's not totally clear until later in the article that this is the case. It would also help put the article in context better.
4
jacques_chester 8 minutes ago 0 replies      
I work for Pivotal, which has a slightly different horse in this race: Pivotal Cloud Foundry. It's based on the OSS Cloud Foundry, to which Pivotal is the majority donor of engineering.

Lots of customers want multi-cloud capability: they want to be able, relatively easily, to push their apps to a Cloud Foundry instance that's in a public IaaS or a private IaaS. They want to be able to choose which apps go where, or have the flexibility to keep baseload computing on-prem and spin up extra capacity in a public IaaS when necessary.

It also happens that lots of CIOs have painful lockins to commercial RDBMSes, and they don't want to repeat the experience. They want to avoid being locked into AWS, or Azure, or GCP, or vSphere, or even OpenStack.

CF is designed to achieve all of these. The official GCP writeup for Cloud Foundry[1] literally says "Developers will not be able to differentiate which infrastructure provider their applications are running in..." (can't say I completely agree, GCP's networking is pretty fast).

If I push an app to PCFDev -- a Cloud Foundry on my laptop -- it will run the same way on a Cloud Foundry running on AWS, GCP, Azure, vSphere, OpenStack and RackHD.

[1] https://cloud.google.com/solutions/cloud-foundry-on-gcp

5
sytse 46 minutes ago 1 reply      
I agree that Kubernetes is a game changer which makes it much easier to run your own applications. If all you have is VMs then the services (RDS, EFS, etc.) offered by AWS are more effective. With a container scheduler there is less maintenance and the decision is harder.

BTW Shoutout to Ev from Gravitational, they are proper Kubernetes experts and we appreciate their help on GitLab, especially making https://github.com/gravitational/console-demo for our k8s integration efforts.

6
HorizonXP 13 minutes ago 0 replies      
We've been running a hybrid on-prem solution for nearly 3 years now. It's been challenging, but Kubernetes has drastically simplified it for us. It now means we can spin up a client site in 1-2 business days, provided we have a server on-site ready to go.
7
mey 1 hour ago 2 replies      
Quick note, they define Private Cloud as "Customers private cloud environment on a cloud provider" but that is not the definition I go by. Just a heads up while reading.

I would define Private Cloud as a non-bare metal solution on Prem in a traditional colo setting where hardware is owned by the company in question. Hybrid Cloud would be a Private Cloud and Public Cloud bridged in some way.

8
twakefield 1 hour ago 0 replies      
Inspired by a previous blog post popular on HN [1]. John E. Vincent's "So You Wanna Go On-prem Do Ya" [2].

[1] https://news.ycombinator.com/item?id=11757669[2] http://blog.lusis.org/blog/2016/05/15/so-you-wanna-go-onprem...

9
ddorian43 1 hour ago 2 replies      
Can't you just, like, go dedicated first ? Or better, start dedicated and turn up cloud every day at 6pm when your traffic is 10x (really, none explained how they can scale their db that fast(because you can't), only the app-tier, which is probably badly-designed to be that slow).
10
gomox 31 minutes ago 1 reply      
I think unless there is a big turn of the tide, supporting on-prem is just trouble these days. Most large enterprise customers are willing to use SaaS already.

Forget about the huge issues in the support organization for a second: the impact on-prem has on your release cycle has consequences that are hard to fully grasp. So much for "continuous development and release" if you have to keep supporting old versions of software for a year.

Build your stack so that you can easily migrate clouds (i.e. don't use all the super high-level AWS APIs). It's a good idea in general, and it should make going on-prem doable enough if you are worried about having that option at all.

11
partycoder 2 minutes ago 0 replies      
Nope it is not.

Going on premise might have some advantages but it comes with completely different problems that you didn't even thought about. Some of them:

* You thought of all the power redundancy, ideal cooling, humidity, etc... but then your office gets robbed and all your computers stolen.

* Network wiring... some people are lousy, create an entire spaghetti mess with them, used a crappier type of cable, cable crosstalk from other equipment, someone stepped over a cable and damaged it. Which one is it? good luck finding it out.

* Hardware fails, power supplies fail, disks fail, everything can fail.

You can pick these battles, or you can focus on your software based service. I suggest the latter.

12
fouc 24 minutes ago 0 replies      
This can help get your SaaS to be on-premises: https://jidoteki.com
2
Show HN: Tesseract.js Pure JavaScript OCR for 60 Languages github.com
470 points by bijection  9 hours ago   60 comments top 19
1
xigency 5 hours ago 2 replies      
To anyone screen capturing small fonts as a demonstration, or capturing digital text especially at a small resolution, I don't believe that that is the purpose of this OCR library. (As a specialized problem, that might be easier to solve depending on the typeface.)

A much better example that works quite well is a picture of someone holding a book: http://i.imgur.com/3JWs64x.jpg

 Magic . Read this to yourself. Read it silently Don't move your lips. Dont make a suund Listen to yourself. Listen without hearing What a wonderfully weird thing, huh? NOW MAKE THIS PART LOUD! SCREAM IT IN YOUR MIND! DROWN EVERYTHING OUT. Now, hear a whisper. A tiny whisper. New, read this next line with your best crotchety old-man voice: Hello there, sonny. Does your town have apost 0 Awesome! Who was that? Whose voice was that? It sure wasnt yours! How do you do that? How?! Must be magic.
Problems with this text: misspelled 'sound' as 'suund', didn't recognize the word 'anything', and mis-recognized 'a post office' as 'apost 0'.

Not bad. Especially since two of three mistakes are on the edge of the page.

2
pyronite 8 hours ago 5 replies      
The text detection is lacking in comparison to Google's Vision API. Here is a real-life comparison between Tesseract and Google's Vision API, based on a PDF a user of our website uploaded.

Original text [http://i.imgur.com/CZGhKhn.png]:

> I am also a top professional on Thumbtack which is a site for people looking for professional services like on gig salad. Please see my reviews from my clients there as well

Google detects [http://i.imgur.com/pSJym1x.png]:

> I am also a top professional on Thumbtack which is a site for people looking for professional services like on gig salad. Please see my reviews from my clients there as well

Tesseract detects [http://i.imgur.com/wwbLU6g.png]:

> \ am also a mp pmfesslonzl on Thummack wmcn Is a sue 1m peope \ookmg (or professmnasemces We on glg salad Pezse see my rewews 1mm my cuems were as weH

3
iplaw 4 hours ago 4 replies      
HOW is there not a better, almost 100% accurate OCR tool?

I routinely (daily) need to OCR PDF files. The PDF files are not scans. They are PDF files created from a Word file. The text is 100% clear, the lines are 100% straight, and the type is 100% uniform.

And, yet, Microsoft and Google OCR spits out gibberish that is full of critical errors.

From a problem solving perspective, this seems like an incredibly easy problem to solve in this exact use case. That is, PDFs generated from text files. Identify a uniform font size (prevent o-to-O and o-to-0 errors), identify a font-family (serif, sans-serif, narrow to particular fonts), and OCR the damn thing. And yet, the output is useless in my field.

4
AgentME 6 hours ago 1 reply      
Why the promise-like interface? If it returned a promise with a this-returning progress method monkey-patched onto it, then you could use it otherwise like a regular promise:

 Tesseract.recognize(myImage) .progress(function(message){console.log(message)}) .then(function(result){console.log(result)}) .catch(function(err){console.error(err)});
or

 Tesseract.recognize(myImage) .progress(function(message){console.log(message)}) .then( function(result){console.log(result)}, function(err){console.error(err)} );
I guess I just still have bad memories of jQuery's old almost-like-real promises. I'd rather never have to think ever again about whether I'm dealing with a real promise or one that's going to surprise me and break at run-time because I tried to use it like a real one.

5
jameslk 7 hours ago 1 reply      
For all those claiming issues with reading text from a screen shot of this page, note that this is more an issue with the original Tesseract library, not this library (which appears to wrap Tesseract compiled through Emscripten). I remember having a similar issue when I used the original Tesseract. The quick hack I found to fix it was to rescale any small text input images 3x first before feeding it to Tesseract. I'm sure there's more intelligent solutions to mitigate that problem.
6
greenpizza13 8 hours ago 1 reply      
Excited about this... but the OCR quality seems to be very bad. Maybe it's not optimized for recognizing black text on a white background.

For example, I took a screenshot of this comment and ran it through the demo and got this:

Excited ehent this... but the OCR enenty Seems te be very bad. Maybe it's het Dptimized farrecngnizing black text an e white heckgmnhe.EDI example, 1 tank e Screenshnt at this cement ehe teh it. thmneh the den ehd get this:

It seems to recognize the bounding boxes just fine but mangles the words.

7
gentleteblor 8 hours ago 1 reply      
I've always wanted to use Tesseract on .NET projects but it was always clumsy (wrappers). This looks like it'll make things easier.

Thanks for putting this out.

8
goatslacker 6 hours ago 0 replies      
I've been using this library to read screenshots of Pokemon Go to automatically calculate Individual Values for each Pokemon[1] It's worked great on desktop, but on mobile safari where it matters most the library causes the browser to crash :(

1: https://github.com/goatslacker/pokemon-go-iv-calculator/blob...

9
jaytaylor 2 hours ago 0 replies      
For those who may be interested;

I threw together a quick proof-of-concept in Go for exposing tesseract via a web API:

https://github.com/jaytaylor/tesseract-web

10
yankyou 8 hours ago 1 reply      
> Drop an English image on this page to OCR it!

This looks great, and I'd really love to but

> Uncaught ReferenceError: progress is not defined

EDIT: works now!

11
mdani 8 hours ago 1 reply      
Languages list link is broken - getting 404 for the followinghttps://github.com/naptha/tesseract.js/blob/master/tesseract...
12
KiwiCoder 8 hours ago 1 reply      
Impressive that this is pure JS, however trying an image cut from the page itself gave this result

> Dropan Enghsh Wage on (Ms page to OCR m

Should be

> Drop an English image on this page to OCR it!

13
maaaats 5 hours ago 1 reply      
Does it block while it works and do the work in several setTimeouts or how do they get it to report progress without freezing everything?
14
slajax 8 hours ago 0 replies      
Pretty cool. I screen captured the text in the bottom right corner of the page and it had some issues. Here's a screenshot of what happened: http://io.kc.io/hkeM
15
zhte415 7 hours ago 0 replies      
Does this include taking a text and for example, when viewing it, 'wiping' the text in the logical native language order?

For languages that don't employ much whitespace, this would be nice.

16
mrcactu5 7 hours ago 1 reply      
Tesseract is not specific to JavaScript right? I do recall there being a version for Python
17
ckluis 8 hours ago 2 replies      
What License? Doesn't mention it.
18
newtons_bodkin 8 hours ago 0 replies      
How long did this take to build?
19
employee8000 8 hours ago 2 replies      
Is this at all affiliated with the already-existing tesseract OCR library? It doesn't seem to be from my cursory check so if not you need to rename your library, because you're ripping off their name.

https://github.com/tesseract-ocr/tesseract

3
OpenBSD vmm enabled undeadly.org
53 points by transpute  3 hours ago   1 comment top
4
Handheld DIY EMP Generator can destroy phones hackaday.com
62 points by adius  4 hours ago   41 comments top 7
1
M_Grey 1 hour ago 2 replies      
Or, "How to kill people with pacemakers". Seriously, there are a lot of bad uses for this which have nothing to do with phones.
2
jacquesm 2 hours ago 4 replies      
This won't destroy your phone. Phones are extensively tested when it comes to both creating and withstanding EM interference. Best it could do - maybe - is crash one that is on the edge of its spec and you would probably have to get pretty close for that (fall off as the square (edited) of the distance).
3
tlarkworthy 2 hours ago 1 reply      
Things like that were used to cheat slot machines http://empjammer.com/how-a-emp-jammer-working
4
giardini 29 minutes ago 0 replies      
Why a second external coil? It isn't aligned with the first coil and so the induced EMF is minimized (usually such coils are co-axial).
5
jaytaylor 1 hour ago 0 replies      
How would the design need to be changed to make it possible to zap a phone from a distance of 1-10 meters?
6
phkahler 2 hours ago 1 reply      
I've often thought about using a wok as a dish and putting a magnetron at the focus. High power directed microwaves would probably damage a lot of things. And made from kitchen hardware...
7
mcguire 2 hours ago 0 replies      
"18650 lithium battery cell"

Or, possibly, explode into flaming death.

5
Show HN: Your Social Media Fingerprint (maybe NSFW) robinlinus.github.io
687 points by Capira  12 hours ago   206 comments top 50
1
Pxtl 6 hours ago 5 replies      
FYI, it's very NSFW in the back-end. Your browser is sending requests to obvious porn servers when you hit this link so it can test if you're logged in to them.
2
zerognowl 10 hours ago 14 replies      
This is why I use 'browser isolation', which is a way to separate different types of surfing activity into different buckets. Currently the best way to do this in Firefox is to create multiple profiles, or in Chrome, you can simply add a different user/persona.

Having one profile, or even an entire dedicated browser just for Twitter/FB ensures the login is not spilled over into other sites. If you're surfing the web heavily, I would recommend spawning a new private window so cookies, and other artefacts are not bleeding into your session.

It sounds like common sense, but many people have cookies and login information persisting for years at a time in their browsing sessions. The Mozilla Firefox team are planning to introduce a feature which makes compartmented surfing sessions a lot more user-friendly by separating sessions into tabs. Currently, the 'profiles' feature of Firefox is not user friendly and requires a bit of tinkering with the filesystem.

3
the8472 5 hours ago 0 replies      
The firefox and tor devs are cooperating to upstream a tor browser feature that isolates cookie stores and similar things based on the domain shown in the URL bar[0]. Available in nightly by enabling privacy.firstparty.isolate = true in about:config.

Additionally they're also also working on a more customizable version of that called contextual identities[1], which eventually will also be manageable by extensions[2]

And of course addons that block cookies in cross-origin requests or cross origin requests in general such as matrix[3] also plug this hole.

[0] https://bugzilla.mozilla.org/show_bug.cgi?id=1260931

[1] https://blog.mozilla.org/tanvi/2016/06/16/contextual-identit...

[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1302697

[3] https://github.com/gorhill/uMatrix

4
diegorbaquero 11 hours ago 8 replies      
In Chrome: Settings > Privacy > Content Settings > Tick 'Block third-party cookies and site data'

Also set 'Send a "Do Not Track" request with your browsing traffic'

And install uBlock Origin, ofc.

5
tomvangoethem 1 hour ago 0 replies      
Attaching cookies to third-party requests is the source of many issues. In a similar demonstration [0], I showed that browser-based timing attacks (which can probably be considered as wont-fix as well) can be used to extract more specific information from social networks (e.g. one's political preference based on who they're following).

[0]: https://labs.tom.vg/browser-based-timing-attacks/

6
aswanson 16 minutes ago 0 replies      
Google is basically omniscient on a user-profile basis with years of search, gmail, and youtube data on users. They should just write and algorithm and let it send out job offers with no human intervention, just like search.
7
spacemanmatt 10 hours ago 3 replies      
TIL YouPorn is considered social media
8
EJTH 8 hours ago 0 replies      
Very simple and cool exploit. I wouldn't be surprised if this technique is already in use on various ad platforms. A really simple pitfall I think most of us can confess to having done in the past (redirect attributes are pretty common in the wild).
9
dorianm 12 hours ago 0 replies      
So, loading favicon.ico via a redirect-type parameter:

 <img onload="alert('logged in to fb')" onerror="alert('not logged in to fb')" src="https://www.facebook.com/login.php?next=https%3A%2F%2Fwww.facebook.com%2Ffavicon.ico">

10
Joof 2 hours ago 0 replies      
Can't get this to work. Turned off ublock origin, but still using https everywhere and blocking third-party cookies (for a recently discovered attack that utilizes cookies).
11
ge96 3 hours ago 1 reply      
How does this work?

I think I get the basic concept of calling redirects to various sites from the page, probably back-end like with php, CURL maybe?

I just don't get how you'd keep track of where it goes after the redirect (trying a link) since you would now be on Facebook's site for example

12
bhauer 5 hours ago 1 reply      
This is the first I had heard of GETs to login pages executing a redirect when the user is already logged in. I wasn't aware that so many did this.

Virtually every application I have built will render a simple response saying "You are already logged in" if you GET the login URL with an active session. As I understand the exploit, if a non-image is returned, the script assumes you are not logged in.

What value is there in redirecting a GET if you're already logged in? You redirect when the login form is submitted as a POST.

13
denzil_correa 12 hours ago 1 reply      
Apparently. I am not logged into anything. I tried it on Opera (along with the internal ad blocker) and I'm not using Privacy Badger.
14
a3n 3 hours ago 0 replies      
So, did I just make all those sites that I'm not logged in to aware of my IP address? And if I didn't have ad blocking, would I then be seeing ads "of interest to" people who visit those sites?
15
rasz_pl 11 minutes ago 0 replies      
Is this a spoof? it is 100% WRONG for me on Vivaldi browser.

Says im logged to FB and nothing more. I dont even have a bookface account, but I do have gmail/YT/github/reddit and few other open in the adjacent tabs and fully logged in.

16
mdesq 10 hours ago 2 replies      
Using uBlock Origin and Privacy Badger defaults, it only showed me as logged into Hacker News.
17
amelius 10 hours ago 2 replies      
Shouldn't a browser not send cookies when the request comes from a different domain? That would seem like the most sensible solution to me. Unless somebody can show a caveat of course.
18
instakill 11 hours ago 2 replies      
Scary. Netflix is showing logged out though, whereas I'm actually still logged in.
19
im_dario 1 hour ago 0 replies      
Using Brave Browser it gets wrong Reddit and Flickr for me. I'm not even logged on these.

On the other side, it doesn't detect Facebook. Only got Twitter right.

20
cha-cho 11 hours ago 1 reply      
Pretty compelling information. Two observations: 1) No LinkedIn. Are they on top of the problem? 2) I had fun results with the Epic Privacy Browser.
21
morinted 7 hours ago 0 replies      
Nifty, with Firefox containers each one shows the "mode" I'm in. Hackernews for default container, personal has my Google world + open source + Dropbox, work has my work's Gmail world, and shopping has my Amazon account. It's like a verification that containers work!
22
throwaway049 5 hours ago 0 replies      
It says I'm not logged into any of its sites. Chrome on Android 6. No special privacy measures. I am logged into a few sites in the browser, including this one.
23
Scirra_Tom 12 hours ago 2 replies      
Very good demonstration thank you.

Some interesting (an unethical) potential marketing opportunities here. For example, at the bottom of articles only show share actions for social platforms they are logged into.

24
nodesocket 5 hours ago 1 reply      
Couldn't this be fixed by instead of using ?next= in the query string storing a cookie.

For example:

 if(!auth) { setCookie('next', '/url-here', 1h); } redirect(login);
Login page action:

 if(cookieExists('next')) { next = getCookie('next'); deleteCookie('next'); redirect(next); } else { redirect('dashboard'); }

25
edibleEnergy 5 hours ago 0 replies      
Recorded the network requests (from incognito) for fun with BugReplay, (the webapp I've been building for a bit over a year) here: https://app.bugreplay.com/shared/report/acf38fbd-f2e1-41c7-9...
26
mp3geek 4 hours ago 0 replies      
Not sure how much false positives this will cause, but its fixed in the Enhanced Tracking list.

https://github.com/ryanbr/fanboy-adblock/commit/2385fb0b2b28...

27
sua_3000 11 hours ago 2 replies      
Can someone explain how this is NSFW? Is it because it's scraping for logins which looks suspicious?
28
CapitalistCartr 11 hours ago 1 reply      
Well its good to see its partly wrong for me. It shows HN correctly, but also shows me logged in to Facebook and Tumblr, not correct. And not logged in to gmail, which I am. Still, its a dangerous flaw.
29
owenversteeg 6 hours ago 0 replies      
Hmm weird, it correctly detected everything except for the false negatives of PayPal, Tumblr, and Spotify. Taking a look at the mechanism I have no idea why this would happen, and opening the relevant links in my browser gives the favicon as it should. Weird.
30
stabbles 6 hours ago 0 replies      
Maybe you could add this leak to your list as well: https://news.ycombinator.com/item?id=12695451
31
rosalinekarr 11 hours ago 1 reply      
This 'fingerprint' changes as you login in and log out of various services, so it's not very reliable for uniquely identifying users. Regardless, it could still be used to profile you and then target content accordingly. For example, if you're logged into Hacker News, you're probably a programmer and you're probably more interested in an ad for web hosting than wedding dresses and visa versa for Pinterest.
32
K0nserv 11 hours ago 1 reply      
I have uBlock Origin in 3rd party deny mode and privacy badger and it still detects me as logged in to HN, Reddit, Slack and Stack Overflow.

EDIT: Following diegorbaquero's advice[0] solved it

0: https://news.ycombinator.com/item?id=12692485

33
kchoudhu 5 hours ago 0 replies      
Who the hell makes accounts on porn sites?
34
bcheung 6 hours ago 0 replies      
Haha, I like how you added just one porn tube site so that you can add NSFW in the title. Nice click baiting. lol
35
fitzwatermellow 11 hours ago 1 reply      
Quick fix: embed favicon in data-uri ;)
36
Anagmate 4 hours ago 0 replies      
for me, it throws several false alerts (Twitter, Flickr and few others). Is it possible that it's caused by my browser extensions (uBlock Origin, Disconnect)?
37
eonw 8 hours ago 0 replies      
what is happening is not legal in the US and a large porn website was sued for doing it. they were printing hidden links on the page, then checking the color with JS to see if you had visited the destination url or not. judge didn't think it was a fair business practice. maybe these companies are not fixing this because of this legal precedent and figured no one was doing it?
38
caoilte 5 hours ago 1 reply      
That's a fun website to look at through Gorhill's uMatrix plugin.
39
alexholehouse 7 hours ago 2 replies      
So, interestingly, it had me logged in to reddit, but I don't actually have a reddit account at all. Thoughts?
40
user5994461 5 hours ago 0 replies      
Good news! It's blocked by uBlock Origin and noscript.
41
JoeAltmaier 8 hours ago 0 replies      
Works mostly! I'm logged into HN of course; it says I'm not. Also Steam.

It got Facebook, Gmail, Youtube, Dropbox right.

Using default browser IE 11 on Win7

42
dhimes 8 hours ago 0 replies      
Hmm. This works in Firefox 49, but gets it quite wrong in Google Chrome 53. I'm on Linux Mint 17.2 64 bit.
43
Retr0spectrum 11 hours ago 0 replies      
I would be interesting to keep track of how common each particular fingerprint is. It could potentially be used to identify an individual user.
44
eriknstr 11 hours ago 0 replies      
>You are logged in to:

>No platform

>(or you're using something like Privacy Badger)

I'm using uMatrix and uBlock Origin :)

45
eximius 9 hours ago 0 replies      
Hm. Doesn't seem to work on Chrome on Android.
46
metastart 10 hours ago 0 replies      
Nothing shows up in my Epic Privacy Browser ;-D!!
47
dimino 8 hours ago 0 replies      
> without your consent

Untrue. I have given my consent. Why are these privacy posts always using some kind of nefarious and negative language?

48
stanislavb 11 hours ago 0 replies      
Nice work!
49
fwn 12 hours ago 4 replies      
Keep in mind that it doesn't show up the icons at all if you're using a content blocker and activated Fanboys Annoyance List.

This is because the critical resource is named "/socialmedia-leak/socialmedia-leak.js".

50
cs0 11 hours ago 7 replies      
Nice, so now by using this I have an NSFW site logged in my workplace's DNS log. Be careful if your employer checks such things.
6
Barack Obama on A.I., Autonomous Cars, and the Future of Humanity wired.com
37 points by gflandre  2 hours ago   26 comments top 3
1
msvan 16 minutes ago 3 replies      
What does Hacker News think about AI? Is it real this time, or are we in for another winter? I'm seeing a lot of grand claims, and it certainly seems like there are plenty of applications, but I'm still not totally convinced that it will turn the entire economy upside down.

Given the enormous amount of press, tweets, blog posts, conferences, degree programs, seminars and interviews popping up it seems like there has to be something more than just hot air here. Still, the most outrageous predictions are hinged on breakthroughs in unsupervised learning happening. Taking the pessimistic view on science, what if we don't get there?

2
cjbprime 59 minutes ago 5 replies      
> OBAMA: The way Ive been thinking about the regulatory structure as AI emerges is that, early in a technology, a thousand flowers should bloom.

I gotta say, I thought he'd be more familiar with that metaphor.

https://en.m.wikipedia.org/wiki/Hundred_Flowers_Campaign

3
notliketherest 1 hour ago 5 replies      
"But it also has some downsides that were gonna have to figure out in terms of not eliminating jobs. It could increase inequality. It could suppress wages."

No doubt that advances in technology have always done away with jobs. We're almost to the point at which the biggest blue collar industry (truck drivers) is about to be wiped out by self driving trucks. What I'm concerned about is the government stifling innovation such as driverless trucks to retain those jobs, or some sort of regulation that stifles the technology's potential. What is the alternative?

7
Decades-Old Mystery Put to Rest: Why Are There X's in the Desert? npr.org
187 points by aaron987  6 hours ago   52 comments top 13
1
grkvlt 2 hours ago 2 replies      
Also interesting are the resolution targets, using sets of black and white bars at varying widths in both X any X orientation, to determine the spatial resolution i.e. how high frequency a component could still be imaged separately as lines, similar to TV test cards. These are discussed in various blogs [1][2] with some amazing picures. I think these were for both satellite imaging and spy-plane (U2 and SR-71 plus less exotic surveillance systems) and not just the USA. This article [3] shows some satellite test targets in the Gobi desert, presumable for Chinese (PRC) spy satellites, and also has a cool picture of the world's largest compass rose, at Edwards dry lake bed, as well as explaining the crosses from the original article, and talking about radar altimeter targets (another dry-lake bed) that are mapped to centimetre accuracy in altitude for calibrating GPS and other systems.

I find it a really interesting area of industrial/scientific archaeology, with some fascinating stories.

[1] http://www.clui.org/newsletter/winter-2013/photo-calibration...[2] http://www.bldgblog.com/2013/02/optical-calibration-targets/[3] http://www.atlasobscura.com/articles/landscapes-made-for-sat...

2
CoryOndrejka 29 minutes ago 0 replies      
My dad was project photogrammetrist on Corona when he worked at Itek in the 60s. Lots of stories around focus and aiming challenges, since Corona was used to build maps of inaccessible regions of the world (e.g. Soviet Union ICMB sites). Focus targets gave high contrast, known images to detect what kind of focus problems were being encountered -- ranging from image smear from forward motion compensation failing or stretching the film; film sticking, stretching and/or lifting off of the focal plane; star camera inaccuracies; thermal distortion of camera, spacecraft, star camera, or film; etc. A few good books on Corona (https://www.cia.gov/library/publications/intelligence-histor...) and Itek (https://books.google.com/books/about/Spy_Capitalism.html) out there. Same teams worked on subsequent KH projects (Gambit, Hexagon/Big Bird/BMF), as well as Apollo and Viking camera systems.
3
mslev 6 hours ago 2 replies      
I love stuff like this, I just think its so cool. Similar to this are the large cement arrows across the US:

http://www.cntraveler.com/stories/2013-06-17/transcontinenta...

4
127001brewer 6 minutes ago 0 replies      
Interesting: I've submitted this story over a day ago (https://news.ycombinator.com/item?id=12684118) with a clean URL, yet this submission got noticed?

Yes, good luck and good timing, but still - ha!

5
strictnein 5 hours ago 2 replies      
One of the most interesting things about the Corona satellites was that they shot on film and that film was developed back on earth.

So how'd they get that film back down from space?

https://en.wikipedia.org/wiki/Corona_(satellite)#Recovery

6
LoSboccacc 1 hour ago 0 replies      
"Those X where used to calibrate spy satellites camera" #stopclickbait
7
okket 5 hours ago 0 replies      
The in the story linked movie about the Corona Project is an amazing document of history.

https://archive.org/details/gov.archives.arc.1678526

8
zerooneinfinity 5 hours ago 2 replies      
Can someone explain to my dumbass how they helped calibrate the photos or satellites?
9
beamatronic 5 hours ago 3 replies      
Could you infer the orbit of the satellite from the location of the X's?
10
dredmorbius 5 hours ago 1 reply      
The Corona project. A/K/A the KeyHole satellites. If my count is right, 135 satellite launches, though not all were successful.

Fun fact: this was the early 1960s. CCD technology and IP transmission bitrates were a bit primitive[1], so the film cameras would eject capsules after they'd been shot, which would re-enter the atmosphere and be recovered, most through mid-air retrieval. The project was active from 1959 to 1972.

https://en.m.wikipedia.org/wiki/Mid-air_retrieval

https://en.m.wikipedia.org/wiki/Corona_(satellite)

Video of recovery: http://petapixel.com/2014/08/31/us-spy-satellites-used-drop-...

______________________________

Notes:

1. Well, technically they didn't exist.

11
SCHiM 6 hours ago 4 replies      
tl/dr: U.S. Military spy satellite camera calibration.
12
vizzah 2 hours ago 0 replies      
They could have embedded LOVE instead of X's and it wouldn't have created any mystery and unneeded attention, as everyone would know it was hipsters.. no, I mean hippies.. =D
13
ythl 5 hours ago 1 reply      
8
Windows 93 (2014) windows93.net
685 points by hxw  14 hours ago   101 comments top 46
1
qwertyuiop924 10 hours ago 2 replies      
It's pretty amazing how much classic software can actually run, and works pretty well. The Wolf3D clone is totally playable, and you can actually use LSDJ (one of my all-time favorite pieces of software), and it seems to be actually running the real LSDJ, too, which is pretty impressive, considering that it means that the site actually has an embedded gameboy emulator
3
indonesia 13 hours ago 1 reply      
This seems so amazing that I had to close it immediately, scared of how much time I would lose with it.
4
vito 7 hours ago 2 replies      
Solitude even has the victory animation: http://i.imgur.com/IlSm5pE.png

Funnily enough if you leave the tab in the background, the events queue up and when you tab back there are a bunch of cards bouncing around at once: http://i.imgur.com/dNObRGb.png

5
csmattryder 14 hours ago 0 replies      
Awesome site, make sure you play the Castle Wolfenstein game and check out chapter 2 - "Operation Stallman".

It's the most /g/ thing I've seen all day.

6
jgw 12 hours ago 3 replies      
Clever and artful, with a little nostalgia thrown in.

1993 was the year I started my undergrad and would not see a "web browser" until the following year. It's nice to re-capture a little of the spirit of what computing was like back then.

Tip: Be sure to click on the virus and after what happens happens, fling the icons around. Moments of mindless fun.

7
Frogolocalypse 13 hours ago 3 replies      
It has probably been shown plenty of times before, but here is a linux command-line javascript one too.

http://bellard.org/jslinux/

8
discrisknbisque 9 hours ago 0 replies      
This is bringing back strong memories of messing around with QBASIC and Clickteam's The Games Factory.

My first game was called "Money 4 Nothing" and you moved a mouse-cursor locked guy around collecting floating cash piles and avoiding guards who usually just shot you on a loop.

9
delegate 12 hours ago 0 replies      
The trick in the "simulator" is to drink coffee, smoke cigarette, smoke weed then take lots of acid and procrastinate until the operating system is finished. Then launch it to finish game. I got 194 #Hero

You can also go to Paris at some point.

10
kowdermeister 14 hours ago 1 reply      
In the trash, there's a zip that contains a link to this album:

https://jankenpopp.bandcamp.com/album/poire-c-poire-v :)

11
FreeFull 13 hours ago 1 reply      
I think it's funny to find an old bytebeat formula I made in this.
12
nostromo 7 hours ago 0 replies      
This is by hacker/musician Jankenpopp, which is why it links to their music.

http://jankenpopp.com/

Music:

https://www.youtube.com/watch?v=C7fKoamz0nY (for example)

http://jankenpopp.bandcamp.com/ (full free albums)

13
alexroan 3 hours ago 0 replies      
I'm sure I'm not the only one who created a load of recursive virtual PC's inside one another. PCeption.
14
jschwartzi 1 hour ago 0 replies      
Does everyone else have their Microsoft lottery winner certificate? I'm itching to cash mine in for that 250k.
15
cestith 7 hours ago 3 replies      
If you open the recycle bin there's a zip file that actually downloads through your real browser to your real computer. I didn't bother opening it. It has a filename not everyone will immediately know how to delete. That's really not a funny thing to put out there.
16
asimuvPR 10 hours ago 1 reply      
The snake game music when you click to defrag... Felt young and old at the same time.
17
RankingMember 11 hours ago 0 replies      
Pretty brilliant site. Unfortunately it locked up Chrome tight after I ran the 3d program and tried to close it along with running Wolfenstein.
18
Cyph0n 14 hours ago 0 replies      
The PlayStation boot sound kind of surprised me...
19
snake_plissken 10 hours ago 0 replies      
I forgot about that hamster dance website! Talk about something that was ahead of it's time...

I love you, Internet.

20
dbancajas 9 hours ago 0 replies      
so what's the stack on this? curious how this is done. Is this like the http://copy.sh/v86/ ?
21
lawless123 13 hours ago 2 replies      
something kinda vaporwave about this
22
__jal 7 hours ago 0 replies      
I love the "VirtualPC" application. Recursive metahumor for the win!
23
entelarust 7 hours ago 0 replies      
Needs sheep.exe
24
Crystalin 7 hours ago 0 replies      
You can cheat on the "Solitude game" (that I finished few times already) by double-clicking on hidden cards to add them to the top if they are the matching cards.
25
cbaleanu 9 hours ago 0 replies      
I actually had Virtual Girl running for a few days in my teenage years :)
26
gchokov 13 hours ago 3 replies      
WIndows 93 does not work on Safari mobile :/
27
zhte415 11 hours ago 2 replies      
The Window Manager and opening/closing/dragging/resizing effects were really impressive for 1993. Way more than Mac at the time, or even Linux 3-4 years later.
28
BHSPitMonkey 14 hours ago 0 replies      
This is art.
29
jschwartzi 12 hours ago 0 replies      
The corgi doesn't work as expected. Can I return this for a refund?
30
maxwellito 11 hours ago 1 reply      
My eyes and mind were not ready for that! This is brilliant!
31
anacleto 3 hours ago 0 replies      
Old but gold.
32
hitekker 11 hours ago 0 replies      
I must be crazy but when I opened this up on my phone, I thought " with a few more tweaks, this could be my go l-to mobile UI"
33
cokernel 10 hours ago 0 replies      
It's tricky to win at SOLITUDE when the first deuce played to a foundation just disappears.
34
kar1181 9 hours ago 0 replies      
Love that there's an Atari ST and ZX emulator running inside of this.
35
tomw1808 12 hours ago 1 reply      
Hell yeah! That must have taken a whoooole lot of fun programming time to develop that.

Someone have a guesstimate?

36
loeber 14 hours ago 0 replies      
I love the individual soundtracks for all the applications. They're so well-done.
37
l0c0b0x 8 hours ago 0 replies      
This is pretty amazing.. well done!
38
AdmiralAsshat 10 hours ago 1 reply      
Why does it play the PS1 startup noise?
39
tadp 6 hours ago 0 replies      
Is this magic?
40
entelarust 7 hours ago 0 replies      
needs sheep.exe
41
messel 12 hours ago 0 replies      
Pre IE, no simulation levels
42
Hydraulix989 10 hours ago 0 replies      
I love the Vaporwave vibes
43
KiDD 11 hours ago 0 replies      
I love this!
44
microcolonel 12 hours ago 0 replies      
Kinda disappointed that Puke Data isn't interactive. ;- )
45
smegel 13 hours ago 0 replies      
c:\libs has all the javascript you need ;)
46
Frogolocalypse 13 hours ago 0 replies      
I love these things.
9
Why our laws will move from natural language to code backchannel.com
11 points by steven  1 hour ago   6 comments top 6
1
sdrothrock 9 minutes ago 0 replies      
I can't help but completely disagree with both the idea and the probability that this will ever happen.

Modern law is supposed to be open, so that anyone can read it. Reencoding laws, for whatever reason, strikes me as being fundamentally undemocratic.

The argument can be made that if legal "code" is taught in schools, then it's no more undemocratic than writing down law, but the chances of that happening seem slim to none.

Another argument is that law is already written in "special" English that most people can't figure out, but I think that having it in English alone is a big step toward having it being readable by almost anyone.

It would be neat to have law written as code and to have AI be able to parse it for you in lay language -- that is, you could ask it questions or pose situations to get a legally-binding answer. But then the question for me is, why don't we just have said AI read the current legalese and parse it?

2
kazinator 12 minutes ago 0 replies      
Laws are not in anything that can be called in natural language; they are in a code called "legalese".

Legalese doesn't have to be formalized into program-like code. Rather, perhaps into mathematical language, with some of the notations that go with it. It needs to use clear logic, and set theoretic descriptions and reasoning. Law is all about logic and sets: what rule applies under what conditions, and what is included and excluded and so forth.

4
flukus 2 minutes ago 0 replies      
Why not keep it in english and add some unit tests that demonstrate the intent of the law?
5
hprotagonist 10 minutes ago 0 replies      
"There can be no Justice so long as Laws are absolute".
6
joshmarinacci 8 minutes ago 0 replies      
Isn't this a Wolfram blog? Why is it here?
10
Wells Fargo CEO John Stumpf Steps Down wsj.com
214 points by smaili  3 hours ago   151 comments top 30
1
Smerity 3 hours ago 9 replies      
I highly recommend listening to a recent episode[1] of Planet Money. This was by no means an accident. As a short summary of one employee's plight:

Ashley, working for them and making only $35k per year in San Francisco, was continually harassed to sign people up for accounts they didn't want. An old man comes in, pensioner, $200 in overdraft fees due to being duped into excess accounts. She dips into her own savings to get him back in the black. She reports the incident to the internal ethics line. Nothing. Tries again. Nothing. She refuses to fraudulently push excess accounts onto people. Fired. Worse, Wells Fargo put her onto a permanent blacklist that others in the industry pay attention to - she can't get a job anywhere else.

Imagine making $35k per year in San Francisco - insanely low given the region - and dipping into your own pockets to help fix a situation your own company created. Then, as thanks, being eventually fired by that company for not continuing the practice and being blacklisted in your field of work. I'm desperately hoping Ashley sues Wells Fargo in a defamation suit but I fear the likelihood of that is low - even if it's not the first time it has happened to Wells Fargo[2] ...

At best, upper management were willfully negligent of the impact that their insane sales goals had on the ethics of the company. At worst, upper management were actively trading any ethical notions they could get hold of for money, ripping apart the lives of employees and customers on the way.

[1]: http://www.npr.org/sections/money/2016/10/07/497084491/episo...

[2]: http://www.forbes.com/sites/billsinger/2011/12/15/wells-farg...

2
webaholic 3 hours ago 3 replies      
Go Elizabeth Warren! Without her prodding, I am sure this would never have happened. He was very happy with firing the 5k employees and not taking any blame on himself for what was essentially his push.
3
rm999 3 hours ago 0 replies      
Non-paywalled article with basically the same information: https://www.washingtonpost.com/news/business/wp/2016/10/12/w...
4
jackmott 3 hours ago 4 replies      
Why isn't anyone in jail again?We will choke a guy to death for selling cigarettes incorrectly, and can't put people stealing millions in jail.
5
JumpCrisscross 1 hour ago 0 replies      
The New York Times this morning about his supposed successor:

"As chief administrative officer from 2010 to 2011, however, Mr. Sloans role included overseeing Wells Fargos human resources and reputation management. He then became finance chief for three years. And one of his direct reports when he was promoted to chief operating officer last year was Carrie Tolstedt, who ran the offending community-banking division until earlier this year.

That makes Mr. Sloan a member of the inner circle that would have known about the wrongdoing from its early days and tried to deal with it. This group hardly covered itself in glory: It was still handing out pink slips in 2016, five years after the first bankers were shown the door. Mr. Stumpf and Ms. Tolstedt have already ceded compensation for the mess. Investigations by the board and regulators may yet implicate Mr. Sloan and others."

http://www.nytimes.com/2016/10/12/business/dealbook/wells-fa...

6
slantedview 1 hour ago 0 replies      
Elizabeth Warren deserves a lot of credit for putting the heat on Stumpf. Coincidentally, one of the leaked Democratic e-mails from this week features banking lobbyists complaining to Democratic party officials about Warren:

https://theintercept.com/2016/10/11/warren-goldman-dccc/

The apparent coziness between the party and the banks is a whole other can of worms, but that Warren doesn't hesitate to go against the party grain and attack folks like Stumpf is a nice thing to see. Good on her.

7
ourmandave 2 hours ago 0 replies      
A previous article said he could leave with $200M which is not a golden parachute but built up over 30 years of working there.

This article says they could take back $41M in stock options, another $4M in bonuses, and probably $20M in 2016 pay. (They paid him $19.3M in 2015.)

$200M - $65M = f*ck me

8
taurath 1 hour ago 0 replies      
My understanding is that they created a system of unrealistic expectations as far as accounts per customer, had managers from the top that pushed this on people and they got fired.

Then, in the case where employees started reporting that they were forced to do the activity to keep their jobs, nothing was done. This seems more like a lever that can be pulled here:

Prevent banks from having ineffective whistleblowing processes by mandating use of a FINRA whistleblowing program. The person in the top comment being fired would get protected by Finra for her future employability, and Wells Fargo and other banks would take whistleblowing far more seriously. I admit it sounds very simplistic, but is there something crucial I'm missing here? I can't see how one would file charges at the top executives for "incompetantly making an internal ethics line" and "setting too difficult sales goals, causing an unintended fraudulent effect".

9
hvmonk 1 hour ago 0 replies      
Senator Elizabeth Warren's questioning at the hearing is worth watching: https://www.youtube.com/watch?v=xJhkX74D10M
10
Bud 3 hours ago 6 replies      
Whatever. Will he get a nine-figure golden parachute like the other senior exec who just left? If not, this does not impress me. Will any of his earnings and bonuses be clawed back? If not, this does not impress me. What about the other execs responsible?
11
erickhill 3 hours ago 4 replies      
I'm surprised no individuals at the executive level have been charged with crimes, to be honest. Or perhaps it's simply too soon and that's the next shoe to drop.
12
carsongross 3 hours ago 1 reply      
Until the headline reads "Wells Fargo CEO John Stumpf Goes To Jail", nothing changes.
13
suprgeek 2 hours ago 0 replies      
What is not mentioned in the very thin article is that this scumbag who ruined many people's lives gets to walk-away with $134 Million [1].

[1] http://www.usatoday.com/story/money/markets/2016/10/12/wells...

Yet if a common man had committed any of the fraud - they would be very likely in jail.

14
evanpw 1 hour ago 2 replies      
Let me just link to Matt Levine's take: https://www.bloomberg.com/view/articles/2016-09-09/wells-far....

TL/DR: The bank probably didn't benefit on net from the fraud, even before the fines. This is more a case of management setting unreasonable sales goals and creating a terrible work environment than a conspiracy to commit fraud against bank customers.

15
GCA10 3 hours ago 1 reply      
Bravo to the directors for waking up. Any time there's a long-serving CEO of a big public company, it's as if the directors have been chloroformed. They generally show no ability to question the boss, let along hold him/her accountable for anything.

Most of the other boards that should be waking up ... probably won't. But at least they've got a reminder that it's possible.

16
maxerickson 2 hours ago 0 replies      
I just got a robocall painting the CFPB as anti-consumer and a waste of tax dollars.
17
mixmastamyk 58 minutes ago 0 replies      
My perception is that WF took a big turn for the worse ethics-wise when they merged with Norwest in Minneapolis. Never heard a negative word about them until that time.
18
tomrod 3 hours ago 0 replies      
I would expect no less. Given the size of the issue, it's virtually impossible that he didn't know about it. I'm assuming competency here, as he was leading one of the largest banks in the world.
19
ArkyBeagle 2 hours ago 1 reply      
I watched a lot of his testimony before the Financial Services Committee on CSPAN. So I am unsurprised. He took a fairly savage beating.

For some reason, the fact that Wells Fargo is a really big enterprise cut no ice with the various Congresscritters.

FWIW, I am not a "break up the banks" guy[1] mainly because of things written and said in podcasts by Charles Calomiris, author of "Fragile By Design" along with Stephen H. Haber. It is all more nuanced and complicated than that, and the small bank lobby in the US really is a thing and a thing that has caused problems.

[1] Canada has very close to only one bank and has had no financial crises to speak of.

20
pessimizer 2 hours ago 0 replies      
The USA Today headline is better: "Wells Fargo CEO Stumpf retires with $134M"
21
trhway 1 hour ago 0 replies      
he directed and benefited from massively organized criminal activity - it looks like a RICO case, isn't it?
22
honkhonkpants 3 hours ago 3 replies      
Honestly I still don't understand why this organization continues to exist. Banks are chartered by the government for the purpose of safeguarding deposits. This bank was engaged in fraud on a huge scale. Their charter should be terminated.
23
tlogan 2 hours ago 0 replies      
The title should be: "Stumpf gets with $134M"
24
bogomipz 2 hours ago 0 replies      
I Couldn't read the article because of paywall. Is there any news on Carrie Tolstedt the woman who oversaw the unit that created all the fraudulent accounts and gets to retire in July at age 56 with a $124 million pay package? This in addition to the $9 million dollar she took home last year.

http://money.cnn.com/2016/09/12/investing/wells-fargo-fake-a...

25
st3v3r 2 hours ago 0 replies      
He's still not actually being punished. He lost his job. Big deal; he's got a golden parachute. He's not going to go through any of the stress that any of the people he fired for this went through when they lost their jobs. He's not getting fined, he's not going to jail. The worst thing that happened to him is he got a tongue lashing from Warren.
26
partycoder 2 hours ago 0 replies      
The apology didn't work and clients started demanding accountability. I am glad this happened.
27
RKoutnik 3 hours ago 7 replies      
Meta: Looks like WSJ is smart enough now to detect the `web` links. I went directly to the site, hit paywall, backed out. Then went via the `web` link, same paywall. Went to `web` link in incognito, was able to read the (fairly anemic) article.
28
Bud 2 hours ago 2 replies      
#downvotedwhiletrue
29
dbg31415 3 hours ago 0 replies      
Headline tomorrow, "John Stumpf Steps appointed chairman of the board of Wells Fargo."
30
miguelrochefort 3 hours ago 0 replies      
Shit's about to go down.
11
Zenefits, a Rocket That Fell to Earth, Tries to Launch Again nytimes.com
49 points by bilifuduo  5 hours ago   21 comments top 4
1
jimmywanger 2 hours ago 1 reply      
Just curious, what is the benefit of staying with the Zenefit name and brand?

It's fairly toxic, as it is associated with debauchery and fraud. Wouldn't you want to raze the brand to the ground and start with something else, even if you're keeping the same tech stack and sales contracts?

2
Nothorized 3 hours ago 2 replies      
How much companies have succeeded at reinventing themselves after a scandal ?
3
quizme2000 2 hours ago 0 replies      
NBC replayed a John Kerry Cyber Security sound byte and the Zenefits logo was in the background. They were a sponsor of the Virtous Circle Conference. Not sure if Zenefits should have been asked to sponsor this event.

"A virtuous circle is often described as a self-reinforcing system that creates positive benefits throughout the economy."

4
ditonal 3 hours ago 3 replies      
12
So You Want to Learn Physics susanjfowler.com
302 points by bootload  13 hours ago   98 comments top 21
1
Koshkin 9 hours ago 13 replies      
This may seem somewhat unorthodox, but what I would strongly recommend is to start not by reading books on physics or math but read some books on history of physics (and math) first. This will give you some intangible basic knowledge, or a sense, of what scientific research is all about, so that many things that otherwise may end up being puzzling to you when you come to learn the "hard science", won't. One recommendation I can make is Rhodes' The Making of the Atomic Bomb.
2
lisper 1 hour ago 1 reply      
Physics is made much harder than it needs to be by the fact that physics pedagogy is generally terrible. Physics texts start by just throwing equations at you, telling you "This is how it is" with no background or foundation about how we know that this is the way it is, or what it means that this is the way it is. There are some very good popularizations out there (like David Mermin's "Boojums all the way through") but very little that bridges the gap between these and "real" physics books. One of the things on my to-do list is to write a book to try to fill this void, at least for quantum mechanics.
3
amelius 9 hours ago 2 replies      
See also Leonard Susskind's series, [1].

> The Theoretical Minimum is a series of Stanford Continuing Studies courses taught by world renowned physicist Leonard Susskind. These courses collectively teach everything required to gain a basic understanding of each area of modern physics including all of the fundamental mathematics.

[1] http://theoreticalminimum.com/about

4
westoncb 7 hours ago 3 replies      
I've been thinking about reading The Feynman Lectures on Physics recently, but I always thought they were essentially textbooks; I was surprised to see them described as 'popular' here. I remember reading something about their origin, that some universities tried adopting them with the result being that students found them too difficult (and many professors considered the material to be a sort of fresh take on classical subjects).
5
ivan_ah 8 hours ago 3 replies      
That's a great list. The only thing that's missing would be a linear algebra course. The OP mentions it in passing, but a good understanding of LA goes a long way. I did UGRAD in engineering, and when I switched to physics everything was over my head, but my knowledge of LA still managed to keep me afloat. Also, matrix quantum mechanics is essentially straight up linear algebra (vectors, unitaries, projections, etc.)

Now switching to a shameless plug mode, I'll mention my math+mech+calc book, which would be a good addition to the section 1. Introduction to Mechanics. Chapter2 of the book (on topic) is part of the preview: https://minireference.com/static/excerpts/noBSguide_v5_previ...

6
racl101 8 hours ago 3 replies      
I've always loved physics but I don't think it loves me back.

That is, I've always found it fascinating since high school but once you need calculus to understand some of the more advanced stuff I feel that I get lost in the math (which, admittedly, I suck at) and lose the intuition for what's really going on. Then it just becomes a giant math problem that prevents me from seeing the bigger picture.

It's just this problem I've had that I always sweat the small things and sometimes miss the bigger picture or the main concept when I get frustrated that I can't understand the details.

7
vortico 3 hours ago 0 replies      
't Hooft's guide is really fantastic as a complement to an undergraduate curriculum.

http://www.staff.science.uu.nl/~001/goodtheorist/index.html

Edit: Looks like this has already been posted. But it's so good it needs another bump.

8
M_Grey 8 hours ago 1 reply      
I can't recommend the book, 'Prime Obsession' by John Derbyshire enough. 'Gravity' by Hartle is invaluable and quite accessible. If you have a strong background in calculus, you can also check out 'Gravitation' by Misner, Thorne, and Wheeler.
9
taktoa 8 hours ago 0 replies      
I haven't fully worked through it yet, but I've been really enjoying reading John Baez's Gauge Fields, Knots, and Gravity [1].

[1]: http://www.worldscientific.com/worldscibooks/10.1142/2324

10
kapilkaisare 49 minutes ago 0 replies      
I would love to see similar lists for Chemistry, Biology, Architecture, Urban planning...

The world of autodidactism needs a list of list of textbooks, providing learning paths for all sorts of subjects.

11
godelski 4 hours ago 1 reply      
There is a classical mechanics missing in the grad school section. One of the primary books used is by Goldstein.

I also recommend Classical Dynamics of Particles and Systems by Marion and Thornton.

As was mentioned in another post Linear Algebra is a must, and I think David Lay's book is a great one to start with.

As the author mentions, to learn physics you MUST DO PROBLEMS.

On another note I can't seem to find anyone that has mnemonic techniques for learning equations. So if anyone comes across a good method I'd like to hear it. And I'm not just talking about something like "low d high minus high d low, square the bottom and away we go". But to more complex equations, like memorize "memorize Einstein's field equation." A method that could potentially work for any arbitrary equation.

13
sizzzzlerz 8 hours ago 3 replies      
Learning physics requires more than simply reading text books. A significant portion of actually understanding the concepts laid out in the book is performing demonstrations and experiments in the lab. In college, we had a 3 hour lab each week to go with 3 1-hour classes and each was critical to learning. I certainly admire anyone who wants to learn physics on their own, especially without already having a strong mathematical education, but to really grasp the meaning of the words in a book requires practical exposure in a lab.
14
Hydraulix989 34 minutes ago 0 replies      
For GR, I really liked Bernard Schutz's "A First Course in General Relativity" -- I read it from cover to cover.

Extremely lucid explanations of some very complex topics, and reading it for the first time blew my mind.

This book "teaches" you well (compared to other books where I feel like I really am putting in a ton of mental effort just to learn what the book is trying to say, much like reading mathematics articles on Wikipedia), and it still manages to move fast.

15
camikazeg 6 hours ago 0 replies      
I understand the appeal of self learning physics for the sake of knowledge, but is there a market for self taught physicists in the same way that there is for self taught web devs?

I feel like you'd have to go through the university system. If not, what would that pathway look like?

16
edtechdev 9 hours ago 0 replies      
Start by playing with the physics simulations at http://phet.colorado.edu
17
FarhadG 3 hours ago 0 replies      
Congrats, Susan! I remember having a philosophy class with you (many) years ago
19
arcanus 6 hours ago 1 reply      
> If you work through the all of the textbooks in the Undergraduate Physics list of this post, and master each of the topics, you'll have gained the knowledge equivalent of a Bachelor's Degree in Physics (and will be able to score well on the Physics GRE).

I am not so confident: the physics GRE is a notoriously difficult test, and is a significant barrier to acceptance to any Ph.d. program.

20
Szel 6 hours ago 1 reply      
Is there something similar for math?
21
melling 3 hours ago 0 replies      
HN post about the best way to learn physics:

https://news.ycombinator.com/item?id=11216668

13
Most drivers who own cars with built-in GPS systems use phones for directions cnn.com
211 points by bmark757  12 hours ago   257 comments top 54
1
bsder 4 hours ago 2 replies      
How about: "Can't set new route while car is in motion?"

If you ever block me from doing something on a computer that I know it can do, I will hate you forever.

You know, I might have a passenger who could work the nav system, but, oh, no, the car is moving so you can't enter a new destination.

So many people disabled this "feature" on the Prius nav system that Toyota removed the ability to disable it. I kid you not.

And then they wonder why everybody uses their phone.

2
pwthornton 5 hours ago 6 replies      
Smartphone maps are so superior because they are always up to date, can provide real-time traffic data, can tell you to take an alternative route and can provide other data about places to eat, etc.

While a lot of car GPSs have some of this, they are no where near what Google Maps or Apple Maps (or a targeted app like Waze) has. I bought a new car last year, and intentionally didn't get the GPS, even though my car has a 7-inch touchscreen. I would never use those crappy maps when I have my phone with me. If the car supported Carplay, I would 100% use maps through that.

This is why Carplay and Android Auto are the future. Their apps, APIs and data are so superior to whatever car companies can come up with.

3
Someone1234 10 hours ago 5 replies      
This is why Carplay/Android Auto are so key. They're safer than using your actual phone, they offer legitimate maps apps (Google Maps/Bing Maps/Apple Maps/etc), and are somewhat future proof.

Too bad the MirrorLink consortium dropped the ball so epically. MirrorLink arguably does the same thing as Carplay/Android Auto and has been deployed to millions of vehicles, but nobody uses MirrorLink 1.1, why? Because to get your app certified takes tens of thousands of dollars, months, and tons of paperwork.

Carplay/Android Auto literally exist because the MirrorLink group created so many rules, regulations, and nonsense in the name of safety that MirrorLink 1.1 has like twenty apps total after two years(!). So if your vehicle has MirrorLink on the feature list, just laugh and forget it exists, you won't be using it.

PS - MirrorLink 1.0 allowed two way screen sharing, which was legitimately useful. MirrorLink 1.1 is a very different beast, most newer cars and phones only have MirrorLink 1.1 (no 1.0 at all). 1.1 defines things like how big buttons have to be, what kind of animations can play, how many button presses to reach each task, etc. Then everything has to be certified by an independent auditor.

PPS - Most depressing part is: MirrorLink could certify Carplay/Android Auto themselves, and instantly add both to millions of existing vehicles on the road. But they're never going to simply because they're effectively in competition with both.

4
jasode 11 hours ago 6 replies      
Another reason not mentioned by the article is map updates. Lexus charges $169+ to update the map data and requires a dealer appointment. A smartphone with google/Apple maps is always more up-to-date.

If you're buying a new car today, the reason to get navigation is for the LCD screen and not for the GPS. The LCD is used to see the rear view camera image for parking. Also, playing music shows the song titles.

5
dsfyu404ed 9 hours ago 7 replies      
Semi unrelated question: Am I the only one who wished navigation apps allowed more precise control of route complexity?

Just a slider that goes between "minimum number of turns" and "most efficient" would be nice. Optimizing a route for complexity vs efficiency is a very important consideration when planning a route. A way to do that in a navigation app would be really nice. I appreciate the intention but taking four extra turns and a one way street or two to save one minute when going somewhere that's one turn off a main road is rarely a good idea. If I'm driving rural state highways for three hours I'd much prefer to go 150mi on two roads than go 120mi on ten roads.

6
RankingMember 11 hours ago 6 replies      
It never made sense to me why you'd want to have a built-in GPS these days when modular devices are available that are A. generally better-designed and B. replaceable. People buying used cars in the coming years will be stuck with these big-screened dinosaurs in their dash that are essentially wasted space.
7
korethr 4 hours ago 6 replies      
As a counterpoint to the comments here bringing up the point that the maps on one's phone are always up to date, I'll bring up the point of offline access.

Yes, often the maps are bad, out of date, and require one to go the dealer and pay money to get an update. However, once out of the city, and cellular data ceases to be a thing, Google maps and similar applications built with an assumption of an always-available Internet connection become kind of useless. Yes, you can add maps for the areas you are planning to be to your offline areas, but that requires planning ahead. And if you ended up somewhere you didn't plan to be, well, good luck.

The UI for these in-car navigation systems is bad, but at least you have _a map_, instead of a featureless void with a blue You Are Here in the middle. Personally, I like a map book for such situations. Sure, it's ancient and obsolete technology, is quickly out of date, but the UI is quick an easy to learn, and it doesn't require power to use.

8
eric_the_read 3 hours ago 1 reply      
My new Tacoma has a hybrid system called Scout GPS. The idea is that you run the app on your phone, and it does all the heavy lifting WRT GPS, computing routes, etc., and the dashboard display is basically a moderately dumb UI that displays maps on the screen.

In theory, it's a best of both worlds situation; the UI doesn't really need to change very often; what is most important is the maps/POI information, which can be downloaded either in-app or via an update. Practically, the UI is just irritating enough that I end up using Google Maps anyway, and just letting the voice directions get me where I'm going.

(edit: punctuation, because my inner copyeditor won't shut up)

9
nomailing 2 hours ago 1 reply      
Why not buy one of the cheap aftermarket Chinese Android head units? When I was looking to buy a car I specifically looked for double DIN compatibility so that I can later easily install my own head unit. I would not want to support a car manufacturer who is abandoning the DIN standard for aftermarket head units. I want this central piece in the car to be upgradable. A car without double DIN is like a computer case without any slot to put an HDD.

There are very nice units with native Android experience. They usually have either a 7" or 10" screen. There are also some discussions of the various chinese units on xda forums in case someone is here interested...

10
Cybernetic 1 hour ago 0 replies      
Native GPS systems are painfully slow, in my experience. I have a Subaru Impreza 2014. Its GPS usually tracks a full block behind we're I'm at. If I'm not paying attention and make a turn based on the voice-guidance alone, it's often a wrong turn. Re-routing takes a long time as well; too long to be useful. Additionally, I have to pay for annual map updates if I want them. To someone's point earlier, all interaction with the map is disabled while the car is in motion. It would be nice if it were unlocked so a passenger could use it.

So I use Google Maps on my phone. It's significantly faster, shows more meta-data relative to my route, e.g., delays and alternate route suggestions, and if it has to re-route it's usually immediate.

11
tracker1 5 hours ago 0 replies      
Even though my car stereo has pandora on it.. I usually just use bluetooth playback with pandora or skype from my phone as the google maps ux is much better and the directions work while listening.

Honestly, the car ux for entertainment, etc in general feels like something that should have been state of the art half a decade ago... it's too slow, unresponsive and irritation imho, depite positive reviews.

For references it's the fullscreen uconnect interface on a 2016 dodge challenger. Also, the fact that it's 3g instead of lte on the antenna makes the mobile hotspot option worthless and not even a consideration.

12
nashashmi 1 hour ago 0 replies      
I have looked at the comments here and everyone is talking about the superiority of their phones and how carplay and Android auto are so much better.

And this leads me to having a Steve Jobs moment: I wish the car dash computers would work as simply as smart watches where they are just dumb remote controls for phones.

I don't have Android auto or carplay. But I have used moto360. And I find that is all I really ever need when driving. The voice recognition is awesome. The interface is simple. And like all simple things it leads the mind to wish for more.

It seems like the car manufacturers tried to swallow too much at once and sucked at all of it.

13
derekp7 2 hours ago 0 replies      
That was my biggest problem when I recently bought a car -- almost all of the ones with the options I wanted (out of dealer stock) also included the $800 manufacturer's GPS. Which is something completely useless to me since I could get by perfectly fine with Android Auto. So I ended up paying for a feature that I would not only not use, but was such inferior quality (it can't locate either my home address, or work address -- for home, it routes me to a house several miles away, and for work it sends me to the movie theater 3 blocks east).
14
jmcdiesel 11 hours ago 1 reply      
Bad UI, quickly out-of-date data, required trip to dealer to update said data, and lack of traffic awareness makes them just useless compared to something like Waze.
15
woobar 10 hours ago 0 replies      
After reading the article, I think the non-clickbait title should be "Most drivers who own cars with built-in GPS systems _sometimes_ use phones for directions".

I am pretty happy with my built-in system - it is fast, always on, integrates well with the car, provides dual-screens (in dash and bigger console screen output), free quarterly OTA map updates, uses Google for POI search, live traffic. Most important I don't need to fish my phone out of pocket, mount it somewhere and connect cable for longer trips.

Yet I am also sometimes (1%?) use phone for directions. Most of the time it is when I need to lookup something nearby and I am not in the car.

16
gwbas1c 11 hours ago 1 reply      
I got so frustrated with horrible GPS systems that I just bought a Sony car stereo that has a phone holder on it: http://www.androidpolice.com/2014/05/25/sonys-xspn1bt-double...
17
agjacobson 45 minutes ago 0 replies      
The navigation system on my 2013 Acura is unusable. My iPhone sits on the seat and does the job.

The navigation system on 2016 Volvos is usable.

18
asplake 10 hours ago 0 replies      
I cited my experience with the new Rav4 satnav in a recent blog post [1] on the difference between "requirements" and "needs". Tweeted it with "If you've lost something in your modern Toyota, it's probably hiding behind a settings menu". The current version isn't terrible; the version the car was delivered to me truly was. Inexcusable. Although there are benefits to integration, unless the delivery model changes to a more phone-like one I can't see myself buying an built-in satnav again.

[1] https://blog.agendashift.com/2016/10/06/better-user-stories-...

19
nojvek 9 hours ago 3 replies      
Toyota gps is just garbage. Hangs all the time. I strongly believe car makers just don't see software as a critical component. Exception is tesla.
20
ingenium 9 hours ago 0 replies      
The Audi navigation works well sometimes, and I really like the integration into the car (using the most up to date maps, they give free map updates for 3 years).

However, the routes are not the most efficient, and although it supposedly uses Sirius for traffic, the data isn't very accurate. Google Maps often gives me a much more efficient route that can be twice as fast given current traffic conditions, but is more "unusual".

Audi's GPS likes to stick to main roads that become heavily clogged. And even though I have an update that came out within the last 2 months, it still doesn't have a major construction project that has been completed for a year added, and always tries to route you around it.

So basically I use Google Maps for navigation within the city or areas that I'm familiar with, and Audi's navigation for long trips (ie inter city highway trips) where the route will mostly be the same on both systems.

That being said, the GPS in my car ALWAYS gets an accurate lock right away. My phone (Nexus 6P) has an awful GPS that doesn't work in hilly areas or around tall buildings and constantly loses the signal. It barely works inside the car, and even then the accuracy isn't great. Guess the all metal phone really kills the GPS signal, since the Nexus 6 had an amazingly good GPS with high accuracy in the same conditions.

21
bleair 3 hours ago 0 replies      
It's really simple - how hard is the UX of my phone vs. the car's system. I've _never_ seen a builtin car version have a UX that wasn't ridiculously complex and frustrating to use.

There are two concrete, real examples, of a routing UX that anyone can just try out - just grab an android phone or an iphone and directly copy the experience. Just sit down and try the most basic goal of using a nav system - enter and address and start and do it on your car system and on either phone.

Oh, wait, differing goals. I, as a user, want a usable system. The maker of these awful nav systems have an entirely different goal. They are trying to sell an "option" upgrade to car makers who then include it in "option packages" with the car... usability is way way down the list after bells/whistles/marketing/subscription sales

22
jrgoj 11 hours ago 0 replies      
As much as the size, placement, and car integration of my vehicle's (2016 CX-5) dash GPS is ideal, the fact that it cannot display Waze renders it mostly useless for my everyday commuting. Until I can do so via Carplay or Android Auto, I'll be using gaudy cell phone mounts.
23
sliken 3 hours ago 0 replies      
Seems like the best advantage of in car navigation is the GPS. Seems like many car GPS are rock solid, lock quickly, and can handle short outages because they know the speed and compass direction.

Maybe car windows are tinted more often now? I often get a marginal signal in my car, seems worse than in my house.

Anyone had luck with a magnetic attached bluetooth GPS they can put on their car roof or similar to improve a smartphone's GPS signal inside a car?

24
Zelphyr 10 hours ago 1 reply      
My wife just bought a brand new Toyota Highlander and I can confirm that the nav system is terrible. I can only imagine how much that turd increased the purchase price.
25
JoshGlazebrook 10 hours ago 1 reply      
If Waze came to Car play and android auto I imagine a lot of people would use it. The only navigation I currently use is Waze and I use it every single time I'm driving, even if I do not need directions. The ETA, routing, and the alerts are useful even if you know where you are going.
26
grecy 10 hours ago 4 replies      
Out of curiosity, are there any Tesla owners that can chime in about their experience?

Do you use the in-built GPS, or just use your phone?

I would think if anyone has it right, it would be Tesla.

27
diegorbaquero 11 hours ago 0 replies      
It simply makes no sense to use in-car GPS. Updates are free (some car markers charge them) and fast for mobile apps. There's a little tradeoff (privacy) though.
28
SpikeDad 4 hours ago 0 replies      
I don't. My 2013 Prius has an excellent GPS and I'm not too concerned that the maps are a bit out of date. I still get accurate traffic data from XM.

There are advantages to the built-in navigation - it works when there's no GPS signal due to inertial navigation (compass and tire rotation). Mine has a head-up display so I can see navigation cues without taking my eyes off the road. The voice synth is much better than Apple Maps or Google Maps.

I've only had to pull out my phone a couple of times in 3 years.

29
salmonlogs 5 hours ago 2 replies      
Reading the comments I appear to be the exception. I lease a 2015 BMW 4-series which I added "Professional Media" to and the nav works really well. I've driven plenty of other cars where the nav is useless, but not all are bad.

Reasons to use it over my phone:

* 11" screen shows me a high quality map overlayed with current traffic, with 1/3 split for lane guidance when needed

* Directions also shown directly on the dash - no need to look far away

* Spoken instructions dim the music volume

* Pretty accurate traffic information with automatic OTA updates

Downsides:

* Voice control is shitty; saying "set destination to leamington spa" will change to a random radio frequency

* Slightly awkward to send routes from PC to car, has to be via Remote Control App

30
jameskilton 11 hours ago 1 reply      
Volkswagen wants to charge me ~$250 for an update to my in-car GPS. Yeah, no. My phone's GPS is free (effectively) and constantly up-to-date.
31
sashk 11 hours ago 0 replies      
When I bought my latest car, navigation package included few safety features I wanted to get, so I had to get them to get somewhat safer car. My car -- Mazda, comes with free 3 years of map updates. I am yet to update it, because I'm lazy. It also comes with optional paid for traffic service. Why do I need to waste my money on that, when I can get free (I know, I know) Waze, Apple Maps or Google maps which will get me where I need to get avoiding traffic. Despite driving 2016 model year vehicle, I GPS/entertainment center feels like from 2010 or so. So yes, I use my phone for navigations most of the time.
32
gbuk2013 2 hours ago 0 replies      
I am one of these drivers - my car has navigation (didn't get a choice since I bought it secondhand) but I use Waze.

That said, having it has saved me on several occasions when my phone ran out of power and when there was no signal and I missed a turn.

33
raitom 3 hours ago 0 replies      
I have a 2015 Mustang with GPS, no carplay, and I always use Google Map on my iPhone (and everyone in san diego assume that i'm a tourist because of that haha).

Same for my friends with their cars, I never saw them using their built-in gps.

I already used carplay in another car, but apple plan on it is not as good as google map.

34
tibbon 4 hours ago 1 reply      
Why is Tesla one of the few car manufacturers who actually pushes real updates to their car's dashboard system (and for free, others charge absurd fees if they do give minor updates).

I hope that by 2020 it is the expectation that your car will continue to update and improve its interface over the years you own it.

I say this because people point to smartphones as always being more updated- and there's no reason for this.

35
celticninja 11 hours ago 0 replies      
My maps app on my phone gets much more frequent updates than the maps fucntion in my car. Saying that I do use my in car nav system, I just rely on googlemaps for traffic data and redirections,
36
conductr 3 hours ago 1 reply      
You can't buy car features a la carte like the "build your car" tool on every auto maker's site suggest you can. You're forced to buy the "tech package" which includes the GPS because you want something as simple as bluetooth
37
lanius 4 hours ago 0 replies      
Haven't tried built-in GPS, but I prefer using my phone over my standalone GPS because it's much faster. My standalone unit takes forever to boot up, has too-long delays when navigating between screens, and overall is just frustratingly sluggish when trying to use.
38
kovrik 5 hours ago 0 replies      
I can't use my car's GPS because the car is from Japan and it has Japanese language only and only maps of Japan. And you can't just download map of any other country, and you can't even change the language.

So, yes, I'd better use my phone's GPS: maps are always up-to-date, I can choose between Google Maps and Apple Maps (or any other), can select any language.

39
antisthenes 11 hours ago 0 replies      
Built-in car systems are usually at least 5 years behind the current tech, whether it was CD players, a USB port or whatever GPS system they started putting in.

As an example - I have a car with a USB port, but the music player interface is so atrocious and is missing key features that it's actually worse than my Sansa Clip from 2008. The car is a 2012 model.

40
ryanlm 2 hours ago 0 replies      
Problem with the bait and switch car GPS's is they actually charge you for updates. This should be against the law.
41
SteveNuts 11 hours ago 1 reply      
I have to update my car GPS using an SD card. It's just too inconvenient when I can just use my phone
42
macjohnmcc 5 hours ago 0 replies      
My car's map was so out of date when I bought it it instantly became useless to me. My wife sitting in the passenger seat cannot select destinations while we are driving so it is far easier to just use a phone with the added benefits of up-to-date maps and traffic.
43
n-gauge 4 hours ago 0 replies      
Can confirm. If the iu / method of entering a postcode wasn't so faffy for the in car satnav, it would serfice.
44
vermontdevil 11 hours ago 0 replies      
Have you seen the UI on some of these built-ins? Awful.
45
honkhonkpants 1 hour ago 0 replies      
Really the thing you want here is for your smartphone to have access to the car's GPS receiver, which is huge and sensitive and can use all the power it wants. The phone can just treat it as a peripheral.
46
CalChris 9 hours ago 0 replies      
CarWings on the Nissan Leaf is unusable. Its only advantage is that the map is built in and so doesn't rely on downloading.

You can preload Google Maps onto the iPhone for say Point Reyes. You won't be able to find a route but GPS location and the map should be good enough.

47
anindha 9 hours ago 0 replies      
Why not just have a nice phone holder built into cars rather than these systems. It would integrate with the speaker/mic in the car and charge my phone.
48
agumonkey 11 hours ago 0 replies      
Taxi in-car GPS was faulty so often the dude resorted to smartphone fallback. When Google Maps owns you, you have a problem.
49
gambiting 10 hours ago 2 replies      
Because built-in GPS is terrible. I recently bought a 2016 Mercedes-AMG vehicle that has sat nav as standard(on non-AMG models it's a $1000 (!!!!!!) option). The satnav is made by Garmin, and it's just awful. Truly horrible in terms of speed, usability, and it only gets updates once a year(you get a new SD card when you go for the annual service).

In the meantime, my TomTom 5000 satnav is still unbeaten - free lifetime upgrades, clear, fast interface, free and constant internet connection in every country of the world, with accurate traffic and speed camera updates. And it only cost me ~$250 new.

I just don't understand why anyone would get a built-in satnav over a dedicated device.

50
drivingmenuts 11 hours ago 1 reply      
I'd be perfectly happy if my car just had a dock for my iPhone, with an amp for the speakers. I use my phone for anything music, gps or phone-related while driving.

The only time I don't use it is when I can't get the friggin' bluetooth to connect right away.

51
zeveb 4 hours ago 0 replies      
My issue isn't so much the pain of entry although it is a pain as the fact that Google Maps provides better directions, taking into account current traffic conditions, than my car's system.

OTOH, my car's system knows about highway amenities like food & gas, and has a neat feature where it displays a preview of what certain turns look like.

I would love to integrate them somehow.

52
j45 10 hours ago 0 replies      
The quality of phone based GPS directions has improved drastically in the last 3 years. I had a cutting edge GPS about 5 years ago in my car that did better than my phone. Now there's no comparison and I wished one could simply mirror their iphone/android display into the car's touchscreen without the pageantry of needing an Android Auto/iPhone.
53
viggity 10 hours ago 3 replies      
I've always hated built in GPS for two reasons. First and most important - the damn maps are always viewed from directly above the car so the map is 2D and North is always at the top. There is a lot of effin' mental work to try and figure out which damn way your car is pointing and whether you need to turn left or turn right because you may be driving "down" the map and shit is reversed. On my phone or on my garmin I get a perspective view of my car and it's position on the earth. There is no confusion about which way I actually need to turn.

Secondly, the vast majority of built in GPS is on a console a good 6 to 12 inches below the windshield. Meaning I have to remove my eyes far off the road in order to look at everything. My garmin mounts to my windshield or dash. It is a quick glance and then eyes back at the road.

54
toyg 10 hours ago 0 replies      
Builtin GPS was pretty much the only requirement I had for the last few cars I got (both VW) and tbh I'm not that disappointed. The only thing really lacking is decent search, once you have an address to input it's mostly ok, but it's true that the temptation to just say "fuck it, I've found it on Maps, let's just use this" is there. If there was an easy way to copy or "stream" the address to the car via bluetooth, I think most people would use it.

It looks like the field is in flux anyway, VW keeps dramatically changing the UI in every new car.

14
Copyright Law Shouldn't Punish Research and Repair eff.org
108 points by lelf  5 hours ago   4 comments top 4
1
nerdponx 3 minutes ago 0 replies      
Would be nice to have a more detailed explanation of what exactly they're opposing here, and some data to back up their claims about it.
2
nialv7 1 hour ago 0 replies      
I'm happy that we have EFF fighting on our side. But this looks like a war us consumers are losing...
3
Mathnerd314 3 hours ago 0 replies      
I guess this is a follow-up to the lawsuit:https://www.eff.org/press/releases/eff-lawsuit-takes-dmca-se...
4
ktRolster 47 minutes ago 0 replies      
I signed it.
15
Why Unicode Wont Work on the Internet (2001) hastingsresearch.com
62 points by jordigh  5 hours ago   58 comments top 10
1
lcuff 5 hours ago 3 replies      
This article was written before UTF-8 became the de-facto standard. According to Wikipedia, UTF-8 encodes each of the 1,112,064 valid code points. Much more than Goundry's (the author's) 170,000. Goundry's only complaint against UTF-8 is that at the time, it was one of three possible encoding formats that might work. Since it has now been widely embraced, the complaint is no longer valid.

In short, Unicode will work just fine on the internet in 2016 as far as encoding all the characters goes. Problems having to do with how ordinal numbers are used, right-to-left languages, upper-case/lower-case anomalies, different glyphs being used for the same letter depending on the letter's position in the word (and many other realities of language and script differences) all need to be in the forefront of a developer's mind when trying to build a multi-lingual site.

2
TazeTSchnitzel 5 hours ago 4 replies      
UTF-16, and non-BMP planes, were devised in 1996. The author seems to have been 5 years late to the party.

> The current permutation of Unicode gives a theoretical maximum of approximately 65,000 characters

No, UTF-16 enables a maximum of 2,097,152 characters (2^21).

> Clearly, 32 bits (4 octets) would have been more than adequate if they were a contiguous block. Indeed, "18 bits wide" (262,144 variations) would be enough to address the worlds characters if a contiguous block.

UTF-16 provides 21 bits, 3 more than the author wants.

Except they're not in a contiguous block:

> But two separate 16 bit blocks do not solve the problem at all.

The author doesn't explain why having multiple blocks is a problem. This works just fine, and has enabled Unicode to accommodate the hundreds of thousands of extra characters the author said it ought to.

Though maybe there's a hint in this later comment:

> One can easily formulate new standards using 4 octet blocks (ad infinitum) but piggybacking them on top of Unicode 3.1 simply exacerbates the complexity of font mapping, as Unicode 3.1 has increased the complexity of UCS-2.

They would have preferred if backwards-compatibility had been broken and everyone switched to a new format that's like UTF-32/UCS-4, but not called Unicode, I guess?

3
jimjimjim 4 hours ago 1 reply      
http://utf8everywhere.org/

a very useful site especially when having to explain what utf8 is to other devs when working in a windows shop.

4
mcaruso 5 hours ago 0 replies      
More like, "Why UCS-2 Wont Work on the Internet".
5
herge 5 hours ago 4 replies      
Man, UCS-2 is the pits. I still remember fighting with 'slim-builds' of python back in the day.

Any critique of unicode while not assuming UTF-8, which allows for more than 1 million code points) is a bit suspect in my opinion. The biggest point against UTF-8 might be that it takes more space than 'local' encodings for asian languages.

6
khaled 2 hours ago 0 replies      
An extensive and very informative, though a bit sarcastic, rebuttal (from 2001 as well): https://features.slashdot.org/story/01/06/06/0132203/why-uni... (via https://twitter.com/FakeUnicode/status/786324531828838400).
7
zoom6628 1 hour ago 0 replies      
The paper is far more interesting for its informative background on the the use of the character sets in CJK region.
8
reality_czech 2 hours ago 2 replies      
Hilarious, a document from 2001 talking about why Unicode is unsuitable to "the orient." At the end, I half expected to read that "Negroes have also proved to be most unfavorable to it."
9
oconnor663 5 hours ago 6 replies      
> Thus is can be said that Hiragana can form pictures but Katakana can only form sounds

That sounds really weird to me. Does that sound right to any native Japanese speakers here?

10
jbmorgado 2 hours ago 1 reply      
Why "640K ought to be enough to everyone"
16
Hybrid computing using a neural network with dynamic external memory nature.com
94 points by idunning  7 hours ago   23 comments top 12
1
nl 9 minutes ago 0 replies      
This is probably the most important research direction in modern neural network research.

Neural networks are great at pattern recognition. Things like LSTMs allow pattern recognition through time, so they can develop "memories". This is useful in things like understanding text (the meaning of one word often depends on the previous few words).

But how can a neural network know "facts"?

Humans have things like books, or the ability to ask others for things they don't know. How would we build something analogous to that for neural network-powered "AIs"?

There's been a strand of research mostly coming out of Jason Weston's Memory Networks research[1]. This extends on that by using a new form of memory, and shows how it can perform at some pretty difficult tasks. These included graph tasks like London underground traversal.

One good quote showing how well it works:

In this case, the best LSTM network we found in an extensive hyper-parameter search failed to complete the first level of its training curriculum of even the easiest task (traversal), reaching an average of only 37% accuracy after almost two million training examples; DNCs reached an average of 98.8% accuracy on the final lesson of the same curriculum after around one million training examples.

[1] https://arxiv.org/pdf/1410.3916v11.pdf

2
the_decider 6 hours ago 1 reply      
Some interesting ideas sadly blocked behind a pay-wall journal, all for the purpose of boosting a researcher's prestige because they now hold a "Nature" publication. Thankfully, this article is easily accessible via Sci-Hub. http://www.nature.com.sci-hub.cc/nature/journal/vaop/ncurren...
3
idunning 7 hours ago 0 replies      
4
triplefloat 5 hours ago 0 replies      
Very exciting extension of Neural Turing Machines. As a side note: Gated Graph Sequence Neural Networks (https://arxiv.org/abs/1511.05493) perform similarly or better on the bAbI tasks mentioned in the paper. The comparison to existing graph neural network models apparently didn't make it into the paper (sadly).
5
gallerdude 7 hours ago 1 reply      
Can someone explain what the full implications of this are? This seems really cool, but I can't really wrap my head around it.

From what I can tell you can give the DNC simple inputs and it can derive complex answers.

6
zardo 2 hours ago 0 replies      
Instead of saving the data, you could think of using a memory address as applying the identity function and saving the data.

Could it learn to use addresses that perform more interesting functions than f(x)=x?

7
bra-ket 7 hours ago 0 replies      
if you're interested in this check out "Reasoning, Attention, Memory (RAM)" NIPS Workshop 2015 organized by Jason Weston (Facebook Research): http://www.thespermwhale.com/jaseweston/ram/
8
gallerdude 4 hours ago 1 reply      
Does this mean we could get way better versions of char-rnn?
9
bluetwo 4 hours ago 1 reply      
One of the examples given is a block puzzle (reorder 8 pieces in a 3x3 grid back into order)

Has this been a problem for AI and CNN's?

10
kylek 6 hours ago 3 replies      
I'm probably totally off base here (neural networks/AI is not my wheelhouse), but is having "memory" in neural networks a new thing? Isn't this just a different application of a more typical 'feedback loop' in the network?
11
ktamiola 6 hours ago 1 reply      
This is remarkable!
12
0xdeadbeefbabe 6 hours ago 1 reply      
> a DNC can complete a moving blocks puzzle in which changing goals are specified by sequences of symbols

A neural network without memory can't do that or can't do it as well perhaps?

17
Alpine Edge has switched to libressl alpinelinux.org
218 points by gbrown_  14 hours ago   73 comments top 12
1
montyedwards 10 hours ago 1 reply      
I'm currently evaluating LibreSSL for use in data protection software I licensed to a large company.

The optional libtls API bundled with LibreSSL is a really simple wrapper API that is secure by default. And it was a breeze to build on Windows because they use cmake (just need to download released bundle rather than from git to avoid problems.) A couple of the optional libtls functions don't work on Windows (tls_load_file), but 100% of OpenSSL-1.0.1+ api functions I tried so far worked fine.

For me, the biggest downside is LibreSSL doesn't support X25519 yet, while BoringSSL and OpenSSL both support it. And BoringSSL is starting to get easier to use with other software like nginx without messy patches.

Hopefully, X25519 will be added as a beta feature during LibreSSL 2.5.x and released as stable in 2.6.

If you have time, take a look at https://github.com/libressl-portable/openbsd

And email patches to: tech@openbsd.org

Edited: tls_load_file instead of tls_read

2
cm3 12 hours ago 5 replies      
I'm glad that VoidLinux isn't anymore the only distro in town that's switched to LibreSSL. And with Docker defaulting to Alpine, more OpenSSL/LibreSSL compatibility fixes will trickle into upstream projects. This is good news.

EDIT: Next, I predict a linux distro will, after having switched to musl, also support llvm/compiler-rt/libunwind/libc++ as base toolchain instead of gcc/libgcc/libstdc++

3
marios 13 hours ago 4 replies      
I recently built LibreSSL to replace OpenSSL on my laptop that runs ArchLinux. After installing, pretty much every thing works seamlessly so far. I rebuilt python because apparently the ssl module looks for RAND_egd (or something of the vein that LibreSSL has removed - and I didn't compile it with a shim). Other than that, dig is broken on my system ("ENGINE_by_id failed") although I have not bothered to fix it since drill works fine.

It's nice to see LibreSSL being picked up by Linux distributions. I wish other major distros did this (I'm looking at you Debian). IIRC, Alpine was often used to built docker images. If that's still the case, I'd say it's good news.

4
jcoffland 6 hours ago 1 reply      
It's worth remembering that OpenSSL has faithfully served the community for many years. Most of those years with almost no financial support. Few projects would survive the scrutiny they have undergone. These guys deserve some credit.
6
_asummers 12 hours ago 2 replies      
How has LibreSSL stood up lately to the relatively frequent CVEs in OpenSSL the past few months? I know the initial months were a frenzy of removing garbage and classes of problems (#yadf) that preempted a few CVEs, but I haven't been paying attention to the commit logs to know if it was also susceptible to them.
7
nailer 12 hours ago 1 reply      
I tried libre on OS X (it's on Homebrew), the binary is around half the size of the equivalent OpenSSL release. Kudos to Libre for stripping out so much junk - OSs that people don't use and ciphers that people shouldn't be using - and producing a more auditable sensible codebase.
8
pilif 12 hours ago 0 replies      
I'm running nginx linked against LibreSSL on our frontend since February. We've not seen any issues and LibreSSL has been a perfect drop-in replacement for OpenSSL so far.
9
bluejekyll 9 hours ago 1 reply      
This is a really short note. I'm curious, was BoringSSL evaluated at all as an alternate substitute?
10
gtt 12 hours ago 1 reply      
The only thing I miss in libreSSL is SRP. How would hn crowd recommend to pick alternative to SRP?
11
StreamBright 12 hours ago 1 reply      
I hope LibreSSL gets the traction it needs in the coming months or years. There are great wins using it and it is worth to move over the FOSS stack to use LibreSSL instead of OpenSSL and reduce the number security bugs we are facing today.
12
qwertyuiop924 11 hours ago 0 replies      
Oh good. Finally.

I want to see OSSL burn. Because it sure as heck doesn't deserve to live.

18
State of the OpenVMS Port to x86_64 [pdf] vmssoftware.com
99 points by knivek  10 hours ago   47 comments top 14
1
dmm 8 hours ago 3 replies      
I have a bad habit where instead of doing something relevant I play with old stuff I read about but didn't have access to as a kid.

VMS always interested me since I first read about it in the jargon file. I recently found an alphaserver on craigslist and I just requested an openvms hobbyist license so I could play with it.

An octane running irix is another expression of this problem.

2
laurentdc 9 hours ago 2 replies      
As an old time OpenVMS user (on VAX DECstations first and on AlphaServer later) I'm looking forward to this.

I doubt it will ever come even remotely close to a "mainstream" OS, but Alpha to x86 is a much better migration path than Alpha to Itanium.

And yes, there are still many OpenVMS installs out there in the wild, from airport logistics to assembly lines, so this may make sense from a financial standpoint (versus a complete software rewrite for Linux/AIX/whatever) - provided they can market it well enough to the old users.

3
SloopJon 8 hours ago 1 reply      
I work with an old DEC guy who co-wrote a book about the BLISS compiler back in the day. He thought it made a much better "portable assembly language" than C.

He also likes to talk about the binary compatibility on VMS; e.g., VAX binaries that run unchanged on Itanium. Unfortunately, that commitment to compatibility has wavered a bit since HP offshored VMS maintenance. We've had to work around some breaking changes recently.

If I'm understanding these slides correctly, AMD64 VMS would require recompiling VAX, Alpha, and Itanium applications from source. I kind of see a chicken-and-egg problem: application vendors won't port without demand, and customers won't adopt without applications.

4
icedchai 27 minutes ago 0 replies      
I have OpenVMS running under the SIMH emulator... It takes me back to the late 80's / early 90's. Hobbyist licenses are available for people who want to tinker. Fun stuff...
5
strlen 4 hours ago 0 replies      
For those itching to try, simh is capable of emulating a VAX and running OpenVMS (or NetBSD/vax) on them. One can also get a hobbyist license and media from HPE https://www.hpe.com/h41268/live/index_e.aspx?qid=24548, http://www.openvmshobbyist.com) and play around with it.

It is really quite a fascinating OS (I've had it running under a MicroVAX 4000/60 for fun -- and yes, I can truly say fun!) and feels quite modern even today.

6
epynonymous 7 hours ago 0 replies      
long live DEC OpenVMS, it's really rare to see an article on hacker news about OpenVMS, but it has happened before. long live DCL, ASTs. i started work in 2000, but was still able to play around with vax and alpha clusters, sometimes if i recall correctly, a mix of vax and alpha nodes in the same cluster. OpenVMS was such a stable operating system, by 2000, most of the good DEC engineers had left already for companies like Microsoft (Dave Cutler, etc), but that legacy code was still quite amazing. long live ZKO and DEC, i learned a lot from that job, great to see this coming, but not sure if it's still relevant these days with all my development on linux. could you imagine running this port on aws, like some customer migrating to the cloud all their legacy infrastructure? that would be such a niche market.
7
pklausler 8 hours ago 1 reply      
Why not translate the PDP-11 or VAX binaries to x86-64 and emulate the memory-mapped device interfaces? Seems like less work, and more likely to produce a cheap solution for legacy installations worrying about how to run an ancient critical application on old gear.
9
WallWextra 9 hours ago 1 reply      
I do believe that MACRO in the slides refers to MACRO-11, a PDP assembler. Or maybe MACRO-32, for VAX. In either case, they have been compiling this "assembly" language for Alpha (and now I guess Itanium) since VMS has run on that architecture.
10
johnohara 8 hours ago 0 replies      
Tip of the hat to Wendin, Inc. circa 1988.

$ show developer /user=steve_jones /source=linkedin /url=https://www.linkedin.com/in/stevejonesinwashington

edit: corrected url

11
hsivonen 4 hours ago 1 reply      
Interesting that they have a non-clang C front end. Did DEC have weird vendor-specifoc extensions to C?
12
youdontknowtho 2 hours ago 0 replies      
Looking forward to it!
13
dekhn 9 hours ago 1 reply      
ugh. i switched from vms to unix in '88 because the telnet in vms was crappy. other than the wonderful help system, I never found VMS attractive.
14
leandrod 9 hours ago 2 replies      
Call me when they free it.
19
Introducing NodeJS-Dashboard formidable.com
93 points by weareformidable  9 hours ago   4 comments top 3
1
STRML 5 hours ago 0 replies      
A nice little dash. Surprisingly, I found the most valuable part of it is honestly just splitting stdout/stderr, as the rest of the data is relatively easy to access (except event loop lag, but there are packages like toobusy[https://www.npmjs.com/package/toobusy-js] to help with this).

It doesn't work in combination with nodemon or babel-watch (yet), but the code looks very clean & simple so I assume it'll be an easy update.

If you're looking for more granular reporting & monitoring I can heartily recommend pm2's web dashes [https://github.com/Unitech/pm2]. There are also terminal dashes around like pm2-gui [https://www.npmjs.com/package/pm2-gui].

Nice (and clean) work @formidable.

2
michaelmior 5 hours ago 0 replies      
Anyone interested in this sort of thing should definitely check out blessed-contrib[0] which this is based on.

[0] https://github.com/yaronn/blessed-contrib

3
afarrell 4 hours ago 1 reply      
Is there anyone interested in improving the UX of the stock nodejs in-terminal debugger but who just needs money to do so? Because the experience of "run this single test and pop into the debugger at line 14 to poke around" has several warts that I'm surprised haven't been fixed yet:

1) Having to call `cont` at the beginning rather than the debugger stopping at the actual place where the `debugger` line is placed.

2) Having to call `repl` in order to start printing things

3) After calling `repl`, not being able to get back to the mode where you can go to the next line and jump into functions.

4) If jumping into an express.js route, inspecting the request sometimes just results in the message "no frames".

20
Facebook React.js License elcaminolegal.com
401 points by maxsavin  11 hours ago   182 comments top 33
1
lucb1e 9 hours ago 8 replies      
The way I read it, it's not evil.

It's a known and deliberate shortcoming of many licenses (e.g. BSD) not to include patent stuff because it makes everything unnecessarily complex. There was recently an article about why BSD and MIT are so popular, and it's because they're concise and understandable. There is a reason WTFPL exists and some developers resort to it as a way to avoid legalese.

Facebook clearly was aware of this "shortcoming" and being a big player, they might have wanted to be nice and say "we won't sue you for patent infringement if it turns out we have a patent on something React does". Then the managers went "but what if they sue us? Patents are not only for offense but also our defense, we would weaken our defense." And so the clause of "except if you sue us first" came into being.

And now this fuss about the patent part making it not an open source license? Oh come on.

I really don't like Facebook as a company, but this bickering is silly.

2
mjg59 10 hours ago 5 replies      
Version 2 of the Apple Public Source License includes the following termination clause:

12.1Termination. This License and the rights granted hereunder will terminate:

(c)automatically without notice from Apple if You, at any time during the term of this License, commence an action for patent infringement against Apple; provided that Apple did not first commence an action for patent infringement against You in that instance.

Like the React patent grant, this applies to any patent suit, not just ones that allege that the covered software infringes. The Open Source Initiative considers APSLv2 an Open Source license, and the Free Software Foundation considers it a Free Software license. Note that this clause terminates your copyright license, not merely your patent license - it's significantly stronger than the React rider.

So I think the claim that it's not open source is a bit strong, even though I find this sort of language pretty repugnant.

3
Noseshine 9 hours ago 2 replies      
So I checked StackExchange's law site and found this question - thus far unanswered, but those would be my questions exactly:

http://law.stackexchange.com/questions/14337/q-about-consequ...

It comes down to two questions (quoted from the linked question) - note that those are questions, not assertions:

1)

 > ... if we use any of Facebook's open source projects Facebook can violate *our patents* (of any > kind) pretty much with impunity: If we try to sue them we lose the right to patents covering their > open source projects(?)
2)

 > I have read opinions that other open source projects that don't have such a clause, for example > those from Microsoft or Google, nevertheless have the exact same problem, only that it isn't > explicitly stated. Is that true? Is my situation not any better when I only use open source > projects without such a clause?
I think that is a good point. The many opinions I see are almost all from people who don't have their own patents to think about, but what happens if you are a company and you do? Would you basically allow Facebook to use any of your patents, because for all practical purposes you can't defend them if you rely on their open source projects?

4
zpao 8 hours ago 7 replies      
Hi, Paul from the React team here. There have been lots of questions about the license+patents combo we use. Recently our legal team answered some of those questions.

https://code.facebook.com/license-faq

5
Egidius 9 hours ago 1 reply      
I agree. I was working at a somewhat large IT company (~30k employees) this year. I made a plea for React and while our development section agreed, it got bounced by legal because of these points.

If Facebook is really serious about Open Source, they also have to make their licence so that every organisation is free to use it.

6
ryanswapp 6 hours ago 0 replies      
So, assuming you actually have a patent, and Facebook actually decides to infringe on that patent, the worst case scenario is that you lose your license to use React. There are principles of fairness and equity in the law that would allow you to stop using React in a reasonable amount of time. So write your frontend in Elm. It probably needed a rewrite anyways.
7
giancarlostoro 1 hour ago 0 replies      
I'm not sure why they didn't just use the MS-PL it sounds like the same thing? I don't understand why use BSD if the MS-PL achieves what they want, and is backed by Microsoft (surely it would be in their best interest to defend their own license).

https://opensource.org/licenses/MS-PL

8
zelon88 10 hours ago 1 reply      
Does this mean that react could be considered "available source" instead of "open source?"
9
morgante 7 hours ago 3 replies      
These conspiracy theories are really getting old.

Do people really think Facebook developed and released React for the sole, or even primary purpose of gaining patent rights? It's preposterous that so many top engineers would be working on such a goal.

It seems obvious that Facebook just has some overly cautious lawyers. I highly doubt that means Facebook is going to use your usage of React as an excuse to steal your patents.

10
the_duke 9 hours ago 1 reply      
While we are on the subject:

I constantly find the need to read up on licensing. Usually with various blog posts or online information, which never gives me the feeling that I fully understood the legal implications or the context.

Can anyone recommend a book covering software licenses in depth? (ideally not only US centric)

11
bad_user 10 hours ago 2 replies      
Facebook's license is even weaker than a BSD/MIT license without any PATENTS license attached at all. Because in that case the patents grant can be considered implicit, depending on jurisdiction. By including a PATENTS license in that repository, Facebook nullifies the possibility of such a defense.
12
yladiz 3 hours ago 0 replies      
One interesting thing to note is that the actual license makes no specific references to the patents rider, and in fact the patents grant rider is a separate file completely. Does that mean that I have to follow it, if it's not directly in the license? If we look at the license as a contract, shouldn't it be in the license directly, even if it's referencing something outside?
13
guelo 7 hours ago 2 replies      
To me it's the GPL of patents; it's a viral anti-patent license. Once you use React you must disarm in the destructive patent wars. I wish more popular software were released with it.
14
DannyBee 5 hours ago 0 replies      
"A. The Additional Grant of Patent Rights in Unnecessary."

This is legally incorrect.

". But Ive never heard any lawyer postulate that that document does not grant a license to fully exploit the licensed software under all of the licensors intellectual property. Anyone who pushes that view is thinking too hard."

Nobody has pushed this view.

However, the author seems to miss that such rights are likely not sublicensable, because they are implied, and implied rights are pretty much never sublicensable.

That is, i may have gotten the rights. That does not mean i can give someone else the same rights.

Now, there are other possible principles, such as exhaustion, that may take care of this (it's a grey area)

But it's definitely not the case that implied patent rights are somehow going to be better than an explicit grant.

They are for people using software.They are not for people distributing software.

15
falcolas 10 hours ago 0 replies      
I worked with a development manager headhunted from Microsoft, who was quite worried about simple "Taint" from open source software (i.e. ideas gained from viewing open source code making their way into closed source software). I also worked with a company which wouldn't accept code contributions to their OS project; they would do clean room implementations to avoid the legal hassle of incorporating code which wasn't written for hire. So I can certainly see large companies being leery of utilizing software with licenses which don't include patent grants.

Perhaps it's less of an issue with the BSD style licenses, as explicitly called out in the article.

16
woah 10 hours ago 1 reply      
If it becomes a problem, you could drop in a react-compatible library pretty quickly. React native might be tougher but still would probably appear if there was a case with a lot of publicity.
17
andrewvijay 10 hours ago 0 replies      
Frankly multiple developers don't really care about the licenses or clauses. Software veterans and corporations care about it more than anyone else. Even the article is kinda hard to grasp in one go that I had to read a couple of times. To find the nuances in an OSS license and to think and act on it is not easy for a lot non native English speakers. And more posts like these are needed to make many people read and know about these serious issues.
18
omouse 10 hours ago 0 replies      
I wonder what the OSI and Software Freedom Law Center say about this.
19
coding123 8 hours ago 2 replies      
Email I just sent the OSI:

Please see here:https://news.ycombinator.com/item?id=12692552

It turns out companies are now "bastardizing" the license terms. I would love for the OSI to re-evaluate if these licences are truly open source. Open source covers freedom, and I should think that these clauses abridge that freedom since it is very well possible for a company to be required to sue Apple or Facebook over patents. If that unrelated lawsuit "strips" your legal right to use software, that is NOT freedom.

Thanks

20
appleflaxen 8 hours ago 1 reply      
if a person offers two licenses, and you only need to accept one of them to be licensed, then you should be all set.

If I sue facebook and they countersue, then my defense is simply "i am licensed under BSD". The fact that you offered an additional license (they even call it "additional") does not mean that I am required to accept it when the first license stands alone.

Right?

21
nmblackburn 6 hours ago 0 replies      
This has been a major issue for me from the get go, it goes against open source culture but that's no surprise because that is what Facebook loves to do (which it has consistently proved).

They pick something upcoming, recreate it injecting their ideals while knocking the original. They then release it to a sea of "pseudo developers" that latch onto it with the "well it's good because Facebook" mentality which aggressively defend it giving them more leverage.

Then they rinse and repeat until they have replaced everything the community has created with their equivalent instead of contributing back to those projects like a true supporter of open source would.

Open source is much more than having code on a repo, it's a culture that Facebook is hell bent on "changing".

22
bedros 4 hours ago 0 replies      
interesting article about why startups should never use Reactjs because if it's license

https://medium.com/bits-and-pixels/a-compelling-reason-not-t...

23
Twirrim 8 hours ago 1 reply      
Unrelated to the actual content of the page.. but why does this site require 1.25Mb of javascript to load? It makes up nearly 70% of the entire page, and is responsible for almost 60% of the requests needed to render the page. Do you really need to use that much javascript just to render a blog post. Why?

Those webfonts also took multiple seconds to retrieve leaving the page essentially completely blank to visitors until their browsers finally pulled them down. It paused long enough I was wondering if HN had sent sufficient traffic to bring the site down.

Here's an output to a quick stab at loading the page using pingdom: https://tools.pingdom.com/#!/cpZuGy/http://www.elcaminolegal...

24
falsestprophet 10 hours ago 1 reply      
After Alice v. CLS, what patents about React could be enforced?

edit: After Alice there has been a software patent massacre: https://en.wikipedia.org/wiki/Software_patents_under_United_...

25
briankwest 8 hours ago 1 reply      
How is this any different than the MPL 1.1 license? Section 8 of the MPL 1.1 has similar language.

/b

26
shadowmint 10 hours ago 2 replies      
If companies fear that if they try to enforce their patents they will lose access to significant commercial opportunities as a result of not being able to use projects such as react...

...basically, I welcome it.

Patents are harmful.

The FSF has, to my knowledge, made no meaningful progress in significant patent reform.

If this helps, then bring it on.

27
draw_down 7 hours ago 1 reply      
I'm not a lawyer so I can't say with certainty, but I'm sure Facebook could make React's license a bit nicer in some ways.

But I can't buy it as a reason for not using React, that sounds bogus to me. Facebook isn't gonna come sue your beer-ranking app company over a patent beef.

28
microcolonel 10 hours ago 1 reply      
Similar approach to the patchwork of various patent grants on Opus implementations; it's still open source, it just might not be free for 100% of the purposes you could think of.

I think Robert doesn't understand that open source refers to the source code being open to use, derivation, and study. The BSD license also includes a warranty disclaimer, which is the exact same kind of protective language as the patent grant. The Facebook arrangement meets all of those requirements with the one stipulation that you forfeit the license when you enter patent litigation against Facebook for a counterclaim the granted patents or primary litigation for unrelated patents. I don't consider countersuing Facebook for patents applying to React, while USING REACT, to be a serious fundamental software freedom.

29
iplaw 10 hours ago 1 reply      
So this attorney's problem is that rather than ambiguously granting a license to the patent claims necessary to implement this software, they decided to explicitly grant such rights? I see no problems.
30
JoshTriplett 11 hours ago 4 replies      
Other Open Source licenses have patent termination provisions. Apache 2.0 (which React used to use) says "If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.". There are other licenses as well that include more extensive patent termination provisions, such as the APSL.
31
pitaj 10 hours ago 1 reply      
I feel like if React doesn't "count" as open source software, then neither should anything licensed as GPLv3.
32
romanovcode 11 hours ago 5 replies      
React.js is basically licensed so that you cannot make anything that can compete with facebook corporation.

Example:

1. You create something similar to Instagram using React.

2. It gets popular

3. Facebook sues you and takes all your reacts and reducers

33
cjbprime 9 hours ago 2 replies      
The argument that "This is Not Open Source Software" feels unsupported and very sloppy.

> Thus, the licensee pays a price to use the library. It is not a price paid with money. [..] I could be missing something, but I have never seen any other software license use such a condition and also claim to be an open source license.

This just isn't thinking creatively. The GPL also requires a "price to be paid, but not with money" -- you give up your right to keep changes you make secret (if you distribute them). Yet no-one seriously argues that the GPL isn't an open source license.

If there is something about giving up the right to file patent lawsuits that is totally different to giving up the right to keep your changes secret, the article doesn't say what that difference is. Giving up the right to keep your changes secret is surely more stringent than giving up your right to file patent infringement lawsuits against one company. Why, then, should the latter be a dealbreaker for an open source license?

21
Scaling in Bitbucket Cloud: new features and reliability numbers bitbucket.org
74 points by kannonboy  8 hours ago   27 comments top 5
1
ronnier 6 hours ago 4 replies      
The Bitbucket UI is a mess.

* It takes over your native browsers find feature.

* It's javascript heavy and has multiple panes that scroll, collapse, and expand.

* Viewing diffs requires you to click each file that changed (unlike every other diff/code browser I've used where it has one long page with all the changes)

2
netcraft 8 hours ago 2 replies      
I've used and loved bitbucket for years for private project hosting, but for the occasion where I need to put more devs on a particular project I'm not sure why I would pay for it now over using gitlab for free. It might be different if it was 5 users per project for free, but 5 collaborators across all of my projects is just too limiting - especially when I can get everything bb has and more from gitlab.

At the same time I understand bb needs to make money - hopefully they'll make enough to keep up the competition.

3
xroche 5 hours ago 0 replies      
Bitbucket is Nice, but why the 1GB/2GB repository limit, especially for paid users (entreprise version) ? Never understood this limit...
4
falsedan 7 hours ago 1 reply      
I was recently looking at BitBucket Data Centre, to move our in-house git server to something with better clustering & high-availablilty & failover (rather than doing it ourselves). I was a little disappointed to find out the HA features are provided by storing your repos on an NFS server, and detaching/attaching it to the primary node.
5
vegardx 7 hours ago 1 reply      
Doesn't really matter that they've made Pipelines available for all users when it's lacking some pretty basic functionality. Like the ability to do something on failed builds.

But other than that it looks really good and responsive. Going to be interesting to see how the influx of new users will impact the average build time.

22
Releasing Serverless Framework V.1, and Fundraising serverless.com
156 points by matlock  13 hours ago   46 comments top 12
1
southpolesteve 9 hours ago 1 reply      
Congrats on the raise, but I'll be honest it really rubs me the wrong way that they renamed to "Serverless Framework" from JAWS. They should have picked a different name. The word "Serverless", for better or worse, was what the community settled on. Lot of companies and individuals use it in many different ways and have before this project existed. But they are trying to own it[1]. Not being a good community member IMO.

Edit: An example of the kind of thing I am concerned will happen[2]. Use the word "serverless" in a project? Get a take down notice.

[1]http://tmsearch.uspto.gov/bin/showfield?f=doc&state=4802:t7t...

[2]https://twitter.com/sindresorhus/status/776142274564616192

2
bjacobel 12 hours ago 6 replies      
Are there other examples of an open-source community project taking on several million dollars in venture capital funding? It seems a bit odd to me.

I'm not sure I'm comfortable using a presumably open-source, free (beer and speech) tool knowing that the group behind it will have to find a way to monetize their users in order to justify the investment from VCs at some point down the road. Open-source developers should of course be able to be compensated for their work, and the project has to find a way to sustain itself (I work for a company whose main product is open-source so I know this better than most), but the venture capital model doesn't seem like a good fit with the interests of the community, in my opinion.

That said, Serverless is a great tool, and congrats on the 1.0. Thanks to the team for their hard work.

3
benrockwood 11 hours ago 2 replies      
I recently started using APEX. It doesn't rely on CloudFormation and has supports for hacking Golang support. Its worth a look if your getting more serious about Lambda development and interested in other options. http://apex.run/
4
IanCal 2 hours ago 0 replies      
Good work on the funding.

I saw a link on the side to sign up for a beta of the serverless platform, so I did but just keep getting redirected back to the blog post. Is anything supposed to happen?

5
cphoover 10 hours ago 1 reply      
I really love the new development being done to simplify cloud deployments of stateless horizontally scalable services.

Once Serverless supports other aas offerings I think this will really take off.

In addition to the big boys (Amazon, IBM, Google, Microsoft), I'd love to see some alternative stable open source providers come about. Maybe something built on top of kubernetes or docker swarm.

6
rco8786 7 hours ago 2 replies      
How does this work when a database/persistence layer gets involved? Having a DB connection per function that calls it seems bad.
7
musha68k 12 hours ago 0 replies      
Great project - best of luck to team Serverless!
8
dolzenko 12 hours ago 3 replies      
So it Serverless Framework that seems to be considerably tied to AWS servers.

EDIT. Only writing this because its in fact possible to imagine serverLESS framework these days (web workers, p2p etc), but this seems to be just about more volatile servers.

9
miguelrochefort 12 hours ago 1 reply      
This is definitely a step forward for software development. I'm interested to see what people build with it.
10
mxuribe 12 hours ago 0 replies      
Best of luck!
11
michaelmior 5 hours ago 0 replies      
Does anyone else find the period in "V.1" odd? It seems like it should be V0.1, not V1.0.
12
syngrog66 3 hours ago 0 replies      
horrible name
23
Russian ship loitering near undersea cables hisutton.com
72 points by willvarfar  2 hours ago   30 comments top 12
1
leggomylibro 2 hours ago 4 replies      
Neal Stephenson wrote a bit about this in Cryptonomicon; laying new undersea cable is both expensive and time consuming, but the cost of cutting existing ones is fairly low. So if any sufficiently-funded individual, corporation, or nation-state wanted to hold a gun to the world's head, cutting undersea data cables wouldn't be a bad way to do it.

The problem is, you can't make that kind of threat in a subtle way, so to consider something like it you would have to be some kind of international pariah with a warmongering streak and a history of 'lying in plain sight' about your own nefarious deeds.

Edit: Okay, we're in better shape today than we were in the '90s and cutting off Cyprus' internet wouldn't cripple the world, but we still don't have THAT many cables running across the Atlantic and Pacific.

2
themodelplumber 1 hour ago 2 replies      
Russia is doing a lot of fearmongering in the run-up to the US elections. I stumbled across some agitprop on Twitter yesterday, where I read that Russia was recalling all students who were studying abroad. Out of curiosity (to say the least) I tracked down the source article, which was a) not available in English, and b) just Russians criticizing privileged Russians who sent their kids abroad to study. Looking around the rest of the agitprop stuff that I could find at a cursory glance, there is a ton of literal FUD going around right now, to the tune of "WWIII imminent."

Wonder who they're trying so hard to influence?

3
willvarfar 2 hours ago 0 replies      
The US famously spied on Russian underseas cables in Operation Ivy Bells https://en.wikipedia.org/wiki/Operation_Ivy_Bells during the cold war. A spy sold the secret to the Russians and they recovered the recorders.

Of course they used submarines, so there was no plot on normal publicly-accessible marine maps..

The Russians have several subs with mini-subs too and are building more; there is a good list of links at the bottom of the article!

4
dfsegoat 1 hour ago 0 replies      
Looks like the ship is currently off the coast of Lebanon.

You can watch/follow this ship in semi-real time as long as it's transponder is active/in-range of a receiver (and not being spoofed):

http://www.marinetraffic.com/en/ais/details/ships/shipid:121...

5
kbuck 2 hours ago 3 replies      
6
gozur88 1 hour ago 0 replies      
Seems like at this point you're really depending on encryption to keep your data safe. You have to assume one or more entities is poring through your bits looking for interesting information.
7
adriancooney 1 hour ago 0 replies      
Ars Technica had a great piece on how these undersea cables: http://arstechnica.com/information-technology/2016/05/how-th...
8
xXx_swat21_xXx 1 hour ago 2 replies      
There is no evidence that the Russians are "tapping" or "sabotaging" undersea cables; the Americans and Israelis would also tap those cables, and I think it's far more likely the Russians are doing counter-espionage of taps on Turkish, Syrian and Lebanese cables, to win favour with their governments.
9
okket 2 hours ago 1 reply      
Cyprus must have an extremely good internet connection :)
10
platz 2 hours ago 0 replies      
the ship appears between lebanon/syria and cyprus, not turkey and cyprus.
11
ageofwant 49 minutes ago 1 reply      
Ships do not "loiter" in international waters. "Loiter" implies the loiterer is violating some rights the accuser imagines himself to have. Russia may be simply legally installing its surveillance equipment on cables it found in international waters. There's no 'loitering' involved here.

I invite you to study the speeches of Anson Chan, less you believe that this irritating political weasel-wording we are so used to currently is universal.

12
aw3c2 2 hours ago 1 reply      
Well, how accurate are those cable lines?
24
How One 19-Year-Old Illinois Man Is Distorting National Polling Averages nytimes.com
175 points by ptrkrlsrd  5 hours ago   63 comments top 12
1
aetherson 4 hours ago 5 replies      
Nate Silver at 538 talks about the whole dissecting polls deal, or "unskewing" them. All polls must make methodological choices, and all of those choices have advantages and disadvantages. Spending a lot of time trying to dissect those choices and passing judgment on them is not as productive as:

1. Looking at aggregates of lots of polls.

2. Looking at the variance that a poll captures from one iteration of it to another.

Or at least, so he claims. Obviously, he runs a poll aggregator, using a model that heavily weights the trendline of individual polls, so he has a dog in this fight.

2
tankenmate 4 hours ago 0 replies      
One thing I noted is that NY Times says that most polls have four categories of age and five categories of education; except these aren't categories, they are ordinal variables.

Age and level of education are slightly co-variant (you don't get many 18 year olds who have a PhD). Because the age classification and education levels are ordinal you should use an ordinal smoothing [0] function to turn them into pseudo continuous variables. Given the continuous and co-variant independent variables (as well as other categorical independent variables) and a categorical dependent variable the best analysis is probably to use a quadratic discriminant analysis (QDA).

[0] http://epub.ub.uni-muenchen.de/2100/1/tr015.pdf and http://cran.r-project.org/web/packages/ordPens/ordPens.pdf

3
jakub_g 3 hours ago 2 replies      
Side rant: what the heck did they implement on NYtimes.com ?! When I quickly click the text (which I compulsively do to select a paragraph etc.), it changes the font size (sigh)
4
matthewbauer 4 hours ago 2 replies      
This is really fascinating. I get why the poll creators made these decisions, but the results of the weighting lead to a ridiculous result compared to other polls. Supposedly this poll was extremely accurate in 2012, so who knows?
5
bbctol 3 hours ago 0 replies      
I'm not sure if he doesn't know his role, and I'm curious how tracking polls like this try to account for the large media attention paid to the poll and its methodology. This guy is known to stats nerds, and they've been tracking his moves and (rather mean-spiritedly) calling him "Carlton" for a while now.
6
eatbitseveryday 3 hours ago 1 reply      
Will the era of click-bait titles come to an end?
7
artursapek 4 hours ago 0 replies      
This makes me feel so much less confused about that poll
8
sytelus 4 hours ago 0 replies      
TLDR; The 19-year black man is very small demographic and would have small sample size. Apparently, the sample for LA Times poll includes and outlier who favors Trump which then gets weighted disproportionately to arrive at conclusion that trump is favored by young black voters.
9
throwawayqwe 4 hours ago 3 replies      
I'm posting under a throwaway account just to say I agree with you; if that says anything.
10
LyalinDotCom 4 hours ago 2 replies      
11
blank_dan 4 hours ago 1 reply      
Let's go find the really heavily weighted members of _all_ the polls and dox them too. This way we can screw them all up. Not just one that is influenced by a potential Trump voter.
12
MrZongle2 4 hours ago 5 replies      
Regarding the Times' decision to run this article, I wonder how much of it was based upon "hey, polling is kind of goofy" and how much of was "look! Here's another way we can show that Trump isn't really resonating with voters!".
25
JavaScript singing synthesis library github.com
73 points by gattilorenz  10 hours ago   19 comments top 9
1
Animats 6 hours ago 1 reply      
Cute as a Javascript hack, but not going to compete with Vocaloid or Festival Singer.

Somebody really needs to crack singing synthesis. Vocaloid from Yamaha is good, but it works by having a live singer sing a prescribed set of phrases, which are then reassembled. Automatic singer generation is needed.

Figure out some way to use machine learning to extract a singer model from recorded music and generate cover songs automatically. Drive the RIAA nuts. Get rich.

2
david-given 3 hours ago 0 replies      
<nostalge>

Way back in the mid-1980s in the United Kingdom, and there were few places more 80s than that, Superior Software produced Speech!, a software speech synthesis program for the BBC Micro, a 6502-based machine running at 2MHz which didn't have PCM audio. It could reasonably reliably read out ordinary English text in a fairly robotic voice.

It was 7.5kB of 6502 machine code.

There's a writeup from the author here: http://rk.nvg.ntnu.no/bbc/doc/Speech.html and a demo here: https://www.youtube.com/watch?v=t8wyUsaDAyI

It was an utter sensation (featuring, among other places, as the computer voice in Roger Waters' Radio Kaos).

It's obviously not going to win awards, being barely intelligable, but if you can achieve that with a table of 49 phonemes each of 128 4-bit samples, then producing basic speech isn't that hard. I think that mespeak.js, which is what this demo is based on (which is pretty cool, BTW) is based on the same principle, although with obviously better samples.

(Unlike producing human sounding speech, which is appalling difficult.)

3
usdivad 1 hour ago 0 replies      
Project author here, just want to say thanks gattilorenz for sharing (was quite the pleasant surprise to see this on the front page!) and everyone for the feedback + fascinating projects, ideas, links etc. Really cool to see so much enthusiasm for speech+singing synthesis and Web Audio!
4
vortico 3 hours ago 0 replies      
It's been a good year for the English singing synthesizer world, with the launch of chipspeech. (https://www.plogue.com/products/chipspeech) But I'm pretty interested in whether more realistic singing synthesizers will be made, since there are a few recent new voices by Acapela Group and others developed for non-singing speech.
5
ohitsdom 4 hours ago 0 replies      
This is great! I'm in the very early stages [0] of creating a framework to automate and control physical instruments through hardware & software. Never thought voice would be possible, I'll have to check out integrating this! Thanks!

[0] https://github.com/fotijr/MetroDom

6
grimmdude 10 hours ago 4 replies      
This is great, nice job! I'm working on a midi player in JavaScript; it would be interesting to use this as the sound font. Maybe assigning certain words to certain pitches. https://github.com/grimmdude/MidiPlayerJS
7
RodgerTheGreat 3 hours ago 1 reply      
On Safari, instead of using the normal "AudioContext" constructor you must create a "webkitAudioContext"- a feature detection check for this would be a nice addition.
8
tisryno 9 hours ago 1 reply      
The demo doesn't allow you to put in your own lyrics, keeps loading exactly the same set of words. Really awesome project though
9
dmitripopov 4 hours ago 0 replies      
Sounds pretty creepy, especially right after watching "Westworld" :)
26
What are malicious USB keys and how to create a realistic one? elie.net
94 points by liotier  11 hours ago   57 comments top 11
1
stcredzero 7 hours ago 2 replies      
A HID-based attack has to spawn a terminal and very quickly inject a set of commands that is very visible but only for a short period of time. Once the attack has been carried out, there is nothing left to see, so this type of attack is less obvious than the social engineering one.

MEMS microphones are tiny. It should be possible to combine data from that and a light sensor, that would make an HID based attack far less likely to be detected. The frequency profiles of keystrokes and someone pushing away an office chair should be fairly easy do discern. You'd want to make it more likely that the attack would occur after someone had left their desk. (Hopefully with the machine unlocked.)

EDIT: Another idea: The USB key uses autorun to pop up what looks like a spammy ad for a PC "cleaner" utility, or something you'd expect on a USB key conference swag item. It's actual purpose is to cover up the shell's window, or to contain the exploit itself.

2
s_kilk 6 hours ago 1 reply      
A few years ago, a co-worker came back from a MongoDB event with a bag of swag, including a MongoDB branded USB drive.

When he plugged it in, it acted as a keyboard and managed to open the browser to a MongoDB promotional page.

Needless to say he freaked out a bit. Some marketing goons (not restricted to the people at MongoDB) seem to think this kind of thing is a great promotional tool.

Now that I think of it, my wife has a similar story about her University distributing this same kind of USB/botnet thing to students.

3
jotux 9 hours ago 2 replies      
Making a mold and casting a custom case seems incredibly cumbersome compared to just buying a USB key case and fitting the board into it: http://www.mouser.com/New-Age-Enclosures/Enclosures/Enclosur...
4
achr2 2 hours ago 1 reply      
A really sneaky method would be to create a USB suppository, that looked like a large USB key but would leave the active part inserted inside the socket after the target pulls out the seemingly defective key.
5
Sanddancer 3 hours ago 0 replies      
A zero-day key wouldn't have hardware costs that would be significantly different from an HID key. A lot of microcontrollers will gladly announce themselves to be whatever sort of device class you want them to be, and which vendor and device IDs to use. Also, if you're worried about bulk, putting an LED or something that flashes and looks pretty would also lower peoples' guards as to why it's so big. USB hacking is an area where there are wide open fields to play with.
6
rwmj 7 hours ago 9 replies      
If one finds a USB key in a parking lot, is there any safe way to find out what's on it?
7
Palomides 9 hours ago 1 reply      
It's interesting to see mention of 0-days against USB drivers as a vector, says "AFAIK, none of those have been publicly discussed." It seems very likely there are vulnerable drivers.
8
DeepYogurt 7 hours ago 0 replies      
9
mschuster91 6 hours ago 2 replies      
There's a fourth class of malicious USB keys: those which discharge -100V across the USB data lines, aka "usb killers". Drop a couple of these on a huge parking lot to create a boatload of damage.
10
PokeAcer 4 hours ago 0 replies      
I'm surprised nobody has made something for testing; how much would it take to make something you plug a USB into and see what it does?

(I.E. "USB is typing"/"USB is doing")

11
gwu78 8 hours ago 3 replies      
"(I didn't know about it!)"

The reference is to the existence of /dev/tcp when using the "Bourne again" shell. Some other large shells, and gawk, have this "feature" as well.

Then I noticed he is head of something technical at Google.

We are always reading about the rigor of this company's interviews in testing candidates for practical knowledge.

I guess knowledge of important capabilities of widely/universally installed software is not something they are testing for?

I mean, I am sure there are probably hundreds of employees there who know these things. And they have some legendary programmers on the payroll. It is like a miniature Hall of Fame of computer programming.

I am not even sure what this all means, but I find it interesting to see the gaps in knowledge considering jobs with this company are so highly sought after.

And they are entrusted with protecting an enormous quantity of other people's data.

27
Desktop Notifications for console logs in browser github.com
83 points by harkirat96  10 hours ago   10 comments top 5
1
sheer_horror 8 hours ago 2 replies      
During development, I have to check the browser's inspector periodically to see what my console.log()'s are saying. This leads to having two browser windows open: The browser and the inspector. And in the inspector, I usually only need to see the console. With these desktop notifications, I can develop and debug web apps with just two open windows: A single browser window and a terminal. And it's only adding ~100 lines to your project.
2
GnarlyWhale 9 hours ago 1 reply      
This is actually a really clever project, not sure why this is the first time I'm seeing something like this.
3
brak1 6 hours ago 1 reply      
Am i missing something here?

Wouldn't it be easier as a browser extension that just gives a notification every time console.log is called?

4
newtons_bodkin 8 hours ago 0 replies      
Very neat. Plan to take a look at the source code later today. I was thinking of doing a chrome extension that accomplished the same goal, but just used a small pane on the bottom right instead of notifications.
5
Globz 7 hours ago 0 replies      
Great job, really useful!
29
PyPy3 5.5.0 released morepypy.blogspot.com
203 points by wclax04  11 hours ago   23 comments top 6
1
quantumhobbit 10 hours ago 2 replies      
I really want pypy to be the future of Python along with python3. I haven't used it in a while but the performance improvements I could get by switching from python somecode.py to pypy somecode.py were magical.

So what is the best way I can help support pypy3? Are there any easy issues I can help contribute to?

2
oliwarner 9 hours ago 1 reply      
When I was still locked to Python 2 I really wanted to use Pypy but couldn't due to Cpython requirements. Now we're using Python 3.5 async stuff everywhere.

I wonder if I'll ever get to use Pypy in anger or will something in Python 3.6+ distract me (those f-strings look pretty fancy, dammit).

3
sumitgt 3 hours ago 2 replies      
I am new to the python ecosystem. I use python for solving problems and hackerrank and noticed that switching from python3 to pypy3 can mean the difference betwwen timing out on some problems v/s getting it to work.

Can someone eli5 how pypy achieves such difference and why that improvement cannot be contributed back to python3?

4
joelS 10 hours ago 0 replies      
Great news, looking forward to full 3.5 support.
5
Alex3917 10 hours ago 1 reply      
Is going straight to Python 3.5 easier than releasing a 3.4 compatible version first?
6
mozumder 10 hours ago 2 replies      
Would like to see a benchmark of common algorithms or server-side functions between PyPy3, NodeJS, & Go.
       cached 13 October 2016 01:02:01 GMT