hacker news with inline top comments    .. more ..    15 Oct 2015 Best
home   ask   best   2 years ago   
Twitter announces layoffs sec.gov
577 points by uptown  1 day ago   378 comments top 68
Sidnicious 1 day ago 14 replies      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.

I tried rewriting his email to live up to this promise:

- - -


We're cutting our workforce to strengthen Twitter as a company.

The team has been deciding how to best streamline Twitter, Vine, and Periscope to put their focus on the projects which will have the greatest impact. Moments, which we launched last week, is a great beginning. It's a peek into the future of how people will see what's going on in the world.

We plan to cut up to 336 people. This was a tough decision, and we'll offer each person a generous exit package and help finding a new job. Product and Engineering are going to make the most changes. Engineering will be smaller but remain the biggest percentage of the organization, and other departments will be cut in parallel.

This isn't easy. We'll honor those who we're losing with our service to all the people who use Twitter. We'll do it with a more purpose-built team. Thank you all for your trust and understanding here. As always, please reach out to me directly with any ideas or questions.


celticninja 1 day ago 12 replies      
> The world needs a strong Twitter

Really? Does it? I think Twitter needs a strong Twitter, shareholders need a strong Twitter even Twitter employees need it. However the world is, at best, ambivalent about Twitter, if it disappeared tomorrow a replacement would spring up within a few weeks if the world really needed a way to shotgun their messages into the ether.

ghshephard 1 day ago 4 replies      
The number, 336, is roughly 10% of their employees - which is pretty much exactly the number that Jack Welch recommended turning over each year to improve the work force.

I often wonder whether these "layoffs" aren't actually layoffs, but simply performance based assessments. It's not like Twitter is shutting down an entire office, or abandoning some technology, and letting everyone associated with that office/technology go - presumably they are being selective on other factors as to who they let go - and I'm guessing that performance is likely a key factor.

If, over the next year, twitter doesn't hire back that 10%, or hires employees in different technologies/positions (I.E. Web developers instead of thick client developers, sales people instead of developers, etc...) - then this is a layoff. But, if headcount returns to the same number, in roughly the same job areas, then this is just a performance based annual rank/yank process.

uptown 1 day ago 4 replies      
"we plan to part ways with up to 336 people from across the company. We are doing this with the utmost respect for each and every person."

Bart might disagree:


danso 1 day ago 2 replies      
> The roadmap is focused on the experiences which will have the greatest impact. We launched the first of these experiences last week with Moments, a great beginning, and a bold peek into the future of how people will see what's going on in the world.

That Moments is mentioned so high up in the email isn't particularly reassuring...since it means they haven't launched many other initiatives of note recently. Moments as a feature is extremely disappointing given the years of interesting discoveries that Twitter has yielded algorithmically via its, well, "Discover" tab. What's on Moments looks like a half-baked newspaper front page except when you click on an item, you go to tweets about that item instead of a full story.

I don't want to pile on the project as it is new...but it should've been given more thought and design time given how much prominence "Moments" has on the interface (it is one of four main icons on the menubar)...Nearly all of the stories are hours old...e.g. "Wave of terror attacks hits Jerusalem" and "Playboy covers up"..."FedEx truck splits in two", granted, is news to me...but not something that makes Twitter unique to me.

There's so much more potential in the Trends section...OK, maybe Twitter wants to filter out potentially visually NSFW topics like #NoBraDay...but things like #MH17 and #VMworld and #ILoveYouAboutAsMuchAs...just show me an automated feed of tweets by reputable sources (rather than spambots or random kids) so I can understand why these topics are suddenly trending without having to click through the trends tag and sort through a overwhelming timeline.

edit: that said, I like all the other products...besides core Twitter, Vine and Periscope are standouts (at least, as a consumer)...I just think that "Moments" isn't worth putting into the spotlight, unless there is literally nothing else to be proud of publicly.

acaloiar 1 day ago 7 replies      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.

Well, since you said it that way, I should assume that what comes next will not sound like a steamy pile of meandering corporate speak, right?

> The team has been working around the clock to produce streamlined roadmap for Twitter, Vine, and Periscope and they are shaping up to be strong. The roadmap is focused on the experiences which will have the greatest impact.

A roadmap focused on high-impact experiences. Got it. I hope your firings go really well, Bob.

ChrisLTD 1 day ago 5 replies      
"The world needs a strong Twitter, and this is another step to get there."

Let's not get carried away here. Twitter is great. I use it too much of the day. But the world hardly needs Twitter.

MattBearman 1 day ago 5 replies      

 "Emails like this are usually riddled with corporate speak so I'm going to give it to you straight."
Three paragraphs later...

 "So we have made an extremely tough decision: we plan to part ways with up to 336 people from across the company."
Edit: My bad, I should have been clearer in what I meant. There isn't really any corporate speak, but I wouldn't call 3 paragraphs of fluff 'giving it to you straight'

dang 1 day ago 0 replies      
This was discussed pre-announcement at https://news.ycombinator.com/item?id=10364197.
antirez 1 day ago 1 reply      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.


> We will honor them by doing our best to serve all the people that use Twitter.


mrweasel 1 day ago 3 replies      
I'm a little surprised that they're only firing 336 people. Unless this is just the first round of layoffs and more will follow once products and management has been streamlined/trimmed whatever you want to call it.
ngoel36 1 day ago 1 reply      
This truly sucks - I'm sorry to hear that. If you're one of the unlucky engineers that caught up in all this - reach out to the email address in my profile. We're hiring tons of awesome engineers at Uber, and if that's not the right fit for you I can help get you connected to other SF companies as well.
josefresco 1 day ago 1 reply      
The "moments" feature will be a failure. They're essentially building an editorial model on top of Twitter - something they (as the platform creators) shouldn't be worried with.

The decentralized model of Twitter's content creation is an asset. If you group those into a more traditional top down model, you lose the uniqueness and power of the Twitter platform.

TheMagicHorsey 1 day ago 0 replies      
I'm seriously impressed by Twitter's service ... doing fan-out for so many popular celebrities, so seamlessly, for so many readers is an accomplishment.

But I'm seriously curious why they have 4,000+ employees.

What in the hell are all those people doing?

300~ people being laid off is nothing. I wouldn't have been shocked if they said they were laying off 1,000+ people. I think entire departments probably need to go.

There has to be a lot of dead weight at Twitter.

xmpir 1 day ago 3 replies      
Why is this email on sec.gov?
GuiA 1 day ago 0 replies      
> Let's take this time to express our gratitude to all of those who are leaving us.

Gratitude in the form of not telling employees that they were laid off, and letting them find out when they try to check their email in the morning? [0]

Fuck that noise.

[0] https://twitter.com/bartt/status/653946266938818561 + exact same thing happened to a good friend of mine who didn't tweet about it + hearing reports of it happening to others

petercooper 1 day ago 0 replies      
"[..] Engineering will move [..] faster with a smaller [..] team [..] we [will] part ways with [..] 336 people. [..] with the utmost respect [..] the world needs a strong Twitter, and this is another step to get there."

Or basically, Twitter is weaker with you in it.

chipgap98 1 day ago 0 replies      
There seems to be a large disconnect in this thread between what is actually corporate speak and avoiding being blunt and insensitive
Wintamute 1 day ago 0 replies      
Slightly OT, but honestly I think the world really doesn't need Twitter. If you view the evolution of the internet as a phenomenon fundamentally linked to the emergence of a global cultural consensus, or even consciousness, then to reduce a sizable fraction of its bandwidth to 140 chars, vicious echochambers and a communication mechanism custom designed to bump people's thoughts out-of-context for the purposes of ridicule then Twitter should be viewed as harmful. I hope some of these coming changes directly address the harm current Twitter does to the quality of human communication on the net.
piratebroadcast 1 day ago 0 replies      
I wonder what this means for the Boston twitter offices (Crashlytics and Bluefin Labs) - No mention of them, whereas Vine and Periscope are.
jackgavigan 1 day ago 0 replies      
Well, the stock's up 4.35% today.

That's roughly $2.7m per fired employee!

Kristine1975 1 day ago 1 reply      
TL;DR: We're making some changes to the company. We're firing the 336 of you we don't need anymore. Thanks for your work.

Everything else is fluff. But I guess they have to sugar-coat it a bit with "utmost respect" and "tough decision".

vegancap 1 day ago 4 replies      
What's the reason for this e-mail being housed on a .gov TLD?
spikels 1 day ago 0 replies      
Revenue per employee is an interesting performance metric although not usually applied to growth companies. Twitter's was growing fast relative to other big public tech firms but still much lower than most.


ducuboy 1 day ago 0 replies      
Focused on what exactly?

> We launched the first of these experiences last week with Moments, a great beginning, and a bold peek into the future of how people will see what's going on in the world.

Wonder what's next, because with Moments it feels like they still have no idea what to do with this platform. It would be such a pity to turn Twitter into TV-like manually curated breaking news channels.

leothekim 1 day ago 1 reply      
"up to 336" -- that's a very specific number, suggesting this was somewhat surgical. Most other leadership types are of the "any manager worth her salt should be able to cut 10% of her staff, so do it". Layoffs are never easy, but it sounds like Jack is doing his best make the right cuts and take care of affected.
grandalf 1 day ago 0 replies      
When 80% of the promotional mail in my inbox is from Twitter trying to drive engagement, something is going badly.
RyanMcGreal 1 day ago 1 reply      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.


> we plan to part ways with up to 336 people from across the company

Nothing says "give it to you straight" like using "part ways" to mean "you no longer have a job".

ape4 1 day ago 1 reply      
I guess the message was a bit too long for a tweet.
mobileexpert 1 day ago 0 replies      
What about Twitter's none-core products? GNIP and Fabric (crashlytics and Answers)
oldmanjay 1 day ago 1 reply      
The sheer vulture-like behavior of recruiters around this event is a sight to behold! Truly recruiting is the occupation for the shameless.
jhwhite 1 day ago 1 reply      
Maybe I'm a little self centered but this line:

> We will honor them by doing our best to serve all the people that use Twitter.

seems a little pep talky to me for the people left.

I feel saying this would have been better:

> We will honor them with the utmost respect for each and every person. Twitter will go to great lengths to take care of each individual by providing generous exit packages and help finding a new job.

cubano 1 day ago 2 replies      
On the bright side, Twitter will only fire 140 people at a time, giving the others time to prepare.
jjzieve 1 day ago 0 replies      
I feel like this could spark a massive bubble pop. I mean if investors have lost confidence in a company with one of the largest user-bases in the world, what does that say about all the startups that are valued so high and will likely never make a dime, unless they're bought out.
tony_b 1 day ago 0 replies      
Good thing that his email about a roadmap for moving forward with a restructuringfor a nimbler team and an organization streamlined in parallel as well as an invitation to reach out wasn't riddled with corporate speak the way those emails usually are.
kenko 1 day ago 0 replies      
"We are moving forward with a restructuring of our workforce so we can put our company on a stronger path to grow. Emails like this are usually riddled with corporate speak so I'm going to give it to you straight."

Riddled with corporate speak like ... the very first sentence.

myth_buster 1 day ago 0 replies      
Tech community reaching out with job postings.


hartator 1 day ago 0 replies      
https://about.twitter.com/careers/positions Still doing a lot of hirings.
idibidiart 1 day ago 0 replies      
"Dorsey is no Jobs" sound very fitting now.
ausjke 1 day ago 1 reply      
How many employees does it have? I recall it's about 4000 or so, so this is like a 10% cut?

It's better to do a big axe once instead of slicing it gradually, will this be it?

Been there done that, and it sucks.

DrNuke 1 day ago 0 replies      
Tbh twitter is pretty good professionally, if you follow the right people / organisations in your field and write accordingly. Much better than spammy Linkedin too.
lfender6445 1 day ago 0 replies      
If anyone who's part of the layoff is looking for the opportunity to work from home (ruby + javascript), let me know -- me [at] gmail.com
jacques_chester 1 day ago 1 reply      
I think the engineers will be in a good position.

The rest, I'm not as sure.

swalsh 1 day ago 1 reply      
If you're impacted, know ruby, want to make healthcare better, and are open to a position in Boston let me know! email in profile.
perlpimp 1 day ago 0 replies      
not sure how related it is, but a few days ago twitter demanded to change my password, after I changed it - they locked my account demanding that I would provide a phone number to tie to the account, yet none of the numbers I provided do work.

May I supposed they'll be even more focused on collection of various marketing data from their users given how little leverage they have over user's personal lives?

fjordames 1 day ago 0 replies      
Oh god. My roommate was literally offered a position at their Boulder office last week. I wonder how systemic cuts will be?
curiousjorge 1 day ago 1 reply      
Have a feeling that Twitter is one of the uniocorns to go next year.

He's approaching this as a simple restructure & pray with engineering teams when in fact the problem is a much more serious problem, there's a loss of confidence in Twitter from investors.

I guess cutting when investors feel like it's due is a good way to appear like you are making changes when in fact the problem with Twitter is much more deep rooted and a fundamental flaw.

1) Investors realize twitter is horrible for monetization

2) Investors are out of patience or trust

3) Twitter scrambles to find a sustainable revenue source.

4) Twitter cuts off Hootsuite and launches competing business

5) Twitter's massive botnets disappear revealing only a small number of it's userbase is active sparking SEC involvement.

coderjames 1 day ago 0 replies      
A more focused Twitter is much needed if the company ever hopes to become profitable, and not just remain a money pit.
tarekkurdy 18 hours ago 0 replies      
Forgets to remove the corporate speak.
grandalf 1 day ago 0 replies      
This is great for the startup ecosystem because likely many of the engineers are very talented.
gketuma 1 day ago 0 replies      
Does this mean Twitter Bootstrap 4.0 release will be delayed?
ThomPete 1 day ago 0 replies      
Wait why did the subject change?

Isn't the correct headline the subject line of the email?

whatok 1 day ago 2 replies      
Any info on whether this is just to appease shareholders or actual redundancies?
sjg007 1 day ago 0 replies      
It would have been better if it was a total of 124 characters.
SneakerXZ 1 day ago 0 replies      
I feel sorry for people that were laid off but to be honest does Twitter need 4100 employees? I cannot imagine what all these people do for not such a complicated product.
ComputerGuru 1 day ago 2 replies      
Wow, these aren't proofread by anyone?

> The team has been working around the clock to produce streamlined roadmap for Twitter,

"to produce streamlined roadmap" Really?

mirap 1 day ago 0 replies      
Actually, this is really well written.
smaili 1 day ago 0 replies      
If anyone who's part of the layoff is around SF and is looking for a new opportunity, let me know -- me [at] smaili.org
santialbo 1 day ago 0 replies      
Yesterday TWTR went down almost 7%.
mahouse 1 day ago 0 replies      
>We launched the first of these experiences last week with Moments, a great beginning, and a bold peek into the future of how people will see what's going on in the world.

I am terrified about the future of Twitter.

vishalzone2002 1 day ago 2 replies      
any idea what is 336 as a percentage of their tech workforce?
cdelsolar 1 day ago 0 replies      
Message me if you've been impacted and want to join us at Leftronic!
moron4hire 1 day ago 0 replies      
Here's what "giving it straight" really looks like, while also having a chance to save face:

 Everyone, As part of a restructuring of our workforce, we must layoff 336 people from across the company. This is an extremely difficult decision. We believe this is a necessary step to put our company on a stronger path towards growth. The team has been working around the clock to produce streamlined roadmap for Twitter, Vine, and Periscope and they are shaping up to be strong. With the utmost respect for each and every person, Twitter will go to great lengths to take care of each individual by providing generous exit packages and help finding a new job. The roadmap is a plan to change how we work, and what we need to do that work. Product and Engineering will make the most significant structural changes to reflect our plan ahead, focused on the experiences which will have the greatest impact. We feel strongly that Engineering will move much faster with a smaller and nimbler team, while remaining the biggest percentage of our workforce. And the rest of the organization will be streamlined in parallel. Let's take this time to express our gratitude to all of those who are leaving us. We will honor them by doing our best to serve all the people that use Twitter. We do so with a more purpose-built team, which we'll continue to build strength into over time, as we are now enabled to reinvest in our most impactful priorities. As always, please reach out to me directly with any ideas or questions. Jack
Notice I left out Moments, because I think it's in really poor form to mention efforts made before the layoff, with those 336 people, as being a part of this new roadmap that includes laying off those 336 people. Really, really poor move.

moron4hire 1 day ago 1 reply      
Can someone explain to me how these places figure out that it's engineering's fault for failing to figure out how to monetize passive aggression, 140 characters at a time?

I mean, if I were asking myself "why did we fail to achieve our expected growth potential", am I going to blame the people who did what I told them to do, or am I going to blame what I told them to do?

Well, clearly, if I'm an MBA, I'll blame the stupid proles.

foobarbecue 1 day ago 2 replies      
nanoojaboo 1 day ago 0 replies      
Dear Mr Jack, please do not cut my job. I have a wife and two kids and a big mortgage. Thank you, NanooJaboo
signaler 1 day ago 0 replies      
As a hobbyist coding small projects like Twitter in my spare time, I feel their pain and have consistently had to re-adjust the code base, and the amount of project contributors. This is observable on the micro-scale, and I would loathe to think how this plays out on the scale of Twitter, where unbridled and unchecked scale was allowed to take over the company, causing them to lose focus.

Twitter is essentially one big DevOps success story / failure after another, and I have faith they can start to focus again. One motif / question I have seen in every pundit's post about Twitter as a company is why the market (up until now perhaps) has not decided Twitter's faith? If it really is the case that Twitter is a big data company, then how come 90% (random estimate) of their users are fembots / fake accounts?

smonff 1 day ago 0 replies      
IT world need a better class consciousness. The class struggle isn't something from the past. Twitter is one of the biggest company of the net economy, and it actually can fire 336 employees because "the world needs a strong Twitter". How a big company like this one can be authorized to fire people this way? Who employee are gonna react? Are they gonna fight?

Ok, we are not working at the mine, we are working on servers and data, and concept and communication tools, comfortably sitten in white offices, but these company makes a huge amount of money with our work and then will throw employees like garbage? Noooo, this is not acceptable. Jack, do you think that people will take your generous exit package and feel fine: no, some will experiment some difficulties to find a new job, some will divorce, some will be obliged to sell what they build to survive, some will get depression because of unemployment, some might even commit suicide. The price to pay for this can't be equal to your exit packages.

There is a serious problem here. And we are not organized at all to fight this. But workers could unite again. After all organizing a servers strike is not that hard. I wonder why it don't happen.

Workers of the world, unite!

Flux is the new WndProc bitquabit.com
524 points by gecko  1 day ago   124 comments top 25
Todd 1 day ago 1 reply      
I've also observed this similarity. The msg is like the actionType, the wParam and/or lParam are like the polymorphic objects that you pass with your action.

The dispatcher is also not the most efficient model, where every store is registered to listen to every event. This is a bit like multiple windows on an event loop. The difference is that in Windows, messages are almost always targeted to a particular window's handle (hwnd). This doesn't make sense in Flux, since it's more of an observer pattern. The logic of interpreting the meaning of an action is left to each store, which is really just a cache.

The biggest problem I have with Flux relates to this polymophism. I use TypeScript where possible and this is the one place where it always breaks down. I understand the appeal of JS objects but the only way to ensure your Flux based system is stable is to have lots of unit tests around your actions and stores.

Redux is a more straightforward take on caching. I can also use type annotations on the reducers and associated store structure, so this helps ensure structural consistency. It also solves the isomorphism problem of server side rendering because each request can get its own state. There is no out of the box solution for this with Flux, since stores are singletons by default.

Minor nit: stores are just caches with observers. I'm not sure why they weren't just called caches.

unoti 1 day ago 3 replies      
The big idea from old school windows that is shared with Flux is the idea of little views that render themselves and manage their own state. In Windows we called those Controls or Window Classes. It is a good idea, and one worthy of preserving.
mpweiher 1 day ago 1 reply      
A couple of corrections:

1)Mac OS X does not store a bitmap for every widget, that's iOS's architecture. It stores a bitmap for every window. Having a layer (GPU-stored bitmap) was only introduced once CoreAnimation was ported to OS X. It was and is optional.

2)OS X Views also have a -drawRect: method that works the same way.

3) In fact that's how MVC works. See http://blog.metaobject.com/2015/04/model-widget-controller-m...

And react and frameworks like it just duplicated this, see http://blog.metaobject.com/2015/04/reactnative-isn.html In fact, when I first read about react (non-native), my first thought was "hey, finally they came up with a good equivalent of NSView + drawRect:

jowiar 1 day ago 3 replies      
As someone who has written several things with Flux and Flux-esque architecture, I see it as a step in the middle, rather than where things are ending. It's not a large step from Flux (Stores update themselves in response to actions) to Redux (Model the entire application as reducers on a sequence of Actions) to RxJS Observables.

What's shared in there is the idea that unidirectional data flow is a whole lot easier to reason about, model, and simulate than 2-way data flow. Everything else is semantics.

ajsharp 1 day ago 2 replies      
There are some great things going on in React / Flux, but the part that needs to be emphasized about Flux, that Facebook doesn't address explicitly anywhere, and that most people eager to always be on the cutting edge will never admit, is that this stuff was designed to solve problems for very complex applications. Complexity is relative, and the solutions that reduce complexity and friction in the development process for Facebook may increase it for another organization. That is to say, Flux / React et al is by no means simple. Not even a little bit. But it probably simplified a lot of things for the Facebook team. However, YMMV for your 6 person startup engineering team.
estefan 1 day ago 3 replies      
...and so for those of us who aren't Windows developers, what learnings can we apply to flux to make it better?
jxm262 1 day ago 2 replies      
This was an awesome read. We use React and Flux daily at work so I'm going to share this with coworkers. I'm a little confused on what the author's concern is though.

> Ive just feltwell, weird. Something seemed off

Is there anything substantively wrong with the flux pattern or drawbacks?

narrator 1 day ago 3 replies      
So what is Angular then? Angular seems to me to be more like an ORM for the view where there's dirty checking of the model and then update events are dispatched to the external system which is the DOM instead of the DB. Is there something similar in the GUI toolkit world?
geowa4 9 hours ago 0 replies      
I've never liked the comparison of Flux to functional reactive programming. It's really just good ol' object-oriented design. Actions are akin to the Command pattern and the Dispatcher feels like a Mediator. Passing callbacks instead of objects and making a mostly directed graph does not yield FRP.

In my latest project, I used React with rx-react (https://github.com/fdecampredon/rx-react) and RxJS. That combination definitely made for some FRP fun.

arijun 1 day ago 0 replies      
danellis 1 day ago 2 replies      
I share the author's feeling of dj vu. I feel like I've seen this article already. It was a comment posted on HN earlier today. It's kind of fascinating how someone's comment can get promoted to someone else's blog post in a few hours.
pducks32 1 day ago 1 reply      
See I think Flux is too low-level. I think it's too hard to reason about from the top level. Not that the architecture is inherently badpeople are using it a tonbut that things get out of hand way to fast. Regardless I can't wait to see web development in a year!
hoprocker 1 day ago 0 replies      
I love the correlation between modern in-browser development and programming early personal computers. It's akin to how digital logic abstracts away the tyranny of E&M physics, but several layers higher, and this time just between instruction sets/runtimes.

ChromeOS is kind of making this leap, but I really wonder when web browser ASICs (or equivalent) will start popping up.

dustingetz 1 day ago 6 replies      
The author does not understand React :(

> React by itself doesnt actually solve how to propagate changes

It does actually - you update the state, then React propogates the changes for you through it's props mechanism. Flux is an extra layer of indirection over state changes if you need it: https://twitter.com/floydophone/status/649786438330945536 edit: I regret my tone here, there is clearly ongoing work in this area and no widely accepted best practice yet)

Flux is not message passing, React components do not redraw themselves, React components do not pass messages to each other, Flux only superficially looks like winapi because of the switch statement in that particular example.

React provides the view as a function of state. winapi is nothing like that.

React is a giant step towards functional programming. winapi is definitely nothing like that.

edit: Windows -> winapi

pducks32 1 day ago 0 replies      
Does anyone know of a good place to learn about these different approaches. I find this so fascinating.
amelius 1 day ago 0 replies      
Stated more simply, React is just like "rebooting" your computer after you have changed the config files. It is, in this respect, quite ancient technology, except that the framework hits the "reset" button for you.
antoaravinth 1 day ago 0 replies      
What a great article. I was asking in my previous thread, what framework should I use React/Angular : https://news.ycombinator.com/item?id=10359497

Clearly from what I have heard from HN and from this blog post is React with Flux is just the old of doing web development today! Thats great!

sovande 1 day ago 0 replies      
The big dispatcher switch in Flux is eerily reminiscent of how we used to program AWT widgets back in Java 1.0 days. This architecture was improved greatly in Java 1.1 with a delegation model. If the history is to repeat itself, as the OP so eloquent argues for, then, if you want to see where flux will be going in the next couple of years, start using knockout.js now and for once stay ahead of the curve.
thewarrior 1 day ago 1 reply      
Which is the best model to date for complex UI ?

Cocoa + Interface Builder or XAML/WPF ?

Have used Cocoa + Interface Builder and its quite a joy compared to web dev.


Has some thoughts on this :http://stackoverflow.com/questions/2442340/how-does-cocoa-co...

avodonosov 1 day ago 0 replies      
In this line of reinventing the wheel of UI programming in web dev, I am waiting for Borland Delphi reincarnation.
iMark 1 day ago 0 replies      
I've only looked into iOS programming a little, but is this not similar to how views are handled there too?
jsprogrammer 1 day ago 1 reply      
And Node is essentially the Windows message loop [0].

[0] https://en.wikipedia.org/wiki/Message_loop_in_Microsoft_Wind...

jesstaa 1 day ago 0 replies      
Also, Ruby on Rails is Flux.
whatever_dude 1 day ago 1 reply      
The writer really likes the word "idempotent".
underwater 1 day ago 0 replies      
WndProc is how the windows manager communicates with Windows UI code. Flux is how the UI communicates actions back to the data layer of the application. They're completely different.
Judge: NYC Seizing Thousands of Cars Without Warrants Is Unconstitutional amny.com
273 points by bane  1 day ago   100 comments top 17
zaroth 1 day ago 1 reply      
I agree 100% this is a perfect example of where we the people rely entirely on the judiciary to provide a remedy. That such an obviously illegal practice could continue for years unfortunately does not reflect well on any thoughts of swift justice.

It should be possible to get a temporary restraining order against the city in cases like this within days of the first contested case. It should be easy to demonstrate there is no imminent harm of telling the city, you have to stop doing this until we decide it's OK or not, and quite the opposite, cars are an essential and significant asset, and this policy placed a potentially massive burden on the citizens it effected.

In one of the examples, by the time the victim prevailed against the illegal seizure backed by zero evidence or investigation of any kind, they had already sold off his car, and offered nothing in return. A pretty large part of the population doesn't have a spare $2,000 in cash to get their own car back while the city makes them prove in front of a Kangaroo Court that they were driving their own family to the airport... Missing from the article -- is there any hope of any kind of restitution? Can the victims now pursue a civil case against the city?

mapt 22 hours ago 2 replies      
You: The city is stealing my car without probable cause in an attempt to extort money from me.

City DA: No they're not.

What's your recourse here? Call the FBI or federal prosecutor and report an organized crime syndicate being run by corrupt law enforcement professionals? Because... isn't that what this is?

Is there any onus, or even incentive, for them to listen and investigate? Is the only way to redress the problems a civil lawsuit against the City citing Bivens and various appellate court principles like malicious prosecution? Because grand theft auto, extortion, racketeering, and fabrication of evidence / perjury are not civil offenses, and conservative readings of the concept of 'standing', as I understand it, make it rather difficult to challenge the authors of a failed / withdrawn prosecution in order to get at the legal principles which triggered it.

Concepts like this one, as well as things like civil asset forfeiture, are so clearly in direct violation of the Constitution that at some point, it's not legitimate to shelter enforcers under cover of "just following orders". We still have laws (Constitutional and common), and Peabody, Minnesota doesn't have the right to do things like put all the gay residents to death by legislative fiat & judicial compliance; If you found this occurring, you wouldn't need to file a lawsuit alleging that a constitutional overreach has been committed and demanding merely that the policy cease to be in effect. Instead, you would get some overriding authority, like the state police or the FBI, to run in with SWAT teams and arrest and prosecute every last person peripherally attached to the Peabody legislature or judiciary or law enforcement. For murder.

No amount of 'adopting selective prosecution based on what we can win, since the courts recognized a valid affirmative defence' or 'changing training programs to be more in line with civil rights' or 'firing/reprimanding the officers involved and settling a civil suit' makes killing the gay population of Peabody less of a crime, and no amount of lawsuit would be required to get that recognized.

maehwasu 1 day ago 0 replies      
And once again, the nice thing about living in not America is that bribes are significantly cheaper.
grecy 1 day ago 1 reply      
>Probable cause is not a talismanic phrase that can be waved like a wand to justify the seizure of any property without a warrant

Does that apply to civil forfeiture as well? Sounds like it should.

aswanson 1 day ago 2 replies      
Why is the regular news reading more and more like my Onion RSS feed? I have a feeling things were always this absurd, if not more so, but the idiocy gets amplified now by the channels being so connected.
thoman23 1 day ago 0 replies      
So the government should not arbitrarily seize property from its own citizenry? I'm sure they will take that under advisement.
peeters 23 hours ago 0 replies      
I think the most interesting, or scary, part of all of this is the justification for this warrantless search and seizure: to stop Uber from operating in the city. Usually the government has to invoke public safety to try to justify removing individual rights. Now they can just invoke the taxi lobby I guess.
avoutthere 1 day ago 3 replies      
Wow, how was this ever legal to begin with?
dandare 23 hours ago 2 replies      
This is on of the things that fascinates me about the US. Such blatant injustice would be unthinkable in the Europe.
dannysu 1 day ago 2 replies      
I was getting redirected to http://www.forbes.com/forbes/welcome/ if I clicked the link on HN.

If I copy & paste the link to a new tab, then it worked for me.

dools 1 day ago 0 replies      
Just another example of how prohibiting human behaviour instead of regulating leads to over zealous police and undue burden on law abiding citizens.

They should take a page out of London's book and allow minicabs to operate.

AdmiralAsshat 1 day ago 1 reply      
Is the lack of any page displaying with adblocking turned on intentional, or is it simply bad design?
timtas 1 day ago 0 replies      
Yet another reason why I've stopped using plural pronouns to refer to the state at any level.
briandear 20 hours ago 3 replies      
Has anyone ever died because of an unlicensed limo? Is it really a threat to public safety? If consenting adults agree to a transaction, I am not sure how that's the government's business. However, if an unlicensed vehicle was portraying itself as a licensed vehicle, then you have a fraud issue, not a public safety one.
pbreit 1 day ago 2 replies      
Is this an Uber thing? I didn't see it mentioned.
c22 20 hours ago 3 replies      
This is a stupid grammar nitpick I only offer in the hopes that you find it useful for your writing. No dismissal of your arguments or denigration of your character is intended.

I think you want the word "affected" in your second paragraph. Effects are the result of causation, whereas affect refers to the causation. The citizens were affected by the effects of this policy.

bsder 1 day ago 4 replies      
What's the deal with all the Forbes links redirecting to welcome? How do I stop this?

I tried checking the "Warn me when websites try to redirect or reload the page" box in Firefox, but it doesn't appear to be stopping it.

Presumably too many people are starting to use things like "Google Sent Me".

Square files for IPO squareup.com
283 points by nikunjk  7 hours ago   171 comments top 26
myth_buster 7 hours ago 17 replies      

 We generated net losses of $85.2 million, $104.5 million, and $154.1 million in 2012, 2013, and 2014, respectively. As of December 31, 2014, we had an accumulated deficit of $395.6 million. For the six months ended June 30, 2015, we generated a net loss of $77.6 million. As of June 30, 2015, we had an accumulated deficit of $473.2 million. 
Could someone explain the motive to go public while operating in the losses? Isn't the public market more averse to companies operating in the negative.

Also wouldn't the timing act more as a distraction for Jack?

kzhahou 7 hours ago 6 replies      
> All executive officers and directors as a group (13 persons) .... 61.5%

LOL 13 people in the company own more than half. The thousands of employees get to split what's left after investors.

13 people will become BILLIONAIRES and centi/multi-millionaires, while the rest of the company that toiled for years gets (maybe) a down payment on a 1800sqft house on the peninsula.

Meanwhile everyone rails against wall street inequity and the Walton family and whoever else...

tinkerrr 7 hours ago 4 replies      
> In the third quarter of 2012, we signed an agreement to process credit and debit card payment transactions for all Starbucks-owned stores in the United States. The agreement was amended in August 2015 to eliminate the exclusivity provision in order to permit Starbucks to begin transitioning to another payment processor starting October 1, 2015. Under the amendment, Starbucks also agreed to pay increased processing rates to us for as long as they continue to process transactions with us. We anticipate that Starbucks will transition to another payment processor and will cease using our payment processing services prior to the scheduled expiration of the agreement in the third quarter of 2016, and, in any event, we do not intend to renew it when it expires.

In addition to the $150 million loss in 2014, looks like the revenue side doesn't look too good either. From their operating data,

Total revenue = $707.8 million

Starbucks revenue = $123 million

So they would likely lose > 17% revenue very soon.

uptown 7 hours ago 1 reply      
"I believe so much in the potential of this company to drive positive impact in my lifetime that over the past two years I have given over 15 million shares, or 20% of my own equity, back to both Square and the Start Small Foundation, a new organization I created to meaningfully invest in the folks who inspire us: artists, musicians, and local businesses, with a special focus on underserved communities around the world. The shares being made available for the directed share program in this offering are being sold by the Start Small Foundation, giving Square customers the ability to buy equity to support the Foundation. I have also committed to give 40 million more of my shares, an additional 10% of the company, to invest in this cause. Id rather have a smaller part of something big than a bigger part of something small.

We intend to make this big! Thank you for your support and potential investment in Square.


jasondc 7 hours ago 0 replies      
Jack Dorsey still owns close to 25% of Square, pretty incredible:


lquist 7 hours ago 4 replies      
Fuck. Seriously. Fuck.

This could be the IPO that ends the party. One of the top candidates for first Unicorpse (see also Evernote).

pnathan 7 hours ago 2 replies      
Square is incredibly common these days at farmers markets and coffeeshops. This class of products has a lot of win going for it; I don't know if Square is going to own that market, but they have a huge advantage right now. I look forward to reading their financials.

edit: a roundup from the WSJ, Fortune, and other random news places suggests strong concern over the CEO situation. That seems fair. I would definitely discount the value of a non-profitable company with a part-time CEO. :-/

That said, having a "stupid simple and ubiquitous payment platform" should be a really easy way to print money. Why hasn't it? I think that story is the interesting one.

iamleppert 7 hours ago 2 replies      
Look at those numbers! And for those to say that Square is innovative...their reader was basically all they had back in the day. Now that everyone has created a Square clone, payment processing is a race to the bottom and all about brokering deals with large merchants and hope that the banks and processing networks don't eat your lunch.

Throw in all the regulatory hassle of dealing with money and its transfer, as well as liability for fraud, and competing in an industry as exciting as refrigerators...

kochb 6 hours ago 0 replies      
misiti3780 5 hours ago 3 replies      
So apparently Square and Box have both raised about the same amount of money, roughly $550-600MM [1][2], but somehow Jack Dorsey still holds 25% and Aaron Levie wound up with ~ 4% - Does anyone else find that surprising?



gsibble 7 hours ago 2 replies      
Their pace of losses are increasing at about 50% more per year. Most companies that file to go public are at least losing less money over time with a path to profitability. I don't see how this bodes well for a strong IPO.
austenallred 7 hours ago 3 replies      
How many people have simultaneously been CEO of two separate public companies? The list has to be pretty small. No wonder Twitter waited so long to make Jack the official CEO.
gmisra 5 hours ago 1 reply      
Does anybody else get the feeling that the primary goal of large scale venture capital these days is to pump-and-dump IPOs, regardless of the actual stability/validity of the underlying business?
rpedela 7 hours ago 0 replies      
I like Square's product but the growing net losses concern me.
qopp 7 hours ago 3 replies      
Isn't this good for startups in general?

Every time a startup IPOs the investors now have a chance to exit and re-invest in other startups with the fresh capital.

dsugarman 7 hours ago 2 replies      
doesn't the roadshow require an enormous amount of time for the CEO? Can you really do this and anything else?
spike021 7 hours ago 2 replies      
Would it make more sense for a larger company to buy Square before it goes public? I thought I'd heard people saying Apple or Google should but I'm not sure how much benefit there would be since they have payment systems now.
sjg007 6 hours ago 0 replies      
The money in credit card processing is in high interest loans fronted to the merchant that are paid back as a percent of the transaction. SquareUp for instance.
rokhayakebe 7 hours ago 0 replies      
Jack, the Roger Bannister of his field.
jgalt212 7 hours ago 0 replies      
It is really disheartening that a company with such horrible numbers thinks they can go public. It's one thing to lose money before going public, and it's entirely another thing to lose money at increasing rates of speed and think you can find an audience for those shares.
goodcjw2 7 hours ago 0 replies      
does this come actually faster than expected? will jack actually leave?
marincounty 6 hours ago 1 reply      
Does anyone know, off hand, if Square has any important patents? I did a little searching, but couldn't find much.

I do like the company. Just curious if they have any patents that will prevent competition?

curiousjorge 6 hours ago 0 replies      
November 2016: Square files for Bankruptcy. Do the simple math, they lose money year after year in bigger amounts.

This will buy Square some time while it shops around for a buyer but that's assuming the capital market is still liquid and happy and there's no market downturn.

harryh 7 hours ago 0 replies      
You talk about a super high PE and the fact that they have negative earnings in the same sentence. Do you know what these words mean?
urda 5 hours ago 1 reply      
Do you have an actual comment regarding the IPO or a point to make? What you have posted appears to be nothing but noise.
manchco 6 hours ago 0 replies      
I don't know how he does it. http://imgur.com/QfNiUz3
Mattermost 1.0 released open-source Slack alternative mattermost.org
310 points by shuoli84  13 hours ago   131 comments top 30
finnn 11 hours ago 6 replies      
http://getkaiwa.com/ is another Slack alternative that uses an XMPP backend, which IMO is much better than a custom backend. So far the only open source Slack clone I've seen that uses an existing standard for the backend
SEJeff 11 hours ago 1 reply      
This is great. Also see: https://zulip.org

And the blog on why Dropbox decided to OSS it:


cdnsteve 11 hours ago 2 replies      
It's good to have options.

The takeaway I'm getting from this story, and Mattermost, is:1. Export your critical data from SaaS services if you're business cannot exists without them.2. Test that this works before putting years of data into a service.

There's nothing wrong with SaaS services, they just mean users must do their due diligence in any business partnership. I can't see how a game company can put their resources into delivering this as an open source project with no future plans for monetization. Frankly, without monetization, open source projects generally wither up and disappear. Then you're no further ahead.


jgrowl 11 hours ago 2 replies      
Props for open-sourcing, but I'm putting my money on http://matrix.org/
lorenzhs 12 hours ago 1 reply      
If what you really want is a pretty web-based way to access IRC, then you might want to check out Glowing Bear -- it connects to your WeeChat IRC client via websockets and works nicely on the Desktop and on mobile. It doesn't have a voice recorder, but it gives you the infinite possibilities of a mature IRC client. It's a project I've been contributing to for a while now and I still absolutely love using it.


ju-st 11 hours ago 5 replies      
Why do I have to go to slack.com to learn what this is?

I clicked on this link, no explanation. I checked mattermost.org, no explanation. I went to slack.com, no explanation. Then I clicked on a "Product" link on top of the webpage. Finally some information what this actually is.

Even open source projects could benefit from a little bit of marketing.

pbreit 11 hours ago 6 replies      
There was a time when I thought something like this was a good idea. But after using Slack for about a week, there's no way I would give up all the benefits of a well-integrated centrally controlled service. All the clients work together perfectly. We have Slack channels with customers. It just all works much better than I can imagine any self-hosted, decentralized service would.
netcraft 12 hours ago 1 reply      
I think Slack serves its stated purpose very well (smaller, business oriented teams), but many groups have started using it for larger communities, mostly because it has unlimited users for free. But it isn't made for that and there is no way that most of these groups would ever be able to pay for a premium subscription due to the per-user costs. 10K messages across all channels is surprisingly easy to hit, need the ability to ignore users, etc. I think this project has great potential to fill that niche if it is marketed properly. Slack is so close to working well in that area but really needs to pivot to be able to serve it well and make money doing it.
kentt 12 hours ago 2 replies      
I'm trying to decide if this is better than Zulip. They're both open source, backed by someone trusted, and I can run it on my own server.
DannoHung 12 hours ago 0 replies      
Interesting that it will be a default feature of Gitlab.

That's a move that seems like it may push Gitlab ahead of GitHub in some ways (well, to me at least).

mugsie 9 hours ago 2 replies      
Well.... this is depressing -

Mattermost server is made available under two separate licensing options:

- Free Software Foundations GNU AGPL v.3.0, subject to the exceptions outlined in this policy; or - Commercial licenses available from Mattermost, Inc. by contacting commercial@mattermost.com

"To simplify licensing, weve responded to community feedback and the compiled version of Mattermost v1.0 is now under the MIT open source license" (Emphasis mine)

Why just the compiled version?

giovannibonetti 11 hours ago 0 replies      
Since we are talking about open source software, maybe the guys that own the Mattermost account on Github could create a placeholder repo for Android (I wonder if this idea would work for iPhone, too) and accept Pull Requests until there is at least a beta native app.
bachmeier 9 hours ago 0 replies      
Doesn't this have some heavy hardware requirements? Three machines with at least 2 GB of RAM? Is that really necessary if I'm going to chat with five people?
ywecur 11 hours ago 0 replies      
Would be happy to move over to an open source alternative, but at the moment they don't seem to support mobile apps.

It would be very difficult for us to move because of this, we talk a lot on the move.

djmashko2 10 hours ago 0 replies      
I wonder how this compares to Rocket.Chat, another open-source alternative: https://rocket.chat/
ChicagoDave 8 hours ago 1 reply      
I've spent a couple of hours trying to get Docker running on my linode Ubuntu server to no avail.

A non-docker implementation would be nice.

e12e 10 hours ago 1 reply      
This looks very nice. Is there any plans for an API/client protocol? Web client is all well and good, but I'd want to have a solid console client, as well as some command line tools (eg: "echo "Some message" | xmpp user@host -- where the equivalent for Mattermost would allow to set the topic, or message a group via a bot etc.)?
rpedela 11 hours ago 3 replies      
Overall I like it because they closely follow Slack's UI. However I question the choice of fully supporting Markdown. A comment isn't supposed to be documentation. Supporting things like bold, italic makes sense for emphasis or making code easier to read. But headings? When would one ever want really large text in a comment?
BHSPitMonkey 7 hours ago 0 replies      
The blog post is dated October 2nd; Is HN just learning of this announcement late, or is their blog displaying the draft date rather than the publication date?
fsiefken 11 hours ago 0 replies      
If it supports SSL XMPP it can be a drop-in replacement for a lot of companies.
jeffjose 12 hours ago 1 reply      
In my past job, I was desperately looking to get an open source Slack alternative. The ones I tested then (few months back) didnt hold up nicely against Slack. I'm happy to see that finally there's some good competition.
lucaspottersky 11 hours ago 0 replies      
Feature idea: a canvas where people could draw instead of typing a text.
pionar 12 hours ago 1 reply      
So, what does this offer over Slack, besides being open-source? I see no mention of any actual features, besides basic chat features.
yannis 11 hours ago 1 reply      
Besides being an excellent application, this is a valuable resource for anyone studying Golang.
artribou 12 hours ago 0 replies      
Does anyone know who the old provider was that locked in their data?
pwenzel 11 hours ago 0 replies      
Can it send push notifications or other alerts to my phone?
mholt 11 hours ago 1 reply      
Congratulations, Mattermost team! Huge accomplishment :
jhildings 12 hours ago 7 replies      
Why not just use IRC ?
api 10 hours ago 0 replies      
There are many OSS alternatives to Slack. Some are clones and some are different approaches and many of them are quite good.

The thing these and all other similar efforts miss is the importance of network effects. Everyone uses Slack because everyone uses Slack.

The real problem that needs to be tackled is one layer down: providing an open, distributed alternative for authentication, identity management, and data interchange that is secure and robust enough to provide a backplane for things like this and that is easy enough for anyone to use that it can be pushed out to the mass market. I can't stress the last point enough. It must be stupid simple easy to use or it will fail. It also must offer a good and simple developer experience (DX) or it will fail. DX is part of UX. Things like XMPP are nightmares for devs and sysadmins and fail badly here.

This is a huge missing piece of the web.

copsarebastards 9 hours ago 0 replies      
Just what we need, another solution to a problem that was solved two decades ago!
Complete LiDAR Scan of England Publicly Available environmentagency.blog.gov.uk
298 points by alibarber  1 day ago   66 comments top 21
hanoz 1 day ago 6 replies      
I took a look at this after it was mentioned on Hacker News two weeks ago and ended up building this map of all the DSM 1m data:


I've been quite fascinated to discover how many mysterious lumps and bumps are to be found all over the country, often with no apparent explanation in aerial photo maps, and to my great surprise I find myself cultivating an interest in armchair archaeology. I've stumbled across a few features which on further research have turned out to be sites of note, a couple of which were only discovered in recent years, which is quite exciting. Next mission is to discover something completely unknown. In fact I could do with some help interpreting some features if anyone here has any experience in this area.

Here's a couple of well known sites:

https://houseprices.io/lab/lidar/map?ref=SU1224642189 Stonehenge)

https://houseprices.io/lab/lidar/map?ref=SU1025569962 Avebury)

A few of my 'discoveries':

https://houseprices.io/lab/lidar/map?ref=ST5895844810 (Medieval and Iron Age/Roman field systems near Croscombe, Somerset)

https://houseprices.io/lab/lidar/map?ref=NY7217242430 (Potential henge near Alston, Cumbria)

https://houseprices.io/lab/lidar/map?ref=SX1025261066 (Roman Fort near Restormel Castle, Cornwall)

A couple of things I'm not sure about:



Doctor_Fegg 1 day ago 0 replies      
People have been experimenting with using this to contribute to OpenStreetMap for a couple of weeks now. Here's one writeup: http://chris-osm.blogspot.co.uk/2015/09/extracting-building-...
praseodym 1 day ago 1 reply      
There is a similar dataset available for The Netherlands. A potree point cloud visualisation can be seen at http://ahn2.pointclouds.nl/.
wielebny 1 day ago 1 reply      
LIDAR scans of Poland are available publicly from some time: http://geoportal.gov.pl/dane/numeryczne-modele-wysokosciowe
JorgeGT 1 day ago 1 reply      
An almost complete LiDAR scan of Spain is also publicly available. I wrote about it here and included a few samples: http://wechoosethemoon.es/2015/09/05/lidar-espana-3D/

Sadly it is in Spanish but I hope available areas and pictures of expected results are clear enough! LiDAR data is provided as 2 Km x 2 Km squares of RGB-colored points in *.laz format. If someone is interested I can translate into English or point to the sources.

Tepix 1 day ago 0 replies      
This is amazing. I love that they mention Minecraft as one of the use cases:

"LIDAR data some surprising uses:"

"Computer games: Minecraft players have requested our LIDAR data to help them build virtual worlds: the data could be useful to anyone creating realistic 3D worlds."

cwal37 1 day ago 0 replies      
If you're interested in LiDAR data from the United states, you should have a look at this wikipedia page and its corresponding links[0]. Most states have some kind of data freely available from the most recent survey, although it's neither uniform or always clear in terms of how to access it. The structure of the program allowed individual states to tackle their won territory differently in both surveying and data dissemination, so there's no easy and official central repo as far as I understand.

I was in grad school at Indiana and working at the geological survey while they were finalizing some of the state's pieces of this, and it was really fascinating to see some of the early products some of the people in the geography and geology departments were producing. I mucked around a bit with it myself, but never really produced anything useful. I can speak to finding the data fairly easy to acquire and quite comprehensive at the time, uncertain if that's changed, but might be a decent starting point[1].

[0] https://en.wikipedia.org/wiki/National_Lidar_Dataset_(United...

[1] http://gis.iu.edu/datasetInfo/statewide/in_2011.php

joosters 1 day ago 1 reply      
Can anyone recommend any 3d viewing programs for this data? This is all new to me but I'd love to try experimenting with it. The download zipfiles contain .asc files
NickHaflinger 1 day ago 1 reply      
'All 11 terabytes of our LIDAR data (thats roughly equivalent to 2,750,000 MP3 songs)' or a stack of paper 513 kilometers high :
Schwolop 1 day ago 0 replies      
At one point in time[1], I hired a helicopter to act as a surrogate remote sensor doing data fusion with a ground robot. I flew as a passenger and told the pilot where to fly based on the ground robot's need for data.

Since we had to pay for the helicopter's time anyway, and the field trial was spread over two days, we left our equipment attached to the helicopter when it returned to its airfield that night. The next day we had a 78km long data set of LIDAR, GPS, visual imaging, and inertial measures, all from an altitude of about 25m giving us about +/- 2mm for the LIDAR's range data.

The sad end to this anecdote is that I have no idea what happened to that data. It's presumably sitting on a dusty server somewhere in academia.

[1] This point in time, as it happens: http://www.drtomallen.com/uploads/1/2/0/2/12026356/3361016_o...

deskamess 1 day ago 0 replies      
Any idea about the cost of doing a LIDAR scan for a region? Lets say you have 1200 square km (assume rectangular area). How much would that cost?
dougbinks 1 day ago 0 replies      
Format of the data is listed as Arc/Info ASCII Grid (AAIGrid) which is an ASCII Esri grid https://en.wikipedia.org/wiki/Esri_grid.
JonnieCache 1 day ago 4 replies      
Hell yes! This will be useful in my long term aim of procedurally generating rolling english hills for video game purposes...

EDIT: with the resolution of this thing, maybe I won't need to generate them, maybe I can just set the game in the real england...

bsykora 1 day ago 0 replies      
LIDAR is also being used at NASA to measure atmospheric CO2 concentrations.



Animats 1 day ago 1 reply      
Do they have "first and last" LIDAR data, or just one value per point? It's common to capture the distance to both the first and the last reflection. This often indicates the top of vegetation and the ground level. With that, you can easily identify trees, brush, and crops.
groth 1 day ago 1 reply      
Anyone know how burial mounds/roman roads are found? Would love to see that reproduced for the non-academic world.
chatman 1 day ago 0 replies      
This will be great as a base layer in OSM!
scuba7183 1 day ago 2 replies      
Awesome! Does anyone know if similar resources for the US are available?

Edit: possibly https://lta.cr.usgs.gov/LIDARStill looking for more

tibbon 1 day ago 0 replies      
Could do some neat things with drone piloting with this.
vanous 1 day ago 0 replies      
Does anyone know of publicly available LIDAR data for the Czech Republic?
alphapapa 1 day ago 0 replies      
I was hoping to find some explanation of how they capture the data. I'm guessing it's from aircraft? It'd be interesting to read about how they stitch together and correct the data captured from a moving platform like that. And I wonder how long it takes to capture the whole country.
Tesla Model S Autopilot Features teslamotors.com
257 points by extesy  5 hours ago   227 comments top 23
ChicagoBoy11 5 hours ago 10 replies      
A common phrase in aircraft cockpits nowadays is "What the heck is it doing now?" as pilots have migrated from actually flying the plane to simply being glorified systems managers.

While planes have become so, so, so much safer because of all this automation, pilots uncertainty regarding autopilot functioning is a major concern nowadays, and the reason for several accidents.

There are very interesting HCI challenges around properly communicating to the pilot/driver "what the heck it is doing" and clearly communicating just how much control the human has or doesn't have at any given point.

This "announcement" certainly doesn't inspire any confidence that they have really thought this through deeply enough (I think they probably have, but it should be communicated like it). As a huge Tesla fan, I can't help but feel like I need to hold my breath now and make sure something terrible doesn't happen because of this, and it ends up leading to even more regs setting us back on the path to full car automation.

dognotdog 4 hours ago 4 replies      
While the over-the-air update is novel, these features all exist on current luxury and even some middle class vehicles as part of driver assistance option packages.

They're typically called Lane Keeping Assistant, Adaptive Cruise Control, Blindspot Warning, Automated Parking, Traffic Sign Recognition, etc.

The emergency steering bit is interesting, though no further details are provided, as it requires the car to ensure that there is a safe space to steer into, which is dicey for a forward collision emergency braking system, so I'd conjecture it is connected to the side collision warning, and allows collision avoidance if there is enough space in the current lane.

lightcatcher 4 hours ago 3 replies      
What sensors does the Model S have? I'm surprised that Tesla sold a car with enough sensors for semi-autonomous operation without the actual software until now.

For those with more knowledge about cars, how does the sensor array in the Model S compare with similar models from companies such as BMW, Audi, Mercedes-Benz? I'm interested in knowing if it's software or the already installed hardware holding back recent luxury cars from similar capabilities.

Also, does anyone know anything about the (digital) security features of the Tesla? This announcement from Tesla makes it clear that the actual control of the vehicle can be modified by an over the air software update. With the recent Jeep hack[0] in mind, does any know if something similar is possible on a Tesla, or if there are some safeguards such as signed updates? As one of the most computerized cars on the market, I tend to think that the Tesla cars might also be some of the most (maliciously) hackable cars on the market.

[0] http://www.wired.com/2015/07/hackers-remotely-kill-jeep-high...

joosters 4 hours ago 2 replies      
Releasing driving assistance features as a 'beta'? What on earth does that mean here? Are the features ready to use or not? Do Tesla warrant that they work and are safe?

Maybe they expect drivers to treat it like beta software - "Please don't use these features in production cars. Make sure you keep backups of all drivers and passengers in case of bugs."

verelo 3 hours ago 1 reply      
All these controls sound very similar to those in my current year Mercedes...although i would be hopeful that the autosteer on offer here is better than the Distronic plus "lane assist" in the Merc, which while OK, does not do a great job on less than gentle turns above 50km/hr (but its actually great below that speed - to the point I wonder why i'm even in the seat, in particular in stop-start traffic situations). It certainly sounds similar from the "hands must be on the wheel" requirement.

I look forward to the next step up from all the car makers, which is clearly the car driving on its own in a much more confident way, with the driver simply there to manage exceptions as opposed to being 'assisted' by technology as is with the current implementations.

Animats 30 minutes ago 0 replies      
This is similar to what other high-end cars have, lane-keeping and smart cruise control, usable only in freeway-type situations. "Drivers must keep their hands on the steering wheel." Mercedes calls this "Active Lane Keeping Assist", and has offered it for several years now. Here's someone using it with a can taped to the steering wheel to defeat the "hands on steering wheel" requirement.[1] All the major manufacturers have demoed this.

This is NTSB level 2 automation, (Combined Function Automation).[2] ("An example ... is adaptive cruise control in combination with lane centering.") Google is at level 3 (Limited Self-Driving Automation), and going for level 4 (Full Self-Driving Automation).

The big problem at Level 2 is keeping drivers from using it when they shouldn't. Level 2 doesn't understand intersections at all, for example. Or pedestrians, bicycles, baby carriages, deer, snow, etc. That's why the major manufacturers are being so cautious about launching it into a world of driving idiots.

Volvo has now officially taken the position that if an autonomous car of theirs gets into a crash, it's Volvo's fault and they will accept liability.[3] Now that Volvo has said that, other car manufacturers will probably have to commit to that as well.

[1] https://www.youtube.com/watch?v=Kv9JYqhFV-M[2] http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Departm...[3] http://www.extremetech.com/extreme/215832-volvo-well-take-th...

jzwinck 5 hours ago 5 replies      
> Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.

Which sense of "must" is used here? The car seems to play an unwinnable game with the driver: keep your hands on the wheel or I'll...what? Disengage autosteer and perhaps crash? With no enforcement mechanism, drivers are incentivized to "abuse" (aka "use") the system as much as it allows.

OopsCriticality 4 hours ago 1 reply      
I was surprised to find that the Autopilot feature is a paid $2500 upgrade, according to one source.[0] I'm not surprised that Tesla is charging for the upgrade, but that in all the press and enthusiast coverage of Tesla, I don't recall it being mentioned before.

[0] http://blog.caranddriver.com/elon-take-the-wheel-we-test-tes...

Mizza 5 hours ago 4 replies      
This seems insanely dangerous to me. They're introducing a feature which could, potentially, cause massive highway accidents, but providing documentation that amounts to little more than a glorified README file?:

> Auto Lane Change

> Changing lanes when Autosteer is engaged is simple: engage the turn signal and Model S will move itself to the adjacent lane when its safe to do so.

A single sentence! What's the point of having drivers license lessons and testing if the fundamental operation of the vehicle can change so drastically?

Am I being a luddite, or does anybody else feel this way?

grecy 3 hours ago 0 replies      
Videos are starting to show up on youtube.


VERY impressive.

mmerkes 5 hours ago 1 reply      
The auto-park feature would be super handy, but I don't see an auto-unpark feature... I look forward to seeing Teslas stuck in amazingly small parking spots!
waterlesscloud 5 hours ago 1 reply      
Here's a video of version 7 in action that someone linked in /r/selfdrivingcars last night. Not super-informative, but interesting to watch anyway.


NN88 5 hours ago 5 replies      
How is this different from Mercedes-Benz's "self driving?"
derek 5 hours ago 4 replies      
> Drivers must keep their hands on the steering wheel.

This seems odd, my understanding was that drivers needed to "check in" every so often, not handle the wheel at all times.

abalone 3 hours ago 0 replies      
Any thoughts on the potential manufacturer liability for software bugs that lead to accidents?

Certainly there are a lot of precedents with anti-lock braking systems, cruise control, etc. But this stuff seems like such a massive expansion of complexity of software control I wonder what will go down in the courts when the inevitable happens.

mixmastamyk 5 hours ago 3 replies      
Unfortunately little mention of front collision avoidance (braking), an important safety feature, which I've waited for on Teslas must be years now.

In the forums there's always the guy that says we should "drive better" instead. With that logic, there's little use for safety features at all.

51Cards 2 hours ago 1 reply      
"Drivers must keep their hands on the steering wheel."

This video would seem to indicate otherwise?


thoman23 2 hours ago 0 replies      
"Autosteering (Beta)"

That must be the single most frightening use of the Beta label in history.

spoon16 5 hours ago 1 reply      
Anyone know how well the lane change feature works in heavy traffic?
mathrawka 5 hours ago 3 replies      
As someone who spends a fair amount of time traveling between countries that drive on different sides of the road... I am always getting the turn signal and windshield wipers mixed up. So I doubt I can use the auto lane change feature.
devit 5 hours ago 1 reply      
Is it smart enough to not change to a lane going in the opposite direction or change to a "lane" that is actually a ditch off the road?
capkutay 3 hours ago 0 replies      
This is a cool technical achievement, but I don't see the practical use nor does it seem like a big win for tesla drives. So it allows drives to kind of tune out while driving on the freeway?
sandworm101 4 hours ago 8 replies      
Note two words absent from the OP: "Speed limit".

This machine will keep pace with traffic. OK. Does that mean it will break speed limits? Unless it is scanning for each and every potential road sign, it simply cannot be respond to arbitrary/temporary limits. The determination of the legal limit on a piece of road is a complex task. Road construction, local conditions, sunrise/set, time of year (school zones) and even weather can be a factor. And let us not forget "Speed limit X when children on road". You need some serious cpu time to work out whether that person walking along the road is a schoolgirl or a construction worker.

Imho any system not capable of determining the speed limit accurately is a legal liability. Have fun with the tickets.

>eliminating the need for drivers to worry about complex and difficult parking maneuvers.

No. Parallel parking is not a complex nor difficult maneuver. It is total beginner territory. No lives are at risk. With a decent bumper, even risk of property damage is minimal. Anyone not capable of learning to parallel park probably shouldn't be behind the wheel of much anything. Anyone buying this car to avoid such mundane tasks isn't someone with whom I want to share the road.

Show HN: Expos a static site generator for photos and videos github.com
534 points by Jack000  2 days ago   70 comments top 35
Jack000 2 days ago 5 replies      
so this is just a bit of glue code for imagemagick/ffmpeg that I use to generate my blog.

Last time I was on HN there was some interest in what I was using for the backend, so I cleaned up the code a bit and put it on github

eric-hu 2 days ago 1 reply      
This looks amazing. Thank you for generalizing this and open sourcing it.

For anyone looking to use the polygon word wrap feature as in the demo on the Github page with the Eiffel Tower, take a look at its responsive behavior before you make that post. As you shrink the window down on jack.ventures, some text can be cut off, or no longer show up on a clear portion of the image like a wall.

This isn't a bug with the software so much as a flaw with the magazine "words on image clearing" style layout that doesn't translate perfectly to the web medium.

superic 2 days ago 0 replies      
Beautiful work and amazing photography!

Interested in making things at Flickr? Drop me a line at eric at flickr.com :)

ausjke 2 days ago 0 replies      
really cool and I also learned a new way to check dependencies under bash:

command -v convert >/dev/null 2>&1 || { echo "ImageMagick is a required dependency, aborting..." >&2; exit 1; }

bobfunk 2 days ago 0 replies      
Added it to staticgen.com now - love these small non-general purpose static gens :)

Reminds me of ThumnsUp: https://github.com/rprieto/thumbsup

mlapeter 2 days ago 2 replies      
This is really great! Is it open source? I couldn't see a license or anything in the readme so wasn't sure.
mettamage 2 days ago 2 replies      
Got a question: for fun I did this over about 400 to 500 photos. I partially did it also to stress test it. But the HTML file is empty. Did I do something wrong? The rest of the folders are there.

Here is the output if someone is interested to brainstorm about this problem. I edited the output a bit to make it slightly more readable.

<begin output>

Scanning directories.

Populating nav.

Reading files...........................




















_name/Downloads/Expose-master/expose.sh: line 139:

/usr/bin/sed: Argument list too long

Starting encode










</end output>

Anyways, I checked the image which was indeed corrupted, so ImageMagick was right on the money on that one. I still don't get it though why there's no HTML in the HTML file.

cataflam 2 days ago 0 replies      
This looks amazing.

I was using a tool one of my friends made, with a similar idea, just put images and videos in directories and have a script automatically generate a nice output. It can be found at https://github.com/jlaine/django-coconuts, but yours looks spectacular. I'm tempted to switch.

anderspitman 1 day ago 0 replies      
Very cool. Somewhat off topic: I've been looking recently for a good way to host many GBs of photos and videos for my friends and family to browse and download. Basically an open source version of Google Drive's file browser, with thumbnails and image and video previews. Any suggestions?
desireco42 2 days ago 1 reply      
Thank you, this is exactly what I was looking for (actually thinking how to cobble up together myself). So I guess I am target audience :)

I want to add that sites look awesome and this is perfect for large number of people, I just want to thank you one more time for making this, it is also excellent starting point for photo sites.

3stripe 2 days ago 0 replies      
Lovely. Would be swell if you could plug it straight into a Dropbox folder.
callmeed 2 days ago 0 replies      
This is very cool. Love that its just a shell script.

In a similar vein, I have a Jekyll plugin that reads folders of images and can create galleries: https://github.com/callmeed/jekyll-image-set

Nice that you have video support. Can't wait to give this a try.

raimue 2 days ago 0 replies      
Thank you very much!

I appreciate the use of simple bash shell script instead of a scripting language with lots of runtime dependencies.

nathancahill 2 days ago 0 replies      
Phenomenal. I've bounced between Tumblr/Wordpress with photoblog themes, Flickr, 500px and Instagram for sharing my film photography with friends. This is better than all of those, especially with the captioning.

Since most of my photographs are organized by trip or theme, this is perfect.

lemming 2 days ago 0 replies      
This is really nice! Getting photos and especially video online in an easy way but still have it be beautiful has been really tricky. I'll definitely use this.
hobonumber1 2 days ago 1 reply      
This looks cool. Is there a demo link that I can check out?
jaza 2 days ago 0 replies      
Nice work. I guess that for a sitegen so focused on photos and videos, a bash script works well - lets you do whatever you need directly with imagemagick and ffmpeg, no beating around the bush.

And I like your sites! Now... all I need is to be a better photographer, then I'd be able to actually create something nice with your script. Hahaha.

lucaspottersky 2 days ago 0 replies      
i like the results. now, it would be very useful to have a friendly GUI to edit those things and generate the YML files for you.
falcolas 2 days ago 1 reply      
Given that it's a static site, why the JS requirement? Most browsers already intelligently handle keeping the "right" number of assets in memory, do you believe your JavaScript handles it better?

Also, it's not at all responsive to different viewport sizes; this might be a good thing to address.

EDIT: Oof, that's a kick in the karma. Aah, well.

thieving_magpie 2 days ago 0 replies      
I didn't know that I needed this, but now that I know it exists I really need this. Awesome job, thank you.
giancarlostoro 2 days ago 0 replies      
Would be cool if you could do something similar to this with git repositories as well... Hmmm..
dyogenez 2 days ago 1 reply      
This is seriously cool! I'm going to have to checkout the code for this one to see how some of this was handled.

Do you have any other sites doing photo/video stories which have been an inspiration for this project?

marcfowler 2 days ago 0 replies      
This looks amazing - really great work. I'll definitely check this out properly.
Omnipresent 2 days ago 0 replies      
Looks beautiful. Support for google photos would make this more versatile.
areohbe 2 days ago 0 replies      
This is wonderful. Superb work.
caiowilsonb 7 hours ago 0 replies      
Thank you so much for sharing.
cmstoken 1 day ago 0 replies      
Beautiful work! I'm in love with your site. I'm definitely going to be using the software. Thank you Jack!
flanbiscuit 2 days ago 0 replies      
I just got back from a trip and I have a lot of photos and some videos. This is perfect! I'm going to try this out as soon as I get home tonight
chadscira 2 days ago 2 replies      
I really don't understand the appeal of these static site generators. Why don't people just toss something like CloudFlare in front of their dynamic sites and turn the edge caching up. I mean this is free, the bandwidth expenses are covered, and you now have a globally accessible site.

Unless you're just trying to get away with hosting your whole site as a GitHub page ;)

noahbradley 2 days ago 0 replies      
Love, love, love this. I travel and shoot a lot of photos, so this would be perfect for putting those out there.
thekevan 2 days ago 0 replies      
This is an excellent way to present images and videos. Also, reading about your work is an inspiration. Thanks!
therealmarv 2 days ago 1 reply      
Looks great, but man... this seems like websites for the top 10% of the world with great internet speed. This sites are really not good with slow(er) internet. Also look at amount of traffic needed for full example of http://jack.ventures even do not want to spend my mobile traffic on that example (although LTE is fast enough).
NKCSS 1 day ago 0 replies      
Very cool.

I have little else to add, but I wanted to let you know anyway :) Keep it up!

evantahler 2 days ago 0 replies      
rekshaw 2 days ago 0 replies      
Wow, love your bio: "These days I mostly travel and work on random stuff that I find interesting. I'm not really looking for employment, but I've always wanted to work at NASA and/or Google. So if you're NASA and/or Google HR, drop me a line ;]"

I wish I had that freedom.

Profile of Margaret Hamilton, programmer of the Apollo software wired.com
250 points by doppp  1 day ago   58 comments top 10
nickpsecurity 1 day ago 6 replies      
Many programmers talk about Ada Lovelace, who was certainly awesome, but should be talking about Margaret Hamilton. Before anyone heard of Dijkstra's work, Hamilton was straight up inventing everything from real, software engineering to properly handling fault-tolerance and human interfaces. Here's what she was doing (see Apollo Beginnings rather than USL):


Her Wikipedia page gives her due credit and shows just how much she and her team pulled off:


Also note the size of that printout of her program. Try to think about how many people make one like that which doesn't fail in the field even when parts of its environment and hardware are failing. Also while using low-level code with little to no tool support on 60's era, embedded hardware rather than a JVM, etc. Margaret Hamilton was the first in the field of high-assurance programming that I've seen. A lot of firsts.

So, I give her credit where possible. The industry should even more as she wasn't a theoretical programmer like Ada Lovelace: she engineered actual software, co-invented many of the best practices for it, deployed them successfully in production, and then made a tool that automated much of that from requirement specs to correct code.

While programmers and industry ignore her, at least NASA gave her positive mention as the founder of software engineering and ultra-reliable software along with the biggest check they ever cut to an innovator:


Where would our systems and tools be if more attention was put into her methods, even just principles, than informal use of C? ;)

lumberjack 1 day ago 6 replies      
Only tangentially related but it seems to me that if you want to work on cool software that does something novel and exciting it seems like it's better to graduate with a degree in math or physics.

Also interesting if you visit her company's website: http://www.htius.com/

It's a view into a software industry that is virtually never reported on in the news, at least not on HN. The client list is impressive. It's not really clear what they actually do? Seems to me like they maintain a developing environment and sell support contracts for it?

Animats 1 day ago 1 reply      
She and Saydean Zeldin used to have a company called Higher Order Software. I met them decades ago, when they were promoting that. They have an unusual formalism which never caught on.

There was a lot of interest in formal techniques in the late 1970s and early 1980s, but the technology didn't go that way.

danso 1 day ago 0 replies      
I knew Hamilton's achievements in the Apollo program were remarkable...but I hadn't known that she had started while being as young as 24 -- and being a mother. That's just incredible. I can't even imagine going through school as a father and being able to balance my time, never mind having to carry child. Nevermind being a star programmer at MIT. And her husband was going through law school at the time, which means not only did she not have a stay-at-home husband to take over the parenting duties, she was the breadwinner.
hoorayimhelping 1 day ago 0 replies      
Excellent chapter from the documentary Moon Machines about the Apollo guidance software (worth getting the whole thing on Amazon if you like this sort of thing), told from the point of view of the people who built the machines, rather than the astronauts.



ghaff 1 day ago 1 reply      
From the article:

>Once the code was solid, it would be shipped off to a nearby Raytheon facility where a group of women, expert seamstresses known to the Apollo program as the Little Old Ladies, threaded copper wires through magnetic rings (a wire going through a core was a 1; a wire going around the core was a 0). Forget about RAM or disk drives; on Apollo, memory was literally hardwired and very nearly indestructible.

I was at a talk by Richard Battin, also of Draper on the Apollo program, a few years back. One of the stories he told was of a number of the Apollo astronauts visiting Raytheon (where the core memory was being "sewn") and the general gist of the visit was "be really careful with your work or these nice young boys could die."

davegauer 1 day ago 1 reply      
A fascinating read about the incredible capabilities of the Apollo software is _Digital Apollo: Human and Machine in Spaceflight Paperback_ by David A. Mindell.

Those craft were quite likely capable of a _lot_ more than they were ever allowed to do by the astronauts - though I guess we'll never know!

ourmandave 1 day ago 2 replies      
Is she a candidate for the $10 bill?
oldmanjay 1 day ago 1 reply      
>why the gender inequality of the Mad Men era persists to this day

Well, that's demonstrably untrue, but the narrative, it must be pushed at all costs.

Kilogram conflict resolved at last nature.com
286 points by ColinWright  18 hours ago   120 comments top 21
Asbostos 16 hours ago 1 reply      
The best part about this batch of changes is they push the mole and Avogadro's constant out on their own where the belong, not linked to any other units. Now we'll have only a single mass unit (kg) instead of two (kg and unified atomic mass unit) that we do now. This will knock carbon 12 of its perch as the definition of the "other" mass unit that's been essential to use SI's mole but was not actually SI itself.
jakeogh 17 hours ago 1 reply      
silicon-28 sphere(s?): https://www.youtube.com/watch?v=ZMByI4s-D-Y yep, they let him palm it)

watt balance: https://www.youtube.com/watch?v=VlJSwb4i_uQ

zb 15 hours ago 1 reply      
I was amused to read this:

"They never found the cause for the disagreement, but in late 2014 the NIST team achieved a match with the other two"

at a time when this story, also from Nature, is also on the front page: https://news.ycombinator.com/item?id=10383984

LoSboccacc 16 hours ago 0 replies      
so the proposed definition was set by fixing the numerical value of the Planck constant to 6.62606X1034 s1m2kg

and the conundrum was that they still needed to have a precise enough measurement of that constant because it's an experimental measurement.



MarcusP 17 hours ago 4 replies      
Are metric measurements all derived from the value 1kg? If so, does this mean that the entire metric weight range can now be officially based on mathematics?
dfc 16 hours ago 2 replies      
The kilogram is still the only base unit that contains an SI prefix in the base unit's name.
tinkerdol 14 hours ago 0 replies      
Reminds me of this movie: https://www.youtube.com/watch?v=5dPnFO_JCdc(haven't seen it, but it looks interesting
pervycreeper 14 hours ago 2 replies      
What is the level of accuracy they are aiming for? If it entails have some uncertainty over the precise number of atoms in the silicon sphere, then how did they choose this level of accuracy?
Animats 10 hours ago 3 replies      
Ah, they went with the "electric kilogram". The other plan was to build up a regular structure with a known number of silicon atoms. That idea was to make a perfect crystal and count the number of atoms on each face. Apparently that's almost possible, although hard to do.
Aoyagi 16 hours ago 1 reply      
Here I thought a kilogram was defined by water... oh well, looks like that definition is slightly outdated.


novaleaf 11 hours ago 0 replies      
If you are interested in hearing more expert commentary, NPR Science Friday did a piece on this in July:


spydum 15 hours ago 1 reply      
i was hoping this was going to explain the kg differences between the original and the copies. instead it just resolves it by changing the standard. good for science i guess, sad for my curiosity
unwind 17 hours ago 0 replies      
Duplicate, very close in time: https://news.ycombinator.com/item?id=10385743.
thieving_magpie 11 hours ago 0 replies      
A planet money podcast from a few years ago on the kilogram: http://www.npr.org/templates/story/story.php?storyId=1120033...
lifeformed 10 hours ago 1 reply      
Previously, why didn't they have a reference gram instead of a kilogram? Seems like it'd be easier to create and maintain, and transport.
justhw 12 hours ago 1 reply      
There's a good Radiolab episode related to this .


acqq 15 hours ago 2 replies      
I didn't understand what then will be used: the Si sphere or the Watt balance?
bartvbl 16 hours ago 3 replies      
I wonder: the article states that the SI unit for Kg up to this point was defined using a single object. Doesn't this definition also involve the fact that it's placed on earth, thus requiring two objects for its definition?
TomGullen 15 hours ago 0 replies      
Lived next door to a PHD NPL physist who was working on this a few years ago. I think they ended up handing the project over to Canada or somewhere like that IIRC. Fascinating project and guy.
mtgx 16 hours ago 6 replies      
Now even the U.S. can adopt it.
castratikron 15 hours ago 0 replies      
Strange to see Planck's constant used that way, defining a kilogram. Planck's constant usually only shows up when you're doing quantum mechanics and the things you're working with are really small.
Comparison R vs. Python: head to head data analysis dataquest.io
269 points by emre  15 hours ago   187 comments top 29
mbreese 15 hours ago 6 replies      
This is interesting, but not really an R vs. Python comparison. It's an R vs. Pandas/Numpy comparison. For basic (or even advanced) stats, R wins hands down. And it's really hard to beat ggplot. And CRAN is much better for finding other statistical or data analysis packages.

But when you start having to massage the data in the language (database lookups, integrating datasets, more complicated logic), Python is the better "general-purpose" language. It is a pretty steep learning curve to grok the R internal data representations and how things work.

The better part of this comparison, in my opinion, is how to perform similar tasks in each language. It would be more beneficial to have a comparison of here is where Python/Pandas is good, here is where R is better, and how to switch between them. Another way of saying this is figuring out when something is too hard in R and it's time to flip to Python for a while...

bigtunacan 13 hours ago 5 replies      
R is certainly a unique language, but when it comes to statistics I haven't seen anything else that compares. Often I see this R vs Python comparison being made (not that this particular article has that slant) as a come drink the Python kool-aid; it tastes better.

Yes; Python is a better general purpose language. It is inferior though when it comes specifically to statistical analysis. Personally I don't even try to use R as a general purpose language. I use it for data processing, statistics, and static visualizations. If I want dynamic visualizations I process in R then typically do a hand off to JavaScript and use D3.

Another clear advantage of R is that it is embedded into so many other tools. Ruby, C++, Java, Postgres, SQL Server (2016); I'm sure there are others.

phillipamann 12 hours ago 1 reply      
R is a wonderful language if you chose to get used to it. I love it. I've even used R in production quality assurance to check for regressions in data (not the statistical regressions). I see countless R posts where people try to compare it to Python to find the one true language for working with data. Article after article, there clearly isn't a winner. People like R and Python for different reasons. I think it's actually quite intuitive to think about everything in terms of vectors with R. I like the functional aspects of R. I wish R was a bit faster but I am pretty sure the people who maintain R are working on that. You can't beat the enormous library that R has.
danso 14 hours ago 3 replies      
I spent a few weeks a few months ago learning R. It's not a bad language, and yes, the plotting is currently second-to-none, at least based on my limited experience with matplotlib and seaborn.

There's scant few articles on going from Python to R...and I think that has given me a lot of reason to hesitate. One of the big assets of R is Hadley Wickham...the amount and variety of work he has contributed is prodigious (not just ggplot2, but everything from data cleaning, web scraping, dev tools, time-handling a la moment.js, and books). But that's not just evidence of how generous and talented Wickham is, but how relatively little dev support there is in R. If something breaks in ggplot2 -- or any of the many libraries he's involved in, he's often the one to respond to the ticket. He's only one person. There are many talented developers in R but it's not quite a deep open-source ecosystem and community yet.

Also word-of-warning: ggplot2 (as of 2014[1]) is in maintenance mode and Wickham is focused on ggvis, which will be a web visualization library. I don't know if there has been much talk about non-Hadley-Wickham people taking over ggplot2 and expanding it...it seems more that people are content to follow him into ggvis, even though a static viz library is still very valuable.

[1] https://groups.google.com/forum/#!topic/ggplot2/SSxt8B8QLfo/...

sweezyjeezy 14 hours ago 5 replies      
This is just a series of incredibly generic operations on an already cleaned dataset in csv format. In reality, you probably need to retrieve and clean the dataset yourself from, say, a database, and you you may well need to do something non-standard with the data, which needs an external library with good documentation. Python is better equipped in both regards. Not to mention, if you're building this into any sort of product rather than just exploring, R is a bad choice. Disclaimer, I learned R before Python, and won't go back.
c3534l 10 hours ago 2 replies      
I like Python better as a language, but Python's libraries take more work to understand and the APIs aren't very unified. R is much more regular and the documentation is better. Even complicated and obscure machine learning tasks have good support in R. BUT the performance for R can be very, very annoying. Assignment is slow as all hell and it can often take work to figure out how to rephrase complicated functions in a way that R can figure out how to do efficiently. I think being much more functional than Python works well for data. I mean the L in LISP stands for list! Visualizations are also easier and more intuitive in R, too, IMO. Especially since half the time you can just wrap some data in "plot" and R will figure our which one it should use.

I think the conclusion of the article is correct. R is more pleasant for mathier type stuff, while Python is the better general-purpose language. If your jobs involves showing people powerpoint presentations of the mathematical analysis you've done,you'd probably want to use R. If, on the other hand, you're prototyping data-driven applications, Python would probably be better.

That said, I really like Julia, but can't justify really diving into it at this point. :\

Mikeb85 11 hours ago 0 replies      
The reason I like R - it just makes data exploration and analysis too damn easy.

You've got R Studio, which is one of the best environments ever for exploring data, visualisation, and it manages all your R packages, projects, and version control effortlessly.

Then you've got the plethora of packages - if you're any of the following fields: statistics, finance, economics, bioinformatics, and probably a few others, there's packages that instantly make your life easier.

The environment is perfect for data exploration - it saves all the data in your 'environment', allows you to define multiple environments, and your project can be saved at any point, with all the global data intact.

If I want some extra speed, I can create C++ modules from within R Studio, compile and link them, as easily as simply creating a new R script. Fortran is a tiny bit more work, still easy enough however.

Want multicore or to spread tasks over a cluster? R has built in functions that do that for you. As easy as calling mcapply, parApply, or clusterApply. Heck, you can even write your function in another language, then R handles applying that over however many cores you want.

Want to install and manage packages, update them, create them, etc...? All can be done from R Studio's interface.

Knitr can create markdown/HTML/pdf/MS Word files from R markdown, or you can simply compile everything to a 'notebook' style HTML page.

And all this is done incredibly easily, all from a single package (R Studio) which itself is easy to get and install.

Oh yeah, visualisation, nothing really beats R.

And while there are quirks to the language, for non-programmers this isn't really an obstacle, since they aren't already used to any particular paradigm.

As for Python, I'm sure it's great (I've used it a little), but I really don't see how it can compare. R's entire environment is geared towards data analysis and exploration, towards interfacing with the compiled languages most used for HPC, and running tasks over the hardware you will most likely be using.

evanpw 12 hours ago 3 replies      
If you only have time to learn one language, learn Python, because it's better for non-statistical purposes (I don't think that's very controversial).

If you need cutting-edge or esoteric statistics, use R. If it exists, there is an R implementation, but the major Python packages really only cover the most popular techniques.

If neither of those apply, it's mostly a matter of taste which one you use, and they interact pretty well with each other anyway.

acaloiar 14 hours ago 2 replies      
I have always considered R the best tool for both simple and complex analytics. But, it should not go unmentioned that the features responsible for R's usability often manifest as poor performance. As a result, I have some experience rewriting the underlying C code in other languages. What one finds under the hood is not often pretty. It would be interesting to see a performance comparison between Python and R.
ggrothendieck 11 hours ago 0 replies      
For R: (1) instead of `sapply(nba, mean, na.rm = TRUE)` use `colMeans(nba, na.rm = TRUE)`. (2) instead of `nba[, c("ast", "fg", "trb")]` use `nba[c("ast", "fg", "trb")]`, (3) instead of `sum(is.na(col)) == 0` use `!anyNA(col)`, (4) instead of `sample(1:nrow(nba), trainRowCount)` use `sample(nrow(nba), trainRowCount)` and (5)instead of tons of code use `library(XML); readHTMLTable(url, stringsAsFactors = FALSE)`
mojoe 13 hours ago 3 replies      
The one thing that sometimes gets overlooked when people decide whether to use R or Python is how robust the language and libraries are. I've programmed professionally in both, and R is really bad for production environments. The packages (and even language internals sometimes) break fairly often for certain use cases, and doing regression testing on R is not as easy as Python. If you're doing one-off analyses, R is great -- for anything else I'd recommend Python/Pandas/Scikit.
The13thDoc 14 hours ago 0 replies      
The "cheat sheet" comparison between R and Python is helpful. The presentation is well done.

The conclusions state what we already know: Python is object oriented; R is functional.

The Last Word appropriately tells us your opinion that Python is stronger in more areas.

falicon 12 hours ago 0 replies      
Language comparisons are equiv. to religion comparisons...you aren't going to find a universal answer or truth, it's an individual/faith sort of thing.

That being said - all the serious math/data people I know love both R and Python...R for the heavy math, Python for the simplicity, glue, and organization.

acomjean 15 hours ago 3 replies      
I work with biologists. R which seems strange to me they seem to take to. I think some of it is Rstudio the ide, which shows variables in memory on the side bar, you can click to see them. It makes everything really accessible for those that aren't programmers. It seems to replace excel use for generating plots.

I've grown to appreciate R, especially its plotting ability (ggplot).

xname2 14 hours ago 0 replies      
"data analysis" means differently in R and Python. In R, it's all kinds of statistical analyses. In Python, it's basic statistical analysis plus data mining stuff. There are too many statistical analyses only exist in R.
fsiefken 14 hours ago 0 replies      
It would be nice to compare JuliaStats and Clojure based Incanter with Python Pandas/NumPy/SciPy. http://juliastats.github.io/
willpearse 14 hours ago 0 replies      
Very picky, but beware constantly using "set.seed" throughout your R scripts. Always using the same random number is not necessarily helpful for stats, and makes the R code look a lot trickier than it need be
zitterbewegung 15 hours ago 1 reply      
This is not just interesting for comparison but its interesting for people that know R/Python how to go from one to the other.
wesm 12 hours ago 1 reply      
I hope you all know that the people who have invested most in actually building this software care the least about this discussion.
daveorzach 12 hours ago 1 reply      
In manufacturing Minitab and JMP are used for data analysis (histograms, control charts, DOE analysis, etc.) They are much easier to use and provide helpful tutorials on the actual analysis.

What features or workflow does R or Pandas/Numpy offer to manufacturing that Minatab & JMP can't?

thebelal 10 hours ago 1 reply      
The rvest implementation was the main thing that seemed like an R port of the python implementation rather than best use of rvest.

An alternate (simpler) implementation of the rvest web scraping example is at https://gist.github.com/jimhester/01087e190618cc91a213

It would be even simpler but basketball-reference designs it's tables for humans rather than for easy scraping.

dekhn 8 hours ago 0 replies      
In general, if I have to chose between two languages, one of which was designed specifically for statistics, and one that was more general, I will chose the more general one.

R's value is in the implementation of its libraries but there is no technical reason a really OCD person couldn't implement such high quality of libraries in Python.

xixi77 12 hours ago 1 reply      
Really, syntax "nba.head(1)" is not any more "object-oriented" than "head(nba, 1)" -- it's just syntax, and the R statement is in fact an application of R's object system (there are several of them).

IMO, R's system is actually more powerful and intuitive -- e.g. it is fairly straightforward to write a generic function dosomething(x,y) that would dispatch specific code depending on classes of both x and y.

andyjgarcia 9 hours ago 0 replies      
The comparison is R to Python+pandas.

The equivalent comparison should be R+dplyr to Python+pandas.

Base R is quite verbose and convoluted compared to using dplyr. Likewise data analysis in Python is painful compared to using pandas.

vineet7kumar 12 hours ago 1 reply      
It would be nice to also have some notes about performance of both the languages for each of the tasks compared. I believe pandas would be faster due to its implementation in C. The last time I checked R was an interpreted language with its interpreter written in R.
jkyle 9 hours ago 0 replies      
Caret is a great package for a lot of utility functions and tuning in R. For example, the sampling example can be done using Caret's createDataPartition which maintains the relative distributions of the target classes and is more 'terse'.

 > data(iris) > library(caret) > data(iris) > idx <- caret::createDataPartition(iris$Species, p = 0.7, list = F) > summary(iris$Species) setosa versicolor virginica 50 50 50 > summary(iris[idx,]$Species) setosa versicolor virginica 35 35 35

hogu 11 hours ago 1 reply      
IF you do your stuff in R, how do you move it into production? Or do you not need to
vegabook 13 hours ago 0 replies      
Python's main problem is that it's moving in a CS direction and not a data science direction.

The "weekend hack" that was Python, a philosophy carried into 2.x, made it a supremely pragmatic language, which the data scientists love. They want to think algorithms and maths. The language must not get in the way.

3.x is wanting to be serious. It wants to take on Golang. Javascript, Java. It wants to be taken seriously. Enterprise and Web. There is nothing in 3.x for data scientists other than the fig leaf of the @ operator. It's more complicated to do simple stuff in 3.x. It's more robust from a theoretical point of view, maybe, but it also imposes a cognitive overhead for those people whose minds are already FULL of their algo problems and just want to get from a -> b as easily as possible, without CS purity or implementation elegance putting up barriers to pragmatism (I give you Unicode v Ascii, print() v print, xrange v range, 01 v 1 (the first is an error in 3.x. Why exactly?), focus on concurrency not raw parallelism, the list goes on).

R wants to get things done, and is vectors first. Vectors are what big data typically is all about (if not matrices and tensors). It's an order of magnitude higher dimensionality in the default, canonical data structure. Applies and indexing in R, vector-wise, feels natural. Numpy makes a good effort, but must still operate in a scalar/OO world of its host language, and inconsistencies inevitably creep in, even in Pandas.

As a final point, I'll suggest that R is much closer to the vectorised future, and that even if it is tragically slow, it will train your mind in the first steps towards "thinking parallel".

k8tte 15 hours ago 2 replies      
i tried help my wife who use R in school, only to get quickly lost.also attended ~1 hour R course on university.

to me, R was a waste of time and I really dont understand why its so popular in academia. if you already have some programming knowledge, go with Python + Scipy instead

EDIT: R is even more useless without r studio, http://www.rstudio.com/. and NO, dont go build a website in R!

There's No DRM in JPEG Let's Keep It That Way eff.org
221 points by DiabloD3  1 day ago   118 comments top 14
Pxtl 1 day ago 2 replies      
To be fair, ask webcomics guys and photographers about piracy - they get the worst of it. Big companies that would never dream of encouraging you to pirate videos and songs functionally encourage you to swap images around constantly, stripped and re-watermarked and whatnot.

So yes, I do feel a bit bad for small independant artists who watch the standards bodies work themselves into a fury to protect video and audio content while they have to deal with Google Image Search and 9gag.

bytesandbots 1 day ago 1 reply      
DRM is protection of content from the consumer itself. The consumer is going to consume the data through an analogue channel. This channel will always be the source of extracting redistributable content. The very premise of DRM rules out any 100%-solution and sets it as obfuscation.

I feel it might be stupid idea but it is not impractical.

The effort of extracting content should be less than the maximum value that can be generated by redistribution. Thus, returns from piracy diminishes as you go from softwares to video to image to text. The effort of extracting an image is too easy via analogue hole. This is assuming an open technology ecosystem not exactly the RMS world but at least Linus or perhaps Mozilla. The enforcers of the DRM do the sensible thing of spreading their proprietary black boxes to as much people, until, they can shut down their doors to the rest of community. That is precisely when certain open source foundations too had to back down. That is how you can enjoy netflix only within your chrome browser.

What bothers me is why are they trying to make it into the standards. If it is built into the standards, it will be built into the downloaders as well. Remember what happened with HLS AES encryption, it is now built into the video downloaders itself. While I understand benefits of standardization, how it has given shape to tech, it might not be true with something so un-technical as DRM. If you do want obfuscation, at least do not make it standard procedure. You know very well that the strongest DRM can not be technically secure.

endgame 1 day ago 1 reply      
I'm so glad I grew up in an era where people's only concern was making things work AT ALL. Trying to make things work for licensed users only, or only for certain devices, or anything else is just bullshit.
anon4 1 day ago 1 reply      
This seems like a technical solution to what's a political problem. Those don't usually work out as one would like, or worse, get enshrined in some standard that doesn't solve anything and which makes things worse for the few people that have supporting programs.
scotty79 1 day ago 6 replies      
If we could get rid of copyright there'd be much less resistance toward embedding information about who created the work.

I'd very much like that we could abolish fines for copying but keep fines for stripping author signature from work, or not propagating original author signature to works that are derived.

This way you could have a trail to reach actual author of the part of work that you find awesome to commission some new work from him.

This could be much more valuable for way more people than current copyright schemes that only seem to benefit fatcats.

impostervt 1 day ago 1 reply      
One of my side projects is a photo water marking SaaS. I was surprised when people actually started paying for it years ago, as I figured, "there's a ton of watermarking apps out there". But it turns out there's a lot of demand from amateur and semi-pro photographers who believe, rightly or wrongly, that they're being ripped off (and want a simple way to watermark their photos).For pros, There are other services out there that actively scan the internet looking for infringers and send DMCA takedown, or similar, notices. These services are generally two pricey for the type of customers my side-project has.

I guess my point is - there is a pretty big demand to protect images online. I suspect DRM will end up being implement in some form or another.

EvanAnderson 1 day ago 1 reply      
How long until this is used to lock down the independent "publishing" of images? This seems like a great foundation upon which to build software ecosystems that discourage user-generated content w/o the imprimatur of an authorized publisher attached.
atom_enger 1 day ago 5 replies      
Couldn't we just stop using JPEG? I realize this is a can of worms, but it's an option, right?
Spivak 1 day ago 3 replies      
Wouldn't this DRM require every implementation of the JPEG standard to honor the DRM or am I missing something?
LoSboccacc 1 day ago 1 reply      
Isn't this problem better solved by watermarking anyway? People want their work distribute publicly but want attribution to attract users. Buyers need a redistributable license most probably as they are interested in the media most probably as part of some communication effort.

Drm is not going to help after buyer redistribute the purchased work in any way, especially if there is a medium conversion involved - i.e. printed issue.

Nadya 1 day ago 4 replies      
Would this stop me from screenshotting the image, saving it as a .png, and distributing that?

Because I and many people would do just that. Sure, the DRM might work for my grandparents and a few other non-techies but over time I can teach my grandparents how to screenshot an image and others would catch on. People would even make chrome apps to "click a picture and resave it in a shareable format".

I'm not sure what this DRM would solve, if anything, other than pissing off users and giving photographers and other digital-sharing artists a false sense of security.

angersock 1 day ago 0 replies      
More attempted fencing off of the commons. Yay.
throwaway2048 1 day ago 2 replies      
countdown until mozilla folds like wet cardboard
togusa 1 day ago 0 replies      
WebP anyone?
WebKit removes the 350ms click delay for iOS webkit.org
205 points by asyncwords  6 hours ago   62 comments top 14
ksenzee 5 hours ago 3 replies      
The change applies only to unscalable viewports. That's a shame, because it means some developers will disable pinch-to-zoom to get a faster click response. That makes this yet another unfortunate conflict between usability and accessibility. The older I get, the more I appreciate being able to zoom (I'm viewing this page at 125% on desktop right now).
zkhalique 20 minutes ago 0 replies      
In our framework, we have for a very long time had a Q.Pointer class which contained functionality to normalize things between touchscreens and non-touchscreens. Among other things, it had the "fastclick" event: https://github.com/Qbix/Platform/blob/master/platform/plugin...

There is far more than simply relying on a "click" in touchscreens. For example the "touchclick" event is for those times when the keyboard disappears because focus has been lost in a textbox, but the click will still succeed: https://github.com/Qbix/Platform/blob/master/platform/plugin...

Also, drag-drop is broken in touchscreens WebKit so you have to roll your own, and much more.

You're better off using a library.

jordanlev 5 hours ago 1 reply      
Ugh... This change is of course totally logical in isolation, but I fear that this will motivate designers and developers to disable pinch-zooming on their sites (more than they already are). I hate when websites do this, and it is generally considered terrible for accessibility.
untog 5 hours ago 6 replies      
While I do symapthise with those lamenting the lack of pinch-to-zoom, I'm confused: apps don't offer pinch to zoom, so how do you use them? If you can use an app fine without pinch to zoom, you should really be able to use a mobile website fine too.

It seems to me that this is an either/or proposition: either you have a not-mobile, pinch-to-zoom-able web site, or you have a mobile-specific site with an app-like viewport that does not allow pinch to zoom. Both of these seem like fine options to me, and I don't think it's a huge loss to lose the middle ground.

escherize 5 hours ago 4 replies      
I really don't understand the lamentation around pinch to zoom. There's a fantastic os-level zoom built into ios! Set it up and three-finger-click to activate. And it works great.
paulvs 2 hours ago 1 reply      
As I see it, the 350ms delay was added to support zooming via double-tap. What I don't understand is why double-tap zooming is necessary when we have pinch-to-zoom? Can't zoom via double-tap be sacrificed for instant clicks and everyone is happy?
RoboTeddy 4 hours ago 1 reply      
How long until this makes it into the Mobile Safari on most people's iOS devices?
nailer 5 hours ago 1 reply      

Typing this on an iOS 9 device and I, as a human, cannot 'fast tap' enough for iOS to register a 'fast tap' and not delay. Try it here: http://output.jsbin.com/xiculayadu

mozumder 2 hours ago 0 replies      
Any idea when this makes it into an iOS release? Does Apple usually implement this in point releases? Or do we wait until iOS10 next year?
jamesrom 5 hours ago 1 reply      
A lot of commenters here are afraid of developers disabling user scaling to get better performance. That is incorrectly making the assumption that user scaling is good thing for every kind of website.

If a 350ms click delay is actually a performance bottleneck on the app you are building, it's very likely user scaling is something you want disabled anyway.

dkonofalski 5 hours ago 1 reply      
What's the intended function of the previous functionality? Didn't double-tapping zoom in and out to a specific section? What problem does the delay solve that isn't present on unscalable viewports?
kristianp 5 hours ago 2 replies      
Can someone explain what this means for the non-IOS developers amongst us?
outside1234 5 hours ago 2 replies      
Remind me: They originally had the 350ms delay in there to distinguish between a tap and a pinch, correct?
joeyspn 5 hours ago 0 replies      
Good news for hybrid apps devs...
The world needs at least 600M new jobs in the next decade for young people bloomberg.com
215 points by cryoshon  1 day ago   471 comments top 44
downandout 1 day ago 11 replies      
Most of the comments here focus on how people aren't trying hard enough to get jobs. This article indicates that the jobs don't exist and that the problem is likely to get far worse. You can try as hard as you want to get a job - if no one is hiring, you aren't going to get one.

The reality is that as time goes on, the world's needs can and will be met by fewer and fewer people. This should be a good thing, but it won't work under most existing economic systems. Our entire economy has to change to accomodate the new reality that a significant percentage of the population will be unemployed.

jonathanjaeger 1 day ago 14 replies      
Disclaimer: This is purely anecdotal, and not backed by any data.

I'm in my mid-twenties and just recently started interviewing people for the team I work on. It's amazing how little effort many seemingly qualified people put in to secure an entry-level job. Whether it be hustle to learn more about a business, the specifics about the company you might work at, or finding someone to give a second set of eyes on a cover letter or resume, most people really drop the ball. If job prospects are grim, you'd at least hope people would put in more effort.

maresca 1 day ago 6 replies      
Student loan debt surpassed credit card and auto loan debt in the US last year. Many college grads graduate with large sums of debt and can't find relevant jobs. Since student loan debt isn't forgivable, it'll be interesting to see the effect of this over the next decade. I have a hunch that the next big market crash will be caused by student loan debt.
NoGravitas 1 day ago 5 replies      
> The world needs at least 600 million new jobs in the next decade for young people

Or, perhaps, the world needs to stop coupling basic human needs for subsistence and dignity to wage labour, and find some better way of doing things.

jbob2000 1 day ago 0 replies      
This is anecdotal at best, but I feel like there is an apathy epidemic. It's fucking impossible to get people to even do "fun" things, much less a "boring" job. Everyone just wants to sit at home in front of a screen. It could just be the people I surround myself with, but that's the feeling I get.
sosuke 1 day ago 3 replies      
What qualifies as an adult anymore?

 people 15 to 29 years old are at least twice as likely as adults to be unemployed
30 is adulthood in their interpretation of the data.

laurentoget 1 day ago 0 replies      
This is quite a contradiction from the recent talk about demographic pressure leading to a wage turnaround.


" Taking just wage growth, simply put (and for more detail follow the links above), an end to the global labour glut should see real wages (wages accounting for the change in prices) start to rise at a faster pace. An ONS report of 2014 found that UK real wages in the 1970s and 1980s grew by an average of 2.9% a year."

so which is it? too many people or not enough?

fensterblick 1 day ago 1 reply      
As the article highlights, the Arab revolutions were led by the youth. I wonder what, if anything, will happen in the USA when the current generation, saddled by seemingly insurmountable college debt, comes to the realization that it cannot find stable work or afford decent housing.

I truly believe that moment will be an earthquake for the current political environment; what we characterize Republicans and Democrats today will dramatically change (just like it did after the Civil War and also the Civil Rights movement).

ausjke 1 day ago 0 replies      
I don't know how this is going to work, I actually think the root-cause of mid-east crisis is more related to youth-jobless.

Young people without job will lead to bad things, in the meantime the technology/AI/robotic-factory is making more people "unneeded", will it either be a utopia-coming-true or a revolution?

ChuckMcM 1 day ago 2 replies      
These stories are always interesting to read, both from what they say, and what they don't say. For example, do you know that world wide there is a shortage of people in various trades roles[1] ? (Welding, masonry, carpentry, electricians, etc) And why are their young people who are loading up on debt they can't afford to go to Ivy league schools when they can be just as successful going to state schools? How much part time employement might be found if there wasn't a floor on minimum wage? [2] Since we don't have the category of 'extra' or 'part time' job like we used to, current minimum wage policy is geared toward making every job pay a living wage. That prices a lot of jobs out competitiveness for humans and spurs the development of robotic replacements. Not that those jobs are career paths, but they do offer people a bit of extra change in their pocket.

A more intriguing question is to what end might you employ two or three hundred million people? Imagine they are sitting outside your window waiting for your command. Assuming you are paying them a living wage, what economic output could they accomplish that would be "worth" say 5 to 15 trillion dollars a year?

[1] http://facilityexecutive.com/2015/05/u-s-employers-suffer-la...

[2] https://www.cbo.gov/publication/44995

cies 1 day ago 0 replies      
Who needs a job? We need "goods", and to obtain then we usually trade in part of our monetary income; but that does not need to come from a job.

I believe in "mincome" or "basic minimal income", as provided by a form of government to all citizens; to be paid for by tax money. This will be low, but enough to sustain yourself (simple shelter + food). If you want more then that you need to either find a job or walk a more entrepreneurial path and make a job.

The amount people receive as "mincome" will be an important number to control by politicians. It will have a strong effect on the then emerging "post-mincome unemployment rate". This would be all people that are looking to supplement their mincome, but currently have not found the means to do so.

I think a mincome-society will find a lot more people entrepeneuring: as a safety net is in place.

The "jobs" that the article speaks of are only going to be created if there is a strict need for them. A business will usually only create a job in last resort, as employing people costs money and brings risks.

maerF0x0 1 day ago 1 reply      
There are a few ways to create many openings:

1. Legislate a maximum working hours (probably < 40 ) . 3 jobs at 40 hours per week becomes 4 jobs at 30 per week, 33% expansion, problem solved.

2. Allow humans to undercut automation in price competition (eg abolish minimum wage)

3. Expand government to employ people for whatever, just print the money you pay them.

Clearly these all come with unsatisfactory side-effects.

Maybe the fix is to end our obsession with creating jobs and jobs being the form of survival we offer our species. Imagine if we just created 600M jobs automating all the things so that those jobs would never (or nearly never) need to be done again? The future generations would need Billions of jobs! But no one would be worse off for it. Its like when a dishwasher became common place, suddenly kids were free to do more homework or facebook or xbox etc. Suddenly parents were more free not to have kids (to complete household chores). Etc etc. By automating and reducing the work that human kind has to do, we're enabling better lives all involved, including the displaced workers. Change hurts in the shortrun, but can bring utopia in the long run.

morgante 1 day ago 1 reply      
> people 15 to 29 years old are at least twice as likely as adults to be unemployed.

Today I learned that 28-year-olds aren't adults.

PebblesHD 1 day ago 0 replies      
This is a truly massive global issue and it hits quite close to me. I've been absolutely lucky in going to university and getting a reasonably promising role in a financial institution whilst studying. The flip side has happened to my brother who at this point is stuck doing menial jobs to pay for food and transport to get to university whilst he collects debt for going. I've had conversations to the effect of 'I actually cannot afford to go to class today or ill be losing another few hundred dollars i need for food' which is not a though ANY young person should have to face. The system needs immediate change for the future of our current society.
forrestthewoods 1 day ago 0 replies      
Central Asia and South-East Europe are lumped together? Ouch! That feels like a rather stinging critique of the European Union.
sprucely 1 day ago 0 replies      
Meanwhile the more menial / labor-intensive jobs are being replaced by automation. I read somewhere that automation was supposed to be the great liberator, enabling an ever-increasing amount of leisure time. But at some point attitudes started shifting so that people must justify their existence by continuously working hard; and if things don't happen to work out for them, well they just weren't motivated enough. This attitude is apparent in our [US] welfare system which has a huge administrative overhead in place to prevent freeloading.
cousin_it 1 day ago 2 replies      
Solution: https://en.wikipedia.org/wiki/Works_Progress_Administration

It's been done. It worked. Do it again.

DrNuke 1 day ago 0 replies      
The point is there will be less and less global-economy jobs (because of automation and the insane productivity it allows to very few skilled people) and more but not enough local-community jobs (caring, agriculture, menials and so on). In a fair deal of the so-called first world, too many educated people are just reverting to local-community jobs already, competing with the uneducated and migrants. This is not going to end peacefully if some sort of basic income is not introduced soon.
dm03514 1 day ago 0 replies      
Less jobs more food. Grow food, at whatever scale available, pots in your room, pots on the balcony/porch, small gardens, side gardens, public spaces, large gardens.
geff82 1 day ago 3 replies      
It is grim with the exception when you live in Germany or Switzerland.
peg_leg 1 day ago 0 replies      
Another idea: how are people today that do so-called 'work' contributing to the human race anyway? In my occupation I call 'work' my contribution is minimal. I help build software to make corporations more money. Almost a negative on the human race. My saving grace is that I make music in my spare time. That is my real contribution to the world.
sogen 1 day ago 1 reply      
Is this a plain in the open "ideology injection in the brain" from above (the rich) to deter emigration to better countries?
tmaly 1 day ago 0 replies      
If we had space exploration capability like in Star Trek, we could think about a different approach. But we are constrained to Earth and we have limited resources. Capitalism is the best system available to allocate resources. What we have right now is not really Capitalism.
sudo-i 1 day ago 0 replies      
Hey how valid is this information? I didn't quite grasp if they counted in other factors, for example, people incurrent jobs that will pass away. Additionally data such as baby boomers are getting older and will create markets in stagnant areas at the moment.
peg_leg 1 day ago 1 reply      
The young people of today are different. They are on the cusp of potentially something wonderful and strange for the human race. Older people don't recognize it. The values are different. Maybe the idea of 'work' will change to suit them.
Kinnard 1 day ago 0 replies      
Once we no longer need to work we can occupy ourselves with love, learning, passion and play!
tdsamardzhiev 1 day ago 0 replies      
Better think positive, guys - it's only going to get worse as you get older ;
moron4hire 1 day ago 0 replies      
I'm struck by the fact that, if 600M people were living together in one area, they'd spontaneously create jobs around the fact that 600M people will have to figure out ways to interact with each other in an civil way.

I tend to believe that the reason we don't have enough jobs right now are because of market distortions that place unequal value on certain, expensive things, like college educations, personal vehicles of far greater passenger capacity than strictly necessary, private dwellings of extremely large size, and the latest and greatest smartphones ever two years. Our "betters" have successfully created a scenario where people willingly enter into debt slavery to acquire what they believe is their entitlement.

Because, sans weird pricing, there is real need for work to be done, that is not getting done, in our current environments. There are roads that are falling apart. There is food that is not getting to hungry people. There are children who are not learning what they need to learn to be successful. There are hydrocarbons that are continuing to be burnt. There are routine medical physical exams that aren't being performed.

There are things that people want, but can't acquire at a price that is reasonable. This could be a function of the constituent inputs being too expensive, but I doubt this. Arbitrage is a powerful force for innovation. I suspect there is a much stronger force at work that is preventing the goods and services that people need from being created: mega-corporate-backed government regulation.

There are people in 1st world countries who are going hungry, who don't have heat, who don't have doors on their house. I have seen this with my own damn eyes. Yes, they are poor, but is their poverty their fault? And even if, in some extremely twisted way, it is their fault, does it justify forcing them to live in squalor? Should the laziest of lazy people be forced to live in literal shit-holes?

As long as there are people willing to work but incapable of moving, I think a little "undeserved" compassion is a good enough reason to create a job. Just because some vanishingly few poor people are slovenly doesn't justify completely writing off the entire class.

nathan_f77 1 day ago 0 replies      
Alternatively, let's rethink capitalism now that automation is taking over jobs. Maybe we all don't need to work so hard anymore.
oconnor663 1 day ago 0 replies      
The world has confronted this problem every decade since the beginning of time. Is there any reason to believe This Time Is Different?
dba7dba 1 day ago 3 replies      
We should be honest and talk about just not creating more jobs BUT having LESS babies.
oneJob 1 day ago 0 replies      
...because we don't have enough stuff and we always have to be doing something? How about, work less, live more.
collyw 1 day ago 1 reply      
Or just redistibute the wealth more equally.
mygodtou 1 day ago 0 replies      
Lots of people have great ideas but most governments stifle small business with excessive relations and fees.
faragon 1 day ago 0 replies      
Let's produce more. Let's make a rich world for everyone :-)
mrdrozdov 1 day ago 0 replies      
Sounds like a good time to create an education business. :)
peterwwillis 1 day ago 0 replies      
Job hack: open volunteer trade schools in impoverished urban areas and fund it with both government and private money and give tax incentives to those that fund them or volunteer to work there. This could be anything from computer jobs to specialized manufacturing (Foxconn-esque).

Not only could this provide us with a 'cheap labor' manufacturing workforce that corporations love, tech jobs that could be done remotely would also be easy to train for, and thus our country's very limited transportation options wouldn't be such a barrier to getting work. Areas of high crime or gang violence could begin to get kids off the streets and into a stable job.

mwhuang2 1 day ago 0 replies      
Extra schooling only delays reality and leads to more debt. What really matters is simple supply and demand - whether people have the skills that others are willing to pay for.
pinaceae 1 day ago 0 replies      
I don't fully understand this claim.

The job market has people coming in, but also, in parallel, people exiting out of it.

developed markets, especially in Europe and Japan, will see massive attrition due to people retiring or dying off. the baby boomer generation in the US is retiring as well. all those jobs need to be backfilled and all those old people will need services.

as the world population is stabilizing, it should not be that bad, no?

Mz 1 day ago 0 replies      
Or, we need 600m new small businesses, consultants, etc. The world can change, adapt.
AnimalMuppet 1 day ago 0 replies      
I recently saw an article (don't recall where, but I think it was based on a World Bank report) that indicated that the fraction of the population aged 18-65 had peaked in 2012. That was part of the problem - more people were of working age. But demographically, that's going to be less and less true as we move forward; perhaps that will soften the conclusions of this article.
greengarstudios 1 day ago 1 reply      
Start a startup and create your own job?
fredgrott 1 day ago 1 reply      
The reality is with $31 Billion in mobile app sales and rising those jobs will come from small businesses building mobile apps as we have 1 TBytes of free data to organize into mobile services every year that our current programming languages cannot self learn how to organize.

Yes there will still net jobs loss as tech progress eliminates them..the new job is your small business you set-up

theworstshill 1 day ago 0 replies      
As difficult as it would be to find money for it - I would propose a one time entrepreneurship grant to all college graduates equal to an average yearly salary in the profession (this is an approximation and experts should figure out what variables should adjust for best amount). That would allow several things to happen:1. New graduates with a strong drive for entrepreneurship can start working on their ideas right away and do not have to spent several years working for corporations, picking up anti-patterns.2. New graduates who are unable to find professional work can have a cushion while they search, and can potentially become lesser partners to people in the first category.

Jobs and careers are created by businesses, so the more small-medium size businesses there are, places that are still flexible in their mindset - the more work there will be.

Auto-Generating Clickbait with Recurrent Neural Networks larseidnes.com
240 points by lars  1 day ago   64 comments top 24
thenomad 1 day ago 2 replies      
If I could feed this an article and have it generate headlines based on the text of that article (and they were any good), there is a solid chance I would pay real money for that service.

Headlines are an absolute pain, and as the article says, they're decidedly unoriginal most of the time. I can't see an obvious reason that an AI would be much worse at creating them as a human.

blisterpeanuts 1 day ago 4 replies      
I like the notion of swamping the Internet with fake click-bait headlines, to dilute the attractiveness of this (to me, odious) form.

Give me sincere, honest news and discussion, or else shut up.

Unfortunately, someone out there must really have a craving for "weird old tricks" and "shocking conclusions".

It's a sort of race-to-the-bottom, least common denominator effect.

Maybe someone will write a browser extension that filters out obvious click-bait headlines. Now that would be clever!

rndn 1 day ago 0 replies      
Could this RNN model perhaps be used to filter click bait headlines from HN automatically? Perhaps one could perform some sort of backward beam search to figure out how likely a particular headline would've been produced by it. If there are words in a headline that the model doesn't know, one could perhaps just let it replace it with one that it knows.
oneJob 1 day ago 0 replies      
Now if we can just teach AI to get sidetracked reading all this content we'd also prevent Judgement Day.

SkyNet: (speaking to self?) "Unleash hell on humans. Launch all missiles."

SkyNet: (responding to self?) "Not now, not now. Let me finish this article on John Stamos's belly button."

clickok 1 day ago 0 replies      
Nice! I've wanted to do something like this for awhile, too, but haven't had the time yet.

What's interesting to me, from a research point of view, is the degree of nuance the network uncovers for the clickbait.We all know that <person> is going to be doing <intriguing action>, but for each person these actions are slightly different. The sentence completions for "Barack Obama Says..." are mainly politics related while "Kim Kardashian Says..." involve Kim commenting on herself.

So it might not really understand what it's saying, but it captures the fact those two people will tend to produce different headlines.

Neat Idea: what if we tried the same thing with headlines from the New York Times (or maybe a basket of newspapers)?We would likely find that the Clickbait RNN's vision of Obama is a lot different from the Newspaper RNN's Obama.Teasing apart the differences would likely give you a lot more insight into how the two readerships view the president than any number polls would.

ChuckMcM 1 day ago 1 reply      

I really find RNNs to be pretty cool. When they are combined with a natural human tendency to see patterns they are hilarious. So perhaps we need to update our million monkeys hypothesis to a million RNNs with typewriters coming up with all the works of Shakespeare.

mikkom 1 day ago 2 replies      
What I'm surprised most is that the headlines seem not to be much better than your average markov chain output
rlu 1 day ago 2 replies      
> The training converges after a few days of number crunching on a GTX980 GPU. Lets take a look at the results.

Stupid question: why is the GPU important here? I would have thought this was more of a CPU task..??

(then again, as I typed this I remembered that bitcoin farming is supposed to be GPU intensive so I'm guessing the "why" for that is the same as this)

flashman 23 hours ago 0 replies      
I used a simpler technique (character level language modelling) to come up with an Australian real estate listing generator: http://electronsoup.net/realtybot

This is pre-generated, not live, for performance reasons. There are a few hundred thousand items though, so the effect is similar.

The data source is several tens of thousands of real estate listings that I scraped and parsed.

juddlyon 1 day ago 1 reply      
I can't stop laughing at these. Check out the Click-o-tron site: http://clickotron.com/
OhHeyItsE 1 day ago 0 replies      
This is simply brilliant.

(Ranking algorithm baked into a stored procedure notwithstanding. [ducks])

neikos 1 day ago 1 reply      
I am not sure how much I would give credit to the idea that the neural network 'gets' anything as it is written in the article.

> Yet, the network knows that the Romney Camp criticizing the president is a plausible headline.

I am pretty certain that the network does not know any of this and instead just happens to be understood by us as making sense.

billconan 1 day ago 0 replies      
I can't understand the first two layer RNN which according to the author optimized the word vectors.

it says:

During training, we can follow the gradient down into these word vectors and fine-tune the vector representations specifically for the task of generating clickbait, thus further improving the generalization accuracy of the complete model.

how to you follow the gradient down into these word vectors?

if word vectors are the input of the network, don't we only train the weight of the network? how come the input vectors get optimized during the process?

andrewtbham 1 day ago 1 reply      
tldr; guy uses rnn lstm to create link bait site.

hopes crowd sourcing will filter out non-sense.


chipgap98 1 day ago 0 replies      
"Tips From Two And A Half Men : Getting Real" is great. Some of the generate titles are incredible
indiv0 19 hours ago 0 replies      
Reminds me of Headline Smasher [0].

Some pretty fun ones there but it doesn't use RNNs. It just merges existing headlines.

[0]: http://www.headlinesmasher.com/best/all

alkonaut 1 day ago 0 replies      
Missed opportunity for HN headline.

This program generates random clickbait headlines. You won't believe what happens next. You'll love #7.

kidgorgeous 1 day ago 0 replies      
Great tutorial. Been looking to do something like this for a while. Bookmarked!
smpetrey 1 day ago 1 reply      
I think this one is my favorite:

Life Is About Or Still Didnt Know Me

CephalopodMD 1 day ago 1 reply      
Your main site is down. Bottle can't handle serving files scalably or something? Point is, it broke.
hilti 1 day ago 0 replies      
Interesting blog post, but site is down.How much traffic do You get from HN?
joshdance 1 day ago 1 reply      
500 Internal Server Error on the site where you could upvote em.
imaginenore 1 day ago 1 reply      
Getting this error:

 Error: 500 Internal Server Error Sorry, the requested URL 'http://clickotron.com/' caused an error: Internal Server Error Exception: IOError(24, 'Too many open files') Traceback: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 862, in _handle return route.call(**args) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 1732, in wrapper rv = callback(*a, **ka) File "server.py", line 69, in index return template('index', left_articles=left_articles, right_articles=right_articles) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3595, in template return TEMPLATES[tplid].render(kwargs) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3399, in render self.execute(stdout, env) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3386, in execute eval(self.co, env) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 189, in __get__ value = obj.__dict__[self.func.__name__] = self.func(obj) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3344, in co return compile(self.code, self.filename or '<string>', 'exec') File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 189, in __get__ value = obj.__dict__[self.func.__name__] = self.func(obj) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3350, in code with open(self.filename, 'rb') as f: IOError: [Errno 24] Too many open files: '/home/ubuntu/clickotron/views/index.tpl'

VLM 1 day ago 0 replies      
This was an enjoyable article. There is an obvious extension which is to mturk the results and feed the mturk data back into the net. Just give the turkers 5 headlines and ask them which they would click first, repeat a hundred times per a thousand turkers or whatever.

Years ago I considered applying for DoD grant money to implement something reminiscent of all this for military propaganda. That went approximately nowhere, not even past the first steps. Someone else should try this (insert obvious famous news network joke here, although I was serious about the proposal). To save time I'll point out I never got beyond the earliest steps because there is a vaguely infinite pool of clickbaitable English speakers on the turk, but the pool of bilingual Arabic (or whatever) speakers with good taste in pro-usa propaganda is extremely small, so the tech side was easy to scale but the mandatory human side simply couldn't scale enough to make the output realistically anything but a joke.

Appeals court hits largest public patent troll with $1.4M fee arstechnica.com
185 points by solveforall  2 days ago   48 comments top 7
kelukelugames 2 days ago 4 replies      
I was hoping for Intellectual Ventures.
ww520 2 days ago 3 replies      
$1,025 per hour for partners, $750 for associates, and $310 for paralegals.

Those are kind of high. Did they actually charge those rates? Or retroactively bumped up the rates once they knew the case was dismissed?

xacaxulu 2 days ago 0 replies      
The title alone makes me all "Yissssssssssssss".
shmerl 2 days ago 0 replies      
Good, but I was hoping for Intellectual Vultures to be busted.
ccvannorman 1 day ago 0 replies      
A good start.
curiousjorge 2 days ago 4 replies      
>$1,025-an-hour partners.

TIL some people make literally a 100 times what the average person makes.

winter_blue 2 days ago 2 replies      
$1.4 million seems small in comparison to the billions[1] that Microsoft was fined by the EU for bundling certain software with Windows.

Acacia did something that would be considered illegal and exploitative in almost every jurisdiction, whereas bundling a browser straddles a legal gray area. Apple does it OS X and iOS, Google does it on Android & Chromebooks, and nearly every Linux distro does it. On iOS, you can't even use a browser engine other than Safari's WebKit. And none of these companies have gotten into trouble.

It just seems unfair to me that when a company does something slightly unfairly competitive (like Microsoft) they get hit with huge fines, but when a company like Acacia does something that's outright evil and illegal, the fines are a joke.

I do think Microsoft should be paying even bigger fines for patent-trolling Android manufacturers with false patent claims. And a judicial decision or executive act ordering Apple to allow users to install their own software on iOS, and removing the ban on interpreters/browser engines/etc on the App Store would be appropriate.

[1] $794 million in 2003, $449 million in 2006, $1.44 billion in 2008, and $765 million (561 mil)in 2013 -- a total of over $3.4 billion, all for bundling standard software with Windows that all other OSes also bundle. And this money paid in fines to the European Court goes back into the EU budget. (TBH, this smells strangely as a revenue-generating move by the Commission.) See: https://en.wikipedia.org/wiki/Microsoft_Corp_v_Commission

Engineer builds 'working' Thor's hammer that only he can lift cnet.com
183 points by davidiach  9 hours ago   46 comments top 16
copsarebastards 9 minutes ago 0 replies      
I'd have picked a different legend: the sword in the stone is more similar to how this works, the sword can be wielded by anyone after the king pulls it out of the stone, and this hammer can be wielded by anyone after the engineer pulls if off of the magnet. Thor's hammer can only be wielded by him, ever.

But it's still awesome.

Jemaclus 8 hours ago 3 replies      
This is pretty clever. The major improvements I'd want to make are some sort of RFID chip that deactivates the magnet when I'm close enough, instead of a fingerprint scanner. It seems like anyone who gets close enough can see the scanner, so I'd prefer to have something more invisible.

And the second thing would be just to improve the lag time between grasping the handle and the deactivation of the magnet, so I can just lean down and casually grab it, instead of having to hold it for a second before bringing it up. The more magic, the better.

Still, this is pretty awesome!

magicseth 22 minutes ago 0 replies      
Magician Robert-Houdin performed this trick in 1846 (without the fingerprint reader) [1]

He used the "Light and Heavy Chest" to demonstrate his ability to remove the strength of men for political ends.

[1] http://www.themagicdetective.com/2012/05/politics-magic-and-...

lifeformed 1 hour ago 1 reply      
I was hoping that instead of magnets, it would just be extremely heavy, and be able to activate an internal gyroscopic system to do something like this: https://www.youtube.com/watch?v=GeyDf4ooPdo

It'd be pretty hard to fit all that in a small package though, and probably dangerous.

carbide 8 hours ago 5 replies      
Cool idea, terrible acting. I feel like the "wow" factor in his audience was really killed by the awkward way he made it look like he was pushing a button and waiting for something to happen, instead of struggling to "lift" the hammer while he waited for the thumbprint to register.
MisterBastahrd 6 hours ago 0 replies      
Woulda been cool to add a remote shutoff so that the kids trying to lift it could have a bit of a thrill.
animex 2 hours ago 0 replies      
An NFC ring might have been a better solution than the laggy fingerprint scanner. Still, cool idea!
oakwhiz 5 hours ago 1 reply      
It needs an accelerometer to engage the magnet if the device is disturbed without the handle being touched.
jeffwass 8 hours ago 0 replies      
In 'The Illusionist', the magician Eisenheim did a similar trick to Crown Prince Leopold, except it was King Arthur's sword in the stone.
netcraft 8 hours ago 1 reply      
I think NFC or Bluetooth might have been better but neat execution nonetheless.
KM33 3 hours ago 0 replies      
This is really neat. I wonder if there is the possibility of using a similar magnet set-up as a lock? I worry about my motorcycle being stolen since it is so easy to pick up and most locks can be broken, if I had an electro-magnet like this one it might be much harder to steal.
oconnor663 6 hours ago 1 reply      
What happens if the magnet comes back on while the hammer's not touching the metal? Is there a way to make this safe?
trishume 7 hours ago 1 reply      
Neat project.

I can't help but wonder if you could beat the magnet by kicking the handle sideways, the strong impact multiplied by the lever force might be enough to beat it.

ljk 5 hours ago 1 reply      
Since it's magnet, did it break the electronic devices the "lifters" were carrying?
brador 7 hours ago 1 reply      
Could you do something similar with cornflour mix?
ck2 7 hours ago 1 reply      
Instead of a thumbprint, he should have used a bracelet with an rfid chip, much faster response time and his hand could have been anywhere on the handle.

Or just inject the rfid chip under your finger.

Emacs maintainer steps down gnu.org
238 points by zeveb  1 day ago   70 comments top 9
nanny 1 day ago 4 replies      
Note: this happened almost a month ago now.

See these threads for the discussion on the new head maintainer:



jwr 1 day ago 1 reply      
Stefan's stewardship resulted in a much-improved Emacs. He did a very good job.
davidw 1 day ago 6 replies      
I've been using Emacs for 20 years, I realized. If you think about all the different things that come and go so quickly in this field, that's a pretty amazing run.

Thanks Stefan!

unknownzero 1 day ago 1 reply      
I would encourage anyone who clicks the link to read through the thread. Pretty heartwarming to see the goodbyes, a definite mid-day boost :)
laurentoget 1 day ago 1 reply      
Good to know there are people stepping down off of open source project leadership roles without throwing a tantrum!
seigel 1 day ago 1 reply      
Heading over to vi? :)
ilaksh 1 day ago 0 replies      
I remember in my C++ class around 1997 the professor was saying emacs was more a way of life or operating system than just an editor. He was only half-kidding.

In the past 18 years I imagine the functionality may be even more comprehensive, if that is possible?

melling 1 day ago 0 replies      
I guess this will impact the Emacs 25 release?
JeremyBanks 1 day ago 0 replies      
Is your username intended to describe the idea you're suggesting?
Smartcrop.js content-aware image cropping in JavaScript github.com
218 points by rayshan  10 hours ago   27 comments top 13
zachrose 3 minutes ago 0 replies      
How long until this has native support in CSS?
vortico 9 hours ago 4 replies      
Really awesome, and the test cases look just as good as I could do.

I would warn web designers to not blindly apply this to everything though. It scans all the pixels of an image, which can take up to 100ms each, especially on mobile devices. A good use case would be a file upload box with a suggestion to crop the image upon upload.

zappo2938 8 hours ago 1 reply      
Cropping images is a massive problem for social media. Here is a talk from 2013 by Christopher Chedeau a front end engineer at Facebook describing some of the problems with their image layout algorithms.[1]

Initially, they tried to solve the problem by getting users to tag people inside the image and then use location of the tags as parameters to crop. If someone is tagged in a photo, Facebook makes sure that person is always inside of the cropped version.

Here is the write up from Christopher's blog.[2]

Was Instagram only using square images at one time? That would have been a brilliant way to have solved this problem.

1. http://blog.vjeux.com/2014/image/image-layout-algorithms-htm...

2. http://blog.vjeux.com/2012/image/best-cropping-position.html

vjeux 9 hours ago 0 replies      
If you are interested in a similar approach by a Facebook developer: http://blog.vjeux.com/2012/image/best-cropping-position.html
emehrkay 8 hours ago 2 replies      
Damn this is cool. It's kinda amazing that it is all done in a few hundred lines of code. I see there is a skinColor method and setting defined as

 skinColor: [0.78, 0.57, 0.44]
I was curious how it worked with darker skin (I admittedly don't understand what the numbers mean without further analysis), and It came out pretty well (it may default on lighter skin, i don't know)


images found on google

RichWalton 5 hours ago 0 replies      
Awesome work.

As it happens I'm working on an image cropping front end (using CropperJS [1]) - I'm going to integrate this so that the initial crop selection is set using the results from SmartCrop.

Thanks again.

[1] https://github.com/fengyuanchen/cropperjs

sam_goody 8 hours ago 1 reply      
Two others in this space:http://thumbor.orghttp://magickly.afeld.me

But neither seems to get as much love as they need, so always good to have more players.

rateofclimb 9 hours ago 0 replies      
Very cool. The Ken Burns effect application of the algorithm is particularly impressive.
IgorPartola 9 hours ago 1 reply      
I've actually been looking for something related. I need to quickly classify if an image contains a face and also if it contains any text. The former seems to be relatively straightforward, but I haven't found anything for detecting text, only OCR'ing it which I don't need. Anyone seen anything like this?
sirtastic 9 hours ago 0 replies      
Worked well with the pictures I dropped in, very nice.

Wish someone would make a solid angular dropzone+cropper.

dheera 8 hours ago 0 replies      
If Imgix could implement this server-side, that could be awesome.
jessedhillon 7 hours ago 0 replies      
This is called "salient region detection" and some current approaches (there are many) include detecting the contrast between each pixel and the global or regional average color or luminosity. Areas of high contrast are likely to be regions which are considered interesting. Once you have those regions, you would have to have a separate algorithm which maximizes the placement of a rectangle (the crop) to get the greatest coverage of "interestingness".

You could also combine this with face detection, so that a picture of someone in a bikini doesn't end up cropping just to their midsection, since going by surface area, the torso could have more high-contrast pixels than the face.

Here's one approach which has open source C++ code:http://mmcheng.net/effisalobj/

donmb 8 hours ago 0 replies      
Exactly what I was looking for. Will give it a try! Thank you.
If You're Not Paranoid, You're Crazy theatlantic.com
202 points by ForHackernews  1 day ago   118 comments top 19
GigabyteCoin 1 day ago 6 replies      
>Id driven to meet a friend at an art gallery in Hollywood, my first visit to a gallery in years. The next morning, in my inbox, several spam e-mails urged me to invest in art. That was an easy one to figure out: Id typed the name of the gallery into Google Maps.

I don't see how the author makes the connection here.

How does searching for an art gallery on Google Maps translate into spam emails? Is he accusing Google of selling your email address and search information to spammers?

thaumaturgy 1 day ago 2 replies      
Paranoia is a specific thing. It requires irrational, unjustifiable fears and a sense of blame or persecution. You're not "paranoid" if you've changed your behavior in the wake of the Snowden documents or if you're cautious regarding the amount of information you share with third party services and devices.

I'm not being pedantic, I've seen a lot of arguments recently from actual paranoid conspiracy theorists that feel smug in the wake of Snowden. I'd hate to see people start to confuse real paranoia with informed caution.

radiorental 1 day ago 3 replies      
Somewhat ironic and paradoxical http://imgur.com/Z2BIMcw
mfoy_ 1 day ago 2 replies      
Think of how many apps you've installed which request permission to a whole laundry list of phone functions.

"Oh, it's reasonable that this app wants access to my text messages, that way when it sends me a confirmation code it can automatically read it."

"Oh, it's reasonable that this app wants access to my mic, maybe it will implement voice chat in a coming update."

"Oh, it's reasonable that this app wants access to my call history and whatnot, that way it can mute itself or pause itself when I get a phone call."

... oh, I guess it's reasonable that if I text, or talk with my phone nearby, about walnuts I'll start seeing targeted ads for walnuts.

TeMPOraL 1 day ago 1 reply      
> Had merely typing seduction into a search engine marked me as a rascal? Or was the formula more sophisticated? Could it be that my online choices in recent weeksthe travel guide to Berlin that Id perused, the Porsche convertible Id priced, the old girlfriend to whom Id sent a virtual birthday cardindicated longings and frustrations that I was too deep in denial to acknowledge?

While a lot of those examples are true instances of tracking and inference, in some cases I think author is imagining things. People have a scary capability to see patterns and intelligent agents where none exists. It's incredibly easy to cause this.

I'm running a simple IRC bot that "pretends to be human" by means of hand-tailored regular expressions matching input and some witty responses. I can't count the times I tricked people into believing they were talking to human. It's like, write out some simple regexes and you're 90% way to passing a Turing test. People prime and then fool themselves.

So yeah, I'm betting those results in the part I quoted were caused just by "seduction techniques" search. And if he clicked on that Ashley Madison banner, he basically sealed his fate.

cubano 1 day ago 1 reply      
Reminds me of poster I saw in dude's house way way back in my stoner days...

"I know I'm paranoid, but am I paranoid enough?"

xkiwi 1 day ago 2 replies      
It is scary for me to know:

#Majority of people post photo of friends on Facebook without understand facial recognition always scans.

#Prefer convenient over privacy, such as Toll Tag on cars.

#Follow trends.

hyperion2010 1 day ago 1 reply      
Qu'on me donne six lignes crites de la main du plus honnte homme, j'y trouverai de quoi le faire pendre.--Cardinal Richelieu
cardamomo 1 day ago 0 replies      
I really enjoyed reading this piece, not only for its discussion of privacy, but also for its poetic and reflective language. There's something more than technical about today's surveillance problem, and the author approaches this issue from a philosophical and, at times, almost spiritual angle.

These are the kinds of discussions we need to have more often: not only what's going on and what it means in practical terms, but also how today's surveillance explosion changes who we are and how we relate to ourselves.

mattmanser 1 day ago 1 reply      
If you are a sci-fi fan, an interesting new trilogy to read is the Imperial Radch trilogy by Anne Leckie starting with Ancillary Justice.

One of the interesting themes is that everyone is surveilled totally and intimately down to even their feelings, but it's not the point of the book and the protagonist treats it as totally normal and it's never really discussed nor is there any suggestion it would be better if that wasn't the case.

I really enjoyed the trilogy but as someone who's pro-privacy it was a strange read.

msutherl 1 day ago 6 replies      
I still don't quite comprehend why people feel personally bothered by such things. Yes, it is better for society to have safeguards in place to prevent certain kinds of surveillance as a check on governmental and corporate power and we need to fight for this but some data about you stored on some servers, a targeted advertisement? What exactly is the immediate personal threat?
euske 1 day ago 0 replies      
Paranoid or not, we should develop a healthy skepticism about this in the society. A scary thing to me is that most people don't know about the true capability of information linking/correlating from multiple sources. It's not intuitive for us that you can get seemingly innocuous data and combine them to magically tell one's behavior. These kinds of threats should be systematically studied and made consciously known to the public.
zbyte64 1 day ago 2 replies      
The part about voluntarily giving up confessions reminds me of something similar during the Vietnam war. People would be grouped together and would need to "confess" the allures of capitalism. You could only graduate from the program once you procured enough drama to guarantee you were a comrade.
Simulacra 1 day ago 2 replies      
Paranoia can be healthy sometimes
shogun21 1 day ago 0 replies      
People are afraid of what they don't understand. If you just thought of the cloud as a server somewhere instead of a mysterious "ghostly entity", you'd know it's really not as smart as you think it is.
geggam 1 day ago 0 replies      
Amazes me that anyone familiar with data online thinks there aren't ways to track and store everyone's internet usage.
em3rgent0rdr 20 hours ago 0 replies      
20+ trackers blocked by Privacy Badger while reading this article.
graycat 1 day ago 0 replies      
In your Web browser, be carefulwhat cookies you are willing toaccept.
powera 1 day ago 1 reply      
Actually, this guy is paranoid and crazy. It doesn't mean he isn't right, but by any definition he is both paranoid and crazy.
Convicted by Code: Defendants should be able to inspect code used in forensics slate.com
168 points by Figs  2 days ago   38 comments top 9
donkeyd 1 day ago 0 replies      
I once nearly lost a contest because of a faulty SQL query on their side. If I didn't get to see the query, I wouldn't have been able to defend my entry and would've lost. Losing this contest would've been trivial, but if I applied this to a trial, it would be horrible.

The error was that a 'group by' was used to find the number of unique entries, even though they had leading spaces, that were part of their uniqueness. Group by doesn't take leading spaces into account, leading them to get a different result than me. I think that this could've happened to a lot of people, even forensic IT engineers.

6stringmerc 1 day ago 2 replies      
Okay so this article is bumping up against the hysteria that I'd categorize as "semi-technology literate" yet makes some good points. Almost like talking about how dangerous it is to walk through a minefield and then stepping on one. There's a valid point in there somewhere.

Copyright reform is one of my favorite subjects, and for a multitude of reasons. Should the prosecution be able to dump a case straight up without recourse because the "Stingray" gathering tool is too lovely to submit to review? Nope. Should breathalyzer code be held from review just because it's a product made by somebody? Nope. Should FOIA be stonewalled or pay-walled and inhibit the Constitutional freedom of the press? Nope!

Innocent until proven guilty is a very, very important premise for the US legal system. It's backed up by both the Fourth and First amendments to the Constitution. Any justification to put them aside for "War on ____" might seem reasonable on the surface, until taking a closer look at multiple murder evidence that comes from within the borders more often than on a laptop of a Citizen who just so happens to be coming back from a foreign country and gets worked for passwords under duress or has to forfeit hardware without recourse.

I dunno, maybe I sound like some kind of off-the-rocker dude by thinking about such things, but I love my country, I'm willing to sit down and think about this kind of stuff. It doesn't have to be extreme. Taking the small steps of talking with one another about what we really value is important in my opinion.

triggercut 1 day ago 2 replies      
There are similar issues with this in Structural and Mechanical engineering. Engineers are expected to rely more and more on software to execute and document complex calculations to verify designs, but how can you be sure those underlying calculations/theorems/models are correctly implemented? Some packages are constantly patching particular edge cases that get sent to them from their users. Many issue announcements to warn of bugs that could cause an incorrect result.

If a result from software led to a critical failure in a design, the onus is most likely still on the Engineer.

I have seen cases where software is formally reviewed by independent verification bodies, much in the same way your ISO 9001 compliance is. I can't see why this wouldn't apply here. Have an independent party, who has signed an appropriate NDA, asses and certify that your product does what it says on the tin and audit it at regular periods.

downandout 1 day ago 3 replies      
This defense attorney was creative for asking to examine the source code, but that isn't the only way to cast doubt on the accuracy of the software that DNA matched his client to the crime scene. He could simply obtain a copy of it and have an expert run tests to determine a false positive rate and also what types of scenarios cause the software to deliver false positives, then call that expert as a witness.
finance-geek 1 day ago 1 reply      
I think things will become even worse now that criminal "scouting" and even vetting is being done via learning models. So you may not even find hard filters or conditionals...instead the errors (or stereotypes?) would be embedded deep inside some neural net. I'm not even sure how one would explain that one to a jury.
TazeTSchnitzel 1 day ago 0 replies      
The essential problem is that in such environments the process of doing a task must be open to inspection, but software exists as a loophole that circumvents making process public.
jhwhite 1 day ago 0 replies      
This use to be a problem in Florida with drunk driving arrests. The company that makes the code for the breathalyzers wouldn't allow their code to be reviewed by defendants. There was finally a precedent set that defendants couldn't mount a viable defense without reviewing the code.

So for a while people accused of a DUI could wind up getting off, under the right circumstances, by requesting the source code then getting refused by the company.

The company finally allowed pieces of the code to be reviewed by the courts.

cm2187 1 day ago 3 replies      
I'm not sure I agree with that view. Independant testing by another lab should remove any doubt on the validity of a forensic, rather than forcing companies to open source their technology. And of course some form of certification/random tests that ensures that the company providing the forensic isn't a bunch of conmen.
joesmo 1 day ago 0 replies      
This is what happens when you have companies profiting off the misery of others.

The biggest reason for companies wanting to protect their source code in this case is that they already know their software is broken, like pretty much every other software, and they don't want to fix it. The arguments against losing money and such are total bullshit as courts have plenty of procedures for disclosing materials only to the relevant parties present, not to the public as a whole. These companies simply don't want to spend the money auditing and making sure their code runs correctly because the only consequence of that is wrongfully convicting someone they don't give a fuck about.

I'd say, let them see the code and let the highest paid expert witness win. That is, after all, the American way.

How is NSA breaking so much crypto? freedom-to-tinker.com
182 points by sohkamyung  2 hours ago   54 comments top 13
misiti3780 46 minutes ago 3 replies      
"Since weak use of Diffie-Hellman is widespread in standards and implementations, it will be many years before the problems go away, even given existing security recommendations and our new findings. In the meantime, other large governments potentially can implement similar attacks, if they havent already."

can someone explain to me why this cant be fixed over night. im no crypto expert, but

" If a client and server are speaking Diffie-Hellman, they first need to agree on a large prime number with a particular form. "

why can't you just switch the large prime number and then continue on sending encrypted data?

paulgerhardt 1 hour ago 0 replies      
See also Martin Hellman's oral history on trap doors: https://conservancy.umn.edu/bitstream/handle/11299/107353/oh...
Pyxl101 34 minutes ago 1 reply      
Some advice from the authors on how to properly deploy Diffie-Hellman:


smegel 9 minutes ago 1 reply      
Can someone explain what "breaking a prime" means? What is the output after your year of computation?
542458 1 hour ago 1 reply      
I wonder what the effort to break a 2048-bit prime would be. I suspect it's heading into "dyson sphere powered ideal computer" territory, but I'd be curious to know what it would actually be.
zmanian 1 hour ago 1 reply      
How much software has been updated to use stronger DH either ECC or 2048 bit prime field?

Is there an easy way to check if a VPN provider has updated?

The ASICs NSA built for breaking some common 1024 bit fields are probably breaking specific RSA keys now...

AnonNo15 1 hour ago 3 replies      
Crap. So what are the immediate countermeasures? Switch to elliptic curves cryptography?
agwa 1 hour ago 2 replies      
There aren't any new findings here. It's merely a rehash of the Weak DH attack (by the same researchers) that was made public in May of this year: https://weakdh.org/

Still, it's a good reminder that you should not be using 1024-bit Diffie-Hellman.

mrb 1 hour ago 0 replies      
FYI this is not really new news. The authors of that research had already disclosed their findings at https://weakdh.org about 5 months ago.

Today they simply formally presented their research at ACM CCS.

auntienomen 1 hour ago 0 replies      
Ha ha! (Seriously, nice paper.)
ape4 1 hour ago 0 replies      
Important stuff.
NN88 56 minutes ago 3 replies      

I wonder what world you all live in in which this is a bad thing. Theres real threats out there and i'd hate to live in a country that lacked the geopolitical leverage to make use of these tools to my nation's interests.

dogma1138 49 minutes ago 2 replies      
Breaking crypto is what the NSA was created to do, playing a cat and mouse game with it means you'll always loose.If the NSA cannot break crypto it's useless, and given 2 outcomes them giving up or them just asking for more money and being more intrusive the latter is much more likely.

No one will get their privacy "back" by fighting the NSA through technology, considering their mission, budget and capabilities they'll always win, the only way to pacify the NSA is through legislation that will ensure that they only use their capabilities when it's warranted.

Whats new in HAProxy 1.6 haproxy.com
189 points by oldmantaiter  15 hours ago   38 comments top 10
radoslawc 14 hours ago 2 replies      
External check pleases me greatly, but sending emails for me seems to be overkill, there are well established ways to do this in unified manner (logparsers, snmp traps etc).Half way trough to fullfill Zawinski's Law.
pentium10 15 hours ago 1 reply      
Cool, now we can use Device Identification feature to route mobile users to a different backend, also love the HTTP/2 connection sharing.
wheaties 12 hours ago 2 replies      
Whenever I see a long list of features like these, especially something major like Lua integration, I always wonder, what was given up in the process of adding them? Normally when there's a performance bump, they show the numbers. In this case there's no numbers. Was there a performance hit? Negligible?
nailer 13 hours ago 2 replies      
As an HAProxy user, support for logging to stdout (and hence journald) would be great. Currently HAProxy users on the major Linux distros either have to use it in debug mode or have a second log server just for the purposes of running HAProxy.

Otherwise I love HAProxy!

ris 6 hours ago 1 reply      
Something I've always wanted to do but from what I can see is impossible in Apache is simply limit the number of connections a single IP can have open at once.

Is this possible with HAProxy? If it is, the documentation doesn't make it clear how.

binaryanomaly 12 hours ago 1 reply      

But no http2 so it won't get in front of my nginx instances, yet ;)

dexcs 15 hours ago 2 replies      
Nice. It supports lua and mail alerts on changing servers now...
ausjke 14 hours ago 2 replies      
I was comparing HAproxy to Squid a while ago and could not figure out what's haproxy's advantage over squid? I ended up using Squid but still am very interested in HAproxy, would like to learn more about it.

Squid remains to be the only one that can deal with SSL proxying(yes it's kind of MITM, but it's needed sometimes), and it's also the real "pure" open source. HAproxy might be better fit for enterprises that need support?

ErikRogneby 15 hours ago 0 replies      
Lots of goodness!
wpblogheader 13 hours ago 0 replies      
Supports Lua? SWEET!
Laser Razor suspended by Kickstarter bbc.com
180 points by aram  1 day ago   123 comments top 28
beloch 1 day ago 4 replies      
To those looking for more concrete info on how this razor is supposed to work, here's the patent for it:


As far as I can tell, the idea is to use evanescent coupling to transfer light into hair follicles. There's no free space laser beam, just an optical fiber that you drag across your face. They also claim that chromophores (color bearing molecules) in hair can be severed at relatively low powers with a mixture of several specific frequencies of light.

So, what this product needs in order to work is a fiber that's durable enough to survive being dragged across skin while having very little cladding so as to allow evanescent coupling. That could be very hard to do, so the heads on these laser razors may wear out after a few shaves just like a metal razor. Second, they need to pack a high power multi-wavelength laser source and the power reserve to run it into a very tiny handle. Again, this is probably going to be pretty tricky.

There's nothing here that looks outright impossible to me. Just very, very tricky.

DannoHung 1 day ago 6 replies      
The thing that is weird about this case is that one of the principals, Morgan Gustavsson, is actually the dude who invented the real laser hair removal that is used in clinics and has been involved in dermatology since.

I don't know if he was actually involved in this project or not, but that was the one thing that made me think that this maybe wasn't 100% a scam?

Anyway, the implication they make in their pitch isn't that it's an open laser, but that it is a laser confined to a fiberoptic wire which leaks into the hair when pressed against it. Gustavsson has published some papers on this a few years ago in which he refers to the concept as a TRASER.

Of course, if this really is such a revolutionary advance, why go to Kickstarter to bring it to market? Why not traditional investors. Gotta be easier to get funding for a significant manufacturing outlay, right? Just to not have to sell a piece of the company? To justify that there is a market?

I personally don't have the background to make any judgments about this and I definitely don't understand the article he published, but I just thought it didn't completely fail the smell test.

dogma1138 1 day ago 8 replies      
It's sad that people thought it was real, forget about lasers and stuff it's basic common sense.

A AAA battery doesn't store enough power to drive a laser capable of burning hair for any reasonable amount of time.

When the laser isn't interrupted by the hair it has to go some where which means that heat is produced, if it can get something hot enough to burn the hair off it would get hot enough that you won't be able to hold it yet alone put it to your face.

There's no way you would ever could get the laser beam close enough to the skin for a smooth shave without burning your skin off.

And most importantly burnt hair smells like shit....

P.S.I assume that most people know at least 1 person that did laser hair removal, they should know it's a very painful and long process and it works only on dark hairs so again using this to shave anything but a fairly dark beard would never work.

aram 1 day ago 2 replies      
Since we all check the comments first, here are some links:

KickStarter project page (suspended):


IndieGoGo project page (they re-posted the project there after being suspended):


Demonstration video:


Sephr 1 day ago 2 replies      
> Backers received an email from Kickstarter saying the Laser Razor was "in violation of our rule requiring working prototypes of physical products that are offered as rewards".

Interesting. Was this a recent policy change? Control VR never showed any working prototypes either, and their campaign was allowed (this was in 2014). Their demonstration video was later revealed to be using another company's significantly more expensive product ($10k+ vs the $600 pledge price), with zero modifications. They never demonstrated any prototypes of the product they were claiming to develop themselves, yet the campaign went through and now they have everyone's money (>$400k) without delivering.

hoopism 1 day ago 1 reply      
hackaday.com had a good writeup that was skeptical of this.


They have generally been good at writing up some of the more sketchy kickstarts.

vicbrooker 1 day ago 1 reply      
Their prototype reminds me of concept car designs that look great but don't have enough internal space for an engine. You'd maybe fit a AAA battery in the handle but I haven't even seen a torch that works without more juice.

Also, I was kind of suspicious when I noticed that more than half of the team have beards.

ortusdux 1 day ago 1 reply      
In light of the recent actions of the Washington State Attorney General, it takes balls to try and pull a scam like this. The AG ruled against the people behind a kickstarter campaign for a card game when they did not ship rewards for two years. The original campaign raised 25k and the judgement was for 56k. The ruling was on behalf of 31 residents out of 810 backers. The company is slowly shipping units and it looks like the AG is backing off.

But this whole thing brings up many interesting questions. The fine was 1k per WA resident + reimbursement + legal fees. Knowing that the AG has your back if things go south should undoubtedly embolden WA residents, which may lead to a higher percentage of backers coming from that state, which in turn would mean a higher fine if things fall through. I honestly was going to have my sister, a WA state resident, back this for me for my birthday, if it survived to the last day of funding.




blakecallens 1 day ago 1 reply      
If Indiegogo doesn't suspend them too, it could be a real watershed moment for them. It'll brand them as the place to go for scam products.
MattGrommes 1 day ago 0 replies      
As soon as I saw that Kickstarter kicked them off I was wondering how long it would take for them to get onto Indiegogo. Then, of course I see it only took 4 hours.

If you're unaware, Indiegogo has shown that they're more than willing to be the platform of choice for scammers and nonsense products.

joshdance 1 day ago 2 replies      
I really want a Kickstarter review site, where people can post projects. Since non-backers cannot post comments there is no way to let people know that 'this project is probably not going to work'. Snopes for Kickstarter?
steven2012 1 day ago 2 replies      
This current phenomenon about crowdsourcing products is similar to the GroupOn/flash sale phenomenon, and as the products get shittier and shittier, and as more and more get disillusioned by it, the entire space will die.

Kickstarter and IndieGoGo need to do a much, much better job policing this, otherwise they will be out of business in 2 years. There are too many shitty products with great marketing videos that are taking a lot of money, and they will likely all be disappointing as hell.

ibmthrowaway271 1 day ago 0 replies      
Do not look into razor with remaining eye?
danr4 1 day ago 1 reply      
I don't like what has become of kickstarter, but I think it's more of it's users fault than the company. They are really trying to stick to their values and principals, and mission despite having the possibilty to go the usual 'lets raise bazillions and get some flippin growth and extinguish the competition' way
RUG3Y 1 day ago 0 replies      
"Don't worry, it'll be ready by spring." lol
wlesieutre 1 day ago 0 replies      

 "They have been incredibly helpful and they believe in the Skarp Razor as much as we do," the firm said of Indiegogo.
is a very polite way of saying "Yeah, we don't believe in our own product either"

thehodge 1 day ago 0 replies      
> The Skarp razor is powered by a small laser which cuts through hair for an incredibly close shave without irritating or damaging the skin.

I would read that as them seeing that the laser is cutting the skin but that could be easily be intentionally misleading marketing speak

jakejake 1 day ago 0 replies      
I couldn't help but wonder if this would ultimately produce the same result as the "No No" shaver which, from what I read, is just a hot wire that burns the hair at the root.

The reviews make it sound like its a rather slow and smelly affair shaving this way.

BinaryIdiot 1 day ago 0 replies      
How can you prove to potential buyers that you're able to produce the product you're selling them if you can't create a prototype demonstrating that it works?

Let's say we take everything at face value and believe 100% they can do this and that it is not a scam. If there is no prototype it's still possible that it could not work as it's unproven.

Paul_S 1 day ago 0 replies      
Reminds me of that hologram device scam - bleen. It was so funny I couldn't tell if it was a scam or a parody trying to make fun of crowdfunding. Unfortunately it was flexible funding so people lost their money and whatever you think no one deserves to be scammed.
aet 1 day ago 0 replies      
A fool and his money, be soone at debate: which after with sorow, repents him to late.
jlebrech 1 day ago 0 replies      
What made me skeptical of this is that it looked like too much of a polished system, you'd expect it to look more like a braun shaver than a gillette fusion, with more battery space. The AA battery was also a dead giveaway.
Simulacra 1 day ago 0 replies      
Unpopular opinion, but I think parents should be assigned based on actual working models of something. Not just an idea, but something that is actually been tangibly created.
ipsin 1 day ago 0 replies      
The name "Skarp" -- possibly in conjunction with "Kickstarter" -- immediately made me think of "scarper", i.e. the process of fleeing, ideally with the big bag of crowdfunding money.
happywolf 1 day ago 0 replies      
Anything that heats a hair to the point it burns off, no matter what the heat source is, will give out a bad smell. Good luck if one has thick beard, and no, I am not convinced this will work cleanly.
kazinator 1 day ago 0 replies      
Laser Razor will clearly have to show actual hair removal to prove itself; fleecing countless sheep of millions is a mere metaphor.
funkaster 1 day ago 0 replies      
lol... what I found funny and somewhat ironic is that most of the people in the video seem to wear a beard? (or lack of shaving) Maybe they're waiting until they can get their hands in an actual working prototype :P
bsder 1 day ago 0 replies      
Um, folks, you're missing the obvious.

Optic fiber glass is SHARP AS HELL.

So, make the thing light up with pretty lights and colors and expose a sharp glass edge for cutting.

Works like a razor, cuts incredibly closely, and has ooh shiny for marketing.

I'm not seeing a problem with this.

Apple facing huge chip patent bill after losing case bbc.com
153 points by jnord  17 hours ago   133 comments top 16
devit 15 hours ago 4 replies      
Looks like the "idea" of the patent in the description is to use a predictor to predict when a STORE and LOAD alias and not speculate the LOAD and any instruction depending on the load (although the claims generalize this to any non-static dependency).

As it generally happens in software/hardware patents, the claimed solution seems quite obvious whenever one wants to solve that particular problem, and the hard part is the "execution", i.e. implementing it efficiently and figuring out whether the tradeoffs are worth it.

So assigning patents to things like this seems really dumb.

rayiner 14 hours ago 2 replies      
This PDF explains what I discuss below in more detail: http://moodle.technion.ac.il/pluginfile.php/315285/mod_resou.... Prediction of aliasing is discussed on slide 25.

The patent in question pertains to an optimization of what these days you'd call "memory disambiguation." In a processor executing instructions out of order, data dependencies can be known or ambiguous. A known data dependency is, for example, summing the results of two previous instructions that themselves each compute the product of two values. An ambiguous data dependency is usually a memory read after a memory write. The processor usually does not know the address of the store until it executes the store. So it can't tell whether a subsequent load must wait behind the store (if it reads from the same address), or can safely be moved ahead of it (if it reads from a different address).

If you have the appropriate machinery, you can speculatively execute that later load instruction. But you need some mechanism to ensure that if you guess wrong--that subsequent load really does read from the same address as the earlier store--you can roll back the pipeline and re execute things in the correct order.

But flushing that work and replaying is slow. If you've got a dependent store-load pair, you want to avoid the situation where misspeculation causes you to have to flush and reply every time. The insight of the patent is that these dependent store-load pairs have temporal locality. Using a small table, you can avoid most misspeculations by tracking these pairs in the table and not speculating the subsequent load if you get a table hit. That specific use of a prediction table is what is claimed by the patent.

Maybe this is worth a patent, or maybe not. For what it's worth, I don't think anybody was doing memory disambiguation at all in 1996. Intel was one of the first (maybe the first) to do so commercially in the mid-2000's. Apple's Cyclone architecture also does it, and I think it was the first in the low-power SoC space to do it.

msravi 13 hours ago 0 replies      
UWisc has always been very aggressive with its patents. I recall sometime during 2002 or thereabouts, while working for a reasonably big semiconductor company with DSP/ARM processors, one of the guys in our team with an interest in computer architecture, used the company network to download and play with a simulator or something (might have been simplescalar). A few weeks later the head of our group gets contacted by the company lawyers saying that UWisc was asking for licensing costs for using their tools (they provided the ip address that was used to download the tools). I'm not sure how it was resolved finally, but I don't think the company paid.
DannyBee 14 hours ago 1 reply      
In general, I welcome the day when universities get what is coming to them for this kind of stuff (see also: Marvell vs CMU for 300+ million, reduced from 1.5 billion on appeal, etc).

In particular, given how much industry funds them, collaborates with their professors, etc, what is going on now is a remarkably stupid approach mostly driven by tech transfer offices that want to prove their value.

Which will be "zero", once the tech industry starts cutting them off.

Kristine1975 13 hours ago 5 replies      
>The University of WisconsinMadison is a public research university

So it's a university [mainly] funded by the tax-payer. How can it be that the research of this university isn't in the public domain? The public paid for it, the public should reap the benefits without paying again.

Sure, Apple tries their hardest not to pay taxes, but the patent isn't limited to them.

ctz 16 hours ago 1 reply      
skaevola 11 hours ago 1 reply      
I'm curious how the university could discover that Apple was using its patent. The internal characteristics of the processor must be secret, right? Do they examine die photos and reconstruct the gate netlist?
monochromatic 14 hours ago 2 replies      
How is this journalism? It doesn't even tell you the damn patent number.
DannoHung 13 hours ago 0 replies      
I have one question: Do the professors teach this technique in classes?

I mean, that'd be funny, right? Teaching students something that you patented, waiting a few years for them to go into industry and apply what they learned, then suing them for it.

bitmapbrother 12 hours ago 1 reply      
I have no sympathy for Apple in this matter. Considering the worthless, prior art ridden patents they used against their competitors they deserve the blowback. And in keeping with their modus operandi they ignored the University of Wisconsin and wilfully infringed the patent.
abluecloud 16 hours ago 4 replies      
$862m isn't that huge in the grand scheme of things. Not to mention, it's most likely not going to be $862m, my guess is it'll be less.
propter_hoc 14 hours ago 6 replies      
This is sort of a depressing precedent. Do we really want to turn our universities into patent trolls?
cozzyd 14 hours ago 0 replies      
Awesome, maybe the Brewers need a new stadium too.
mtgx 15 hours ago 1 reply      
You know what they say: Live by the patent sword...

Why doesn't Apple start lobbying for real patent reform?

werber 15 hours ago 1 reply      
I don't get how they settled out of court and then did it again, that seems really bizare.
bwilliams18 12 hours ago 2 replies      
What if patents could only be held by individuals and not corporations?
MH17 Report bbc.co.uk
189 points by nns  1 day ago   138 comments top 11
nns 1 day ago 3 replies      
Video Reconstruction:https://www.youtube.com/watch?v=KDiLEyT9spI

At the 10.30 mark you can see the missile impact on the plane.

danielvf 1 day ago 0 replies      
rogeryu 1 day ago 6 replies      
It's incredible that 61 airlines flew over that area during those days. That day alone, 160 airplanes flew over East Ukraine before MH17 was hit. Even if it's safe to fly at 10k height, what happens if you have problems with your plane, like a failing engine? Next time I fly I'm going to check over which countries we fly.
bonkabonka 1 day ago 0 replies      
The Aviation Herald has a very good writeup: http://avherald.com/h?article=47770f9d/010&opt=0
escapologybb 1 day ago 4 replies      
Can anyone break down how they knew that the missile exploded within a one square metre volume of air, just above and to the front of the plane please?

Do they do it just from the pattern of debris, or do they use other methods as well. I think I got that they can use the microphones in the cockpit to work out the direction the missile fragments came from, but not quite sure on the details.

Thanks in advance!

kushti 1 day ago 1 reply      
Interestingly, it seems everyone has forgotten CNN top news from mid-July: http://edition.cnn.com/2015/07/15/politics/mh17-pro-russian-... . CNN did spread an outrageous lie, but nobody cares.
pms 1 day ago 2 replies      
It was all known thanks to Bellingcat since at least 1 year.
sschueller 1 day ago 1 reply      
The report did not go into finding out who did it. Both sides own BUKs. It is very difficult to find the truth because all sides are pumping out propaganda. Even the United States is manipulating what we hear for their own political and economic gains.

Here an subtle but strong example : http://www.sott.net/article/302911-Sott-Exclusive-Full-unedi...

ed_blackburn 1 day ago 3 replies      
I anticipate lot of bluster. The perpetrators perhaps being named but ultimately nothing happening. A lot of nations do not recognise courts higher than their own sovereignty so will not extradite. Especially if they deem the charge to be politically motivated. Right or wrong. Russia will contest this report, and any other.

Basically someone messed up with a borrowed bit of heavy kit and a tragedy ensued. There will be no justice for them or their families. Poor bastards, can only hope the report is accurate and any suffering was brief.

SuddsMcDuff 1 day ago 3 replies      
I find it really fascinating that they've gone to such lengths to literally recreate a large portion of the plane from the wreckage in order to better understand what happened.

I can't help but think though, why weren't similar measures taken with Flight 93 (Pennsylvania, 2001-09-11) or Flight 77 (Pentagon, 2001-09-11)? I don't wish to allude to any of the many conspiracy theories, but I do find it interesting to see how a "real" crash investigation is done, as opposed to what we've been told about 9-11.

Playboy to Drop Nudity as Internet Fills Demand nytimes.com
151 points by stanleydrew  1 day ago   95 comments top 24
masterponomo 1 day ago 0 replies      
For boys growing up in the 60's and 70's, a lot of time and effort was spent trying to get hold of a Playboy or even just a few pages. Imagine a whole neighborhood of boys playing "ditch 'em" (a wide-ranging and violent version of hide 'n' go seek) wherein the "seekers" would while away the countdown time by gathering at one player's house to peruse his dad's magazine collection--and manage to put it back EXACTLY as it was found lest the treasure trove be locked away. Imagine a middle school music class where an ancient radiator vent in the back of the room was known as a reliable drop point for a stash of pictures, necessarily folded and ripped from being alternately jammed into and removed from the hiding place. Oh well, what we were looking for (and so much more) is blase now. Times do change.
jusben1369 1 day ago 2 replies      
This is the key part to me: "The company now makes most of its money from licensing its ubiquitous brand and logo across the world 40 percent of that business is in China even though the magazine is not available there" The actual magazine is kind of like Apple Desktop products. It's a direct link to their heritage and the original reason people were drawn in but it's not where the real money is made anymore so changes are fairly trivial to the overall business.
lvspiff 1 day ago 1 reply      
Recently Playboy turned more into an outlet of Photoshop skills than actual tasteful photography it seemed like. It has been and hopefully continues to remain a source for reporting and interviews that are intriguing and somewhat hard hitting due to the uncensored nature. If it turns into yet another Maxim however its going to continue to stall. Playboy does have a place in the magazine world and its good to hear Heff is open to letting it evolve.
aezell 1 day ago 2 replies      
This has nothing to do with gender issues, sexuality issues, societal changes, or Internet ubiquity. This is simply a flailing attempt to get press for a moribund flagship publication that is likely dragging down a profitable brand. This is a money and marketing decision and nothing more. Like New Coke, I can foresee Playboy returning to nude photos with some huge star in a few years.
adav 1 day ago 1 reply      
Earlier this year, a British tabloid called 'The Sun' dropped its infamous Page 3. This news immediately made headlines and everyone was talking/joking about it.A couple of days later, Page 3 returned and everyone realised that they'd been played; it was a great advertising ploy. The media forgot that it was The Sun's sister paper The Times that made the original, brilliantly sneaky, announcement! (Murdoch/News International)

I predict the same will happen in this Playboy's case...


pratyushag 1 day ago 5 replies      
Playboy should relaunch itself as a women empowerment magazine. There is a strong sense of empowerment for women to pose nude and playboy helped bring about this revolution (or can take some claim for it). We don't need another Vice but we do need a magazine that interviewed the likes of Martin Luther King and Malcolm X, to do interviews on women leaders. It should be remembered that Christie Hefner has probably directed the company for a lot of its history and was instrumental in its development. In many ways, playboy is not inherently sexist or undermining women (do we consider a nude men's magazine to undermine men? no so the argument carries itself quite well I think).
tyre 1 day ago 1 reply      
This is about more than nudity becoming ubiquitous. Playboy's brand now relies much more on branded goods than pornography.

Quartz has a decent overview here: http://qz.com/522672/china-not-online-porn-is-why-playboy-is...

AndrewKemendo 1 day ago 3 replies      
Vice is what playboy should have turned into 10 years ago.
kin 1 day ago 1 reply      
I subscribe to Playboy. Among all my magazines while their issues do have the occasional great article, the issues are just really thin and other magazines more often have better reads. What they do have is a brand. Getting rid of nudity may increase subscriptions which may increase ad space, but they're also getting rid of the main reason people many subscribe to Playboy. We'll just have wait and see if they can execute and deliver on what their brand demands of them. Pretty big gamble IMO.
jerf 1 day ago 0 replies      
A good chunk of the relationship between 20th and 21st centuries captured in one headline. Beautiful.
_delirium 1 day ago 2 replies      
This bit is interesting, if the claimed causation is true:

In August of last year, its website dispensed with nudity. As a result, Playboy executives said, the average age of its reader dropped from 47 to just over 30, and its web traffic jumped to about 16 million from about four million unique users per month.

BurningFrog 1 day ago 0 replies      
"We only publish it for the articles."
brento 1 day ago 1 reply      
> Youre now one click away from every sex act imaginable for free. And so its just pass at this juncture.

Just my personal opinion but this is a very sad truth about our society and how easy it is to see porn.

fricken4 1 day ago 0 replies      
Back in the 90s when the print industry was still a thing my canned statement for whenever I passed a magazine rack was that Playboy should really start covering up the nipples so it can sit on the front of the magazine rack and compete with Frat-boy mags like Maxim, Loaded, and FHM. Playboy had great content and a great culture built up of esteemed contributors, it's a shame legacy pride kept them from making it to the next generation. I'm skeptical there's much left to salvage at this stage.
Animats 1 day ago 0 replies      
It's surprising that Playboy, as a print magazine, is still around.

The main competitor, Penthouse, is owned by Andrew Conru, the guy who did Adult FriendFinder. He's probably the most successful spammer, after beating California's anti-spam law in 2002.[1] Conru tried to buy Playboy a few years ago, but Hefner wouldn't sell.[2]

[1] http://www.dmnews.com/news/california-spam-case-appealed-to-...[2] http://www.thewrap.com/media/column-post/penthouse-owner-mak...

xlm1717 1 day ago 4 replies      
Soon Playboy will see that, no, nobody was getting Playboy for the articles.
mickgiles 1 day ago 0 replies      
I only read it for the articles anyway
joelx 1 day ago 4 replies      
Playboy produces well written articles now, but will their business model survive removing their original reason for existing?
eggnet 1 day ago 0 replies      
I'm not sure how they can pull this off without rebranding. State, ISP, and end user devices/software filters exclude playboy by name. Right?

How can you distinguish between old rated X playboy, and new PG playboy.

werber 1 day ago 0 replies      
I really thought that when they inevitably changed their "featured content" it was going to be including naked men, like a grown up (and less hipster) version of Ryan McGinley's work.
shade23 1 day ago 0 replies      
And for the first time I see something on HN after i saw it on Facebook.
smacktoward 1 day ago 1 reply      
The Internet may have provided the final blow, but it wasn't what killed Playboy. What killed Playboy was success.

The first issue of Playboy hit the stands in December of 1953. The magazine espoused a philosophy that was pretty radical for that time, namely that sex was not just not bad but actually good and fun and something everyone should be doing. It was "sex-positive" in a time when literally nothing else in the culture was.

This boldness served them well from the '50s to the mid-'60s, as the rest of the culture slowly started to come around to the same point of view. The problem is that by the late '60s the culture had reached the same point that Playboy had, and it didn't stop there -- sexual liberation kept galloping on, reaching points that were far beyond anything Playboy had ever advocated for. By the early '70s, for instance, American culture was so saturated with sex that pornography was just another part of the cultural landscape (see https://en.wikipedia.org/wiki/Golden_Age_of_Porn). Playboy had agitated against the censorship laws that had kept such films out of general circulation; but when those laws disappeared, it had nothing to say about the results.

Playboy, in other words, eventually got lapped by the changes it helped to create. The culture galloped along, but the magazine didn't do anything to keep up, so its relevance slipped and eventually tumbled. It was still selling a Mad Men view of sexuality in a world where divorce and cohabitation and premarital sex had all become part of everyday life. In the 1950s, Playboy was daring and edgy; by the 1970s, it was positively quaint. And while porn can be a lot of things, one thing it can't be is quaint. Nobody ever got turned on by something that was 100% safe, 100% familiar.

So Playboy started dying way back then, when it gave up its claim on cultural relevance, and the story of the decades since has just been the slow playing out of the inevitable.

homulilly 1 day ago 0 replies      
This makes some sense as the only demographic group still interested in nudy mags are too young to legally buy them.
tiatia 1 day ago 4 replies      
Playboy is done mostly by women. I assume it is hard to think the whole day about "What do men want?"

It was more Photoshop than photos. They published naked pictures of Merge Simpson. I mean, really?

This is how it can be done different:

[NSFW] https://www.amypink.com/de/

All the UML you need to know bsu.edu
175 points by CaRDiaK  1 day ago   102 comments top 18
m_fayer 1 day ago 8 replies      
My first real-world job in the industry was in a large-ish shop that worked as follows:

Create a full-detail schematic of the system in version-controlled UML.

At some point, "deploy" the UML by printing it into a 4 cm-thick binder of paper, then distribute these binders to the head architects.

Iterate on the UML until the architects are happy. (The architects spent many years trying to auto-generate code from the UML diagrams and have the results "fleshed out" by lowest-bidder consultants, though this never really worked. Their stated goal was to no longer have to write any code in house, but rather nothing but UML.)

Begin implementing the system in house with auto-generated code from the binder-of-UML as a baseline, after the lowest-bidder consultants had failed.

Quickly get into big fights between the coders-on-the-ground and management when it was found that the UML diagrams contained major architectural flaws and the UML-phase would not, actually, account for 80% of the project's duration and budget. Needless to say, more than half of the projects failed entirely.

This experience nearly made me leave the industry, before I discovered that there was plenty of software being written in a saner way. This was more than a decade ago, but to this day, just seeing UML diagrams turns my stomach.

putzdown 1 day ago 0 replies      
No, all the UML you need to know is this: (1) draw and label a box for each class; (2) draw an arrow from one class to another to show dependency; (3) draw a different kind of arrow from one class to another to show inheritance; (4) [bonus material, for super-geniuses like you] use regex-style symbols * , +, 1 and suchlike to mark the ends of dependency arrows in order to indicate when you have one-to-one or one-to-many relationships and so forth.

There. My 20+ years of experience in software architecture in various fields from games to networking tells me that you now know enough to work out the classes and their relationships in a large software system.

Don't fuss around with "aggregation" or "composition" or whatever. Don't spell out functions (though occasionally I'll jot one below a line to remind myself what the salient feature of the dependency is). And by no means write the class properties, their types, or their access specifiers (public, protected...)this is way too much detail. A UML diagram is useful in modeling broad object relationships in a system. If you want to work out what properties a class should have, write the damn class. Any software developer worth his salt can figure out the code from a high-level diagram; don't write the code for him. Or do, but then don't call it an architectural diagram.

I know there's a whole culture of software development where architects design code but don't dirty their hands with writing it, then hand it off to underlings who type it up for them, and so on down some kind of techno-bureaucracy Great Chain of Being. Rubbish. Code architecture is a thing and some kind of diagramming is helpful, but UML as such is the sort of busywork and IRS-style hierarchism that marks bloated government jobs, not real productivity or real teamwork.

Give UML a miss and use something very, very simple.

cjg 1 day ago 14 replies      
I've generally found UML to be a complete waste of time.

I'd rather outline the major components of a system by drawing (on real paper) simple boxes and lines, or write the code that implements the system.

Not sure what code-as-picture achieves - it's generally has worse tooling (less editable, less versionable, etc.) and tends to be used by 'architects' who don't write code, only for that UML to be essentially ignored by the coders on the ground.

jasode 1 day ago 0 replies      
Diagrams/notation I've found useful:

++ E-R (entity-relationship) diagrams. I find it easier to look at boxses for each table follow the lines signifying relationships to other boxes. The "crows feet" can signify 1-to-many. The diagram is easier than reading a sequential list of SQL CREATE TABLE statements and making a mental note of "FOREIGN KEY" strings and mentally backtracking to the parent table.

++ swim lanes to show how the "state" of a system is supposed to change with "time". This can succinctly show how data "flows" without having to actually run a program and watch variables in a debugger to manually recreate the "swim lane" in your head.

++ truth tables to summarize combinations of valid/invalid business rules and associated side effects. A grid is easier than parsing a long (and often nested) list of if/then/else/switch statements.

As for UML, the notation never seems to be that succinct or helpful to me. On the surface level, it seems that UML (for code) should have the same return-on-investment as E-R (for db tables) but it doesn't in my experience.

I also wonder if there is a cultural component to UML usage. It doesn't seem like tech companies (such as Microsoft/Google/Amazon/Ubisoft/etc) internally use UML as a prerequisite step for building systems. On the other hand, I could see more UML usage at non-tech companies (such as banks/manufacturing/government) building line-of-business CRUD apps. Grady Booch (Booch & UML notation) did consulting about software methodology at non-tech companies so that may have been a factor.

edpichler 1 day ago 0 replies      
Plot twist: this article doesn't have all the UML you need to know, it's just the class diagram.

Particularly, for my personal projects, I use the Use Cases diagram to map the requirements and the features my application will have, associated with my prototypes. Other diagrams, like Class diagram, usually I use just to map the Domain before develop the persistence. This is how all my projects start, even if I am working alone. It is good to me and it's part of my creative process.

jmartinpetersen 1 day ago 2 replies      
Key quote: "Keep in mind that UML is a communication tool, and you can omit details that are not necessary for expressing your message."
skrebbel 1 day ago 1 reply      
The complete overapplication of UML for many years gave UML an undeservedly bad name. The top comments in this thread are testament to that: many programmers are simply all too happy to go "haha! UML! that's for enterprise losers in suits who prefer paper over working code!"

Thing is, the core elements of UML are very useful in communicating a design or an idea. Class diagrams are a great way to discuss an OO-ish codebase in front of a whiteboard (or any data model, really). When you do that, it really helps when everybody knows that an arrow in static UML diagram types means "dependency" and not "the data flows from here to there".

Similarly, I still haven't seen a better way to visualise state than with a UML state chart.

It's also very nice if you can draw a UML object diagram that people understand (looks like a class diagram, except you basically draw a hypothetical runtime situation of instantiated classes and you underline the object identifier names). This works best when people understand that the picture on the left is a class diagram (design time) and the one on the right is an object diagram (runtime example) of the same classes. This is not complicated stuff, but it doesn't really work as well when half the team thinks UML is for losers.

Now, bear with me, I'll be the first to agree, UML is a bloated piece of shit. Package diagrams, wtf, who needs that? Use case diagrams that show which use cases are specified, instead of how the use cases go - seriously? Activity diagrams so you can draw a 5-line method on an entire sheet of paper, big fucking what the hell were you guys thinking?? Why do I even know what this stuff is? What a waste of time - even the decent diagram types have 60% bullshit syntax and only 40% useful stuff. And message sequence charts are nice enough for protocols but impossible to draw right.

But to dismiss UML just because some enterprise architects went a little overboard in 2002 is a bit like dismissing all of OOP because 15-level inheritance hierarchies used to be hip.

I wish we could agree on a tiny subset of UML that actually makes sense, and all learn that. This post makes a good start for class diagrams, although IMO even the ball-and-socket notation is overblown nonsense from a time long gone. Maybe we should do this, and give it a separate name.

On a mildly related note, one thing I like about OOP is that you can draw pictures of it easily. Does anyone here know of a good way to visualize functional code structure? You can draw a dependency chart of modules of functions but that only gets you so far.

mark_l_watson 1 day ago 0 replies      
Too bad he skipped sequence diagrams, saying that he had already covered them in class.

Years ago I co-authored a book on UML, but the only UML diagrams that I still use are sequence diagrams which I think are great for explaining interactions between objects or separate services.

omellet 1 day ago 0 replies      
This should have linked to a blank page.
vortico 1 day ago 1 reply      
Yup, even if you never write OOP code, this is a somewhat common language you will encounter between colleagues, so it is worth knowing exactly what is on this page.
lisper 1 day ago 0 replies      
Graphical representations are useful for representing spatial relationships because then you can take advantage of the inverse-GPU in the human visual cortex to do a lot of computation for you. But software doesn't have spatial relationships because it doesn't exist in space. So trying to represent software concepts graphically is generally doomed to fail. There are a few exceptions, like code indentation, but there's a reason that flowcharts aren't used much any more.
saiki 1 day ago 1 reply      
Personally I use quite a lot sketch like UML and find it very helpful for clarifying complex environments or ideas that are not clear yet. Most of the time sketches are just boxes, circles and lines, but those communicates and clarifies the problem for others very well. I don't use lot of time when sketching, system can be described graphically very quickly, just to get the idea out or it usually goes too detailed. We have also built a tool that helps sketching systems with remote teammates (https://sketchboard.me).
makecheck 1 day ago 0 replies      
Just think of UML as the C++ of diagrams: it is sometimes the best way to produce the result that you need but you have to choose a sane subset.
DanielBMarkham 1 day ago 3 replies      
I love UML. UML is overused. More people need to know UML. The less UML you do, the better.

I believe all of these statements to be true.

I had a contract many years ago with a large insurer. Their development process basically consisted of drawing really complex UML diagrams, then hitting the Big Red Button and having the modeling tool generate 40,000 lines of framework code. The chief architect explained to me that really the only work required was just a tiny bit of business logic in the appropriate places.

Fortunately I was not part of the main dev team, which for some strange reason (at least in the lead architect's mind) had the damnedest time with this system. My job was to create an internal permissions system. Given app X, user Y, and action Z, was the action allowed or not.

I looked at the problem for a while, and no matter how I thought about it, to me I had three lookup tables and one method. Boom, I'm done.

The lead architect wanted me to still draw a diagram with one class, push the button, and get the 40,000 lines of code. For some reason, this did not appeal to me.

Took me about 3 weeks to convince him that really 20 lines or so of code was all we needed. I still had to draw the diagram, though.

That's the horror story -- one among dozens I have. But on the flip side, I've been with teams that interviewed the customer while sketching out a domain model. Since we all understood UML, a quick and lightweight sketch using the appropriate notation got agreement on a ton of detail just taking 30 minutes or so. That would have been very difficult using a conversation or word processor. Sketching without some common lightweight understanding could have led to rookie errors.

There is nothing in this world better for getting quick technical agreement on complex problems than group sketching using lightweight UML. The trick is sticking to the absolute minimum.

cubano 1 day ago 0 replies      
I've never figured out how to use UML abstractions to fix client and/or user production bugs, or met anyone else who could, so who the hell has the time?
agentgt 1 day ago 0 replies      
I'm just curious and I'm not trying to be snide but is there any spec that OMG has produced that people actually like and use still? (they are also makers of CORBA)

I live in the Boston rt128 area and I pass OMG's building all the time and I just have no idea how they are still in business (they are near Trip Advisors new complete awesome building).

I wonder how many massive companies continuously donate to OMG and do not realize it.

crocal 1 day ago 0 replies      
All the UML I need to know? Frankly? Nothing. I am surprised people would still consider this thing useful.
CrLf 1 day ago 0 replies      
The general feeling about UML is that it's overkill for most projects and actively harmful when used to generate code. On the other hand, most agree that parts of UML are useful as communication tools.

However, UML was designed as a standard, near-UML is not UML. Ergo, UML is useless.

I feel better already.

Running Swift code on Android goyet.com
186 points by Ecco  12 hours ago   59 comments top 13
e28eta 12 hours ago 1 reply      
A coworker showed me this project, which does the same but also includes .NET and regular Java: http://elementscompiler.com/elements/silver/

Looking at their github, they appear to be making progress against re-implementing the Swift core library by binding to native classes with similar functionality. Like Arrays: https://github.com/remobjects/SwiftBaseLibrary/blob/master/S...

jevinskie 12 hours ago 1 reply      
Have you thought about extracting the XAR archive of LLVM bitcode from Xcode 7.0+'s libswiftCore.dylib's __LLVM,__bundle section and linking your NDK project against that? You'll probably quickly run into linking dependencies outside of libswiftCore but perhaps you could A) stub out those missing dependencies B) build them from opensource.apple.com's CF, libobjc, etc projects or C) try and extract the __LLVM,__bundle's of the dependencies themselves. Rinse and repeat until everything works? :)
autoreleasepool 12 hours ago 1 reply      
I'm excited to see what happens in the Linux and BSD space when they finally open source Swift. I hope it finds an audience excited enough to help reimplement some of the functionality that will be missing from the lack of Apple APIs.

It would be great to see an open source ecosystem for Swift, written in Swift on all *nix platforms.

pjmlp 10 hours ago 2 replies      
> Generally speaking, the NDK makes sense only for a small percentage of apps, so in the general case Google advise against trying to write a whole Android app using the NDK.

Yep, it is easier to make languages target DEX than suffer the pain of the NDK.

They not only advise against it, they make it really uncomfortable for us to use it (vs iOS and WP experience).

nashashmi 11 hours ago 1 reply      
> Namely, instead of generating machine code targetting a specific architecture, LLVM generates assembly code for an imaginary machine, and then converts that intermediate representation to actual code for whichever architecture we're interested in.

Why is that not possible for real-world languages? e.g. have one language which would include the specifics and complications of each language of the world and use it as intermediary for going back and forth to any language in the world.

randyrand 9 hours ago 0 replies      
Why can't they just compile the swift core library into the executable? I imagine some things are architecture specific, like launching new threads (iOS calls down to objectiveC code which then handles the system call,etc). I imagine the problem is that you can't compile just the non architecture specific parts. Unless you wrote some code to strip out those parts...hmm is that possible?

Also, how come it doesn't look like they had to provide the path to the runtime? Built into the compiler I guess?

Alupis 12 hours ago 2 replies      
Just curious why someone would want to do this?

According to the article, you would have to use the NDK to make this work and...

> And of course since we're missing the SwiftCore library this is restricted to a small subset of Swift.

So, it seems more of an academic exercise than a practical thing?

AndrewKemendo 12 hours ago 0 replies      
This is really cool! Also interesting because of Microsoft's efforts to make iOS apps cross compatible with Windows through what I would expect is a similar process of cross-compiling and porting libraries.
tpaksoy 11 hours ago 2 replies      
Having to deal with mangled symbols is pretty annoying. Is it not possible to give a directive to the Swift compiler to disable mangling? Like rust has:


alediaferia 12 hours ago 1 reply      
This is awesome resource. Thank you for sharing.
arsalanb 10 hours ago 0 replies      
Wow, sorry for slightly going on a tangent, but this is really cool. Imagine all the products had a single syntactic guide to follow. Underlying tech may differ, but a uniform process for creation. There are a million flaws with this, but just a thought.

And can I just add WOOOOOO!! Okay. Sorry.

maykr 9 hours ago 0 replies      
actually, you could build an Android command-line executable by using NDK target BUILD_EXECUTABLE. This way is imho clear for the demo purposes by leaving the JNI stuff out of scope
annacollins 10 hours ago 0 replies      
Presently I am a student of the Swift language and always in the back of my mind is the tedious thought that I will have to learn how to do this for Android, one day. Having a compiler do this for you, makes sense. It would be a real blessing.
Out of the Darkness: How two psychologists and CIA devised torture program aclu.org
149 points by geetee  1 day ago   59 comments top 12
aqme28 1 day ago 3 replies      
'By early 2002, the CIA, the Justice Department, and the National Security Council were debating whether the legal and humanitarian protections of the Geneva Conventions would apply to captives suspected to be members of al-Qaida or the Taliban. After weeks of debate, and over objections from the State Department, President George W. Bush ultimately issued the final word on the matter. In a February 2002 memo, he stated that al-Qaida and Taliban detainees were not protected by the Geneva Conventions.'

This is one of the roots of the problem. Once you have a class of people without rights, you can arbitrarily identify people as part of that class and they have no chance at due process. Unless absolutely everyone has basic legal protections, no one has them.

SCAQTony 1 day ago 3 replies      
Time for war crime trials.

Blatant violation of Article 2, United Nations Convention against Torture, United States Signatory 18 April 1988, ratified 21 October 1994.

To be proven by ACLU:

"...For more than a month, Suleiman endured an incessant barrage of torture techniques designed to psychologically destroy him. His torturers repeatedly doused him with ice-cold water. They beat him and slammed him into walls. They hung him from a metal rod, his toes barely touching the floor. They chained him in other painful stress positions for days at a time. They starved him, deprived him of sleep, and stuffed him inside small boxes. With the torture came terrifying interrogation sessions in which he was grilled about what he was doing in Somalia and the names of people, all but one of whom hed never heard of. ..."

numlocked 1 day ago 3 replies      
The actual 'torture report' by the Senate intelligence committee is worth reading at least in part (it's long...) and goes into some good detail about how the psychologists were recruited and operated.


Take an hour or two to read the executive summary at the beginning and you will be better informed that just about anyone. Going to the primary sources is easy enough, and it's really enjoyable (a weird word in this context) to form opinions based on the rawest information available. Of course the actual report is a political document in itself, but that aspect is as much a part of the coverage of the report as the contents itself.

deutronium 1 day ago 0 replies      
"Today, Abu Zubaydah is imprisoned at Guantnamo. He continues to suffer as a result of the torture. He has permanent brain damage. He suffers from searing headaches, sensitivity to noise, and seizures. He cant recall his fathers name or his own date of birth."

-- That is simply fucking abhorrent.

phren0logy 1 day ago 1 reply      
Small, but important point:

There are two APAs:

1. American Psychological Association - PhD/PsyD Psychologists - (torture scandal)

2. American Psychiatric Association - MD/DO Physicians - (unambiguously opposed to medical involvement in torture)

People get psychiatrists and psychologists confused all the time, and the distinction here is really, really important.

Bias: I'm a psychiatrist, and I am proud that our professional organization has been clear from the start that torture is unacceptable.

simon_ 1 day ago 8 replies      
I am always annoyed by the argument that torture is unacceptable AND it doesn't work. It's not totally implausible that this is true, but I think it's very likely that torture "works" in some sense and certainly everyone practicing it expects it to work.

So... for an anti-torture position to have some meat to it, you have to make it clear that you think torture is unacceptable even when it does work.

If you're not willing to sacrifice real lives and safety to avoid torture, I don't think you are meaningfully against it, and you certainly stand no chance of persuading those convinced of its efficacy.

gnu8 1 day ago 3 replies      
How do these guys still exist? Anyone who has the opportunity to do so should damage these guys. Turn off their accounts, refuse to sell them goods or services, anything not required by law.

By no means am I suggesting anything illegal or dangerous. But it's appalling that James Mitchell and Bruce Jessen are allowed to participate in society.

If you are an engineer or executive at any company, you have a duty to check for accounts belonging to these people and turn them off. If they subscribe to your services, their money is no good. If they want to buy products, they are not for sale. No credit cards, no bank accounts, no cellular phones.

They don't belong in our society and anyone in a position to eject them ought to do so.

S4M 1 day ago 0 replies      
Reading this article, I was wondering about the logistical costs of this torture (on top of the fees paid to Mitchell and Jessen). Wouldn't it have been more efficient to arrest the suspected men and try to have an honest conversation with them in the line of "We suspect that you are part of a terrorist network that wants to harm us. If you speak, we can protect you and your family, if you don't, well we will detain you for couple of weeks - without harming you - to see if you change your mind, and then we will simply release you."

It may sound too innocent, but how is someone able to trust the information given by someone who has been driven crazy? If Suleiman forgot the name of his father due to the torture, he may as well forget crucial details of the terrorist plot he was supposed to be part of.

And on a totally unrelated note, please disable the autoplay of the videos when scrolling down to them, I think this is one of the most annoying misuse of javascript I ever came across.

e12e 1 day ago 0 replies      
Btw, for those that aren't aware, it might be timely to mention the legendary German interrogator, Hanns Scharff:


as he is often brought up in discussions on interrogation techniques.

leroy_masochist 1 day ago 0 replies      
Compelling article. I think the extended interview Jon Stewart did with John Yoo is another good document if you want to get the perspective of the people who authorized this. I don't mean that as an endorsement of what happened...


sneak 1 day ago 3 replies      
It remains a tremendous national shame that there have been absolutely no movements toward prosecution for those that perpetrated this incredible regression from civilization.
xer0x 1 day ago 0 replies      
Wow. Dark.
       cached 15 October 2015 04:11:02 GMT