hacker news with inline top comments    .. more ..    16 Jan 2014 Best
home   ask   best   5 years ago   
1
Show HN: Kimono Never write a web scraper again kimonolabs.com
612 points by pranade  18 hours ago   203 comments top 82
1
randomdrake 17 hours ago 5 replies      
The presentation is beautiful and the website is great, but the tech broke so I have no idea how or if this even works. This is a wonderful concept and one I've talked about doing with others. I was really excited to try this. I watched the demo video and it seemed straightforward.

I went to try and use it on the demo page it provides, going through and adding things, but when I went to save it, I just received an error that something went wrong. Well, crap. That was a waste of time. Oh well, maybe it's just me.

Alright, I'll give it another shot using the website they used in the demo. Opened up a Hacker News discussion page and started to give it a try. Immediately it was far less intelligent than the demo. Clicking on a title proceeded to select basically every link on the page. Somehow I clicked on some empty spots as well. Nothing was being intelligently selected like it was in the demo. Fine, that wasn't working tremendously well, but I wanted to at least see the final result.

Same thing: just got an error that something went wrong and it couldn't save my work.

Disappointing. I still might try it again when it works 'cause it's a great idea if they really pulled it off. So far: doesn't seem to be the case.

2
DanBlake 16 hours ago 1 reply      
Show me it working with authentication and you will have a customer. Scraping is always something you need to write because the shit you want to get is only shown when you are logged in.
3
dunham 16 hours ago 3 replies      
The Simile group at MIT did something similar back around 2006. Automatic identification of collections in web pages (repeated structures), detection of fields by doing tree comparisons between the repeated structures, and fetching of subsequent pages.

The software is abandoned, but their algorithms are described in a paper:

    http://people.csail.mit.edu/dfhuynh/research/papers/uist2006-augmenting-web-sites.pdf

4
georgemcbay 17 hours ago 1 reply      
I've written more web scraping code than I care to admit. A lot of the apps that ran on chumby devices used scraping to get their data (usually(!) with the consent of the website being scraped) since the device wasn't capable of rendering html (it eventually did get a port of Qt/WebKit, but that was right before it died and it wasn't well integrated with the rest of the chumby app ecosystem).

This service looks great, good work! But since you seem to host the APIs created how do you plan to get around the centralized access issues? Like on the chumby we had to do a lot of web scraping on the device itself (even though doing string processing operations needed for scraping required a lot of hoop jumping optimization to run well in ActionScript 2 on a slow ARMv5 chip with 64mb total RAM) to avoid all the requests coming from the same set of chumby-server IP addresses, because companies tend to notice lots of requests coming from the same server block really quick and will often rate limit the hell out of you, which could result in a situation where one heavy-usage scraper destroys access for every other client trying to scrape from that same source.

5
GigabyteCoin 11 hours ago 1 reply      
I'm curious how you plan to avoid/circumvent the inevitable hard IP ban that the largest (and most sought after targets) will place on you and your services once you begin to take off?

I could have really used a service like this just yesterday actually, I ended up fiddling around with iMacros and got about 80% of what I was trying to achieve.

6
hcarvalhoalves 17 hours ago 3 replies      
This is excellent. Even it if doesn't work for scraping all sites, it simplifies the average use case so much that it's not even funny.

Feature proposal: deal with pagination.

7
shamsulbuddy 28 minutes ago 0 replies      
Does such Webscraping is allowed legally. Since it is not done directly from our servers and if any legal action will be taken by the scraped website , will it be on kimonolabs..or the user..
8
fsckin 16 hours ago 1 reply      
Constructive Tone: I figured that it might be nifty to scrape cedar pollen count information from a calendar and then shoot myself an email when it was higher than 100 gr/m3.

This would be a pretty difficult thing to grab when scraping normally, but the app errors before loading the content:

https://www.keepandshare.com/calendar/show_month.php?i=19409...

JS error: An error occurred while accessing the server, please try againError Reference: 6864046a

9
sync 17 hours ago 0 replies      
Undo button is awesome.

More web apps need an undo button.

10
thinkzig 11 hours ago 1 reply      
Great work so far. The tool was very intuitive and easy to use.

My suggestion: once I've defined an API, let me apply it to multiple targets that I supply to you programatically.

The use case driving my suggestion: I'm an affiliate for a given eCommerce site. As an affiliate, I get a data feed of items available for sale on the site, but the feed only contains a limited amount of information. I'd like to make the data on my affiliate page richer with extra data that I scrape from a given product page that I get from the feed.

In this case, the page layout for all the various products for sale is exactly the same, but there are thousands of products.

So I'd like to be able to define my Kimono API once - lets call it CompanyX.com Product Page API - then use the feed from my affiliate partner to generate a list of target URLs that I feed to Kimono.

Bonus points: the list of products changes all the time. New products are added, some go away, etc. I'd need to be able to add/remove target URLs from my Kimono API individually as well as adding them in bulk.

Thanks for listening. Great work, again. I can't wait to see where you go with this.

Cheers!

11
bambax 8 hours ago 1 reply      
> Web scraping. It's something we all love to hate. You wish the data you needed to power your app, model or visualization was available via API. But, most of the time it's not. So, you decide to build a web scraper. You write a ton of code, employ a laundry list of libraries and techniques, all for something that's by definition unstable, has to be hosted somewhere, and needs to be maintained over time.

I disagree. Web scraping is mostly fun. You don't need "a ton of code" and "a laundry list of libraries", just something like Beautiful Soup and maybe XSLT.

The end of the statement is truer: it's not really a problem that your web scraper will have to be hosted somewhere, since the thing you're using it for also has to be hosted somewhere, but yes, it needs to be maintained and it will break if the source changes.

But I don't see how this solution could ever be able to automatically evolve with the source, without the original developer doing anything?

12
aqme28 17 hours ago 2 replies      
I would seriously consider rethinking that Favicon.
13
rlpb 16 hours ago 1 reply      
Are you familiar with ScraperWiki? I'm wondering how your work fits in with it.

Edit: looks like they've moved away from that space, but have an old version available at: https://classic.scraperwiki.com/

14
jval 12 hours ago 0 replies      
Great job guys.

One problem I've had though is that I think you guys are hosted on AWS - a lot of websites block incoming connections from AWS.

Are there plans to add an option in future to route through clean IPs? Premium or default, this would be cool and make it a lot more useful.

15
IbJacked 8 hours ago 1 reply      
Wow, this is looking good, I wish I had it available to me 6 months ago! Nice job :D

I don't know if it's just me or not, but it's not working for me in Firefox (OSX Mavericks 10.9.1 and Firefox v26). The X's and checkmarks aren't showing up next to the highlighted selections. Works fine in Safari.

16
tectonic 13 hours ago 1 reply      
17
fnordfnordfnord 17 hours ago 1 reply      
>Sorry, can't kimonify

>According that web site's data protection policy, we were unable to kimonify that particular page.

Sigh... Oh well... Back to scraping.

18
lucasnemeth 2 hours ago 0 replies      
Nice job! I really liked, it's a fantastic idea! And your UX is great! Just one thing I've found when testing: I've had some problems with non-ascii characters, when I was visiting brazilian websites, such as this : www.folha.com.br.
19
jjcm 17 hours ago 1 reply      
Very cool, and I like that the link is your announcement page running inside of the demo. Really drives home the idea.

That said, it looks like it can't do media right now. I would love it if it could at least give me a url for images/other media.

20
trey_swann 17 hours ago 0 replies      
This is a great tool! In a past life we needed a web scraper to pull single game ticket prices from NBA, MLB, and NHL team pages (e.g. http://www.nba.com/warriors/tickets/single). We needed the data. But, when you factor in dynamic pricing and frequent page changes you are left with a real headache. I wish Kimono was around when we were working on that project.

I love how you can actually use their "web scraper for anyone" on the blog post. Very cool!

21
guptaneil 17 hours ago 1 reply      
Nice work, this is much better than I expected! Does it require Chrome? It doesn't seem to work in Safari for me. Also, does Kimono work for scraping multiple pages or anything that requires authentication?
22
ForHackernews 17 hours ago 1 reply      
This looks really slick. What happens if a website you're scraping changes its design? Do you respect robots.txt?
23
alternize 14 hours ago 1 reply      
looks promising!

to be fully usable for me, there are some features missing:

- it lacks manual editing/correcting possibilities: i've tried to create an api for http://akas.imdb.com/calendar/?region=us with "date", "movie", "year". unfortunately, it failed to group the date (title) with the movies (list entries) but rather created two separate, unrelated collections (one for the dates, one for the movies).

- it lacks the ability to edit an api, the recommended way is to delete and recreate.

small bugreport: there was a problem saving the api, or at least i was told saving failed - it nevertheless seems to be stored stored in my account

24
pknight 16 hours ago 1 reply      
That UI made me go wow, this could be an awesome tool. Idea that pops into my mind is being able to grab data from those basic local sites run by councils, local news papers etc and putting it into a useful app.

How dedicated are you guys to making this work because I'd imagine there are quite a few technical hurdles in keeping a service like this working long term while not getting blocked by various sites?

25
paul1664 2 hours ago 0 replies      
Reminds me of Dapper

http://open.dapper.net/

This allowed you to do similar, before being consumed by Yahoo. Might be worth a look.

26
narzero 1 hour ago 0 replies      
I like the concept. Would love to see page authentication
27
lips 3 hours ago 0 replies      
I'm experiencing login errors (PEBKAC caveat: password manager, 2x checked, reset), but the support confirmation page is a nice surprise.

http://i.imgur.com/w01CoUy.jpg

28
tlrobinson 16 hours ago 1 reply      
I built something very similar last year, but sadly never got around to polishing and launching it: http://exfiltrate.org/

(There's a prototype of an API generator hidden in a menu somewhere but it's nowhere near production ready)

29
thatthatis 12 hours ago 1 reply      
This is my third time trying to get an answer to this question: does your crawler automatically respect robots.txt?
30
dmunoz 17 hours ago 1 reply      
I'm normally a bit worried when a thread quickly fills up with praise, but this looks very nice.

It's something I have thought about, as I'm sure many people who have done any amount of scraping have, but never went forward and tried to implement. The landing page with video up top and in-line demo is a pretty slick presentation of the solution you came up with. Good job.

31
rpedela 12 hours ago 1 reply      
Definitely awesome presentation and product.

The example doesn't seem to work right on Firefox. On Chrome, if I click "Character" in the table then it highlights the whole column and asks if I want to add the data in the column. On Firefox, clicking "Character" just highlights "Chatacter" and that is it.

Ubuntu 12.04

Firefox 25.0.1

32
blazespin 15 hours ago 0 replies      
There's a huge business here if you keep at it. I'll throw money at the screen if you can make this work.
33
ThomPete 16 hours ago 1 reply      
Thank you for building a tool I been wanting so I don't have to!

Can't wait to play around with this tonight.

Suggestion. Allow one to select images.

34
jlees 14 hours ago 1 reply      
I like how you've thought through the end to end use case: not just generating an API, but actually making it usable. I've done my fair share of web scraping and it's not an easy task to make accessible and reliable -- good luck!

It makes me wonder if there isn't a whole "API to web/mobile app with custom metadata" product in there somewhere. I can imagine a lot of folks starting to get into data analysis and pipelines having an easier time of it if they could just create a visual frontend in a few clicks.

35
eth 12 hours ago 0 replies      
Great tool!

I'm coming at things from a non-coder perspective and found it easy to use, and easy to export the data I collected into a usable format.

For my own enjoyment, I like to track and analyze Kickstarter project statistics. Options up until now have been either labor intensive (manually entering data into spreadsheets) or tech heavy (JSON queries, KickScraper, etc. pull too much data and my lack of coding expertise prevents me from paring it down/making it useful quickly and automagically) as Kickstarter lacks a public API. Sure, it is possible to access their internal API or I could use KickScraper, but did I mention the thing about how I dont, as many of you say, "code"?

What I do understand is auto-updating.CSV files, and that's what I can get from Kimono. Looking forward to continued testing/messing about with Kimono!

36
PhilipA 5 hours ago 0 replies      
It looks cool, but very expansive compared to Visual Web Ripper, which you pay way less for (but has to host yourself).
37
phillmv 17 hours ago 2 replies      
The UX is great and a journalists everywhere will thank you.

But outside of government websites I don't see how a lot of this is even legal, per se?

38
garyjob 10 hours ago 0 replies      
I found the one click action for selecting an entire column of values as well as the UI/UX on the top column of the page to be very impressive. We were thinking of a nice clean way to represent that particular UI/UX flow in this browser extension we built as well. Will incorporate that in our next release.

https://chrome.google.com/webstore/detail/krakeio/ofncgcgajh...

Would love to meetup and exchange some ideas if you are based in Bay area.

39
chevreuil 4 hours ago 0 replies      
We all know there are a lot of existing tools that does the same things. But I've not met one with such a polished UX. Kudos to the Kimono team, I'll definitly recommend your product.
40
ameister14 18 hours ago 0 replies      
I really like how you guided me in to demoing. Nice job.
41
jfoster 17 hours ago 1 reply      
Cool concept. One concern I'd have about this type of tool is that when it encounters something it can't handle, I'm stuck. Writing your own scraper means that you can modify it when you need to. I think the ultimate solution would be something like Kimono with the ability to write snippets of custom javascript to pull out anything that it can't handle by default.
42
jmcgough 17 hours ago 1 reply      
Really sleek interface, and looks like it could be extremely useful (I just spent a few hours cranking out Nokogiri this morning).

Oh, typo: "Notice that toolbar at the toop of the screen?"

43
kyriakos 5 hours ago 0 replies      
It appears that it doesn't work with websites containing international characters.
44
rafeed 17 hours ago 0 replies      
This is awesome. Really nice implementation and so useful for many different applications. Just signed up and looking forward to trying this out.
45
rmason 17 hours ago 0 replies      
I thought to myself oh boy yet another web scraper as a service but got surprised. I haven't been this impressed with a product video since Dropbox.
46
kenrikm 17 hours ago 1 reply      
Looks awesome, however I keep getting errors and 404s. Could this be an issue on my end (seems to be working for others) or just HN making the servers beg for mercy?
47
cbaleanu 17 hours ago 1 reply      
Does it do logging in to websites then fetching?Do you plan to add scripting to it?
48
twog 17 hours ago 0 replies      
Well done on the product & solving a clear need! This is extremely useful for hackathons/prototyping. I also loved the live demo in the blog post and you did a wonderful job with the design/layout/colorscheme of the site.
49
shekyboy 16 hours ago 1 reply      
Like the parameter passthrough feature. Take a look at places where the parameters are part of the URL structure. For example a Target product pagehttp://www.target.com/p/men-s-c9-by-champion-impact-athletic...

In order to get data for a different product, I will have to modify the URL itself. I think same holds true for blog posts.

50
ph4 17 hours ago 1 reply      
Very nice job. What about scraping data from password-protected pages?
51
catshirt 16 hours ago 1 reply      
really excited to see this. i've had the idea (and nearly this execution) in mind for years but no use or ambition to get it done.

given the pricing though i'm almost motivated to make my own. as a hosted service the fees make sense with the offerings. but not only would i rather host my own- it would be cheaper all around. would you consider adding a free or cheap self hosted option?

aside, i think there is a mislabel on the pricing page. i'm guessing the free plan should not have 3 times the "apis" than the lite plan.

52
xux 17 hours ago 0 replies      
Wow looks amazing. I tried doing some queries on public directories, and it even supports parameter passing. Will be using this for some side projects.
53
bigd 13 hours ago 1 reply      
Seems it can't see the stuff inside angular views.. well at least mines..

But for the rest, awesome product. Thanks.

54
pranade 17 hours ago 0 replies      
Thanks guys, glad you like it. Welcome any feedback so we can make it better!
55
joshmlewis 11 hours ago 1 reply      
Is there an ability to scrape more than one page of data?
56
BinaryBird 13 hours ago 0 replies      
Nice tool, slick UI. It worked for some pages and not for others. Currently I'm using Feedity: http://feedity.com for all business-centric data extraction and it has been working great (although not as flexible as kimono).
57
keyurfaldu 9 hours ago 0 replies      
Awesome! Hats off.. How about extracting hashtag/GID of any record if applicable, which are typically not rendered on page, but hidden under the hood.
58
diegolo 8 hours ago 1 reply      
It would be nice to have a view also on the raw html code, e.g., to create a field containing the url of an image in the page.
59
dikei 13 hours ago 1 reply      
I don't think this can beat the speed of a hand-tune crawler. When I write crawlers, I skip rendering page and javascript execution if it isn't needed, which massively speed up the crawling process.
60
yummybear 8 hours ago 1 reply      
Looks very nice. There seems to be an issue with international characters though (//).
61
dmritard96 16 hours ago 0 replies      
as someone building a home grown proprietary scraping engine. Consider alternative locations of elements. Most sites are using templating engines so its fairly reliable to find things in the same place, but more often than you might expect, things move a round ever so slightly. Navigation is a fun one also. ;)
62
ewebbuddy 16 hours ago 1 reply      
Really cool idea and tool. Still need to test this out properly. Is it possible to scrape note just one page but a stack of them? For example - a product catalog of 1000 SKUs extending upto 50pages.
63
timov 2 hours ago 0 replies      
You can use the utility without registration or login by blocking the login prompt with, for example, AdBlock.
64
wprl 16 hours ago 0 replies      
It's easy not to write web scrapers even without this tool ;)
65
tchadwick 17 hours ago 1 reply      
This looks really useful, and I'm trying to figure out if I could use it on a project I'm working on, but hitting an issue. I sent a support message. Nice job!
66
cullenmacdonald 17 hours ago 1 reply      
the reason i ever have to write a scraper is because of pagination. while this looks awesome, i'll have to stick to scraping until that is solved. :(
67
bluejellybean 17 hours ago 1 reply      
How (if at all) does this run on javascript heavy sites?
68
iurisilvio 17 hours ago 1 reply      
What about some navigation tools there?

Looks pretty good, but it does not really replace my scrappers. Maybe some of them...

69
NicoJuicy 16 hours ago 1 reply      
This is really slick! Btw. Who made your intro video?
70
mswen 12 hours ago 0 replies      
How does this compare with Mozenda?
71
byteface 5 hours ago 0 replies      
use any chrome xpath plugin and give that to YQL
72
aaronsnoswell 13 hours ago 1 reply      
Man that demo is impressive!
73
dome82 16 hours ago 1 reply      
I like the concept and it looks similar at Import.io
74
szidev 17 hours ago 0 replies      
great idea. i'll have to keep this in mind for future projects.
75
abvdasker 13 hours ago 0 replies      
I kind-of enjoy writing web scrapers.
76
taternuts 18 hours ago 0 replies      
That looks quite swift
77
harryovers 15 hours ago 0 replies      
so what do you do that import.io doesn't?
78
rismay 12 hours ago 0 replies      
OMFG.
79
pyed 13 hours ago 0 replies      
actually I love scraping :(
80
iamkoby 15 hours ago 0 replies      
i love this! and amazing video!
81
tonystark 18 hours ago 0 replies      
neat.
82
nnnn 17 hours ago 0 replies      
"Never write a web scraper again"... yea right.. sick and tired of such gimmicks and self promotion on the net today.
2
Requirements for DRM in HTML are confidential w3.org
566 points by duncan_bayne  2 days ago   399 comments top 29
1
simonsarris 2 days ago 23 replies      
I suppose that the title assertion is to be expected. DRM only works if you don't know how it works.

~~~

I'm not sure I see anything wrong with DRM per se (this could be my fever talking), there are probably good uses I'm too dim to think about, but I do think it's unnecessary as part of the HTML specification.

There's no industry or company that has switched to DRM-free content, that I know of, that has failed or suffered because of it:

* Music is largely available DRM-free now, thanks to Amazon's MP3 store (at the least, I'm sure there are others)

* For games, Steam makes it easy to avoid SecuROM Hell

* Despite DRM, all of Netflix's original series House of Cards was available on The Pirate Bay within hours of release. This doesn't seem to hurt Netflix's wish to create more content, or police it more heavy-handedly. (Maybe they would if they could)

For that matter, I think in the modern case every single time a business went DRM free it turned out OK. Isn't that right? In all modern cases, maybe after 2006-ish, DRM-free businesses were accompanied with an easy way to get the content online, and sales did not seem to suffer because at the end of the day piracy can appear (or be) shady and people (rightfully) don't trust shady websites, even The Pirate Bay with all of its popups.

I wish we had better numbers. I would like to see a real analysis on all the reasons people don't pirate and instead buy on Steam. I wish there was a good way to convince media businesses at large.

But I guess this is all water under the bridge, and I'm preaching to the choir.

2
Nursie 2 days ago 5 replies      
Great. DRM. The best example of shooting yourself in the foot ever.

Give customers encrypted content and the keys, try to prevent them from freely using the two together, undermine copyright fair use and first sale doctrines as you go along.

Intended effect - No Piracy

Actual effect - Paying customers get crippled products, pirates carry on regardless

It's crazy. And the more they try to lock it down the worse their products become and the better piracy looks in comparison. Pirates don't only beat the legit industry on price, they beat them on quality and availability. How can the industry allow this to stand? Let alone continue down the same path with their fingers in their ears shouting LALALALALALA I CAN'T HEAR YOU!?!

3
andybak 2 days ago 2 replies      
A key paragraph:

link: http://lists.w3.org/Archives/Public/public-restrictedmedia/2...

Well, as I say, the actual requirements that lead to the proposal of EMEwould be a start. This is how it looks to those who don't agree thatEME is a good fit with the Open Web:

- 'big content' has certain requirements relating to preventing users copying data streams

- they won't make those requirements public (as you've said, the agreements are confidential)

- their licensees propose a technical solution that is unacceptable to many others because it necessitates the use of non-user-modifiable client components

- all proposed alternatives (e.g. FOSS DRM, server-side watermarking, client-side watermarking, no DRM at all) are shot down as being either too expensive or inadequate to the (secret) requirements

In a normal software project, I'd take an apparently insoluble conflict(the requirement for non-user-modifiable client components) to mean thatwe have done a poor job of determining requirements.

Hence my request for either a real user to talk to (e.g. an MPAA rep) orthe actual requirements docs, which you've told me are confidential.

And that sets off my spidey-senses ... something is not quite righthere.

4
Daiz 2 days ago 2 replies      
This should really be at the top of every HTML DRM discussion:

HTML DRM will not give you plugin-free or standardized playback. It will simply replace Flash/Silverlight with multiple custom and proprietary DRM black boxes that will likely have even worse cross-platform compatibility than the existing solutions. In other words, giving in to HTML DRM will only make the situation worse.

Some vendors will keep pushing for it, but at the very least we should not officially sanction what they are doing.

5
rlx0x 2 days ago 4 replies      
This is all so ridiculous, rtmp for instance is as secure a DRM as its ever gonna get and that never stopped me from downloading a stream. Even things like HDMI/HDCP is broken beyond repair. And all of this should justify damaging the w3c reputation forever, what are they thinking?!

This whole concept of DRM is just idiotic, its enough if one guy breaks the DRM and releases it. Why should I even bother booting a propertary OS (windows) and buying a stream everytime I want to watch something if I can just download a release and watch it, and its not like they can do anything against that either.

Why should I bother and buy HDCP capable new hardware, bother with proprietary NSA-compliant US software I much rather buy the DVD, trash it and just download it in a open and free format (I don't even bother with ripping (and breaking CSS) anymore).

6
josteink 2 days ago 2 replies      
Email the W3C. Tell them what you think of this bullshit (in reasonably polite manners).

I've done it. I've gotten a non-canned response.

But clearly they need more people at the gates bitching. This needs to be stopped.

7
belluchan 2 days ago 5 replies      
Can't we just fork the w3? Start using Firefox and forget about these people. Oh I'm sorry your browser is a little slower, but at least it's not Google made.
8
ronaldx 2 days ago 0 replies      
Why is W3C involved in this?

Not only does this create a lack of openness and transparency in the core of the web, but "big content" creators get to pass on the costs of DRM that nobody else benefits from, including to people who are not consuming their content.

Meanwhile, browser vendors will become uncompetitive - since nobody else can compete against a closed standard - and they become even more motivated to work against openness to maintain their existing oligarchy.

Could not be worse for the web.

9
duncan_bayne 2 days ago 1 reply      
It's worth mentioning that the CEO of the W3C, Jeff Jaffe, is trying to rectify that:

http://lists.w3.org/Archives/Public/public-restrictedmedia/2...

10
girvo 2 days ago 11 replies      
Sigh. Look, I'm okay with DRM, as long as it works on all my devices. EME won't, under linux, I guarantee the DRM Vendors won't bother releasing Linux binaries. That annoys me.
11
alexnking 2 days ago 1 reply      
Maybe instead of getting everyone to adopt Silverlight, we could just make the web more like Silverlight. Like more closed and stuff, because movies!
12
dschleef 2 days ago 0 replies      
Compliance rules for Microsoft Playready: http://www.microsoft.com/playready/licensing/compliance/

The encryption part of DRM systems is effectively the same as client-side SSL certificates with a secret SSL certificate. How well it's kept secret is defined in the compliance documents. This secret, plus a secure decoding and output path, are the engineering core of DRM systems.

Studios require "industry standard DRM" for movies and TV shows, with lesser requirements for SD. This effectively means "DRM backed by some entity with lots of money that we can sue if things go wrong". Studios approve each individual device that you serve to, usually with compliance targets at some particular future date for various existing loopholes.

Flash (Adobe Access) is somewhat different, and has an obfuscated method for generating the equivalent of a client cert, thus on laptops it's only rated for SD by most (all?) studios. Apparently studios don't care too much about people copying SD content.

Studios would theoretically approve watermarking DRM systems, but there are two major barriers: having a large (ahem, suable) company offering it, and some way to serve individualized media through a CDN. Neither seem likely. So nobody loses too much sleep about whether studios would actually approve watermarking.

13
hbbio 2 days ago 0 replies      
Looks like the W3C may have been the inspiration for Games of Thrones...

Seriously, if there are men and women of honor in this organization, they should stand up against any form of standardization for DRM. DRM can be a proprietary extension for the people who want it.

14
shmerl 2 days ago 1 reply      
> So, the DRM vendors have solved the problem of creating solutions that meetstudio requirements and what we are trying to do with EME is provide aclean API to integrate these solutions with the HTML Media Element.

Which reads as: studios have nonsensical requirements, which are implemented and soon broken. And "we" (i.e. W3C) need to oblige this insanity for the sake of <...>.

Put your own reason, but I bet it won't be good.

15
Zigurd 2 days ago 3 replies      
Why should DRM be part of a standard? Aren't plug-ins sufficient?
16
alkonaut 2 days ago 2 replies      
The only benefit I can see from standardizing something is that browser makers who want to claim to be compliant actually have to support it, so you won't end up in the flash/silverlight situation where some platforms don't support it.

But if a plugin framework is standardized, why settle for only DRM? Why not fix the whole crapfest that is plugin applications entirely? A standardized interface to a fast sandboxed virtual machine with good hardware support would be excellent. Currently there is javascript, ActiveX, flash, java applets, Silverlight, NaCl, WebGL and a number of others, each having their own benefits and drawbacks.

If I want to write a web based multi-threadced GPU accelerated webcam-using application that works on any compliant browser on any platform, what do I do? Isn't that what the next kind of web standards should be addressing?

17
drivingmenuts 1 day ago 0 replies      
Since many are using Steam as an example of DRM - the important difference is that Steam is a free product, but is not open-source (though it can be used to distribute open-source). It is produced by a company as a means of distributing their products.

It is not even a valid comparison to the blinkard pig ignorance of the secret DRM requirements in HTML, which is an open standard.

I'd just like to know what dipshit at the W3 signed off on this.

18
duncan_bayne 2 days ago 0 replies      
From the mailing list: "[with EME] ... the publisher will have the possibility of deciding which platforms may access their content."

That was from one of the proponents of EME, touting this as a good thing. The response from another list regular was excellent:

"In non-web-terms this is the publishers deciding on what brands of TV you're allowed to play their content."

That's where EME will take the Open Web. We need to oppose it, strongly, urgently.

19
pjakma 1 day ago 0 replies      
I download movies and TV shows using Bittorrent and index sites like TBP because of DRM. Often these DRM systems are not available for Linux, or if they are, they require installing some big blob of binary code. It is easier and more secure for me to use bittorrent.

I would happily use the legal services, if not for this DRM. Those services sometimes are even free (e.g. BBC iPlayer). I would happily pay for a subscription service (I pay subscriptions to a number of different of online sites, mostly journalism or data-organistion - I've no problem with that).

The industry standardising proprietary DRM in W3 will just ensure that I continue to support the distributed, end-user provided services which are DRM-free.

20
kevin_bauer 2 days ago 0 replies      
I guess, the "another backdoor" proposal will go very well in Europe, where most citizens are just static about americas view on privacy and respect for constitutional rights. Way to go, maybe the W3C will finally get Europe and the rest of the "free" world to create their own web!
21
pyalot2 2 days ago 0 replies      
HTML-DRM, proudly building "solutions" to problems nobody has, by following requirements nobody knows about, to create a landscape of content nobody can play.

Way to go W3C, keep up the "good" work.

22
mcot2 2 days ago 2 replies      
If our end result is to see Netflix using HTML5 video on Desktop browsers, how do we get there from a technology and business point of view? Keep in mind that Netflix has content created and owned by the major studios. If any form of DRM is not the way, than what? How do we get to this end goal? Do we make streams 'free' to copy and rely more on the legal system for protection? We are all keen to slam DRM, but what is a viable alternative?
23
xyjztr 2 days ago 2 replies      
Hey Guys, can somebody create a simple guide, FAQ or something similar for non-tech people to understand what is going on with HTML and DRM? It will help to spread the word.
24
PavlovsCat 2 days ago 0 replies      
Here are some thoughts by Cory Doctorow on web DRM. Spoiler: he's not a fan.

http://mostlysignssomeportents.tumblr.com/post/72759474218/w...

25
aquanext 2 days ago 0 replies      
Can't we just boycott this entirely?
26
jlebrech 2 days ago 1 reply      
why can't they just build it in NaCl and leave the open standard alone.
27
dreamdu5t 2 days ago 2 replies      
What's the problem? Don't support companies that distribute any DRM content. Standardizing DRM and propogating DRM aren't the same thing.
28
Fasebook 1 day ago 0 replies      
The internet was nice while it lasted.
29
silveira 1 day ago 0 replies      
HTML6 = HTML5 - DRM
3
US Supreme Court declines to hear appeal by patent troll inc.com
443 points by dded  2 days ago   85 comments top 10
1
grellas 2 days ago 7 replies      
A few thoughts:

1. The Supreme Court declined to hear an appeal by Soverain from an adverse ruling by the Federal Circuit Court of Appeals that had determined the Soverain "shopping cart" patent to be invalid on grounds of obviousness.

2. The Federal Circuit's holding by a 3-judge panel had been remarkable and had shocked patent lawyers generally in that the parties before the court had not even raised the issue on appeal as a ground for invalidating the jury's verdict below. The court raised the issue on its own, concluded that the patent was obvious and invalid, and gave judgment for Newegg in spite of the fact that the jury at the trial court level had found that Newegg infringed.

3. This particular patent had been the original shopping cart patent, dating back to 1994 (well before Amazon began) and it had had a formidable history by which its holder had gotten massive licensing fees from major players over many years for the privilege of using online shopping carts on the web.

4. It is easy to say today that everyone knows what the concept of a shopping cart is and that anyone could have come up with the idea of applying that concept to online shopping. That is all well and good but consider this: not only had this patent passed muster as being non-obvious with the USPTO on its original filing but it had also been found to have been non-obvious on two separate patent re-examinations before that same body and by a string of U.S. district court judges before whom the issue had arisen. In other words, Newegg faced a huge challenge on this issue (the legal standard required that it be able to prove that it was obvious by "clear and convincing" evidence, which is often a tough standard to meet) and this is why Amazon and virtually all other major other online retailers had long since caved and agreed to pay royalties for use of the patent. In the patent community, the Soverain patent was seen as rock solid and one whose shopping cart idea was deemed far from obvious. The top judges and lawyers in the nation, not to mention the USPTO, had all so concluded. The chances of upending it seemed slim to none. And, as noted, even the parties themselves had not raised the issue on the key appeal as a ground for potential reversal. Thus, everyone was stunned when the Federal Circuit reversed the judgment against Newegg on that ground, invalidated the patent, and threw the case out.

5. All that said, when Soverain petitioned the Supreme Court for review of the Federal Circuit's decision, it was trying to undo what it perceived as an injustice done to it as a private litigant ("this is so unfair to us and to our valuable patent"). However, from the Supreme Court's point of view, the kind of petition filed by Soverain is to be granted, and a case heard, only when it has significance far beyond whatever impact it might have on any private litigant. The Court's role in hearing such discretionary appeals is to step in and decide important questions of federal law or to determine who is right when the various lower federal appellate courts may have reached conflicting decisions on such points of law in way that cries out of definitive resolution by the highest court. The Court will not hear cases merely because they might have been wrongly decided unless some such extraordinary factor exists. Thus, in denying Soverain's petition, the Court did nothing more than say that this particular petition did not present important issues of the kind that warranted its attention. It did not validate the Federal Circuit's reasoning or analysis. It did not weigh in against patent trolls. It did not add its authority to the fight against frivolous patents. It simply did what it does on over 99% of such discretionary petitions: it used its discretion to deny it. The legal significance of its decision goes no farther than that.

6. Is Soverain a patent troll that deserved this outcome? Well, its CEO had been a law partner at a major law firm (Latham & Watkins) and the company's business was clearly driven by a legal licensing scheme that had little or nothing to do with active business operations or innovation. It had simply acquired the original company that had come up with the patent back in the day. So, it is a troll if you want to call it that or it is not if you want to use some different definition. But this distinction does underscore how difficult it becomes to analyze patent issues simply by placing labels on the parties. The problem with modern software patents is that too many are too easily granted over trivial "innovations" and this has given vast incentives to those who would package them into shakedown licensing ventures and thereby gum things up for true innovators. It is a situation that calls for action by Congress to rein this in. Otherwise, every party trying to defend itself will find itself, as Newegg did, having to go to extraordinary efforts at massive expense to avoid claims of infringement. Very few litigants can do that and, indeed, Newegg is to be commended for fighting this all the way against tough odds. Let us only hope that systemic fixes can help correct the problem so that this is not the only way available for dealing with such patents. Whatever else this system does, it hardly promotes true innovation.

2
kalleboo 2 days ago 2 replies      
I had to use a secret browsing window to read this without signing up, so here it is for anyone else who has trouble loading the page:

--------

Chalk one up for the enemies of patent trolls: The Supreme Court on Monday threw out a request for trial from alleged patent troll Soverain Software.

The case, called Soverain Software LLC. v. Newegg Inc., is one of three such cases the Supreme Court is expected to consider this year. While the Court will likely hear the remaining cases, which deal with finer points of patent law, its dismissal of Soverain speaks to the potential frivolousness of its claims.

Soverain acquired the rights to numerous pieces of code tied to the online shopping cart, developed in the 1990s. In recent years, Soverain has gone on a litigious tear, suing more than two dozen companies including Amazon, Nordstrom, Macy's and Newegg, an online retailer, which all use shopping carts for internet sales.

Soverain had some success suing on the state level, where a Texas jury awarded the Chicago-based company $2.5 million in damages against Newegg. However, Soverain lost on appeal last year in U.S. District Court for the Eastern District of Texas, which ruled the shopping cart patents owned by Soverain were too general.

Patent trolls typically acquire rights to fallow or soon-to-expire patents with no intention of using the patent. Often patent trolls set up shell companies whose only assets are the patents, which means they have no real revenues or assets. Their sole purpose is to harass small businesses, which usually settle rather than pay for extended and costly litigation.

Patent law was originally written to protect the patent holder, making it easier for the patent holder to prevail in court. For the patent infringer to win, rather, the defendant must prove exceptional circumstances--namely that the patentee acted in bad faith and made baseless claims. This is hard to do. While the patent holder can be awarded "treble damages," or three times the damage claimed, the most the infringer can ever collect is attorney fees.

The remaining cases before the Supreme Court will deal with these finer points.

Congress is examining legislation that would fight patent trolls and their frivolous lawsuits by making them liable for court costs, should they lose their cases.

Small businesses mounted 3,400 legal defenses in 2011 for patent cases, a 32 percent increase over the prior year, according to a research paper from 2012 by Boston University law professors James Bessen and Michael J. Meurer. That cost to small companies was about $11 billion in 2011, also a 32 percent increase over the prior year.

The total median awards to trolls is now nearly twice as high as those to legitimate patent holders, whose median reward fell about 30 percent to $4 billion, according to a 2013 report by PriceWaterhouseCoopers.

3
motbob 2 days ago 2 replies      
"While the Court will likely hear the remaining cases, which deal with finer points of patent law, its dismissal of Soverain speaks to the potential frivolousness of its claims."

I don't think this is accurate. The standard that the Supreme Court uses to decide whether to take cases is not "is this frivolous." Soverain v. Newegg would have to meet a pretty high standard in order to be granted appeal.

I think the author of this piece is reading into this denial way too much. The norm is for appeals to be denied. To be more precise, less than 5% of appeals were granted over a recent one year period. http://dailywrit.com/2013/01/likelihood-of-a-petition-being-...

4
vanderZwan 2 days ago 5 replies      
Good news, but the last sentence of the article made me curious:

> The total median awards to trolls is now nearly twice as high as those to legitimate patent holders, whose median reward fell about 30 percent to $4 billion, according to a 2013 report by PriceWaterhouseCoopers.

I was wondering how they estimated this, so I checked out the report:

> We collect information about patent holder success rates, time-to-trial statistics, and practicing versus nonpracticing entity (NPE) statistics from 1995 through 2012.

> Damages awards for NPEs averaged more than double those for practicing entities over the last decade.

Note: PWC does not use the word "patent troll" - that is entirely the interpretation of the article.

So, just to play the devil's advocate: are NPEs by definition patent trolls? I can't think of a counterargument, but maybe someone else can?

EDIT: Thanks for the enlightening examples so far!

5
dded 2 days ago 4 replies      
I'm encouraged that patent trolls are getting knocked. But my fear is that patent law will hit such a state that only large corporations can wield them. If I'm a small patent holder, and I'm liable for court costs if I lose a suit, then it becomes far too risky to defend my patent against a corporation that violates it.
6
ck2 2 days ago 2 replies      
What did it cost Newegg to litigate that?

Does the troll have to pay legal fees?

Hope Newegg can remain price competitive.

7
csbrooks 2 days ago 0 replies      
I worked on shopping cart software for the web in 1996, and the company I worked at, Evergreen Internet, had been around a while before that. I wonder if anything we did constitutes prior art.
8
shmerl 1 day ago 0 replies      
I hope TQP troll will be busted as well. When will the Supreme Court process that case?
9
incogmind 1 day ago 1 reply      
I think the best way out of these things is make software patents invalid after a short period- like 10 years.
10
revelation 2 days ago 1 reply      
I guess this is why HN mods edit titles on submissions (although the original title is just as terrible). The Supreme Court did not side with anyone; they denied a petition to the court, which is the case for the vast majority of petitions.

If they did accept this particular petition, this would not mean that the Supreme Court sides with the patent troll and the world is doomed; it simply means that the case deals with a contested issue where clarification by the Supreme Court is widely sought.

4
The Great Firewall of Yale 162.209.96.128
379 points by shaufler  1 day ago   125 comments top 39
1
zaidf 1 day ago 7 replies      
I thought my school was bad but reading this makes the administration at my school look like angels. When I launched a similar service at UNC Chapel Hill, the IT dept blocked requests from my server to theirs for scraping latest data.

They claimed I was creating excess load, which is silly because if they really did the math, given how many people were using my service I was probably saving them resources.

2
jahewson 1 day ago 4 replies      
There is no way that a valid copyright claim can be made over the underlying data because it is a statement of fact. Such a work is not eligible for copyright protection.
3
Tossrock 1 day ago 1 reply      
I don't think blocking a specific set of IP addresses constitutes deep packet inspection. If they were reading the payload contents for strings matching the CourseTable site, that would qualify.

Still, this is a stupid move by Yale.

4
girvo 1 day ago 0 replies      
Frankly, if colleges receive public funds, they shouldn't be allowed to claim "copyright" on something like timetable information, in my opinion. Actual intellectual property, maybe, but this? Not a time table. That's just silly.
5
ojbyrne 1 day ago 2 replies      
"Universities are a bastion of free speech." LOL.
6
jamesk_au 1 day ago 4 replies      
One of the principal issues raised here - and not squarely addressed in the post or the article to which it links - is the extent to which average subjective ratings of courses and professors should be permitted to dominate the decision-making processes of students.

Note that Yale's complaint included concerns over "the prominence of class and professor ratings", and the student developers' response was to remove "the option of sorting classes by ratings". Subjective five-point ratings can be useful in many contexts, but in the context of education they can also give rise to genuine pedagogical concerns about the way in which students choose their courses.

Looking at the screenshot in the post, it is not difficult to see that the pattern of enrolments might very quickly become skewed towards those classes with higher average evaluation ratings (whatever such ratings might mean). If that happens, it suggests that some students may be making decisions about the courses in which they enrol based principally on factors other than their interests, abilities and future career paths, or without critical thought. Whilst other factors are relevant, including those for which an average of subjective evaluation ratings might be a plausible heuristic, that does not mean those factors should be the primary or predominant factors.

Without seeking to defend or condone Yale's response, there is more to the story than the tale of student censorship presented in the post.

7
epmatsw 1 day ago 2 replies      
I'm sure no Yale student has ever heard of tethering and that blocking the site on the Yale network will effectively prevent very smart students from reaching this website.

You would think that the Yale administrators would know better than this.

8
dictum 1 day ago 1 reply      
I expect the official explanation to be something like "we cannot endorse an unofficial service that might give misleading information to our students."

Every censor does it from an honest desire to keep this terribly misleading information away from the unknowing masses.

I don't think Yale is blocking the service in a conspiratorial effort to stymie students, but from a not well thought out desire to babysit.

9
jlgaddis 1 day ago 1 reply      
It would have been really cool if the developers of this (really nice, AFAICT) site moved it to (or also made it available via) a Tor hidden service.

The students would regain access to their data (I realize that it has now been e-mailed to them) and it would be a great example of exactly how Tor can help "bypass" censorship.

10
nmodu 1 day ago 1 reply      
If I'm paying $58,000 to attend an institution (rather, if my family is sacrificing $58,000 for me to attend an institution...or,worse yet, if I am taking out $58,000 worth of student loans per year), I should be able to use a course listing service so that I can tailor my academic experience however I chose. THAT is how we open this debate, not with comments about who the proper copyright holder is or whether or not this constitutes as deep packet inspection.
11
klapinat0r 1 day ago 1 reply      
To focus only on the actual website issue:

Could it be in order to govern the information, rather than "copyright" per say?

My thinking is that, from Yale's perspective, having a 3rd party (and especially a student) be the go-to source for course info might be a bad shift in power.

When it's all in good kind, it may not look bad, and even if it is well intended, there are a few problems that could arise:

- Bugs in crawling code causing some course information to be false, omitted or stale.

- Changes in OCI causing said crawler to keep stale data and fail to update.- Students complaining to Yale with wrong information.

all the way to the more paranoid:

- 3rd party maliciously falsifying information.

- Generel confusion as to which information is reliable, driving students to have a more, rather than less, difficult time finding and verifying class scheduling.

I'm all for net neutrality and strongly against censorship in all forms, but "playing devil's advocate" can't there be a somewhat "legitimate" reason to shut the 3rd party page off for Yale students?

12
shtylman 1 day ago 3 replies      
I run a similar service for other schools (courseoff.com) and I have run into this before. I bet what happened was their site failed to cache the course data or seat information and was thus making lots of requests to the Yale servers. To Yale it might appear like a DoS from this site.

Obviously I don't know for sure but I would venture to bet this block was more an automated response than malicious intent against the site.

14
jrs235 1 day ago 1 reply      
"They had contacted us warning that we were using copyrighted data" last I understood you can't copyright data or facts [in the US]. You can own copyright to a particular published format though. One can't copy and publish a phone book verbatim but you can certainly scrap a phone book for its data/facts and publish them in a different format.
15
ivanplenty 1 day ago 1 reply      
tl;dr -- the crux of the issue (right or wrong) is making the evaluation information too public. From the news story:

> "[Administrators' primary concern was] making YC [Yale College] course evaluation available to many who are not authorized to view this information,

> "[Administrators also asked] how they [the site operators] obtained the information, who gave them permission to use it and where the information is hosted."

Edit: Agreed, I don't buy these are the real reasons.

16
Nanzikambe 1 day ago 2 replies      
If it were only deep packet inspection, the solution would be simply to prefix https:// and be done with it. As other posters have remarked, I suspect the article means an IP based block.
17
stormbrew 1 day ago 1 reply      
Something like this happened at the university in the city I live in. There was an apparently awful service for signing up for classes called BearTracks [1] and someone made a scraped version of it that was better called BearScat [2]. Eventually the university basically incorporated the better version into theirs (to, I understand, mixed results).

[1] https://www.beartracks.ualberta.ca/[2] http://www.bearscat.ca/

18
zamalek 1 day ago 0 replies      
> Universities are a bastion of free speech.

Incorrect - universities are now a business, nothing more. You can have your free speech so long as it makes the shareholders happy. Having students confused and lost (or being unable to chose the best education for themselves) is a fantastic way to have them repeat courses in the long run.

Tertiary education is no longer what it used to be. It is now exactly the same type of delusion that women face in terms of having to be slim; or consumers face in terms of having to have the latest iPhone or what have you.

19
dreamdu5t 1 day ago 2 replies      
What's the purpose of Yale censoring certain websites? I find it hilarious that people spend so much money to go to Yale, and some of that money goes to inspecting what they're browsing.
20
diminoten 1 day ago 1 reply      
Is the course listing software open-source? I'd like to do this for another school...
21
TylerE 1 day ago 0 replies      
I forwarded the link to a friend who works in the admissions office at Yale. Can't promise anything but she said she'd be asking some questions.
22
thinkcomp 1 day ago 0 replies      
Harvard did this in 2003. It even went so far as to accuse me of using the word "The" improperly, in a copyright line where I properly attributed credit to "The President and Fellows of Harvard College," when http://www.harvard.edu at the time said the exact same thing (and apparently still does). I left Harvard early (with a degree), and then I wrote a book about it.

http://www.aarongreenspan.com/authoritas.html

Some things never change.

23
benmarks 1 day ago 0 replies      
The experience seems like fair preparation for the reality into which their charges will graduate.
24
ballard 1 day ago 0 replies      
This is an unacceptable, naked abuse of power. Any education institution blocking any site on political or anticompetitive grounds flushes away any vestiges of ideals of free speech and open learning. The administration should have known better or it may find itself replaced for acting incompetently.
25
xerophtye 1 day ago 0 replies      
Wow. This really makes me appreciate what we had at my college. For nearly a decade now, the OFFICIAL portal for the university that lets students and teachers manage courses and assignments (submissions included), has been the one that was originally developed, and still managed by students. We have a webmasters club for that whose responsibility it is keep it up and running and add features to it as they see fit. The university has been nothing but supportive of this, including assigning it an yearly budget for hosting and other expenditures.
26
windexh8er 1 day ago 0 replies      
If anyone is curious that's a Palo Alto Networks NGFW block page. Yale is at least using some great hardware!
28
poizan42 1 day ago 1 reply      
If you actually go to http://coursetable.com you will be asked to login through Yale Central Authentication Service, which sends you to:https://secure.its.yale.edu/cas/login?service=http%3A%2F%2Fc...

I hope I don't give the administration any good ideas here, but I would seem that they have a much more efficient way to disable the site.

29
sgarg26 1 day ago 0 replies      
I understand that Yale and Harvard have a rivalry and compete for students. Out of curiosity, how might Harvard have handled a similar situation?
30
philip1209 1 day ago 0 replies      
Switch it to Cloudflare to obfuscate the source
31
arkinus 1 day ago 0 replies      
Note that this site is also accessible at http://coursetable.com
32
ballard 1 day ago 0 replies      
Has there been an official response?
33
Ihmahr 1 day ago 1 reply      
So MIT murders a student (Arron S.), Yale does some ridiculous censoring...

What's next?

34
zobzu 1 day ago 0 replies      
it's so disgusting that this stuff even happen.
35
takeda64 1 day ago 0 replies      
It looks like http://www.coursetable.com is filtered on WebSense.
36
lightblade 1 day ago 0 replies      
I'm surprised that we haven't DDOS them yet, lol.
37
songco 1 day ago 0 replies      
GFW don't show any "blocked" message, it just "reset" the connection...
38
robitor 1 day ago 1 reply      
"It threatens the very basis of academic freedom and net neutrality"

So pretentious, did a teenager write this?

39
epochwolf 1 day ago 2 replies      
This is not news. Most campus have filtering software and the university administration will use it to block websites that make them look bad.
5
Blackphone blackphone.ch
370 points by jorrizza  1 day ago   195 comments top 66
1
revelation 1 day ago 13 replies      
The privacy issue in smartphones isn't the freaking application processor running Android. Sure, that ones terrible enough.

But the actual problem is the baseband processor running completely non-free software, with an enormous attack surface and access to all the interesting periphery (GPS, microphone). There is not just opportunity to compromise your privacy, Qualcomm and others actively implement such features at the behest of governments and carriers.

Oh, and if you plug that enormous hole, you get to the SIM card, yet another processor that you have zero control over, but which has access to enough juicy data to compromise your privacy. I highly recommend everyone to watch a talk from 30C3 by Karsten Nohl, where he shows a live attack on an improperly configured SIM card that remotely implants a Java app on the SIM card which continuously sends your cell ID (your approximate location) to the attacker by short message (without notification to the application processor, e.g. Android or iOS):

http://www.youtube.com/watch?v=5B7XyVWgoxg

Carriers can do this today. (edit: that's a bit nonsensical, because carriers of course already know your cell id. Anyone with the ability to run a fake basestation momentarily (think IMSI catcher) can do this.)

2
EthanHeilman 23 hours ago 0 replies      
I'd really like a phone that had the following features:

* physical switches for GPS, WIFI, Radio, Camera, Mic, write/read access to disk (go diskless),

* a secondary low power eInk display that is wired directly into the hardware that shows when the last time GPS, mic, camera were turned on (and for how long) and how much data has been sent over the radio and read from disk,

* a FS which encrypts certain files with a key that is stored remotely. If your phone is stolen you can delete this remote key. The key is changed on every decrypt. You also get a remote log of all times this remote key was accessed.

* hardware support for read-only, write-only files,

* hardware support for real secure delete on the SSD,

* the ability to change all my HW identifiers at will (IMEI, SIM, etc),

* a log, stored on a separate SD card, of all data sent and received using a HW tap on the radio/WIFI. The log should be encrypted such that only someone with the private key can read it (public key used to encrypt an AES session key which is rotated out every 5 minutes). If you think someone has compromised your phone you can audit this log for both exploitation and data exfiltration. Since the log is implemented in HW, no rootkit can alter it.

3
joosters 1 day ago 4 replies      
Completely useless web page. All wooly 'feel-good' words and no hard, concrete information. So I guess we just have to take it on trust then?

Also, their privacy policy is laughable:

We turn the logging level on our systems to log only protocol-related errors - great!

the pages on our main web site pull in javascript files from a third party. This allows our web developers and salespeople to know which pages are being looked at - so instead of keeping your own logs, you are outsourcing this to a 3rd party with worse privacy policies, and who can now aggregate your website usage with other sites.

Why didn't they just keep logging on and get rid of the 3rd party bugs?

4
buro9 1 day ago 3 replies      
Well, this is just a splash page and says very little.

It's in partnerships with http://www.geeksphone.com/ which is FirefoxOS based. But yet the Blackphone splash has an image of a phone with Android buttons.

They claim no hooks to vendors, so if it's Android I can't imagine this is going to carry the Play store.

I'd be interested in knowing how they will secure and make private the core functionality of being a phone and sending email and text, all of which are insecure.

On that, I'd speculate that this is just pre-loaded with Silent Circle apps, and maybe will be announced as having DarkMail and a choice of RedPhone.

But... there's no info at all really, so who knows what this is.

The only problem they really have to solve is the eternal question of: Is it possible to provide real security and privacy whilst providing convenience?

5
_wmd 1 day ago 2 replies      
As others have pointed out, the baseband is not your friend. Was thinking about this recently, and saw no reason why existing POCSAG (pager) networks couldn't be reused to provide a completely passive receiver. Imagine a phone where the baseband was off by default, unless attempting to make a call. Voicemail/e-mail summaries were broadcast encrypted via POCSAG, and generate notifications just like a new mail summary coming in via GPRS/3G would.

Obviously usability would suffer a little bit (mostly in huge latency when you actually wanted to make a call), but seems like very cheap phone could be built that integrated a pager, allowing complete disconnection from the 'active' radio network, avoiding location tracking by your cell provider, or similar evil tricks by third parties.

6
thecoffman 1 day ago 1 reply      
A site peddling a product that is supposedly about user control and privacy that won't even load without javascript...

The irony is almost too much.

7
Trufa 1 day ago 0 replies      
I agree with the fact that the website is still a little bit unspecific but this project is backed by Phil Zimmermann, he was the creator of PGP, it doesn't guarantee anything but it definitely means some smart people who are worried about privacy are behind it.
8
GrinningFool 1 day ago 0 replies      
https://www.blackphone.ch/hello-world/

I'm sure there's logic there - powering a very basic non-informative landing site with a WP installation that you took the time to customize, but not delete the default post and comment from...

But it certainly doesn't give me warm fuzzy feelings about the people behind this.

9
wavesounds 23 hours ago 0 replies      
Anyone thinking of making a video to sell a privacy product to mass consumers should probably stay away from creepy music and women walking around in all black hoods. Instead go for soccer moms buying stuff with her credit card or librarians doing research for a school kid. Let's not make secure/private communications something weird and creepy but something normal that everyone does.
10
apunic 1 day ago 2 replies      
Android having the most granular permission system ever seen on any operating system is already the most secure operating system.

The biggest security hole next to the baseband processor and the SIM is the user who installs every app in seconds without checking permissions.

11
darklajid 1 day ago 1 reply      
I'm weird enough to be interested in these kind of things, but the whole site is really .. just fluff. Ignoring that and focusing on the sparse details of the actual thing:

- High-End Android device

- Privacy features in the (custom) Android version

- "Secure communication builtin"

Again, I like the idea. But so far the details match CyanogenMod (with TextSecure for SMS, maybe XPrivacy on top)?

12
epaga 1 day ago 0 replies      
No mention of the thing being completely open sourced - or did I overlook something? If not, seems like something they should mention (I am assuming it IS open source?)...
13
Duhck 1 day ago 0 replies      
I don't really feel like a slave, maybe I am under reacting here. I am pissed the NSA is collecting data, I am upset at all the recent revelations we have had about data privacy in the last 6-8 months, but I certainly don't feel like a slave.

These products should be advertised on theblaze and infowares.

Sure there is a need for better privacy, but I don't really care for the fearmongering...

14
joncp 1 day ago 2 replies      
Secure? They're rewriting the baseband, then? Color me skeptical.
15
c1sc0 1 day ago 2 replies      
How does this protect me from my carrier? No matter which phone I use they still need to record who I call for "billing purposes" and know which cell is closest to route my calls.
16
bosch 11 hours ago 0 replies      
Does any one else find it odd a privacy centric phone's website won't load without scripts, cookies, etc? I would think they would have a text only version if items failed to load properly...
17
andyl 1 day ago 0 replies      
I think the Blackphone is a fantastic reaction to the problem of corporate and government spying. It will build awareness of privacy issues, and pave the way for other more secure offerings. A great first step.
18
andyjohnson0 1 day ago 0 replies      
I know they are pre-launch and this is just a landing page, but it doesn't tell us much. Questions:

1. Is this just a stock phone with some privacy-orientated applications built-in, or is the OS and hardware contributing anything?

2. They seem to be using Android. AOSP or Cyanogenmod? Have they any work themselves to harden the OS? Are they using virtualisation?

3. Any closed binary blobs in there? What about the baseband firmware? (Does open source baseband firmware even exist?)

4. Whats the hardware like? Is it hardened in any way?

19
runjake 1 day ago 0 replies      
I like Mike Janke and all, he's a nice guy. But, he has backed out of RSAC '14 yet [1]? I find it a tough sell to call yourself a privacy advocate and legitimize and fund RSA by speaking at their conference. It also doesn't help Blackphone's cause.

1. http://www.rsaconference.com/speakers/mike-janke

20
sdfjkl 1 day ago 0 replies      
To even have the theoretical possibility of "privacy & security", both software and hardware must be fully open. And then there must be some way to check that the hardware and software you got in that box is actually the hardware from the spec, without extra chips. Those are pretty hard to accomplish.
21
digitalengineer 1 day ago 1 reply      
"and anonymize your activity through a VPN."

iOS and Android support VPN but it needs to be manually activated each time, making it rather useless unless you're using some public wifi. If I understand correctly there is a possibility for large companies to integrate VPN but for the average guy it's rather useless if you have to activate it. If this phone has VPN really integrated that'd be great.

22
andyjohnson0 1 day ago 1 reply      
Renowned cryptographer believes his 'Blackphone' can stop the NSA

http://www.theverge.com/2014/1/15/5310710/phil-zimmermann-si...

23
josefresco 1 day ago 0 replies      
If I desire privacy would I buy a Blackphone, or would I buy another more common smartphone which I would then secure?

If you're "picked up" or detained and you have a Blackphone, or someone observes you using your Blackphone I doubt very much it would help your pricacy concerns.

If however you have a seemingly normal phone it might be overlooked and simply using it wouldn't raise suspicion.

My point is that this type of phone is more for the "regular" person who simply doesn't want to be monitored (as much) and not covert agents looking for a secure phone/platform for communication.

24
r0h1n 1 day ago 1 reply      
>> "Enabling revolutionary communications"?

Eh? Wouldn't "Enabling secure/private communications" be a better, albeit less grand, descriptor?

25
sifarat 1 day ago 4 replies      
I would hate to say this, but people here and there, are cashing in NSA fiasco. I would have loved it more, if this was more focused on 'features' than playing with people's emotions. this is valid for everything currently cashing-in NSA issue.

As for, NSA spying how exactly can this phone ensure 100% secrecy. Given a user would have to use the same apps, and above all, the carrier that other smartphone users use.

Point is, US Govt is hellbent on spying on you. And they will no matter what. Either change the US Govt, or suck it up. Nothing else is gonna work.

26
oh_sigh 16 hours ago 0 replies      
I get the feeling this phone was designed by a marketing group, and not competent engineers. Unless they completely design every chip in the phone, including the SIM and wireless chipsets, the device will never achieve their stated goals.
27
blahbl4hblahtoo 1 day ago 0 replies      
Personally, if I were really worried about privacy I would use burners or get a lineman's handset. It seems like a smart device that you use all the time is going to have the same problems.

So, yeah you can encrypt the voice channel. That's great. You can send encrypted text messages. The people involved are serious cryptographers. All of it sounds good.

You have to ask your self though, what is it you are trying to do? Who is your adversary? Other people here have mentioned it, but what about apps on the phone? Facebook is still Facebook.

28
tn13 1 day ago 2 replies      
How difficult is it really to make a truly open source phone ? All it takes is one dedicated hardware company and a software company coming together.

Hackers have built some amazing hardware in past and we all know about how open source communities have built some of worlds best software. Google, Apple etc. are building devices where they act as gatekeepers and charge us for all nonsensical stuff. If you make a website there are a gazillion ways to promote it but there is only one way to promote and app. Pay some advertiser and you are totally at mercy of Google or Apple.

Firefox has been doing the right thing so far but they seem to take too much time.

29
yetfeo 1 day ago 1 reply      
Mozilla could take great strides towards this type of phone if they cared. Integrate tor, Whisper Systems RedPhone and SercureText, HTML tracking disabled, etc. I'm surprised their Firefox OS looks and works so much like every other phone out there.
30
MWil 20 hours ago 0 replies      
I thought it was funny, considering the top comments, that if I cntrl+F for "zimmerman" it takes me all the way to halfway down the page
31
avighnay 1 day ago 0 replies      
Geeksphone is doing pretty impressive for a startup that they were launch partners for Firefox OS and now have roped in PGP founders for this project.

Were they successful in delivering on the Firefox phones?, Their website always says 'out of stock'. Blackphone seems to be ambitious too. Is it possible for a startup to sail these two boats?

Also I find it odd that the PR is always just before the Mobile World Congress (MWC) which happens in Spain, last year with Firefox OS and this year with Blackphone

32
unicornporn 1 day ago 0 replies      
No Play store in this I hope. I'm currently running Cyanogenmod without Gapps and I'm wondering what this will offer me.
33
djyaz1200 19 hours ago 0 replies      
Will someone please tell them to remove the clips in their video of testing a white phone in the interest of brand consistency? Also this idea seems like a solid game plan for Blackberry? They could rename their company "Black" ala P-Diddy v just Diddy. :)
34
rch 1 day ago 0 replies      
This is not the 'first' phone to do these things. I had an idea along these lines in 2003, and some searching turned up a German company that was already doing it. Somebody bought them a couple of years later, and I don't know what happened to the phone. This sure isn't the 'first' though.
35
whizzkid 1 day ago 1 reply      
With all the respect what they have done so far, I can't see any reason why this is securer than the other mobile phones..

With the latest NSA stuff, I came to conclusion that a true secure system can only be built under these conditions and just to put it out there, this is just my opinion;

- A computer company that manufactures their own hardware such as hard drive, ram, cables, network cards.

- An OS that is newly written and not based on any other existing operating systems.

- Building the whole system with INDEPENDENT hardware and software mentioned above.

- Keeping the mobile device's source code offline from Internet as much as possible

These are just the first steps on developing a secure system, then comes the mobile network architecture and encryption etc.

I admit, it is not an easy job but, trying to develop a secure system with "not secure" development tools is not the right way to go :)

36
aagha 23 hours ago 0 replies      
It's interesting that all the work being done on this "secure" phone is being done on non-secure hardware and networks. Presumably if interested parties think this is a threat, they can access all comms/data about this new phone, inject themselves where they see fit and compromise the final product.

Oh, and never mind compromising the people involved.

37
huhtenberg 1 day ago 4 replies      
For a project concerned with privacy and anonymity the news subscription form is asking way too much.

Also, why is domain on .ch ?

38
tinalumfoil 19 hours ago 0 replies      
Does anyone else see this as ridiculous attempt to profit off the NSA leaks. The video is about scaring people into believing their being "enslaved" and are coming out with a device that has "never before before created" that is aimed at "for privacy-minded, security-minded people". It's filled with unrelated words like "neutrality", "all walks of life", "innovative thinkers" to make it seem legit.

There is no mention of the methods used by the phone to ensure privacy.

39
JoelJacobson 17 hours ago 0 replies      
Would it be possible to do the encryption outside of a normal phone, via some AD/DA converted plugged into the standard 3.5mm-headphone minijack?

I started a thread to discuss this idea:

https://news.ycombinator.com/item?id=7066792

40
lispm 18 hours ago 0 replies      
I stopped watching the video at 'Android'.
41
pieter_mj 1 day ago 0 replies      
True privacy on a smartphone can only be expected when software and hardware are 100% open sourced. This of course includes the source code for the 3 Os's that typically run on a smartphone.Anything that's running server-side cannot be trusted either. So we need client-side encryption/decryption as well.
42
bybjorn 1 day ago 0 replies      
Looks like there will be several players in this market - an alternative is Indie Phone; http://indiephone.eu .. If it ever ships it should be a better alternative privacy-wise as they are building everything from the ground up (their own OS instead of relying on Android, etc.)
43
BuildTheRobots 21 hours ago 0 replies      
Love the idea of a GSM handset that believes in protecting my privacy, however all their features seem to revolve around a secured Android OS.

Does anyone know if the actual baseband/wireless side has been designed with security in mind? -for example I'd love to be warned when I'm connected to an A5/0 "encrypted" GSM network, but I haven't been able to find a handset build in the last decade that's willing to warn me.

44
viseztrance 22 hours ago 0 replies      
I would personally be interested if they would provide security updates over a long period of time.
45
pessimizer 17 hours ago 0 replies      
How usable is Android without a continual involvement with Google? If you have to be involved with Google, there's no point.
46
ilovecookies 21 hours ago 0 replies      
Isn't the problem more connected to the hardware and the fact that most people are already willingly using tons of applications who are giving information about you to companies like google (maps) twitter, facebook etc. If you install the apps with consent on your phone, and those apps have access to the linux or ios kernel runtime and syslogs then you're basically fucked from start.
47
fmax30 1 day ago 4 replies      
This maybe a bit off topic but,why did Switzerland get the .ch domain instead of china.China seems to have a lousy CN domain ,( which reminds me of cartoon network for reasons that are irrelevant here).
48
blueskin_ 1 day ago 0 replies      
Please not another long scrolling page without any real info... shame, I might have wanted one if they had provided any specs or technical details at all...
49
dblotsky 20 hours ago 0 replies      
"You can make and receive secure phone calls; exchange secure texts; exchange and store secure files; have secure video chat; browse privately; and anonymize your activity through a VPN."

People. It's really secure, private, and anonymous, ok?

50
linux_devil 1 day ago 0 replies      
One should be concerned about privacy and digital footprints , but more or less it depends on how many people are looking forward to adapt this concept. People still use Gmail and facebook .
51
caiob 1 day ago 0 replies      
Funny how there's a twitter link at the bottom.

Jokes aside, I think it's a great initiative, looking forward to see what comes out of it.

52
jlebrech 1 day ago 0 replies      
nowhere near as secure as a burner phone purchased in cash.
53
muyuu 1 day ago 0 replies      
I loved it when they asked for my full name to keep me informed.
54
naithemilkman 1 day ago 1 reply      
Isn't this kinda moot if you're using any services that is domiciled in the States?
55
sgarrity 1 day ago 0 replies      
They should probably work on the mixed-content SSL warnings on their own website. It's obviously not related to the security of the phones, but it doesn't instill much confidence.
56
sidcool 1 day ago 0 replies      
Is it an Android phone?
57
heldrida 1 day ago 0 replies      
The phone image is missing. Check "images/teaser_site/img03.jpg", css #phone style.css line 396

Thanks

58
arj 1 day ago 0 replies      
Unless they have some really special hardware in this, I don't see how its that much different than running cyanogenmod + secure applications on top, such as textsecure.
59
pattle 1 day ago 0 replies      
The website doesn't really tell me anything about the phone.
60
junto 1 day ago 0 replies      
> Use the apps you know and love.

Ok, so how do they stop Facebook et al from abusing our contact lists and location data as they do on existing smart phones?

61
dandare 1 day ago 0 replies      
I am not getting it, how do you prevent the carrier from knowing where you are if you sign up to it with your number?
62
skuunk1 18 hours ago 0 replies      
Too bad they couldn't get the url blackphone.sh

;)

63
higherpurpose 1 day ago 1 reply      
Since NSA/FBI can reroute shipping boxes and install malware in them - do they have any plans against that?
64
n008 1 day ago 0 replies      
Just get an old Nokia feature phone
65
blackphace 22 hours ago 1 reply      
Their trailer seems a little too "inspired" by this First ELSE promo video from 2009: https://www.youtube.com/watch?v=ZHghZnOH8dA
66
hekker 1 day ago 0 replies      
It would be nice to order Chinese food anonymously with this phone. Looking forward to the release!
6
Super successful companies samaltman.com
342 points by dko  19 hours ago   142 comments top 40
1
kevinalexbrown 15 hours ago 0 replies      
Mediocre founders try to hire people for the parts that they don't like. Great founders just do whatever they think is in the best interest of the company, even if they're not "passionate" about that part of the business.

I don't often like to follow people, but when I do, this attribute matters to me. I'm not sure, but I suspect that it's a signal that someone wants to build something great more than accumulate accolades. In the latter case, I help someone get rich and famous (which is fine!). But in the former, I'm a part of something awesome.

I find it a little difficult to articulate why this affects me so deeply, but it resonates with me, and talented people I know.

In general, I suspect these attributes are important if for no other reason than talented people you might be recruiting will be on the lookout for them.

2
carsongross 18 hours ago 9 replies      
C'mon now, kids.

While there is some laudable-if-extremely-conventional wisdom in here, almost every point suffers from either survival bias ("Successful companies succeed by being successful.") or from having obvious counter-examples ("Was Steve Jobs a nice guy?"), or both.

3
exit 19 hours ago 12 replies      
> - The founders are nice. I'm sure this doesn't always apply, but the most successful founders I know are nicer than average. They're tough, they're very competitive, and they are ruthless, but they are fundamentally nice people.

ruthless, rooTHls/, adjective

1. having or showing no pity or compassion for others.

what does "ruthless" mean nowadays in valley newspeak? or did i miss a memo about "nice"..?

4
aashaykumar92 19 hours ago 0 replies      
Another important trait Ron Conway highlighted at Startup School that really stuck with me was that Founders of 'super successful companies' can careless about other distractions, especially the media. He highlighted the example of Ben Silbermann and how he used to reject several interviews so as not to lose focus on Pinterest.

So although it's a hybrid of two traits mentioned in the article ("They are obsessed with the quality of the product/experience" and "They don't get excited about pretending to run a startup"), it is still one that should be there by itself.

5
tvladeck 18 hours ago 3 replies      
To all those people screaming "survivorship bias", isn't that what this whole post is about? Like, explicitly in the title of the thread?

Of the companies that became very successful, what were some common traits? That's what this post is about.

"Yeah but you only looked at the successful companies."::smacks head::

6
tomasien 19 hours ago 1 reply      
"They don't get excited about pretending to run a startup"

That's the biggest difference between me now and starting my first company - I was ultra-excited to call myself "CEO of startup" the first time around, this time around we don't even have titles because we don't give a shit. We have a CEO because you need to know at whom the buck stops, but we've even discussed using that interchangeably depending the situation (decided against it) because of how thoroughly we don't care.

I don't think the way I felt the first time around was bad or the reason we failed - but it was a pretty good signal, and I even knew it at the time.

7
argumentum 18 hours ago 0 replies      
Great article ... Some of the points might seem obvious in hindsight, but are rarely followed in practice. Perhaps it is because they are so obvious as to be somewhat invisible, or they are repeated so often as to be ignored.

I thought one of the less obvious, and especially interesting, insights was the following:

Another way this trait shows itself is "right-sized" first projects. You can't go from zero to huge; you have to find something not too big and not too small to build first. They seem to have an innate talent for figuring out right-sized projects.

I wonder how much of this talent (which I'd call a "knack") can be acquired from experience.

I noticed in myself an instinct for categorizing projects as "right time"/"right place". By "instinct" I don't mean I'm particularly good at this, I may be really bad. I mean that I have a feeling, which I can imagine as some sort of neural pattern recognition algorithm. I'd guess everyone has this same feeling .. to what extent can it be tuned into a "knack"?

8
calbear81 13 hours ago 0 replies      
I think a lot of the points are great but when we talk about super successful, I can think of some great startups that grew organically but also a lot of operationally efficient companies that knew how to build businesses at scale with a combination of organic growth, massive marketing, and smart PR. For example, one of Google's largest advertisers in spend is Booking.com (owned by Priceline). They have built an efficient machine that can acquire users/customers in an efficient manner and in the process capture 48% market share of online hotel bookings in Europe and help make Priceline a $60B market cap company.
9
edw519 19 hours ago 0 replies      
Funny, seems like I always narrow my list down to this 1 item:

"They are obsessed about their customers' success."

Everything else is a byproduct.

10
ritchiea 19 hours ago 0 replies      
This started out with some boilerplate SV platitudes (obsess over your product, obsess over talent), but got quite good by the end. The concrete items, particularly "They make something a small number of users really love" and "They don't get excited about pretending to run a startup" really resonate with my experience and are things I think everyone should keep in mind when running a company.
11
_sentient 19 hours ago 2 replies      
Great points. I would also add: They have serious intestinal fortitude.

You can possess all of the other traits, and you're still pretty much guaranteed to run into numerous points in the life of a company where you're staring into the abyss of imminent failure. The ability to withstand that kind of pressure is probably a prerequisite for highly successful founders.

12
normloman 19 hours ago 2 replies      
SURVIVORSHIP BIAS. SURVIVORSHIP BIAS. SURVIVORSHIP BIAS.
13
oskarth 15 hours ago 1 reply      
All I can think about reading the comments (and the article, for whatever reason) is pg's advice to sama and how it didn't seem to help at all here: https://news.ycombinator.com/item?id=6843726
14
jusben1369 18 hours ago 0 replies      
Good stuff. Only nitpick is the "partnership" comment. I actually think too many startups are too insular and not comfortable working with other companies early on. However perhaps here he meant the "hit it out of the ballpark" partnership type.

Please add to "They don't get excited about pretending to run a startup" - they're not on Twitter tweeting cliche's around vision/team/culture/design/customer love all day. That's a big red flag to me.

15
inthewoods 14 hours ago 0 replies      
I think there are two important areas he doesn't touch on:1. Timing: The most successful companies are also the ones that are usually at a particular point in time when an opportunity exists. Too early, you fail, too late, you fail as well (for different reasons).2. A scalable idea: The most successful companies find product or service ideas that scale hugely. These ideas are pretty rare. More often, people find ideas they like and can execute but turn out to be ideas that scale only so far. Sometime it turns out that they've actually founded a services business and didn't know it. More often, they find an idea that can scale to, say, $5m in revenue but then has trouble scaling beyond that. Billion dollar ideas likely represent a very small portion.
16
aryastark 11 hours ago 0 replies      
This article is literally advocating a cargo cult. I've seen sharks jumped, but never this high and this obvious.
17
vzhang 19 hours ago 5 replies      
Number one reason you didn't mention - they are lucky.
18
saumil07 8 hours ago 0 replies      
Sam Altman is a great writer. He was also Founder/CEO of Loopt. I have to wonder why this post doesn't relate the traits back to his work at Loopt so the lessons learned can be more contextualized.

For example, are there areas where he was a mediocre founder by his own definition? Were there times when he was mediocre and then became great? What did it take to go from mediocre to great?

I'm surprised that the essay is so general in nature when there's a wealth of specific (and maybe more valuable) cases that could have been shared even after respecting privacy of individuals involved, etc.

19
tdumitrescu 19 hours ago 1 reply      
I like this one: "They respond to emails quickly." It always amazes me just how many would-be "founders" are unbelievably flaky, miss appointments, drop off the face of the earth for days or weeks at a time...
20
lpolovets 18 hours ago 0 replies      
"Great founders are execution machines." That's a great summarizing quote.
21
sillysaurus2 18 hours ago 0 replies      
This is an excellent list. Thank you, Sam, for putting it together.
22
tsunamifury 16 hours ago 0 replies      
These are great foundational traits, which are common among noble-failure and successful startups I've seen.

You still need to add in the resources (money, energy, charisma) to fund sustained hard work, market resonance, and luck to reach super success.

23
semerda 8 hours ago 0 replies      
I like Sam's blog posts. They are short and punchy. Always leaving me charged with energy. Good stuff. Keep it up Sam!

"They have a whatever-it-takes attitude." - This is such a powerful trait that it puts the "big dreamers" to shame and separates the Wannapreneurs from Entrepreneurs. Anyone is capable of dreaming, talking big, generate ideas et al.. but few are capable of executing them by doing whatever it takes to turn that dream into reality. Ha, it reminds me of the "never give up frog poster".

24
Jormundir 19 hours ago 2 replies      
Is there any evidence backing up these claims? I can think of exceptions for just about every one of these points.
25
tschellenbach 6 hours ago 0 replies      
With many of these points I completely agree, I also believe that there's data to back them up. The following however:

- the generate revenue very early on in their lives- they keep expenses low

Is in direct contrast with many of the most successful startups. As far as I know, Google, Facebook, YouTube, Instagram, Skype don't fit this criteria.

26
dclara 19 hours ago 0 replies      
Wow, so many traits!

I noticed this one is not quite easy to make:

"*They grow organically. And they are generally skeptical of inorganic strategies like big partnership deals and to a lesser extent PR. They certainly don't have huge press events to launch their startup. Mediocre founders focus on big PR launches to answer their growth prayers."

Most startup companies are looking for big partnership deals with PR support intensively. But they are focusing on building customer base. It's really not easy.

27
wslh 17 hours ago 0 replies      
He misses one point: live and fund your company in US. There are more successful Internet companies in US than abroad.
28
hoboerectus 16 hours ago 0 replies      
* They can bench press twice their body weight.

* They are sublime swordsmen.

* They revere the supreme commander.

29
bsirkia 18 hours ago 0 replies      
I would add they were also a bit lucky (whether with timing or virality or some other factor) at some point in their lifecycle.
30
pbreit 19 hours ago 1 reply      
"Charging customers early" is actually the opposite. Few of the big internet successes charged early (google, eBay, PayPal, Facebook, yahoo, etc.).

And, frugality is good in the beginning but after you prove yourself, you have to step on the gas.

31
zephyrnh 19 hours ago 1 reply      
"They generate revenue very early on in their lives. Often as soon as they get their first user."

Is this true?If "super successful" can be understood as "the biggest tech IPOs of the last 15 years", then I think that Google, Facebook, LinkedIn & Twitter would be at the very top of that list. I guess it depends on how we define "very early".

32
mathattack 17 hours ago 0 replies      
I like the list but what are all the source companies? (And in the spirit of survivor bias, failed comparison companies)
33
quadrangle 10 hours ago 0 replies      
tl;dr what makes a company great is how it is run greatly
34
LeicaLatte 13 hours ago 0 replies      
We are prone to saying super while talking. That's ok.

That many supers in writing? Sorry, but that's bad writing.

35
drelihan 17 hours ago 0 replies      
Did you have a list of extremely successful companies you were looking at specifically when you wrote this? If so, I'd be interested in seeing that list to compare
36
desireco42 17 hours ago 0 replies      
No examples, just slogans.
37
samishamlet 9 hours ago 0 replies      
It turns out that pattern matching on PG's essay style does not make one's essays as insightful..

All I can say is: the only factor really worth a damn is luck. Unfortunately you can't control luck, the only thing you can do is increase your luck surface. From my experience, nothing on that list actually does that - what increases luck is: hard work, value of idea, connections and personality. Re-order at will.

38
kimonos 16 hours ago 0 replies      
Nice! I see some helpful information in here. Thanks for sharing!
39
higherpurpose 17 hours ago 1 reply      
> *They grow organically. And they are generally skeptical of inorganic strategies like big partnership deals and to a lesser extent PR. They certainly don't have huge press events to launch their startup. Mediocre founders focus on big PR launches to answer their growth prayers.

Has Google+ written all over it.

40
drdiablo 12 hours ago 0 replies      
this is a test of fun
7
Public speaking is tough speaking.io
338 points by FredericJ  1 day ago   106 comments top 39
1
nostromo 1 day ago 23 replies      
Here's two pieces of public speaking advice nobody will tell you about, but actually work.

1) Beta-blockers. Ask your doctor.

2) Alcohol. Obviously, be careful with this. :) But having a drink really will take the edge off. This works better when giving a toast as a best man than it does at work. It could probably work at a conference too.

Other than this, for a big talk or pitch, I just practice until I'm blue in the face, then I practice some more. If you experience a fight or flight response, your brain cannot think straight, but you can fall back on something that has become rote long enough for you to regain your footing.

After 30 seconds or so, your body will start to calm down, you just have to make it through that 30 seconds without pulling a Michael Bay. http://www.youtube.com/watch?v=_tqRyzTvNKE

Ask HN: I was thinking the other day, someone should make an Oculus Rift app that is just a giant conference room of people staring at you. People with stage fright could use this to practice public speaking and hopefully improve.

2
beloch 1 day ago 3 replies      
Everyone probably has some good advice for public speaking. Here's my #1 piece:

Slow the fuck down!

You don't "win" at public speaking by getting more words in. In fact, you'll likely lose your audience by going a mile a minute. It makes perfect sense, but it's still hard to do. You can practice your talk in private a hundred times and it'll be X minutes. You can present your talk to colleagues and co-workers and it'll be X minutes. Then, when you get in front of a room full of strangers, the adrenaline will hit, you'll go into manic-caffeine-squirrel mode, and you'll blast it out in X/2 minutes! Some people deliberately make their talks too long, knowing they'll finish early if they don't. This is a mistake. They're just cramming too much material into the time allowed and will shell-shock their audience. Slow the fuck down!

The method by which you slow the fuck down is going to be somewhat personal. Different things work for different people. Personally, I do a lot better if I've gotten to know even just a few people in the room a tiny bit. If I can get a few people (hopefully in the front row) into the colleague-zone, I can focus on them during the talk and ignore the strangers.

3
bane 1 day ago 1 reply      
I'm a so-so to "good" public speaker. I used to be a terrible public speaker. I'll probably never be a great orator or Steve Jobs, but I'm pretty happy with my presentation skills. In group settings, I'm often the one chosen to give the public presentation.

Some things that improved me:

1) My university undergrad CS program required a semester of public speaking. Everybody hated it. It's probably one of the top 3 most important classes I took. If you're in a school that doesn't require it, take it as an elective.

2) I had a teaching job for a few years. Getting points across day in and day out, and trying to drag a class along of people at very different learning speeds teaches you very quickly how to project and enunciate so people can hear you well. Watching the faces of, and talking to, the people in the back rows becomes a very important speaking tool.

3) To deal with stage fright, I learned to mentally "not care" about giving the talk. It's hard to explain, it doesn't mean "not caring about doing a good job", it just means to adopt a viewpoint of detached apathy. Before I learned how to do this, even small stumbles would send me into a panic state which only made it worse ending with an avalanche of stutters and tied tongues. Detached apathy turns those little stumbles into such unimportant things that I don't even know they happened until I listen to a recording of my talk or see myself in a presentation.

4) Practice your speech. Because it's important to look up every once in a while in order to project. Practicing your speech helps you do that, instead of looking down into your note cards or your script. I don't practice it relentlessly like Steve Jobs or President Obama. 2 or 3 runs through is usually good enough for most of my purposes. But it helps you keep your focus on not caring.

5) Practice giving speeches. I haven't done it, but I've heard lots of good things about Oration societies like Toastmasters. In my case I got plenty of practice while teaching. But for those people who don't have that option, this is a great option. Nothing gets you used to the routine of giving speeches like giving speeches.

4
hawkharris 1 day ago 0 replies      
Public speaking became much easier to me once I recognized that all good speeches follow a concrete formula.

It's kind of like writing. You wouldn't pick up a pen and start scribbling a lengthy essay without considering its structure.

Similarly, effective public speakers follow a pattern not necessarily the same formula, but a formula. For example, Bill Clinton likes to...

1) Begin with a personal, visual anecdote about a specific person or small group. (e.g. A family walking miles to collect water.)

2) Relate the small example to broader theme. (e.g. Poverty is a big problem.)

3) Weaving that broader concept into the theme of the speech.

Another thing to remember is that while speeches share a structure with writing, they are not written articles. The biggest difference, I think, is that people are not capable of processing as much information.

While repeating yourself in a written piece is often bad form, most public speakers repeat key phrases to keep the audience focused. Listening is usually harder than sitting down to read.

5
reuven 1 day ago 1 reply      
I have been speaking professionally for a number of years now. In a given week, I'm probably speaking 2-4 full days (minus lunch and breaks), teaching various programming languages and technologies. I also give talks at conferences and user group meetings.

I remember very, very well when I had to give a talk oh-so-many years ago, while doing a student internship at HP. I flubbed it big time, and left the room saying to myself and anyone who would listen that I disliked public speaking, and was bad at it.

I'm not quite sure when things changed, but I think that it had a lot to do with my attitude. Instead of worrying about whether people would like me or believe me, I instead concentrated on trying to teach people something they didn't already know, and have a good time in the process.

If I'm enjoying myself while speaking, then the odds are good that the people in the audience are enjoying themselves, too.

If I've learned something interesting, then the odds are also good that the people in the audience will find it interesting, too, and will be glad that I'm sharing it with them.

Again, I'm not sure when my attitude changed, but when I get up in front of an audience now, I feel like I'm there to have a good time. Of course, I don't want to flub things, and there are times when I worry about that more than others. But for the most part, it's a matter of thinking, "Hey, everyone here has the same goal -- to enjoy themselves and learn something."

As others have written, your enjoyment will be enhanced significantly if you prepare. I'd even say to over-prepare. You probably need to know twice as much as you will actually say in your talk, so that you can speak naturally and reasonably about the subject. Try to outline your talk as a story, with a beginning, middle, and end. In technical talks, the story will often be something like, "Here's a problem. Here's a solution. Here are some examples of the solution in use. Here's where the solution fails. Questions?"

Don't worry about your slides too much. Yes, they should be high contrast. Yes, they should be easy to read. But I think that people worry way way way too much about colors, fonts, and images, and not enough about the actual SPEAKING. You want people to be engaged with what you're saying, not with what's on your slides... and that's going to happen if you have interesting things to say.

Above all, be yourself. There are oh-so-many examples (in real life, and also in movies and on TV) where people are told that they should open with a joke, and so they tell a ridiculous joke that no one finds funny, including the presenter. If you're naturally funny, or are willing to have people not laugh at your jokes, then go for it. If you're a serious kind of person, then be serious. (Although it's always better if you can be somewhat silly, in my book.)

6
saurik 1 day ago 1 reply      
To some extent the point I want to make I'd similar to the one made by reuven elsewhere in this thread, but I think it is still different (and maybe shorter? we'll see ;P) enough to still post. (OK, after writing, this failed at my goal of being shorter ;P.)

So, I also do a lot of conference speaking, albeit nowhere near as much as reuven. I remember in high school, public speaking was terrifying. By the end of college, I was giving one of the graduation speeches.

The difference was not me becoming better at making arguments or telling stories or being prepared or building slides or really anything about what I said on stage: the difference is that I felt at home there.

In essence, I had the fear of public speaking that many, if not most, people have. This fear is mostly about people watching you and judging you. You are concerned about where they are looking and what you are doing: it paralyzes you.

It had very little, however, to do with what you are doing in front of everyone: you could be on stage being told "eat breakfast as you would on a normal day" or simply a lunch meeting where you are standing due to lack of chairs while everyone else is sitting.

I don't feel, therefore, like helping people present is the solution. I will say that it might try to ease the person's anxiety enough to consider doing it once, but that isn't why they are afraid: I am not afraid of bungee jumping because I think I'm going to die due to the cord breaking, I'm afraid of bungee jumping because even looking at a photograph taken from a high-up location makes me curl into a ball.

These fears can be so bad that they aren't obviously fixable (phobia-level fears can be like that). In my case, I likely have acrophobia (heights), but as something of a "class clown" when I was much much younger, I can't ever claim to have had glossophobia (public speaking). My fear was mild, and I tackled it.

I want to be very clear, though, that there is a difference between "preparation" and "lack of fear": if you told me to go stand on stage right now in front of a thousand people, I'd be happy to do that. I would be willing to try to entertain them. I might fail, but I don't mind anymore.

I might thereby recommend more doing something structured that tales away all of the "things you can do wrong" variables entirely before bothering with trying to prepare those away: take an acting class. You are told exactly what to say, you have a director guiding your movements, and on the show day a perfect performance can be identical to the previous day. You don't have to worry if what you are saying sounds stupid: you have no choice in what to say.

(That said, I wouldn't "recommend" it strongly, as I think a lot of these shortcuts in hindsight by people who have defeated something others find hard are missing the point of what made it work for them: that you probably just need to be doing it, constantly, for long enough, to make it easy. This is similar to the "monad tutorial fallacy" in my mind.)

Then, when your fear of being in front of people is gone, maybe the preparation isn't even that big of a deal: if you are comfortable, the audience will be comfortable, and you can "get away with" a lot more on stage.

I mean, preparation is great, but "public speaking is tough" is not because "writing slides is tough" or "answering questions is tough", it's simply tough because "public anything is tough"... you answer questions every day in the hallway: you don't need more preparation to do that on stage, you just need less fear (which again: isn't easy).

7
ctdonath 1 day ago 1 reply      
As an introvert, I have no problem talking in front of a large group. I thrive on one-on-one conversations where each person has an opportunity to talk thru long complex interesting thoughts without interruption. Speaking in front of a large group is exactly that: I get to talk at length on a favorite topic, at whatever level of detail I choose, to someone who is interested in what I'm saying and will not interrupt; that I'm doing this with 10,000 individuals at once is just being efficient about it.

Helps that I've decided that if I'm going to be wrong, I'm going to be definitively wrong.

8
EliRivers 1 day ago 1 reply      
As someone in your audience, I beg you, please do not tell me what you're going to tell me, then tell me, then tell me what you just told me.
9
chops 1 day ago 0 replies      
I've given a handful of talks at miscellaneous user groups ranging from 5 minute lightning demos to one way-too-long-but-there-is-too-much-to-cover-in-45-minutes talk about Erlang types (I felt bad it was so long).

While I'm the last guy to walk up to a stranger and strike up a conversation, and I break out in cold sweats preparing to cold-call prospects for my business, I've always had this thing about public performing, whether it be speaking, playing and instrument, or even (gasp) singing.

I'm not sure of the psychology of it all, but it feels like the pressure of presenting, combined with a strong fear of being viewed a failure gives way to a certain comfort zone in presenting. And once up there for a minute or two, I notice that I quickly find myself firing on all cylinders (probably from the adrenaline), and then everything from then on becomes quite natural for me (even if my natural presentation style comes across a little neurotic).

Anyway, that's my anecdotal contribution to the public speaking discussion.

10
bedhead 1 day ago 1 reply      
I had never spoken publicly, as in a featured speaker in front of a large gathering of strangers. I had spoken in front of everyone at my old company (80 people) but that was the closest I came to public speaking, and since I knew everyone it didn't count. I remember freshman year of high school having my stomach in knots when teachers would call on me. I just had that nervous personality. Want to know how nervous I'd get in public with everyone's attention on me? I almost fainted at my wedding - at the altar. The priest had to cut the ceremony in half to accommodate me. To this day people make fun of me for it (I feel bad for my wife).

A couple months ago, I surprisingly got asked to be a speaker at a pretty large and prestigious conference in town. It was at a large venue with over 1,000 attendees, some of whom are important to impress for various reasons. It was a great opportunity so I accepted, knowing that this could be a problem.

Anyway, I rehearsed my 10 minute speech ad nauseum, I could do it in my sleep. Every little last verbal tic, joke, everything. I knew I'd still be nervous. I wanted to be so good that I could do it on autopilot and hopefully be more confident. I got on stage, lights shining brightly, and took a seat as the host read a brief introduction about me. While he was doing this, I was so nervous that I thought I was either going to vomit or faint, or some horrible combination of the two. I was literally telling myself not to puke over and over again. My stomach was tossing and my head was spinning...I could barely breathe.

He finishes his intro and I start my talk, visibly nervous. Then a funny thing happened. About 20 seconds in, something clicked and I just thought to myself, "Why are you nervous? You know this stuff cold. You got this." And wouldn't you know it, from there on out I killed it. I dunno, it was weird, I instantly became as relaxed as I am with my friends and delivered a great speech. I had tons of great jokes, kept everyone really engaged, and I think even delivered an interesting idea to the audience. By the time it was over I was actually disappointed it was over since I was having so much fun. I got tons of superlative-filled compliments afterwards and was really in shock about it all.

I dont know what the moral is. Just have fun I guess. Know what you're talking about and the rest will sort itself out.

11
drblast 1 day ago 0 replies      
Don't think of it as public speaking, think of it as a performance.

You wouldn't go try to perform a play without scripting it and memorizing your script first, nor should you do that with your presentation. Once you do that you can ad-lib and it will seem natural. Even the off-the-cuff jokes aren't really off-the-cuff.

And go twice as slow as you think you should, and pause a lot. When people get nervous they talk faster and don't realize it. If you're nervous your perception of time will change and small pauses seem like an eternity. Slow down and force yourself to break for five seconds between "paragraphs" and you'll be way ahead of most people.

12
Theodores 1 day ago 2 replies      
Just wing it. Seriously.

Why is it that so few schools teach children how to speak in public?

It is not difficult, all you need is a debating society.

I am fortunate enough to have gone to a school where the debating society was the thing to do. Even on a cold winter with snow outside two hundred or so of the thousand at the school would show up, of their own accord and without anyone telling they had to go. To be voted by your peers onto the committee for the debating society was the ultimate in status. Our debating society made public speaking a fun thing to do.

As well as being able to propose/oppose a motion from the stage with a self-prepared speech it was also possible to learn how to listen, ask questions from the floor and respond to points made.

So, when I left school, I had a head start. I had spoken in front of a crowd on two hundred or so occasions from a very safe sandbox. In my adult life this experience has been invaluable. I know about what happens if one is not totally prepared. I know what happens if one is over prepared - i.e. reading instead of talking. I know about posture and how to make meaningful eye contact with a sea of faces. However, most importantly, I knew that public speaking was a desirable thing to do, a privilege.

If anyone reading this has kids and their kids are not involved in a school debating society, think about it. Get together with the school and a few teachers and sell them the idea of a debating society. Get someone charismatic - a head teacher who has to present in front of all the kids - to make the debating society the most important thing he/she does. Your local posh school will have a debating society, visit them, learn how they do it and steal their procedures and organisational structure.

Then, if you are lucky and the school debating society kicks off and becomes the thing to do, your child should grow up to be a darned good public speaker. What they will learn from that will help them no end. If they also end up knowing a subject inside and out at some stage of their adult life they should be able to literally wing it without having to use any of the silly suggestions presented on this thread (betablockers - you must be kidding!!!).

13
bigd 1 day ago 0 replies      
I've a talk in 30'.

Another suggestion should be "do not read suggestions on how to do talks right before giving one".

after a life in academia, what I usually suggest is:like your topic, keep it easy, and reharse, reharse, reharse.

14
yodsanklai 1 day ago 0 replies      
I used to be really scared when i had to give "important" talks, especially in English which isn't my native language. I was so anxious that I couldn't even work the days before. I remember my first professional talk. My mouth was so dry that talking was difficult. (tip to beginners: take a bottle of water).

Interestingly, I had much less problems when I was presenting somebody else's work.

The thing that really helped me was benzodiazepines (e.g. Xanax). I took them a few days before until the day of the talk and I felt much much better. I know these drugs get a bad press, but in my case, they really helped. The side effets is that they tend to make you sleepy, but it didn't really affected me.

Now, I'm certainly not a great speaker, but I don't have any problems with public speaking.

15
pmiller2 1 day ago 0 replies      
Between grad school (teaching, seminar talks, etc) and other occasions, I've spoken in front of groups of 3-300 people hundreds of times. I have no idea if I'm all that good at it, but at least I'm comfortable with it. :-)

The biggest trick for me is realizing that talking in front of a group is different from talking to one person, but talking in front of a small group is not that different from talking in front of a medium or large group. Under 5 or so people is still pretty much an intimate/conversational atmosphere in my experience, but going from 5 or 10 up to 50, 100, or 300 is pretty much all the same. The only real difference is the amount and type of projection equipment involved.

Depending on the specific scenario, there are other things I try to keep in mind (e.g. I found that between 0.5 and 1.5 slides per minute worked well for a seminar talk in grad school), but abstracting away the size of the audience in my mind is the one that's paid me the biggest returns in reduced anxiety. Now if I just had a way to make sure the A/V equipment always worked, I could make a crapload of money. ;)

16
Codhisattva 1 day ago 4 replies      
Practice at Toastmasters meetings.
17
jccalhoun 23 hours ago 0 replies      
As someone that has and is teaching public speaking my number one tip: sound like you care.

I can't tell you how many terrible speeches I've sat through where the person was saying "this is really important and means the world to me" but sounded like they didn't care at all.

Number two: don't write out ever word of your speech. It is public speaking not public reading. Being able to read a text out loud without sounding like you are reading is a skill and you should learn to speak from notes/outlines first because that is easier to sound like you are talking with us rather than at us.

18
julienchastang 1 day ago 0 replies      
Public speaking has caused a great deal of distress, panic, and anxiety for me in the past. To remedy this situation, I joined a local Toastmasters club. They are located literally all over the world, and there is probably one in your area. I cannot say enough good things about Toastmasters. Through frequent, repeated public speaking exposure, over time, you become desensitized so you don't feel as panicked. And your speaking skills improve as you have to give speeches on a regular basis. I completely disagree with comments that suggest this problem can be solved through drugs or alcohol. I had ONE stiff drink before an important talk, and I completely hated the feeling while I was speaking.
19
treenyc 1 day ago 0 replies      
I have enzyme issue with Alcohol, but I believe that is a nice trick.

From personal experience, and from someone who had tremendous problem with public speaking to someone who performed very well at a toast master event in NYC without any preparation. I can say quite a few things on the subject.

One thing is for sure. We are all afraid of other people. No matter who we are. It is just that fear get expressed in different ways. Some people are being shy and passive, while some are being aggressive and over-confident. Until we discover who we really are. Using tricks (power point) and strategies (drink alcohol/weed) will not take us far.

What made the most difference in my process is some ontological training like this leadership course. The course doesn't really say that it will help you with public speaking. Just that you will leave the course

"Being a Leader and Exercise Leadership Effectively as your own natural Self-Expression"

Nothing more, nothing less.

However, the course has nice side-effects, like public speaking.

The course is NOT cheap, but I consider it worth more than my college degree. Next one is at Singapore. FYI, I have no financial tight to the course or University.

http://beingaleader-singapore.com

20
pessimizer 22 hours ago 0 replies      
Public speaking terrifies me. I seem to do alright if I follow five rules:

1. Don't bail and run out of the room screaming.

2. Don't ramble. Don't leave your outline for an anecdote or further explanation - trust your outline to be good. If you have to meander because you did your outline at the last minute and you know it kinda sucks, if you then meander while meandering, you've lost the game and no one remembers what you were talking about.

3. Don't "umm," "right," or "ok." before and after anything you say.

4. Don't laugh at your own jokes (at least don't do it before you finish getting them out.)

5. Remember that you don't look as nervous as you feel.

21
alan_cx 1 day ago 1 reply      
I assume there are different reasons for people fearing public speaking. But, FWIW, my thing is to really and fully know the subject you are talking about. For me, the nervousness comes from the fear of being found out in some way. So, I find that if I know my subject, Im quite happy to waffle on to who ever wants to listen, but if I know or think the audience might know more than me and be able to some how show me up to be some sort of fraud, Im a bag of nerves.

I dont know if that works for anyone else, but my theory is that the nerves come for the fear of somehow looking a fool, and that becomes less likely the more you know about what you are talking about.

22
anildigital 1 day ago 0 replies      
23
wturner 1 day ago 0 replies      
The easiest way to speak publicly is to actually believe in what you're doing and talking about. The audience then becomes kind of like an omnipresent pressure that keeps you going.

If you aren't 'locked into' what your talking about then nothing will save you. I know from personal experience.

I also heard a talk that if you imagine the audience as 'prey' such as small rabbits or chickens then it becomes easier as it takes power away from the flight or fight aspect.

24
AhtiK 1 day ago 0 replies      
Exhale as deeply as possible and keep it this way as long as you can. After that breathing restores with first few rapid big inhales. Restarting your breathing this way is also restarting your brain in a way so the thinking becomes calm. Works every time.

Another tip is to eat 1-2 bananas half an hour before the event and maybe a glass of fresh orange juice. Banana works as a natural beta blocker reducing anxiety. While on stage, plain water, no juices..

25
chaz 1 day ago 0 replies      
Plant your feet and square your shoulders to the audience. Walking around is ok, too. But slouching and shifting your weight from left to right can hurt your confidence as well as hurt the way your confidence is projected. You'll develop your own more natural style over time.
26
peteri 22 hours ago 0 replies      
For talks to user groups where I'm generating new slide decks and demos one piece of advice I was given was reckon on around 1 hour of prep for each minute of speaking time. The successful stuff that I've done seemed to match this.

Also for a one hour time slot you'll probably actually want around 40 minutes of material allowing time for introductions and a Q&A session at the end.

27
city41 1 day ago 0 replies      
blatant plug: I'm working on a website aimed at increasing social skills and one "track" of the site will be for improving public speaking -- http://metamorf.us
28
mebassett 1 day ago 1 reply      
Say someone is a mediocre-to-decent public speaker already. How does one "level up" to be a really great public speaker? I've thought about a speech coach or class, but I don't know anyone who has had any success with this who could recommend where to find a good one.
29
re_todd 1 day ago 0 replies      
I went to a doctor, and he gave me beta blockers, which helped a lot.

Another thing that helped is reading forums like this where many people admit how nervous they are. In speech class, everyone seemed to do relatively well, so I was under the impression that I was the only person in the world that gets nervous during a speech. Just knowing that other people get nervous has helped me handle it better.

You can also take your contacts out or glasses off so you cannot see people clearly, which also helps a little.

I've also noticed that my anxiety attacks usually happen before the speech, not usually during it, and they only last a few minutes. Knowing that they will not last forever has also helped me.

30
eflowers 1 day ago 0 replies      
What I've learned is that 20 minutes in, you're hour is up.
31
aniketpant 1 day ago 0 replies      
Nobody mentioned Speak Up. It's a wonderful community of people where everyone helps each other out in planning and preparing for talks. It's been slightly inactive recently, but every mail gets an assured response.

Link: http://speakup.io/

32
cmbaus 1 day ago 0 replies      
Here are couple ideas I've written on the topic: http://baus.net/i-don%27t-like-public-speaking/

I did quite a bit of public speaking in the past couple years and it gets easier over time. I think the best advice is prepare, prepare, prepare.

33
gumby 1 day ago 1 reply      
To me there are different scales of public speaking or presenting.

I actually have no problem presenting to 500 people (the largest audience I've had): I just talk, and try to make some eye contact. There are always a few friendly faces.

Presenting to up to a dozen people is no problem for me: I can adapt (speed up / slow down, skip over stuff, dive deep, repeat, whatever) depending on how the people react.

But there's an excluded valley of somewhere between one and three dozen. I feel weird just presenting as I would to 500 people, yet it's too big to get the intimate preso treatment. When I have presented to a group this size it has almost always fallen flat.

34
Kerrick 1 day ago 0 replies      
Another great resource: We Are All Awesome! http://weareallaweso.me/
35
ismaelc 1 day ago 0 replies      
If you have something exciting to talk about, public speaking is not such a chore (a joy in fact). The challenge is having content that's easy to make exciting.

If that's not possible for you, then try to get excited of the fact that you're out there to excite the hell out of something mundane. Surprise your audience.

Being in that state of mind alone should knock out the jitters.

36
crimsonalucard 1 day ago 0 replies      
The only way a phobia can be conquered, if it can be conquered at all, is through repeated exposure.
37
janogonzalez 1 day ago 0 replies      
Shameless plug, here it is my own advice regarding conference speaking: http://janogonzalez.com/2013/12/02/conference-speaking-how-t...
38
hakanson 1 day ago 0 replies      
Where can I submit a pull request to remove the F-word from these otherwise great tips, or do I need to fork. One could consider "dropping the F-Bomb" against many conferences code of conduct pertaining to "harassment includes offensive verbal comments." Also, as we try and mentor more youth to code, including school age girls, is this the persona we ware marketing?
39
gre 1 day ago 1 reply      
Tell them what you are about to tell them, tell them, tell them what you told them.
8
Net neutrality is half-dead: Court strikes down FCCs anti-blocking rules arstechnica.com
324 points by shawndumas  2 days ago   126 comments top 23
1
mjmahone17 1 day ago 1 reply      
What the court is saying is that, if the FCC refuses to classify broadband providers as common carriers, then, because they neither receive the same protections as common carriers nor have the same responsibilities, they can't be regulated as if they were common carriers.

The FCC could change their rules to treat broadband suppliers as common carriers. However, that's something that big-name broadband providers don't seem to want, as it would reduce their freedom of operations.

2
saalweachter 1 day ago 7 replies      
Note that the DC Court of Appeals is the one that the Filibuster Crisis was all about. According to the Wikipedia, it still(!) has three vacancies, and the Senate Republicans have spent the last ~N months preventing any of the Obama administrations nominees from being confirmed to the Court.

These things matter.

3
loup-vaillant 1 day ago 1 reply      
We could make all the laws we want about Net Neutrality, it wouldn't change the fundamental flaw that made this problem possible in the first place: too much centralization.

I hear that the US, there are only 2 ISPs: one of the big 2, or the little local one. In France, we have about 4. At the other end, we have Google, Amazon, but most notably we have YouTube and Netflix.

Clearly the market is not efficient. Why do we have so big players in the first place? Why do we tend to have only the big players?

Because of the infrastructure. In the way the internet is distributed, artificial economies of scale and barriers to entry favour the big ISPs (this is clearly the case in France, I suppose the US is the same). And, we have asymmetric bandwidth, which kills peer to peer exchanges. If people were allowed to host servers at home, there would be no need for things such as YouTube, Blogger, or Facebook (search engine are still a thorny problem, though).

If we got rid of this over-centralization, it would be harder to discriminate your bandwidth in the first place. Net Neutrality would be the default, instead of something we have to fight for.

4
sologoub 1 day ago 2 replies      
Definitions from US Code Title 47:

"(1) Advanced communications servicesThe term advanced communications services means(A) interconnected VoIP service;(B) non-interconnected VoIP service;(C) electronic messaging service; and(D) interoperable video conferencing service."

"(11) Common carrierThe term common carrier or carrier means any person engaged as a common carrier for hire, in interstate or foreign communication by wire or radio or interstate or foreign radio transmission of energy, except where reference is made to common carriers not subject to this chapter; but a person engaged in radio broadcasting shall not, insofar as such person is so engaged, be deemed a common carrier."

"(24) Information serviceThe term information service means the offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications, and includes electronic publishing, but does not include any use of any such capability for the management, control, or operation of a telecommunications system or the management of a telecommunications service."

I'm not a lawyer, but consider myself well grounded in tech and telecom, but reading these definitions I'm kind of at a loss. In common law, my understanding is that a "common carrier" is someone that makes transport services available to the public. These can be physical, such as shipping a crate, or technological (telecom) in nature. By that inference, transporting packets of information is essentially same as transporting normal packages.

Unfortunately, the "by wire or radio or interstate or foreign radio transmission of energy" is so period-specific that one could argue that it doesn't apply and the (24) Information Services is so broad and vague, it could practically be applied to anything.

One interesting bit, which makes me think that there is hope, is the definition of advanced communications, that include both VoIP and messaging services. Sadly, their definitions are not that broad...

5
gdubs 1 day ago 1 reply      
My knowledge of Anti-Trust laws dates back to elementary school, but how is it legal for the companies that maintain the infrastructure to be in the content game as well, when other content providers can't compete on favorable pricing for bandwidth?
6
jacobheller 1 day ago 0 replies      
Here's the full text of the opinion on Casetext: https://www.casetext.com/case/verizon-v-fcc-3

We'll be getting some leading net neutrality scholars and lawyers to annotate the doc, so check back later today for interesting, in-depth analysis along-side the key passages in the case.

7
adricnet 1 day ago 0 replies      
So, the court agrees with many others that the FCC needs to re-label cable companies as communication common carriers before regulating them as common carriers. I guess that's good?

Is it a difficult thing technically or only politically for the FCC to change their minds / admit they did this wrong in the first place?

What is the downside of treating the cable networks as communications media?

There are some thoughts on that here, though note the source: http://www.attpublicpolicy.com/government-policy/the-fcc-hav... .

8
anExcitedBeast 1 day ago 1 reply      
I know this isn't a popular opinion, but I think we need to reconsider accepting precedent that we're OK with the US government regulating the Internet (without legislation, no less). With the concern about surveillance, copyright abuse, DMCA/Computer Fraud and Abuse Act abuse, and content regulation (in the case of the UK and China), is this really the solution we're looking for? Neutrality is a real problem, but this seems like throwing the baby out with the bathwater.
9
declan 1 day ago 1 reply      
This is a duplicate of another thread started an hour earlier: https://news.ycombinator.com/item?id=7057495
10
pasbesoin 1 day ago 2 replies      
It is obvious that they are, de facto, common carriers.

Give up the lobbyist payola, reclassify them, and introduce some real competition to my now more frequent than annual Crapcast price bumps (or significant humps, as it were).

(And in my case, this is primarily for Internet, although basic cable comes along as a quasi-freebie -- it costs, but then a discount on the combined package largely or totally negates that cost.)

Otherwise, you can bet I'm not voting for either major party, next time around.

As a consumer, I find that the only way to defeat this bullshit, is to stop paying for it. If I had an alternative to Crapcast in my neighborhood, I'd take it. (I don't count AT&T, because for a lonnngggg time they refused to deploy high speed Internet here, and because their policies and behaviour are just as bad. As well, they've personally screwed me in a prior location, as I've commented before.)

11
kolbe 1 day ago 2 replies      
Intuitively, I would have thought that this would be horrible news for content providers/distributors, and great news for wireless carriers. However, today, Google, Facebook, Amazon &etc are flying, while Verizon and AT&T are falling.

Does anyone in the industry know what this is all about, and what importance this decision really has on the future of mobile?

12
unethical_ban 1 day ago 0 replies      
Am I experiencing deja-vu? I fee like many of these comments (and their responses!) are exact reposts from earlier submissions about this very same topic.

  "The court is saying the FCC needs to reclassify providers"    "The Republicans are holding up nominations"    
so on so forth.

13
angersock 1 day ago 1 reply      
Anyone who is interested in a really good overview of 20th century telco policy should read The Master Switch by Tim Wu (http://www.amazon.com/The-Master-Switch-Information-Empires/...).

It goes over the transition from telegraph to telephone to internet, talks about the rise of media conglomerates, and basically explains how we're in the mess we're in today. Quite an enjoyable read, especially when learning about the differences between old and new styles of monopolies.

14
wahsd 1 day ago 0 replies      
What I wish would happen is that organizational forces were focused on breaking up ISP monopolies over the pipes. Essentially, building a firewall between infrastructure and content. It would create a market...remember that thing we think controls America's destiny...that would lead to faster bandwidth and also ISPs that offer free, open, and protected services.
15
shmerl 1 day ago 0 replies      
Why can't they start classifying ISPs like common carriers?
16
hrjet 1 day ago 1 reply      
How much should the rest of the world care about this?

If a web service is hosted, say, in Europe and is being consumed by a customer also in Europe, will they be affected? AFAICT, they shouldn't.

17
tomrod 1 day ago 0 replies      
The economist in me is happy, as this allows for greater investment incentives on the part of ISPs.

The FLOSS advocate in me is sad, as this is a compromise that I don't want to see go away.

18
VladRussian2 1 day ago 0 replies      
>In its ruling against the FCCs rules, the court said that such restrictions are not needed in part because consumers have a choice in which ISP they use.

in theory vs. in practice

19
the_watcher 1 day ago 1 reply      
Does anyone have a good solution for this argument? I find myself wildly sympathetic to both sides of it. Is there any way to decentralize internet access in the future (something like what the utopian ideal of solar powering your home would be for electricity)?
20
draggnar 1 day ago 0 replies      
Perhaps this means the other shoe will fall. Will local regulations making it easy for municipalities or other actors to set up their own ISPs?
21
nitrobeast 1 day ago 0 replies      
Quote from the linked article, "(net neutrality rules) forbid ISPs from blocking services or charging content providers for access to the network." But that is confusing. ISPs are already charging content providers for access to the network. Netflix and Google need to pay for their bandwidth.

Actually, web neutrality means the ISPs should treat all data in their network equally.

22
Grovara123 1 day ago 0 replies      
Why is this not a bigger deal?
23
pearjuice 1 day ago 1 reply      
How can something or someone be half-dead? Life is a binary thing, you are either YES (1) alive or NO (0) dead. I fail to comprehend how any respectable (tech) journalist would call something "half-dead". It implies there is a state between being alive and death when this is clearly not the case.
9
Why 'Her' will dominate UI design even more than 'Minority Report' wired.com
307 points by anigbrowl  2 days ago   205 comments top 30
1
aegiso 2 days ago 16 replies      
Here's the thing that bugged me throughout the movie: once AI's progressed to the point where it can rival a human, all bets are off. Nobody needs to work again, ever -- not even to maintain or develop the AI's, since they can, by definition, do that themselves, with infinite parallelizeability to boot.

What does "design" even mean in a world where everyone on earth can basically have an arbitrarily large army of AI's in the background designing everything in your life, custom-tailored for you?

For this reason I don't see how the world in the movie could possibly exist. Not because the technology will never get there, but because once it does virtually all aspects of society that we take for granted go out the window. So imitating any of this design is a silly pursuit, because once you can make it there's no reason to.

I should go re-read some Kurzweil.

2
mrmaddog 2 days ago 3 replies      
I have not yet seen "Her", but this strongly reminded me of Ender's communication with Jane from the "Ender's Game" sequels. One of the most interesting facets to their conversations is that Ender could make sub-vocal noises in order to convey his pointsshort clicks of his teeth and movements of his tonguethat Jane could pick up on but humans around him could not. It is the "keyboard shortcuts" of oral communication.

If "Her" is really the future to HCI, then sub-vocal communication is a definite installment as well.

3
jasonwatkinspdx 2 days ago 3 replies      
I once read a quip in an interview with a sci-fi author. He said something like: "No one writing about the present day would spend paragraphs explaining how a light switch works." It's easy for sci-fi to fall into the trap of obsessively detailing fictional technologies, to the determent of making a vivid setting and story.

Edit: I'm not saying that sci-fi shouldn't communicate some understanding of the future technology or shouldn't enjoy engaging in some futurology. Just that it's difficult to do in an artful way.

4
kemayo 2 days ago 2 replies      
>>> Theos phone in the film is just thata handsome hinged device that looks more like an art deco cigarette case than an iPhone. He uses it far less frequently than we use our smartphones today; its functional, but its not ubiquitous. As an object, its more like a nice wallet or watch. In terms of industrial design, its an artifact from a future where gadgets dont need to scream their sophisticationa future where technology has progressed to the point that it doesnt need to look like technology.

This article really makes me think of the neo-Victorians from Neal Stephenson's Diamond Age.

...which is kind of funny, because in many ways Snow Crash exemplifies the other ("Minority Report") style of design the article talks about.

5
scotty79 2 days ago 5 replies      
Voice is horriblly slow medium of transfering information. I read because it's faster than listening to an audiobook. It's not scannable. You can't skip through the unimportant parts with one thought as you can do when you look at things.

You can listen to a single voice stream at a time so when AI talks to you you are more cut off from the people around you than when you look at our phone. ...unless exchanging glances is more important than what people are actually trying to tell you when you happen to look at the screen.

6
w-ll 2 days ago 2 replies      
OT: But if you get a chance, watch [1] Black Mirror. There is 2 seasons of 3 episodes. skip the first episode maybe? but I liked it because that* could happen tomorrow. Where as the other shorts are in a somewhat see-able future.

I feel like Spike Jonze was inspired by a few of the episodes. Her was still an amazing movie.

1. http://www.imdb.com/title/tt2085059/

7
sourc3 2 days ago 2 replies      
Saw the movie this past weekend and thought it was really good. I didn't like it just because it has awesome voice driven OSes or endless battery life devices, but because it portrays a current trend we are experiencing; hyper connected loneliness.

The more people are "digitized" and tethered to their devices, the more they seek some human connection.

Don't want to ruin the movie for those who haven't seen it so I won't comment on the ending. However, I urge the HN crowd to check it out. It's one of the best movies I've seen in a while.

8
altero 2 days ago 4 replies      
I wish futurist would just drop speech recognition as holly grail. Speech has lot of flaws, is horribly unprecise and non private. I think neural interface has better future.
9
snowwrestler 2 days ago 1 reply      
Does Minority Report dominate UI design? I think it has dominated the movies' potrayal of future UI, but that is not the same thing.

I think if you look at the actual UIs being designed and sold today, their clearest entertainment ancestor is Star Trek the Next Generation.

10
mratzloff 2 days ago 1 reply      
I found the technology in Her to be natural and elegant, all things considered.

Actually, the most improbable thing in the movie is that this guy had the equivalent of a $40,000 a year job and rented such a fantastic apartment.

(Also, that the website BeautifulHandwrittenLetters.com would be successful with such a clunky domain name.)

11
aaron695 2 days ago 0 replies      
As usual a fictional movie uses a imaginary amazing far future backend with a 'new' UI and people seem to think it's the UI that's the great bit.

Minority Report was never about the UI, it was the software that allowed the gestures find the info. It would have been equally amazing and quick with a mouse and keyboard.

This is a common trick when people demo new hardware. Somehow that internet mirror knows exactly what to show you in the morning by magic, but you think it's the physical internet mirror that's amazing when you watch the demo.

12
jkw 2 days ago 4 replies      
Can someone explain how Minority Report dominated UI design? (serious question)
13
njharman 1 day ago 0 replies      
Making technology "invisible" is missing the point and wrong tack to take. It's not that tech is hidden. It's that tech has become so ubiquitous, accepted, and integrated that we no longer notice it or think of it as "tech". Which combines social changes, refinement of technology, and time (as in, new generation has to grow up not knowing life before smartphones for example).
14
leephillips 1 day ago 0 replies      
According to the article, the movie depicts a near-future where "a new generation of designers and consumers have accepted that technology isnt an end in itself". Do people the the present regard technology as an end in itself? I had no idea. Anyway, I'm a big Jonze fan and want to see this.
15
JVIDEL 2 days ago 2 replies      
From the UX standpoint the problem with Minority Report (MR) is that when you compare it with the tech we had in 2001-02 its completely INSANE, while Her is actually building on top of something we already have

Point in case 12 years ago we didn't have ANYTHING close to the UX in MR, and even today we don't. Any consumer-available motion tracking and gesture recognition is still not comfortable to use in a professional way (ie: for work) as it was in the movie, but voice recognition is much much better than it was in 2002.

Basically Her is like Siri or any other decent voice assistant, but MR is like...........what? kinect? nah, wii? yeah right, leap? yeah right! I can picture tom cruise losing all tracking the moment he rotates his hand...

16
danso 2 days ago 1 reply      
Does anyone still re-watch TNG episodes and find that the queries they do to be profoundly limited in power, other than the feature of having the universe's knowledge to query across?

If UIs are taking cues from entertainment, they might act as a nice bridge, but are just as likely to be stifling

17
skizm 1 day ago 1 reply      
Minority Report technology is garbage. That much hand waving and moving around gets tiring after about 5 minutes. In no way does that UI beat a keyboard and mouse or an xbox controller depending on context.
18
krazybig 1 day ago 0 replies      
The question of how AI will integrate with our society and economy is a fascinating one. We often make the mistake of assuming that an AI will be similar to a human just faster or smarter, but that misses some of the key distinctions of an AI versus biological intelligence.

One of the most striking is the ability to radically alter the substrate and operation of an AI system.

Because of the emergent nature of intelligence, I suspect that many AI instances will be raised like children, tested and validated for specific environments and then large portions of their consciousness could be frozen to prevent divergence of their operational modes. AI systems could also incorporate self-auditors, semi-independent AIs which have been raised to monitor the activities of the primary control AIs. Just as we involve checks and balances in corporate or national governance, many AIs may be composite entities with a variety of instances optimized for different roles.

This will be desirable since you may not want a general AI intelligence acting as a butler or chauffeur. Do you really want them to be able to develop and evolve independently?

Of course this just scratches the surface. AI will take in us in directions we can not dream of today.

19
sirkneeland 1 day ago 0 replies      
So this is how Apple gets disrupted. A future in which devices go from the central component, the obsession, the grabber of our attention, to dumb (if not invisible) terminals to a massive omnipotent cloud.
20
platz 2 days ago 0 replies      
All the comments here debating whether AI in the movie would. What about the topic of the article, design?
21
jotm 2 days ago 2 replies      
I haven't seen the movie, so I gotta ask - do those glasses have built in displays? Cause that seems like the near future and a better one than just vocal communication...
22
wooptoo 2 days ago 0 replies      
While I was reading this I couldn't stop thinking how much it converges with the ideals of calm computing http://www.ubiq.com/hypertext/weiser/acmfuture2endnote.htm
23
ecoffey 2 days ago 0 replies      
Reminds of the Human-AI relationship in this series : http://en.wikipedia.org/wiki/Counting_Heads
24
marc0 2 days ago 0 replies      
I see quite some discussion about UIs and whether they should be audio based or rather visually oriented etc. For a really futuristic intelligent device (call it OS, robot ...) I would drop the idea of "the UI" at all. Rather I would imagine such a system to be intelligent enough to provide a suitable way to exchange data depending on the situation and the task.

There are times when "it" listens to my words and answers verbally. At other times I just want "it" to read what I wrote on my sheet of paper and interpret it. Or I want it to follow my eye movements, or read command off my lips. And it's not just a collection of UIs, but it's a flexible UI that adapts its protocols permanently (sometimes twinkling of an eye has huge information content, sometimes not).

25
solnyshok 2 days ago 1 reply      
started reading that article, but then got carried away with thoughts, what if AIs were designed to make humans's life nice and pleasurable and romantic. That could work until 2 humans fell in love with one AI. What's next? Give each a clone?
26
zequel 1 day ago 0 replies      
" he realized, isnt a movie about technology. Its a movie about people"

That quote, from the article, could be applied to every apocalyptic, zombie and robot movie. It's not about the [X], it's about how people react to [X].

27
trumbitta2 2 days ago 1 reply      
I'm uncomfortable with the idea of a computer system solely based on speech recognition, without a keyboard or other input devices, as the one depicted in the article.

How about people who can't speak or hear?

28
tempodox 2 days ago 0 replies      
Can we PLEASE stop posting this pointless Wired infotainment crap?
29
frade33 2 days ago 1 reply      
pardon my ignorance to technology, is this even hypothetically possible to create AI intelligent enough to be at par with humans or even more?
30
abhi3188 2 days ago 0 replies      
any idea when this movie is releasing in India?
10
AMD launches Kaveri processors aimed at starting a computing revolution venturebeat.com
293 points by mactitan  2 days ago   186 comments top 39
1
pvnick 1 day ago 5 replies      
Among other things, this has lots of applications for molecular dynamics (computational chemistry simulations) [1]. Before you had to transfer data over to the GPU, which if you're dealing with small data sets and only computationally limited is no big deal. But when you get bigger data sets that becomes a problem. Integrating the GPU and the CPU means they both have access to the same memory, which makes parallelization a lot easier. If, as someone else here said, AMD is partnering with Oracle to abstract the HSA architecture with something more high-level like java [2], then you don't need to go learn CUDA or Mantle or whatever GPU language gets cooked up just for using that hardware.

I'm personally hoping that not only will we get to see more effective medicines in less time, maybe some chemistry research professors will get to go home sooner to spend time with their kids.

[1] http://www.ks.uiuc.edu/Research/gpu/

[2] http://semiaccurate.com/2013/11/11/amd-charts-path-java-gpu/

2
ChuckMcM 1 day ago 2 replies      
This reaffirms for me again that we really need AMD to keep Intel from falling asleep at the wheel. I was certainly intrigued by what I saw in the Xbox One and PS4 announcements and being able to try some of that tech out will be pretty awesome.

It is fascinating for me how FPUs were "always" co-processors but GPUs only recently managed to get to that point. Having GPUs on the same side of the MMU/Cache as processors is pretty awesome. I wonder if that continues though what it means for the off chip GPU market going forward.

3
pron 1 day ago 3 replies      
AMD is doing some interesting work with Oracle to make it easy to use HSA in Java:

* http://semiaccurate.com/2013/11/11/amd-charts-path-java-gpu/

* http://www.oracle.com/technetwork/java/jvmls2013caspole-2013...

* http://developer.amd.com/community/blog/2011/09/14/i-dont-al...

* http://openjdk.java.net/projects/sumatra/

It is intended that the GPU will be used transparently by Java code employing Java 8's streams (bulk collection operations, akin to .Net's LINQ), in addition to more explicit usage (compile Java bytecode to GPU kernels).

4
amartya916 1 day ago 1 reply      
For a review of a couple of the processors in the Kaveri range: http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600...
5
AshleysBrain 1 day ago 2 replies      
I have a question: Previous systems with discrete GPU memory had some pretty insane memory bandwidths which helped them be way faster than software rendering. Now GPU and CPU share memory. Doesn't that mean the GPU is limited to slower system RAM speeds? Can it still perform competitively with discrete cards? Or is system RAM now as fast as discrete-card bandwidth? If so does that mean software rendering is hardware-fast as well? Bit confused here...
6
bvk 1 day ago 2 replies      
The comparison is hardly disingenuous: the i5 may not be given Intel's highest branding designation, but it is an enthusiast processor and only a slight step down from the top-of-the-line i7-4770k, lacking only hyperthreading.

And this is completely irrelevant, since the i5-4670k ships with Intel's highest integrated graphics option for desktop chips, which is what is being compared to the A10-7850k.

At the moment AMD's processors can't compete with Intel at the high end. It makes no sense to berate a company for not doing what it can't.

7
networked 1 day ago 0 replies      
This is an interesting development indeed. In light of http://images.anandtech.com/doci/7677/04%20-%20Heterogeneous... I wonder if we'll soon see a rise in cheap, low-power consumption dedicated servers meant for GPU-accelerated tasks (e.g., for an image host to run accelerated ImageMagick on to resize photographs). Do you think this would be viable in terms of price/performance?

And in case you were, like me, wondering about how much the new AMD CPUs improve on improve on their predecessors' single-thread performance you can find some benchmarks at http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600....

8
tommi 1 day ago 2 replies      
Kaveri means 'Buddy' in Finnish. I guess the CPU and graphics are buddies in this case.
9
GigabyteCoin 1 day ago 3 replies      
Any initial insights as to whether this new CPU/GPU combo will play any nicer with linux than previous AMD GPUs?

Setting up Catalyst and getting my ATI Radeon cards to work properly in a linux setup is probably my least favorite step in setting up a linux computer.

10
anonymfus 1 day ago 3 replies      
11
ck2 1 day ago 2 replies      
AMD needs to die shrink their R9 chip to 20nm or less and put four of them on a single pci-e board.

They'd make a fortune.

12
transfire 1 day ago 2 replies      
Hey, they finally built an Amiga-on-a-chip!
13
dmmalam 1 day ago 0 replies      
This could be an interesting solution for a compact steambox, essentially very similar to the hardware in the ps4 & xbox one, though I wonder if the lack of memory bandwidth would hurt performance noticeably.
14
jjindev 1 day ago 0 replies      
"AMD says Kaveri has 2.4 billion transistors, or basic building blocks of electronics, and 47 percent of them are aimed at better, high-end graphics."

This sentence would have been so much better off if they'd just punted on the weak explanation of "transistor" and left it to anyone unsure to look it up.

15
malkia 1 day ago 3 replies      
Old ATI chips were named Rage. Kaveri seems to be a river in India.... but it would've been much more cooler if it was named Kolaveri, which according to my poor translation skills must mean Rage in Indian (or one of it's dialects - possibly tamil).

And then there is the song... :)

16
fidotron 1 day ago 1 reply      
This is great progress, and the inevitable way we're going to head for compute heavy workloads. Once the ability to program the GPU side really becomes commonplace then the CPU starts to look a lot less important and more like a co-ordinator.

The question is, what are those compute bound workloads? I'm not persuaded that there are too many of them anymore, and the real bottleneck for some time with most problems has been I/O. This even extends to GPUs where fast memory makes a huge difference.

Lack of bandwidth has ended up being the limiting factor for every program I've written in the last 5 years, so my hope is while this is great for compute now the programming models it encourages us to adopt can help us work out the bandwidth problem further down the road.

Still, this is definitely the most exciting time in computing since the mid 80s.

17
sharpneli 1 day ago 0 replies      
This looks really cool. However it suffers from the same issue as their Mantle API suffers from. The actual interesting features are still just hype with no way of us accessing them.

Yeah the HW supports them but before the drivers are actually out (HSA drivers are supposedly out at Q2 2014) nothing fancy can be done. It'll probably be at end of 2014 until the drivers are performant and robust enough to be of actual use.

18
jcalvinowens 1 day ago 2 replies      
This is interesting, but my experience is that Intel's CPU's are so monumentally superior that it will take a lot more than GPU improvements to make me start buying AMD again.

Specifically I'm dealing with compile workloads here: compiling the Linux kernel on my Haswell desktop CPU is almost a 4x speedup over an AMD Bulldozer CPU I used to have. I used to think people exaggerated the difference, but they don't: Intel is really that much better. And the Haswells have really closed the price gulf.

19
Torn 1 day ago 0 replies      
> It is also the first series of chips to use a new approach to computing dubbed the Heterogeneous System Architecture

Are these not the same sort of AMD APU chips used in the PS4, i.e. the PS4 chips already have HSA?

According to the following article, The PS4 has some form of Jaguar-based APU: http://www.extremetech.com/extreme/171375-reverse-engineered...

20
rbanffy 1 day ago 2 replies      
Are there open-source drivers or will the driver builders have to reverse engineer the thing?
21
vanderZwan 1 day ago 2 replies      
Here's something that confuses me, and maybe someone with better know-how can explain this:

1: The one demo of Mantle I have seen so far[1] says they are GPU bound in their demo, even after underclocking the CPU processor.

2: Kaveri supports Mantle, but claims to be about 24% faster than Intel HD processors, which are decent, but hardly in the ballpark of the type of powerful graphics cards used in the demo.

So combining those two, aren't these two technologies trying to pull in different directions?

[1] Somewhere around the 26 minute mark: http://www.youtube.com/watch?v=QIWyf8Hyjbg

22
grondilu 1 day ago 0 replies      
The A-Series APUs are available today.

It's nice to read a tech article about a new tech that is available now, and not in an unknown point in the future.

23
higherpurpose 1 day ago 1 reply      
I wish Nvidia would join HSA already, and stop having such a Not Invented Here mentality.
24
codereflection 1 day ago 0 replies      
It's really nice to see AMD getting back into being a game changer.
25
hosh 1 day ago 3 replies      
I'm a bit slow on the uptake ... but does this remind anyone of the Cell architecture? How different are those two architectures?
26
rch 1 day ago 0 replies      
> the power consumption will range from 45 watts to 95 watts. CPU frequency ranges from 3.1 gigahertz to 4.0 gigahertz.

I was fairly dispassionate until the last paragraph. My last Athlon (2003-ish) system included fans that would emit 60dB under load. Even if I haven't gotten exactly the progress I would have wanted, I have to admit that consumer kit has come a long way in a decade.

27
annasaru 1 day ago 0 replies      
Nice name. A majestic river in South India.. https://en.wikipedia.org/wiki/Kaveri
28
jsz0 1 day ago 1 reply      
The problem I see with AMD's APUs is the GPU performance, even if it's twice as fast as Intel's GPUs, both Intel & AMD's integrated GPUs are totally adequate for 2D graphics, low end gaming, and light GPU computing. Both require a discrete card for anything more demanding. IMO AMD is sacrificing too much CPU performance. Users with very basic needs will never notice the GPU is 2x faster and people with more demanding needs will be using a discrete GPU either way.
29
belorn 1 day ago 1 reply      
Will the APU and graphic card cooperate to form a multi-GPU with single output? It sounds as it could create a more effective gaming platform than a CPU and GPU combo.
30
dkhenry 1 day ago 1 reply      
So we finally get to see what HSA can bring to the table.
31
devanti 1 day ago 0 replies      
Hope to see AMD back in its glory days since the Athlon XP
32
erikj 1 day ago 0 replies      
The wheel of reincarnation [1] keeps spinning. I hardly see anything revolutionary behind the barrage of hype produced by AMD's marketing department.

[1] http://www.catb.org/jargon/html/W/wheel-of-reincarnation.htm...

33
lispm 1 day ago 0 replies      
So the next computing revolution is based on more power hungry chips for gamers?
34
adrianwaj 1 day ago 0 replies      
I wonder how well they can be used for mining scrypt.
35
imdsm 1 day ago 1 reply      
How do I get one?
36
ebbv 1 day ago 2 replies      
All of Intel's recent mass market chips have had built in GPUs as well. That's not particularly revolutionary. The article itself states "9 out of 10" computers sold today have an integrated GPU. That 9 out of 10 is Intel, not AMD.

The integrated GPUs make sense from a mass market, basic user point of view. The demands are not high.

But for enthusiasts, even if the on die GPU could theoretically perform competitively with discrete GPUs (which is nonsensical if only due to thermal limits), discrete GPUs have the major advantage of being independently upgradeable.

Games are rarely limited by CPU any more once you reach a certain level. But you will continue to see improvements from upgrading your GPU, especially as the resolution of monitors is moving from 1920x1200 to 2560x1440 to 3840x2400.

37
higherpurpose 1 day ago 1 reply      
> AMD now needs either a Google or Microsoft to commit to optimizing their operating system for HSA to seal the deal, as it will make software that much easier to write.

I'd say this is perfect for Android, especially since it deals with 3 architectures at once: ARM, x86, MIPS (which will probably see a small resurgence once Imagination releases its own MIPS cores and on a competitive manufacturing process), and AMD is already creating a native API for JVM, so it's probably not hard to do it for Dalvik, too. It would be nice to see support for it within a year. Maybe it would convince Nvidia to support it, too, with their unified-memory Maxwell-based chip next year, instead of trying to do their own thing.

38
X4 1 day ago 0 replies      
Want to buy, now! Can someone give me a hand at choosing a motherboard or something that allows using about 4 to 8 of these APU's?
39
noonereally 1 day ago 1 reply      
"Kaveri" is name of one of major river in India. Must have involved ( or headed) by Indian guy.

http://en.wikipedia.org/wiki/Kaveri

11
Project Euler projecteuler.net
290 points by gprasanth  2 days ago   133 comments top 35
1
jboggan 2 days ago 5 replies      
The best technical interview I ever had involved picking a random Project Euler problem in the hundreds and pair-programming our way through it. The CTO wrote his version in Python and I worked in Perl . . . he was astounded mine ran 8x faster.

The same company also had regular hack night where everyone drinks a lot of Tecate, agress on a Project Euler problem and a language no one knows, and races. Fun times.

2
habosa 1 day ago 1 reply      
I can't adequately express how great of a resource Project Euler is to someone learning about programming.

The way I learned to code was working my way through Project Euler problems in Python, eventually getting to a score of about 55 before I was at the point where I decided to try making "real" programs like Android apps.

When you learn to code people tell you that X or Y is bad for performance, and you should do A or B instead. The problem is that most beginner-type programs run in a few milliseconds and there is no way to see the performance either way. When you're doing a PE problem, a performance tweak can change your answer from a 1-minute runtime to a 1-second runtime. That's something anyone can appreciate, and it lets you experiment with performance on interesting math problems.

Another advantage of Project Euler is it makes you realize just how powerful a computer can be in the right hands. These are problems that nobody in their right mind would try to solve by hand, but they're so tractable with programming knowledge. That was a very exciting realization to me and it pushed me towards a career in software.

3
b0b0b0b 2 days ago 5 replies      
I love project euler, but I've come to the realization that its purpose is to beat programmers soundly about the head and neck with a big math stick. At work last week, we were working on project euler at lunch, and had the one CS PhD in our midst not jumped up and explained the chinese remainder theorem to us, we wouldn't have had a chance.
4
FigBug 2 days ago 0 replies      
I was really into Project Euler when I had a job where I didn't have to do anything. I've solved 122 problems. Now I work for myself and don't have the time, as well I solved all I was able to solve. I last solved a problem in 2009 I think.

It's fun, I encourage everybody to do a few. Get past the easy ones at least.

5
henrik_w 2 days ago 0 replies      
Another good one for (more general) programming problems is Programming Praxis: http://programmingpraxis.com/
6
mixedbit 2 days ago 0 replies      
I love project Euler. A nice way to improve programming skills in a new language is to go through others solutions in the same language after you solved a problem. This allows to break bad habits. Say you are a C programmer learning Ruby or Lisp, 'C-ish' approach will often seem the most straightforward, but will rarely be optimal and idiomatic in the new language you are learning.
7
gaius 2 days ago 2 replies      
A dozen Project Euler solutions in a given language can be an excellent pre-interview candidate screening technique. Quite simple to check for plagiarism too, within reason.
8
asgard1024 2 days ago 2 replies      
I solved about 80 of them, then my interest waned a little. But I wonder, are there any hints or recommended reading for the harder ones? Some of them I have no idea how to even start working on..
9
dmunoz 1 day ago 0 replies      
A lot of good links to similar sites in this comment thread.

I enjoy Project Euler, but as with many people slowly got annoyed by lack of specific mathematical knowledge as opposed to programming. One thing I believe would really help with this would be a resource that discussed the problem in the abstract. As an example, for most of the programs that rely on using primes, whether it be iterating them (e.g. first 1M primes) or the unique prime factorization of a number, discuss the known algorithms in pseudocode. Perhaps this is a bit much, as I would be satisfied with just knowing the words I need to go find resources for myself. This is what I tend to do anyway after I have taken a fair stab at a problem: "Oh, I'm doing prime factorization. I wonder if there are better algorithms than I have used." Indeed, one resource for this is the forums that are made available after the problem is solved.

Some might see this as ruining the fun, but I would personally have more fun and solve more problems if this was available.

10
datawander 2 days ago 1 reply      
To be honest, I'm shocked this is on the front page as this website has been out for years and already notably mentioned, but I guess it's good to recycle very important websites for those who haven't heard of it.

My favorite problems is 98. This problem, along with the Sudoku one at 96, require much more careful programming than some of the others due the drastically fewer number of people who solved it compared to the surrounding problems.

11
bradleyjg 2 days ago 0 replies      
These are a lot of fun to do, especially in a new language you want to play with. However they are as much an exercise of your math skills (mostly basic number theory and combinatorics) as programming. One thing I'd suggest is that you pick an algorithm reference and stick with it, if you google anything too specific you will come across one of the many sites where people have blogged about thier solutions.
12
blacksmythe 1 day ago 0 replies      
If you are not challenged by these problems, here is an alternative that I found considerably more difficult:

http://www.spoj.com/

13
captn3m0 1 day ago 0 replies      
Tangentially related: I made a pseudo-terminal web interface to Project Euler called CodeBot[1]. You can view problems, submit solutions, and do much more (some *nix commands work) in your browser. Its even open-source[2] on GitHub

[1]: http://codebot.sdslabs.co.in/

[2]: http://github.com/sdslabs/codebot

14
kozikow 1 day ago 0 replies      
In my opinion it may be better to do practice SRMs/Codeforces contests instead of project Euler. Topcoder rank imo tends to mean more, since it is timed. If someone says "I solved x problems on site X" you can't say if he done it in days or weeks of effort. If someone says he's red on topcoder you can say he's awesome.
15
kylemaxwell 1 day ago 0 replies      
For those interested I keep a list of these sorts of things at https://github.com/technoskald/coding-entertainment.
16
Karunamon 2 days ago 0 replies      
I'm trying to go through this with Ruby right now and having a lot of fun. Being a bit rusty on basic algorithms and higher algebra has not helped much, though.
17
ahuth 2 days ago 1 reply      
There's only one problem for me with Project Euler. Eventually, the problems become more about coming up with the mathematical algorithm you need to solve it.

That may be what you want. However, a lot of these are outside my math knowledge/ability, without really expanding my programming ability.

18
doughj3 2 days ago 2 replies      
Project Euler is great but as others here have said it is very math focused. Can anyone share other programming challenge sites? I saw one the other day here on HN in a comment but can't find it again. The only thing I remember is the problem I checked out was a kind of AI / pathfinding for a "floor cleaning robot" and code was submitted directly in the page.

[Edit] Just found it going through my history: https://www.hackerrank.com/

19
selectnull 1 day ago 0 replies      
I love it, although I found I lack math knowledge to really be good at it.

I enjoyed solving a few of those problems using SQL, that was fun.

20
donquichotte 2 days ago 3 replies      
Problem that has been solved by the smallest number of people (31): http://projecteuler.net/problem=453
21
lquist 1 day ago 0 replies      
If I see a github with 30+ solved Project Euler problems, 99% chance it becomes a hire.
22
yankoff 2 days ago 1 reply      
Project Euler is great. Another one, but more algorithm and CS oriented: hackerrank.com
23
JakeStone 2 days ago 0 replies      
I always love this site for when things get a little slow and I think I could use some relaxation.

Then I remember that I only took a little bit of math, so then there's the research, the papers to read and decipher, the code to write, and I finally solve the problem and swear I'll never come back.

So, yeah, I just finished a batch of problems last week so I could get a couple of ego badges just within reach. 75 down, 379 to go!

24
aezell 1 day ago 0 replies      
I always liked this set of challenges/riddles, though it is directed at Python specifically. I appreciated that it forced you to deal with some Internet-related programming tools and concepts.

http://www.pythonchallenge.com/

25
wanda 1 day ago 1 reply      
Weird, I was just talking about this earlier when someone asked for productive activity on train journeys to/from work.

I used to do these problems years ago when I was still a student and later when commuting to London. I did as many as I could on paper before trying to program solutions. I'll have to log in sometime and finish the few I missed.

26
elwell 1 day ago 0 replies      
How does a site this old get so many upvotes? whatever, I guess it's worth bringing back into the collective consciousness.
27
elwell 2 days ago 2 replies      
How does it work? Do you submit code or just input your answer as a number?
28
prothid 2 days ago 1 reply      
This site is great fun to tinker with a new programming language.
29
careersuicide 1 day ago 0 replies      
Here's a little side project I've been working on for a few months: https://github.com/seaneshbaugh/rosetta-euler/

I've been a little busy lately so it's been neglected somewhat. Why is Prolog so hard?

30
sricciardi 2 days ago 0 replies      
I used it to learn the basics of F# and solving algorithms using a functional approach.
31
Sgoettschkes 2 days ago 0 replies      
Learning haskell with ProjectEuler right now. It's great and after solving it, one can always look up the forums and improve the own code or learn different ways to implement the solution!
32
veritas9 1 day ago 1 reply      
On CodeEval.com we have over 126+ executable programming challenges in 18 languages :)
34
yetanotherphd 1 day ago 0 replies      
The best and worst thing about project Euler is the binary feedback they give you: either you pass or you fail.

On the one hand, it is a good lesson in how hard it can be to write correct code.

On the other hand, real world problems aren't black boxes where you try an integer until you get the right one. Problems with multiple tests needed to pass (like topcoder) are much more realistic.

35
jbeja 2 days ago 2 replies      
I will start this with python.
12
Mother sen.se
282 points by rkrkrk21  1 day ago   190 comments top 76
1
nostromo 1 day ago 8 replies      
Wait, is this real? It seems like commentary on the current zeitgeist, not a real product.

> Mother. Mother knows everything.

> She's like a mom, only better.

> Sense: the meaning of life

edit: I see they are based in France, so perhaps the branding didn't translate well.

2
michaelwww 1 day ago 7 replies      
First I've heard of it, but I had the same reaction as Cringley. Maybe it's an age thing.

"Imagine v1 of Big Brother's -- or NSA director Keith Alexander's -- most inflamed fever dream: a sensorbot shaped like a Russian nesting doll wearing a Hindi-cow smile. Then terrifyingly name it "Mother" and build it specifically to monitor as many facets of your personal life as it can. Are you schvitzing yet?"

http://www.infoworld.com/t/cringely/sense-mother-may-i-the-m...

3
devindotcom 1 day ago 3 replies      
I played with this at CES. The "mother" bot is basically just a router. The little things only sense motion, and when I asked the lady said they had no plans to add any other types of sensitivity - temperature, moisture, light, current, etc. Compared with the other 'internet of things' kits out there battling for visibility, this one doesn't seem original or more useful, only visually striking. The tags are also pretty big for what they do. A useful thing for $50 maybe to buy once, but really doesn't seem like a worthwhile 'ecosystem' to buy into in any big way.

Also, I was unhappy to learn upon close inspection that the face is a sticker.

4
cromwellian 1 day ago 2 replies      
The way the thing is filmed with the smiley face and lighting up eyes, I could easily imagine a sci-fi horror film being based around it. :)

More seriously, the idea of using cheap motion trackers to track usage of things in the home is very interesting.

When Google acquires this, it'll make the Nest complaints pale in comparison. :)

5
CodeMage 1 day ago 7 replies      
That was a really poor choice of a name. It took me less than 10 seconds to start hearing Pink Floyd's "Mother" [1] in my head. Once that started happening, I just couldn't stay objective while looking at the pitch.

[1]: http://www.youtube.com/watch?v=p0HrrR9QDQU

6
vertex-four 1 day ago 2 replies      
As a young person who wants to remember to take her pills, to cut down on her soda consumption, to track how much she exercises (and maybe turn it into a game of walking further every week), and no doubt some more that I can't think of right now, this product sounds like it'd be amazing.

The video is a brilliant marketing asset. It showed me some very real problems of mine, and how it could help me solve them (by tracking things that I want to, and gamifying them).

The only issue is cost. As a young, single person, 166 is prohibitively expensive. It's likely not worth it for me. Is it worth it for people with families and kids? If they had 166 to spend, could they find something more pressing to spend it on?

7
pcurve 1 day ago 2 replies      
We all may be suffering from a case of Fortune 100 CEO syndrome. That is, we all wish our lives are so busy and important that we need personal assistants managing our lives.

So we buy these products that make us feel more important. It documents what we do, and it tells back our story through a dashboard, in an autobiographical way, as if we're some kind of celebrities.

But are we that important?

8
MartinCron 1 day ago 3 replies      
Just yesterday I posted a quasi-luddite rant about how these smart devices and services are infantalizing.

And now they're naming one Mother? I can't tell if I should feel vindicated or offended.

9
fab13n 1 day ago 1 reply      
I haven't been so creeped out by an ad for quite some time.

This looks like a solution desperately looking for a problem; that is, unless your problem is "I want to spy every single step of everyone in my family".

And seriously, "mother"? Do they even ever had one, to be that much off-mark? The very first quality of anything motherly is to be human; this is a wireless log collection system. Call it a "warden", or at best "coach", if you really need an anthropomorphic comparison.

10
Jun8 1 day ago 2 replies      
Same French company that created the successful Nabaztag rabbit and then couldn't cope with the traffic. I had my wife buy me one of those for Valentines Day (stupid, I know) and after trying to do something useful with it and getting frustrated I tossed it somewhere in my cube where it remains to this date.

Apart from the super bad naming and Branding, this is another reason for me to stay away from this mother rabbit.

11
cracell 1 day ago 1 reply      
Cool product but very creepy branding. Might be ok to keep the name Mother but shouldn't be emphasizing it as a "mother" on the site at all.
12
gjm11 1 day ago 0 replies      
This is the single creepiest thing I have seen in the last month.
13
dmazin 1 day ago 1 reply      
God, the future is so fucking weird.
14
hrktb 1 day ago 1 reply      
They would have called this little sister, they would avoid so much of tastelessness surrounding their current branding...that put apart, what it's doing is already 50% doable by current smartphones + an arm band eventually (alarm , podometry, sleep control), and the other things it's trying to solve doesn't seem to be solved in a reliable enough fashion.

You'll have to update your coffee capsule count every time you buy them. Buying new packs when the opening the last set of capsule is ny far the easiest way to manage I think.

You'll have to put the sensor on every bottle you drink.

If you care that much about toothbrushing, buy an electric toothbrush. A timer will be integrated telling you when you pass the 2 pr 3 min mark.

Central temperature management would need a programmable device anyway, you'll basically need a Nest I guess.

At the end, It doesn't seem so easy to use in practice, it will forward every life information to an external server, and half of what it does is better solved an other way.

15
dictum 1 day ago 0 replies      
Mothers watch their sons out of love and genuine care for their wellbeing. Mine did a bad a job and that's why my next sentence will be bitter:

If a company wants to make me use a telescreen, they might as well make it a suppository.

16
atmosx 1 day ago 0 replies      
Apart from the privacy issues, which the community already raised here, there's another fact that bothers me: Applications do can not discipline yourself for you.

I have tried many applications which should increase by productivity, sleep quality, or you-name-name-it. I don't recall a single one that managed to do so in the long run (most not even in the short run...).

So either one is open to change and that has little to do with technology or... You're toasted anyway. But even when you decide to change for yourself and not because a notification tells you to do so, these technologies become time consuming and troublesome to use. Of course they look nice on TV and ads, but in real life, most of them are frivolous IMHO.

17
sheraz 1 day ago 0 replies      
More technology where none is needed. How about just being responsible and accountable for our own actions?

  Want to improve ___ about yourself?   Just do it(tm).   Get it done(tm).   Do what it takes. 
I don't need devices and a dashboard to tell me I'm winning at life.

Fuck this arrogant and stupid product.

18
nilkn 1 day ago 1 reply      
> we reinvented mothers

> Mother knows everything (in red text at that)

> She's like a mom, only better

The branding of this is either creepy or crazy. Maybe it's a bit of both. But I'm certainly not going to forget it, and the idea itself seems pretty interesting.

19
aray 1 day ago 0 replies      
I'm surprised it doesn't have wireless. Places I've lived always have the router tucked away in some inaccessible closet.
20
woofyman 1 day ago 0 replies      
It may be an age thing, but it find it creepy and useless. I haven't needed a Mom since I left home at 18.
21
notlisted 1 day ago 0 replies      
I like the concept. Surprised so many here are bugged by the marketing and/or the technology. Maybe you don't have kids (yet)?

Above all, I like the neat interface of the apps (mockups?) and simplicity of the cookie sensors. No charging nonsense, because they measure and buffer stuff but don't transmit. 1yr battery life. 10 day memory.

Sure, I'd love to see additional, more advanced cookies that would require charging, e.g. with built-in LED or vibration (reminding me when I enter or leave, though my phone could serve that purpose), Data, GPS (though my phone could serve this purpose, need an app that intercepts an SMS after presence is detected to auto-upload GPS data), multi-mother stuff (one at work, one at home), integration with home automation systems (someone below mentioned frequencies indicate zigbee), Zapier/IFTT support and above all some sort of data input/output API so I can import my own data points.

By the way, $222 for a mother and 4 sensors seems quite affordable to me.

The only thing that prevents me from pre-ordering a set is that the NaBazTag history doesn't exactly instill much confidence in the makers' ability to support this thing in the long run; I also wonder where the data is stored and if it's remotely future-proof (data import/export/backup).

22
tomphoolery 1 day ago 0 replies      
This is fucking creepy. But like most creepy things, the idea is also kinda neat. :)
23
ameswarb 1 day ago 2 replies      
Their tagline "Mother knows everything" is terrifying.
24
state 1 day ago 0 replies      
I like how open-ended this thing is. I wonder if the market is actually ready to move beyond domain-specific sensor hardware and in to something broader. The aesthetic isn't quite my taste, but I'm very curious to see how their users react.
25
buro9 1 day ago 1 reply      
The sync module reminds me of the Nabaztag ( http://en.wikipedia.org/wiki/Nabaztag ) and I wondered whether Mother was going to have signals and indicators so that you didn't have to use a mobile device for insight.
26
RutZap 1 day ago 0 replies      
I want everything Mother knows/finds out, to be stored locally (i.e. on my pc, not in the cloud), to be kept secure, private and I want to access it at any time from anywhere.... can Mother do that? I don't think so.

Still pretty good but as long as there isn't a privacy promise that would satisfy the basic security principles (Confidentiality, Integrity, Availability) I don't see it as a successful device.

27
zxcvvcxz 1 day ago 0 replies      
As I saw it popping up, I thought "wow that looks like a sex toy." Freudian slip, whoops.

"Sense Mother is at the head of a family of small connected sensors that blend into your daily life to make it serene, healthy and pleasurable."

You never know.

28
thatthatis 23 hours ago 0 replies      
What does it do?

I scrolled to the bottom of the page expecting some kind of explanation of what it is and why I would want it or what need or want it solves. Nothing that I could find.

29
anonu 1 day ago 1 reply      
I think this is really cool and definitely brings us closer to the Internet of Things. I don't think I would have anthropomorphized the system by calling it "Mother" and putting an eerie LED smiley face on the base station.

I can't seem to find any technical info on the "cookies". Are they similar to the technology in the Fitbit Flex, ie Bluetooth Smart coupled with some sort of accelerometer. If that's the case, do the cookies need to be charged every week. This remains the single massive downside to widespread adoption of such devices.

30
jds375 1 day ago 1 reply      
Seems like a pretty cool product. They have an amazing design and a beautifully done website too. Only thing I am a bit concerned about is the price. It costs 222 USD for a base unit and 4 cookies (sensors).Here's a video from CES2014 about it http://www.youtube.com/watch?v=024OPHSgOqo
31
cm2012 1 day ago 0 replies      
To me, this is pretty awesome and not at all creepy (22). A friendly UI and ease of use for life tracking? Yes please. Attach it to barbells to track workouts.
32
Geee 1 day ago 1 reply      
What's going on in here? I don't get the negativity. I think the branding is great, and I realized the function of the product immediately. Also I think they presented it in a funny way (programmable mother). It's obvious that it doesn't 'know everything', that was a joke. It has simple sensors and you can collect data from those, there's nothing scary about that. The product is interesting, but however not useful at least for me.
33
samolang 23 hours ago 0 replies      
Interesting concept. Simplify the sensors as much as possible and do all of the work in the software. I'm guessing they have profiles that allow you to determine how a sensor's data is interpreted. I wonder if they allow you to define custom profiles or at least have access to the raw data.
34
avighnay 1 day ago 1 reply      
This thread is a good example why name matters. If the same product was given any other name, perhaps it would not have been noticed that much. A set of motion sensors with a central comm hub.

The makers perhaps thought that the name 'Mother' would evoke care and love in the minds of their users. To their agony, it is revealing in the thread that though most people love their mom, they really do not want to be a 'watched over' kid.

I guess it gives all of us that creepy feeling of guilt as kids when we stole from the cookie jar and kept turning our heads with fear of being caught by mom :-)

Note to self: keep relationship names away from product names, too much friction ;-)

35
nathan_f77 1 day ago 0 replies      
This is horrible marketing. Seriously, who came up with this creepy design and name.
36
carls 1 day ago 1 reply      
This seems to be a herald to the situation described in the poem All Watched Over By Machines Of Loving Grace (1967) by Richard Brautigan.

  I like to think (and   the sooner the better!)   of a cybernetic meadow  where mammals and computers  live together in mutually  programming harmony  like pure water  touching clear sky.  I like to think  (right now, please!)  of a cybernetic forest  filled with pines and electronics  where deer stroll peacefully  past computers  as if they were flowers  with spinning blossoms.  I like to think  (it has to be!)  of a cybernetic ecology  where we are free of our labors  and joined back to nature,  returned to our mammal  brothers and sisters,  and all watched over  by machines of loving grace.
And yes, incredibly creepy.

37
EdZachary4 1 day ago 0 replies      
They need the companion "Father - Common sense" to tell you not to waste your money on nonsense like this.
38
girvo 1 day ago 0 replies      
So, Defcon last year had a talk where they hacked things (including a Bunny ostensibly for watching your baby in another room) like this. And it was super easy. I wouldn't out this anywhere near my house or life.
39
rglover 1 day ago 0 replies      
Will it send me a notification that says "don't disappoint mother" if I forget to do something?
40
Yetanfou 1 day ago 0 replies      
Apart from all the other emotions which this plastic big sister evokes, I wonder what it is that makes so many of these startups reach back to the crib when it comes to branding their products. From this bastardized matryoshka doll through Snapchat's Miffy-like ghost to Twitter's tweety to just about half the iconography on tablets and smartphones, they all have one thing in common: the more infantile the logo and/or branding, the better it is. Is this idiocracy at work or are they all following some celebrity psychomarketeer's edict about successful marketing to the attention-span deficient generation?
41
pnathan 1 day ago 0 replies      
Like other people: it's an interesting idea, but the branding is dystopian.
42
owenversteeg 21 hours ago 0 replies      
I personally think that I don't need to spend $222 on something that seems to be minimally useful. I don't need to monitor how often I brush my teeth, how often I drink coffee, how often I water the plants, and how often I take medication.

For the things on the list that are somewhat useful (like sleep logs + a pedometer + temperature) I have a 1975 pedometer/calculator combo that's worked fine since the day I got it, a notepad, and an infrared thermometer that is a thousand times cooler.

I think the only people that will buy this are people that want Google Analytics for their life.

43
jawr 1 day ago 0 replies      
Did anyone notice this:

"Cookies immediately send everything they capture to the nearest Mother."

For me, that's a bit of a scary statement considering how intimate the product is meant to be in someone's life.

44
dennisz 1 day ago 0 replies      
If you scroll down far enough, you get to the 'technical details', where the device is described as 'a white mother'. I just found that funny, haha.
45
webXL 1 day ago 0 replies      
Cheese.... wait for it... E!

And can I see the product without having to hunt down my country in a gigantic freaking select box first?? Isn't it fundamentally the same in every country?

46
Roelven 1 day ago 0 replies      
Happy to see the guys behind Violet are not giving up. I believe they're on to something but the branding / language choosing is indeed poor. Whatever they launch with now will surely be extended, I'm hopeful that they've learned a great deal with the Nabaztag (which I've owned one back in the days).
47
xianshou 1 day ago 0 replies      
Who knows you better than your mom?

From this marketing, I'd answer...Big Brother.

48
paul9290 1 day ago 0 replies      
This looks really cool, though I had no idea what this product did based on their bloated landing/homepage.

Just a few picture examples with blurbs of text & a demo video would suffice rather then an infographic type of website.

49
Houshalter 1 day ago 0 replies      
I can't wait to have an AI virtual assistant that can keep track of and manage all these mundane statistics for me.
50
themoonbus 1 day ago 0 replies      
I was hoping for news about an Earthbound sequel, and instead I got this weird little smiling pod.
51
dlsym 1 day ago 0 replies      
> Mother. Mother knows everything.

I guess it's "Big mother is watching you" then.

52
jnardiello 1 day ago 0 replies      
Beside the branding thing, i've lost my fitbit one in less than 2 weeks. How long till i lose one of the cookies? Dongles are not for me.
53
forgotprevpass 1 day ago 1 reply      
Does anyone know how the signals are being sent from the cookie to Mother? The company mentioned in a CES video that they werent using the traditional bluetooth, wifi, etc.
54
jrochkind1 1 day ago 0 replies      
Are they TRYING to scare me?
55
Shtirlic 1 day ago 0 replies      
Looks like this was done before in Green Goose project in 2011 http://www.engadget.com/2011/02/23/green-goose-sensors-monit...
56
treenyc 1 day ago 1 reply      
Very nice, however unless all the hardware and software are open sourced. I will not use it in my real life.
57
TeeWEE 1 day ago 0 replies      
So when I'm living in the Netherlands I cannot continue? (its not in the list)
58
protez 1 day ago 0 replies      
This is one of the most horrendous brandings I've ever seen. Maybe, the horrible branding is intentional, but it's damn too creepy for sane users who dare using their product.
59
nfoz 21 hours ago 0 replies      
Have these people ever seen a mother before?
60
75lb 1 day ago 0 replies      
advertising slogan: "It's not enough to pipe every move, call, text and click you make from your smartphone? Mother's sensors are cute, small and funky. Collect more data for US spies today!"
61
MiWDesktopHack 1 day ago 0 replies      
kill it. kill it with fire.this product collects the kind of personal data that should not be handed to third parties. too ripe for abuse. too much insight into your existence.a scary Orwellian nightmare.
62
ilitirit 1 day ago 0 replies      
What does this gadget do?
63
sifarat 1 day ago 0 replies      
I don't how I feel about this, right after watching the 'her' trailer. I am speechless.
64
Houshalter 1 day ago 0 replies      
Can people please come up with better names for things, especially not common English words?
65
xname 1 day ago 0 replies      
Watching the video. First I liked it..... Then I hated it. It's too much. I don't want that kind of life. I don't want Mother to watch at me everyday every moment.
66
mikegriff 1 day ago 0 replies      
Hmm, Ireland doesn't exist. I guess they don't want me to get one, or find out about it.
67
dpweb 1 day ago 1 reply      
Expected Pink Floyd link..
68
grumbles 1 day ago 0 replies      
I got confused for a second and thought I was still reading 'The Circle' while reading this page.
69
lcasela 1 day ago 0 replies      
The son could have easily tricked the sensor.
70
pyrocat 1 day ago 0 replies      
...creepy...
71
meandyounowhere 1 day ago 1 reply      
Concept is stupid as FK. Why you need sensors just to know some basic stuff such as taking pills, tracking health etc. You can use app also. All they are doing is using sensor( motion sensors in particular) and send message to your phone. So why would I spend $222 for something where I could just it with $10 reminder app ?
72
elwell 1 day ago 0 replies      
Doesn't look ready.
73
indigromer 1 day ago 0 replies      
I know I'm an unfortunate case but as someone with a recently deceased mother I do not like this one bit.
74
diu_9_commerce 1 day ago 0 replies      
Bad name - I hate the fact that mother knows everything.
75
rambojohnson 20 hours ago 0 replies      
ech, creepy.
76
michaelrhansen 1 day ago 0 replies      
makes me want to go bowling
13
PSD to HTML is Dead teamtreehouse.com
279 points by nickpettit  1 day ago   161 comments top 46
1
wpietri 1 day ago 6 replies      
And thank goodness. If anybody knows where the grave is, I'd like to go piss on it.

As somebody who long ago did print design, I totally get why designers would want pixel-perfect control. It is awesome, but you get that in print because you are physically manufacturing an object and sending it to people. The web was device independent from the get-go. It wasn't your paper anymore; it was their screens. There were a couple of designers I came close to beating to death with their own Pantone books because they refused to get that.

Sadly, the desire for pixel perfection led to trying to force every single user on the planet to conform to the designers' weaknesses and fetish for control. For example, every Flash intro in the world. Or all of the goddamn fixed-width "experiences" that were either too wide for what users wanted their window to be or so narrow that acres of space were wasted. An approach that surely looked fine in presentation to executives, but much less well for actual users.

The great improvements in CSS have definitely helped. But I think the major changes have been the the explosion of form factors (small laptops, giant desktop monitors, tablets, phones) and the rise of a generation of designers for whom the web is a native medium. The old paradigm got harder to force at the same time there were plenty of people who were thinking in a new way.

Planck wrote, "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." Design, like science, proceeds one funeral at a time. So goodbye, PSD2HTML, and let's quietly put a stake through its heart so it never returns.

2
reuven 1 day ago 6 replies      
It drives me totally batty to work on projects in which the designer assumes that their only responsibility is to provide a PSD file, which the developers will then turn into HTML and CSS.

I want to work not just with designers, but with Web designers, who intimately understand the workings of HTML, CSS, some JavaScript, and the implications for different browser sizes and versions. Web designers speak HTML/CSS natively, taking these limitations and issues into account when they're creating their designs. And if something needs to change, they can change the HTML/CSS that was created. If the designer only knows how to work with Photoshop, every change to the site requires a great deal of additional work and communication.

I've sometimes remarked that a designer who uses Photoshop, but who doesn't know HTML and CSS, is like a photographer who refuses to actually touch the camera, and instead tells someone else how to aim, focus, and shoot. (And yes, I'm aware that TV and movies work this way; the analogy is far from perfect.) I want to work with someone who lives and breathes Web technologies, not who sees them as just another type of output. I'm glad that this blogger made this point, and has indicated that while Photoshop might once have been acceptable, it no longer is.

3
IgorPartola 1 day ago 7 replies      
Rant to follow:

So I have done a fair share of PSD to HTML, PSD to WordPress theme, PSD to application web GUI, etc. rewrites. I generally have no problem with the concept of this, and got quite good at this. However, there are some real pet peeves that keep coming up in this workflow, that are really driving me crazy. If you are a designer working with a developer, and you happen to read this, at least please consider it the next time you produce a PSD:

First, PSD's that assume text length. For example, if you have three call-out boxes with a title and some text to follow, don't assume that the title will always be one line and the text will always be the same length. Instead, figure out what this will all look like when you do have very uneven amounts of text. Do we center it vertically? Do we abbreviate it?

Second, PSD's that don't assume a responsive design. Sure, working directly in the medium (HTML/CSS) would solve this, but you can still provide some direction here. Tell me how the columns should be laid out. Which parts of the site should expand/collapse with size, which parts can be hidden, etc.

Third, and this goes without saying, but clean up the PSD layer names and groupings. Layer 1, Layer 2, etc. is not a great convention for this.

Fourth, show me the unusual cases. I know the clients always want to focus on the prominent pages, like the home page, the product listing, etc. Those are important, give me those. But also give me what a form submission error looks like. Or what a 404 page looks like. Or an empty shopping cart. Or pagination. Or a table that's wider than the viewport would normally allow.

Fifth, consistency. It sucks for the developer, and I'd argue it sucks for the user, to have every page use a slightly different set of CSS rules for headers, paragraphs, lists, etc. Best case scenario here is to give me a style guide I can trust. I know it's two different documents you now need to maintain, but honestly this is the biggest help you can give me.

Sixth, show or describe to me the interactions and workflows. A simple shopping cart can become a giant minefield of interpretations of what the design is supposed to convey.

Seventh, and this is a bit meta, but don't walk away from the design before a single line of HTML/CSS is written. This is bad because there will be questions about interactions, etc. If first I have to email your boss's boss to try to see I can ask you a simple question, the process is broken and I will not recommend working with you again.

Eighth, if you do promise to deliver sample HTML/CSS, for the love of good, do this well. I have recently had the misfortune of having HTML/CSS/JavaScript delivered to me for a large site redesign by a big name web design agency. I was very excited about this, especially since these guys said they would use Bootstrap as the foundation for this so that we would have all the benefits of that framework built right in. I got the files, opened them and OMG. It did include Bootstrap, but in name only. After that declaration, it instead included a completely custom column system that was just slightly incompatible in sizes with Bootstrap's. It also used none of the same class names even where it made sense, etc. Needless to say, I had to re-write all of their CSS from scratch, and re-adjust lots of the Bootstrap variables to accommodate their column system.

</rant>

Great designers are worth their weight in gold. The above highlights that the waterfall process of design -> develop does not work. Instead it should be design -> develop/design/develop. If you cannot step outside of Photoshop that's fine, but if you want to be efficient, you must know the final medium, which is the web.

4
bbx 1 day ago 0 replies      
I'm currently redesigning a backend interface, and it's the 1st time since I've started my Front-End career (7 years ago) that I'm not using Photoshop at all. I'm just using Bootstrap, Sublime Text, and Chrome.

For many projects of course, it won't be sufficient: clients want (and probably need) a stunning Photoshop mockup to provide feedback and boost their self-assurance.

But if you combine a simple CSS framework (even if it's just for a grid system), Chrome's inspector, a selection of Google Fonts, and some sense of "flat" aesthetics, you can come up with a more than decent, and sometimes amazing, design. Plus, it takes 70% less time, especially considering it's usable right now.

37signals mentioned this "skipping Photoshop" attitude in 2008 [1], but I never quite managed to put it into practice until recently.

[1] http://37signals.com/svn/posts/1061-why-we-skip-photoshop

5
dredmorbius 1 day ago 0 replies      
As a mostly back-end guy (systems, network, databases) who's dabbled in HTML and CSS, somewhat increasingly over the past few months (latest results below), I've taken a highly pragmatic approach to how I prefer sites styled:

Consider the screen as a sheet of paper on which you can 1) communicate your message 2) provide a UI, and 3) apply your branding. Modest amounts of logo / artwork, color palette, and styling touches go a long way. Other than that, it's a rubber sheet. There are no fixed dimensions.

Start with a basic HTML5 framework. body, header, article, aside, footer.

Put minimal elements above the fold. Your header, logo, and some basic navigation. Emphasize body text and / or UI.

You almost always want to design around the text. That's your payload. For interactive tools, controls layout should be clear, consistent, logical, and most of all provide enough space to meaningfully navigate options. For that last: size-constrained modal dialogs or their equivalents (pop-up menus, etc.) are strongly deprecated. Unless the user needs to see other content while performing input, that dialog should be front, center, and the principle screen element.

CSS gives you a whole slew of tools: special selectors, including :hover, :active, :first-child, :last-child, :nth-child, :nth-of-type, shadows, columns, and more. No, MSIE legacy doesn't support many of these. Fuck'em.

Stick to light backgrounds and dark fonts, with few exceptions. http://www.contrastrebellion.com/ is strongly recommended.

Think of your page in either ems or percentages, and almost certainly ems (scaled to your principle body font).

Provide a minimum page margin of around 2ems for desktops. For mobile, enough to keep text from flush with the edge of the screen, 0.25em typically. Don't crowd your text. I accomplish this by setting a max width (usually 45-60ems depending on context), and a 2em left/right internal padding. This provides a comfortable reading width but preserves margins in narrow displays.

Scale fonts in pt, or use relative/absolute sizing based on the user's preferences. I recommend "medium" for body text.

Other than image elements and logos, avoid use of px. Never mix px and ems (say, for line heights).

Rather than a traditional sidebar, use CSS column elements for your asides, which are then full-screen width. @media queries can toggle between 3, 2, and 1 column views.

If you've got to float elements, float right of text rather than left. This is less disruptive to reading. 0.5 - 1em padding or margins is usually appropriate.

For long lists, I'm growing increasingly partial to "li { display: inline;} or inline-block (the latter allows trick such as ":first-letter" but fails for wrapping.

Make modest use of dynamic elements. I'm generally not a fan of flyouts, automatically opening menus, etc., and they're among the first elements I nuke when modifying sites. Color shifts to indicate links and other dynamic elements, however, can be useful. Google's "Grid" is a notable exception to this rule.

Don't fuck with scrollbars. Allow the user environment defaults. Yes, Google+, I'm talking to you.

DO NOT USE FIXED HEADERS OR FOOTERS. Far too many displays are height-constrained, and robbing another 10-25% of the display with elements which cannot be scrolled offscreen is an insult. If you've got to fix something, put it in a margin. Do not fix ANYTHING for mobile displays.

CSS modification: Metafilter lite http://www.reddit.com/r/dredmorbius/comments/1v8fl5/css_adve...)

6
rwhitman 1 day ago 2 replies      
I swear I feel like I've read a version this article once a year since the advent of CSS. This is a naively utopian vision of the future. The designer/developer is a very rare breed outside of the HN community. Most designers can't / won't write markup or CSS, and most developers are piss poor designers. The design->planning->building segmented workflow will always exist, as it has in all engineering disciplines since the dawn of human civilization.
7
Trufa 1 day ago 3 replies      
I'm a little bit confused of the workflow they are suggesting.

I'm a web developer with "good design taste" but I definitely can't design myself, I always pair up with a designer that does the PSD.

But of course this doesn't mean that when I see a navbar that has a gradient I copy a paste the image of the navbar in my website with a <img>, my job is porting this images to HTML, CSS and JS.

If you're actually putting images from the PSD, you're definitely doing it wrong, but in my case, I still need a highly detailed design that I can make a website, otherwise I have to design it myself, wireframes only get you that far.

When I'm working with a good designer, that knows about how the web works, I feel it's a great workflow.

8
tomatohs 1 day ago 1 reply      
This article should be titled "the slice tool is dead."

The slice tool represents the direct transformation of raster image to website. We all know that this isn't possible anymore because of mobile, retina, etc.

But Photoshop and image editors still provide tremendous value to the web development process for mockups, image assets, colors, etc.

What this article is trying to say is that the process of turning a design into a website has become much more difficult. A PSD is no longer a final deliverable but the beginning of a conversation.

Now design needs to be functional. Instead of taking the static image you get from a PSD, you need to ask "What does this look like on mobile? What about huge resolutions? What if we don't have that content?"

The article suggests that this process will be improved by designing in the browser thanks to CSS3.

The truth is that the browser has just barely hit the minimum requirements to be able to make design decisions. Have you seen the Chrome color picker? It's alright for choosing a border color but final design work can not be done entirely in the browser just yet.

9
elorant 1 day ago 7 replies      
As a developer I hope that CSS would share a similar fate sometime in the not too distant future. Its freaking hideous, doesnt work as it should and in order to build any decent modern site you end up writing something like 5,000 LoC. Nine out of ten times I want to do something with CSS I prefer doing it with JavaScript.
10
efsavage 1 day ago 2 replies      
I disagree. In the hands of a competent web designer, photoshop is still the most expressive tool available. I've been bouncing PSDs with a designer for the past couple of weeks and I want him being creative and making something beautiful, not constantly worrying about how the images are going to get sliced up or sprited or what's svg and what's not. That's my job. So long as there is in iterative process in place where I can keep him within the bounds of reality, it all works out very well in the end.
11
danboarder 1 day ago 0 replies      
Photoshop may be dead as a starting point, but not quite dead as an intermediate step for customized template design. A workflow that works today for quick site turnaround in commercial web design is to screenshot a Wordpress or other CMS responsive template, bring that into Photoshop, drop in branding, color changes, and replace content to produce a comp for presentation to clients. It is still quicker to make design changes in this Photoshop intermediate phase. Once the design is signed off, it's fairly easy to customize the CSS in the original template and arrive at a branded site the client is happy with.
12
tn13 1 day ago 0 replies      
Thanks goodness. My life was hell when I was working for an Indian outsourcing giant where they made web application like an assembly line.

The designer were hired from school which taught only print media design. They made PSD mockups which arrived at frontend developers who would then make HTML out of it with dummy data.

For example say you are designing a charting app for a banking company, They would create pie chart in PSD and then ask the frontend devs to convert into HTML. So these people use to put those charts as image. When it arrived with us the backend team, we use to realize that this graph needs to be dynamic. If we use any other charting library it use to look ugly with overall design.

Not to mention if the webpage does not look pixel perfect in FF and IE it would go as a bug. Countless human hours were wasted in making corners round in IE.

The real interesting part was that, the baking giant did not give a shit about the design in first place neither about the browser compatibility. It was meant for their say 30-40 employees who could simply switch to FF if they did not like sharp edges in IE.

In the battle of egos between the designer and testers we were screwed.

13
callmevlad 1 day ago 0 replies      
The pain of the PSD->HTML workflow, especially around responsive design, is one of the reasons we're working on Webflow (https://webflow.com). While Photoshop will have a critical role in web design for a long time to come, having to deal with multi-resolution elements is extremely tedious.

Also, Photoshop layer styles are way behind what's actually possible with CSS3 these days (multiple shadows, multiple background images, etc), so designers who have to implement a website end up doing their work twice. With a tool like Webflow, implementation work is part of the designer's workflow, so once something looks good on screen, it's actually ready to ship.

Granted, designers have to learn the base concepts of how content flows in a website (the box model), but I think that's a small price to pay for designing directly in the intended medium.

14
sandGorgon 1 day ago 0 replies      
This whole post, and comments, sound extremely unrealistic. In an ideal world, things work as you would say - but in the real world, things don't work like this.

I'm not sure if any of you guys have seen the inside of a psd2html place - it is highly optimized with a hive mind around browser compatibility. I would say that best of the breed slicers leverage bootstrap, sass/less, etc and incorporate their experience inside it.

I would argue that the missing piece is not some new, magical way of doing things - but rather the interchange formats. For example designers don't use PSD grids that account for fluid layouts (FYI - I'm not even opening the can of worms that is responsive design). This makes it hard for slicers to deliver fluid layouts.

The search for the mythical designer + SASS engineer is very hard and very likely futile. In fact, my opinion is that you are starting the process incorrectly. I suggest to find a best of breed slicer, START the design process with them as opposed to a designer (get their recommended grids, etc) - then give the designer a set of constraints to work with. This should ensure your downstream workflows are smooth.

15
at-fates-hands 1 day ago 0 replies      
I actually stopped working in Photoshop about two years ago when I realized you can prototype faster just by building a design from scratch in the actual browser.

It's so much faster than having a designer painstakingly mock something up in PS, then have me build it and realize a myriad of things that weren't apparent because we weren't looking at it in an actual browser.

16
anthemcg 1 day ago 0 replies      
I am not here to say that web designers should create PSDs and just throw them over the fence.

But, I don't think most web designers really agree with this. I think this philosophy really tries to downplay visual style to practical problem solving and I believe they are both essential.

I can write competent HTML/CSS/JS, Frameworks etc. At least, I know enough to work with engineers and work effectively in my projects. For me using Photoshop isn't just about what browsers can and can not do. Its certainly, not just about pixel perfection or making a design ready to code.

Working with HTML is just clunky. Working with paper is too loose. I can think about how to build a design, plan it on paper but exploring visually is actually quite constrained by trying to do it with markup or just paper/wireframes. Photoshop represents an open environment where I can create anything I need from an illustration to a button and its powerfully close to what it will really look like. To some people that might sound like a clunky or wasteful step but I think it really helps.

For sure, I think Nick makes some great and valid points here. I agree, there are problems with the PSD process but direct prototyping and CSS frameworks just don't solve those problems.

I don't know, I feel like if in reality everyone used HTML to design, everything would look like Bootstrap and that would be acceptable.

17
tomkin 1 day ago 0 replies      
I don't know what the author at Treehouse is doing, but I use the PSD as a visual representation of what I will end up creating as CSS/HTML/JS. Who was still seriously drawing grids and cutting out PNGs/JPGs?

The take away for many reading this article is going to be: Photoshop is not the way to design a website. The article does attempt to address this is not the case, near the bottom.

In the end, the author admits that you do need some design reference point (Photoshop, Illustrator, paper, etc). I do remember the days of cropping out many images, backgrounds, etc., but that was at least 6-7 years ago.

18
ilaksh 1 day ago 0 replies      
I agree that PSD to HTML is generally now a bad idea that will make the task more difficult.

However, I believe that the idea of having an interactive design tool should not be abandoned so easily.

I believe that we should create interactive GUI design tools that support the back-end encoding.

I know that doesn't meld well with hand-coded and maintained approaches.

I believe that we can create design tools that output acceptable markup. But I don't think we have to.

I think that the business of writing code in order to layout a user interface is ludicrous. I do it, because thats the way most everyone does it these days. Most everyone also drives massive 5 passenger vehicles as the sole occupant that waste huge amounts of energy driving to and from work every day. Point being, just because that is the way people do things, doesn't mean it makes sense.

Programmers by definition write code. If you're not writing code, you're not a programmer. The problem is the definition of programming needs to be updated, since we now can create very sophisticated programming tools that have friendly user interfaces.

19
ChikkaChiChi 1 day ago 0 replies      
We don't live in a world where every web user is part of a majority of three monitor resolutions and web design has changed to accommodate that. Web sites need to scale properly and that cannot be done with raster graphics.

If you are using a raster program for anything other than mockups before you head into real design, you are doing yourself, your clients, and their customers a disservice.

20
wwweston 1 day ago 0 replies      
Well, as long as we're making controversial statements (those in the "____ is dead" usually are)....

I think Photoshop as a design/layout tool may have done more damage to front-end design/development productivity than Internet Explorer. And this article is just an indicator that there's a growing awareness of how.

Photoshop is an amazing raster image manipulation tool. But the dominant mechanics have always been about composing a series of fixed-dimension bitmapped layers (outside some shoehorned not-quite-layers-but-actually-layers there's really no other kind of entity to work with). For that reason there's always going to be an impedance mismatch between the tool and the web.

21
discordian 1 day ago 0 replies      
He may wish it was dead, but I can assure you there is probably more PSD to HTML work going on now than ever before.

First of all, it seems the author is not even opposed to the idea of mocking up a design in PSD. He just thinks that responsive design and advances in CSS have altered the process somewhat. OK, point taken, but this doesn't make the overall concept of PSD to HTML obsolete by any stretch of the imagination. The majority of designers will always favor mocking up their intended design in a program like Photoshop, and using that as a starting point for the development process. Responsive design just adds an additional layer of complexity, which may call for additional mockups.

I've heard people advocate prototyping concepts directly with HTML/CSS, but this is ultimately a rather inefficient way to work if you are a detail-oriented designer.

As far as the actual workflow changing and becoming more iterative, it completely depends on the context. Not everyone works at a company like Treehouse that has a team of in house developers and designers. Many website projects - the majority even - are the result of small businesses subcontracting the process out to various companies. It's not always possible for the designer and the developer to be in the same room. So as an ideal - sure, the designer should be involved throughout the process, but this doesn't always match the reality.

22
tlogan 1 day ago 2 replies      
What is the best HTML page design tool? I.e., designing CSS and HTML with minimum coding?
23
seivan 1 day ago 1 reply      
PSD to iOS as well. I just wish companies would stop wasting resources on photoshop goons and let the engineers who work with the platform & SDK design.
24
mgkimsal 1 day ago 0 replies      
Yay. I'm surprised it was ever a thing, really. Maybe not surprised, but pissed off. We've all got our horror stories - I got a PSD with > 200 layers (4 layers for 4 rounded corners on a button - WTF). It was just crazy.
25
zx2c4 1 day ago 0 replies      
The work flow might be dead but... psd.js lives on!

http://git.zx2c4.com/psd.js/about/

    git clone git://git.zx2c4.com/psd.js
This is a neat project from `meltingice`.

26
mratzloff 1 day ago 0 replies      
tl;dr Most browsers support modern CSS techniques that remove the need for image-based techniques, and mockup tools like OmniGraffle and Balsamiq make it easy to create layout drafts.
27
goggles99 1 day ago 1 reply      
Link bait warning. Author even admits in the comments thatwhat he means is "Going directly from a PSD to an HTML file is dead".

Link bait may get you more traffic in the short turn, but will likely just hurt you in the long run. Especially since lots of people think that he an idiot now.

Why? Who would have thought... Modern day web dev needs to be rendered to different sized screens and we have CSS3, more skills, and better tooling now.

Who does not know this already. I was baited and now he is hated (JK)...

PSD is still used quite commonly for conceptual purposes. Of course no one expects anymore (did they ever?) that it will be pixel perfect across devices ETC.

28
atomicfiredoll 1 day ago 0 replies      
"Everyones workflow is different and nobody knows how to make the perfect website. You should always do whatever is most effective for you and your colleagues."

Not to say that there aren't some valid points brought up, but this feels like dramatically titled click bait with a weak conclusion.

When I click a title like this, it's because there is an implication that a better process exists--I want to know what that process is! At best, it's only hinted at here.

I know teams that are using processes similar to the PSD oriented ones outlined in the article very successfully. I suppose that means that it's not dead for them, as it's effective.

29
Bahamut 1 day ago 0 replies      
As a frontend developer, I like having designers figure out the look of a page, and implement the look in a way that doesn't break what I've implemented. If they need help, I don't mind helping - in fact, I have a bit of design experience as well. However, it is not a good use of my time, so I don't do too much of the css.
30
lstamour 1 day ago 0 replies      
I'm surprised no one here's mentioned Photoshop CC's Generate function yet, especially given that it was written in Node.js: http://blogs.adobe.com/photoshopdotcom/2013/09/introducing-a...
31
Sssnake 1 day ago 0 replies      
Now perhaps in 10 years idiots in suits will finally stop demanding ridiculous pixel perfect control over website designs.
32
workhere-io 1 day ago 0 replies      
"X is dead" is dead. Just because you don't use X doesn't mean others don't use it.

A very large number of people who do web design for a living are much better at making their visions a reality using PhotoShop than HTML/CSS.

33
SkyMarshal 1 day ago 0 replies      
Well it should be dead, but like COBOL it'll be around a long time simply because there are tons of expert Photoshop designers who are much more productive with that tool than raw html/css and need their designs converted. I'm working with one right now, don't see it going away anytime soon.
34
d0m 1 day ago 0 replies      
The concern is also about needing to hire two different people all the time. It slows down the workflow so much.. it's way easier to have one person in charge of it and being able to design, hack and tweak it as the projects evolve.
35
joeblau 1 day ago 0 replies      
If you're looking for a fast way to extract images from your Photoshop file by layers/visibly/etc I highly recommend this software: http://getenigma64.com/

And if you're trying to extract gradients from Photoshop into CSS, SCSS, SASS: http://csshat.com/

36
jbeja 1 day ago 1 reply      
And would be glad if people stop making icons and UI sets in PSD, and start using a more portable format like Svg.
37
j_s 23 hours ago 0 replies      
Here is the direction designers should be heading:

http://www.sketchingwithcss.com/

38
wil421 1 day ago 0 replies      
Tell that to people I work with, this is something I just did last week. I dislike doing it and I dont really agree with the camp that slices their page into images.
39
grimmdude 1 day ago 0 replies      
When I first read this article I didn't really agree with it, but after reading some of the comments on here I can understand where it's coming from.

I think the main issue is that the designer understand that it's more of a guideline on how the site should look. When they start getting nitty gritty about exact line breaks and page by page style changes is where it gets hairy and falls apart.

I don't think moving away from mockups is the answer if that's what the article is implying. Just a greater understanding of modern web abilities and standards is all that's needed from the designer.

40
supercanuck 1 day ago 0 replies      
So what is the replacement?
41
ctrl 1 day ago 0 replies      
+1 all these replies.As a designer first, I taught myself to code, Just as I taught myself how print medium works. These details are integral to the end product.

A web designer that doesn't understand code != a web designer.

I think Photoshop should be replaced with Illustrator. for the initial design phases.1) You can do wireframes in Illustrator, then build directly on top of that for design.2) Multiple artboards lets you layout multiple screen sizes/breakpoints.3) Resizing elements and keeping crisp edges is much faster.

42
martianspy 1 day ago 0 replies      
I would like some constructive advice.I currently use sliced PSDs as part of page content workflow to get a one page flyer from InDesign onto the web.

I start with a one page PDF which was generated in InDesign for print. This one page flyer needs to be linked to products on an ecommerce site. The flyer changes weekly.

Currently I open the PDF in Photoshop, slice it, add the links and upload it into an iframe. It takes about 15-30 minutes to get if from PDF to live.

What would be a more efficient way for me to convert this PDF to clickable web content? I don't want to spend more time than I currently do on it.

43
leishulang 1 day ago 0 replies      
with tools like edge reflow, the human part of PSD to html is going to be dead. HTML/CSS/JS will become the assembly of the web that no one is directly programming with. Designers will keep working on psd and use edge-like tools to export html/js files, and coders will be using clojurescript/fay/coffeescript etc.
44
mtangle 1 day ago 0 replies      
But picture is a good start to show what kind design you want And yes in many accessions some designers are tooooooo pitchy about their psd.
45
bluemnmtattoo 1 day ago 0 replies      
stone and chisel is dead
46
thomasfoster96 1 day ago 0 replies      
Hurrah!
14
VC Pitches in a Year or Two avc.com
258 points by SethMurphy  1 day ago   95 comments top 32
1
r0h1n 1 day ago 3 replies      
This is already reality in some countries. Like India for instance.

1st Indian entrepreneur: I plan to launch a search engine that understands Indian languages and contexts better than Google.

VC: Well since Google has already paid telcos like Airtel (http://www.airtel.in/free-zone/) so their searches and even some results don't use up any of the data plan, we are passing.

2nd Indian entrepreneur: I have an idea for a social network that is better than Facebook.

VC: Sorry, Well since Facebook has already paid telcos like Airtel (http://www.medianama.com/2014/01/223-airtel-facebook-free-hi...) so their site/app doesn't consume data while being used, we are passing.

2
legutierr 1 day ago 3 replies      
I think the pertinent question now is whether the FCC rewrites its rules to classify ISPs as common carriers. It seems to me, given the local monopoly or duopoly that the vast majority of ISPs enjoy, that this is an obvious move. But I have heard it barely discussed, which is distressing.
3
OoTheNigerian 1 day ago 0 replies      
The issue of Net neutrality is a global phenomenon/risk. Telcos in Nigeria abuse their positions as the primary carrier of data and they are protected by having 'licenses'

MTN and Rocket Internet recently tied a deal. I wrote about the risk here

http://oonwoye.com/2013/12/17/mtn-rocket-internet-deal-worri...

4
antr 1 day ago 1 reply      
I'd like to believe that many European entrepreneurs will reconsider going to the US to start a company. The European Parliament and the Commission have been straight shooters with net neutrality and they will not consent telcos to play with the pipes.
5
zxcvvcxz 1 day ago 5 replies      
Not the point of the article, but is anyone else annoyed by these "entrepreneur-sounding" ideas? I'm so sick of low-tech, solve-first-world-problem ideas and conflating that with entrepreneurship. If I were the VC, I'd tell them to get the fuck out, and it'd have nothing to do with Telcos.
6
TTPrograms 1 day ago 3 replies      
The author is missing the point that the ruling was about the specific language of the FCC regulations. See: http://gigaom.com/2014/01/14/breaking-court-strikes-down-fcc...

"That said, even though the Commission has general authority to regulate in this arena, it may not impose requirements that contravene express statutory mandates. Given that the Commission has chosen to classify broadband providers in a manner that exempts them from treatment as common carriers, the Communications Act expressly prohibits the Commission from nonetheless regulating them as such. Because the Commission has failed to establish that the anti-discrimination and anti-blocking rules do not impose per se common carrier obligations, we vacate those portions of the Open Internet Order."

It seems very likely that the FCC will rewrite their regulations to fix this. Everyone knows that net neutrality is important, and this ruling is just an issue in legalese. It's a little early to resort to torch-and-pitchfork hyperbole.

7
mwsherman 18 hours ago 0 replies      
Heres the problem: what AT&T is doing does not violate any technical definition of net neutrality, unless we ad hoc append new parts to it.

This is the core problem of net neutrality arguments, which it is often defined as I know it when I see it. It amounts to principles, but if we are going to have an enforceable law, we need to do better than that.

AT&T is not offering any priority to any bits here. Nothing is being blocked or degraded. Content providers who pay for sponsored data do not get faster bits nor do they slow down anyone elses.

Its free shipping: http://clipperhouse.com/2008/06/03/the-long-game-on-metered-...

Now, I can understand objecting to it on its merits, and Fred is making that argument, which is great. And I can understand why it feels like a violation of net neutrality, but we need to do better than feelings.

Heres how we test whether were defining net neutrality ad hoc: show me a clear, specific, widely accepted definition of net neutrality that describes AT&Ts behavior here, and that existed before this behavior was publicized.

8
dzink 9 hours ago 0 replies      
I am watching the reactions of non-technical people after they hear how this latest Net Non-Neutrality development affects them. Some are saying "Google, Apple, and the other big guys will step in for us". Others reply that when gas prices jumped to $3 everyone cried out and legislators started shuffling, yet we are now paying $4 and being thankful. Unless the outcry gets bigger this is going to pass through the cracks.

Do any of the big tech companies really have an interest in stopping this development? They could afford to buy themselves an "unrestricted by cap" deal with internet distributors and suffocate every other potential competitor?

9
n00b101 16 hours ago 0 replies      
Entrepreneur: I plan to launch a better [XYZ] hosted on Amazon AWS.

VC: Well since Amazon has paid all the telcos so that services delivered through AWS "telco-optimized elastic IPs" can be free on data plans, all you have to do is include Amazon's surcharges in your business plan.

10
rexreed 18 hours ago 0 replies      
While I hate the loss of Net Neutrality as much as the next person (as long as that next person isn't one of the incumbent carriers), this is the sort of thing that happens frequently in other markets. You want to see entrenched competition? Try launching a company in the Cleantech markets or in certain hardware or biotech or healthcare fields. Big pocket incumbents regularly flex their muscles here.

The loss of net neutrality is bad from many perspectives, but to be honest, there will ALWAYS be opportunities for startups and entrepreneurs in the space and VCs will not be want for good ones. All this does is shake out many ideas in favor of other ones.

I don't see why the VCs have reason to panic. And while I understand the Entrepreneur ideas were straw men utilized to illustrate a point, the quality of these ideas are pretty low. Maybe we should see the silver lining on this dark cloud in that it will shake out some of these deals from being funded when they probably shouldn't be anyways.

11
badman_ting 19 hours ago 0 replies      
Yes, current giants are going to continue consolidating and buying up businesses and amassing power. In America we used to have ways of dealing with monopolies but that is now pass -- we don't like the government "punishing" successful businesses (in the same way that taxes "punish" the rich). A lot of times I feel like something very bad will have to happen for things to change. Until then, hey.
12
anovikov 22 hours ago 0 replies      
Scary thing, this is same thing that created Rockfeller-era monopolies: they had right and did pay railways to create preferences (price-wise and otherwise) for their traffic, literally derailing competition. Internet is the bloodline of the post-industrial economy just like railways were of industrial one. That is much worse than most of you think.
13
avighnay 22 hours ago 0 replies      
The blog places its argument on the basis that data plan cost is going to be high. Would this be the case in the future? What if data plan costs are that negligible that it does not matter?

Secondly, compare this to TV networks, consumers watch TV and pay for them too, a part of the cost is subsidized by advertisers who are willing to pay the network to reach the audience. The consumers are in that network only because of the content, remove the content providers or reduce the quality of the content then the consumer vanishes. An empty network is worth nothing.

Would telcos, not harm themselves and their whole data plan business by attempting to charge the content providers (Google et al) and would the content providers 'advertising model' margins justify paying out to the telcos just to get through their infrastructure?

14
napoleond 19 hours ago 0 replies      
So basically it will become even harder for startups to disrupt the established players. Until a disruptive telco comes along and everyone switches to them, while also learning an important lesson about net neutrality. Maybe the whole thing will even cause society to re-think the way we allocate access to the internet and cellular networks.

(A man can dream!)

15
ankitml 1 day ago 1 reply      
Facebook already does this in india. I remember seeing an advertisement of a telco that said you wont be charged for data for facebook. Does this means that India already had this non neutral internet? It didn't change the scenario much here.
16
crgt 22 hours ago 0 replies      
A friend of mine was part of the team that made the decision for UHC to participate in AT&T's new sponsored data program. When I found this out, I tried to point out the broader implications of a shift towards pay-to-play, but had trouble getting past the narrow vision of "but we just want to get our content out to people that can't pay for mobile bandwidth". Just wanted to say thanks for this article for demonstrating what some of the implications are of a shift to a pay-to-play model. Non-techies really seem to struggle to get their heads around what happens to the entire ecosystem if net neutrality dies, and articles like this are helpful for making the consequences clearer.
17
mathattack 1 day ago 0 replies      
I get that this is an issue, but I'm a little curious why AT&T's stock price didn't pop as a result.

http://finance.yahoo.com/echarts?s=T+Interactive#symbol=t;ra...

18
josefresco 1 day ago 0 replies      
Forget net neutrality, doesn't this reality already exist when you factor in data caps? Most of these startup ideas use a fair amount of data which is now being capped more often than it has in the past.

I may be alone in my worry that as we move to capped data plans, and the pay-per-bit model that many new ideas and concepts (say for example always connected appliances) won't be financially feasible for consumers.

19
andrewescott 16 hours ago 0 replies      
There seems to be an assumption in the original argument that not every company will be able to get access to sponsored data, e.g.

> Well since Spotify, Beats, and Apple have paid all

> the telcos so that their services are free on the mobile

> networks, we are concerned that new music services like

> yours will have a hard time getting new users to use them

> because the data plan is so expensive

If a new music service could make a deal with telcos so that their service is free too, wouldn't this problem go away? In other words, if sponsored data was open to all, does this address the concern described?

The real concern seems to be that the cost base of a new service will go up because it will be forced to pay for sponsored data in order to compete, and VCs aren't happy about having to cover increased costs of their portfolio companies.

A similar argument could have been made about CDNs. Because the big services use CDNs to provide a better service, startups have to pay to use CDNs also in order to compete, and hence their costs are higher.

20
exelius 23 hours ago 1 reply      
Most VCs would pass on these ideas regardless. They're all in mature, saturated markets. One of the signs of a mature, saturated market is that the incumbents have taken great lengths to create barriers to entry that are very high. This is a discouraging thing for VC investment because it increases the chances that a new entrant will fail.

Besides, you've had to pay for access for years. The big difference is that before, you had to go through a CDN like Level3 or Akamai. Part of what you paid them went to the ISPs to ensure fast connections. All this means is that the YouTubes of the world will begin to buy interconnects with the big ISPs. Small ISPs will likely just band together into a cartel and sell access that way.

Yeah, there will be a fast lane and a slow lane, but the advent of CDNs in the early 2000s already created that anyway. The data caps are disappointing, but not really unexpected if you look at the mobile market. We're reaching the point where in many major cities, there is no media consumption that requires a much faster connection than is already available. So why will consumers pay for more speed when their existing 50mbps cable modem is enough to stream 4k video from Netflix? Those speeds ARE possible today if you buy carriage through a CDN like Akamai (and I regularly get those speeds from Steam downloads) and the fact that Netflix hasn't is really more of an implication of their business model.

The most recent ruling really changes nothing, because net neutrality has been dead for 10 years anyway. While everyone on the internet was complaining about it, the business side moved on and built a few billion dollar companies around it. Capitalism at its finest.

21
delinka 1 day ago 2 replies      
So someone should plan to launch a VPN service that will pay telcos so its traffic doesn't use any of its customers' data plans. Configure the mobile device to route all data via the VPN, encrypted. It'll charge its customers for access to unlimited apps, sites and streams without incurring a data plan hit.
22
SethMurphy 1 day ago 2 replies      
While I agree with his premise, there are already industries that used to be startup focused where there is a gatekeeper. E-commerce has Amazon, advertising has Google (and possibly Facebook), which already destroy new startups before they really get started through this same VC mindset. It seems to me that the balance of power is just moving up the line a bit to industries where VC's have little to no power to influence.
23
thejosh 1 day ago 1 reply      
Australia has been doing this for years for mobile data plans. Even though data plans aren't as expensive as they use to be, Facebook, Twitter, Foursquare, eBay, LinkedIn & MySpace are all "free" on the Optus network.
24
dredmorbius 1 day ago 1 reply      
This is the best possible thing that could happen.

Perhaps we'd see startups aimed at building out solar capacity, grid-scale storage solutions, electricity-to-fuels solutions, solar-powered airships, high-efficiency wind-steam hybrid shipping, high-efficiency retrofits to existing building stock, and management or treatment for TDR-TB, rather than an endless stream of privacy-invading "social" surveilices, games, and new forms of intrusive and annoying advertising.

Though building out an alternet that bypasses the telco's "authorized" channels wouldn't hurt either. Mesh and darknets.

Get cracking, HN!

25
prolifically 1 day ago 0 replies      
I hear him, Internet could become a caste system for businesses. But from what I understood, the ruling was more about the FCC trying to bend rules to accomodate everyone with carrier types (common carrier?). Yesterday's ruling raised awareness but the game is far from over.
26
BrownBuffalo 23 hours ago 0 replies      
What's interesting are the comments below the post about Net Noot. The problem really comes less about the fact that traffic shaping may not occur, but there are no safe guards if someone attempts to do so. In smaller markets, its more and more sounding like a small cottage industry will start because of the lack of existing laws to protect the consumer. The larger markets will have power in numbers, but mom-pop towns like Marion, AL with only a small regionarl carrier - not so much. Problematic and there is SOME truth to this in scale of economy.
27
codingdave 1 day ago 1 reply      
I get the point you are trying to make, but it wasn't presented in a way that non-tech folk will appreciate.

My parents' reaction to your scenarios would be something akin to: "Really? For the monthly rate I'm already paying, I now get Netflix and Hulu included for free? NICE!"

28
neovive 1 day ago 0 replies      
I guess this makes open city WiFi networks much more compelling. In most urban areas, how often are people not within range of a WiFi network?
29
tarr11 23 hours ago 0 replies      
I guess we need a chrome extension to uuencode your next social network over FB status updates.
30
orenbarzilai 1 day ago 0 replies      
imho in the near future most countries will have unlimited data plans or similar, so this argument will be irrelevant.
31
excitom 1 day ago 0 replies      
Well, I'm no fan of losing net neutrality but if the worst thing that happens is that lame VC pitches are no longer funded, I'm OK with it.
32
hackaflocka 19 hours ago 2 replies      
Bandwidth isn't free. The universal all-you-can-eat model is very unfair to the bandwidth supplier.

Either customers will need to be charged by meter.

Or producers will need to pay by meter.

It's only capitalism.

By the way, why can't the VC say, "we love your idea, and we'll front the money you need to pay the telcos."?

15
PSA: Back Up Your Shit jwz.org
229 points by mfincham  2 days ago   127 comments top 27
1
steven2012 2 days ago 14 replies      
I think that the beauty of Snapchat is that it frees you from this ridiculous notion that a text, IM, Facebook message, etc, has any value.

In my opinion, it doesn't. Also, in my opinion, I believe that feeling the need to save every single conversation you have fuels an over-inflated sense of self-worth, and that everything you say has value and needs to be saved.

I never, ever peruse through my messages, to reminisce over an old conversation. It's too much navel-gazing to suit my sense of pride. What actually matters is the actual relationship you have with a person, which is built on the BODY of IMs, messages, conversations, visits, dinners, parties, etc, that you shared with that person. Sometimes, it's best to leave good conversations in the blurry past, and just remember that a certain person is funny, a great conversationalist, etc.

I'm doing the same sort of thing with Google now. I will disallow anyone I'm in a conversation with to google facts with their phone. When we talk, it's about whatever resides in our own brains, be it good, bad or ugly. The entertaining part of any conversation is the actual conversation, the passion, the humor, etc. If all we wanted to do was pass around facts, then we can forward each other URLs and be done with it. When I'm talking with someone over dinner, we're not hammering out a contract that requires precision, we're having a conversation over ideas, and as funny as it sounds, facts aren't as important as the spirit of the conversation. Unless of course you're in an argument with someone, and then that isn't very much fun so why even bother starting the conversation in the first place.

2
famousactress 2 days ago 4 replies      
Thanks for this. The SMS export from iPhone is something I've been looking for. One of the most important relationships and experiences of my entire life has been documented (trapped) in my phone and it's backups ever since.

I'm looking forward to seeing how well it works, specifically whether it can pull photos/videos as well. If it doesn't yet but it wouldn't be too much trouble to add, I'd be willing to literally pay you to add that.

[Edit: Since a lot of the other comments are questioning the value of saving this stuff I figured I'd share my use cases. It turned out when I thought about it I have at least three:

1. I effectively met my wife on myspace (believe it or not a pretty nasty software bug led to our relationship) and an enormous amount of our initial friendship and courtship ended up documented there. Years ago I painstakingly clicked through for hours and copy-pasted the conversation to a text document.

2. I had a close friend die very suddenly and at a young age. My memory generally kind of stinks and I hated that there were conversations with him that I half-remembered. I went back through social media conversations with him (again, mostly on myspace) a lot in the years that followed. It helped me piece together memories that are very important to me now.

3. This past year my wife and I adopted our daughter. Our relationship with her birthmother has primarily been via SMS and the months that followed were a really exhausting and beautiful blur. It's really important to us that we're able to share that thread with our daughter someday.

In none of these cases did I see it coming that these services would end up having such valuable content in them for me. I didn't know I'd meet my wife. I'll never know when the last time I talk to someone is, and I would have never guessed that one of the most important things I'll have to give my daughter about her birth story is an SMS conversation.

So yeah, having access to this stuff is important to me. Thanks to jwz for pulling these resources together.]

3
borski 2 days ago 17 replies      
"You don't just throw your letters in the trash. You might want them some day."

Maybe it's just me, but I actually /do/ throw my letters in the trash. I /do/ treat Twitter, etc. as ephemeral and passing. I don't care about saving those messages. Am I the only one?

4
randomdrake 2 days ago 1 reply      
Accessing your own data and storing it is great, but there's still the matter of backing it up. jwz wrote a good guide for that as well. It's linked in the article, but not in a way that makes it obvious. Thought it would be good to mention it here:

http://www.jwz.org/doc/backups.html

5
enigmabomb 2 days ago 0 replies      
PSA: This guy's nightclub makes a really mean meatball sandwich.

Make sure that recipe is backed up.

6
ThatGeoGuy 2 days ago 1 reply      
I hate to be that guy who plugs his own crap everywhere, but I actually wrote my own blog post recently about backing up my stuff on Linux.

My setup is fairly rudimentary, and I had the help of a friend on IRC, but here's the link if anyone is interested in setting up something simple for a Linux workstation at home or a VPS you can ssh into (really, as long as you can SSH into it with rsync, my method will work). I'd also love any feedback HN can give regarding my mechanism. Hell, if you wanted to fully back up a phone and sdcard on your desktop, you could probably do something similar with "adb pull" or the like.

https://thatgeoguy.ca/blog/posts/howto-encrypted-backups-in-...

That out of the way, I'm often surprised by how often I have to remind either myself or others to make good backups. Phone's aside, there's been enough times where I've nuked my system that backing up all my files should be secondhand at this point. Thankfully, I have a decent system set up now, but I still consider it rough around the edges (especially considering how long archiving backups takes).

7
dkokelley 2 days ago 1 reply      
Honestly, my Twitter feed, Facebook, and SMS records could all disappear tomorrow, and I would be OK with it. Maybe there's value in my accumulated Facebook connections and history, but most of the value today comes from current content.

Now email, that I value for archival.

8
aestra 2 days ago 0 replies      
I want to have some record of some things but not everything by far. For example, I wouldn't want a record of everything I ever did recorded, but my dad used to walk around once in a while with a video recorder on special times when everyone gathered (birthdays, holidays...) and looking at those artifacts of life from almost 30(!!) years ago is priceless. Many people in those tapes have died since then, and I'm glad they exist. I save a few letters and emails, but not all or even many. I was going back and reading an email I sent a friend about getting together with an ex and seeing the perspective I had on things back then was... weird. When I was in high school (before cell phones or text messages) people passed notes in class, and some of my friends still have a box full of them. They are relics of the past. They take up space, and you probably won't want ALL of them, but I think it is worth keeping a few. I wish I kept one or two. I can't imagine the vacant things mine would have contained. I mean, I still have my yearbooks, I didn't throw those out... same kinda thing really.
9
Ellipsis753 2 days ago 0 replies      
I'm on Gentoo Linux and had my Skype settings set to never delete chat logs. After a couple of months these logs were in the tens of gigabytes. The strange thing is that I rarely even chat on Skype. This should be tens of gigabytes of just text chat. Well it's not any format that I can understand (and Skype lags badly and becomes pretty much unusable if I type /history to look at the logs) so I've had to delete them and for a while now Skype only stores the last month of chat. I can't think why the logs got so big. Perhaps Skype trys to optimize them for quick searches or something?

Anyway, does anyone know of a way to back up Skype text chats? They shouldn't have to use up this much space. (And ideally I should actually be able to load them up and read them too!)

10
davidgerard 1 day ago 0 replies      
Inspired by this, I went and downloaded all my tweets last night.

Then I looked through them. I can assure you that tweets may not be ephemeral, but they are most certainly disposable.

11
anigbrowl 2 days ago 0 replies      
These conversations aren't ephemeral and disposable, they are your life, and you want to save them forever.

Yes they are, and no I don't. I highly doubt JWZ carries a portable recorder to immortalize all his in-person conversations; I certainly don't, even though recording people (for movies) is what I do for a living. Funnily enough, far more of my important memories involve real-life conversations than exchanges on IRC/Facebook/HN.

Yeah,. it's good to have a method of backing this stuff up if you do need it, eg for business communications or any number of other use cases. But most digital chatter is eminently disposable I wish there was a way to have emails expire and self-destruct automatically, so that things like time-sensitive sales offers would quietly vanish once the actionable date had passed unless I made some special effort to retain them.

12
gwu78 2 days ago 3 replies      
"Remember: if it's not on a drive that is in your physical possession, it's not really yours."

So, if we store our data in "the cloud", it's not really ours?

13
codva 2 days ago 3 replies      
I delete all email after 90 days, unless I explicitly moved it to an archive folder.

I've never event thought about saving IMs, texts, Twitter, etc. Civilization has survived a very long time without a written record of every conversation ever. It will continue to do so.

14
flipstewart 2 days ago 1 reply      
I do throw away letters. I'd rather not live in the past or cling to ephemera for emotional reasons, thank you.
15
sturmeh 2 days ago 0 replies      
Chat history serves one purpose for me, the file size quantifies how much I spend talking to a particular person, and I use that to sort people on my contact list.
16
dhughes 2 days ago 1 reply      
Pictures are the worst for backing up, actually no backing up someone else's pictures is worse.

Parents for example, my mom takes a lot of pictures she wants to keep I take lots of pictures I don't care about.

Semi-wheneverly when I manage to get the card from the parent's camera or cellphone to back up it's usually a mess of I backed up 63% of these so which ones are new. Is IMG0003.JPG the same as IMG0003.JPG I saved already wait no one is 2MB and the other is 3.25MB.

Meld helps but it's the same thing what do I have and what is new and what is different with the same name but is different which I just happen to notice due to the file size.

So I end up dumping it all onto something or multiple somethings and swear I'll figure it out next time. Goto step 1.

17
mathrawka 2 days ago 1 reply      
Totally off topic, but the I can read the site fine, but if I switch back to white color site (like HN, or just staring at a wall), my eyes still see the lines of the site for awhile.

It physically affects my vision for a few minutes, albeit just a little bit. Is this normal?

18
X-Istence 2 days ago 0 replies      
The SMS backup tool for iPhone doesn't seem to work for me. I have encrypted backups turned on for my iPhone, will this not work because of that?
19
ballard 2 days ago 0 replies      
Backup personal stuff and code to Tarsnap. Videos would be too expensive, but downsampled home videos might be worth saving too.
20
Aloha 2 days ago 0 replies      
It drives me nuts that pidgin and adium use different logging interfaces - its made switching from Windows to OSX more painful - as I still use finch on Linux, and use file syncing to sync logs and config files across platforms.
21
FollowSteph3 2 days ago 0 replies      
I view this as no different than backng up phone calls. And most people don't care to back up their phone calls. Just because you can doesn't mean it's always worth it...
22
normloman 2 days ago 0 replies      
Always back up online chats so you can blackmail the participants with embarrassing or incriminating statements they made.
23
qwerta 2 days ago 0 replies      
On android you just mount phone partition and query SQLite tables.
24
mmanfrin 2 days ago 3 replies      
PSA: It is not 1999, please don't use neon green text on a black background as your color scheme for text.
25
jheriko 2 days ago 0 replies      
along these lines i'd highly recommend: http://socialsafe.net/
26
LeicaLatte 2 days ago 0 replies      
Social media is just excreta of human activity on the internet. Why back that up? Lame.
27
bestspellcaster 2 days ago 0 replies      
I want to use this opportunity to thank drstanleyspelltemple@hotmail.com for helping me get my lover back after he left me few months ago. I have sent friends and my brothers to beg him for me but he refused and said that it is all over between both of us but when I met this Dr. Stanley, he told me to relaxed that every thing will be fine and after three days and contacted him, I got my man back......Caitlin
16
Nassim Taleb: We should retire the notion of standard deviation edge.org
229 points by pyduan  23 hours ago   235 comments top 39
1
bluecalm 22 hours ago 25 replies      
So first about the article:

>>The notion of standard deviation has confused hordes of scientists

What an assertion! It also proved to be very useful for hordes of scientists... what about some examples of confused scientists ?

>>There is no scientific reason to use it in statistical investigations in the age of the computer

As someone who uses it daily I am eagerly awaiting his argument.

>>Say someone just asked you to measure the "average daily variations" for the temperature of your town (or for the stock price of a company, or the blood pressure of your uncle) over the past five days. The five changes are: (-23, 7, -3, 20, -1). How do you do it?

Ok... if I am to calculate the average I am calculating the average if I need to know standard deviation I calculate standard deviation...

>> It corresponds to "real life" much better than the firstand to reality.

What the flying fuck. What "real life" ? Standard deviation tells you how volatile measurements are not what mean deviation is. Those are both very real life things just not the same thing.

>>It is all due to a historical accident: in 1893, the great Karl Pearson introduced the term "standard deviation" for what had been known as "root mean square error". The confusion started then: people thought it meant mean deviation.

I don't know how one can read it and not think: "is this guy high or just stupid?".

>>. The confusion started then: people thought it meant mean deviation.

I am yet to see anybody who thinks that standard deviation is mean deviation. It's Taleb though. Baseless assertions insulting groups of people are his craft.

>>What is worse, Goldstein and I found that a high number of data scientists (many with PhDs) also get confused in real life.

One example please ?I can give hundreds when std dev is useful and mean deviation isn't. Anything when you decide what % of yoru bankroll to bet on perceived edge for example.

Ok so he asserted that people should just use mean deviation instead of mean of squares. Guess what though, taking the squares have a purpose: it penalizes big deviations so two situations which have the same mean deviation but one is more stable have different standard deviations. THis information is useful for many things: risk estimation or calculating sample size needed for required confidence (if you need more experiments, how careful should you be with conclusions and predictions etc).He didn't mention how are we going to achieve those with his proposal. Meanwhile he managed to throw insults towards various groups without giving one single example of misuse he describes.

This is not the first time he writes something this way. His whole recent book is like that. It's anti-intellectual bullshit with many words and zero points. He doesn't give any arguments, he throws a lot of insults, he misues words and makes up redundant terms which he then struggles to define.The guy is a vile idiot of the worst kind: ignorant and aggressive. Him gaining so much following by spewing nonsense like this article is for sure fascinating but there is no place for him in any serious debate.

2
dxbydt 20 hours ago 1 reply      
The notion of area has confused hordes of scientists; it is time to retire it from common use and replace it with the more effective one of circumference. Area should be left to mathematicians, topologists and developers selling real estate. There is no scientific reason to use it in statistical investigations in the age of the computer, as it does more harm than good.

Say someone just asked you to measure the area of a circle with radius pi. The area is exactly 31. But how do you do it?

scala> math.round(math.Pi * math.Pi * math.Pi).toInt

res1: Int = 31

Do you pack the circle with n people, count them up and verify n == 31 ? Or do you pour a red liquid into the circle and fill it up, then drain it and measure the amount of red ? For there are serious differences between the two methods.

If instead, you were asked to measure the circumference of a circle with radius pi.

scala> math.round(2 * math.Pi * math.Pi).toInt

res2: Int = 20

You just ask an able-bodied man, perhaps an unemployed migrant, to walk around this circle while another man, an upstanding Stanford sophomore, starts walking from Stanford to meet his maker, I mean VC, well its the same thing...

So by the time the migrant finishes walking around the circle, our upstanding Stanford entrepreneur is greeting the VC on the tarmac of the San Francisco International Airport. This leads one to rightfully believe that the circumference of the circle of radius pi is exactly the distance from Stanford to the SF Airport ie. 20 miles. It corresponds to "real life" much better than the firstand to reality. In fact, whenever people make decisions after being supplied with the area, they act as if it were the distance from their university to the airport.

It is all due to a historical accident: in 250BC, the Greek mathematician Archimedes introduced Prop 2, the Prevention of Farm Cruelty Act ( http://en.wikipedia.org/wiki/California_Proposition_2_(2008) ). No I believe this was a different Prop 2. This Prop 2 states that the area of a circle is to the square on its diameter as 11 to 14 (http://en.wikipedia.org/wiki/Measurement_of_a_Circle ) .The confusion started then: people thought it meant areas had to do with being cruel to farm animals. But it is not just journalists who fall for the mistake: I recall seeing official documents from the department of data scientists, which found that a high number of data scientists (many with PhDs) also get confused in real life.

It all comes from bad terminology for something non-intuitive. Despite this confusion, Archimedes persisted in the folly by drawing circles in the sand, an infantile persuasion, surely. When the Romans waged war, Archimedes was still computing the area of the circle. The Roman soldier asked him to step outside, but Archimedes exclaimed "Do not disturb my circles!" (http://en.wikipedia.org/wiki/Noli_turbare_circulos_meos)

He was rightfully executed by the soldier for this grievous offense. It is sad that such a minor mathematician can lead to so much confusion: our scientific tools are way too far ahead of our casual intuitions, which starts to be a problem with a mad Greek. So I close with a statement by famed rapper Sir Joey Bada$$, extolling the virtues of the circumference: "So I keep my circumference of deep fried friends like dumplings, But fuck that nigga we munching, we hungry." (http://rapgenius.com/1931938/Joey-bada-hilary-swank/So-i-kee...)

3
Homunculiheaded 22 hours ago 2 replies      
I sometimes think that progress in the 21st century will be summed up as: "The realization that the normal distribution is not the only way to model data".

Taleb's favorite topic is the "black swan event" which is something that the normal distribution, and the idea of standard deviation, don't model that well. In a normal distribution very extreme events should only happen once in the lifetime of several universes. Of course assuming variation inline with a Gaussian process is at the heart of how the Black-Sholes model calculates risk/volatility/etc.

Benoit Mandelbrot argued that financial markets follow a distribution much more similar to the Cauchy distribution (specifically the Levy distribution) rather than a Gaussian. The problem of course is that the Cauchy distribution is pathological in that it doesn't have a mean or variance, you can calculate similar properties for it (location and scale), but it doesn't obey the central limit theorem so in practice it can be very strange to work with.

The normal distribution is fantastic in that it does appear frequently in nature, is very well behaved, and has been extensively studied. However a great amount of future progress is going to come from wrestling with more challenging distributions, and paying more attention to when assumptions of normality need to be questioned. Of course one of the challenges of this is that the normal distribution is baked into a very large number of our existing statistical tools.

4
beloch 15 hours ago 1 reply      
I'm a physicist, so I'm one of the people this guy says standard deviation is still good for. However, despite some "oddities" (pointed out by others here) in his article, I'm more than willing to admit a simpler, easier to understand term would be helpful for explaining many things to the general public. Hell, it would be helpful for explaining things to journalists, who we then trust to explain things to the public!

Look at an reputable news site or paper. Odds are they post articles based on polls several times a day. How many report confidence intervals or anything of the sort? These are crucial for interpreting polls, but are left out more often than not. Worse yet, many stories make a big deal about a "huge" shift in support for some political policy, party or figure, when the previous month's figure is actually well within the confidence interval of the current month's poll!

Standard deviation, confidence intervals, etc. are all ways of expressing uncertainty, and it's become abundantly clear that the average journalist, to say nothing of the average person, has no clue about what the concept means. If the goal is to communicate with the public, then we really need to take a step back and appreciate the stupendously colossal wall of ignorance we're about to butt our heads against. When we talk about the general public, we should keep in mind that rather a lot of people know so little about the scientific method that they interpret the impossibility of proving theories as justification for giving religious fables equal footing in schools. This kind of ignorance isn't a nasty undercurrent lurking in the shadows. It's running the show, as evidenced by many state laws in the U.S.! There is absolutely no hope of explaining uncertainty to most of these people.

There is hope of explaining basic statistics to journalists, if only because they are relatively few in number and it's a fundamental part of their job to understand what they are reporting. Yes, I just said that every journalist who has reported a poll result, scientific figure, etc. without the associated uncertainty has failed to adequately perform their job. We need to make journalists understand why they are failing. If simplifying the way we report uncertainties will assist with this, then I'm all for it. Bad journalism is a root cause of a great deal of ignorance, but it's not an insurmountable task to fix it.

If you are a scientist who speaks to journalists about your work, make sure they include uncertainties. If you are an editor, slap your peons silly if they write a sensationalistic poll piece when the uncertainties say it's all a bunch of hot air. If you are a reader, please mercilessly mock bad articles and write numerous scornful letters to the editor until those editors pull out their beat-sticks and get slap-happy. We should not tolerate this kind of crap from people who are paid to get it right.

5
n00b101 22 hours ago 2 replies      
Taleb has a good point about people mistakenly interpreting standard deviation (sigma) as Mean Absolute Deviation (MAD). I like that he gives some conversions (sigma ~= 1.25 * MAD, for Normal distribution).

I think it's rather silly to talk about "retiring" standard deviation, but we can't blame Taleb - the publication itself posed the question "2014: What Scientific Idea is Ready for Retirement?" to various scientific personalities.

What Taleb failed to mention is that, once properly understood, standard deviation has distribution interpretations that can be much more useful than MAD. For example, if the data is approximately normally distributed, then there is approximately a 99.99% probability that the next data observation will be <= 4 * sigma.

Not everything is approximately normally distributed, but a lot of phenomena ARE normally distributed. It's a well known fact that the phenomena which Taleb is most interested in (namely, financial return time-series) are not normally distributed. But I would like to know how Taleb proposes to "retire" volatility (sigma) from financial theory and replace it with MAD? Standard deviation is so central in finance that even the prices of some financial instruments (options) are quoted in terms of standard deviation (e.g. "That put option is currently selling at 30% vol"). How do we rewrite Black-Scholes option pricing theory and Markowitz portfolio theory in terms of MAD and remove all the sigmas everywhere? Surely Taleb has already written that paper for us so that we can retire standard deviation?

6
programminggeek 22 hours ago 1 reply      
I think because it's called "standard deviation" that it sounds like the thing to use or look for. It sounds more correct because of the word standard.

I feel like it is the same kind of failing due to human perception of language that programmers have with the idea of exceptions and errors, especially the phrase "exceptions should only be used for exceptional behaviors". That's a cool phrase, but people latch on to it because of the word exception sounding like something extremely rare and out of the ordinary whereas we see errors as common, but they are in fact the same thing. Broke is broke, it doesn't matter what you call it, but thousands of programmers think differently because of the name we gave it.

We are human and language absolutely plays a role in our perception of things.

7
cheald 22 hours ago 4 replies      
I really tried to get through "The Black Swan" and Taleb's writing struck me as so pretentious and self-involved that it made it impossible for me to finish.

He strikes me as someone who is so desperate to be important and recognized that an assertion like this doesn't really surprise me.

8
Glyptodon 21 hours ago 0 replies      
All I know is this reminds me a lot of high school where we had to always compute std dev in problems, homework, and sometimes labs, but nobody really ever explained how to interpret it. It was always like "This is std dev. This is how you compute it. Make sure you put it your tables and report."

Eventually someone (or something) did explain it, but once I understood it, it became clear that it wasn't always a sensible thing to be asked to calculate but was instead just an instinctive requirement.

9
justin66 19 hours ago 0 replies      
Taleb has a textbook draft up which is more technical than his popular writings:

http://www.fooledbyrandomness.com/FatTails.html

There might be something there for the more rabid critics. At least it will keep them off the internet for a few days...

10
zeidrich 22 hours ago 0 replies      
It's not that we should retire the notion of standard deviation. It's more that we should understand the tools that we are using and use the appropriate tool for the job.
11
JASchilz 22 hours ago 1 reply      
The central limit theorem shows us that unimodal data with lots of independent sources of error tends towards a normal distribution. That description is a good first-pass, descriptive model for lots and lots of contexts, and standard deviation speaks well to normally distributed data.

Squaring error isn't just a convenient way to remove sign, it's driven by a lot of data-sets' conformance to the central limit theorem.

12
aredington 2 hours ago 0 replies      
The way I read it he's proposing two things:

1) Refer to the analysis of Root Mean Square Error always by that name. (RMS is already often used in certain jargon instead of stddev).

2) Stop treating RMS as a default measure of variance. Treat Mean Absolute Deviation as the default measure of variance, because the figure it provides is more consistent with people's psychological interpretation.

It's not really retiring RMS, just retiring the idea that it is a good default statistical analysis.

13
ClementM 22 hours ago 3 replies      
This article is based on paper Taleb published in 2007.If you want to test yourself, submit yourself to experiment in page 3:http://papers.ssrn.com/sol3/papers.cfm?abstract_id=970480
14
spikels 21 hours ago 1 reply      
You gotta love the acronyms: STD versus MAD!

Taleb is definitely mad but his use of the MAD acronym (mean absolute deviation) is actually correct. However the STD acronym (all caps) refers to "sexually transmitted disease" and not generally used for "standard deviation". Most people use SD, Stdev, StDev or sigma.

Once again his ability to coin new terminology outstrips his ability to form coherent ideas that are anything more than trivial (eg. we have known about fat tails in stock returns for 50+ years). Like George Soros[1], Taleb's success says more about the state of the world of finance than their contributions to our knowledge.

[1]-See his book "The Alchemy of Finance"

15
lambdasquirrel 22 hours ago 1 reply      
I think we'd be better off if we recognized that there are statistical distributions in the world besides the plain old Gaussian. For example, wealth does not follow a Gaussian, so why the heck do we throw around ideas like "above average wealth"?

Is MAD any better? Definitely. But I'd like to see a visual demonstration of how well it models exponential-based distributions. How well does it describe their "shape", the skew of the tail?

16
ChristianMarks 16 hours ago 0 replies      
Climate scientists--among others--have made similar recommendations to use the absolute mean error in place of the standard deviation, depending on the application. Taleb might have cited the extensive methodological literature--for example:

Cort J. Willmott, Kenji Matsuuraa, Scott M. Robeson. Ambiguities inherent in sums-of-squares-based error statistics. Atmospheric Environment 43 (2009) 749752.

URL: http://climate.geog.udel.edu/~climate/publication_html/Pdf/W...

17
dschiptsov 22 hours ago 1 reply      
Why, it is pretty good in describing probability distributions. What we should retire are idiots, who assume that it predicts an outcome of the next event.
18
puranjay 20 hours ago 2 replies      
NNT is my intellectual superhero but the amount of hate he gets is tremendous.

Please understand that NNT's biggest issues are not so much with the way statistical models are applied to economics and finance, but how social scientists sometimes feel compelled to apply them to social fields as well, which is plain unscientific, dumb, and mostly disastrous.

So when you bear down on his arguments, please keep this context in mind.

19
thetwiceler 16 hours ago 0 replies      
It is sad that Taleb does not see the value in the standard deviation; standard deviation is far more natural, and more useful, than MAD.

For example, if X has a standard deviation of s, and Y has a standard deviation of t, then the standard deviation of X + Y is sqrt(s^2 + t^2). There is a geometry of statistics, and the standard deviation is the fundamental measure of length.

To retire the standard deviation is to ignore the wonderful geometry inherent in statistics. Covariance is one of the most important concepts in statistics, and it is a shame to hide it from those who use statistics.

Additionally, I will mention that we do not need normal distributions to make special the idea of standard deviations. In fact, it is the geometry of probability - the fact that independent random variables have standard deviations which "point" in orthogonal directions - which causes the normal distribution to be the resulting distribution of the central limit theorem.

20
bayesianhorse 16 hours ago 1 reply      
Nassim Taleb somehow likes to beat up on normals...

We Bayesians have similar notions, but we usually try not to overly bully frequentist methods, the poor things. Also, being familiar with Bayesian methods, a lot of what Taleb is saying sounds vaguely familiar...

21
tn13 17 hours ago 0 replies      
There is nothing wrong with STD or MAD. The real problem is a lot of people apply them without realizing the nature of their data and what kind of analysis they want to do.

In this case what matters in the end is the kind of impact deviation from mean has on the real world variable you have. I agree that in most Gaussian experiments MAD might be more useful than STD.

STD is more useful when the real world impact of the deviation increases exponentially with the magnitude of deviation and hence it is a good idea of magnify the (x-n) by squaring it. In many cases the impact is linear where MAD clearly works better. For example in cricket where n runs are n times better than 1 run. But in case of shooting. Hitting 9 targets out of 10 might be 100 times better than 1 out of 10 so there MAD will be misleading.

22
cwyers 22 hours ago 2 replies      
"In fact, whenever people make decisions after being supplied with the standard deviation number, they act as if it were the expected mean deviation."

Boy, is that statement useless without any kind of context, example or citation.

23
TTPrograms 13 hours ago 0 replies      
There is some argument that MAD is actually better than RMS for a lot of applications. Apparently it predated RMS, but one of the reasons it was switched to was because RMS minimizing linear regression is much, much simpler to calculate. Also consider comparing the robustness of RMS based regression with MAD based regression. See: http://matlabdatamining.blogspot.com/2007/10/l-1-linear-regr...
24
scythe 22 hours ago 1 reply      
While the mean deviation as presented is slightly nicer than sigma for intuitive purposes, it isn't as appropriate (iirc) for statistical tests on normal distributions and t-distributions.

More importantly, it doesn't fix the real problem, which is that the mean and standard deviation don't tell you everything you need to know about a data set, but often people like to pretend they do. It's not rare to read a paper in the soft sciences which might have been improved if the authors had reported the skewness, kurtosis, or similar data which could shed light on the phenomenon they're investigating. These latter statistics can reveal, for instance, a bimodal distribution, which could indicate a heterogeneous population of responders and non-responders to a drug, and that's just one example.

I'm not a statistician, so some of this might be a bit off.

25
valtron 22 hours ago 0 replies      
He makes a good point about infinite MAD vs. STD.
26
afterburner 21 hours ago 0 replies      
I've found MAD a potentially useful measure for monitoring whether something gets out of whack; when using STD I needed to modify it to give less weighting to outliers.
27
MaysonL 19 hours ago 1 reply      
How often do "six sigma" events occur in financial markets? A hell of a lot more often then the 0.0000001973% that they would in a normally distributed system.
28
snake_plissken 18 hours ago 0 replies      
I've always thought his writings were more allegorical than scientific; you can't rely on the standard deviation to never go against you at the worst possible time. But like anything else, it can and it (probably) will.

Also, yes, his writing style is grating and he takes opportunistic character swipes at pretty much everyone.

29
al2o3cr 19 hours ago 0 replies      
Shorter social scientists: "Gaussian distribution sez wut?"
30
RivieraKid 22 hours ago 3 replies      
I was just wondering about a very related problem. I do 5 measurenments of some random variable (let's say execution time) and average them. How should I report the variability of that average?

State the sample size and standard deviation?

31
Beliavsky 16 hours ago 0 replies      
If data is drawn from a Laplace distribution of the form p(x) = exp(-|x|), the mean absolute deviation is more informative than the standard deviation, but if its form is close to the normal, p(x) = exp(-x^2), the standard deviation is more important. So whether to use the mean absolute or standard deviation depends on the distribution of the data. There is a field called robust statistics that looks at this question.
32
randomsample2 21 hours ago 1 reply      
Standard deviation and mean absolute deviation are both useful, but I think it's silly to suggest that we all adopt exactly one measure of variability to summarize data sets. When in doubt, make a fucking histogram.
33
notastartup 19 hours ago 1 reply      
I've been a long time fan of Dr. Nassim Taleb. First book I've read was the one about his time as a day trader and how on the Black Friday market crash, he made a killing and cleared his desk and never had to work again.

There are those that dislike his ideas because it is threatening to their existing assumptions about probability and statistics. He argues that experts and majority of people do not account for the unpredictable but significant impact a single event can have which often shatters the commonly held belief. For example, swans were white until the discovery of black swans in Oceania, too big to fail multi-national corporations going bankrupt like Lehman's brothers and etc.

He's not anti-academic, but he is against teachings in the common academia that is based on naive assumptions that is specifically tailored to serve those that thrives most off the limited quantitative measures, such as market callers, hedge funds selling complicated quantitative algorithm trades, academics seeking fame and fortune by writing the most logical and quantitative paper without questioning any of the tools they are using, it is this hypocrisy and laziness that is apparent and those that try to deny to the point of making ad hominem remarks against a man, who simply observes these things and decides to write it in an entertaining manner (otherwise nobody would give a shit because the topic would be dry without lay man's linguo).

Keep an open mind, a lot of what he says I do find interesting ideas and it has influenced my thinking process quite a bit, however it's no way in anyway, grounds for cracking jokes or ridicule, in fact when I read some of the comments here, it's a bit shameful. We should be embracing new ideas in order to explore them, regardless of who the explosive nature of the claim, because the black swan event is very real and is not captured or understood completely by our current set of statistical tools and methodology based on questionable assumptions about how the real world operates. For example, 1/2500 chance is not what we really think it means in the real world because black swan events are more common than we think, a percentage probability do not fully reflect it's frequency and the magnitude of it's event.

Note the fall of crime rates in the United States following a decision to legalize abortion, economists and experts would come on television and bring up all sorts of random theories and ideas but little did they realize it was a chain effect from a court ruling passed decades ago until two economists came out with a paper that was ridiculed because it suggested that 'killing babies from poor neighbourhoods = lower crime rate' where most poor neighbourhoods is occupied by African Americans. Because such idea was earthshakingly controversial and still denied even to this day. Because Galileo claimed the earth was round instead of flat, he was executed. This is simply the nature of our world, almost all part of life, there exists a hierarchy that people simply do not ask questions either due to blind trust or the fear of reprisal.

34
tehwalrus 16 hours ago 0 replies      
at least he's leaving us physicists alone with it...
35
vzhang 20 hours ago 1 reply      
I'm seriously questioning some people's reading comprehension - he NEVER said STD is not useful! He's only saying the name "Standard Deviation" is badly chosen.
36
etanazir 21 hours ago 0 replies      
The minimum uncertainty wave equation is ~ e^(-x^2) ergo the standard measure is in terms of x^2. QED.
37
yetanotherphd 12 hours ago 0 replies      
I had hoped this would be about the revolution occurring in statistics/econometrics where confidence intervals based on strong parametric assumptions (e.g. the confidence intervals you would obtain using the standard deviation) are being replaced by confidence intervals obtained using the bootstrap (and other non-parametric methods) that don't rely on such strong assumptions.

But no, it is just advocating using Mean absolute distance instead of the standard deviation. Which I guess is to be expected from someone whose work focuses mostly on long-tailed distributions.

Still, I think that non-parametric methods are much more valuable as a solution to dealing with non-normal data than what Taleb is proposing.

38
roywei 22 hours ago 4 replies      
four-day returns of stock x: (-.3, .3, -.3, .3) -> MAD = 0; four-day returns of stock y: (-.5, .5, -.5, .5) -> MAD = 0.
39
truthteller 22 hours ago 0 replies      
he's really lost the plot. :(
17
The Next Phase of Node.js nodejs.org
216 points by sintaxi  23 hours ago   122 comments top 22
1
beaumartinez 22 hours ago 4 replies      
> I am starting a company, npm, Inc., to deliver new products and services related to npm.

I don't know whether I should be concerned a core part of Node.js is becoming "businessy". Is this common amongst other big software projects?

Providing "premium" services is a slippery slope. What happens when someone wants to add a feature to npm that the premium services already provide, for example?

Python and Django have non-profits which help sustain their ecosystems, but AFAIK Node.js doesn't. Perhaps it would be a good step.

2
hartator 22 hours ago 7 replies      
What about the bnoordhuis story?

It seems not to be active again for his lib: https://github.com/joyent/libuv/graphs/contributors

For people who don't know, bnoordhuis exiled himself 2 months ago after the bashing the node.js community imposed him for not merging a pull about gender. It was a shame because he was in the top 3 contributors. (all details: https://github.com/joyent/libuv/pull/1015)

3
bhouston 22 hours ago 1 reply      
So NPM Inc.? Interesting. I do think NPM requires its own team, but I am unsure if NPM itself is a business (although I can see it being a sponsored foundation.) But there are hopefully creative business solutions to be had here.

Q: How does this relate to the money I gave to the Scale NPM project a few months back? https://npm.nodejitsu.com/

4
nilsbunger 21 hours ago 1 reply      
What's next?The pip corporation?Worldwide Gems Inc?

"Global ./configure&&make&&make install Industries" ?

5
jnardiello 22 hours ago 1 reply      
How about the 200k Nodejitsu crowdfunded not long ago for npm? Also, how the npm revenue model is going to affect packages distribution? To whoever may concern: I expect some clear answers. There's simply too much money going around Node at the moment and while it might eventually be a good thing (as involved companies will push node adoption among devs), it's also scaring and quite weird.

[EDIT: Isaac already partially replied, see other answer]

6
rtfeldman 22 hours ago 2 replies      
I assumed from the title this would be an announcement about Node 1.0, but I'm not terribly surprised to see the trend continue of approaching 1.0 asymptotically.

Still, this is interesting stuff. Does anyone know if any similar repos (Ruby Gems, etc.) have their own for-profit companies?

7
majke 21 hours ago 0 replies      
Good luck Isaac and thanks for all the fish!

You did a great work as a node lead. The brief moment when our paths crossed, due to a security issue in node, was handled perfectly. Seriously. You will be missed.

8
transfire 21 hours ago 1 reply      
NPM "app" store? Should I laugh, cry or try to monetize?
9
niix 22 hours ago 0 replies      
Awesome, so glad Isaac gets to focus on what he loves. Congrats to TJ on his new role!
10
thepumpkin1979 22 hours ago 2 replies      
"Open-Source Democracy 101"... 3 Leaders in less than five years: Ryan Dahl(2009-2012) -> Isaac Z. Schlueter(2012-2013) -> Timothy J Fontaine(2014-2016?)
11
ozten 20 hours ago 0 replies      
I'd love to see an alternative to npm that:

* Puts security as a high priority

* That is operated as a federated system (think bitcoin block chain)

* Puts uptime as a high priority

Many companies and individuals could run deployments of it, removing the need for a new NPM Inc to pay for service costs.

12
matan_a 22 hours ago 0 replies      
While there are ways to create private registries [0], having a new packaged way to do it would be very useful to some orgs.

[0] https://npmjs.org/doc/registry.html#Can-I-run-my-own-private...

13
edwinnathaniel 20 hours ago 0 replies      
There's nothing wrong to focus on NPM.

Take a look at Maven[http://maven.apache.org] and Sonatype[http://www.sonatype.com/] (dependency management in Java-land).

Works awesomely (and maybe even better!).

14
lnanek2 18 hours ago 0 replies      
Kind of disappointing this wasn't more about the technical direction. Really feel the title is misleading and should have been something about leadership/people change so I would have known not to bother reading it.
15
binocarlos 22 hours ago 1 reply      
"The peacekeeping budget for the 201314 fiscal year was $7.54 billion" - this is from the Wikipedia UN page

I'm not saying that npm is like, that important, but - take 0.01% (one ten thousandth) and npm would have an annual budget of 3/4 million bucks.

Will we get to a place where core software distribution to devices not humans is deemed as critical infrastructure?

It would be great to know that npm (and github and every other package distribution tool) were somehow too big to fail like banks have shown to be.

Meanwhile - everyone who has worked hard to make npm and node brilliant - thank you!

16
drderidder 19 hours ago 0 replies      
I just wanted to wish Isaac the best with his new initiative. He's been a great contributor and helped make the node community a fun and friendly one.

[edit] - originally thought TJ Fontaine wasn't employed by Joyent but apparently he is.

17
dmourati 21 hours ago 1 reply      
The next phase of npm should be to fix this open issue:

https://github.com/npm/npm/issues/4131

18
yachtintransit 22 hours ago 0 replies      
i think is a great thing! communities flourish when they are support by companies with effective ( preferably transparent) revenue models. exciting times . npm inc , best of luck !
19
mmaunder 22 hours ago 0 replies      
I suspect that the ecosystems forming around node will find more success by emulating what Ubuntu has done vs RedHat.
20
jaiball 22 hours ago 1 reply      
wonder if this has anything to do with the npm maintainer who is missing. Hope he's ok.
21
drakaal 21 hours ago 0 replies      
Node is becoming a business not a community, and the community is taken by surprise. I saw this coming, but I think many people thought Node was about free love and changing the world. It is down hill from here. You can avoid a lot of politics when there isn't any money involved, but now there is, and that changes everything.
22
calroc 21 hours ago 1 reply      
Node.js is a scam.
18
X^2 is the sum of three periodic functions gotmath.com
194 points by ColinWright  2 days ago   53 comments top 11
1
losvedir 2 days ago 2 replies      
> It is clear that a non-constant polynomial cannot be expressed as a finite sum of continuous periodic functions,

But it can be expressed as an infinite sum of continuous periodic functions, right? I seem to remember that you could use all the sine functions as bases for the vector space of functions and (almost?) any function could be expressed as an infinite sum of sines.

It's been a while since I've thought about these things, is that recollection correct?

2
simias 2 days ago 4 replies      
It's very interesting but why isn't there any practical example? What are the equations of three periodic functions that would add up to y = x^2 for instance?

How would one even construct those three equations? If I'm not mistaken TFA demonstrates that these functions exist, not how to find them.

I wish I hadn't stopped with maths in high school...

3
jbert 2 days ago 2 replies      
I think the thing which is defeating my intuition here is the continuity issue.

"If a periodic function is continuous and nonconstant, then it has a least period, and all other periods are positive integer multiples of the least period."

I think my naive intuition is modelling "periodic function" as "continuous and periodic". Basically, I'm not exercising the the full freedom of the "it's periodic" concept.

There's a good Feynman anecdote on this (from Surely You're Joking...):

------

"I had a scheme, which I still use today when somebody is explaining something that I'm trying to understand: I keep making up examples.

For instance, the mathematicians would come in with a terrific theorem, and they're all excited. As they're telling me the conditions of the theorem, I construct something which fits all the conditions. You know, you have a set (one ball)-- disjoint (two balls). Then the balls turn colors, grow hairs, or whatever, in my head as they put more conditions on.

Finally they state the theorem, which is some dumb thing about the ball which isn't true for my hairy green ball thing, so I say "False!" [and] point out my counterexample."

------

So I think it's useful to consider the most extreme thing which meets your criteria, instead of a "representative" example.

To come back to tech I think a similar mindset is also useful for things like system failure mode analysis. "Yes, but what if that switch dies at the same time...?"

4
gpvos 2 days ago 0 replies      
It was when the article introduced "Lebesgue measurable periodic functions" without any explanation whatsoever that I realized that I was not going to understand this article at all (at least not without spending a day on Wikipedia).
5
nmc 2 days ago 2 replies      
Fun fact about considering R as a vector space over Q: the Hamel basis A mentioned in the post is not only infinite, but even uncountable.
6
pcvarmint 1 day ago 1 reply      
"By repeating this, we find that e^x is the sum of 1 periodic function. But this is absurd."

No, it's not absurd.

e^x is periodic with period 2 Pi I.

Maybe the article was limiting itself to real functions and the existence thereof, but this example raised eyebrows.

7
impendia 2 days ago 0 replies      
Another fun (and seemingly nonsensical) thing you can prove if you use the Axiom of Choice:

http://en.wikipedia.org/wiki/Banach-Tarski_paradox

8
adsche 2 days ago 0 replies      
Heads up: In section "A polynomial of degree n is the sum of n+1 periodic functions" it says 'cdots' in two formulas. (Probably a backslash missing in the TeX(?) source.)
9
baking 2 days ago 2 replies      
Can someone who understands this please make a graph?Pics or it didn't happen.
10
bitL 2 days ago 1 reply      
What is a neat characteristics of N^2 where N is integer is that its value is the sum of the first N odd integers.

1^2 = 12^2 = 1 + 3 = 43^2 = 1 + 3 + 5 = 94^2 = 1 + 3 + 5 + 7 = 16...

I am wondering if there is a relation of this to the article.

11
ColinWright 2 days ago 2 replies      
I hesitate to say this, but ...

If you think this belongs on the front page of HN then you need to up-vote it. Any moment now it will trigger the flame-war detector, get a penalty in the ranking score, and disappear without trace. Anything with 40 comments and fewer points than comments gets such a penalty, and while it's a good proxy for flame-war detection, is does get quite a lot of false positives.

If you want this to avoid that fate, you need to up-vote it. If you don't really care, then that's fine. I think it's interesting, but not everyone does.

Added in edit - something has giving this item a penalty[0] - maybe there are people who've flagged it as inappropriate. Certainly it doesn't seem to have tripped the "flame-war" detector, but who can tell.

[0] http://hnrankings.info/7056295/

19
Why Are There So Many Pythons? toptal.com
192 points by akbarnama  1 day ago   107 comments top 20
1
shadowmint 1 day ago 2 replies      

    > Its fast because it compiles source code to native code 
Actually, that's why it's slow. pypy is significantly slower than cpython in fact, primarily for short-life scripts.

The initial start time can easily double or triple the run time in some cases.

To be fair, it's only a couple of seconds extra, but for many tiny scripts this amounts to a lot of wasted time. It's not a magical cure all in all situations.

...that said, its so fantastically easy to drop in, in most cases its worth trying to see how it performs just for fun~

    pypy setup.py install    pypy app.py    [INFO              ] Kivy v1.8.0-dev    [INFO              ] [Logger      ] Record log in /Users/d/.kivy/logs/kivy_14-01-15_2.txt    [INFO              ] [Factory     ] 156 symbols loaded    ...
(drop in replacement for fun times? without having to change all your print and import statements? yes please~)

2
Foxboron 1 day ago 6 replies      
I'll give a shoutout too Hy. It removes the Python interface, and replace it with a LISP interface. The neat part is that its currently compatible, and bidirectional, with 2.6, 2.7, 3.2, 3.3 and pypy. We even added a few steps so we will have support for 3.4!

https://github.com/hylang/hy

It's a great example of how Python isn't that different from LLVM and JVM!

3
wikwocket 1 day ago 1 reply      
Well, with a habitat that reaches across Africa and Asia, and their ability to thrive in rain forests and grasslands, there are.... Oh. Ohhh. Never mind.
4
pjmlp 1 day ago 1 reply      
Funny that an article written to describe why there are so many Python implementations, fails when comparing Python with C and Java.

There are quite a few interpreters available for C.

Although Sun/Oracle's implementation is JIT based only, there are other Java vendors with toolchains that support ahead of time compilation to native code.

5
carlob 1 day ago 3 replies      

    x = random.choice([1, "foo"])
this is given as an example of why Python could never be strongly typed.

Honest question: isn't this solved using Either in Haskell?

6
fredsanford 1 day ago 6 replies      
"In very brief terms: machine code is much faster, but bytecode is more portable and secure."

Does anyone else have trouble with that statement?

Maybe it should read "bytecode is more portable and is easier to secure." ?

Java is what, 20 years old and based on bytecode and despite Sun/Oracle's efforts still has security problems.

7
chowells 1 day ago 1 reply      
Futamura projections would completely blow this guy's mind. It's amazing what ignorance of computer science causes people to think is new and special.

http://en.wikipedia.org/wiki/Partial_evaluation#Futamura_pro...

8
obblekk 1 day ago 8 replies      
Why isn't there a branch of python 2.7.2 which introduces the features from 3.x that don't require syntax changes and fixes known bugs in 2.7.2?

I get that these are the reward for upgrading, but it seems unreasonable to hold back.

9
rrosen326 22 hours ago 0 replies      
Question: This, and many other posts, suggest that PyPy is the way to go (as a user). Faster and compatible - what's not to like? But elsewhere I read, 'hold on'. It actually isn't faster with things that use a lot of C optimizations (like numpy and pandas, which I use extensively). I don't mind small startup JIT penalties, but I don't want to have my core code run more slowly, or fail. Is there a simple direction? (ie: Use standard cPython or use PyPy) Or is it, as always, 'it depends'?
10
goodwink 1 day ago 0 replies      
This really should have been written in the form of a parody of Snakes on a Plane.
11
vezzy-fnord 1 day ago 2 replies      
This is basically an article for beginners that explains the difference between a language's specification and its actual implementation, using Python as its illustration.
12
DonGateley 1 day ago 2 replies      
"CPython makes it very easy to write C-extensions for your Python code because in the end it is executed by a C interpreter."

Say what? What C interpreter might that be? In the end it is executed by machine code compiled from C programs designed so as to implement a Python virtual machine. Much, much different and much harder to make the intended connection.

Now why, really, is it easy to write C extensions for your Python code?

13
hardwaresofton 1 day ago 1 reply      
Awesome article, I thoroughly enjoyed it -- sent me on a tangent to learn more about Tracing JIT compilers, RPython (and eventually I started looking for python-interpreted lisp, haha).

blog posts that helps me understand my tools/ecosystem better (rather than just harp on some recent happening in SV, for example) is awesome.

14
Grue3 1 day ago 0 replies      
That's not many. Most of older languages have even more implementations.
15
bsaul 1 day ago 0 replies      
Since the JVM is also JIT compiled to native, is the jvm python implementation as fast as pypy ?
16
grdvnl 1 day ago 1 reply      
Why does one need a Jython Interpreter once the byte code is available for a JVM. Isn't there the standard java that I can use?
17
spacemanmatt 1 day ago 0 replies      
This article lost credibility pretty early for carelessly conflating tokenized python with Java bytecode. That is unfair to readers that don't already understand the difference. It's not a trifle.
18
DrJ 1 day ago 0 replies      
TBH there are only 2 versions of Python I feel is worth worrying about/using 2.7 and 3.3
19
CmonDev 1 day ago 1 reply      
"It all starts with an understanding of what Python actually is."

It's basically a script language.

20
wmnwmn 23 hours ago 2 replies      
The real question is why is there one python. Perl was already doing everything python can do and it is pointless to write every supporting library in 20 different languages. You can write good or bad code in any language; the language is nearly irrelevant. Of course, I wouldn't propose going to Bourne shell or something.
20
Taking PHP Seriously infoq.com
188 points by kmavm  2 days ago   109 comments top 22
1
flebron 2 days ago 4 replies      
I really liked this talk.

PHP was my first language when I was starting out professionally, and before I went into university and learned "proper programming". Back in 2007-2008, I remember the mess it was, with the internals list being a permanent struggle to implement anything by consensus (no lambdas or JS array notation because it wasn't easy to google, for example), and it really did look like a dead end. The internal source code of PHP was often a mess of macros everywhere, and the whole PHP6 unicode fiasco really did paint a grim picture of its future.

Facebook seems to have given it a bit of fresh air, implementing some pretty interesting stuff (Hindley-Milner with subclasses, for instance), and it really seems like the "feel" of programming in PHP has changed. I'm not going to say "Screw C++ and Haskell, _this_ is a serious language!", but on the other hand I feel I can say with a straight face to someone who is starting programming, "You could check this language out", without a guilty conscience that I'll be ruining their mind.

I'm unsure of PHP's future - if it'll be tied to Facebook (and thus Facebook's future, which I am equally unsure of), for example - but as of now, it seems to be a reasonable, if idiosyncratic, language.

So yeah, good talk :)

2
SCdF 2 days ago 3 replies      
At my job I maintain some PHP that we are slowly removing (and replacing with Scala for the backend and Angular for the frontend). My total experience with PHP prior was writing a tiny wordpress plugin a decade ago that added some anti-spam stuff to comments.

And you know, PHP, it's not so bad. It gets the job done, it deploys instantly (the rest of our code is mostly Scala. Oh. My. God. Just. Build. Already). It can be pretty haphazard, you can do some funky stuff with it (the abilty to pass by reference on integers and other 'primitives' caught me by surprise) and I wouldn't pick it for new code, but it's certainly not the steaming heap of evil that people make it out to be.

Maybe our code is just awesome, idk.

3
nikcub 2 days ago 2 replies      
slideshare link, for those who want to avoid the InfoQ marketing signup maze:

http://www.slideshare.net/zerutreck/taking-php-seriously-kei...

4
camus2 2 days ago 2 replies      
Interesting talk , curious about Hack.

PHP is a horrible language with great libraries and a few good features. A lot of successfull projects are written in PHP. The execution model is good enough for some types of apps(blogs,e-shops,...).

As a PHP developer , i'm betting on Python and NodeJS for the future, i dont believe PHP has any real future outside a bunch of popular CMSes. While PHP has excellent libraries (Symfony,Doctrine,...) , Python has very good ones too, and it IS trully multipurpose.

When i first came to Python i did not like its OOP model(no interfaces,...),but Python metaprogramming features are unique and very interesting to learn.

The biggest problem with PHP is its core developers, adding some "feature" is not fixing the language. Removing the bad parts should be the focus of the next PHP versions. It is not.

5
girvo 2 days ago 2 replies      
God-damn.

I just spent the last week working on a new language that... turns out, is basically Facebook's new language: Hack. Although, it does seem they've not released it to the public, so I suppose I'll keep working on it, heh.

My new language steals PHP's "shared-nothing", "bootstrap from nothing at request" but steals Typescript (and Hack's, apparently) gradual typing system, along with a saner, less ridiculous StdLib.

6
joeblau 2 days ago 1 reply      
PHP is a strong language for the web, but the community is not. I really only see Facebook pushing the language and even though Steve Ballmer has been memed to death about his "Developers" rant, the developer community is really what makes languages like Ruby/Rails and Node.JS more attractive.

I definitely applaud all of the work the Facebook team has done in order to push the limits of PHP and turn it into something that developers are actually comfortable working with. With that being said, I still don't think it's better than I thought it was.

7
anuraj 2 days ago 1 reply      
I came to PHP after long stint with C, C++, Objective C, Java and Javascript - and I was productive from day 1. I have programmed extensively, and have come across few quirks, but never got stuck, nor is my code spaghetti. I do not use frameworks where they can be avoided and write extremely simple code and avoid OOP for scripting. I would not do OOP with a language without strong typing. My take is for short lived HTTP requests which need to spew out some JSON preferably(favor headless coding for frontend), PHP is the apt tool. For heavy lifting, use a strongly typed language which can easily handle complexity.
8
dkhenry 2 days ago 1 reply      
This is not about taking PHP seriously. This is about taking Facebooks internal development language _based_ on PHP seriously. By the time you factor in a different run time and the various language extensions they have made you no longer have PHP. In fact everything they have done has been to turn their existing PHP codebase into a better language and to their credit they have added some excellent tools to make that migration smooth. The question is at what point are you no longer writing PHP but writing facebook PHP with a VM that still lacks documentation and in the name of backwards compatibility carried forward the warts of the old PHP.
9
jimmytidey 2 days ago 3 replies      
I love that he says "if you are a person like me, who is a c++ compiler, I mean c++ person" (16.20)
10
dscrd 2 days ago 0 replies      
Generally a good talk and a good defense, but I didn't quite appreciate his new definitions of "State" and "Concurrency".

And did he really compare a full operating system to Facebook's web apps in terms of complexity? If that's true, there has to be a lot of accidental complexity in FB.

11
martin_ 2 days ago 1 reply      
Hack sounds pretty awesome, it's strange I haven't seen mention of that alongside hip hop before - also he mentioned "hacklang.org" but that seems completely different
12
jhh 2 days ago 0 replies      
I think this was a worthwhile talk. The speaker makes some interesting points about the importance of "ergonomic properties" of languages.

He convincingly explains why PHP was so succesful: Workflow (just hot reload), Concurrency (in the sense of "shared nothing" HTTP requests), and State (every request starts from a "clean plate", which reduces long-time statefulness bugs if you know what I mean).

An interesting point he raises is that PHP leaks it's GC mechanism to the API, which makes implementing a fast VM harder for the HHVM team.

13
mitchtbaum 2 days ago 0 replies      
Whoah! Thank you for linking to this video series. Another video from this Strange Loop 2013[0] conference is on Symmetry in Programming Language Design[1]. I am finding it to be way more up my alley.

[0]: http://www.infoq.com/strange-loop-2013/

[1]: http://www.infoq.com/presentations/noether

Edit: Posted to https://news.ycombinator.com/item?id=7054815

14
dogweather 2 days ago 1 reply      
Does it attempt to refute John Siracusa's _dissertation_ on 5by5? That's the first hurdle for any pro-php article.
15
pknerd 2 days ago 1 reply      
A post I just made about what PHP has become now.

https://plus.google.com/118197810020432218051/posts/17fNVWVj...

16
apanonymous 2 days ago 0 replies      
I haven't currently moved on to learning PHP yet, I've been using Python quite regularly for the past few years and a lot of Java Script. PHP could possibly be the next language for me to learn.
17
chronomex 2 days ago 4 replies      
Is such a thing even possible?
18
sefk 2 days ago 0 replies      
Nice talk Keith. Honest, clear, and informative. Thanks.
19
alien3d 2 days ago 1 reply      
i'm waiting for other language as flexible as php.. write code without thinking error message.
20
puppetmaster3 2 days ago 0 replies      
If you use PHP, you can't CDN your HTML.If you use PHP, you can't PhoneGap build your HTML.

I recommend pure HTML (and call API/CORS).(Same is true of ASP, etc.). Pure HTML!

21
jbeja 2 days ago 0 replies      
Yes, more PHP please!.
22
sudy 2 days ago 0 replies      
I am a PHP developer and I like it
21
N.S.A. Devises Radio Pathway Into Computers nytimes.com
187 points by nealyoung  1 day ago   138 comments top 21
1
todayiamme 1 day ago 11 replies      
I do understand that it is now the status quo to disavow everything the NSA is, but foreign intelligence gathering is their mission and releasing these details simply doesn't help the cause of fixing the NSA's less savoury incursions.

While arguably any foreign intelligence agency of note isn't going to be caught off guard by these leaks, leaking these details does offer political ammunition to the very people who stand to gain from the expansion of the NSA's mission into civilian data gathering. It helps to make the case that the leaks aren't such a good thing after all and are compromising the intelligence gathering apparatus of the US of A. Add a bit of spin and you can quickly use this to get back to business as usual and people will actually support them as now it'll become a matter of identity instead of what it should be - a surgical exploration of a cancer afflicting a nation state.

2
PythonicAlpha 1 day ago 2 replies      
"We do not use foreign intelligence capabilities to steal the trade secrets of foreign companies"

Nobody with an unspoiled mind and following the news last year will believe this bullshit.

If there is anything, people all over the world (also in the US) should have learned: Statements from people of some federal US organisations can not be believed at all -- in many cases the complete opposite is true.

3
staunch 1 day ago 2 replies      
This sounds like a non-issue to me. Any person on this site could create little USB devices for stealing data. It's nothing special or new. I thought I was going to hear that they're light years beyond Tempest[1] or something. Feels good to finally hear an NSA story that doesn't depress me.

1. http://en.wikipedia.org/wiki/Tempest_(codename)

4
rl3 1 day ago 1 reply      
"In most cases, the radio frequency hardware must be physically inserted by a spy, a manufacturer or an unwitting user." [emphasis added]
5
beloch 1 day ago 0 replies      
Well, I suppose it's time for the tin-foil-hat crowd to turn their computer cases into a Faraday cages then! Of course, these NSA gizmos might plug into ground and detect radio-induced current fluctuations. Given how many computer cases are metal, this might be the obvious way to go actually. So... Faraday cage and a really expensive ground conditioner?
6
vonnik 1 day ago 2 replies      
The only thing that's more disappointing than the NSA spying is the NYT sitting on this scoop for more than a year, and letting Der Spiegel break it. Only slightly less amazing is that Der Spiegel and Jacob Applebaum were talking about this more than two weeks ago, and the NYT diddled until now. Incredible.https://www.youtube.com/watch?v=vILAlhwUgIU
7
vxxzy 1 day ago 4 replies      
Transmit as far as "EIGHT Miles". Does anyone know what type of power this would take? I imagine if they used a less noisy frequency combined with sensitive receiving equipment, it would not take much. I used to play with CB radios which has a cap at 4W, with a good antenna, one could transmit 7+ miles in good situations.
8
Theodores 1 day ago 0 replies      
9
codex 1 day ago 11 replies      
This is another example of how Snowden has compromised national security by leaking secret information that has nothing to do with American metadata and everything to do with the NSA's charter and legal mission.
10
lazyjones 1 day ago 1 reply      
So, any chances of finding such a device out in the wild? Suggestions for detecting the most likely used type of radio transmissions? How can they transmit over 5Km with USB power and no antenna?
11
danso 1 day ago 0 replies      
Given that the NSA's mission is to do surveillance against foreign targets ("There is no evidence that the N.S.A. has implanted its software or used its radio frequency technology inside the United States.")...the techniques described here actually seem to be in line of what you imagine the NSA is supposed to be doing. At least it's surveillance that requires them to have a physical targeted presence, rather than just drinking from the telecommunications firehose.
12
NKCSS 1 day ago 1 reply      
I remember an article on here a while back of a well known security or cryptology researcher that had a machine get re-infected by unknown malware time and time again without a network connection, who also observed radio waves and thought that was the iv...
13
xanth 1 day ago 1 reply      
So now one needs to run BSD, air gaped and in a Faraday cage to be 'secure'... So now what does one do with it
14
oceanplexian 1 day ago 1 reply      
> The technology, which has been used by the agency since at least 2008, relies on a covert channel of radio waves that can be transmitted from tiny circuit boards and USB cards

Obviously if someone has physical access to a machine it can be compromised. Replace "USB Cards" with "USB WiFi stick" and you've achieved the same thing.

This is just FUD. Machines that are air-gapped from the Internet with tight physical security are as secure as ever.

15
f_salmon 1 day ago 0 replies      
> Richard A. Clarke, an official in the Clinton and Bush administrations who served as one of the five members of the advisory panel, explained the groups reasoning in an email last week, saying that it is more important that we defend ourselves than that we attack others.

Pretty frightening that such things apparently still need to be said.

16
ShirtlessRod 1 day ago 0 replies      
My favorite part:

"The technology, which the agency has used since at least 2008, relies on a covert channel of radio waves that can be transmitted from tiny circuit boards and USB cards inserted surreptitiously into the computers."

Oh, so they only need physical access to the machine, and then they can do stuff to it? It's like magic!

17
__pThrow 1 day ago 0 replies      
I have to admit I was disappointed these seem to require radio transmitters be added to the device. Was sort of hoping to discover there were little antennas built into Intel processors or nvidia video cards.

However, I now know more about what DARPA's littlest flying robots will be doing, especially the ones already described as little more than chips with wings.

18
higherpurpose 1 day ago 0 replies      
This article feels like NSA bait to me. It's like NYT is trying to make NSA look good.
19
snambi 16 hours ago 0 replies      
very good use of tax payer money!
20
notastartup 1 day ago 0 replies      
oh man when does this stop? these guys are clearly breaking the law all in the name of "keeping us safe from terrorists". This needs to be stopped. All the perpetrators of this program must be brought to justice with a court that adheres to the principals of democracy and freedom.
21
zerny 1 day ago 0 replies      
badBIOS and now this. Sigh.
22
Living a High-DPI desktop lifestyle can be painful on Windows hanselman.com
176 points by henrik_w  1 day ago   137 comments top 26
1
pilif 1 day ago 9 replies      
you can say what you want about Apple in general, but they way they "fixed" this issue with the retina macs is so much better than what windows is doing.

Windows has had the DPI selector ever since 3.1 (or even 3.0), but because nobody traditionally tweaked the settings, nobody bothered to make their apps look right and because nobody bothered to make the apps look right, nobody tweaked the settings because running higher DPI modes was breaking apps all over the place.

Mac OS did this too - I think in the 10.4 timeframe there was an option to actually switch into a higher DPI mode and many of the OS-internal UI assets were vector images. They probably noticed that it will never work out, so they opted to go for a hack:

With the exact quadrupling of the resolution, we got a solution that works mostly transparently for the applications. They still think they are drawing with a one-pixel resolution, so the burden of getting this right moved from the app makers to the OS maker.

Even if an application has no modifications for retina displays, it will look mostly right (minus some blurring issues for pixel art). There will be no scaling issues, no texts will be cropped and everything will be scaled by the same factor.

If you want to 'optimize' your app for retina, all you do is provide higher resolution bitmaps and you're mostly fine.

The exception is some text-editors (sublime) and browsers (chrome) that were doing some manual text rendering, not relying on the OS API. These had to be fixed manually, but there really weren't that many applications like that.

Yes, the way Windows does it is probably more "purist". Yes, the way Windows does it allows for arbitrary scaling factors (also sub 200%).

But it doesn't work in practice.

Yes. The Apple solution is a hack. Yes, it doesn't allow scaling to arbitrary factors. Yes, providing 2x bitmaps instead of one vector image is annoying.

But it works in practice.

On a retina mac, you'd never see the issues the OP complained about seeing on their Windows machine. What you get there might be superior from a technical standpoint, but it all boils down to an ugly half-working mess because developers just don't bother to get it right.

I'm not excluding myself here. I've done a few windows apps and I f'ed up high dpi modes as many times as everybody else - also because my development environment (Delphi) made some assumptions that just didn't work well with high-dpi modes.

Getting HDPI right on Windows (Desktop): Really hard and thus not worth it for most developers. Getting HDPI right on the Mac: Trivially easy.

2
latitude 1 day ago 2 replies      
Just to be clear - supporting High-DPI on Windows is not as simple as adding scaled up resources. Instead, it is an absolutely royal pain.

For one, you can't just throw hires bitmaps into a resource section of .exe and expect them to magically work. There needs to be code that looks at current DPI and picks matching bitmap.

For two, standard DPI levels are 96, 120, 144 and 192, but guess what? All other values in between and above 192 are game too. In fact, there's a nice little slider in the Control Panel that encourages you to put it somewhere in between. This means that your code either needs to rescale bitmaps to match these odd DPIs or use the largest one that fits. In either case the result will look like butt.

For three - the dialog layout. If your dialogs have text that's longer than 3-4 words, the chances are that it will either overflow, underflow or wrap differently under different DPIs. This in turn means that you need to test dialog appearance with at least 4 different font sizes and Tahoma 8px for Windows XP. Do you know how hard it is to word a longish sentence so that it would fill about the same space with all 5 combinations? Really damn hard and very time consuming.

But wait! There's more.

Every app icon needs to exist in at least 9 sizes, like so - http://imgur.com/5Pe2ZV0 - and this would still miss some cases where Windows will scale an arbitrary chosen icon image and use it.

It really is a mess. However this is not something unexpected if you've been writing for Windows for a while. This mess is a routine.

3
modeless 1 day ago 0 replies      
The Windows DPI scaling control panel has DPI scaling bugs: http://i.imgur.com/zi80IhG.png

That's all you need to know about high DPI support in Windows, in a nutshell.

4
msy 1 day ago 1 reply      
Can we get 'on windows' added to the title? None of this is about high-DPI/Retina displays in general, it's just Windows and Windows ecosystem issues
5
interpol_p 1 day ago 1 reply      
One area where Apple has been incredibly aggressive and Microsoft has been sorely lacking is in actually producing and preparing high resolution content well in advance of the display technology.

Apple new this was coming and for years have demanded icons, images, and artwork to be provided in high resolution. They developed assets for their own software in high resolution before retina displays ever hit the market.

Contrast to this article where Microsoft's own Visual Studio (one of the "best cases" mentioned by this article) doesn't even have Hi-DPI icons. Microsoft's own software displays blurry icons and it's listed as a "best case."

Apple also pushed Intel very hard on driver support for their integrated graphics. Rewriting the driver-level image scaling to ensure that discrete and integrated GPUs produced identical images, again before retina displays ever came on market, to allow for seamless switching between GPUs on their hardware.

This sort of foresight was necessary to pull off the high density display introduction as smoothly as they did.

Same goes for multiple monitors, which "just worked" when Apple introduced their high density displays. You could connect a regular monitor to your laptop and drag windows across. The DPI and art assets adjusted dynamically. This is something Windows is only just now getting around to fixing in version 8.1.

6
pdkl95 1 day ago 0 replies      
http://www.antigrain.com/research/font_rasterization/index.h...

This is a few years old now, but is still one of the better explanations of the issues involved in modern font rasterization, with emphasis on how Microsoft's likely held back the entire industry by only realistically supporting widget fonts at the standard 96ppi. Really, anything other than the stock Arial 10pt/96ppi is going to cause layout issues.

In an effort to very-aggressively hint that font so it looks consistent, they throw out horizontal accuracy by rounding to pixel boundaries. Per-letter. Ouch.

The best part of the paper, though, is this image that moves the sentence to the right exactly 1/10 pixel each line:

http://www.antigrain.com/research/font_rasterization/sample_...

7
js2 1 day ago 1 reply      
Adobe everything.

Well at least Adobe is consistent. The Flash Installer is awful on OS X as well. For example, say you've disabled the translucent menu bar - you'll notice it's drawn translucent when the Flash Installer is the foreground app. (Nevermind the fact that the installer is a just an annoying wrapper around the OS X installer that seems to do nothing more than force you to quit your open browsers before it will continue. With some digging you can actually find where it's downloaded the real installer pkg, double-click that pkg to install it, then quit the wrapper.)

8
Aardwolf 1 day ago 6 replies      
Why are names like "High-DPI" used?

There was a time when 320x200 graphics with 256 colors was amazing. Would 800x600 have been considered "high DPI" back then?

Or in the time were people were playing Duke Nukem 3D at 640x480 pixels, would they have considered 1920x1280 to be "high DPI"?

Windows has transitioned all the way from 640x480 in Windows 3.11 to the resolutions of today, why would there be a problem with just another step up?

What is "high" DPI today? What will it be tomorrow?

9
ZoFreX 1 day ago 3 replies      
One of the most depressing areas of Windows to look at is the various control panel screens. Some of them, I kid you not, will show different bits of text in up to three different sizes. If Windows itself doesn't deal with different DPIs correctly, what kind of example does that set for application developers?

Another application that definitely deserves mentioning is Chrome. It looks absolutely terrible. It's really blurry and browsing the web for even a couple of minutes really strains my eyes. If you run high DPI Windows, you're going to need a different browser than Chrome (Firefox looks fine but unsurprisingly does not perform so well at such a high resolution. Internet Explorer is actually not a bad choice - high res and fast)

10
0x0 1 day ago 1 reply      
It's pretty amazing that Adobe is doing particularly bad. Keepers of pdf and postscript, you would think they knew all about fonts and dpi.
11
forrestthewoods 1 day ago 0 replies      
I'm actually impressed that windows apps support it at all. I didn't realize it could actually work properly if programs took the time to do it right. With High-DPI monitors finally coming out I'd expect dramatically improved support moving forward.
12
codeulike 1 day ago 0 replies      
Remote desktop is really annoying on high dpi screens - the client cant scale up (although it can scale down) so you end up with a tiny view of your server.
13
riquito 1 day ago 9 replies      
What's the situation on GNU/Linux?
14
nycticorax 3 hours ago 0 replies      
It's funny that, in an article describing issues with a high-DPI display, the author never explicitly says the size of the screen (e.g. in inches), or the DPI. He just tells us it's 3200x1800. In case you were wondering, the display on the Lenovo Yoga 2 Pro is a 13.3", 16:9, 276 DPI display. Personally, I find this description more immediately useful than saying that it's a 13.3", 3200x1800 display.
15
danudey 18 hours ago 0 replies      
Changing your DPI settings on Windows can cause all kinds of strange issues. I had a problem a few years back where my Windows 7 gaming theatre PC thing I had set up and connected to my 1080p TV wouldn't work quite right; specifically Star Wars: The Old Republic wouldn't launch, and would crash on loading.

I couldn't figure out what the problem was, and googling was no help until I stumbled across the dumbest advice I'd ever seen: change the text scaling. Once I set it back to 100%, the game loaded fine. Frustrating, because at 100% I couldn't read any of the other text, which meant that if I wanted to play the game I basically had to navigate by icon and give up on reading dialogue boxes unless I wanted to sit on my coffee table instead of my couch.

16
spitfire 21 hours ago 1 reply      
I always liked what SGI Irix did. From the get go the desktop was all vector based. So you could run at 800x600 or 1280x1024 and either way things would look normal.

Both OS-X and Windows solutions are a hack.

17
bluedino 1 day ago 1 reply      
Isn't the credit due to the choice by Apple (NeXT, actually) to go with Display Postscript?

I'm sure they figured that as amazing as the resolutions of the original NeXT monitor was (1120832 in 1988), in the future it would be greatly exceeded. Not to mention they wanted to be able to use the same routines to draw to a 300 or 600dpi printed paper as a 72dpi or whatever monitors were at the time.

18
ghusbands 1 day ago 0 replies      
I've had the pleasure of using a high-DPI display on Windows 7 for a while, now. There are some tricks that help significantly:

1. Calibrate ClearType to use as little color as possible (use the system magnifier to help); this way, when apps get scaled the text is just blurry rather than blurry with odd color-fringing.

2. For apps that are incompatible but suitably configurable, set their compatibility mode to disable HiDPI scaling and then set their font sizes (or default zoom) to be larger. This works well for Chrome and Skype, at least.

3. For those times when you momentarily have trouble, remember that the windows key and the plus key will zoom your whole desktop.

19
zokier 1 day ago 2 replies      
I guess the best way to solve this would be to (partially) ignore the dpi awareness flag in applications and instead apply a series of heuristics and white/blacklists (with user override) for determining if applications truly are hidpi capable. That would be bit of a pita for the applications that actually do support hidpi properly now and might get still bitmap scaled, but that would be just a temporary issue.
20
OrwellianChild 17 hours ago 0 replies      
As someone who prefers high-DPI screens for workflow management across programs (spreadsheets, web layout, graphic design, etc.), this scaling problem in programs is a frequent frustration.

Particularly working with media (DSLR images or 1080p video), it is difficult to scale GUI to a readable level while viewing the media itself at an un-scaled 1:1 pixel level. Do any editing suites for photo/video do this well? I'm talking about an un-scaled video window with a GUI that scales/wraps to an arbitrary window/screen size...

21
tux10 1 day ago 2 replies      
This is one of the biggest reason I just can't move away from my regular DPI laptop. Every time I have tried a HiDPI display with Windows it is painful. Text looks great but icons and widgets look like shit. Things get better with every new version of Windows but Microsoft have been promising true DPI independence since Vista yet things are still crap.
22
nettletea 1 day ago 1 reply      
I can't even use my Windows 8.0 on my 32 inch TV that is 1080 sitting a couple of metres away, while keeping the resolution and boosting all the accessibility settings to the max.

Even the metro Apps I can't use.

I'd need a 50ft display! It's another facet of display/accessibility issues.

I had high hopes after Media Centre, that Windows would get this right. And bought Windows 8.0 specifically for this purpose. Disappointing. I'm forever going over to the TV and my neck hates it.

Personally I think part of the solution is to change the menuing system, breaking it out from the window. If I can at least use the menus/controls on apps I have a good chance of using them.

23
maaarghk 1 day ago 0 replies      
I have a 166 dpi laptop at the moment and yes, windows was pretty bad. Only recently found out about the dpi settings thanks to a commenter here.

On another note I really want that laptop!!

24
blueskin_ 1 day ago 2 replies      
Is this guy having a laugh?

Every time I've upgraded my monitor, I've found it easier. If you don't want a window to be wider than a certain point, don't make it wider. Myself, whenever I see apple users, I see them with these tiny narrow windows with nothing else on screen, which I have never understood.

That said, I always value vertical resolution - I have never bought a 1920x1080 monitor as 1600x1200 is so much nicer, or 1920x1200 for the wider aspect ratio.

25
CaRDiaK 1 day ago 0 replies      
I got IntelliJ running really slick and smooth in QHD+.

Set the compatability to disable scaling for hidpi. Then just set the text in settings to be 26pt.

I know have a retina looking IDE on my Ativ Book 9 Plus.

26
darkr 1 day ago 0 replies      
s/ a High-DPI desktop lifestyle can//
23
No-fly list takes legal hit mercurynews.com
175 points by andrewfong  1 day ago   49 comments top 9
1
srl 1 day ago 6 replies      
I thought the judge's name sounded familiar, so I checked: https://en.wikipedia.org/wiki/William_Haskell_Alsup

Yup. This was the guy who learned Java because he felt he needed to know how to program to be qualified to make a ruling in Oracle v Google. I usually dislike criticizing or praising individual judges (unlike politicians, it's not for the public to pick and choose -- nor should it be), but this one is really remarkable.

2
nowarninglabel 1 day ago 2 replies      
As someone who has had to deal with the list, good for this woman for fighting it. My name (or rather, the name I share with whomever it is the government was really after) led me to having five years of not being able to check in to flights after 2001 without going to the counter, giving ID and confirming my birth date, and waiting for the agent to make a phone call (presumably to Homeland Security), which would invariably take a solid 15 minutes of waiting, until I got the all clear and could be checked in, get my tickets, and board my flight.

Fortunately, sometime in '06 or '07 I received information on how to remove oneself from the list by filling out a form. I can't recall if I found it independently or if it was given to me by an airline agent, but a few months after turning that in I no longer had to deal with the list again.

Perhaps, I should have actually fought against it as well, but in my case it was always just due to the happenstance of having the same name as someone on the list.

3
fpgeek 1 day ago 0 replies      
To me the most shocking thing (of many) about this case is the steps DHS took to prevent the woman's daughter (a US citizen!) from returning to the US to testify:

https://www.techdirt.com/articles/20131204/10434025453/dhs-p...

4
linuxhansl 1 day ago 2 replies      
> The Obama administration has vigorously contested the case, the first of its kind to reach trial, warning that it might reveal top-secret information about the anti-terrorism program

How often and how long will we hear this nonsense. The no-fly list has never caught or deterred a terrorist.We cannot stop the legal process with the blanket "national security" card.

5
greenyoda 1 day ago 1 reply      
For those who are interested, there's a blog called "Papers, Please!", run by a group called "The Identity Project", which has been following this trial in some detail:

http://papersplease.org

"The Identity Project explores and defends the fundamental American right to move freely around our country and to live without constantly having to prove who we are or why we are here."

6
crazytony 1 day ago 0 replies      
Even if I understand (but probably don't agree with) the existence of the list and the need to keep the contents confidential I don't understand why the criteria/process used to put a name on the list needs to be confidential other than they want judges to rule from fear, uncertainty and doubt.
7
tempodox 1 day ago 0 replies      
The DHS should rebrand as the Department of Humiliation as a Service. And, aliens get this service for free, courtesy of the generous US tax payer.
8
avighnay 1 day ago 0 replies      
Once a female security officer in Amsterdam airport made loud, snide remarks about how I arranged cables and stuff in my laptop bag. That by itself was very annoying, imagine being called off in front of all people like a criminal for no fault of theirs!
9
angersock 1 day ago 1 reply      
So, an interesting bit:

"In a decision for the most part sealed, U.S. District Judge William Alsup disclosed that Rahinah Ibrahim was mistakenly placed on the controversial list and said that the government must now clear up the mistake."

As I understand it, the vast majority of the US legal system is based on common law--that is, on using the rulings of previous judges and courts in order to determine how a new case is handled.

Isn't the idea of sealed decisions and secret courts pretty much striking at the very core of our entire system of justice?

24
How I Built a Raspberry Pi Tablet makezine.com
173 points by nkvl  2 days ago   36 comments top 10
1
rhgraysonii 2 days ago 0 replies      
For those interested in the actual build process and materials rather than just a writeup, here it is from his personal site: http://mkcastor.com/2014/01/02/pipad-build/
2
girvo 2 days ago 2 replies      
A compliment from Bunnie is something to be proud of! The PiPad looks awesome, well done :) I'm working on a palmtop in the vein of an HP 200LX, running RetroBSD on a PIC32 micro controller. So much fun, but gosh hardware hacking is harder than I thought. New found respect for the hardware people!
3
msoad 2 days ago 0 replies      
If you are interested in hardware components these people have decent productshttp://www.hardkernel.com/main/products/prdt_info.php
4
enscr 2 days ago 2 replies      
Quick grep for cost/price didn't show up anything. Sorry if I missed. How much did it cost? Just out of curiosity, I'm wondering if it's possible to build an extremely cheap but waterproof tough tablet for toddlers to play with.They don't care about performance at all.
5
peterburkimsher 2 days ago 0 replies      
I put my Raspberry Pi inside an iPod: http://peterburk.tumblr.com/pipod
6
greenyoda 2 days ago 2 replies      
Are there any commercially available Linux-based tablets out there that people can recommend?
7
blaze33 1 day ago 0 replies      
Just nitpicking over the website: loading the gif took some time... yeah, 18MB, it was a good occasion to test gfycat: http://gfycat.com/OblongSilentAndeancat it could have been reduced to 6MB.

Related discussion from 2 weeks ago: https://news.ycombinator.com/item?id=6975202

8
bitwize 2 days ago 0 replies      
I like the woodgrain bezel. It looks like a device from the movie Her.
9
kevinchen 2 days ago 2 replies      
This is a really cool DIY project, but probably not practical. That thing's going to be slow as molasses.
10
tostitos1979 2 days ago 0 replies      
I think this is wicked cool. Is the underlying desktop mouse based or optimized for touch?
25
Tell HN: Server Status
172 points by kogir  3 hours ago   87 comments top 19
1
barrkel 2 hours ago 2 replies      
By tolerating the loss of two disks, do you mean raidz2 or do you mean 3-way mirror?

Raidz2 is not fast. In fact, it is slow. Also, it is less reliable than a two way mirror in most configurations, because recovering from a disk loss requires reading the entirety of every other disk, whereas recovering from loss in a mirror requires reading the entirety of one disk. The multiplication of the probabilities don't work out particularly well as you scale up in disk count (even taking into account that raidz2 tolerates a disk failure mid-recovery). And mirroring is much faster, since it can distribute seeks across multiple disks, something raidz2 cannot do. Raidz2 essentially synchronizes the spindles on all disks.

Raidz2 is more or less suitable for archival-style storage where you can't afford the space loss from mirroring. For example, I have an 11 disk raidz2 array in my home NAS, spread across two separate PCIe x8 8-port 6Gbps SAS/SATA cards, and don't usually see read or write speeds for files[1] exceeding 200MB/sec. The drives individually are capable of over 100MB/sec - in a non-raidz2 setup, I'd be potentially seeing over 1GB/sec on reads of large contiguous files.

Personally I'm going to move to multiple 4-disk raid10 vdevs. I can afford the space loss, and the performance characteristics are much better.

[1] Scrub speeds are higher, but not really relevant to FS performance.

2
makmanalp 1 hour ago 3 replies      
The trend I'm noticing is people mentioning that if only HN was moved to <insert-cloud-provider>, problems would go away.

Instead of doing that, they probably dropped a bit more than a thousand dollars on a box, and are probably saving thousands in costs per year. This is money coming out of someone's pocket.

This site is here, and it's a charity, being provided free of cost, to you. Who cares if HN is down for a few hours? Seriously? Has anyone been hurt because of this, yet?

3
cincinnatus 3 hours ago 15 replies      
I'm sure it has been asked many times before, but I'd love to hear the latest thinking... Why in 2013 is HN still running on bespoke hardware and software? If a startup came to you with this sort of legacy thinking you'd laugh them out of the room.
4
hartator 3 hours ago 0 replies      
Not really related but any update on releasing the HN code again?

[the current release is pretty old: https://github.com/wting/hackernews]

5
JayNeely 1 hour ago 0 replies      
Being the sysadmin on a site frequented by sysadmins has to be frustrating at times.

Thanks for all you do!

6
conorh 1 hour ago 0 replies      
Have you thought about perhaps open sourcing the server setup scripts for HN? I'd love (and I'm sure many others here) to help with the configuration. Perhaps a github repo for some chef recipies that people could work on given the current servers?
7
ishener 3 hours ago 5 replies      
may i ask where are the machines hosted? is that on AWS? if not, why don't you move to a more reliable hosting, like AWS?
8
nmc 3 hours ago 1 reply      
Thanks for the info!

Out of curiosity, do you have an idea about the source of the corruption problems?

9
richardw 1 hour ago 0 replies      
Thanks for the update. No worries, it's just a news message board and no businesses are hurt when it's down. I quite enjoy seeing how these things are solved and I'm sure all will be forgiven if you post a meaty post-mortem.
10
erkkie 2 hours ago 1 reply      
This reminds me I'm still looking for a (pki?-)encrypted zfs snapshots as a backup service, /wink-wink @anyone

Hoping the box has ECC ram, otherwise zfs, too, can be unreliable (http://research.cs.wisc.edu/adsl/Publications/zfs-corruption...)

11
avifreedman 2 hours ago 0 replies      
Assuming the disk footprint is small...

Would recommend a new SSD-based ZFS box (Samsung 840 Pros have been great even for pretty write-intesive load), with raidz3 for protection and zfs send (and/or rsync from hourly/N-minute snapshot for data protection which should eliminate copying FS metadata corruption, as not sure if zfs send will).

Happy to provide and/or host such a box or two if helpful.

12
rdl 2 hours ago 0 replies      
I still like hardware RAID because it's conceptually simple and nicely isolated. Sometimes horrible things happen to it, though, too.

I didn't realize HN had enough disk storage needs to need more than one drive. I guess you could have 1+2 redundancy or something.

13
shawn-butler 1 hour ago 0 replies      
Using DTrace to profile zfs:

http://dtrace.org/blogs/brendan/files/2011/02/DTrace_Chapter...

I'm sure other more experienced DTrace users can offer tips but I remember reading this book and learning a lot. And I believe all the referenced scripts were open source and available.

14
jffry 3 hours ago 0 replies      
Thanks for the writeup.
15
lukasm 2 hours ago 1 reply      
How about error page show the last static HN page? Most people just need likns
16
rincebrain 2 hours ago 1 reply      
ZFS instead of UFS on what, an Illumos derivative, FBSD, or actual Oracle Solaris?
17
carsonreinke 2 hours ago 0 replies      
Maybe you could provide details on the current configuration and architecture and some suggestions could be made on how to improve. Just a thought.
18
smalu 2 hours ago 2 replies      
The world would be better place if software could exist without hardware.
19
waxzce 2 hours ago 1 reply      
Hi, I'm the CEO of http://www.clever-cloud.com/ and I'll be happy to help you on this, ping me on twitter : @waxzce
26
Why is a minute divided into 60 seconds, an hour into 60 minutes? scientificamerican.com
159 points by smalu  1 day ago   147 comments top 20
1
spodek 1 day ago 16 replies      
As an American with a physics background, a while ago I casually reviewed how bad our non-metric system is -- http://joshuaspodek.com/metric-system-isnt -- and found it not nearly as bad as people treat it. Among other things, when I build things it's useful to divide in half a few times, which is easier with inches and feet. And I've found no benefit to Celsius's 0 and 100 coinciding with water's state changing.

I bring that up here because I've never heard even the staunchest metric proponents use kiloseconds or megaseconds or hesitate to use hours, minutes, days, and so on. I know people experimented with decimal times, especially around the French Revolution, but it didn't stick. It's funny when someone talks about the value of using base ten and then switches to base 60, base 12, and base 24 in the next sentence.

I should say that in physics experiments people used seconds only (which is where I learned that to within about a percent a year is pi times ten to the seventh).

2
brudgers 1 day ago 0 replies      
12, 24, 60 are all used because they are cipherable using one's fingers.

To cipher on 12, pick a hand and assign the values 1 to 12 to each finger joint so that the tip of the index finger is one, the middle joint of the index finger is 2 ... the base joint of the little finger is twelve. Use the thumb as pointer to a number. Add and subtract by moving your thumb as you count.

Cipher on 24 by using each joint on both hands.

Cipher on 60 by using one hand to cipher on 12. The other to cipher on 5 in the traditional way but value each finger as 12. Example: Base joint of pinky on right hand and ring finger of left hand is 48.

To get the full Babylonian number system allow the exponent to float based on context. It's really just an extension of the move from ciphering on 12 to ciphering on 60.

Exercises:

1. [M05] Where are the indexes after adding 13 and 8?

2. [10] Change the system to use natural numbers.

3. [50] Is abandoning sexigisimal ciphering for decimal ciphering the oldest case of changing a computational system so as to make it easier for beginners at the expense of vastly reduced expressive power?

http://en.m.wikipedia.org/wiki/Babylonian_number_system

3
dirktheman 1 day ago 1 reply      
After the french revolution there was a short period (3 years) when the French had decimal time. It didn't catch on because that meant the workers had 10-day workweeks instead of 7.

http://en.wikipedia.org/wiki/French_Republican_Calendar

As a result, decimal clocks from that era are very rare and highly sought after!

4
nmc 1 day ago 3 replies      
Interesting history lesson about the Egyptian's use of the duodecimal system.

I believe the last argument is understated: one big advantage of base 12 over base 10 is division by 3. This offers many ways of dividing a time interval into several sub-intervals of identical duration.

For base 60, this intensifies: as mentioned in the post, 60 is the smallest number divisible by 2, 3, 4, 5, and 6. This gives tremendous flexibility for dividing a time interval.

5
treenyc 1 day ago 2 replies      
Short version. They don't really know.

"Although it is unknown why 60 was chosen, it is notably convenient for expressing fractions, since 60 is the smallest number divisible by the first six counting numbers as well as by 10, 12, 15, 20 and 30."

6
justwrote 1 day ago 3 replies      
You can also find the duodecimal system in languages like English and German:

  ten, eleven, twelve | thirteen, fourteen, fifteen, sixteen  zehn, elf, zwlf | dreizehn, vierzehn, fnfzehn, sechzehn

7
petercooper 1 day ago 1 reply      
And why are multiplication tables in school typically taught up to 12x12? Historically in the UK, there were 12 pence in a shilling, 240 pence in the pound, 12 inches in a foot, etc, but I'm not sure of the value nowadays.
8
sdfjkl 1 day ago 1 reply      
Interesting, although I stopped reading at the end of page 1. It seemed the article already explained most of it and while I would've scrolled down to skim the rest of the article, waiting for a page load seemed too much effort.
9
mVChr 22 hours ago 1 reply      
> Interestingly, in order to keep atomic time in agreement with astronomical time, leap seconds occasionally must be added to UTC. Thus, not all minutes contain 60 seconds. A few rare minutes, occurring at a rate of about eight per decade, actually contain 61.

And thus, the programmer's nightmare begins...

10
headbiznatch 18 hours ago 0 replies      
Fun topic which inspires me to mention two goodies that help fuel in-depth conversations about measurement and conversion:

1) The Measure of All Things - http://www.kenalder.com/measure/ (science history goodness)

2) Frink - http://futureboy.us/frinkdocs/ (one of my first discoveries on HN and still one of the most fun to return to)

11
abgv 1 day ago 1 reply      
From what I've read, the reason is simple:

Base 12: 12 is a number that can be divided by 2, 3, 4 and 6. This makes it a much better fit than base 10, which can only be divided by 2 and 5.

Base 60: As good as base 12 is, it misses division by 5. So what do you do to make it divisible? You multiply 12 x 5 = 60.

Now you can divide an hour in 2 parts of 30 minutes each, 3 parts of 20 minutes, 4 parts of 15 minutes, 5 parts of 12, or 6 parts of 10 minutes. This also means that if for example you want to divide a job in 3 shifts, every shift will be 8 hours, not 3,3333333 hours or similar, what you would get in a base10 system.

I mean, the stars and the gods and the tip or our fingers might be also a justification, but I think those were rationalized after the fact. I find it difficult that the guys that came with base12/60 didn't realize the particular properties of those numbers.

12
chiph 1 day ago 0 replies      
I asked this of a curator at the British Museum several years ago. And he replied that it was the Sumerians that first adopted the 24 hours in a day convention. But he didn't know who came up with 60 minutes in an hour.
13
yardie 1 day ago 1 reply      
I read this book about the history of numbers.

http://www.amazon.co.uk/gp/product/0747597162/ref=oh_details...

The book author declares the Babylonians had a base 60 system. some native cultures have none at all. (well 1 and many)

14
auvrw 1 day ago 0 replies      
"The trains are not only running on time, they're running on metric time."
15
carsonreinke 22 hours ago 0 replies      
Thought it was very interesting of thinking the minute/second as base 60
16
jokoon 22 hours ago 0 replies      
a sexagesimal base has some advantages.

you can divide it by 10, 5, 4, 3, 2, etc.

17
netcan 1 day ago 0 replies      
legacy systems
18
VLM 1 day ago 0 replies      
Aside from previously discussed, the pendulum length is convenient, and water drop "clocks" are fairly reasonable at one drop per second.

Also people can count one digit per second pretty easily if the point is to cook or process something for 45 seconds or whatever. That would be tough if the second were 100 times smaller than it is.

Its a numerical base with two "digits" not just one digit. So its not just 60 sec/min its 60 min/hr and if you arbitrarily decided to use 2 for both, or 1000 for both, you don't get multiple levels that result in the second being useful. If you used 2 for both aka binary then each new-second would be 900 of our seconds long, thats useless. If you used 1000 for both then a new-second would be about 3 ms which might be handy for power EEs (not the RF guys...) but seems a bit inconvenient for the ancients.

One curiosity from the chem lab from decades ago was measuring to a milligram isn't all that challenging and a candle burned about a mg of wax per second (or was it a tenth?) anyway I'm well aware the gram is pretty recent, but the point is your stereotypical apothecary type in the ancient world should have been able to build a "mg capable" balance pan scale or at least approach it, so weighing a candle before and after would be a not too awful way to measure time and the least they could measure might have been around a second.

19
squirejons 1 day ago 1 reply      
why is the sky blue, Daddy?
20
fjcaetano 1 day ago 5 replies      
>The Greek astronomer Eratosthenes (who lived circa 276 to 194 B.C.) used a sexagesimal system to divide a circle into 60 parts in order to devise an early geographic system of latitude, with the horizontal lines running through well-known places on the earth at the time. A century later, Hipparchus normalized the lines of latitude, making them parallel and obedient to the earth's geometry. He also devised a system of longitude lines that encompassed 360 degrees and that ran north to south, from pole to pole. In his treatise Almagest (circa A.D. 150), Claudius Ptolemy explained and expanded on Hipparchus' work by subdividing each of the 360 degrees of latitude and longitude into smaller segments. Each degree was divided into 60 parts, each of which was again subdivided into 60 smaller parts. The first division, partes minutae primae, or first minute, became known simply as the "minute." The second segmentation, partes minutae secundae, or "second minute," became known as the second.

This makes no sense. For this to be true, it implies that the ancient Greek already had knowledge that the Earth is round, 1600 years before Galileo.

27
Poll: As a founder, what is your salary?
158 points by robg  1 day ago   149 comments top 48
1
ryguytilidie 1 day ago 15 replies      
I was thinking about this recently and I had a bit of a question. People used to always ask "wow why do founders get so much equity and dish so little out to employees?" and the answer would always be "because the founders takes all the risk". In the last few companies I work for, none were profitable, all had founders making 150k+, and all took at least 1m from our rounds out for themselves. It made me wonder where this supposed risk comes from and why VCs don't seem to mind founders paying themselves so much?
2
buro9 1 day ago 3 replies      
I don't mind sharing the details of mine.

Year one salary was 4,800 per year.

Year two salary is 12k.

London, UK based start-up, very little revenue (working with a few test customers, opening the doors very soon as we're quite far down the road on seeing the output of our tests).

I suspect the wage will stay at the 12k level until our active users pass 50k and the revenue passes break-even. At which point we want to raise money to fund growth, but the salaries will likely rise to help us focus on that growth (rather than how the hell we cover the credit card bill).

We are seed-funded and haven't yet gone for the A round.

3
tptacek 1 day ago 1 reply      
This isn't going to mean much: founder of what? A company that is only going to be lucrative if it has a major liquidity event in the future? Does the company have traction yet? How many employees at the company?
4
eieio 1 day ago 11 replies      
At this point a salary of 0 is easily in first place for the poll with around ~45% of the votes(116 votes out of 267).

I understand that many folks like to randomly vote on polls on HN: I remember several people explaining that they cast an incorrect vote on the last "how old are you" poll.

The fact that we have 267 votes in the first 45 minutes of this poll makes me more suspicious of the votes: I have a hard time believing that 5 founders per minute have read this poll and voted on it.

So I'm curious. Founders, how many of you truly take no salary? If you take no salary or close to no salary, how do you compensate? Is it via savings, tricky shenanigans where the business pays for things for you, or something else entirely?

5
beat 1 day ago 1 reply      
Zero, but zero revenue.

It will need to hit $100k once revenue can support it, for the sake of marital bliss. Worse, until there's revenue, I'm dayjobbing to make ends meet. I can live on savings for a while, but not pre-revenue.

6
abalone 1 day ago 2 replies      
Here's a question: For those of you who are bootstrapping making <$25K or so and live in the U.S., what are you doing for health insurance now?

I just found out that in California at least, you will be shunted to Medi-Cal, i.e. the program for people living in poverty. (At least up until now it was.)

You do not have the option of getting a subsidy for private insurance if you qualify for Medi-Cal. It is not either/or. You need to exceed a minimum income threshold for that -- and they no longer look at assets/savings to calculate that, only your income.

So, if you want a doctor that only takes private insurance, you'd have to purchase it without a subsidy. And of course, the unsubsidized market price has shot way up now. (For my plan, it doubled.)

So... what are ya'll doing? Medi-Cal? Full-price private insurance? Uninsured and taking the penalty? Haven't thought about it yet?

7
OoTheNigerian 1 day ago 0 replies      
perhaps it will be good to state profit/revenue rate and/or if it is funded and what stage.

This is not really clear cut to use and make your own decision.

8
matthuggins 1 day ago 0 replies      
I think this poll would have made more sense as a set of ranges instead. If I'm making $10k a year, would I answer $0 or ~$25,000/yr? The poll should have looked more like:

-> $0/yr

-> $1 - $25,000/yr

-> $25,001 - $50,000/yr

-> etc.

9
rexreed 1 day ago 0 replies      
Whatever I can afford to pay myself based on the previous month's net income. The more net income, the more I pay myself.
10
toblender 1 day ago 1 reply      
What really amazed me, was that CEO of our 10 Million a year income company was getting a salary of 0 dollars. He grew the thing form 0 dollars a year income to it's currently earning power too.

They repaid him from firing him from the job, although he is still on the board.

When I had coffee with him he seemed completely ok with it though... what a champ.

11
kapkapkap 1 day ago 0 replies      
Is this for founders whose statup is their full time gig? Or for founders who also are still maintaining a full time job?

A salary of $0 is quite easy if your still making $150k from another job. If the startup is your only source of income...not so much

12
lquist 1 day ago 1 reply      
Bootstrapped pass-through entity.

Year 1 Salary: $100kYear 2 Salary: $600k-$1M (6 months in to Year 2. Estimated salary)

13
primitivesuave 1 day ago 3 replies      
I run a bootstrapped education business, and was told by other bootstrappers that one should convert personal expenses into business expenses, like car payments (make your car a company car) and food (it's for the office, but sometimes certain employees take it home with them, wink wink). Then, pay yourself as little as possible to cover those impulsive purchases that we are all prone to, and focus on building the value of your company.

I later learned from a wise HN'er that you can also claim rent on your apartment as a business expense. Although I'd love to support social welfare and bureaucratic spending with my income taxes, I'd rather maximize how many jobs I can create for hardworking people with my business.

14
paulirish 1 day ago 0 replies      
Visualization of the poll results: http://hnlike.com/hncharts/chart/?id=7059569

(and to compare, the TNW article's chart: http://cdn0.tnwcdn.com/wp-content/blogs.dir/1/files/2014/01/... )

15
rplnt 1 day ago 1 reply      
The choice of answers is very US-based. I'm not a founder, but I can see myself living on $10k quite easily if necessary. Before taxes.
16
mightybyte 1 day ago 0 replies      
One important thing to consider is the tax benefits available to founders that are not available to employees. If the founder has a home office, then at least part of their living expenses can be paid by the company as office expenses. This makes the founder's effective income much higher than the raw numbers would indicate.
17
skadamat 1 day ago 0 replies      
There needs to be more transparency / data around this to really reach any legitimate conclusions.

Salaries alone are only marginally useful, it's also important to know what stage the company is in (earlier the stage, less the salary, yes or no? Would love to test that b/c that's our intuition). Its also important to know what industry. How much fundraising vs # of employees, how does that ratio effect the CEO salary. Also, equity %. Do founders who make more get less equity than sweat equity CEO's who take no salary? So many questions!

18
timc3 1 day ago 0 replies      
The problem with just stating income is that it doesn't take into account living costs for where you are. For instance the cost of living in Sweden is very high, i.e. 500g loaf white bread is circa USD$2.69 and if you are on a budget you don't actually get to live cheaply as you can't get to the really cheap stores or take advantage of bulk buying.

Plus you are not taking into account the age/maturity of the company.

19
jakejake 1 day ago 0 replies      
It would probably be interesting to know what type of companies these are. I see the majority has voted $0 salary. Are these companies that have employees, office space, payroll, etc? Or are these projects that one or two friends are working on together to try to get off the ground?

I am not trying to discredit either situation, it's just that I see one situation as a company where nobody is making any money. The other situation would have staff who are getting a steady paycheck, but the CEO has chosen not to pay his/her self.

20
gsibble 1 day ago 0 replies      
Since my persona on this account is public, I won't go into details, but I do view it as any other job....my investors want me to focus on building a company. That means competitive pay compared to what else I could be doing, which means a substantial salary.
21
tosser8398b9 1 day ago 0 replies      
bootstrapped. "it's a long road, there's no turning back"

Year 1: 0 + working old job (150K annual) for 6 months

Year 2: 0 + 38K consulting

Year 3: $19,000

Year 4: $38,000

..

Year 9: $97,500

Year 10: $137,000

Year 11: $182,000

Year 12: $465,000

Year 13: $415,000

22
pmorici 1 day ago 2 replies      
This poll isn't very useful. Given that you can't even get a cheap apartment for much less than $2,000 per month in SF I don't see how anyone could live on less than 36k per year in the bay area before taxes. There has to be more to these numbers perhaps they have a large stash of personal savings they are living off or a significant other with a good paying job.
23
Tyrant505 1 day ago 0 replies      
Already, decidingly zero. Hires always tend to depreciate the work and risk a founder has goes through years up to investment and thereafter.
24
dmtroyer 1 day ago 3 replies      
someone should start a poll with the question "As a founder, what is your total yearly compensation?"
25
endlessvoid94 1 day ago 0 replies      
Size and stage of the company is important and not accounted for here at all.
26
arikrak 1 day ago 0 replies      
-$100/month
27
davidu 1 day ago 0 replies      
The survey results indicate you are missing some higher-salaried choices. (200, 250, 300, 350+) would be my advice.
28
immad 1 day ago 0 replies      
Would be more interesting to also know a) the funding that people received vs salary. b) the revenue people have vs salary.

Salary by itself doesn't really tell much.

29
yaph 1 day ago 0 replies      
Made a chart of the results http://i.imgur.com/pIDP9kA.png
30
rcjordan 1 day ago 0 replies      
ScottWhigham is correct, in my case it was 'gain' not salary. But the income averaged around 40-50k USD for years. I bootstrapped it with personal funds and keystrokes. (Sites started around 1994. Sold out 2011.).
31
jc123 1 day ago 0 replies      
Peter Thiel mentioned something like the salary of the CEO for a startup had high inverse correlation with the startup turning out to succeed.
32
ericnakagawa 1 day ago 0 replies      
Good story from today on founder salaries: http://thenextweb.com/insider/2014/01/14/salary-founder-favo...

I made $7k in 2012 while running simplehoney. (me and my cofounder paid ourselves minimum wage so we could keep 3 people on payroll and get a group rate for health insurance)

33
wtvanhest 1 day ago 0 replies      
It matters a lot whether they founded a rapid growth startup or a lifestyle business. The two should be completely separate polls.
34
mikeg8 1 day ago 0 replies      
~$1,000 per month for each of three founders to cover our costs of living in Santiago, Chile.
35
erikpukinskis 1 day ago 0 replies      
On average, about -$600/month.
36
plaxis 1 day ago 0 replies      
Here visualized in bar graphs as a %. Interesting. http://i.imgur.com/yEW0ivT.png
37
pouzy 1 day ago 0 replies      
0I even created a non profit to make sure that I can reinject everything in the business itself, without having to pay a lot of taxes.
38
casualobs 1 day ago 0 replies      
Won't this form a normal distribution? There's no way of vetting if the people who voted actually are founders and if they make that much, right?
39
chirau 1 day ago 0 replies      
120,000/year + a percentage of revenue. I give my engineers 130k + a slightly smaller split.
40
ScottWhigham 1 day ago 0 replies      
This assumes that, as a founder, I have a "salary". Most of the solo and LLC founder members here will not have a traditional salary - the business' profits are your personal "gains".
41
sgarg26 1 day ago 0 replies      
How did founder salary change with regards to level of funding, break even, and profitability?
42
elwell 1 day ago 0 replies      
Should have option for <$0
43
ceedan 1 day ago 0 replies      
Salary w/o cost of living data is just a number.
44
sfermigier 1 day ago 0 replies      
I pay myself in Euros.
45
leoplct 1 day ago 2 replies      
$ 400,000/year
46
Ryel 1 day ago 0 replies      
Not enough.
47
rob-alarcon 1 day ago 0 replies      
<0
48
tomkinson 1 day ago 0 replies      
Salary? ahahahahahahahahahaahahahahahahahahahahahhahahahaha. Funny stuff.
28
Coming soon: Stripe CTF3, distributed systems edition stripe.com
158 points by gdb  20 hours ago   42 comments top 17
1
orf 19 hours ago 5 replies      
The last stripe CTF was amazing, I managed to be one of the first hundred or so to get to the last stage but got utterly stumped by it. The solution involved tracking the incrementing port numbers of a Linux server as a kind of side channel attack to bruteforce a password if I remember correctly - a very interesting puzzle.

The whole thing was very fun, I highly recommend anyone interested in security to give it a go. The XSS challenges were also cool: they ran a headless webkit browser to emulate a user so your XSS code actually did something.

2
darklajid 19 hours ago 1 reply      
Amazing. I had a blast in the last CTF and the shirt is considered one of the more presentable I own according to the wife.. ;-)

Thanks a lot for the work you put into these things!

3
patcon 17 hours ago 2 replies      
Funny that they mention Bitcoin when Stripe doesn't publicly deal with bitcoin yet...
4
wasd 18 hours ago 2 replies      
I've got next to no experience with distributed systems. What sort of concepts should I brush up on before next Wednesday?
5
bdr 19 hours ago 0 replies      
I remember looking back on my 2012 and including Stripe's CTF on the list of highlights. It was that fun.
6
mcescalante 19 hours ago 0 replies      
Very interested to see how this is structured and how it is going to work - it sounds like this time around it is less about the competition piece, but still leaves the opportunity for those who want to test their skills against others to do so. Any way you slice it, it will be great to see what they've put together.
7
joeblau 19 hours ago 0 replies      
I'm to dumb to ever figure these things out, but I look forward to seeing the challenges :).
8
pirateking 19 hours ago 0 replies      
The last CTF was super fun. I have been waiting for this announcement, and am happy to see it will be a whole new kind of game. I am sure it will be great given the high production values of the last one.
9
heywire 19 hours ago 0 replies      
I love that Stripe puts together these competitions. I still wear my shirt from the first CTF proudly :) I wish I had more free time to participate again...
10
dclara 9 hours ago 0 replies      
Can you please show me the page where the "distributed systems" is about? What I found is the "distributed search" page which restricted to a certain type of document indexing algorithm. If this is all the competition about, please don't mislead users that it is about "distributed systems".
11
spiderPig 19 hours ago 0 replies      
Looking forward to this and curious how it's going to be structured (writing clever state machines?). The last one was amazing (nice American Apparel t-shirt too :-)!).
12
sprizzle 19 hours ago 0 replies      
I like how CTF is becoming more educational and less of a hacking competition. Can't wait to see what CTF3 brings.
13
paulcnichols 18 hours ago 0 replies      
Super excited about this. CTF2 was a ton a fun. I also enjoyed the t-shirt.
14
mac1175 15 hours ago 0 replies      
Nice! This should be fun!
15
hydralist 20 hours ago 1 reply      
any chance for a new programmer to take part?
16
Ryel 17 hours ago 1 reply      
No event in NYC?
17
peterwwillis 18 hours ago 3 replies      
Why is the valley crowd treating distributed systems engineering like it's a new fad? And why are they only using tech made in the past 5 years? The field has been around for four decades, yet they focus on only a couple algorithms and models?
29
Show HN: Funded.io rapid prototype funded.io
155 points by bottompair  1 day ago   58 comments top 19
1
bottompair 1 day ago 6 replies      
Struggling with maintaining 25 versions of financial projection spreadsheets for a tiered SKU based SaaS startup. I decided to spin this up as quickly as possible (and learn a little AngularJS along the way).

The result - after ~30 hours it's usable enough to show. Obviously needs more flexibility around the variety of revenue models and expense categories. But really it's just for rough estimates at this point.

2
davidbanham 1 day ago 1 reply      
I have spent more hours than I'm comfortable contemplating slaving over Excel spreadsheets creating projections. I went into one funding round with perfect eyesight and came out needing reading glasses.

This tool is fantastic.

I'd love to be able to modify it for non-MRR companies though. What license are you releasing the code under?

3
immad 1 day ago 2 replies      
Would be cool if you could export to Excel. That way you could lay the fundamentals in place here and do custom things offline.
4
toast76 1 day ago 1 reply      
Love it. The only issue I have is the SKU slider is a little bit "fuzzy". Would be great to actually be able to plug some real sales estimates.

Also the slider seems to be capped out at some really low number (like 30 new customers a month or something?)... not sure if I'm doing something wrong?

5
labaraka 1 day ago 3 replies      
This is absolutely fantastic. I would pay real money for something similar tailored to product sales model (vs recurring SaaS).
6
ilaksh 1 day ago 1 reply      
That's awesome. Can you make one thats for 'lifestyle' businesses, i.e. bootstrapped startups with a sustainable model?

For starters, with my business model, the monthly expenses depend on the number of customers directly. Every time I get a customer, I automatically deploy a VM. And I am charging them a few more bucks than the VM costs, so my revenue is directly tied to the number of customers also.

Maybe I just need a spreadsheet. But really its pretty simple. I plan to launch something that I can support alone. Within a month or so, I need to have enough profit to hire a guy from oDesk to help with support.

The other part is charging for support, which I plan to make separate from the servers. So you can pay as low as $5 if you want minimal support, up to $500 if you are a business and want to prepay for up to half a day of consulting each month.

So what I want is a spreadsheet that I can change that has a projection chart off to the side that is automatically connected to the spreadsheet.

With template spreadsheets/charts that I can configure.

Maybe use something like Google Spreadsheets or http://stoic.com/formula/index.html

7
tomasien 1 day ago 0 replies      
I did a quick model with this tonight after a full day of investor pitches, and even though I'm exhausted I got about a 70% accurate model done in about 5 minutes. I know how to use Excel, but I just wouldn't have gotten that done in 5 minutes with Excel. It was helpful, that's the best thing I can say about anything.
8
budgetyzer 3 hours ago 0 replies      
Check: App.budgetyzer.comSimilar tool more advanced.
9
cordie 1 day ago 0 replies      
I'll add a vote for the "bootstrappers" version of this. Although I was able to make it work for my situation. Great work though. Definitely a useful tool!
10
kfk 1 day ago 0 replies      
Whats your current pain in dealing with financial projections? What kind of help could benefit you from a finance controller expert remotely? Would you be willing to share finance/spreadsheet knowledge with some coding mentoring?
11
jaimefjorge 23 hours ago 0 replies      
This is definitely useful. Export/import to/from excel could be a great feature.
12
pla3rhat3r 1 day ago 0 replies      
Very cool! Add some secure features and exporting tools and you're onto something.
13
shekyboy 1 day ago 0 replies      
Awesome work!Export to Excel please
14
narzero 1 day ago 0 replies      
Fantastic tool! Like aktary, I'd love to see a save, print and export options.
15
tomkinson 1 day ago 1 reply      
I will use this extensively tonight(spend about 1-2 hrs beta testing) and come back with suggestions/comments and input, IF you can add ways to save the data and/or export to CSV/excel after. On a brief look, it looks great, well done.
16
aktary 1 day ago 2 replies      
Impressive! Are you going to develop this further for the community? I'd love to use it and be able to save, print, export, etc...
17
ing33k 1 day ago 0 replies      
cool , please add export to excel or google docs..
18
notastartup 1 day ago 0 replies      
half of those accounting terms scare me. I'm more than happy to just keep selling software and not leverage myself to appease some rich guy who doesn't respect the art of software and just wants a ridiculous return.
19
martialmartian 1 day ago 1 reply      
for those who can't use excel...
30
Asm.js AOT compilation and startup performance mozilla.org
151 points by bzbarsky  2 days ago   82 comments top 5
1
flohofwoe 1 day ago 0 replies      
Author of the mentioned Nebula3 demos here. I must say it was extremely impressive to watch how quickly the AOT compilation time for asm.js code in Firefox was improved within only a couple of weeks. I think when I first tried to compile the demos to asm.js in a very early Odinmonkey build, Firefox froze for 10..15 seconds on start. Today it takes about half a second and everything is done asynchronously (no freeze or stuttering). This is especially impressive when looking at the time PNaCl currently requires when spinning up the demo for the first time (after that first compilation pass the result is cached though, and the next start is basically instant). Here's a bit more info on demos (lines of code, compiled binary size of various platforms, etc...): http://www.slideshare.net/andreweissflog3/gdce2013-cpp-onthe...
2
natural219 1 day ago 1 reply      
I know very little about compilers, low-level optimization, or any of these topics beyond a rudimentary understanding of basic computer systems. It speaks volumes that Mozilla is able to explain some of these concepts in ways that I sort-of grasp, even if the specifics mostly go over my head.

Excellent, excellent article. I look forward to more improvements to asm.js and the future of Javascript. Maybe one day I will actually learn this shit.

3
Ygg2 1 day ago 8 replies      
It's funny that time and time again MIT approach over Hacker approach fails. Worse is better so to say ;)

On a simply theoretical ground having LLVM in browser sounds like an amazing thing. It elegantly solves all the problems of using different languages in browser, having super optimization of native LLVM project, etc.

Then you look at Javascript. It was written in a week. It's a sloppy mess of Java, Self and Scheme merged into a single horrible entity. But It just works. And now it works fast :D

Good job Mozilla.

4
kibwen 1 day ago 0 replies      
Regardless of how you feel about the political implications of asm.js, this is a fascinating technical article on the challenges of implementing a world-class Javascript interpreter.
5
jokoon 1 day ago 0 replies      
It's funny that after all, native code is still what devs wants. Even on top of a super JS engine JIT I-don't-know-what.

There was a HTML browser, then a scripting language, and it seems it was the easiest, hackiest way to massively deploy native-fast software, and it's done through a browser.

       cached 16 January 2014 16:11:01 GMT