hacker news with inline top comments    .. more ..    25 Oct 2013 News
home   ask   best   4 years ago   
Most people won't bryce.vc
78 points by imd23  1 hour ago   38 comments top 15
spodek 26 minutes ago 3 replies      
Everybody so far is commenting on the woman who approached the CEO.

Let's not forget the CEO, who committed and risked resources on a hunch or instinct or who-knows-what.

If I had to pick one of the two to ask how they had the nerve to act and to learn from, I'd pick him. (Of course I'd prefer both and not to belittle her gumption and skills to back it up).

- What did he see to suggest risking those resources? ... To create a team of outsiders to work on the core app?

- How likely did he expect things to work out?

- How did he explain the expenditure of flying the others in to the CFO or whomever?

- Or did he make a unilateral decision without asking others?

- Did he just get lucky?

- Had he done things like this before and succeeded? Failed?

- Was he worried about making waves in his organization? Did he?

Plenty more questions pop up...

md224 39 minutes ago 8 replies      
I've been kind of struggling lately with this sort of inspirational attitude. It seems like a good attitude, but it also comes with this hidden assumption: if you try hard and keep at it, there's a good chance you'll succeed. Is this actually true?

I mean I want it to be true. I'd like to live in a world where it's true. But I don't actually have any hard evidence besides the testimonials of people who have found success, and I'm not sure if it's survivorship bias or an accurate picture of how one can become successful.

My theory is that chances of success are incredibly variable, and trying hard and putting yourself out there will increase that chance, but I can't figure out a ballpark for the baseline.

Is it even possible to crunch the numbers on something like this? I feel like we'll never know.

Knowing you gave it a shot is the important part, I suppose.

pkulak 14 minutes ago 1 reply      
I hate this self empowerment bullshit. Do people have any idea what this sounds like to people with actual, real problems? Your life can't be solved by positive thinking and being impulsive. In fact, I can make a very good case that that kind of attitude will create far more problems than it solves. Hell, I'm sure I could come up with _two_ anecdotes, which is already twice the evidence given by this blog post.
hawkharris 21 minutes ago 0 replies      
To me the most interesting aspect of this story isn't the inspirational "don't be afraid to try" message. That's a sentiment most people have heard, and I don't think this story puts a particularly new spin on it.

The interesting bit, I think, is how the protagonist challenged the traditional relationship between employer and job seeker. Instead of pandering by praising Uber's design, she had the guts - possibly because of whiskey :) - to offer thoughtful criticism of the product.

As someone who recently finished a tough job search, I found this concept very liberating. Following the traditional process - researching a company's best features, trying to say the right things in interviews, waiting for callbacks - can feel discouraging. It can be like a bad round of speed dating.

Finding a creative, respectful way to point out a company's flaws is an innovative approach that, when done appropriately, can shift the ball back into the candidate's court.

jseliger 29 minutes ago 0 replies      
<blockquote>The statement of the Shimura-Taniyama-Weil conjecture must have sounded crazy to its creators. . . . the idea that this was true. . . must have sounded totally outrageous at the time. This was a leap of faith, in the form of a question that [Taniyama] posed at the International Symposium on Algebraic Number Theory held in Tokyo in September 1955.

I've always wondered: what did it take for him to come to <em>believe</em> that this wasn't crazy, but real? To have the courage to say it publicly?

We'll never know. Unfortunately, not long after his great discovery, in November 1958, Taniyama committed suicide. He was only thirty-one. To add to the tragedy, shortly afterward the woman whom he was planning to marry also took her life, leaving the following note:

<blockquote>We promised each other that no matter where we went, we would never be separated. Now that he is gone, I must go too in order to join him.</blockquote>

. . . In his thoughtful essay about Tayniyama, Shimura made this striking comment:

<blockquote>Though he was by no means a sloppy type, he was gifted with the special capability of making many mistakes, mostly in the right direction. I envied him for this, and tried in vain to imitate him, but found it quite difficult to make good mistakes. (94) </blockquote></blockquote>

Edward Frenkel, <a href="http://www.amazon.com/Love-Math-Heart-Hidden-Reality/dp/0465... and Math: The Heart of Hidden Reality</em></a>, which is recommended.

What mistakes have you made lately?

ritchiea 16 minutes ago 0 replies      
Great story. That's really the right way to get a new job, tell the CEO of a company exactly how you can help her/his company. And smart of the Uber CEO to listen to her.

That said it really helps if you're already hanging around at a party with a CEO of a big, in the news, growing startup and thus have insider access to tell him exactly what you think after a few drinks.

dm8 10 minutes ago 0 replies      
I think it is about the decisiveness of a founder. It's not about asking her to come at 9. It's about having a team ready by Monday and flying people from out of town.

Although, I appreciate that the designer took his offer seriously. More importantly, it's the guts of a founder in this case that made it happen.

medell 37 minutes ago 0 replies      
Great story. Uber got shutdown here in Vancouver last November, but I was curious what it looked like. To save you a few clicks:https://itunes.apple.com/ca/app/uber/id368677368?mt=8
imd23 1 hour ago 0 replies      
I couldn't feel more identified.

I've done similar things sometimes (not playing at that level but, similar in the end) and what I felt was a big discomfort and a really huge passion for something.

You want to defend your values.

These two together creates a willingness to change the status quo and make something better can move mountains. At the same time I can say this seconds you are terribly fragile.

Not even courage is needed. That's why it's so difficult to explain, it's something you feel inside and need to get out.

tomasien 35 minutes ago 0 replies      
Her use of the word "liminal" in Elle's profile he links to reminded me of the most culturally unifying thing I ever read - http://www2.fiu.edu/~ereserve/010010095-1.pdf

Amazing how societies all have this same period, whether it's college or 3 years in the woods. The rules are weirdly the same - "there's a time for everything, and it's college" translates to a random tribe in Africa almost literally.

tacoman 17 minutes ago 0 replies      
Uber should also overhaul their website. I had to go to wikipedia to figure out what this company does.
sandeshkumar 18 minutes ago 0 replies      
Uber, indeed!
useraccount 5 minutes ago 0 replies      
This is vapid as hell.
Thatguise 37 minutes ago 2 replies      
These stories are everywhere, is like the one about the guy who bought a porsche for spare change because the seller was the owner's ex-wife and wanted to piss her ex off.

You do know that was a coincidence right? or even a miracle given how well things ended up. Has the uber guy been a little pissed or in a bad mood the results could have been drastically different, and yet your blog-worthy suggestion is to take a leap of faith and see if it works.

Who cares, we are telling unexperienced, unprepared and even untalented kids to quit school and launch a "startup", whatever that means now.

axaxs 35 minutes ago 0 replies      
Liquid confidence is not special. This was a case of good luck and little motivation.
Gzip + poetry = awesome jvns.ca
19 points by jvns  34 minutes ago   4 comments top 2
CGamesPlay 3 minutes ago 0 replies      
Audio is unnecessary. The video shows a slow-motion decryption of a gzipped version of the poem. The red text between brackets is a chunk of text that was earlier stored into the huffman tree (example "W{hile I }" means that the "hile I" was previously encoded, it occured in the substring "while I pondered"). You can see the red chunks quickly occupy the larger volume of the poem, which visually highlights the symmetry in lyric that the computer is using to encode the file as gzip.

Pretty neat.

JacksonGariety 7 minutes ago 1 reply      
Can anyone explain what is going on here?
LinkedIn Introduces Insecurity bishopfox.com
299 points by shenoybr  8 hours ago   113 comments top 28
buro9 5 hours ago 3 replies      
One of the other subtle things they do with metadata is their fascination with IP addresses.

Intro will enable LinkedIn to have the IP address of all of your staff using it, and thus (from corp Wifi, home locations of staff, popular places your staff go) they will know which IP addresses relate to your staff members (or you individually if you are the only person on a given IP).

This means that even without logging onto LinkedIn, if you view a page on their site they can then create that "so and so viewed your profile", which is what they're selling to other users as the upgrade package to LinkedIn.

Worse than that, as a company you can pay to have LinkedIn data available when you process your log files, and from that you know which companies viewed your site. And that isn't based on vague ideas of which IPs belong to a company according to public registrar info, this is quality data as the people who visited from an IP told LinkedIn who they were.

Think of that when you're doing competitor analysis, or involved in any legal case and researching the web site of the other party.

And VPNs won't help you here, as you'd still be strongly identified on your device and leaking your IP address all the time.

There are so many reasons why this LinkedIn feature needs to die a very visible and public death, and very few about why it should survive. It's a neat hack for sure, but then so were most pop-up and pop-under adverts and the neatness of overcoming the "impossible" is no reason this should survive.

ig1 5 hours ago 8 replies      
Well lets take these one-by-one:


1. Attorney-client privilege.

I'm guessing most law firms use third party email servers, anti-virus, anti-spam and archive/audit systems which this would also apply to. It would also apply if you're using Raportive, Xobni or the like (or integrated time-tracking, billing, crm, etc.).


2. By default, LinkedIn changes the content of your emails.

Irrelevant. Unless you read your emails in plain text every modern email client changes how email is displayed.


3. Intro breaks secure email.

Yes. Except iOS mail doesn't support crypto signatures anyway.


4. LinkedIn got owned.

Yes. LinkedIn adds an extra point of vulnerability.


5. LinkedIn is storing your email communications.

Well metatdata but yes.


7. Its probably a gross violation of your companys security policy.

Yes. As is using Linkedin itself. Or Dropbox. Or Github. Or Evernote. Or Chrome. Or any enterprise software that uses the bottom up approach.


8. If I were the NSA

The NSA has access to your emails if they want them anyway. Email isn't a secure protocol against a well funded adversary.


9. Its not what they say, but what they dont say

This looks like a semantic dispute, but it doesn't look any more vague than say Google's privacy policy. Companies in certain circumstances are legally required to provide access to information.


10. Too many secrets

These all seem to be questions that can either be answered by testing or ones that LinkedIn would probably be happy to disclose, but unlikely to be major issues to mainstream users.


So fundamentally it comes down to two points, granting Linkedin access to your email creates a new point of attack and Linkedin themselves might use your email in ways you find undesirable.

So it's essentially a trade-off for the benefits you get from the app versus those risks. For a personal account which you use for private emails, personal banking, etc. the evaluation is obviously going to be very much different from say a salesperson's work account which they use for managing communication with leads.

In the later case they may already be trusting LinkedIn with similar confidential information and already use multiple services (analytics, crm, etc.) that hook into their email so the additional relative risk might be smaller.

As people with technical expertise we shouldn't use scare-mongering to push our personal viewpoints upon those with less expertise, but rather help people understand the security/benefit trade-offs that they're making so they can decide for themselves whether to take those risks.

It's important to treat the wider non-technical community with respect and as adults capable of making their own judgements and not as kids who need to be scared into safety.

sneak 5 hours ago 3 replies      
Giving away email credentials to a third party service, regardless of reason, should be both covered in your internal training materials, as well as be maintained as a firing offense.

This is really just a case of well-branded spearphishing. You should already be protecting against that.

jmadsen 5 hours ago 3 replies      
Are Linkedin still working out of Mom's garage? Do they not have a single person on staff capable of looking at the current environment regarding internet privacy and say, "Uh, guys...maybe put this one on ice for a year..?"
ctide 7 hours ago 7 replies      
What's the difference between this and using an app such as Mailbox?
etchalon 7 hours ago 3 replies      
This is ridiculous. LinkedIn is offering a feature, optionally, to users who chose to install it. They have been upfront about how it works. If you don't like how it works, don't use it. Problem solved, myopic holier-than-thou rant avoided.
csmatt 7 hours ago 4 replies      
LinkedIn just seems overwhelmingly sleezy to me. How do they keep getting away with this stuff?
dclowd9901 6 hours ago 1 reply      
> 1. Attorney-client privilege.

Really? I guess you better have your own SMTP server set up then, or hope your email provider is willing to go to bat for your rights...

> 8. If I were the NSA

Yeah, it sounds like they definitely have needed it so far...

5 other of the things are basically the same point, remade in 5 different ways. This is a really weak list. There are certainly concerns, but most of these problems are symptomatic of our email system as it is. And have we all forgotten how crazy everyone went when we found out google was going to start advertising in Gmail?

siculars 1 hour ago 1 reply      
This idea is such a disaster I don't even know how it was allowed to see the light of day. The sad fact is that there are untold numbers of people who will install this monstrosity.

Serious questions though, if you are an IT shop - how do you defend against this trojan horse app?

llamataboot 4 hours ago 1 reply      
I desperately want to delete LinkedIn, but I am also looking for my first developer jobs in the tech field. In my former field, no one would ever ask for your LI profile. You send a resume, link to a resume, whatever. In the tech field, every single company I've interviewed with so far has looked at my linkedin profile before our interview and specifically requested it. Until the field changes, or I have a stronger status as a developer, I feel I have to be there or get overlooked for someone who is there.
sytelus 2 hours ago 0 replies      
I'm still not able to believe if I read that right. Does LinkedIn really re-routes your emails to their servers in their entirety? I looked at their announcement and video at http://blog.linkedin.com/2013/10/23/announcing-linkedin-intr.... There is NOT even a hint of disclosure that they are doing this. I can imagine 10 ways to achieve the similar user experience without re-routing entire emails. So if this is true, LinkedIn really really fundamentally screwed up with customer trust.
martinbc 2 hours ago 0 replies      
Seems like Linkedin have posted an update on http://engineering.linkedin.com/mobile/linkedin-intro-doing-...:

Update, 10/24/13

We wanted to provide additional information about how LinkedIn Intro works, so that we can address some of the questions that have been raised. There are some points that we want to reinforce in order to make sure members understand how this product works:

- You have to opt-in and install Intro before you see LinkedIn profiles in any email.- Usernames, passwords, OAuth tokens, and email contents are not permanently stored anywhere inside LinkedIn data centers. Instead, these are stored on your iPhone.- Once you install Intro, a new Mail account is created on your iPhone. Only the email in this new Intro Mail account goes via LinkedIn; other Mail accounts are not affected in any way.- All communication from the Mail app to the LinkedIn Intro servers is fully encrypted. Likewise, all communication from the LinkedIn Intro servers to your email provider (e.g. Gmail or Yahoo! Mail) is fully encrypted.- Your emails are only accessed when the Mail app is retrieving emails from your email provider. LinkedIn servers automatically look up the "From" email address, so that Intro can then be inserted into the email.

tzury 4 hours ago 0 replies      
I wonder how's Rapportive doing this days. That is, whether this plug-in seats in people's GMail app and sends out data to LinkedIn or not.

After all, we are talking about the same team more or less, and surely the same company who owns Rapportive today.

If my concerns are real. One might find this is ironic that Rapportive was backed by YC and Paul Buchheit, the creator of Gmail, and now this very company violating GMail users' privacy.

lispm 6 hours ago 2 replies      
To celebrate this, I removed LinkedIn apps from my devices.
kevinpet 6 hours ago 0 replies      
I wonder if they called it "intro" to make it impossible to google for so that no one can ever figure out what they're agreeing to when they install it.

What does the sig it appends look like? I will have to make sure to never send email to anyone who has the tell-tale "I opt into spyware" flag.

gohrt 3 hours ago 0 replies      
Is this claim true? I thought the Feds were claiming that using any hosted email (Gmail, Hotmail, etc), is considered a 3rd party subject to subpoena.

> These communications are generally legally privileged and cant be used as evidence in court but only if you keep the messages confidential.

orenmazor 7 hours ago 1 reply      
seriously? this is what Intro is? how is it not a bigger deal?people get upset over the littlest Facebook changes, but something this big barely shows up?
mcenedella 5 hours ago 0 replies      
Related: https://news.ycombinator.com/item?id=6430893

"LinkedIn Founder says 'all of these privacy concerns tend to be old people issues.'"

The bit about privacy starts at the 13 minute mark.

webhat 6 hours ago 0 replies      
Nicely stated, what I didn't see mentioned was the iframe it introduces into the mail. It can use this iframe to collect all kinds of additional data about you.

In the first instance I thought this was an app that was running in the background on your phone, I would have called that doing the impossible. This is just a MITM, and not a very good one at that.

iamleppert 5 hours ago 0 replies      
In other news, e-mail is an insecure protocol and most people transmit in the clear and don't have their own e-mail infrastructure anyway.

It's interesting this "blog post" came from a professional security company who makes it money from scaring individuals and companies about security threats.

Is it just me, or is this firm even worse than LinkedIn?

natekh 6 hours ago 0 replies      
I'm not saying 1 bad turn deserves another, but this is no worse than what any company operating at scale does when they serve https through a gateway service (Scrubbers, CDN, whatever).
coldcode 2 hours ago 0 replies      
Hmm if enough people complain Apple might close this feature. At least it's opt-in. As for me, I would say no.
ninjazee124 1 hour ago 0 replies      
I just can't fathom how something so ridiculous could pass so many engineers at LinkedIn, without raising flags on how bad this is. The moment I saw the word "proxy" I cringed!
pavel_lishin 7 hours ago 1 reply      
Good thing I use gmail.
shenoybr 7 hours ago 0 replies      
I wonder how this affect BYOD to work. Corporations would be furious to have their email content scanned by linkedin.
cognivore 4 hours ago 0 replies      
The thing that I find interesting is if LinkedIn goes ahead and does this, how many other companies will want to join the bandwagon and then we'll end up with our email being bounced around through a slew of different proxies so everyone can add their spam and ads to it.
tonylemesmer 5 hours ago 0 replies      
So make a plugin for your email client which raises a little Intro flag when you receive an email from an Intro user.
Self-driving cars projected to reduce injuries by 90%, save $450B annually techspot.com
13 points by alok-g  38 minutes ago   1 comment top
mmanfrin 6 minutes ago 0 replies      
I am really excited for the possible future of self-driving electric taxis. All the pieces kind of fit together: you can use an app to dispatch and gauge number of fleet vehicles to put out (Google just bought a huge stake in Uber), self-driving cabs can return themselves for a recharge when low on batter and another vehicle can replace it instantly, the amount of cars on the road can scale with demand fluidly. There'd be no need for a car ever in a metro area.
The PC is not dead, we just don't need new ones idiallo.com
405 points by firefoxd  11 hours ago   308 comments top 83
simonsarris 10 hours ago 17 replies      
I've felt this way since I built my last desktop in 2008. I was sort-of waiting for the "gee its time to upgrade" mark to roll around in 3 or 4 years, but it hasn't happened yet. Any games I want to play it still runs very well, and it still feels very fast to me even compared to modern off-the-shelf systems.

When my friends ask for laptop-buying advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.

I think I can pinpoint when this happened - It was the SSD. Getting an SSD was the last upgrade I ever needed.


Above that, PCs aren't necessary for a lot of people, because people do not need $2000 Facebook and email machines. For the median person, if you bought a PC in 2006, then got an iPad (as a gift or for yourself) and started using it a lot, you might find that you stopped turning on your PC. How could you justify the price of a new one then?

Yet if there was a major cultural shift to just tablets (which are great devices in their own right), I would be very worried. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.

I think its extremely healthy to have the lowest bar possible to go from "Hey I like that" to "Can I do that? Can I make it myself?"

I think its something hackers, especially those with children should ask themselves: Would I still be me, if I had grown up around primarily content consumption computing devices instead of more general purpose laptops and desktops?

Tablets are knocking the sales off of low-end PCs, but we as a society need the cheap PC to remain viable, if we want to turn as many children as possible into creators, engineers, tinkerers, and hackers.

fiatmoney 11 hours ago 7 replies      
"For what" is the obvious question. Web development with a remote testing environment, office applications, email, web browsing - sure, a Core 2 Duo is more than good enough if your software environment is kept in order. Audio / video / photoshop, gaming, developing software that does math, data analysis - you can never get fast enough.

The limiting factor is if your computer's feedback loop is tighter than your brain's perception loop. If you can type a letter and the letter appears, your computer is fast enough for word processing. But, if you can run a data analysis job and it's done before you release the "enter" key, it just means you should really be doing better analyses over more data. Certain use cases grow like goldfish to the limits of their environment.

UVB-76 11 hours ago 7 replies      
People snack on smartphones, dine on tablets, and cook on PCs.

A lot of people don't want to cook, so are happy with smartphones and tablets.

Why buy a desktop or laptop when an iPad will do everything you need for a fraction of the price? That's what people mean when they sound the death knell for the PC.

gtaylor 11 hours ago 5 replies      
I built a dev/gaming machine back in early 2010. It's stout, but not a ridiculously expensive (~$1,000) behemoth. The only thing I've done since then is toss some more RAM in so I could have two sets of triple channel DDR3 instead of one. I can still run just about any modern AAA game at the highest settings.

The only time I felt like I've needed an upgrade is while playing Planetside 2, which is/was very CPU bound for my setup. However, when it was initially released, Planetside 2 ran like a three-legged dog even on some higher end rigs. It's much better after a few rounds of optimizations by the developers, with more scheduled for the next month or two.

I dual boot Linux boot on the same machine for my day job, 5 days a week all year. For this purpose it has actually been getting faster with time as the environment I run matures and gets optimized.

As good as it is now, I remember struggling to keep up with a two year old machine in 2003.

protomyth 10 hours ago 1 reply      
The PC market isn't dead, but then again, the Mainframe market isn't dead either.

The Post-PC devices[1] (tablets / smartphones) are it for the majority of folks from here on out. They are easier to own since the upgrade path is heading to buy new device and type in my password to have all my stuff load on it. If I want to watch something on the big screen, I just put a device on my TV. Need to type, add a keyboard.

The scary part of all this is that some of the culture of the post-PC devices are infecting the PCs. We see the restrictions on Windows 8.x with the RT framework (both x86/ARM), all ARM machine requirements, and secure boot. We see the OS X 10.8+ with gatekeeper, sandboxing, and app store requirements with iCloud.

The PC culture was defined by hobbyists before the consumers came. The post-PC world is defined by security over flexibility. Honestly, 99% of the folks are happier this way. They want their stuff to work and not be a worry, and if getting rid of the hobbyist does that then fine. PC security is still a joke and viruses are still a daily part of life even if switching the OS would mitigate some of the problems.

I truly wish someone was set to keep building something for the hobbyist[2], but I am a bit scared at the prospects.

1) Yes, I'm one of those that mark the post-PC devices as starting with the iPhone in 2007. It brought the parts we see together: tactile UI, communications, PC-like web browsing, and ecosystem (having inherited the iPods).

2) I sometimes wonder what the world would be like if the HP-16c had kept evolving.

downandout 9 hours ago 4 replies      
If everyone adopted the attitude of the author of this blog, all innovation everywhere in the world would cease instantly because, for most of us in the developed world, everything is good enough already. There are many points throughout computing history at which existing hardware was overkill for the things that we were asking our computers to do. Had we stopped innovating because of that, the world wouldn't be anywhere near where it is today.

In high school I recall lusting after a $4,500 486DX2 66Mhz machine with an astounding 16MB (not GB) of RAM, and a 250MB hard drive. A few months ago I spent a little less than that on a laptop with 2,000X that amount of RAM, 8,000X that amount of hard drive space, and a processor that would have not so long ago been considered a supercomputer.

I for one am glad that we have continued to innovate, even when things were good enough.

bluedino 10 hours ago 2 replies      
Don't worry, PC manufacturers are currently selling machines that are already obsolete.

My dad went to Walmart and bought a computer (why he didn't just ask me to either advise him, or ask if he could have one of my spare/old ones I don't know) and monitor for $399.

It's an HP powered by a AMD E1-1500. It's awfully slow. Chokes on YouTube half the time. My dad is new to the online experience, so he basically uses it for watching streaming content.

I could have grabbed him a $99 Athlon X4 or C2D on craigslist and it would better than this thing. I'm not sure if he'll ever experience a faster computer so I don't think he'll ever get frustrated with this machine, but it's amazing that they sell an utter piece of shit like this as a new machine.

josefresco 11 hours ago 3 replies      
It's not that people don't need a new PC because their old PC does just as good a job as it did 5 years ago. It's also not because your average mom and pop are upgrading their own rigs themselves that new PC sales are slow.

It's that when tablets hit the scene, people realized they don't need their PC for 90% of what they do on a "computer". Email, social networking, shopping, music, video etc.

Us old geeks who swap hardware, play PC games, tweak OS settings and generally use yesterday's general purpose PC will be the ones remaining who keep buying new hardware and complete machines.

The general public meanwhile will only buy a PC if their tablet/smartphone/phablet needs expand beyond those platforms.

The market will shrink but it will turn more "pro". The quicker MS evolves into a modern IBM the better.

joshuahedlund 10 hours ago 0 replies      
What if one of the reasons we don't need new PCs yet is not that tablets and smartphones are replacing the need for them entirely (although for some people they are), and not that PCs are lasting longer on their own either (although they probably are, too), but that tablets and smartphones are helping PCs last longer by reducing the wear and tear we give them?

I'm still running fine with my 2007 Macbook, but I think my iPhone has extended its life because now my laptop almost never leaves the house and sometimes doesn't even get used in a day, whereas pre-smartphone I used to cart my laptop around rather frequently and use it every day.

seanmcdirmid 1 hour ago 0 replies      
The PC is not dead; the market for selling new PCs is just stagnant. PostPC doesn't mean the PC is dead, but it lives on more like a zombie.

I'm hoping that a new generation of largish (24-27") 4K displays will lead to a rebirth in desktop PCs, if only because we depend on them so much for professional work where they've fallen behind in experience when compared to high-end laptops, which shouldn't be the case!

zeidrich 10 hours ago 1 reply      
A tablet is a PC. Especially as x86 processors start taking over arm processors.

Just because it doesn't sit in a big box doesn't mean it's a different class of system. The difference is really the openness of the platform, comparing something like iOS to Win 8 pro.

That said, many tablets are basically what we would have thought of as PCs before. Consider something like the Samsung 500T or similar, or thinkpad helix. Components are small and cheap enough that they can be packed behind the LCD, and you have essentially a laptop that doesn't need it's keyboard.

Will iPads take over PCs? No. They are too limited, not because of hardware, but because of OS limitations. Will tablets take their place though? Quite possibly. The portability is quite handy. That I can dock a tablet with a keyboard and have a normal PC experience, but have it portable when I need it is a selling feature.

The obvious cavaet is that a limited OS is fine as long as the majority of data is cloud based. In that case even development can be done on a closed platform, and the tablet becomes something more akin to a monitor or keyboard. More of a peripheral than a computing device. We might get to that point, but that's not the cause of the current trend.

gordaco 10 hours ago 1 reply      
> You rarely have the need to buy a whole new box.

This is the number one reason why I love the PC above any other kind of computing machine. Need more disk space? Sure, go get a new disk, you may not even need to remove any of the others. Want a better graphics card for that new game? Easy as pie. Your processor died because the fan was malfunctioning? Too bad, but luckily those two are the only things you'll have to pay for. The list goes on.

I bought my current PC on 2009. The previous one still had some components from 2002.

rythie 8 hours ago 0 replies      
I think people are pissed off with PCs.

They bought a windows machine for what to them is a lot of money (more than a iPad), it didn't last long before it slow and it's got extra toolbars and all sorts of rubbish. What's worse is that this happened last time they bought a PC and the time before and the time before that. They are not going to add a SSD because that's not how they think + they don't how + it's throwing good money after bad + they are dubious of the benefits.

The iPad in contrast exceeded expectations and in the year or two they've had it they had a better experience. They can't get excited about a another windows machine because it's expensive, more of the same and not worth it really.

bhouston 11 hours ago 3 replies      
CPUs have not gotten significantly faster in the last couple years, especially at the high end.

Back in Q1 2010 I got an Intel Core i7 980X which benchmarked at 8911 according to http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7+X+980+...

Now in Q2 2013 (3 years later) the very top of the line processor available, an Intel Xeon E5-2690 v2, is only twice as fast at 16164: http://www.cpubenchmark.net/cpu.php?cpu=Intel+Xeon+E5-2690+v...

It used to be that things got faster at a much faster rate. And until this new E5-2690 v2 was released, the fastest CPU was only 14000 or so, which is less than 2x as fast.

jseliger 11 hours ago 0 replies      
This reminds me of a piece I wrote a couple years ago: http://jseliger.wordpress.com/2011/10/09/desktop-pcs-arent-g... , which makes a similar point. Both articles are less screechy and less likely to get readers than screaming headlines about OMG DEATH!!!
null_ptr 9 hours ago 0 replies      
I disagree with "The top of the line smart-phone or tablet you own today will be obsolete by the end of 2014 if not earlier."

I will use my 2011 smart phone until it physically breaks. If a 1.2GHz device with a 300MHz GPU, 1280x720 screen, and 1GB of RAM can't make calls and do a decent job of browsing the web, that's a problem with today's software engineering, not with the hardware.

And if Google decides to doom my perfectly good device to planned obsolence, fuck them, I will put Ubuntu Touch of Firefox OS on it. The day of disposable mobiles is over, we have alternatives now just like we do on PCs.

rndmize 11 hours ago 0 replies      
I think the issue is that the rate of improvement has fallen pretty hard. I remember when nvidia moved from the 5 series to the 6 series, their new flagship card doubled the performance of any current card on the market. The same thing happened with the 8 series. Processors before multicore would show direct improvements in the speed of the machine, especially if (like the average consumer) your machine filled up with useless, constantly running crap over time.

These days I just don't see that. Graphics cards seem to improve by 30-50% each generation, and because so many games are tied to consoles now, they often aren't even taking advantage of what's available. With multicore processors and the collapse of the GHZ race, there's no easy selling point as far as speed, and much less visible improvement (now all that useless crap can be offloaded to the second core!) and most consumers will never need more than two cores. Crysis felt like the last gasp of the old, engine-focused type of game that made you think "man, I really should upgrade to play this"... and that was released in 07. Without significant and obvious performance improvements, and software to take advantage, why bother upgrading?

mcgwiz 7 hours ago 0 replies      
Hmm, there seems to be the implication that we've hit some magical end state in hardware development where consumer needs are forever met.

Personally, I think of these hardware market developments with an eye toward interplay with the software market. Historically, software developers had to consider the capabilities of consumer hardware in determining feature scope and user experience. Hardware capabilities served as a restraint on the product, and ignoring them could effectively reduce market size. The effect was two-sided though, with new more demanding software driving consumers to upgrade. Currently, in this model, the hardware stagnation can be interpreted as mutually-reinforcing conditions of software developers not developing to the limit of current hardware to deliver marketable products, and consumers not feeling the need to upgrade. In a sense, the hardware demands of software have stagnated as well.

From this, I wonder if the stagnation is due to a divergence in the difficulty in developing software that can utilize modern computing power in a way that is useful/marketable from that of advancing hardware. Such a divergence can be attributed to a glut of novice programmers that lack experience in large development efforts and the increasing scarcity and exponential demand for experienced developers. Alternatively, the recent increase in the value of design over raw features could inhibit consideration of raw computing power in product innovation. Another explanation could be that changes to the software market brought about by SaaS, indie development, and app store models seem to promote smaller, simpler end-user software products (e.g. web browsers vs office suites).

I wouldn't be surprised if this stagnation is reversed in the future (5+ years from now) from increased software demands. Areas remain for high-powered consumer hardware, including home servers (an area that has been evolving for some time, with untapped potential in media storage, automation and device integration, as well as resolving increasing privacy concerns of consumer SaaS, community mesh networking and resource pooling, etc), virtual reality, and much more sophisticated, intuitive creative products (programming, motion graphics, 3d modeling, video editing, audio composition, all of which I instinctively feel are ripe for disruption).

tuananh 51 minutes ago 0 replies      
I recently bought a new PC, after 6 years. Not because my old PC is unusable but I rather need a new one as HTPC with very low power consumption.
mortenjorck 11 hours ago 1 reply      
Five years ago, I bought a MacBook Pro to replace my PowerBook G4, which was itself five years old. The list of obsolescences was enormous: It had only USB 1.1 in a market teeming with new USB 2.0 hardware that couldn't have existed with the slower speeds; it had a single-touch trackpad just as OS X was introducing all sorts of useful multi-touch gestures; it relied on clumsy external solutions for wi-fi and Bluetooth; it had a slow-to-warm CFL LCD that had been supplanted by bright new LED backlit screens; it was even built on a dead-end CPU architecture that Apple had traded for vastly more powerful, energy-efficient, multi-core x86 processors.

Today, the calendar says it's time for me to upgrade again. Yet the pain of obsolescence of a five-year-old laptop in 2013 just isn't the same as in 2008: USB 3.0? What new applications is it enabling? Anything I need Thunderbolt for? Not yet. New Intel architectures and SSDs at least promise less waiting in everyday use... but I'm hardly unproductive with my old machine.

JusticeK 11 minutes ago 0 replies      
4K will be the revival of PC sales, in two ways:

1. Consumer affordable monitors. You'll need a better GPU, and probably Display Port. I don't expect most consumers wanting 30" 4K display. They'll want 22-27" displays of 4K resolution, a la Retina. (PPI scaling) Everything is still the same size as people are used to (compared to 1080p), but everything is sharp as Retina.

2. 4K adoption of multimedia on the Internet. The more 4K videos that pop up on YouTube, the more people who are going to want to upgrade their hardware. This one isn't specific to PCs though, it could apply to mobile devices as well.

Go to YouTube and find a 4K video (the quality slider goes to "Original"). Now look at the comments. Many of the comments in 4K videos are people complaining how they can't watch the 4K video because of their crappy computer (and sometimes bandwidth).

Zak 9 hours ago 0 replies      
I've been prioritizing human interface over raw power for some time with my laptop (more or less my only PC). It's semi-homebuilt - a Thinkpad T61 in a T60 chassis. I would rather work on this machine than any new one.

The CPU is slow by current standards, but a Core2Duo isn't slower than the low-clock CPUs in many Ultrabooks. The 3 hour battery life could be better, but I can swap batteries and many new laptops can't. The GPU sucks, but I don't play many games anyway. DDR2 is pricey these days, but I already have my 8gb. SATA2 is slower than SATA3, but I'm still regularly amazed at how much faster my SSD is than spinning rust. It's a little heavy, but really, I can lift six pounds with one finger.

So the bad parts aren't so bad, but nothing new matches the good parts. The screen is IPS, matte, 15" and 1600x1200. Aside from huge monster gaming laptops, nothing has a screen this tall (in inches, not pixels) anymore. I can have two normal-width source files or other text content side by side comfortably. The keyboard is the classic Thinkpad keyboard with 7 rows and what many people find to be the best feel on a laptop. The trackpoint has physical buttons, which are missing from the latest generation of Thinkpads. There's an LED in the screen bezel so I can view papers, credit cards and such that I might copy information from in the dark, also missing from the latest Thinkpads.

ChuckMcM 7 hours ago 1 reply      
Interesting to watch the uptick in 'retina' laptops. Basically people don't need a new PC but will pay for a better PC 'experience' that means longer battery life, 'better' screen (usually retina/IPS/etc), better ergonomics.

Interestingly it seems like some would love to run their old OS on them. My Dad sort of crystallized it when he said "I'd like to get a new laptop with a nicer screen but I can't stand the interface in Windows 8 so I'll live with this one." That was pretty amazing to me. Not being able to carry your familiar OS along as a downside. That reminded me of the one set of Win98 install media I had that I kept re-using as I upgraded processors and memory and motherboards. I think I used it on 3 or 4 versions of machines. Then a version of XP I did the same with.

I wonder if there is a market for a BeOS like player now when there wasn't before.

drawkbox 2 hours ago 0 replies      
A few things lead to this including the obvious tablet/mobile disruption. PC Gaming decline due to console gaming and mobile and Moore's law and processor speed.

I used to update for gaming and 3d almost entirely.

I also used to update more frequently for processor speed/memory that were major improvements.

If we were getting huge memory advances or processor speeds still there would be more reason to upgrade. Mobile is also somewhat of a reset and doing the same rise now.

platz 11 hours ago 1 reply      
Below is what I feel is a relevant excerpt from Text of SXSW2013, Closing Remarks by Bruce Sterling [1]:

---Why does nobody talk about them? Because nobody wants them, thats why. Imagine somebody brings you a personal desktop computer here at South By, theyre like bringing it in on a trolley.

Look, this device is personal. It computes and its totally personal, just for you, and you alone. It doesnt talk to the internet. No sociality. You cant share any of the content with anybody. Because its just for you, its private. Its yours. You can compute with it. Nobody will know! You can process text, and draw stuff, and do your accounts. Its got a spreadsheet. No modem, no broadband, no Cloud, no Facebook, Google, Amazon, no wireless. This is a dream machine. Because its personal and it computes. And it sits on the desk. You personally compute with it. You can even write your own software for it. It faithfully executes all your commands.

So if somebody tried to give you this device, this one I just made the pitch for, a genuinely Personal Computer, its just for you Would you take it?

Even for free?

Would you even bend over and pick it up?

Isnt it basically the cliff house in Walnut Canyon? Isnt it the stone box?

Look, I have my own little stone box here in this canyon! I can grow my own beans and corn. I harvest some prickly pear. Im super advanced here.

I really think Im going to outlive the personal computer. And why not? I outlived the fax machine. I did. I was alive when people thought it was amazing to have a fax machine. Now Im alive, and people think its amazing to still have a fax machine.

Why not the personal computer? Why shouldnt it vanish like the cliff people vanished? Why shouldnt it vanish like Steve Jobs vanished?

Its not that we return to the status quo ante: dont get me wrong. Its not that once we had a nomad life, then we live in high-tech stone dwellings, and we return to chase the bison like we did before.

No: we return into a different kind of nomad life. A kind of Alan Kay world, where computation has vanished into the walls and ceiling, as he said many, many years ago.

Then we look back in nostalgia at the Personal Computer world. Its not that we were forced out of our stone boxes in the canyon. We werent driven away by force. We just mysteriously left. It was like the waning of the moon.

They were too limiting, somehow. They computed, but they just didnt do enough for us. They seemed like a fantastic way forward, but somehow they were actually getting in the way of our experience.

All these machines that tore us away from lived experience, and made us stare into the square screens or hunch over the keyboards, covered with their arcane, petroglyph symbols. Control Dingbat That, backslash R M this. We never really understood that. Not really.---

[1]: http://www.wired.com/beyond_the_beyond/2013/04/text-of-sxsw2...

evo_9 11 hours ago 4 replies      
The PC is dead, it's just not dead for computer professionals, and never will be. But for the rest of the world - think mom, dad, gramps,grammy - why on earth do the need the headaches of a full PC (mac or windows)? A good tablet is basically enough for almost everyone else.
rayhano 9 hours ago 3 replies      
This is an over-simplication.

Yes, PCs aren't ageing as fast as they used to.

But they are obsolete beyond 'not being portable'.

Here is why tablets are winning:

1. Instant on. I can keep my thoughts in tact and act on them immediately. No booting, no memory lags, no millions of tabs open in a browser.

2. Focus. Desktop interfaces seem to be desperate to put everything onto one screen. I have a PC and a Mac (both laptops). I prefer the PC to the Mac; better memory management for photoshop and browsing, and I love Snap. But that's where the usefulness stops. With an ipad, I have no distractions on the screen.

3. Bigger isn't better. That includes screens. Steve Jobs was wrong. The iPad Mini is better than the bigger variants. Hands down. Same goes for desktop screens. I want a big TV, because I'm watching with loads of people. I don't need a big screen for a PC because the resolution isn't better than an ipad and I'm using it solo. Google Glass could quite possibly be the next advancement in this theme.

4. Build quality. PCs look and feel cheap. Including my beloved Sony Vaio Z. The ipad in my hand could never be criticised for build quality.

5. Price. The ipad doesn't do more than 10% of what I need to do. But, I do those 10% of things 90% of the time. So why pay more for a PC when the ipad has no performance issues and takes care of me 90% of the time.

I used to think shoehorning a full desktop OS into a tablet is what I wanted. Seeing Surface, I can happily say I was wrong. I don't want to do the 90% of things I do 10% of the time. That's inefficient and frankly boring. PCs and Macs are boring. Tablets are fun. There's one last point why tablets are winning:

6. Always connected. It strikes me as absurd seeing laptops on the trains with dongles sticking out. It takes ages for those dongles to boot up. I used to spend 5-10 minutes of a train journey waiting for the laptop to be ready. My ipad mini with LTE is ever ready. And cheaper. And built better. And more fun.

The PC isn't dead, but it will have next to no investment going forward, so will suffer a mediocre retirement in homes and offices across the world.

Note: I love my PC. I just love my ipad mini more.

simba-hiiipower 9 hours ago 0 replies      
Of course PC sales will be low. When you don't have enough memory, you buy more RAM. When your processor is too slow, buy a new CPU, or you get a new heat sink and over clock it. You rarely have the need to buy a whole new box.

i agree that the increased (functional) life of pcs is a contributing factor to slowing unit sales, but its laughable to attribute it to the idea that people who once would have bought a new pc are now just buying more ram and upgrading internals.

the percentage of people who would have any idea how to do that, or even consider it as a viable option, is far to small to have any real impact on demand..

dkarl 11 hours ago 0 replies      
I was ticked off that my 2007 Mac Mini couldn't be upgraded to Mountain Lion, until I realized Snow Leopard ran all the software I needed on that box. I think I'm happy with the hardware and form factor of my phone, too, so I've got all the electronics I need for years to come. Good thing, too, because my rent just went up, and I need a new couch.
hmart 1 hour ago 0 replies      
I'm a happy owner of a DELL e1505 still working in the living room where has survived two little girls of 4 and 2 years. Now I want to rescue it and install Ubuntu after upgrading to a SSD.
bitemix 1 hour ago 0 replies      
It seems like the only folks who consistently upgrade their computers every 1-2 years are gamers and people working with big media files. Some friends and I run a website dedicated to helping people build and upgrade their PCs. We see about 130k visitors per month. That's a pretty low number, but it still converts to a quarter of a million in sales every month.
DigitalSea 5 hours ago 0 replies      
I actually touched upon this in a blog post I wrote last month: http://ilikekillnerds.com/2013/09/rumours-of-pcs-demise-have... and I said exactly this. A bad economy coupled with the fact people just don't need to upgrade as much any more are reasons PC sales have slowed. The PC will always be around, tablets and smartphones are great, but they're not comfortable for extended periods of time nor as capable. As I also point out, being a developer means I need a keyboard and multiple monitors to do my job and coding on a tablet is just never going to happen.
venomsnake 7 hours ago 0 replies      
Quick and dirty guide for having decent PC:

1. Buy mid range processor with a lot of L2 cache2. Find mobo that supports lots of ram and stuff it to the max.3. SSD is a must4. Buy the second card of the high tier model (the cut chip from the most recent architecture (in their times that were 7950, 570 etc ... but with current branding of NVIDIA a total mess it may require some reading if you are on team green)5. Any slow hard drive will be enough for torrents6. In 2 1/2 years upgrade the video to the same class.in 5 years ... if the market is the same repeat. If it is not - lets hope there are self assembled devices on the market non locked.

I have been doing that since 2004 and never had a slow or expensive machine.

sdfjkl 11 hours ago 1 reply      
Mainly we don't need new ones because the 3 year old one is still doing the job. That wasn't the case a decade ago - your 3 year old PC was seriously out of date and couldn't run most games released that year and probably not install the latest OS release. This rapid progress has flattened out considerably. Now people upgrade to get nice features such as retina displays or SSD drives, but that's optional (so you don't do it if you don't have spare money laying around) and the benefit is much smaller than going from a 90 MHz Pentium to a 450 MHz Pentium III.
dankoss 11 hours ago 2 replies      
> When your processor is too slow, buy a new CPU, or you get a new heat sink and over clock it

The motherboards for PCs built 5 years ago are completely different from those built today, and the CPU sockets have changed every other year. New processors from Intel will be soldered on.

The performance of a PC from five years ago is probably adequate for web browsing and office tasks. For anything more demanding, the advances in power consumption, execution efficiency and process node are huge leaps from five years ago.

D9u 4 hours ago 0 replies      
I ran my 2008 Acer Aspire One ZG5 netbook until I got my current "Ultrabook" a couple of months ago.

The netbook handled just about everything I threw at it, and with FreeBSD and dwm it ran faster than it did when I first bought it.

Unfortunately I'm not too pleased with the HP Envy 15. The AMD A6 Vision graphics aren't so bad, but support for the Broadcom 4313 wifi card is sparse in the nix world...

Soon I'll be tearing it apart to swap out the bcm 4313 for something supported by FreeBSD, but for now, I'll not be purchasing a new PC any time soon.

dageshi 7 hours ago 0 replies      
This is pretty much dead on. What I think will happen is that PC manufacturers are going to look around for new markets and the obvious one is going to be consoles. Once SteamOS comes out I expect a slow but massive ramp up in PC-Console production in a similar vein to the way that Android powered devices have come to dominate the smartphone market (in numbers shipped).
btbuildem 7 hours ago 0 replies      
I'd argue a similar pattern is happening with laptops (well, at least ones with exchangeable parts).

My old T400 was "dying" until I put an SSD in it. Blew my mind how significant an upgrade that was. When it started "dying" again I maxed out the RAM @ 16GB.

The CPU is a bit lacking now that I want to run multiple VMs side by side, and the chassis has seen perhaps a bit too much wear, so a replacement is coming -- but I've managed to put it off for years, with relatively inexpensive upgrades.

davexunit 10 hours ago 0 replies      
I agree with the author. I built my desktop computer in 2009 (I think) and it's still going strong. I see no reason to upgrade. I also recently purchased a used Thinkpad X220. It's a few years old but has no problem doing everything that I want to do with it.

It's wasteful to be throwing away computers constantly. In the PC world, I've noticed that it's particularly prevalent among "gamers" that are convinced that they need a new computer every couple of years.

willvarfar 9 hours ago 0 replies      
> Of course PC sales will be low. When you don't have enough memory, you buy more RAM. When your processor is too slow, buy a new CPU, or you get a new heat sink and over clock it. You rarely have the need to buy a whole new box

This is not end-consumers nor businesses. Enthusiasts who were building and upgrading their computers were always a small market.

The article talks about upgrading repeatedly, but I don't think the author can extrapolate their own expertise over the rest of the traditional desktop users.

niuzeta 11 hours ago 1 reply      
the article is falling under fallacy of assuming the wrong sample. Of course the author wouldn't buy new PC because he can upgrade his old one. Heck, almost any tech-savvy people can in fact upgrade or build one from the scratch. If not, chances are that you know at least one person who can help you and after the first time, it just gets easier.

the PC market isn't dead, it is slowly receding and it won't stop. It's because of the new alternatives, and assuming finite budge, when you get one of the alternatives, which cost roughly around a consumer-level laptop, you don't have enough for another PC that you don't need.

The article to me seems extremely narrow in both its oversight and scope. People don't care about processing power not because it's a marketing gimmick, but because they don't care. People who do care are the ones who know enough to care, and they will always be minority.

padobson 11 hours ago 4 replies      
I don't know which conclusion I had about this was more useful:

1) I don't need to buy a new PC every two years anymore2) Someone should make a tablet with slots so it can be upgraded like a PC

basicallydan 11 hours ago 0 replies      
Good point, well made.

Personally, I upgrade incrementally, and I still use my PC on a regular basis. The machine I have now is a hodge-podge of parts from different ERAs. I have an Intel Q6600 but DDR3 RAM, and a modern, quite beefy graphics card that I bought when it was in the upper-echelons in early 2013. It runs most modern games pretty well. I have an SSD for most software but also three big HDDs, one of which I've had since my first build in 2004.

eliben 7 hours ago 0 replies      
Hmm, I want to compile huge open source projects quickly. For this I need as many cores as possible at a reasonable price, a lot of memory and an SSD. So it's time to upgrade :)
wahsd 10 hours ago 0 replies      
That's why we all needs tablets. A tablet for you...and a tablet for you...And you get a tablet....and you get a tablet....We all get tablets..... Oh! these tablets kind of suck to actually produce or do anything on. ....... ok, back to laptops and all-in-ones.
bparsons 7 hours ago 0 replies      
I have a 13 inch Acer I purchased in early 2011. Despite its low cost, the thing has run like a charm since day 1. I literally have zero desire to replace this thing at any time in the foreseeable future. It still runs 4+ hours on a battery, which is remarkable, since I use this machine more than 5 hours a day.

I have a desktop with twice the processing speed and twice the ram, but for all intents and purposes, it runs almost exactly the same as the little Acer. Unless I am playing a game or running illustrator, I simply don't need the power.

tehwalrus 10 hours ago 0 replies      
I have bought laptops, but not a whole desktop ever in my life. I've been through two desktops, mind you, that were both built from scratch[1].

I think this article gets it about right - I've started enforcing a 3 year cycle for both phone and laptops because they were costing me too much (in a mustachian sort of way) - and I've stuck to it with laptops (I made 3.5 years on a 2009 MBP) and will be doing so with the iPhone (due for replacement spring 2015.) If the nexus devices keep getting cheaper and awesomer, then I might jump to those a bit earlier (particularly if I can sell the 32GB 4S for an appreciable fraction of the new phone cost.)

Working with the 3.5 year old laptop got slightly painful (re-down-grading back to snow leopard from lion was essential, I even tried ubuntu briefly) but perfectly bearable for coding and web browsing. I'll see how slow the phone gets, but I'm quite relaxed about not having the latest and greatest iOS features (I've not seen anything compelling since iOS 5; I only did 6 because some new app requested it.)

[1] or rather, one was, and then I gradually replaced all the parts until I had a whole spare PC to sell on ebay, and one mobo bundle later and I'm still using it with no problems, playing games etc.

codegeek 10 hours ago 0 replies      
I have always pondered over this whole question of PC being dead vs alive. Interesting thing is that even though with tablets and smartphones, lot of regular people can probably get away with not using a PC just to surf the net, facebook etc, the real question that comes to mind is what will happen in the future if someday coding/programming does become a commodity and more and more regular people actually start coding (to whatever extent) to solve problems. Would that ever happen ? What would they use then ? PCs ? something else ?
jebblue 2 hours ago 0 replies      
Well I did, used my last one for almost 8 years, got this one a few months ago, don't have to upgrade as often; I still have to upgrade. It's lighter, quieter, generally more powerful, more RAM, more disk space, better graphics. These are all the reasons I ever upgraded just not as often.
utopkara 10 hours ago 0 replies      
Part of the reason is because we have gone back to the days of terminals. Chromebook is a good milestone in marking what people do with computers and how much power they need. We are past the point where computer as a consumer device, and computer as a professional equipment have parted their ways. We are also lucky that the people who buy CPUs in bulk for their powerhouses are still using architectures similar to the ones we use in our desktops and laptops. Because with our weak demand for new hardware, the prices cannot stay low for long.
b1daly 6 hours ago 0 replies      
It's weird,but I feel like my PCs are all you slow. I bought a rMAcBook Pro recently expecting to be blown away, but it still feels sluggish to me. I want instantaneous response when it comes down to it. There actually is a qualitative difference between 100ms and 10ms response time. I'm surprised, I really thought we would be closer.
Lost_BiomedE 10 hours ago 0 replies      
My .02 is that Microsoft OS stopped being lead-ware. I noticed that since Win7.
mhurron 11 hours ago 1 reply      
Basically this, computers hit good enough a while ago, now you just have to replace parts when they die.

Yes, on paper, the latest processor is faster than the one released two years ago but you have to be doing specific types of workloads with it to really make a big difference.

meerita 8 hours ago 0 replies      
As a guy who has been involved in computers I tend to buy something to last at least 3-4 years. Once I start feeling I'm behind I like to upgrade.

I had a 2005 imac before acquire this 2011 iMac and in between I've bought MacBooks and Macbook Air. I'm thinking in getting my new desktop on 2015.

Thing is, when I go to my parents house, I see 2003 computers. I think this reality apply's to many families: parents don't care about speed, they get used because their needs are less computational and more casual, like browsing, Facebook and Skype. The trend I'm seeing in Spain is getting iPads for parents is getting notably high. All my friends instead upgrading their parents pc desktops are buying ipads and parents love it. Are you having the same experiences?

ausjke 3 hours ago 0 replies      
This is so true, tablet/smart-phone are great portable devices, however I can not live without a PC/laptop, it's just I already had a few of them.My first choice will be PC, then smart phone, the last one is tablet.
pmelendez 11 hours ago 0 replies      
Thanks! Somebody finally said it! (or at least this is the first blog post I read about it)

If any, what is dead is the software need for the Moore's law

javajosh 10 hours ago 0 replies      
Backend devs can probably use more computer resources, particularly cores and RAM. We want to simulate whole clusters on our dev machines and instrument them with tools like Ansible and Docker, and then deploy multiple (fairly heavyweight) processes like JVMs to them. But yeah, 4 (fast) cores and 16GB of RAM is available in a laptop these days, along with an SSD and the best display you can buy, for $3k. (Of course I'm speaking of the MBPr).

Games can always use more resources. AFAIK there is still a lot of progress being made with GPUs. 60fps on a 4K display will be a good benchmark. The funny thing is that GPU makers have taken to literally just renaming and repackaging their old GPUs, e.g. the R9.[1] As for the game itself, there is a looming revolution in gaming when Carmack (or someone equally genius-y) really figures out how to coordinate multiple cores for gaming.[2]

But yeah, most everything else runs fine on machines from 2006 and on, including most development tasks. That's why Intel in particular has been focused more on efficiency than power.

[1] Tom's Hardware R9 review: http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r...

[2] Carmack at QuakeCon talking about functional programming (Haskell!) for games and multi-core issues: https://www.youtube.com/watch?v=1PhArSujR_A&feature=youtu.be...

malyk 9 hours ago 0 replies      
I'm seeing the same thing with my iPhone. I Have a 4S and while I like what the 5s brings I'm just not sure it's worth upgrading now. There is just starting to be the very hint of slowness in some things on the 4S, but it isn't anything like when I went from the 3G to the 4s. That was a huge upgrade. Now it just doesn't feel necessary to buy the next thing on the same schedule.
tn13 6 hours ago 0 replies      
Well, if PC had to die then on what are we going to write all our code ?

Tablets, those funky phones are popular today something else will get popular after them. PC may never get as popular as them but they are here to stay.

ivanhoe 10 hours ago 0 replies      
This is all true, I still can do pretty much everything on my 2009 PC, but truth is also that I do it rarely, specially since I've got a new console a few years ago and stopped playing on PC... everything work related is on my laptop, playing games on console is nicer, PC desktops are simply not needed anymore (for what I do, and also for majority of not-tech users)
FrankenPC 11 hours ago 0 replies      
Well, the CPU/RAM/HDD systems do last a very long time. It's the GPU that needs periodic upgrading. Robert Space Industries for instance will be leveraging the Cryengine 3 with nearly 10 times as many polygons as with the average 3D FPS. Also, Microsoft keeps adding rendering features to the latest OS's which require hardware updates on the GPU level. I guess what I'm saying is: Nvidia will continue to be a sound stock to add to your portfolio.
dworin 11 hours ago 0 replies      
I'm typing this on a PC where I did the same thing as the author. Over the past 10 years, I've swapped out a part every two years or so to keep it running the latest and greatest. But the CPU is five years old and still running fine. I'm planning to donate it to a non-profit to replace a computer that's almost 10 years old and also still running fine.

There was a time when you felt like a new PC was obsolete the second you took it out of the box. But that was because we were just scratching the surface of what we could do with new hardware. We're now at a point where it's hard to find consumer and business applications for all the spare hardware that you can afford.

Mobile adoption has been so quick because everyone is buying devices for the first time (tablets), or there is an incentivized two-year replacement cycle (phones). But I'm still using an original iPad that works just fine, and a 3 year old cell phone with no reason to upgrade. Eventually, I think we'll start to see the same leveling off in mobile as well.

goblin89 8 hours ago 0 replies      
This article makes a similar point: http://techland.time.com/2013/04/11/sorry-pc-industry-youve-... I think it's been posted on HN before, but I couldn't find the post).
shmerl 11 hours ago 0 replies      
> The PC is not dead, we just don't need new ones

It's really nice when some build process takes less time because of better hardware. Also, try running some upcoming games on an old PC. Obviously the need for some hardware depends on what you are planning to do.

avenger123 9 hours ago 0 replies      
At least Microsoft is helping the PC industry.

Microsoft and its SharePoint platform will keep SharePoint developers upgrading their desktops upon every release.

abvdasker 10 hours ago 0 replies      
Yeah, I pretty much agree with that premise. In my experience, faster CPUs and RAM make little difference compared to the gains from an SSD. Hard drive disks are such a huge bottleneck compared to other upgrades that the average user gets the biggest gains in responsiveness from upgrading to an SSD. And for a lot of PCs that doesn't even necessitate buying a new one.

For laptops it's a different story. The big push seems to be in reduction of power consumption for longer battery life, which sounds pretty sensible to me. I guess if battery life is a big concern for a PC user, then it makes sense to go to a smaller process. That does seem like a pretty small reason to upgrade, though.

Another good indicator that the PC "game" has changed is that the two major commercial PC OS's just released their latest versions (Mavericks & 8.1) for free.

linux_devil 11 hours ago 0 replies      
I still use my 5 year old desktop (upgraded twice) for development. I like to open box and upgrade it myself , if I want to do similar on laptop I think twice . Freedom to upgrade it yourself is a bliss.
solnyshok 9 hours ago 0 replies      
Mostly agree, however, I think there could be more upgrade waves for home PCs, triggered by some qualitative improvements in technology. My guess, once we have a reasonably powerful, totally silent (fanless, 512-1TB SSD), book sized desktop PC, maybe in 2-3 years from now, it might trigger wave of home PC upgrades. After that, who knows...
jrs99 10 hours ago 0 replies      
When people say the PC is dead, they do not mean that it is not being used and people don't need one... they mean that people simply don't buy it as often and have other options to choose from, like laptops.

Saying that the PC is dead is being correct. Almost everyone I know buys a laptop instead of a PC. I know a lot of people that do not have a PC, but I don't think I know a single person that doesn't have a laptop.

It's like saying the Novel is Dead. Plenty of novels are being written, but it is really not the one major form of art that people are discussing. That is being replaced by television and film. Will there be novels written fifty years from now? Most definitely. But still, the idea that the novel is the one true form where the greatest art occurs is over.

mpg33 7 hours ago 0 replies      
Average computing power and storage has gotten to a point that it now can handle the everyday stuff with relative ease. High-def video/gaming are the main areas where hardware still has to keep up with.

Although one could argue that network bandwidth is still an area affects the "everyday stuff".

akinity 9 hours ago 0 replies      
The last few times I looked at the desktops available at Targets and Walmarts in the Bay Area, there weren't very many options. Bestbuy and Costco are somewhat better equipped. I think that, with the lower margins on desktops relative to laptops and the amount of space they consume, desktop PCs are well on their way out of being attractive to traditional brick and mortar retailers.

Haswell architecture couldn't have hit the market at a better time for laptop owners, with more powerful integrated graphics and low power use. I'm sure it isn't a coincidence.

staringispolite 10 hours ago 0 replies      
Somehow I don't think my mom would trade her iPad for an e1505 with a broken display, external monitor, plus the periodic need to upgrade the hard-drive and install/upgrade Ubuntu :
zerny 2 hours ago 0 replies      
Well, PCs performance has never been beaten by tablets and phones.
wainstead 9 hours ago 0 replies      
When we speak of PCs versus smartphones or tablets we're talking a lot about form factor and portability. I imagine a day when my smartphone has more horsepower than the best desktop today and it can drive a huge 4K monitor while streaming petabits at a time. You'll only need one device and it will be the CPU to all your interfaces.
Sami_Lehtinen 9 hours ago 0 replies      
I just upgraded from Q6600 / 4GB to i7-4770K / 32GB, but actually that Q6600 would have been enough, if I would have just used SSD with it. SSD is they key. Apps I user are Firefox, Thunderbird, Deluge and VLC.
fallingmeat 11 hours ago 2 replies      
Thinkpad T60 purchased (refurb!) in 2007. Still a rock solid machine. It does get a little warm though..
mpg33 7 hours ago 0 replies      
I think computing power/storage is becoming more necessary on the server side than the client side.
nXqd 9 hours ago 0 replies      
With all the guide from tonymac, I enjoy building my own hackintosh with cheaper and better hardwares :P
alinspired 9 hours ago 0 replies      
most of consumers will not even upgrade their PCs, but change it to a new PC, laptop or tablet when it's completely broken.

i'm thinking my parents - they will use that 2000 pc until it's not booting up, and then they'll worry on upgrade

ffrryuu 6 hours ago 0 replies      
The new fanless PC's are pretty cool.
bjoe_lewis 9 hours ago 0 replies      
If only Paul let me vote twice.
devx 10 hours ago 0 replies      
Either way, terrible news for Intel and Microsoft.
badman_ting 11 hours ago 2 replies      
Right, that's what it means to say that the market is dying. But if you need to feel clever, feel clever.
How I compiled TrueCrypt 7.1a for Win32 and matched the official binaries concordia.ca
229 points by maqr  9 hours ago   82 comments top 11
generalpf 9 hours ago 1 reply      
That's amazing work. Well done to the author.
zokier 8 hours ago 3 replies      
Just a slightly off-topic question, but WTF does TC require VC 1.52 for?
yeukhon 9 hours ago 3 replies      
"TrueCrypt is a project that doesn't provide deterministic builds."

Why? What is the benefit of doing so when everyone wants a deterministic build?

CUViper 8 hours ago 2 replies      
> TrueCrypt is not backdoored in a way that is not visible from the sources

... as long as you also trust the compiler not to introduce any backdoor... (cf. Reflections on Trusting Trust)

wai1234 8 hours ago 2 replies      
This is a great first step but we're not done yet. It proves the binaries are built from the published code, but only when the published code has been thoroughly vetted can we conclude there is no backdoor.
bliker 7 hours ago 1 reply      
I am just shooting into darkness, but would not it be easier to compile it twice and diff outcomes to find found out what parts are being changed so those can be ruled out?
proctor 5 hours ago 1 reply      
it seems to me that the relaxed gpg key verification that the author uses doesn't give us any more assurances regarding the authenticity of the source than a simple hash offered on the website would. i think in this situation, if the author did not intend to attempt more rigorous verification of the truecrypt pgp key, at least cross-checking that the key offered on the site matches the key offered on a public key server pgp.mit.edu for example would be prudent before signing the truecrypt key with your own.

  Import the .asc file in the keyring (File > Import certificates).  Now you should mark the key as trusted: right click on the TrueCrypt Foundation public key   in the list under Imported Certificate tab > Change Owner Trust, and set it as I believe checks are casual.  You should also generate your own key pair to sign this key in order to show you really trust   it and get a nice confirmation when verifying the binary.

pamparosendo 6 hours ago 0 replies      
I entered just to say it's an incredible work done by this guy... it's been years since I analized a file on hex mode (from Norton Commander, jeje).
xbeta 3 hours ago 0 replies      
Coolest post I've read today! Good work!
smegel 4 hours ago 0 replies      
Kudos for effort.
eterm 9 hours ago 1 reply      
Tldr: Binaries didn't match, here's some handwaving at the differences.
uTorrent tricking users into changing default browser settings? utorrent.com
61 points by gantengx  2 hours ago   66 comments top 22
gilgoomesh 2 hours ago 7 replies      
Simple: don't use uTorrent. Use Transmission instead:


laureny 1 hour ago 1 reply      
First Vuze, then uTorrent. It looks like the life cycle of any popular torrent application is:

- Starts very light, bare bones, downloads torrents and that's all

- Gets bloated with more and more features that nobody wants

- Partners with a shady company

- Dies

Off to alternatives I go.

cmsimike 1 hour ago 2 replies      
Wasn't uTorrent The Best Thing Ever when it first hit the scene? I seem to recall it was this application. It was about a 93k executable that didn't need to be installed. Just download and run. It was my go to torrent client of choice during my Windows days. Sad to see it become this.
product50 2 hours ago 1 reply      
Surprising that no-one is talking about Yahoo and their tactics to get more users. I am sure SearchProtect and Yahoo! here have a deal to push as many default searches as possible to drive revenue.
xanderstrike 1 hour ago 0 replies      
Switch to Deluge[1]! It's Free Software, and is so similar to uTorrent you won't notice the difference.

The day uTorrent pushed the update that tried to install a browser extension I was absolutely done with them. I do not support malware in any shape or form.

[1] http://deluge-torrent.org/

alan_cx 1 hour ago 0 replies      
If memory serves, uTorrent lost trust-ability when it got sold. IIRC, that means post version 1.6.1 it became a concern and began to needlessly bloat. Prior to that is was a brilliant bit of software.

1.6.1 is light weight, unmolested, and still worth using.

lingben 1 hour ago 2 replies      
Simple solution: go back to using the old, barebones, simple, fast utorrent v 2.2.1


runs fast, no ads, no issues, just works!

znowi 1 hour ago 0 replies      
Wow, this is an unusual step for Yahoo. Who would think that a hijack process that tempers with user's browser settings is a good idea? Hello, Marissa Mayer?

As for uTorrent, it's been going down this path for a while, gradually introducing crap into the app. And this one is the last for me, as well.

Btw, apparently, they turned off registration on the forum to ward off the mounting complains. When I go to https://forum.utorrent.com/register.php, I'm greeted with Get lost spammer, we don't need your kind here. And of course the topic is closed. Well done.

mercurialshark 1 hour ago 1 reply      
So glad someone posted on this bullshit. Not only has uTorrent started doing this, but BBEdit 10 and some other previously not super shady software has too.
orbitur 1 hour ago 4 replies      
Wow. People are actually angry in that thread because they didn't look closely at the setup steps.

Let's be clear here: the user was still given a choice, but the user "trusted" uTorrent to not force them to make one. Give me a break.

x0054 52 minutes ago 0 replies      
This is exactly why I have such a love/hate relationship with BitTorrent Sync. I use it a lot, and it works so great, but I so wish it were open source.
ParadisoShlee 2 hours ago 3 replies      
Even Sun installs some kind of ad toolbar in Java!
sheepz 1 hour ago 3 replies      
This is why I love Linux. Every generic piece of software comes with no BS attached.For example, on Windows, if you want to mount an ISO you have to download some shady piece of software, the installer of which comes bundled with n toolbars. In Linux it's a matter of a simple one-line command...
pbreit 1 hour ago 0 replies      
I just tried installing and after declining the offer the installer hung. Buh bye.
pavel_lishin 1 hour ago 0 replies      
Damn; and I blamed Firefox for this.
smegel 1 hour ago 0 replies      
uTorrent went bad ages ago...ive been using an old, solid version (around 2.7?) for years, no reason to change.

It was a beautiful bit of software.

vezzy-fnord 2 hours ago 1 reply      
uTorrent is a horrendous bloat, anyway. Personally I used rTorrent for a while (minimal ncurses interface, very appealing to me) but later switched to Transmission.

I've also used Deluge, but there's nothing too special about it in my eyes.

jaxbot 1 hour ago 0 replies      
Not new, though. uTorrent install has been shipping with crap for a while now.
gesman 2 hours ago 0 replies      
uTorrent started selling itself short quite a while ago.

I stopped using it about 2 yrs ago for similar reasons. It's a malware seeding garbage now.

unabridged 2 hours ago 0 replies      
this is why "freeware" can't be trusted, the threat of a fork keeps open source developers honest
neoyagami 2 hours ago 1 reply      
Since i stopped using windowz in mac i use the official bittorrent app
sydbarrett 2 hours ago 3 replies      
I never updated uTorrent, I run 2.2.1 so either find it or I can probably send you the exe.
Primer on elliptic curve cryptography arstechnica.com
108 points by andrewfong  7 hours ago   25 comments top 8
tptacek 6 hours ago 4 replies      
Nit: the hard dependency on good randomness for ECDSA is a property of DSA in general, and not of elliptic curve cryptography. The DSA construction has what is probably the strictest randomness requirement in all of mainstream cryptography; a bias of just a few bits is, with repeated signatures, sufficient to recover private keys! (The attack that makes this work on ECDSA is extraordinarily cool).

The problem with NIST Dual_EC_DRBG is simpler than the article makes it sounds. A good mental model for Dual_EC is that it's a CSRPNG specification with a public key baked into it (in this case, an ECC public key) --- but no private key. The "backdoor" in Dual_EC is the notion that NSA --- err, Clyde Frog --- who is confirmed to have generated Dual_EC, holds the private key and can reconstruct the internal state of the CSPRNG using it. I think this problem is simple enough that we may do a crypto challenge on a toy model of Dual_EC.

Nobody in the real world really uses Dual_EC, but that may not always have been historically true; the circumstantial evidence about it is damning.

The NIST ECC specifications are in general now totally discredited. If you want to see where the state of the art is on ECC, check out http://safecurves.cr.yp.to/.

You should never, ever, never, nevern, nervenvarn build your own production ECC code. ECC is particularly tricky to get right. But if you want to play with the concepts, a great place to start is the Explicit Formulas Database at http://www.hyperelliptic.org/EFD/; the fast routines for point multiplication are mercifully complicated, so copying them from the EFD is a fine way to start, instead of working them out from first principles.

pbsd 6 hours ago 1 reply      
The performance comparison of ECDSA vs RSA is somewhat unfair. In ECDSA, signing is the cheapest operation, whereas in RSA it is the most expensive. If the timings chosen were signature verification time, RSA would be much faster. See:

    Doing 2048 bit private rsa's for 10s: 1266 2048 bit private RSA's in 9.98s    Doing 256 bit sign ecdsa's for 10s: 22544 256 bit ECDSA signs in 9.97s    Doing 2048 bit public rsa's for 10s: 42332 2048 bit public RSA's in 9.98s    Doing 256 bit verify ecdsa's for 10s: 4751 256 bit ECDSA verify in 9.92s
A fairer comparison would probably pitch DH-2048 against ECDH-256, which is more apples-to-apples.

wfunction 31 minutes ago 0 replies      
The NIST document was a backdoor, there's no question about it.


jevinskie 6 hours ago 0 replies      
I always liked the ECC segment of a Purdue crypto course: https://engineering.purdue.edu/kak/compsec/NewLectures/Lectu...
mrcactu5 3 hours ago 1 reply      
Why do ECC when RSA-2048 works just fine? One argument I keep hearing is that as we keep factoring the RSA numbers we just come up with bigger ones.

Not a cryptography expert here, I don't know how to respond to these.

picomancer 1 hour ago 0 replies      
The article says that the two functions,

    f : x -> pow(x, pubkey) mod m    g : x -> pow(x, privkey) mod m
being inverses of each other was a big breakthrough when it was discovered. The article implies, but does not directly state, that this "big breakthrough" was part of what separated the "classical" era of cryptography (pre-1977 as defined by the article) from the "modern" era (post-1977).

The "big breakthrough" result was actually proven by Euler hundreds of years ago! [1] The innovation of RSA was building a working public-key cryptosystem around Euler's result, not the result itself.

[1] http://en.wikipedia.org/wiki/Euler%27s_theorem

j2kun 4 hours ago 1 reply      
From what I understand theoretical improvements in factoring algorithms go hand in hand with theoretical improvements in discrete logarithm algorithms, and for both there are algorithms which improve slightly over the trivial approach. ECC is considered better because of the believed constant factor slowdown in arithmetic operations on elliptic curves, not because discrete logarithm is considered harder than factoring. This article implies the opposite quite directly.
wnevets 5 hours ago 0 replies      
Steve Gibson did a pretty good job explaining this for normal folk (like me) on his podcast IMO.

Episode #374https://www.grc.com/securitynow.htm

Introducing Cover coverscreen.tumblr.com
156 points by timdorr  9 hours ago   64 comments top 20
zmmmmm 4 hours ago 1 reply      
So how does it work with a PIN / pattern lock?

The problem with most lock screen enhancements is that anything you put there is outside your phone security "firewall" and available to anybody who picks up your phone. The 4.2 lock screen widgets work fairly well with this (eg: you can open the camera app without unlocking the phone, but attempting to swipe over the gallery forces you to unlock). However they are (I assume) using the core framework APIs to do that and I presume support for it is coded into the apps, while this seems to be doing it for any app.

fidotron 8 hours ago 3 replies      
This is a neat idea, however, I fail to see either why it requires so much funding ($1.7m according to TechCrunch) or any server backend whatsoever.
cik 8 hours ago 8 replies      
This terrifies me. There's an instant problem of paranoia, and trust here. I would never be okay with the idea of an application monitoring what I'm doing, in order to reorder itself. Mind you, I say all this without knowing if it requests network access.

How do I know you're not sending my usage patterns upstream to CoverCorp? How do I know that you're not reading the Android Music Provider database, and sharing my data back?

rcthompson 6 hours ago 1 reply      
On a semi-related topic, is it really possible for an app to properly replace the Android lockscreen? I haven't found any way to do it. As far as I can tell, all the "lock screen" apps use a hack where they disable the stock lock screen and then emulate a lock screen by asking you to make the lockscreen app your default home screen and then launching your "real" home screen when you "unlock" them. I've seen it said that they do this because it isn't truly possible to replace the lock screen. The problem is, of course, that this hack sometimes doesn't work or produces weird results often enough to discourage me from using any custom lock screen.
27182818284 6 hours ago 1 reply      
My initial reaction to this is nothing but love.

I hate the idea it needs all sorts of server connections for their business model. I don't know a way around that, but if they or another company figure out how, that's what people will gravitate toward. Especially given the paranoid climate.

jackbewley 5 hours ago 0 replies      
The Android app SayIt has a widget that does something similar to this. It learns purely from usage/recency and is generally very good at presenting you with the apps your most likely to launch. All of the analysis is performed on the device so no information is shared with 3rd party servers. It also sports very fast voice based app launching. No affiliation just use it. https://play.google.com/store/apps/details?id=com.rn.sayit
radley 7 hours ago 0 replies      
They're going to have the same problem as Facebook Home. They're essentially doing an overlay activity like most of us do. It only works as they describe provided the user keeps the device unlocked in the system. You can't bypass the device lock screen without rooting the device.
wayward-yeah 1 hour ago 0 replies      
I wrote an iOS speed-dial app that does the same sort of predication for contacts that this does for applications. It's definitely an order of magnitude less sophisticated, but I though it might be worth sharing:http://nate-at-lightspeed.appspot.com/swiftdial
pc86 9 hours ago 0 replies      
Got my 5S in the mail last week.

This makes me want an Android. Great job, guys!

sscalia 9 hours ago 1 reply      
Neat idea, elegantly executed. Exceeds the design standards typical of Android apps.

I've never liked Android's implementation of home/app screens (widgets + some apps, tap to reveal all your apps).

I guess if you want a lot of clocks, Android is great.

This adds another app/button layer...

chrisrh 8 hours ago 0 replies      
Looks like to be in competition with Aviate: http://getaviate.com/
gameguy43 3 hours ago 0 replies      
Interesting example of a useful smartphone extension that is Android-only because of limitations in the access iOS gives 3rd party apps.
nathan_f77 4 hours ago 0 replies      
Why does it need an internet connection? Why is it posting data to a server?
jfaghm 4 hours ago 1 reply      
I must've missed something because the site says "launched" but I can't find the app on the site or in Google Play. Or is it only available for certain devices? I have an HTC One Google Play edition.
ejp 8 hours ago 1 reply      
This looks really slick!

How well does it work with some kind of lock-screen security? The UX for that is always a hassle, and I'd love to find someone who is doing it well.

thoughtpalette 6 hours ago 0 replies      
Doesn't Android already have a "profiles" mode so you can switch between multiple app layouts and configurations?
unlogic 8 hours ago 0 replies      
Right, unlocking the phone prior to opening apps is so hard and boring. Let's just launch them directly from the lockscreen. Wait, apps are accidentally being launched in my pocket. Can I have a lockscreen for the lockscreen?
g3orge 5 hours ago 0 replies      
wow, beautiful device... anyone know which is it?
samstave 8 hours ago 1 reply      
Heh. What the Facebook phone could never be; useful.

It would be good to be able to define actions based on location (either by which wifi I connect to or GPS) - as well as time of day.

(I'd like to have my screen auto dim at 10PM)

rstevens11 7 hours ago 0 replies      
this thing is slick! Great, thoughtful design that is making me want my android back
Dwolla launches Dwolla Credit Real-time payments without interchange dwolla.com
9 points by danielbru  1 hour ago   4 comments top 4
EGreg 4 minutes ago 0 replies      
How is this different than PayPal credits or any other credits?

You don't prepay?

wmf 32 minutes ago 0 replies      
How are chargebacks handled in this system?
ascordato 56 minutes ago 0 replies      
Go Dwolla, go!
Ataub24 1 hour ago 0 replies      
New banner ads push actual Google results to bottom 12% of the screen arstechnica.com
189 points by llambda  10 hours ago   103 comments top 33
anon1385 8 hours ago 4 replies      
Page and Brin themselves once pointed out the problems of accepting ads or paid placement, with some rather ironic examples:

Furthermore, advertising income often provides an incentive to provide poor quality search results. For example, we noticed a major search engine would not return a large airline's homepage when the airline's name was given as a query. It so happened that the airline had placed an expensive ad, linked to the query that was its name. A better search engine would not have required this ad, and possibly resulted in the loss of the revenue from the airline to the search engine. In general, it could be argued from the consumer point of view that the better the search engine is, the fewer advertisements will be needed for the consumer to find what they want. This of course erodes the advertising supported business model of the existing search engines. However, there will always be money from advertisers who want a customer to switch products, or have something that is genuinely new. But we believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.


The main difference seems to be that today even getting the top organic search result doesn't provide enough clicks for advertisers, so they feel obliged to purchase ads for their own brand names even when they already rank first. If people searching for Southwest Airlines on Google aren't ending up on the Southwest Airlines website without a huge great banner ad (despite it being ranked at the top of the results) then something is going badly wrong on the Google search results page.

spankalee 5 hours ago 6 replies      
Disclaimer: I'm a Googler

I think this is a pretty disingenuous analysis of what's going on. It's obvious from the comparison to the [Virgin America] search that this is a bigger change that just adding a "banner ad".

Notice that for [Virgin America] there are _two_ spots that bring you to virginamerical.com, the ad and the first organic result. This is redundant, wastes space, and probably is confusing to some users. I don't know why a company buys ads for navigational queries where it's already the top result, but they do, and I'd argue it's bad for users.

On the [Southwest Airlines] query you can see that there's no redundant ad anymore - the navigational ad and the first organic result are combined. Calling that whole box and ad, when it contains the same content that the former top organic result used to, is misleading, but makes for a much more sensational headline when you want to claim that most of the screen is ads.

I'm not sure about the experiment, that's not my area, but my guess that this is part of an attempt to not have this ad+organic confusion for navigational queries by allowing the owner of the first result of a nav query to merge the ad with the result into a professional and official looking box. Maybe that'll work, maybe not, which is most likely why it's an experiment.

aresant 8 hours ago 1 reply      
There was an interesting earlier this year from EBAY showing that there was ZERO value to buying their own brand keywords from Google (when their organic keywords ranked high).

In fact spending money on their own brand keywords generated signifigant negative ROI (1).

So my guess is that this strategy from Google is designed to provide brands with a first step to generating actual value from Google search results.

I can see brands making these out-sized spends when able to provide their customers w/additional value like interactivity within the goog results, etc.

(1) http://blogs.hbr.org/2013/03/did-ebay-just-prove-that-paid/

radley 1 hour ago 0 replies      
I don't think they're doing banner ads. I bet this is the beginning of Google "pages".
spindritf 9 hours ago 10 replies      
Someone searching for Southwest Airlines is probably looking for... Southwest Airlines. So the very first result is a useful one. With sections of the official website conveniently linked and a pretty picture on top.

This sounds to me like a complete non-issue. If you don't like ads, install AdBlock. Of course if you need clicks for your website, carry on.

ColinWright 9 hours ago 1 reply      
This seems largely the same as the item submitted just 3 hours ago, still on the front page, and discussed at considerable length:


Same story (but no real discussion) was submitted here:


Call me cynical, but I suspect it will still be upvoted and discussed here because any comments on that earlier discussion will get lost in the noise of the close to 200 comments already there.

Theodores 8 hours ago 0 replies      
If you look at this from a retailer perspective this completely makes sense. Important boss man at 'Acme Widgets' wants to type 'Acme Widgets' into Google and see something impressive, with some control over it. He can now pay for the advert with Google rather than pay a bunch of SEO clowns and 'web designers' that typically go over-clever with the homepage design rendering the top search results useless.

Important boss man also wants to get good results for 'acme blue widgets', 'tough widgets Alabama', 'naughty widgets' and whatever but only really cares about those secondary searches when someone else has told him to care about it. It is the main company name, in the search box that matters.

I think this is going to work well for all concerned and I don't share the cynicism most people seem to have about this.

eliben 9 hours ago 2 replies      
What are "actual results" for Southwest though? To me it seems like links to check-in, flight status, schedules and customer services is exactly what I'd want to see there. Is the image what you find distracting? But doesn't this give you immediate indication that you got into the right place?
aegiso 8 hours ago 1 reply      
Whoa, I just had a flash forward to 2030.

First, probe the outrage machine for banners for particular brands. Then for a huge price tag, add lightweight widgets to the SERP for brands so searchers can e.g. buy tickets from the Google Search page. This is hailed by the brands as increasing sales dramatically. Demand for this feature grows.

Once significant numbers are using the SERP widgets, make the banners/widgets part of general non-brand search. Natural next step. A little bit of outrage, but at this point it just gets muffled by the masses. Life goes on.

All of these brands are getting increasingly dependent on Google's SERP widgets, which give Google huge leverage power. One deal leads to another and before you know it Google starts buying up airlnes to streamline everything.

So in 2030 we're flying Google Air using a Google phone to buy tickets to the Google Movies, to see a film made by a studio wholly owned by Google.

I'm not even saying this is a Bad Thing (tm). Just that if I were heading Google this would totally be my game plan.

muraiki 8 hours ago 0 replies      
Here's the best "malicious" reasoning behind this that I could come up with. Consider the following list of _hypothetical_ statements (as I have no research to back it up):

1. Users tend to ignore the small ads on the right (anecdote: I do)

2. Users do notice and click on search results beneath the top query, even when they originally intended to arrive at their exact branded query

3. Search results beneath the top result are for competitors

Solution: Put in huge "ad" to draw attention and also to knock competitors listings to the very bottom of the screen or off the fold completely

If 1-3 hold true, then I could see it making sense competitively to shove those other results down the page.

Edit: aresant pointed out a good article that could explain the intent. Yay! Also, it wasn't my intent to hate on Google, just a thought experiment.

shuw 9 hours ago 0 replies      
The example they use is navigational query for "Southwest Airlines". As far as I'm concerned, the deep links to South West airlines' site such as "Flight Schedules" are actual search results.

Ignoring that, it's unfair to use one example and say that search results are 12%. Is it 12% average, 12% median, or 12% for navigational queries only?

scott_karana 10 hours ago 0 replies      
As far as I'm concerned, the "News" results shown on the bottom 12% aren't quite search results either, though still useful. Everything is below the fold. :(
wahsd 10 hours ago 0 replies      
Well, at least there are 6,352,596,267 results I can sift through. The last search I did, I found my solution in result 4,936,392
Mikeb85 9 hours ago 1 reply      
What did he expect when searching for Southwest Airlines? He got their website and links to flights.

I just did a few searches for educational topics, got no ads. ... I would say there isn't a problem...

LeafyGreenbriar 8 hours ago 1 reply      
I was worried when I saw this headline, and then very relieved when I saw what was actually going on.

So long as Google only returns these sponsored ads for searches for the company name, I don't see this as being a problem at all, given the fact that many users are using the address bar integrated search in place of bookmarking or typing URLs.

Where this would become a problem is if they start expanding this to searches beyond simply the company name, and I think there is a bit of a gray area there. As someone else pointed out in this thread, showing the Southwest banner in response to a search for "cheap airfare" pretty unambiguously crosses a line, but what abut something like "book southwest airlines flights." One could argue that the user was attempting to get to the southwest airlines website to book a flight, so showing the Southwest banner would be appropriate, however, companies like Expedia, Kayak, and so on, whose links would now be much further down the page, would likely disagree.

toddmorey 8 hours ago 0 replies      
So SWA is a pretty specific example, but what about Apple? What about when you are searching for, well, information about apples? And can SWA ever own the term "Southwest"? When you think it out, it's not as cut and dry as it first seems.
mildtrepidation 5 hours ago 0 replies      
I've been criticized more than once by designers for making references to content being "below the fold." Of course there's no actual fold, and yes, it's an old term from the newspaper world. However, it's very clearly still relevant, even if it's not as easily definable: The harder you make it for people to find your content, the less likely they are to view it or continue parsing your message, regardless of what it is.
chintan 6 hours ago 0 replies      

Speaking of "high quality ads": The second Cheap-O-Air Ad is for flights to Southwest not on Southwest Airlines - Deceptive IMHO.

dotcoma 7 hours ago 0 replies      
They look like Altavista in 2002. Glad I switched to DuckDuckGo three months ago. Adios, Google!
stingrae 9 hours ago 0 replies      
This doesn't seem to be a very fair comparison. You cant compare a search for "maps" with a search for an actual company in this case "Southwest Airlines." I would expect that a search for a company even earlier in googles history would have been links mainly to southwest owned pages.
dragonwriter 9 hours ago 0 replies      
That's misleading, because they are counting the whole result box that is labelled "Sponsored", but of that box, everything but the actual graphic banner at the top of the box is exactly the organic search result which is the top hit for the search (including the subordinate links) served to users that aren't getting the new experimental ads. So, everything but the graphic (not everything but the sponsored box) is "actual Google results".
ktr100 9 hours ago 1 reply      
goolge quote:

There will be no banner ads on the Google homepage or web search results pages. There will not be crazy, flashy, graphical doodads flying and popping up all over the Google site. Ever.


dm8 9 hours ago 0 replies      
Most users don't even care about going to second page of search. With knowledge graph, Google gives you precise answer right away and takes full screen on mobile (nearly half of the screen on desktop). I think Google is optimizing for users rather than SEO/Websites.

As a someone who works in advertising, even I dislike banner ads. They are obtrusive, annoying and take away the attention. Google should go back to adwords and make them better rather than anything else.

SCdF 5 hours ago 0 replies      
This is such a non-issue. They searched for an actual brand name, and they got branded results. If I searched for "how do airplanes work?" and got a massive Southwest Airlines banner this would be something to complain about. Currently though, this is just link bait.
bsimpson 8 hours ago 0 replies      
FWIW, I have a Chromebook Pixel and the large Sponsored brand box pushes the search results entirely below the fold on my screen.
andrewhillman 6 hours ago 0 replies      
I never understood why big companies waste money for keyword campaigns for their own brand, especially since they are going to show up first anyways. These banner ads provide branding opportunities so I understand this move.
NicoJuicy 6 hours ago 0 replies      
I'd seriously consider using the Bing search engine more with my chrome browser just to get in their statistics..

This ain't a big deal actually, it's a test to get more from their Adwords when people really search for the companies. But behold the future :( (investors, stocks, it will never be enough).

acheron 9 hours ago 1 reply      
I like the image of results from 2005. I had totally forgotten about "Froogle".
tn13 7 hours ago 0 replies      
I am not sure why this is a bad thing as such. Google does not owe us to give the search results. Google owes advertisers a good return for their money and they will optimize it in whatever way they can. At least they are not being like Ask or Conduit.
elwell 8 hours ago 0 replies      
I'm sure this makes DuckDuckGo happy.
pearjuice 8 hours ago 0 replies      
Easily circumvented by using proper browser plugins.
mindcrime 4 hours ago 0 replies      
Wow, that's absolute shit. Horrible, horrible, brain-dead move by Google. It won't happen overnight, but this will inevitably wind up pushing people to seek out a better search engine (read: one that doesn't display huge honkin' banner ads like this) and sooner or later, somebody will come along and offer equal (or better!) search results, nix the banner ad, and eat Google's lunch.

Google are so big and powerful that it's easy and tempting to think of them as invulnerable and immortal, but remember... people have thought that about many companies in the past, more than a few of whom are no longer with us.

Edit: OK, IF this really is only for brand names and doesn't show up for more general searches ("cheap airline tickets", etc.) then maybe it won't be received so badly. That said, I still believe that, in general, "big honkin' banner ads" are NOT going to be well received on Google search result pages. I guess time will tell.

ChrisNorstrom 6 hours ago 1 reply      
I am convinced they have monkeys for designers. WHY on earth would you allow "About 30,2000,000 results (0.25 seconds)" to take up space?! Are these guys insane? That's the most useless information on the page, and it's pushing the ads and search results further down.
Microsoft reports record first-quarter revenue of $18.53 billion microsoft.com
187 points by coloneltcb  7 hours ago   182 comments top 20
ChuckMcM 7 hours ago 10 replies      
Nice boost in their search revenue, if your wondering why Google's CPCs are going down, that is why. Microsoft has gotten serious about exploiting their search engine tech and that is having an effect [1]. Unlike 'recuter' I don't think this is their "Blackberry Moment" :-)

Google is smearing the smartphone market, at the expense of Apple's cash engine, Microsoft is smearing the Search market at the expense of Google's cash engine and Linux is smearing the operating system market at the expense of Microsoft's cash engine. Seems like there is a lot of pressure to diversify.

[1] http://techcrunch.com/2013/10/24/pricing-engine-adwords-bing...

300bps 6 hours ago 10 replies      
I am a developer at an investment bank that passed the Level 1 Chartered Financial Analyst exam part of which explicitly tests you on your ability to read accounting statements. If you are a developer without similar training, please realize that you will probably sound as uninformed offering your commentary on this topic as would an equity analyst giving their opinion on pages of C code.
mrb 5 hours ago 1 reply      
Note the keyword is "first-quarter". Usually for Microsoft the first quarter of the fiscal year is a little below other quarters. But this first quarter is still below what MS typically achieves the other 3 quarters of the year:

- 1st quarter of last year: $16.01 billion

- 2nd quarter of last year: $21.46 billion

- 3rd quarter of last year: $20.49 billion

- 4th quarter of last year: $19.90 billion

- 1st quarter of this year: $18.53 billion (the "record" one)

jusben1369 7 hours ago 2 replies      
This is impressive. They blew through their numbers. They're showing that they can offset a slowdown in the core cash-cow via other product lines. Search revenue increase was particularly impressive.
paul_f 6 hours ago 2 replies      
For those of you who continue to predict the demise of MS, it might be worth mentioning that Microsoft has a quite broad range of products and they primarily sell to enterprise customers who are notoriously non-fickle.
netpenthe 4 hours ago 0 replies      
For those people doubting the future of MSFT, here is my take:

MSFT is both a tech company and a utility.

It has growth potential (phones, surface, search, xbox) but it is also completely essential for global business (servers, AD, SQL Server, Exchange, Sharepoint).

In that sense it is a utility. If you took out all the MSFT software in the world everything basically stops. Your electricity probably doesn't work, you probably can't get on a train to get to work and if you manage to get to work you can't login to anything.

People say "but my company has BYOD!" that might be true, but MSFT is still the infrastructure it is running on. You can bring your AAPL car but you're still driving on an MSFT road.

Zigurd 6 hours ago 0 replies      
If they want to translate the great performance some parts of Microsoft are having into a Google-like stock price they should break Microsoft up into business and consumer companies.

Critics of Microsoft are wrong to call it's enterprise business a dinosaur. There is no reason to think Microsoft won't continue to grow this business for decades to come.

But I would like to be able to own this as a pure play, not mixed up with XBOX. Let's call this company "Azure" and spin it off, like HP did with Agilent (which should have been called HP), and let the "devices and services" part screw around with reinventing itself.

us0r 6 hours ago 0 replies      
Why are we comparing Microsoft to RIM?

RIM was a one trick pony. Microsoft has several billion dollar businesses.

Theodores 2 hours ago 1 reply      
Here us the Google Trends graph showing the decline in the search terms 'Microsoft' and 'Windows':


For comparison there is the trend for 'iphone' and 'android'.

Sure Microsoft are doing loads of exciting things but people aren't typing 'Microsoft' or 'Windows' into the search engine box of Google as much as they used to. Make of that what you will.

spoiledtechie 5 hours ago 0 replies      
In other news, they still only pay 5% taxes due to offshore accounts.
aabalkan 7 hours ago 0 replies      
Stock price just hit after hours $35.63 (up 5%), good news for Microsoft employees indeed.
eddiegroves 5 hours ago 0 replies      
The enterprise and business division is turning into a juggernaut at Microsoft that shows no sign of slowing down. Unlike the Windows team, they have a clear focus and vision guiding them.
epa 6 hours ago 2 replies      
Don't forget these are un-audited and don't really give us much information other than what they want to show us.
rch 5 hours ago 0 replies      
I switched my default search in chrome to bing when the new tab page changed. I've since fixed the tab page, but left the search provider alone for now to see what differences I notice over time. So far, it's OK, but fails to give me my geek-centered results for generically named things like orange and amber.
umeshunni 6 hours ago 2 replies      
$400M in Surface revenues. At even a generous $400 unit price, that's only 1M sold this quarter. Probably closer to 700K if you consider some of them being Surface 2 priced up to $900.
devx 7 hours ago 2 replies      
Windows revenue is (finally) down. I say finally only because many people wouldn't believe this would happen, even a few quarters ago. That could be quite a problem for Microsoft over the next few years. Right now they are offsetting that with enterprise deals, but do they really think that's safe for them? RIM did, too.
gesman 7 hours ago 0 replies      
MSFT's todo list to boost revenues:

1. Accelerate Ballmer booting out process. Why's he still there?

2. Boost Cloud.

3. Boost enterprise services and everythings.

4. Stop wasting resources on stupid consumer widgets department.

JPKab 7 hours ago 4 replies      
They will continue to milk the enterprise cow, but eventually even they will dry up.
recuter 7 hours ago 3 replies      
Headlines from Dec 18th, 2009: "BlackBerry shipments break record in Q3, RIM profits jump 59 percent".

Also known as the Wile E. Coyote Syndrome.

Paying Money To Save It And Doing More With Less brianbrunner.com
27 points by bbrunner  3 hours ago   12 comments top 8
Xorlev 2 minutes ago 0 replies      
Agreed. I used to think it was best to "do it yourself so you know it" but was quickly abused of that notion. Do important things yourself (if you can do it better -- the stuff the makes or breaks your business). SaaS is a godsend.

TRWTF is that the author knows someone who reimplemented NodeJS in house.

nahname 1 hour ago 1 reply      
Jenkins easily runs on a $20/month digital ocean box and should take less than 2 hours to setup. I scripted jenkins to shutdown and wake up the workers on heroku during off hours. It took me less than five minutes to script up the job and saves us $35 a month per environment. Effectively, jenkins costs nothing per month (actually saves us money).

Most testing services are horribly over priced and you don't get that level of control. Maybe you want to reconsider that one?




nfm 2 hours ago 0 replies      
I have mixed feelings about this, but that might just relate to a certain subset of SaaS products we've used - namely products that require heavy integration with your data.

I'd sooner spend engineering time writing our own report generators than writing code to push data to another service, in the format they expect, pre-empting which data we might want to have in that service down the track, understanding exactly how the different reports are calculated, and then inevitably having to write a few custom ones of our own as well.

Of course, the exception to this are services have a huge set of useful features and take basically no time to fully integrate with, like Google Analytics.

pm 3 hours ago 1 reply      
Is NIH syndrome such a big deal with engineers that they need to implement EVERYTHING? I'm leading a team of 3, including myself, and I can't imagine a worse way to spend your time. Why would I spend precious hours re-implementing a non-core problem badly when there's someone already doing it well?

When you support a SaaS service, assuming the product is good, you're essentially paying for an entire team to work on a problem you don't like that much but need, but which they love. You're also paying for the future of that product. Unless the software is really bad, or you need something so specialised that a current solution is out of the question, how does an engineer not understand this?

You only have limited time in the world. Work on something you find meaningful.

greenyoda 1 hour ago 0 replies      
The risk of outsourcing to an SaaS is that the vendor can disappear - it may be a small startup that goes bankrupt, gets acqui-hired, pivots to a different product, etc. Or maybe they can't provide the bandwidth you need to grow or the 99.9% uptime your customers are demanding. At that point you have to scramble to find a replacement SaaS or implement the service in-house.

I have nothing against paying for software that someone else wrote, but I'd feel much more comfortable buying the software and running it on my own servers (with a contract that says that if the vendor goes out of business, I retain the right to use the software).

PaulHoule 3 hours ago 0 replies      
The "we don't buy software here" syndrome is going away.

GitHub is a bit part of it. At a lot of shops you have time to drive to the next town and back to update the Wiki or close out a trouble ticket. Uncompromising speed is a feature that turns your developers into winners.

melindajb 36 minutes ago 0 replies      
Love this idea. Of course it assumes one finds the right software quickly without too much time and hassle.
bluedino 1 hour ago 0 replies      
Article is extremely hard to read on my phone.
Assembly - Quirky for Software assemblymade.com
4 points by sinak  19 minutes ago   discuss
Darpa Initiative Will Focus on Advancing Deep Brain Stimulation nytimes.com
40 points by ironchief  5 hours ago   28 comments top 13
npalli 4 hours ago 3 replies      
Anyone who wants to see the dramatic impact of deep brain stimulation needs to see this video[1]. Just by turning off and on his neurostimulator (see starting 2:00-ish) see the dramatic change in his body.


suprgeek 3 hours ago 0 replies      
If you have seen Fringe[1], you know where this is headed...[1] http://en.wikipedia.org/wiki/Brain_implant#Television
etrautmann 4 hours ago 0 replies      
I'm involved with one of these programs and am very encouraged to see DARPA's focus as very much inline with the goals of the broader field of neuroscience.

In heading off some of the anticipated snark, this is very much a program with the goal of understanding basic science and developing tools to better understand how our brains work.

LeBlanc 2 hours ago 0 replies      
A lot of this research is going towards developing neuro-prosthetics. The primary application of neuro-prosthetics in humans is to give paraplegics either the ability to control a cursor with their mind (a huge improvement in standard of living) OR the ability to control a prosthetic arm or leg with their mind (the long term goal).

Currently, this is only available with invasive brain surgery that can often have complications. So money spent on better imaging and implant technologies will have a strong positive impact on the field.

Interestingly, the researchers I know in this area are confused about why such a big deal is being made about this "Brain Initiative" because the amount ($70M) is actually not a lot given how capital intensive this type of research is and how many labs it will be spread amongst. Still, any funding is better than no funding.

ultimoo 1 hour ago 1 reply      
To put this in perspective, DARPA's annual budget is a little more than 2.8 bn.

70 mn. over five years is (naively) 14 mn. per year -- which is 0.05% of DARPA's budget.

Wingman4l7 1 hour ago 0 replies      
Obligatory related Cory Doctorow scifi short story, 0wnz0red: http://www.salon.com/2002/08/28/0wnz0red/
tonyplee 54 minutes ago 1 reply      
Resistance is futile you will be assimilated.

Google glass is not a moonshot. Google brain implant will be - just think about the SAT scores and all your collect test scores difference for those with brain implant and those without.

TrainedMonkey 2 hours ago 0 replies      
We need to monitor thoughts of people to prevent them from committing crime. We promise* not to use that information in any* bad* way.* For a very specific meanings of promise, any, and bad which are all unfortunately classified.

But seriously, this research can enable us to understand brain better and help a lot of people.

bmmayer1 3 hours ago 0 replies      
There are no foreseeable negative consequences to having a government agency put microchips in our brains.
narfquat 4 hours ago 2 replies      
Oh good, a military R&D lab aiming to develop brain implants...

Is there an equivalent of a tin foil hat that is available for subdermal implantation?

Kidding aside, military research has resulted in some of the most amazing stuff these days... like the entire space program and velcro.

atrilumen 4 hours ago 3 replies      
Darpa just wants to treat depression. (And not, you know, create supersoldiers or anything.)
zcarter 3 hours ago 0 replies      
On the inevitability human computer interfaces leading to some form of bermenschine: "We just have to hope that Dick Cheney isn't the first person they plug in."
missserenity 3 hours ago 2 replies      
Why are they funding breast implants?
MS-DOS Viruses in Action wired.com
29 points by adamnemecek  4 hours ago   6 comments top 6
danjayh 0 minutes ago 0 replies      
The difference between viruses then (most did little real harm, left your computer in perfectly working state, and were meant to show off the skills of the writer) and now (network zombies used to make people money, or replace web content with ads) is kind of depressing.
yeukhon 5 minutes ago 0 replies      
The awkward moment when I clicked on a web page with MS virus in action, while browsing the awesome gifs, the site decided to play ad suddenly like this (http://i.imgur.com/6I3WTUX.png)...
kjackson2012 1 hour ago 0 replies      
No reference to the stoned virus? I remember that being extremely widespread, despite not having the Internet to easily propagate it.
voltagex_ 1 hour ago 0 replies      
I wonder how many of these run in DOSBOX
blueblob 1 hour ago 0 replies      
Awesome how much creativity these hackers had. The graphics are surprisingly pretty amazing for some of this stuff.
gesman 2 hours ago 0 replies      
INT 13(CD13) - feel the power to format the hard drive in less than 8 bytes of code.Good old days :)
What's new in WordPress 3.7, "Basie" poststat.us
38 points by krogsgard  5 hours ago   15 comments top 3
DigitalSea 4 hours ago 2 replies      
The automatic background updates feature might be the most important feature of all in 3.7. Hopefully this goes a long way to plug some of the rampant security issues that plague the CMS because of failure to update to a later version. The new date querying features are also a welcome addition as it has been notoriously hacky in prior versions querying advanced date values in Wordpress and finally we search has been given a little love and is no longer so horrible that it requires a third party plugin to fill the void. Best version of Wordpress to date by far, wonder how they can top this list in 3.8.
ChrisNorstrom 2 hours ago 1 reply      
To me Wordpress is one of the most important things to happen to the internet, Not Facebook, not Twitter. Wordpress has democratized creating a site and helped millions of people create an online business, community, blog, helping bring in massive amounts of revenue for small business owners.

I've got 4 sites running wordpress: my portfolio, my online store, a design database for items under $50, and a magazine cutout marketplace. (See my profile for links) Those last two I mentioned took less than a week to tweak and hack together thanks to the speed and ease of setting up wordpress sites.

It's sad that Matt Mullenweg never got the same recognition that Jack Dorsey or Mark Zuckerberg got. He definitely deserves it. We've got to stop only celebrating and worshiping people who make money. I think Matt empowered people just as much if not more.

d_espi 5 hours ago 0 replies      
Great write up! Tons of useful features in the new release.
Exact numeric nth derivatives jliszka.github.io
122 points by jliszka  10 hours ago   44 comments top 14
backprojection 9 hours ago 6 replies      
I think it's worth noting that the problem with numerical differentiation, fundamentally, is that differentiation is an unbounded operator. In finite-differences, (the more obvious approach), you assume that your data are samples of some, general, function.

The problem then, is that that general functions have no (essential) bandlimit [1]. Remember that differentiation acts as a multiplication by a monomial, in the frequency domain [2]. Non-constant polynomials always eventually blow up away from 0, so in differentiating, you're multiplying a function by something that blows up, in the frequency domain. This means that, in the result, higher frequencies are going to dominate over lower frequencies, at a polynomial rate.

Let me be clear, the problem with numerical differentiation is not just that rounding errors accumulate, it's that differentiation is fundamentally unstable, and not something you want to apply to real-world data.

It depends very much on what your application is, however, I think generally a better approach to AD is to redefine your differentiation, by composing it with a low-pass filter. If designed properly, your low-pass filter will 'go to zero' faster (in the frequency-domain) than any polynomial, thus making this new operator bounded, and hence numerically more stable. It's not a panacea, but it begins to address the fundamental problem.

One example of such a filter is Gamma(n+1, n x^2)/Factorial[n], where Gamma is the incomplete gamma function [3].

In Python:

scipy.special.gammaincc(n+1,nx2)ormpmath.gammainc(n+1,nx2, regularized=True)

To see why this is a nice choice, notice item 2 in [4]. This filter is simply the product of exp(- x^2) (the Gaussian) multiplied by the first n-terms of the Taylor series of exp(+ x^2), (1/ the-Gaussian). Since this series converges unconditionally everywhere, as n-> +infinity, this filter converges to 1 for a fixed x (as you increase n), however, since it's still a gaussian times a polynomial, it always converges to 0 as you increase x, but fix n.

This is my area of research, so if anyone's interested I can give more details.

[1] https://en.wikipedia.org/wiki/Band-limit[2] https://en.wikipedia.org/wiki/Fourier_transform#Analysis_of_...[3] https://en.wikipedia.org/wiki/Incomplete_gamma_function[4] https://en.wikipedia.org/wiki/Incomplete_gamma_function#Spec...

jordigh 9 hours ago 4 replies      
Am I missing something or is this begging the question?

For any function that is not a combination of polynomials, you need to have its Taylor expansion up to the desired order of derivatives, so you can't just take an "arbitrary" function and use this method to compute its derivative in exact arithmetic.

So for anything other than polynomials, you just reword the problem of finding exact derivatives to finding exact Taylor series, and in order to find Taylor series in most cases, you have to differentiate or express your function in terms of the Taylor series of known functions.

Edit: Indeed, take the only non-polynomial example here, a rational function (division by a polynomial). In order to make this work, you have to know the geometric series expansion of 1/(1-x). For each function that you want to differentiate this way, you have to keep adding more such pre-computed Taylor expansions.

crntaylor 9 hours ago 2 replies      
Very neat. Presumably there is a more efficient method for implementing Nth order automatic differentiation than encoding the dual numbers as NxN matrices, though? To multiply the matrices takes O(N^3) time, whereas by exploiting their known structure I think you should be able to do it in O(N^2) time. Am I wrong?
ot 9 hours ago 2 replies      
> they are almost never used in any real automatic differentiation system

They're efficient enough for first-order derivatives. For example they are used in Ceres, Google's library for non-linear least-squares optimization


Pitarou 1 hour ago 0 replies      
How does this technique compare to a computer implementation of the kinds of techniques we learnt in High School? Is it easier to implement? More efficient? Are there some situations where it isn't appropriate?
BoppreH 5 hours ago 0 replies      
I couldn't believe it would work, so I made a toy implementation in Python using simple operator overloading: https://github.com/boppreh/derivative

All values tried so far agree with Wolfram Alpha, so color me surprised and happy for learning something new.

dhammack 7 hours ago 1 reply      
There's an interesting python library [1] which implements AD as well as has some neat features like automatic compilation to optimized C. It's developed by the AI lab at the University of Montreal, and is pretty popular in deep learning circles. I've found it to be a huge time saver to not worry whether you screwed up your gradient calculations when doing exploratory research!

[1] http://deeplearning.net/software/theano/

tel 10 hours ago 0 replies      
See also the ad package [1] for Haskell which has a number of interesting features of this vein.

[1] http://hackage.haskell.org/package/ad

tomrod 1 hour ago 0 replies      
This read nicely until I got to the code block. Does anyone else see this as yellow and gray (with syntax highlighting) coloration--such that it's virtually impossible to read?
svantana 5 hours ago 1 reply      
Is it just me or this article pretty naive? The headline's use of the word "exact" would imply integer arithmetic only, but the computations are done with floating point. So basically (s)he is trading one rounding error for another, which seems to be small-ish in some particular cases. What about discontinuities? And why forward derivatives only? I hope noone will use this for any application that actually relies on exact derivatives.
mrcactu5 7 hours ago 0 replies      
congratulations, you have implmented the Zariski tangent space using nilpotent matrices. welcome to the beautiful theory of algebraic geometry and schemes.


This really does fall in the ream of algebraic geometry since this method only works for rational functions - as he implemented it.

To numerically compute sin(x + ) you need the Taylor series.

gpsarakis 9 hours ago 2 replies      
Nice analysis. Hope you don't mind me adding that by omitting terms of the Taylor series you do have some loss of precision, however small. Also, solving linear equation systems may even introduce instability as the following must be preserved: http://en.wikipedia.org/wiki/Diagonally_dominant_matrix
fdej 9 hours ago 0 replies      
This seems to be essentially the same thing as "power series arithmetic" (first-order "dual arithmetic" is equivalent to arithmetic in the ring of formal power series modulo x^2, but you can make that x^n).

Encoding power series as matrices is sometimes convenient for theoretical analysis (or, as here, educational purposes), but it's not very efficient. The space and time complexities with matrices are O(n^2) and O(n^3), versus O(n) and O(n^2) (or even O(n log n) using FFT) using the straightforward polynomial representation (in which working with hundreds of thousands of derivatives is feasible). In fact some of my current research focuses on doing this efficiently with huge-precision numbers, and with transcendental functions involved.

Bill_Dimm 9 hours ago 1 reply      
There seems to be a typo at the beginning of the "Implementing dual numbers" section. It says:

  The number a+bd can be encoded as...
Should be:

  The number a+b*epsilon can be encoded as...

Low-background Steel wikipedia.org
16 points by spking  3 hours ago   9 comments top 4
cmsmith 27 minutes ago 0 replies      
The facts in this article are somewhat incongruous:

Radiation levels due to nuclear testing were elevated 7% over normal

The major source of radioactivity in steel is cobalt 60, which has a half life of 5.27 years

In which case one could just wait a year and the radioactivity of your steel would drop by 7%, making up for the effects of nuclear testing contamination. Put another way, steel from 1944 has been around for some 10 half-lifes of cobalt-60, meaning it has 1/2^10th as much 60Co radiation as when it was made. Why would it matter if the radioactivity was 1/2^10th or 1.07/2^10th as much as the background radiation?

I'm sure there are other isotopes which make this more of a problem, but the facts as presented in this article don't make much sense.

EvanKelly 2 hours ago 2 replies      
Could someone clarify how radiation dosing and dose rates are measured?

The article says background radiation levels peaked at .15 mSv in 1963. Looking at the wikipedia page on Seiverts, I am trying to compare this to other radiation examples, but not sure how to draw a comparison.

Would a human standing outside be receiving .15 mSv per hour? year? total?

mattparlane 1 hour ago 1 reply      
There are four "[citation needed]"s in the first paragraph. Surely there must be some substance out there? I did some quick Googling and couldn't find much beyond what the article already has.
endgame 2 hours ago 2 replies      
That's interesting and all, but why do we get a mildly-interesting wikipedia article every couple of days without any context or commentary?

I've half a mind to write a bot that submits a random article every 48 hours.

The best patent troll-killing bill yet eff.org
221 points by beauzero  15 hours ago   28 comments top 6
davidw 13 hours ago 5 replies      
Oh, cool - DeFazio (one of the sponsors) is my representative. I guess that saves me from having to call him. Anyone know if they tally calls for people calling to say thanks or good job or whatever?
larrik 13 hours ago 1 reply      
I still think the idea that customers of an infringing product can be liable is completely bonkers.
ColinWright 13 hours ago 0 replies      
shmerl 8 hours ago 0 replies      
By the way, what happened to another important bill to repeal the wicked DMCA 1201 (by Reps. Zoe Lofgren and others)?


snarfy 13 hours ago 3 replies      
I'm having a hard time pulling my jaw from the floor after reading Lamar Smith in the name of sponsors.
tallbrian 2 hours ago 0 replies      
Regarding "fee shifting" if the loser can be made to pay the winner fees, couldn't that be an even scarier proposition for a troll target?
So You'd Like to Make a Map using Python sensitivecities.com
221 points by urschrei  15 hours ago   40 comments top 12
Demiurge 14 hours ago 3 replies      
Cool article, explains how you can do anything using Python, although doesn't mention Mapnik. However, for most people, these days I would recommend to try TileMill (https://www.mapbox.com/tilemill/) to make a map. The CartoCSS can let you style anything based on attributes and it also lets you add and style raster data.
polskibus 13 hours ago 4 replies      
My biggest problem with maps these days is the data license for commercial use. I dont need very detailed map, usually administrative level 2, but it's hard to find accurate sources that dont make you pay thousands of dollars per small userbase. We create our own app and distribute it, therefore cannot exactly estimate our userbase. Does anybody know of a decent source with good, fairly detailed world maps and liberal license ? Doesn't have to be free.
jofer 15 hours ago 1 reply      
Surprised to see that they're using basemap instead of cartopy. There's nothing wrong with using basemap, but it can be a bit clunky, i.m.o.

Then again, cartopy is only a year or two old, so it doesn't have the traction that basemap does. It's gained a fairly large following very quickly, though.

pacofvf 15 hours ago 0 replies      
In our company we use python to make maps, but we go with the traditional GIS approach, dependencies?: postgis and mapnik.The first two examples would be solved by a single postgis query, the last one maybe would require some extra work. But nice work anyway, bookmarked.
jaegerpicker 14 hours ago 0 replies      
Very cool, article. I've always loved maps and mapping and python is my preferred language. The only thing I would mention is that it would be nice to have a pic of the results earlier in the article, that's just from the "let's look at this article, seems cool but what exactly is he teaching me" angle. I'm more likely to try the code if I can see the results up front. Otherwise it was a really cool example.
spiritplumber 10 hours ago 0 replies      
http://www.robots-everywhere.com/re_wiki/index.php?title=Geh... I wrote a sort of google earth API wrapper thing in python if anyone wants it. Windows only though.
gjreda 11 hours ago 1 reply      
There's also Vincent[1], which has some mapping capabilities and is built on top of Vega (a "visualization grammar" for d3js).

[1]: https://github.com/wrobstory/vincent

dannypgh 14 hours ago 4 replies      
Cartography? Hasn't everything already sort of been discovered, though by, like, Magellan and Corts?
pagekicker 14 hours ago 3 replies      
What are blue plaques?
cwal37 9 hours ago 0 replies      
Very interesting, it had never occurred to me that there were probably python libraries for mapping. My ArcMap license expires in less than two weeks, perhaps I will give this a shot before I re-up.
zmjones 12 hours ago 1 reply      
This is exactly why I prefer R for static maps. Would have taken like a quarter of the time, if that.
namelezz 11 hours ago 0 replies      
Cool! I have been looking for an article like this. Thank you for sharing.
Debian Packaging and Distribution at Ninja Blocks ninjablocks.com
3 points by thatguydan  28 minutes ago   discuss
An Interview with Kevin Novak, Data Scientist at Uber chartio.com
65 points by thingsilearned  9 hours ago   13 comments top 3
jorgem 7 hours ago 2 replies      
Creepy: "at Uber, we've got every GPS point for every trip ever taken at Uber, going back to the Trip #1"
ztnewman 1 hour ago 0 replies      
I wish he actually discussed his work as a data scientist, not just the field in general
calcsam 8 hours ago 0 replies      
Kevin's great. I got to know him after he came to present at HackerDojo and did a heck of a job -- extremely helpful guy!
The Man Who Would Teach Machines to Think theatlantic.com
152 points by jonbaer  14 hours ago   71 comments top 16
cs702 12 hours ago 5 replies      
"Gdel, Escher, Bach" is one of my favorite books, and I have a tremendous amount of respect and admiration for Hofstadter... so I'm really disappointed and saddened to read that he (quoting from the article) "hasn't been to an artificial-intelligence conference in 30 years. 'There's no communication between me and these people,' he says of his AI peers. 'None. Zero. I don't want to talk to colleagues that I find very, very intransigent and hard to convince of anything. You know, I call them colleagues, but theyre almost not colleagues -- we can't speak to each other.'"

Hofstadter should be COLLABORATING with all those other researchers who are working with statistical methods, emulating biology, and/or pursuing other approaches! He should be looking at approaches like Geoff Hinton's deep belief networks and brain-inspired systems like Jeff Hawkins's NuPIC, and comparing and contrasting them with his own theories and findings! The converse is true too: all those other researchers should be finding ways to collaborate with Hofstadter. It could very well be that a NEW SYNTHESIS of all these different approaches will be necessary for us to understand how complex, multi-layered models consisting of a very large number of 'mindless' components ultimately produce what we call "intelligence."

All these different approaches to research are -- or at least should be -- complementary.

stiff 11 hours ago 5 replies      
So Good Old Fashioned AI[1] is the new hot underdog AI thing now? I seriously don't understand the praise of Hostadter in the article and in the comments here, and the criticism of the mainstream AI research, especially it is very hard to find any precise details of what he does and what are the outcomes.

There have been attempts to understand intelligence with intelligence (logic, symbols, reasoning etc.) for 30 years, to not much effect, now AI and machine learning are advancing quite steadily, so why the snark? All evidence suggests that the way the brain itself learns things is statistical and probabilistic in nature. There are also new disciplines now, like Probabilistic Graphical Models, which are free of some of the traditional downsides of purely statistical methods, in that they can be interpreted and that human-understandable knowledge can be extracted from them. This is something that really seems promising, and to some extent is an union of the old and new approaches, despite the claims of a big division, but it is hard to see much premise in purely symbolic methods invented merely by some guy somewhere thinking very hard.

I for one am very happy that people seek inspiration in the way human brain works, that's what science is, if you just come up with things without consulting the real world it's not science, it's philosophy, the one discipline that has yet to produce a single result.

[1] http://en.wikipedia.org/wiki/GOFAI

ssivark 11 hours ago 1 reply      
Norvig and co. are like drunk men searching for their lost key under a streetlight. It might not be where it lies, but that's the only place where they think think could find something, or at least make some tangible progress. Hofstadter doesn't mind taking the long shot... feeling his way about in the dark, in the hope of inching forward and making progress towards artificial intelligence.

This comparison between complementary approaches is an apt analogy for most fields, where the focus shifts every once in a while, when one of the approaches largely hits a wall and most people switch to the other one. A while later, the trends will almost inevitably reverse and draw inspiration from other approaches. The unfortunate thing is that there's no dialogue between the two camps, which makes it that much harder to port good ideas from one context to the other.

I could provide examples from physics research, or for that matter, trends in static-vs-dynamic blogs :P Also, the more "applied" the field, the shorter these cycles are.

Ref: https://en.wikipedia.org/wiki/Streetlight_effect

stiff 12 hours ago 3 replies      
This is a bad article, especially for a technical audience. It romanticizes things a lot, as journalists have to, to keep up the readership rates, but it doesn't make for a very balanced judgement. This kind of debate is going on and on, you can read a much more reasonable account here:


I find the analogy to Einstein at the end of article especially funny. I think it's much more likely that people will look upon current defenders of "good old fashioned AI" like they now do upon people who still looked for ether after Einstein's discoveries.

drcode 13 hours ago 4 replies      
Douglas Hofstadter is important because most AI work right now focuses either on (1) big-data-style statistical analysis or (2) emulating brain anatomy.

DH is the most well known guy of a small, stubborn group of AI developers who still believe that "human thought" can be reasoned about and can be understood in isolation, and that we can build intelligence without simply reducing it to statistics or to brain anatomy.

I applaud his efforts, and find some of the programs he's written both creative and refreshing.

aethertap 13 hours ago 1 reply      
I've been enjoying this series from MIT OCW on Gdel, Escher, Bach:


MichaelMoser123 2 hours ago 0 replies      
Also of interest:

Hofstadters lecture about analogy on youtube http://www.youtube.com/watch?v=n8m7lFQ3njk

Also some earlier work on the subject


I have also written a review on this very interesting book "Surfaces and Essences: Analogy as the Fuel and Fire of Thinking"


dnautics 3 hours ago 0 replies      
When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.

I cannot recommend "Creative Analogies" more. I have purchased no less than four copies (two for myself; two for others, including K. Barry Sharpless, who once made a remark about AI that was reminiscent of some of the ideas in CA) over the years. It's even better than "Surfaces".

cjbprime 7 hours ago 0 replies      
This is the article referenced as pending publication in http://www.aeonmagazine.com/living-together/james-somers-web..., which is incidentally my favorite article about startups.
nathansnyder 12 hours ago 1 reply      
Love'd this quote "...the trillion-dollar question: Will the approach undergirding AI todayan approach that borrows little from the mind, thats grounded instead in big data and big engineeringget us to where we want to go? How do you make a search engine that understands if you dont know how you understand?...AI has become too much like the man who tries to get to the moon by climbing a tree: "One can report steady progress, all the way to the top of the tree."
jmilloy 6 hours ago 1 reply      
I'm not convinced that Hofstadter is pursuing computers that think like humans, so much as computers that appear to think like humans. He abstracts certain observable behaviors of the human mind (e.g. analogy), but there's no guarantee that what a brain can observe about itself is what a brain is actually doing. Does it make sense to ignore the underlying behavior of human brains, and instead try to directly emulate a particular abstraction? We can't let our romantic notions of what brains "do" get in the way.
sinkasapa 10 hours ago 0 replies      
I really enjoyed this talk of his on analogy in human language.


ArbitraryLimits 10 hours ago 0 replies      
It's interesting to see this article portray Hofstadter as the last of the dying breed of GOFAI researchers.

When I was in college (and GOFAI was still alive) GOFAI researchers themselves portrayed him as very much an outsider.

atlanticus 11 hours ago 0 replies      
I think a big part of the problem with AI is you are trying to map a digital model onto an analog system. There was a story on HN last year, I can't seem to find, that used a genetic algorithm on analog circuits to evolve optimal pattern matching for certain images. The results were good but when they went to build another one it didn't work right because of unmeasured EM feedback and subtle differences between individual circuits meaning every circuit would have to run its own evolution, negating most of the usefulness of the project. Maybe an analog model would be more appropriate.
duwease 11 hours ago 2 replies      
Considering the large and growing bank of research that highlights areas where the brain's output is flawed or plain wrong when compared to the consensus "optimal solution", I think I'm with the "AI establishment" as it's painted by this article. It doesn't seem self-evident to me that the inner workings of the human mind are the only or even optimal implementation of intelligence for every task.

If anything, the human mind seems to me to be a particular algorithm that is flexible, but trades that flexibility for capability in certain problem areas. Using a transportation metaphor, it's like walking versus air travel. Walking is incredibly flexible when it comes to where you can go, but air travel is by far the optimal route to get from coast-to-coast, although you are limited to travelling between airstrips. I feel like focusing on the human brain as the "true" intelligence is like claiming that walking is the only true transportation, instead of focusing on optimal routes for each problem.

9wymanm 7 hours ago 1 reply      
The father of psychology, William James

I was under the impression that Wilhelm Wundt was the father of psychology.

Twitter to offer 70M shares priced at $17-$20 to raise up to $1.4B in IPO techcrunch.com
44 points by coloneltcb  7 hours ago   28 comments top 10
cperciva 6 hours ago 1 reply      
Several of the companys directors and executives are sharing stock. Chairman Jack Dorsey is selling roughly .6 percent of the company and will own 4.3 percent after the offering making his share worth around $434 million. Co-founder Evan Williams is selling 1.6 percent of the company and will own 10.4 percent after the offering. Williams will be worth just over $1 billion if the shares go at a median price of $18.50. CEO Dick Costolo will own 1.4 percent of the company after IPO.

I don't see any sign that anyone is selling stock... these numbers are all consistent with 13% dilution due to Twitter issuing new stock.

c2 6 hours ago 5 replies      
It seems like the trend lately is to float a tiny amount of shares to the public. From my perspective this creates an artificial supply problem for the stock and makes higher valuations easier as you need less institutional buy in to maintain the price, and a few good quarters can result in disproportionate gains in the market.

Can anyone comment on that or shed some light? As a potential investor, those factors make me shy away from these investments as it makes the stock more volatile to changes and puts the fate of the stock in a few large holders hands.

rpedela 5 hours ago 1 reply      
Unless I am reading the prospectus wrong, looks like they haven't made a profit yet and the losses are much larger this year than last. I find that troubling for a software company wanting to IPO.
unabridged 3 hours ago 2 replies      
$11B seems like a low valuation. Groupon is still worth $6B and I don't think its anywhere near half as valuable as twitter. Twitter does have a problem trying to monetize the core business, the only way seems to be to charge celebrities and businesses to send out tweets(advertisements) but that will probably hurt them a lot.

But with all that data and all those users and celebrities there is so much possibility and directions it can go. If I owned them the front page would be all entertainment news, hire bloggers to scan through tweets and write stories about the latest gossip. Why let tmz and other entertainment news get all the ad revenue by writing about the info coming directly out of your site, twitter should be the #1 webpage for celebrity news/gossip. The kind of people who really use twitter are obsessed with this garbage, it seems only natural.

sliverstorm 5 hours ago 2 replies      
What is twitter going to do with $1.4B? Do they have some incredibly expensive investments planned?

Companies offer stock as an investment opportunity to raise capitol. Right?

codex_irl 4 hours ago 2 replies      
This is probably a silly question - but can regular people buy Twitter stock as soon as trading begins?

I am thinking of buying a very small number of shares & "gifting" them to my sister - a super user of their service..more for novelty than as part of any serious investment strategy.

yeukhon 48 minutes ago 1 reply      
Average consumers like us can't buy stock right?
knodi 1 hour ago 0 replies      
Was expecting more like $8-$12 not this high.
bluedevil2k 5 hours ago 1 reply      
It's interesting that Biz Stone, the face of Twitter for a few years, isn't listed on the prospectus as a large shareholder.
6thSigma 4 hours ago 0 replies      
Anyone planning on buying Twitter for around the IPO price?
Tesla Snags Apple VP Of Mac Hardware To Lead New Vehicle Development techcrunch.com
56 points by ot  8 hours ago   25 comments top 3
codex 7 hours ago 1 reply      
Before Apple, Field was VP of Design & Engineering and CTO at Segway.
c2 6 hours ago 1 reply      
Interesting move. While I can see some overlap in experience leading large technical projects where industrial design and battery life are of paramount importance, I wonder how much of that experience will translate into actual car development.

From the outside looking in I'd rather fill that role with someone with car industry experience bringing actual cars to market, because battery life and industrial design are somewhat fungible, but if Tesla is late on bringing car models to market that has a serious effect on their timelines.

loceng 7 hours ago 1 reply      
Build Business Logic in Minutes, Not Weeks copperthoughts.com
41 points by ihodes  7 hours ago   18 comments top 10
ebiester 3 hours ago 2 replies      
I get it, in principle, but in every project I've been in, actually nailing down the business logic itself is the tricky part, more than coding it up. It usually goes like this.

"I want it to work like A."code it up, find twenty edge case potentials"Well, 3, 5, 6, and 7 can be solved like this, let's guard to make sure 1, 3, 8, 9 never happen, 10-13 can never happen, and let me get back to you on the rest."code it up, notice a few more edge cases.Lather, rinse, repeat.

Along the way, one of the edge cases will show me that what they really want is B, and time and budget let me know if we start down that path.

toddmorey 45 minutes ago 1 reply      
As an extended state machine, it's certainly an interesting concept. I also love that you describe the components of your workflow in JSON, but the logical pieces are I think where this approach struggles.

I mean I get what's going on here, but the syntax clutter drives me nuts.

[["if", "eq", "decline"], "Pending Response"]

ihodes 7 hours ago 0 replies      
This has been my project for the past few months, in one form or another. I'm planning on making a video to show just how easy and extensible this system is, as well as launching to a few small companies in the coming weeks. I'd love to answer any questions/take any criticism. Crocker's Rules (but that's a given around here :)
memracom 1 hour ago 1 reply      
No, building business logic takes years, and even then it is not done because the business needs demand that it be changed yet again. Business logic is never correct. At best it is just good enough for now. This is why established businesses place so much emphasis on change management. They know that change is constant and that they need changes to be carefully managed in order to adapt fast enough, yet not break anything.
tarr11 7 hours ago 1 reply      
Is this a JSON oriented workflow engine?

My experience with workflow engines (as a developer) has been ... not great. They are hard to debug, and difficult to code once the logic gets complicated.

gfodor 7 hours ago 1 reply      
Amazon has a killer offering for this problem in the form of SWF. It's fairly under the radar as they have done a terrible job marketing it, but it's great.
krmmalik 6 hours ago 1 reply      
I'd love to chat with you more about this. I don't know if we could benefit since we're not building a business app even though it has a fair amount of business logic planned in the spec.

I've signed up to your mailing list anyhow. (Khuram Malik)

Look forward to chatting to you.

elwell 3 hours ago 0 replies      
"in Minutes"... no.
ericHosick 7 hours ago 1 reply      
What external APIs are working in Copper? Stripe? Twilio? etc.?
ppadron 4 hours ago 0 replies      
Also, NoFlo: http://noflojs.org/.
Level Up Your Shell Game viget.com
59 points by dce  9 hours ago   29 comments top 13
agscala 9 hours ago 3 replies      
I didn't see Ctrl-r, but I've found that to be the most insanely useful shortcut.

Ctrl-r = reverse history search. Type a partial command after Ctrl-r and it'll find the most recent executed command with that substring in it.

Press Ctrl-r again, jump to the next oldest command containing your substring. Did you accidentally press too many Ctrl-r? Press backspace to move forward in history.

bstpierre 6 hours ago 1 reply      
IMO the aliases for git should be in ~/.gitconfig instead. I have a bunch of these, like:

    [alias]    br = branch    co = checkout    ci = commit -v    sci = svn dcommit --interactive    cp = cherry-pick    l = log --pretty=oneline --abbrev-commit    l3 = log --pretty=oneline --abbrev-commit -n3    lm = log --pretty=oneline --abbrev-commit master..    rc = rebase --continue    st = status -sb    squash = !git rebase -i --autosquash $(git merge-base master HEAD)
Also, I prefer to set aliases in my ~/.functions instead of in ~/.bash_profile or ~/.bashrc. I find that this makes it easier to move the .functions file from one machine to another, especially on a lab/test machine with a shared account where I shouldn't be modifying things in the shared ~/.bashrc. To make this work, you can add this to your ~/.bashrc or ~/.bash_profile:

    if [ -f ~/.functions ]; then . ~/.functions; fi
This will source your .functions file if it exists when your .bashrc is run.

A tweak to the "editbash" suggested alias will make it so that you don't have to reopen your terminal. My equivalent alias is "vif", for "vi .functions":

    alias vif='vi ~/.functions; . ~/.functions'
Note that the second command (after the semicolon) sources the modified .functions file.

Lastly: brevity is king. I love 'alias psgrep="ps aux | grep"', since I use it several times a day, but to "level up your shell game", keep it short. My alias for this command is "psg". The other alias that I use all the time is "d" -- "alias d='ls -lFh --color --group-directories -v'".

patrickmay 8 hours ago 1 reply      
The command line navigation commands are just what any Emacs user would expect. vi users can set their $EDITOR to 'vi' to get those commands.

What do you mean you've never used Emacs? mumble whippersnappers mumble

adamnemecek 4 hours ago 1 reply      
> Note: youll need to open a new Terminal window for changes in ~/.bash_profile to take place.

Alternatively, you can just do '. ~/.bash_profile' or 'source ~./bash_profile'.

molecule 8 hours ago 1 reply      
aliasing git shortcuts seems more appropriate for git config:


...after aliasing git to 'g' in your shell config, of course :)

    alias g='git'

deckiedan 6 hours ago 1 reply      
Interestingly, relatedly, a lot of the emacs style keybindings (ctrl-a, ctrl-e, ctrl-k, etc) are system-wide in OSX. I prefer vim as my daily editor, but it is often useful. You can also make the bindings even more emacsy if you want.


dbbolton 5 hours ago 1 reply      
They should really mention which shell and terminal emulator they are using. Not everyone is using the same "Unix command line".
rwl4 8 hours ago 1 reply      
My favorite shell trick (not in the link) is this: ~-

Tilde-hyphen expands to the previous directory you were in, and of course "cd -" returns you to your previous directory, so I put them together all the time.

Here's an example workflow (with a fake PS1):

  mac:/Users/me/Projects/my_new_app$ cd ~/.pow  mac:/Users/me/.pow$ ln -s ~- .  mac:/Users/me/.pow$ cd -  mac:/Users/me/Projects/my_new_app$
Now I can continue working on my app.


That's bit of a contrived example above. Here's a more realistic way to do a symlink for pow:

  mac:/Users/me/Projects/my_new_app$ ln -s `pwd` ~/.pow/

robrenaud 7 hours ago 0 replies      
Here is a rule for how to level up your shell/editing game. If you ever touch your arrow keys, you are doing something wrong.

> ctrl + left arrow Moves the cursor to the left by one word> ctrl + right arrow Moves the cursor to the right by one word

Alt + b and Alt + f are also aliases for the same action.

yonaguska 7 hours ago 1 reply      
It's probably worth mentioning that all the delete functions he points out(all of them for that matter, ctrl-k, etc) are actually cut's, so you can paste them back as well. I find Ctrl-u especially useful when I'm halfway through with a command, then I realize I wanted to do something else before executing said command, so I cut it- then paste it back when I need it.* Ctrl + y to paste anything back
bglazer 8 hours ago 0 replies      
I like the commands, especially the ssh one, which I didn't know before and will certainly use in the future.

I also enjoyed the format of the article. A whole dev team each contributing their own piece to a blog post provides a lot of different voices and styles in a concise way.

enahs 3 hours ago 0 replies      
I knew about the sudo !! but the sudo !$ will come in handy! thanks!
Show HN: Nunjucks Node templates with inheritance, asynchronous control jlongster.github.io
36 points by jlongster  7 hours ago   9 comments top 4
coolsunglasses 2 hours ago 1 reply      
The Clojure alternative (simpler): https://github.com/yogthos/Selmer/
darrhiggs 6 hours ago 2 replies      
Personally I settled on swig[1] for a recent project. My reasoning being that Nunjucks may contain more features, but, I prefer a constrained set of features that forces me to think twice before I implement & use logic in a view.

It's a great project anyway and it did take 2-3 weeks to decide between the two. Good luck with the project!

[1] http://paularmstrong.github.io/swig/

gkoberger 6 hours ago 1 reply      
Very cool, James!
elwell 3 hours ago 0 replies      
:O the background sparks are different every pageload.
A Look at Ads on Instagram instagram.com
50 points by yeukhon  8 hours ago   23 comments top 10
wmeredith 8 hours ago 2 replies      
This is a really good looking ad. Do you know what the problem with ads like this is? They don't work. Ads that don't stand out, that blend into a seamless UX that allows a person to navigate a content company's media platform quickly in a goal-orineted manner; they don't fucking work. They get ignored.

So the marketing company or the sales dept turns up the juice a little bit. They allow them to stand out a little more. They give them special bells and whistles the user created content doesn't have. They let them animate, float over content or auto play video or sound.

Why? Because it makes the product a little better for the customers: the advertisers. Instagram, just like their Facebook overlords, are now in the business of making their product as good as it can be for the advertisers while keeping the users just happy enough not to leave.

I really dislike this cycle in the startup world. I don't know a good way around it. I'd like to see someone disrupt that.

kelnos 8 hours ago 4 replies      
I would gladly pay some yearly fee to be able to use Instagram (and most of the services I use often, really) ad-free. I really wish this was an option.

I assume either a) it's prohibitively expensive to implement such an ability based on how much revenue they expect vs. lost ad revenue vs. lower user engagement due to ads lessening the user experience, b) people are less willing to pay to put ads on your service at a given price if there is a segment of your user base that is guaranteed not to see them.

kyro 7 hours ago 0 replies      
It'd be interesting to see them monetize on user content. With all the pictures of cups of java, bowls of salads, plates of sushi and other food items that are geotagged, Instagram could start including ads for promotions at these restaurants or similar highly rated restaurants near me. I'd find that sort of advertising useful and enjoyable.
GBKS 7 hours ago 0 replies      
Interesting how both Instagram and Pinterest are starting this transition at pretty much the same time with a very similar approach (http://blog.pinterest.com/post/61688351103/planning-for-the-...). Tumblr is already down the exact same path of inserting native-looking content with a small "sponsored" label.

I'm sure big, sexy brands will have no problem creating some nice-looking visuals that will fit right in. Questions is for me whether this will ever work for the long tail of advertising that is currently tweaking their keywords on AdWords. Anybody have any insight or experience with this?

pwhython 7 hours ago 2 replies      
I just invented something (probably not). Voluntary Ad Photo Submissions. Someone takes a photo of their friend drinking Pepsi. Posts it, tags the advertiser: @pepsi. Use a hashtag, whatever. Pepsi looks through their submissions, selects a photo they like and uses it for their Instagram ad campaign. BAM. Now it's a game. The "winner" is popular, they get the likes. Pepsi gets tons of free advertising from everyone posting Pepsi photos.
loceng 8 hours ago 0 replies      
The ads that get results won't be a subtle as this, and without that subtly you'll start to hurt positive metrics - it's the game where users expect 100% value towards them, and you must play it to win otherwise they are mobile and will simply use a new service.
aram 8 hours ago 1 reply      
It was expected, especially after those articles showing how Instagram has been used as a selling platform on Middle East[0].

[0] Just one of them: http://www.digitalks.me/social-media-marketing/instagram/how...

kepano 7 hours ago 0 replies      
So the bait-and-switch begins.

There are a number of startups including Nitrogram, Olapic and Pixlee who seem to be doing well selling analytics, user-generated content, customer service tools and branded contests based on Instagram. Those business models seem like a much better fit for the platform.

Sad to see Instagram take the lazy route.

ewolfe 3 hours ago 0 replies      
I don't understand why they don't just charge accounts with more than X followers. High profile brands are (currently) getting a massive amount of free advertising. I'm sure they would be willing to pay to keep up their profile.

See Mailchimp, free up to 2k subscribers, paid after that. It's so simple.

slashCJ 4 hours ago 0 replies      
Is it just me or does it seem like everytime Twitter announces something about the IPO instagram writes a blog post about ads?
       cached 25 October 2013 04:02:01 GMT