hacker news with inline top comments    .. more ..    14 Aug 2014 Ask
home   ask   best   4 years ago   
Ask HN: What's Paul Graham been up to?
17 points by peapod91  19 minutes ago   5 comments top 4
kazinator 6 minutes ago 1 reply      
He's designing an even better Lisp dialect in which lambda is spelled f (down a full 50% from fn!) and let bindings do not require any parentheses at all:

    (let a 1 b 2 c 3 (+ a b c)) -> 6.

tomhoward 9 minutes ago 0 replies      
He's quite active on Twitter:


Other than that he's probably just enjoying spending quality time with his young family after a relentlessly busy few years.

Robby2023 4 minutes ago 0 replies      
It's actually been 137 days since he made his last comment on HN:


However, he is very active on Twitter

Ask HN: Why don't more apps use peer to peer networking?
32 points by api  4 hours ago   27 comments top 14
drewcrawford 1 hour ago 2 replies      
Let me give you a comprehensive answer on this that goes back to first principles. I've worked on apps like SnapChat, so I am probably pretty close to an authority on why apps like that don't use p2p.

The first problem is that mobile devices are pretty much inherently asynchronous. There are apps that you would use at the same time as another person (like real-time games) but especially on cellular, lag is an issue. This pushes people into designing products that can tolerate lag measured in seconds (because that isn't shockingly bad performance on cellular networks for apps that use your standard off-the-shelf tools like REST/HTTPS/AWS for example). This produces a lot of asynchronous, or semi-asynchronous applications.

Now partly due to those product designs, and partly due to people having lives, they use these apps asynchronously. You pull out SnapChat, fire off a message, and go back to reading Reddit or whatever. Snapchat is off. There's no way to reach you.

Okay, so why don't we run SnapChat in the background? Well there are layers of reasons. The first layer is that it costs energy, and the mobile revolution is in large part possible because software and hardware developers got very aggressive about energy management. If we ran things in the background like you do on your laptop it would need to be as big as your laptop and plugged in regularly like your laptop. There are also practical problems, like announcing every time your network address changes, or even figuring out when your network address changes, which is hard to do passively. I'm glossing over some networking details but there's a deep principle here that within the design of existing TCP/IP/cellular stack you can't have reliability, energy-efficiency, and decentralization. You must pick 2.

Apple, very presciently IMHO, has decided to legislate a lot of rules about background processes that I don't have time to go into here but basically they try to regulate the types of work that apps can do in the background to only the energy-efficient kind. The rules are actually pretty well-designed but they're still rules and they restrict what you can do. Android doesn't have this limitation but unless your product is Android-only you're going to comply with the iOS rules when you design a feature that communicates across platforms.

Okay, so we can't run Snapchat in the background. But what if two users happen to have it open? We can use p2p then right?

Well sure. But the user may be on a network that blocks p2p traffic. That is their network's fault, but they still e-mail you to complain, and leave bad reviews for your product, because as far as they can see "the Internet works" so it's your app's fault.

So what you do is you design a scheme that uses p2p by default and client-server as a fallback. There are actually apps that work like this. Problem here is, instead of getting support tickets about it not working, now you get support tickets about it being slow.

And there are ways to solve this, like permanently giving up on p2p after a certain numbers of failures for example. But the first experience is still pretty bad, which is what counts in mobile. And I remind you, this p2p feature is already a scheme that only works in the 0.3% of cases that users actually have the app open at the same time, and now you want to add code that disables p2p in even more cases than it's disabled already. This process continues until basically zero actual customers will ever use the feature.

And we haven't even gotten to cases like "Why didn't this message get delivered to all my devices?" because there is just zero chance that any customer, anywhere, will have all his devices turned on at the right time to receive incoming p2p connections.

Now non-messaging products like Spotify or Netflix are more plausible, but you still have to ask who wins here. Customer experience is worse, both because of connectivity problems and increasing bandwidth bills and the energy efficiency losses that comes with rebroadcasting content to other users. Developers are worse because they probably have to build both client-server and p2p architecture, because p2p isn't reliable enough on its own. Support is worse because almost any issue is potentially a p2p-related issue, have you tried disabling p2p and seeing if the issue persists?

There's really no reason, certainly no compelling business case, to inflict that much pain on any mobile product I can think of. I mean, there's probably a place where p2p makes sense--we live in a big world--but in general it makes things much worse for everybody.

ownedthx 2 hours ago 0 replies      
I have worked on multiple applications that use P2P networking, and the fundamental problem is this:

Not all networks support the ability to P2P network, or, if they do, they require intervention by the user.

So you have two issues: some users will never get to use your application, and for those that could potentially, they will likely need customer support to help them configure their network correctly.

Corporate networks are the worst for this. They aren't going to change their rules for your application (yes, they might, but don't assume that starting out).

A much greater percentage of home networks can support P2P networking, but your application probably needs to support STUN as well as UPNP.

Some number of home routers won't work ever, or can if you configure them correctly. And that's where it gets messy. Is it enough to tell a customer to 'go figure it out', when it pertains to router configuration? You might get away it with PC gamers; but I'd argue any other segment of people will have no idea how to do it, and need some help. So now you have to try figure out the enduser's home router configuration as best you can remotely. Huge drain on customer support resources, which in a small startup, usually means the developers.

So why go P2P with all these headaches? Unless you have a really strong reason to do use P2P, like keeping latency down between peers, you don't bother.

sliken 19 minutes ago 0 replies      
1) Users don't care2) It's complex, so it costs quality developer time. Even then there's no guarantees.3) The old school approach is cheap/easy/fast. Additionally enabled by fast/quality clouds that allow scaling relatively easily.4) Increasing number of clients do not accept incoming connections because of IPv4/Nat or because of cellular/mobile/wan connections that don't accept incoming.5) It's a competitive market for apps, web services, and related. Any increase in latency or increase in support calls is prohibitive. If even one and 10 people need support for port forwarding that's a deal killer.6) in an increasingly mobile world incoming connections and anything that hogs battery (bandwidth or even just being awake) is a disadvantage.7) there's no easy money in P2P. No monthly data plans, centralized towers, centralized servers, etc. Sure a mesh network of smartphones with millions of clients could do cool things. But where does AT&T/Verizon make money? Without AT&T/Verizon is Samsung going to make a p2p/mesh network phone if they need to sell millions to break even?

As an example skype years ago with mostly desktop/laptop clients was largely p2p (just login was centralized). With increasing numbers of tablets, WAN connections, and smartphones they switched to central servers.

So sure, you might be able to spend a man year and get an awesome, robust, and performant solution. But your competition will have spent that time actually making users happier and steal your market.

max0563 38 minutes ago 0 replies      
It really comes down to the fact that it's hard. Creating a secure P2P network takes a lot of time, a lot of smart people, and a lot of money. Especially on mobile the design is not reasonable. However, a semi P2P network could be an option. For example, a snapchat is sent and the app attempts to send the message directly to the user. If the app is open there is no problem, but if it isn't the message gets sent tp a snapchat server and saved for later. This is a design I used for a P2P email project I worked on and it worked.
jdietrich 15 minutes ago 0 replies      
Servers are dirt cheap, developers and customer service reps are very expensive. P2P solutions have the potential to impose high reputational costs if users start getting angry about data or battery usage, or experience connection problems. In a very few edge cases (very high bandwidth use, very low customer value) it might make sense to use P2P, but in those cases you probably don't have a real business proposition.
supercoder 3 hours ago 0 replies      
Because there's never a guarantee the app is open, especially on mobile.

If we take your Snapchat example , if one user is trying to send a message to another and their app is closed or their device is off, where does the message go ?

On the desktop you have the luxury of mostly running in the background to pick up the ping, but it's almost never the case on mobile.

ColinCera 4 hours ago 1 reply      
My guess is it's a combination of development complexity and (probably more importantly) firewall/router issues.

It's very hard impossible, really to deploy P2P technologies on a mass scale without thousands of users encountering problems with their routers & firewalls.

For mass market products, you can't get away with asking people to whitelist your app, make sure port 28777 is open for UDP, etc.

Many P2P systems have freeloader issues, which can usually be resolved/avoided/ignored on desktop systems, but when you add mobile into the equation with its paltry bandwidth limits and sky-high overage charges the potential for it to become a problem is much greater.

aikah 1 hour ago 0 replies      
1 - complexity

P2P is just hard, and not that pertinent for most products. And of course there is the network issue with special ports, UDP that makes these solutions sometimes impossible to deploy in the enterprise world.

Thiz 14 minutes ago 1 reply      
I'd like a p2p chess app to play with my brother using our mobile phones and nothing else.

Is that even possible?

csmdev 3 hours ago 0 replies      
It's because startups rarely have time and resources to create a solid infrastructure from the beginning. Easy and simple are preferred. The product needs to be launched and validated as fast as possible.

So by the time a product becomes successful, it's core is already using inefficient solutions. And very few companies upgrade them because the effort is not really worth it business-wise.

If we would have been driven by real solutions instead of money, then P2P would probably be king. Direct communication everywhere.

wmf 4 hours ago 1 reply      
P2P is definitely complex to develop. And on phones P2P sucks battery life due to keepalives between peers; this is one of the reasons Skype is removing P2P. P2P push notifications could help, but they'll probably never exist.
bigmickey 1 hour ago 0 replies      
Check out maidsafe.net - those guys think that the internet should have been designed as peer-to-peer in the first place!
mikeash 3 hours ago 0 replies      
Developer time is one of the most important things to optimize. Bandwidth is really cheap by comparison.

P2P isn't completely reliable. There are many cases where you can talk to a server but you can't talk to a peer, ranging from evil firewalls to excessive layers of NAT to simple things like the target device being offline. Thus, you must code a fallback that talks to a server if you want reliability. This server fallback will work for all situations, so it's necessarily easier to just use it for everything and not bother to code the P2P bit at all.

P2P is also really hard to do well. It's pretty easy to do poorly: have one device tell the other device what its IP address is and a port to connect to, then connect to it. In practice, this fails about 99% of the time because approximately all consumer internet users are behind NAT these days. So then you enter the wonderful world of NAT traversal meaning you have to deal with horrifying things like UPNP, NAT-PMP, and STUN. And this is when both sides keep the same IP address throughout the connection! Now consider when your smartphone user goes from Starbucks, where he has WiFi, to the bus, where he only has LTE, to home, where he has WiFi again.

Bandwidth is cheap. Let's say it would take Snapchat one developer-month to implement this, or about $10,000. (I'd wager this would be a strong underestimate, both in terms of time required and the cost of that time.) Amazon S3, to take a random example, charges 12 cents per GB of outgoing transfer to the internet at lower use levels. You can buy 83TB with that $10,000. If your typical Snapchat image is 1MB (they're low resolution, right?) then you'd have to P2P 83 million images before you broke even on the investment. Factor in a more realistic timeframe, a more realistic cost, and the opportunity costs of not having that developer work on something more useful, and the payoff goes up by an order of magnitude or more.

P2P does get used where it pays off well. That's either high-bandwidth stuff or low-latency stuff. WebRTC does P2P whenever it can. Apple's FaceTime does P2P when they're not disabling that functionality to placate patent trolls. Skype does (or at least did, I seem to recall some changes) P2P for audio and video. And of course nothing beats BitTorrent for sending massive amounts of data to large numbers of people. But it just doesn't pay off unless you're really sending a lot of stuff.

bluedino 1 hour ago 0 replies      
Why would they?
Whatever Happened to BeOS?
10 points by forca  6 hours ago   10 comments top 6
dsr_ 6 hours ago 1 reply      
When BeOS was becoming interesting, it was competing against MacOS 8 and Linux (assuming you had already ruled out Windows and you couldn't/wouldn't pay for a UNIX workstation).

MacOS had commercial apps and a good reputation in certain fields. The hardware was expensive by PC standards but high quality.

Linux was growing like a weed: adding just a little more RAM to the kind of desktop a college student could afford let you do everything that the Sun workstations in the computer lab could do, and you never had to share it.

If BeOS had been free, compatible with less-blessed hardware, and not offended Apple so much, it might have had a chance.

CyberFonic 2 hours ago 0 replies      
Like most companies of that era, they built their own hardware (BeBox) which was very expensive (compared to the PCs of the time). Later they moved to Apple hardware and Apple did try to buy them but they held out for more money.

Linux took hold because you could pull any old PC out of the junk pile and load it up and go. BSD, at the time, was fracturing into multiple versions and having legal troubles.

Haiku alpha 4 is awesome. But there are too many other projects out there so there aren't enough resources to make faster progress.

The wikipedia page http://en.wikipedia.org/wiki/BeOS has a lot of more good information.

morganvachon 5 hours ago 0 replies      
I was a heavy BeOS user during its last days, and from what I remember, Microsoft put some serious pressure on certain OEMs to not sell BeOS on their computers. Compaq was one of those; they were told in no uncertain terms that if Compaq sold even one computer with BeOS on it, they'd never sell another computer with Windows, period.

That's certainly not the only reason the company went under, but it was a major factor.

wmf 5 hours ago 0 replies      
Partly they just didn't have enough money to build an OS, so every time they pivoted it put them further behind. And they pivoted a lot of times.

I think they made a lot of design decisions that were overfitted for 1995, so if they had survived they would have quite a bit of legacy cruft by now. And the brittleness of C++ may have made the cruft worse.

In some cases their idealism seems to have held back practical adoption. Their "pervasive multithreading" made it very difficult to port Java and Mozilla. (The idea of "only native apps, no ports" made perfect sense in the siloed PC market of 1993. By 1997, not so much.) Treating all developers equally meant that professional developers may have gotten shortchanged.

0x0 6 hours ago 1 reply      
My bet: a lack of a killer app, plus no compatibility with killer apps on other platforms.

OS/2 was slightly more compatible with existing apps and therefore had slightly more success for a while.

api 4 hours ago 0 replies      
I think the problem was the same as that faced by languages like D competing with C++11:

It wasn't "better enough" than the alternatives.

It was better than NeXT (what would become OSX), but not enough to be revolutionary. It was better than Windows, Mac Classic, and Linux, but also not enough to be revolutionary.

The existing closed incumbents won out by market share and inertia, and Linux won out by being free, open, and by basically going viral.

If BeOS had been open and free I think it would have given Linux a run for its money, but it wasn't.

Ask HN: Do you have a dream job?
23 points by neilsharma  10 hours ago   21 comments top 11
dccoolgai 7 hours ago 0 replies      
I'm a web architect / software engineer at a language learning company.

I got the job by a referral of an acquaintance from a local software Meetup.com group that I attend regularly. When I got frustrated with my last job, I approached the leaders of the Meetup group (who I sort of knew as acquaintances by that point) and said "Hey guys, I'm sort of looking around now. Let me know if you hear anything." A few weeks later, one of those guys made an email introduction to me of another guy who works at the company I'm currently at. He brought me in for an interview... at first, I was just kind of going to the interview as a courtesy - I was pretty sure I didn't want to work here - but when I met the guys and the boss I would be working with, I was sold pretty quickly. They seemed like really fun, intelligent guys when we met and they are. If there was a point in the interview that sold it to me, it was probably the point where I asked them if we should start the whiteboard coding portion of the interview and they waived it off, saying "That's just for people who we think are bullshitting us. We can tell you know what you're talking about and what you're doing."

I just got back from the beach for a week, and while it was nice to be away for a little bit, I was genuinely excited to get back in the office on Monday. I don't remember ever feeling that way at any other job I've had, so I guess I have to say this is my "Dream Job". That doesn't mean I want to do it forever - I'm a firm believer that even the "Dreamiest" job doesn't beat working for yourself, but I have to say I really enjoy it. Unless someone offered me something on the order of 2-3x what I make here, I wouldn't consider leaving (and I make a pretty decent amount for my region/experience).

What makes it that way?

1. Boss. Best/most competent guy I've ever worked for. He somehow has the "magic touch" of keeping the team focused on things we can deliver, calling bullshit on all the paper-pushers and meeting-mongrels that try to sap our time ("No one on my team has a company phone. No one in this company has the right to innterupt my devs while they're working."), and rolling up his sleeves and coding when we have to stay super-late to get something done (only happened once). He also judges on "body of work" more than individual incidents ("You went out at 2PM and get drunk with your co-workers yesterday? No problem, you usually get your shit done in good order and on time. You showboated about staying until 1AM last night? Bullshit - on a regular basis, you don't get shit done. Try harder.")

2. Open-ness of the team. "You have been messing around with a new library/framework at home for the last couple of weeks and you really like it? Come in and show it to the rest of the team for a couple hours tomorrow. Team discusses it, weighs benefits/drawback/long-term-maintainability...it's in production a week later." "You met an awesome guy at a meetup last week? Bring him in here and interview him. Two weeks later, he's hired."

3. "Goldilocks" company atmosphere. Not "douchy-SV-startup concrete walls and beanbag-chairs", but not "corporate cubicle farm" either. It's a great mix of people from different parts of the world, different age ranges / genders / etc. We have free sodas/coffee, but not free beer. People leave at 6. There is usually a happy hour every couple of weeks where we get together at a crappy bar and have a great time, but no one's social circle consists entirely of coworkers. If you're married/kids people tell you what happened at the happy hour the next day so no one feels like they're "left out".

jkarneges 7 hours ago 0 replies      
I had my dream job awhile back, where I was paid to work full time on my own open source project. Essentially my employer was offering the software to customers, and they wanted to make sure it was maintained. It was easygoing, I set my own roadmaps, and I had the freedom to contribute to struggling third-party dependencies as needed (I felt like an open source superman, swooping in). This job lasted for several years until I quit to pursue startup life.

I'm currently the founder of my own funded company, which in a way sounds like a dream job too, but because there are a lot of non-coding distractions and also huge stress levels, I don't think it's really the same. The cushy open source gig was the dream job. :) Maybe after I'm "successful" I'll go back to that.

collyw 8 hours ago 1 reply      
No, I have a shit job. Started great went downhill after a year.

I loved it the first year, getting requirements, designing and building the database, and web front end for a sequencing centre. Learning a new language, web framework, some JavaScript, using my existing database and coding skills and pushing them further.

Three and a half years on all I seem to do is fix the same excel upload errors (it was a temporary quick fix solution that has always been bottom priority for replacement), go to dumb meetings where they spend half an hour discuss the acronyms used in the menus, and change colour of items. I never get a chance to focus on anything that takes more than half a days coding, so basically all the interesting work is now replaced by trivial fixes usually where the users seem incapable of reading the error message (though with Excel it impossible to guess what the problem will be).

I was promised a promotion last year, but that hasn't happened yet (I work in Spain, and government cuts mean it not allowed until we merge with another institute). I have realised recently, that it is making me frustrated and deeply unhappy, so I am willing to take a pay hit (on my future wage) to do something that I actually enjoy again.

Ideally I want to go freelance, but there isn't such a big market here in Spain, and the common advice is to build up your portfolio (tricky when your work is in house).

Anyway, as I realised recently, do something you enjoy, and don't get sucked into the management style thing if you that isn't what you want to do. If management are not listening to your suggestions, it is time to get out. The promise of more money has kept me hanging on far too long.

a3camero 8 hours ago 1 reply      
I'm a technology lawyer in Toronto, Canada.

I got my job by starting my own law practice after working at a large tech company + a large law firm. Most lawyers work at firms for several years (or forever) instead of doing their own thing. I did contract programming work for many years before starting my practice so I knew what I was getting into by starting my own business. Without my background in programming I wouldn't have most of my clients. I also get a lot of inbound leads because my website does well on Google (re: former web developer).

My job is awesome because I get to help entrepreneurs start their businesses/keep them running smoothly. I'm independent so I can offer reasonably-priced, fast and flexible service. Another benefit of being independent is that I get to work for businesses that other people might not touch such as Bitcoin-related companies. It's more interesting than being a programmer because I get a higher-level view and work with a wide variety of people. I still keep one foot in the world of programming by building my own services on the side.

mkautzm 10 hours ago 0 replies      
I work as a System admin and a client support tech for a small-ish MSP.

I really thought that this is what I wanted to do. I could solve interesting and new problems every and be given a chance to explore what is effectively the infinite depths of computing at a very technical level.

I was really happy initially but as time has went on, I've become more and more aware that customer-facing IT has a lot less to do with solving technical problems, and a lot more to do with solving social problems - Talking to people and setting expectations and bridging a gap between what should actually be done and what the customer thinks should be done....and I'm not good at that. I want to solve difficult, technical problems. I don't want my job description to be 'I primarily deal with people and technical competence is secondary'.

So I'm not there yet, but I feel this experience has helped really carve out what I do want for a career, so I'm glad I've had the experience. If there is one thing it helped define, is the quality of people I work with. I get to work with awesome people, and if I could find these kinds of people doing a more technically-focused job, that might be the dream.

shimshim 7 hours ago 0 replies      
- I work for a mental health nonprofit, largest one in the country at the moment.

- I got into IT after being a warehouse worker and "being good with computers" there. I did air guitar to land my first IT job.

- My current job is awesome because as a nonprofit everyone is very huggy-feely, lots of psychiatrists leading the way and whatnot. It is also awesome because I am given free reign to come up with solutions, design new systems and generally tinker with a home-built lab using old equipment I was loaned by this non-profit.

- We get several different models of charity licensing from various vendors in addition to having some of the best negotiators around. A non-profit running all EMC storage, backup and replication? Pretty awesome.

I love the work in general and would do it for peanuts, almost.

normloman 8 hours ago 1 reply      
I teach private music lessons. It's only a second job... I don't think it will ever support me full time. But it's as close to dream job as you can get. I get to share something I enjoy, and help people have fun. And it's only a 10 minute drive from my apartment. I can have a crappy day at my normal job and leave my teaching job energized.
JSeymourATL 9 hours ago 1 reply      
> If some people can identify with your goals and interests, this may help them find a job they love.

There's a solid, recent book on this very subject by Robert Kaplan. Here's his presentation from Talks @ Google, https://www.youtube.com/watch?v=8sY-qwEYjs0

bennyp101 9 hours ago 2 replies      
I've been coding commercially for 12 years now, and in 2008 I landed what I would class as a dream job. It was a contractor position up near Reading in the UK, but became permanent. I spent 4 years having a great time, interesting work, great people to work with, and an awesome culture. Then we got bought out by Oracle. I lasted 8 months and moved back to Kent.

Now I work for an ISP (I worked with one of the directors about 10 years ago) 10mins from home, working on everything that needs doing, and I would say that this is a dream job.

I guess it depends on your circumstances, and what you need at the time. Dream job is such a loose phrase. They have both had their downsides, but in happiness levels they have both excelled!

Edit. I should say that I worked at various places before, and none really compare, but they did give a good grounding and helped me figure out what it was i wanted from a job. (Hint:the money is nice, but after a while job satisfaction and quality of life take precedent)

paulornothing 8 hours ago 0 replies      
Program Manager overseeing data collection

Washington, DC


My job isn't awesome but I certainly don't hate it. I would much rather be working for a law enforcement agency. But I'm at a solid point in my current career, and I was going to work for the FBI but I would have to take a pay cut and it would take 2 years to get back to the pay I currently make. So add that in with a house and family and I just don't think I'm going to make a career move. So I just try and volunteer places to do criminal analysis and may pursue my PhD.

dspig 8 hours ago 0 replies      
It might not be anyone else's dream job, but I find myself working on things I would anyway be doing for fun (like a specialized bytecode interpreter, audio effect algorithms, various little research projects which happen to improve my Python and Javascript skills...). The only problem is it makes my non-work time feel a bit less interesting by comparison!
Have you been to an online bootcamp or coding school?
4 points by eduluvr  5 hours ago   discuss
9 months ago I created a similar app like RememberWin for Android
6 points by Manapp  4 hours ago   2 comments top
lobotryas 2 hours ago 1 reply      
Looks cool, but are you sure you want to include a screenshot of your personal goal titled "NoFap"?
Ask HN: Hacker News like websites in other languages?
3 points by kiyoto  2 hours ago   1 comment top
humpt 2 hours ago 0 replies      
I don't think techies mind english, and it sure is the only way to get information faster to more people.
Ask HN: What to do when the .com is too expensive?
5 points by hartator  6 hours ago   8 comments top 4
byoung2 5 hours ago 1 reply      
We have found a new name for our company

Since you are rebranding, you might be better off finding a name where the .com is available just to avoid the hassle of being shaken down if you get big later. Companies that just ponied up the money when they got big are dropbox (formerly getdropbox.com) and facebook (formerly thefacebook.com). It seems like recently you don't even need the .com (e.g. famo.us and socket.io), so you could go that route if your name is short and easy to remember.

notduncansmith 5 hours ago 0 replies      
You're best off prefixing (to get a .com) or going with an alternative TLD. Don't get a .net or .org though, because they just don't have the same appeal. Maybe try one of the new TLD's (.academy, .guru, .sexy, etc)? Also, don't worry about the effect your domain may have on SEO. Focus on solid content marketing and the SEO will come - use paid traffic and a good autoresponder series if you have to spend money on marketing.
cylinder 5 hours ago 1 reply      
Is there any way you can find out who the real owner is?

The broker doesn't want to take your offer because he won't make much commission. He'd rather hold out for a bigger offer later on.

Circumvent the broker. Owner may be more likely to take your offer.

csmdev 5 hours ago 1 reply      
Find another one.

Don't get fixated on things. A business must evolve and adapt in order to succeed.If you are stubborn about a simple name, the web domain is the least of your worries.

Ask HN: What info do you web scrape for?
121 points by cblock811  2 days ago   106 comments top 42
lumpypua 2 days ago 6 replies      
I've had three primary uses of web scraping. The hard part for me has never been speed. Getting the results structured is somewhere between easy and hideously complicated.

1. Reformatting and content archival (lag times of hours to days are no prob).

As an example, I put together http://yareallyarchive.com to archive comments of a ridiculously prolific commenter on a site I follow. I needed the content of his comments, as well as the tree structure to shake out all the irrelevant comments leaving only the necessary context. Real time isn't an issue. Up until recently it ran on a weekly cron job. Now it's daily.

2. Aggregating and structuring data from disparate sources (real time can make you money).

I work in commercial real estate. Leasing websites are shitty and the information companies are expensive and also kinda shitty. Where possible we scrape the websites for building availability but a lot of time that data is buried in PDFs. For a lot of business domains, being able to scrape data in a structured way from PDFs would be killer if you could do it! I guarantee the industries chollida1 mentioned want the hell out of this too. We enter the PDFs manually. :(

Updates go in monthly cycles, timeliness isn't a huge issue. Lag times of ~3-5 business days are just fine especially for the things that need to be manually entered.

This is exactly the sort of scraping that Pricenomics is doing [1]. They charge $2k/site/month. Hopefully y'all are making that much.

3. Bespoke, one shot versions of #2.

One shot data imports, typically to initially populate a database. I've done a ton of these and I hate them. An example is a farmer's market project I worked on. We got our hands on a shitty national database of farmers markets, I ended up writing a custom parser that worked in ~85% of cases and we manually cleaned up the rest. The thing that sucks about one shot scrape jobs from bad sources is that it almost always means manual cleanup. It's just not worth it to write code that works 100% when it will only be used once.

Make any part of structuring scraped data easier and you guys are awesome!

[1] http://priceonomics.com/data-services/

chollida1 2 days ago 5 replies      
There is a closet industry for scraping any sort of data that can move markets. Fed, crop, weather, employment,etc.

Anything that is released at a certain time on a fixed calendar, you can bet that multiple parties are trying to scrape it as fast as possible.

If you can scrape this data( the easy part), put it in a structured format( somewhat hard) and deliver it in under a few seconds(this is where you get paid) then you can almost name your price.

It's an interesting niche that hasn't been computerized yet.

If you can't get the speed then the first 2 steps can still be useful to the large number of funds that are springing up using "deep learning" techniques to build a portfolio over timelines of weeks to months.

To answer the question of: > Wouldn't this require a huge network of various proxy IPs to constantly fetch new data from the site without being flagged and blacklisted?

This is why I gave the caveat of only looking at data that comes out at certain times. That way you only have to hit the server once, when the data comes out, or atleast a few hundred times in the seconds leading up to the data's release:)

dennybritz 2 days ago 3 replies      
I'm working on a startup that has web scraping at its core. The vision is a bit larger and includes fusing data from various sources in a probabilistic way (e.g. the same people, products, or companies found on different sides with ambiguous names and information. This is based on the research I've doen at uni). However, I found that there are no web crawling frameworks out there that allow for large-scale and continuous crawling of changing data. So the first step has become to actually write such a system myself, and perhaps even open source it.

In terms of use cases, here are some I've come across:

- Product pricing data: Many companies collect pricing data from e-commerce sites. Latency and temporal trends are important here. Believe it or not, there are still profitable companies out there that hire people to manually scrape websites and input data into a database.

- Various analyses based on job listing data: Similar to what you do by looking at which websites contain certain widgets, you can start understanding job listing (using NLP) to find out which technologies are used by which companies. Several startups doing this. Great data for bizdev and sales. You can also use job data to understand technology hiring trends, understand the long-term strategies of competitor's, or us them as a signal for the health of a company.

- News data + NLP: Crawling news data and understanding facts mentioned in news (using Natural Language Processing) in real-time is used in many industries. Finance, M&A, etc.

- People data: Crawl public LinkedIn and Twitter profiles to understand when people are switching jobs/careers, etc.

- Real-estate data: Understand pricing trends and merge information from similar listings found on various real estate listing websites.

- Merging signals and information from different sources: For example, crawl company websites, Crunchbase, news articles related to the company, LinkedIn profile's of employees and combine all the information found in various source to arrive at meaningful structured representation. Not limited to companies, you can probably think of other use cases.

In general, I think there is a lot of untapped potential and useful data in combining the capabilities of large-scale web scraping, Natural Language Processing, and information fusion / entity resolution.

Getting changing data with low latency (and exposing it as a stream) is still very difficult, and there are lots of interesting use cases as well.

Hope this helps. Also, feel free to send me an email (in my profile) if you want to have a chat or exchange more ideas. Seems like we're working on similar things.

fnbr 2 days ago 1 reply      
I work as a research analyst for a Canadian provincial opposition party. Most government data is in terrible HTML tables, often dynamically generated, and almost none of it is in an easily machine readable format. I spend a lot of time downloading PDF files of data and converting them to JSON formats.

I have two main recurring scrapes:

- political donations. Every donation to a political party in my province above ~$300 is posted publicly on a gov't website (in a PDF). I use the data to run machine learning algorithms to predict who is most likely to want to donate to my party.

- public service expenses. My province has a "sunshine list" which publishes the salaries and contracts for all senior government officials. We grab it weekly (as once someone quits the gov't, their data disappears).

One tool that you could consider building is an easily accessible expense website, where people can enter the name of a public official and see all their expenses, including a summary of the total amount spent. There have been a number of massive expenses here in Canada related to this [1, 2].

[1] http://news.nationalpost.com/tag/alison-redford/[2] http://en.wikipedia.org/wiki/Canadian_Senate_expenses_scanda...

michaelt 2 days ago 1 reply      
A lot of services with online billing refuse to send bills by e-mail, instead requiring users to log into their websites.

No doubt the companies would justify this by saying e-mail isn't secure enough. The side-effect that it'll stop many users bothering to look at their bill isn't why they do it at all, no sir.

I've been considering making a web scraper that goes to the phone company, electricity company, gas company, broadband company, electronic payslips, bank, stockbroker, AWS and so on; logs in with my credentials; downloads the PDF (or html) statements; and sends them by e-mail.

Of course, such a web scraper would need my online banking credentials, so I'm not in the market for a software-as-a-service offering.

allegory 2 days ago 1 reply      
I scrape Gumtree and eBay hourly using a python script for certain things I want under a certain price. The script sends me an email with the link in it and I get on top of it sharpish.

Managed to bag a lot of stuff over the last couple of years for not much money.

If someone bags this up as a service I'd pay for it.

Smerity 2 days ago 1 reply      
Partial plug, but very related to the topic: if you're doing large scale analysis on the web and you don't want to have to actually run a large scale crawl, use the CommonCrawl dataset[1]! Common Crawl is a non profit organization that wants to allow anyone to use big web data.

I'm one of the team behind the crawl itself. Last month (July) we downloaded 4 billion web pages. Thanks to Amazon Public Datasets, all of that data is freely distributed via Amazon S3, under a very permissive license (i.e. good for academics, start-ups, businesses, and hobbyists). If your hardware lives on EC2, you can process the entire thing quickly for free. If you have your own cluster and many many terabytes of storage, you can download it too!

People have used the dataset to generate hyperlink graphs[2], web table content[2], microdata[2], n-gram and language model data (ala Google N-grams)[3], NLP research on word vectors[4], and so on, so there's a lot that can be done!

[1]: http://commoncrawl.org/[2]: http://webdatacommons.org/[3]: http://statmt.org/ngrams[4]: http://nlp.stanford.edu/projects/glove/

jpetersonmn 2 days ago 1 reply      
I do a lot of scraping for my day job. We have a business intelligence team that will build us reports that we need from the data that we have. However I find that this process is so incredibly slow and sometimes we only need to compile the data for a one-off project. I used to use vb.net for this as that's what I started learning programming with. Now I use python/requests/bs4 for all my scraping scripts.

I've started working on a new website that will use data scraped from several vbulletin forums. I've found that even 2 vbulletin forums running the same version may have completely different html to work with. I'm assuming that it's the templates they are using that changes it so much.

I'm setting up the process so that the webscraping happens from different locations than the server were the site is hosted. The scraping scripts upload to the webserver via an api I've built for this. Mostly did this because for now I'm just using a free pythonanywhere account and their firewall would block all of this without a paid account. And then also none of these sites would see the scraping traffic coming from my website, etc...

richardbrevig 1 day ago 0 replies      
My first scraping project was well over 10 years ago in college. I was a member of the education club and we wanted to get funding so I convinced the college of education to allow us to charge $10 automatically to students in their school. But then the administration dragged their feet to give us a list to submit to the accounting office for billing. 1 (of the many professors) submitted a list to me via outlook that they copied off the site so I was able to look at the HTML structure of their list. The university used basic security (htaccess) and didn't verify that you had permission for a task once you were in. I had access because I worked for the dean of men. So I scraped all the faculties student lists and then used another system behind the htaccess point to get all the relevant information on each student. Compiled a list of 300 students and submitted it getting the club $3,000 in funding. The college of ed office staff were freaked because they had no clue how I came up with the student roster (no one in their office gave it to me) but nothing came of it.

Been scraping a lot lately but mostly:

- government website for license holders

- creating lists of businesses for different segments (market research/analysis)

- using those lists to scrape individual sites and make analysis (how many use facebook/youtube/etc)

Cyranix 2 days ago 1 reply      
When I worked at MyEdu, I didn't actually sign on with the dev team originally I worked on "the scraper team". We scraped college and university websites to get class schedule information: which classes were being taught, broken down by department and course number; by which professors; at which times on which days. If you're ever looking for an interesting challenge, I would encourage you to try getting this data.

Well-formed HTML is the exception rather than the rule and page navigation is often "interesting". Sometimes the school's system will use software from companies like Sungard or PeopleSoft, but there's customization within that... and of course, there's no incentive for the schools to aggregate this information in a common format (hence MyEdu's initiative), so there are plenty of homegrown systems. In short, there's no one-size-fits-all solution.

* NOTE: If you do attempt this, I insist that you teach throttling techniques from the very start. Some schools will IP block you if you hit them too hard; other schools have crummy infrastructure and will be crushed by your traffic. Scrape responsibly!

jawns 2 days ago 2 replies      
I scrape about 60 blogs and news sites that deal with a niche topic and examine all the hyperlinks. If more than one of them links to the same page, I assume that it's a page that's generating some buzz, so I send it out in an email. It's proved to be a generally reliable assumption.
jasallen 2 days ago 1 reply      
I am currently scraping for brand product and nutrition data. Having to build custom scrapers per brand is hell.

I have a dream to use something closer to OCR against a rendered page, rather than parsing DOM. That way it would be less custom, and I could say, for instance, "find 'protein', the thing to the right of that is the protein grams".

I, personally, don't know how to do this, but I'd be willing to pay for a more generic way to scrape nutrition data (email in profile :) )

pbowyer 1 day ago 0 replies      
In terms of ideas, how to scrape Javascript-heavy sites. This one has broken me and the Import.io helpdesk: http://www.usrentacar.co.uk/. I'm now trying in CasperJS/PhantomJS but no joy there either.

I'm looking to buy a house, and not all local estate agents post to Rightmove (or some post with a 24-hour delay). Trying to submit the search form on the agent's own hideous website, parse the results and get a standard structure between them all is hideous - I gave up in the end.

Once I have the data the challenge is then analysing it (geolocation, how long are the commute times, distance to amenities etc) which is its own separate challenge

andy_ppp 2 days ago 0 replies      
YQL is surprisingly quite brilliant:


mushfiq 1 day ago 0 replies      
I scraped around hundreds of the major universities of US for one of my client. And he used the data to build a mobile platform for students which integrated different services. I mainly grabbed course, class routines, bus route and email of both professors and students. I still have around one and half million of email address of academics.

I also did some e-commerce information scraping.

One of the most interesting one was for a data selling company. They asked me to collect data of geo information, disaster, finance, tweets etc. We used to apply ML and statistics to give forecast with historic data.

hatethis 1 day ago 0 replies      
Frankly I'm annoyed to see this topic here. Most people who have taken to scraping are low-life scum. They see content that others have spent months or years producing, simply set up a site that aggregates all of that information, and then sit back and collect revenue from ads or reseller links you paste everywhere.

People who put in a few hours of work to take advantage of other people's hard work piss me off. :/

kohanz 2 days ago 0 replies      
I have a side-project which scrapes play-by-play data from NBA games to gain more insights into these games.

Here is an example of the (un-finished) side-project: http://recappd.com/games/2014/02/07

I'm far from the only person scraping this data. Look at sites liked http://vorped.com and http://nbawowy.com for even better examples.

finkin1 2 days ago 0 replies      
We do a lot of live web scraping of product information from retail sites for http://agora.sh. We basically scrape all of the essential product info and offer it to the user in an optimized view (we call it the 'product portal') that can be accessed without having to load a new page. This reduces tab sprawl and provides a general improvement to a lot of shopping workflows, such as just wanting to see larger/more images of the product (likely clothing) and being able to do so with an overlay on the existing page.
sayangel 2 days ago 0 replies      
What about https://www.kimonolabs.com/ ? Makes it pretty easy to collect data and presents in a structured format (JSON).
aruggirello 2 days ago 1 reply      
Scraping really is a quite complex process, and not everybody does it right.Do you employ a (distributed?) crawler pool? What if a scraped page goes offline (404/410)? And, how do you handle network errors, and 403's / getting caught (and possibly blocked) - if at all? Do you conceal the scraping by employing a fake user agent? Do you (sometimes?) request permission for scraping to relevant webmasters?These are the things that can make it or break it IMHO.
jerhinesmith 2 days ago 3 replies      
A while ago, I had the idea of creating a travel site that catered to the group of people that enjoy traveling but aren't bound by time (i.e. I want to go to X, but I don't care when -- just show me the cheapest weekend for the next 3 months).

Anyway... it turns out that flight APIs are ridiculously non-existent. I ended up scraping two different airline sites, but since it was against their terms, I never took the site any further.

oz 2 days ago 1 reply      
I once wrote a scraper for a Yellow Pages site in Python. It pulled down the business category, name, telephone and email for every entry, and returned a nicely formatted spreadsheet. The hours I spent learning the ElementTree API and XPath expressions have paid for themselves several times over, now that I have a nicely segmented spreadsheet of business categories and email addresses, which I target via email marketing.
jaequery 2 days ago 0 replies      
I'm currently scraping data such as "tweets, comments, likes" a website gets each day so I can graph them over time.

One thing I am having a hard time scraping backlinks to websites. Currently using bing but they are paid after like 5000 queries. I really wonder how other companies like seomoz do this daily against millions of websites.

murukesh_s 2 days ago 0 replies      
I used to scrape web for a daily deals search engine i wrote for a client in 2010. But we scraped in realtime as the number of sites were really low (in 10s).

pre-crawled copies with distributed processing platform could be cool. you could come up with a better search engine with programmable rules that are edited collaboratively (like wikipedia)

pknerd 2 days ago 0 replies      
I love scrapping and even made a subreddit for the purpose where I showcased few of my public work. Any scraping lover can join in.


hpagey 2 days ago 0 replies      
I currently scrap my lending club account to automatically trade loan notes on the secondary market. This way I can buy/sell notes that satisfy my criteria. If anyone is interested in this, I can send you the scripts.
jhonovich 2 days ago 0 replies      
We do so to determine new pages on websites within our industry. Often the new information here is not formally announced or done so only weakly. We regularly uncover valuable new info about company developments and changes.
amitagarwal 2 days ago 0 replies      
I scrape Google to save search results.


keviv 2 days ago 0 replies      
I scrape for Google Playstore for app data, top ranked apps, etc. Unlike iTunes, they don't have a public API/feed. So, in order to get the data I require, I scrape playstore on a pretty regular basis.
rtcoms 2 days ago 1 reply      
I am looking for data having list of all universities and associated colleges . I didn't found anything related to it anywhere
skanga 2 days ago 0 replies      
I scrape craigslist for side by side comparisons of stuff I want to buy from there. Eg: Cars, motorcycles, etc. Maybe even real estate would be a good target.
bussiere 2 days ago 2 replies      
Reddit and twitter account for tendancy.

but twitter is so vast you may want to categorize account.

But reddit is a good source for a lot of info.

kurrent 2 days ago 1 reply      
Sports scores, statistics, etc are always high in demand for scraping and great to get people interested when learning scraping techniques.
cperciva 2 days ago 0 replies      
I scrape the Hacker News website to get links and numbers of points. These allow me to produce my "top 10" lists.
Mikeb85 2 days ago 0 replies      
I regularly scrape financial data - historical prices, live quotes, company information, quarterly reports, etc...
catshirt 2 days ago 0 replies      
i'm building a database of games and scores. web scraping has been very helpful.

maybe instead of trying to change up the content, try to change up the method. ie. do a talk on running crawlers/scrapers to seed your database at an interval. (instead of just "scraping").

dochtman 2 days ago 1 reply      
I used to scrape a bunch of webcomics to turn them into RSS feeds. I still have one or two running, actually.
yutah 2 days ago 2 replies      
if you could publish a price list for items sold at major grocery chains, I am sure that many people could use it (bonus if it includes aisle numbers)
Daviey 2 days ago 0 replies      
Online personal banking
contingencies 2 days ago 0 replies      
1. Monitoring competitors. By monitoring product/service offerings close to my own operations, I can get bizdev people on the phone and speak to partners when I see indications in the public marketplace that someone has a better sourcing deal than I do. Haven't done this in five years or so.

2. Gathering basic data that should be freely available anyway (like currency exchange rates, global weather, etc.). Always this is done carefully and with a light touch, with maximum respect for load inferred on targeted systems. Again, haven't bothered in about five years.

3. Automating content acquisition. For search engines, media libraries, etc. This is more like ten years ago. These days there's so little call for it... maybe if I ran a boutique hotel chain in a copyright-isn't-respected jurisdiction and wanted to provide a fat library of in-room entertainment...

viggity 2 days ago 1 reply      
I just started getting into scaping (mostly been using import.io) mostly because it is a complement to what I really care about - data visualization. I've gotten a ton of interest for my side project and despite that I haven't opened the beta I'm still worried that it won't be as lucrative as creating some niche reporting services for various verticals (real estate, auto, etc). Essentially data that is very tabular and not hierarchical or qualitative. You can think of my work as pivot charts on crack. If someone already pre-compiled this data, I'd much rather pay for it that do it myself. My value add is the analysis/viz done on top of the data. If you want to chat, feel free to email me, contact info is in profile.
thinkcomp 2 days ago 0 replies      
PlainSite (http://www.plainsite.org) uses about 20 different scrapers/parsers to download and standardize legal materials.
Saw an app on HN lost it's name
6 points by momoterraw  9 hours ago   8 comments top 5
hansy 5 hours ago 1 reply      
FlyingLawnmower 7 hours ago 0 replies      
Perhaps you're looking for https://www.pennywhale.com/?
crazypyro 7 hours ago 0 replies      
soneca 9 hours ago 1 reply      
try looking for it here: http://hn.algolia.com/
adityar 7 hours ago 1 reply      
Ask HN: Why doesn't your team ship?
43 points by mijustin  1 day ago   48 comments top 23
chollida1 1 day ago 2 replies      
Here is a very popular Quora question that details alot of the reasons why software estimates are wrong:


When I look back on my career the patterns that lead to good development were:

1) small team sizes, I'm a big believer that if version 1 of your product is developed by more than 4 people, its in trouble:) Small, super focused, and highly talented teams make the best version 1 products in my opinion, probably related to why some start ups can be more agile.

2) Everyone on the team has domain experience, ie its not the first time a product has been written. This is slightly at odds with the "second system syndrome". At my current company, when we wrote our algo platform, each member had done this before so we knew from a data, networking, machine learning, and trader's perspective what we wanted to get done. There was very little flailing around trying to learn the domain( ie no learning what ml techniques to use, how to connect to exchanges via FIX, no learning what a pairs trade was, etc).

3) 1, and only 1 person in charge of the vision. This might be obvious but debates, even when they are well intended, seem to slow things down. Having one person dictate what the next version will have seems to make things much easier. This is especially obvious in my current field of finance. Its very easy to spot the products developed by engineers for traders, vs the products developed by traders for traders, The former have lots of features that no one wants but they look pretty and the later, look ugly but make money:)

META NOTE TO ANYONE DEVELOPING A TRADING SYSTEM No one cares what it looks like. I'll say that again, no one cares what it looks like. The Bloomberg terminal is the ugliest thing on the planet and they mint money. Function over fashion, always. I'd go as far as to state that a small team developing a trading system having a designer is viewed in the same light as a small team having an mba. That person might add value, but you'll need to justify why you're there instead of another engineer.

I think alot of not shipping can be tied to these three things, too large of a team, not knowing what the final product will be doing and hence alot of experimentation and wrong turns and competing visions, or a lack of vision of what you are building.

cognivore 1 day ago 0 replies      
Changing requirements.

Us: "Well, we just got the roof on and all the walls sheetrocked!"

Them: "We want all the walls moved now."

Lately it's also been the strangely recurring request to do the impossible. I've actually been asked to use cross site scripting ("like the hackers do - why can't you do it") to implement features a customer just had to have.

msoad 1 day ago 1 reply      
Culture of bullshit. In some companies, the better you produce bullshit and fluff the better you get treated. Nobody cares who is doing the actual work. Whoever speaks about that work will get rewarded. In such culture, people who can do stuff will escape and people who assemble bullshit all day will stay. Then people wonder why nothing is being shipped...
justinweiss 1 day ago 0 replies      
Two main things:

* Bad estimates. When people are too aggressive with their estimates and miss them, it derails everyone that was depending on that team's stuff being done at a particular time. It causes blockages that cost way more than an over-estimate would have reserved as buffer time.

* Getting distracted. Dev should work really closely with design and PDM to get things done, but if Design / PDM has already moved on to the next project, it's distracting for everyone when they have to get pulled back in. Then, you have multitasking, and blockages, and stuff doesn't get done as smoothly as it should.

cik 1 day ago 1 reply      
I realize that I'm going to sound like a grandfather here, but I think it comes down to a lack of history. Teams with that young/old balance, people who have done it before ship more frequently, from what I see.

Years ago, that was called Build Management, now, we call part of it DevOps. But the reality is that most teams have forgotten about professional engineering.

Pausing to think about the 'how' rather than just hammering it out, and living with the consequences are enormous. The fact that I rarely see separations of concern anymore, let alone focusing on reducing build times or test run times, is massive. Ultimately, that means tradeoffs - the software triangle comes into play.

It's especially interesting to me, since I'm (today) working with a team I last worked with 12 years ago. There's been massive flux - only three of twenty originals are here. And yet, with multiple geographies, multiple age groups, multiple operating systems, and three distinct cloud providers (let alone the internal wannabe cloud) they push out a valuable release weekly, and micro-push to production ~3x daily. That's a testament to the engineer who manages the team, the same engineer I worked with on build workflows over a decade ago.

davybrion 1 day ago 1 reply      
Corporate release cycles... In large corporations, you rarely get access to the production environment and everything is managed by different departments. At my current client, we can only go to production a few times a year (except for hotfixes, but they need to be approved by a special board).

Luckily, our deployments are automated and we deploy to our testing environment multiple times a week, sometimes multiple times per day. That at least enables us to get feedback from the test team, which results in tickets either being moved to 'really done' or moving them back to 'in progress'. It's not the same as shipping, but in such an environment it at least reminds you of the important fact that things are indeed still moving and getting done.

MalcolmDiggs 1 day ago 0 replies      
In my experience it usually boils down to delusions of grandeur. Stakeholders often think THE WORLD is watching this upcoming release, so they can't possibly allow the team to ship without features X Y and Z ready.

In reality, for the vast majority of launches very people are watching, and even fewer care about your features or lack thereof (that's just not how early-adopters think about products, in my opinion).

If founders realized how little traffic / how few downloads they were going to get out of the gate, they'd ship much earlier; unfortunately everybody thinks their project is going to be a TechCrunch headline, and that's just not the case.

jbob2000 1 day ago 1 reply      
Not enough planning and general apathy.

We'll start a project and then half the devs will go on vacation. They return, and then the other half go on vacation. I guess that's summer for ya, but why not just close the office for a month and let everyone know that they should take their vacations around that time.

Devs will raise complaints about the work environment or a poorly written (but critical) module, but nothing gets done about it.

If a feature doesn't have strong executive support, nothing really gets done on it, and 3 months later someone will ask "hey, what happened to that feature?"

gatehouse 1 day ago 1 reply      
Essentially the exact same issues everyone else faces, which are covered exhaustively here: http://www.stevemcconnell.com/rd.htm

I'm going to single one out though: tweaking: making changes without adequate anticipation of the effects, or without the theoretical backing to expect that it will be correct. When you have a well structured high level understanding you can make changes knowing approximately what the effect will be and converge on the solution. Without that you end up thrashing around and making changes at random. If you take a random walk you're probably not going anywhere fast.

When you are dealing with a well designed and executed system, then tweaking actually seems productive, because you begin from a good place in the solution space, when you explore the "local neighbourhood", then the modifications still produce a functional piece of software and it might actually be better in some ways that you care about.

When you are making something new, tweaking gets you nowhere.

EDIT: Here is a foolproof process to get me to automatically disregard all your future ideas:

1. Find some parameters that were chosen with theoretical justifications and a real analysis of historical data.

2. Modify one of the parameters based on some flimsy rationale.

3. Run some quick tests that are obviously designed specifically to confirm your expectations, declare victory and act like you've solved something.

pan69 1 day ago 1 reply      
The No. 1 problem for me always seems to be:

Not having a clear and concise plan. For software development this means obviously a detailed and worked out feature set.

It amazes me how often a development team is set out on the journey of building a system without clarity of what it is it's trying to accomplish. Sure, the high level functionality is there but as developers you need to know low level stuff. Usually this due the lack of leadership. A word to the "CEO's" out there;

A vision that's not formalised in a document and shared with the rest of the team is not a vision, it's fantasy.

Sure, I get it. For you, the CEO, it's easier to make stuff up as you go along than it is to write stuff down and to commit to it.

As a developer, when you try to get functionality formalized then those meetings to discuss the functionality turn into "design" meetings where everyone has to come up with new "awesome" ideas which means that at some point the team stops having meetings to discuss things.

It doesn't matter how much experience I have as a developer, I simply can't cram the full time role of project/product manager and at least 40 hours of writing quality software into a single week. Something has to give.

canterburry 1 day ago 1 reply      
Well, some of this is also over exaggerated. I have spoken with some of the people who work at these "ship multiple times a day" companies and they only do this for their less critical systems. Their core money making stuff is less frequently updated.

It also depends on what domain your company is in. If downtime doesn't lead to anyone's death, doesn't send you to prison or millions in lawsuits from Enterprise customers...sure ship daily.

winslow 1 day ago 0 replies      
Red tape and general overhead of a massive corporation. We can't get anything done. They move way to slow for us (team was aquired ~3 years ago). We haven't delivered anything remotely useful for two years. I'm currently looking to switch companies. Oh also the mandatory 45 minute training we had on how to use a new internal HR website that was a complete waste of time.
misterparker 1 day ago 1 reply      
As a remote team, communication may be what slows our team down. But on the same token, in-office communication can slow things down too.
Htsthbjig 1 day ago 0 replies      
Bad estimates: You don't know what you don't know. You don't know how much time it will take to do something you never have done.

Problems are dam hard: What we do is really hard, so basically we have to trim our projects to the basics, instead of doing what we want to do, which is way more. Creative people want to start-create always way more projects that what they could finish, so we need discipline here.

*Complexity and lack of documentation. Another thing that needs a lot of discipline. People believe that what they know after thinking on it for a lot of time is evident for everybody else. It is not. Not only that, people forget what they know today if they do not document it, so if you do not document a year from now you will repeating most of your work, with the company's budget.

aytekin 1 day ago 0 replies      
Scope: The larger the project, the harder it gets to release.

Tools: The longer it takes to release, the less it will be done.

Culture: The harsher the response to failure, the less risk shall be taken.

scotty79 1 day ago 0 replies      
Requirements mostly undefined, delivered one semi-random brushstroke at a time. Part of requirements communicated so far changed as we go. Two rewrites of the front-end, including one with whole front-end stack changed from desktop client to webapp. Both rewrites used to change requirements for the front-end and back-end parts.

Only unchanging thing is the deadline.

vishalchandra 1 day ago 0 replies      
Not having a detailed and thorough product definition delays the end result. The development is fast but if you keep recoding the same functionality with different color, different fonts, different form fields you are constantly shifting the goal posts. The solution is to ship the ugly (unfinished version), get feedback and then ship again.
JoeAltmaier 1 day ago 0 replies      
Once we got to a stable functional release, the bar got raised for releasing a subsequent client. Change means instability; instability means it won't be shipped. So changes accumulate, the code never sits still long enough to shake out all the instabilities.
paulgambill 1 day ago 1 reply      
Donuts and other crappy food. Typical office food treats like donuts, pizza, and beer will all have negative chemical effects on team members. They will be more sluggish, less creative, and less communicative in general.
chrismcb 1 day ago 0 replies      
The industry is obsessed with shipping the same way humans are obsessed with breathing
gjvc 1 day ago 0 replies      
"frictions" is the best way to describe this.
trhway 1 day ago 0 replies      
some things better be left un-shipped.
jasonlotito 1 day ago 2 replies      
Looking beyond what other people have mentioned, these are things that plague teams:

1. Fear of failure. Too many people are afraid to ship too soon. They fear a bad product out the gate, rather than getting out the gate in the first place.

2. Lack of focus. You'll spend too much time focusing on things that don't really matter, such as building the perfect messaging system rather than reusing something that exists and works now.

3. Busywork. You'll spend your time doing things that really don't matter, such as optimizing your icons to use fonts instead of a PNG, and then dealing with trying to make it work in older versions of IE.

Ask HN: Credit Card reader similar to Square but with API
4 points by obaid  12 hours ago   5 comments top 2
aquark 6 hours ago 1 reply      
vishalzone2002 10 hours ago 0 replies      
amazon just released one today !
Ask HN: How much do you pay for mobile Internet connection?
3 points by DanBC  8 hours ago   5 comments top 5
DanBC 8 hours ago 0 replies      
12 per month on GiffGaff in the UK. PAYG. Domestic. This is soon going to rise to 20 per month for unlimited data.

"Unlimited", which used to be unlimited but has some kind of traffic control recently introduced.

No tethering, and this is strictly enforced. (there are separate deals for tethering and for dongles and tablets.)

3G only. They've only just got proper Apple iPhone support.

rahimnathwani 3 hours ago 0 replies      
Country, Speed, Operator, Frequency, PriceLocal, TaxLocal, PriceUSD, TaxUSD, Minutes, MB, Tethering

China, 3G, China Unicom, Monthly, 286, 0, 46, 0, 900, 1100, Yes

USA, 2G, T-Mobile, Daily, 2, 0, 2, 0, Unlimited, Unlimited, No

Someone1234 7 hours ago 0 replies      
You might want to ask people if that price includes tax or not. For example in the EU (and UK) it is typical to include all tax in the list price (e.g. 12 is "really" 12).

In the US almost all prices don't include the tax. Cellular services are particularly bad about this due to the way the US tax system works, they fund things like 911 operations from telephone and cellular bills.

So in the US a $20 list price might be closer to $35 in real terms (i.e. the amount they debit from your account every month).

oz 7 hours ago 0 replies      
20 USD/month.

Jamaica, on Digicel's network, HSPA+, pay as you go, tethering allowed, 2GB.

dennybritz 8 hours ago 0 replies      
I have three.

US,4G, $60/month, unlimited, no tethering, month-by-month contract

Thailand,4G, $15/month, 3GB/month, tethering, pay as you go

Germany,4G, $25/month, 3GB/month, no tethering, month-by-month contract

Gravit Free Design Tool Release
4 points by quasado  14 hours ago   3 comments top 3
hartator 5 hours ago 0 replies      
Waho, I was looking for replacement for Fireworks (My early love). Weird that you guys haven't much more traction on HN.

You have my upvote though! :)

quasado 13 hours ago 0 replies      
Just wanted to note that the whole application is build up on pure HTML5 without any proprietary stuff :)
Seattle AWS Architects and Engineers MeetUp 8/18
2 points by DanjaMouse  10 hours ago   discuss
Cloud-based secure device-to-device network service sharing
3 points by jayaraj  10 hours ago   discuss
Ask HN: How can I make experience as a Sys Admin?
2 points by yulaow  14 hours ago   2 comments top
B5geek 13 hours ago 1 reply      
I am a Sysadmin. The only advice I can think of is this:

Just build stuff.

Make use of virtualization and just start building systems. There is more then one 'role' a sysadmin will play and in some areas each specialty could be its own job.

i.e. A Windows/Active Directory admin vs an Exchange Admin.

Most good admins will know how to do a bunch of various duties, setup a Samba server; building a ZFS array; install a printer driver; configure a company Wiki and all the maintenance needed to maintain these systems.

And learn how to automate 90% of your tasks.

It's your choice if you want to be picky. If you want to only work on Linux systems, or to be more anal; if you only want to work on Debian servers, don't be surprised if it's harder to get a job. Spread yourself out and learn to be a Jack-of-all-trades. I prefer working with linux, but I jump up and resolve Windows headaches for my co-workers because that annoying 10% of the job makes me more attractive to my boss then the neckbeard who is a distro snob and refuses to touch anything except OpenBSD.

Build systems to do one job.One DHCP server, one DNS server, one file server. Then start combining them and optimizing them. break them and then fix them. There are a million 'gotcha's that only show up when you start working with the systems and you never see in youtube videos.

As far as finding work; either 'exaggerate' about some of your experience while contracting, or start working help-desk and get your foot in the door.

Your resume will get you an interview, your real skills will get you the job.

Ask HN: How do you sell things that you don't have?
4 points by justplay  10 hours ago   2 comments top 2
ISeemToBeAVerb 10 hours ago 0 replies      
Your methodology will depend on the product or service you plan to offer. If it's software you could build an interactive mockup. If it's a physical product you could hire a 3D artist to create some photo-realistic renders based on your product specs. The rest really comes down to having a strong knowledge of your market and your offer. Like Elyrly said, you could go the crowdfunding route. You could also set up your own landing page and accept email addresses or pre-orders. The key is to make sure you have a strategy and system in place to actually fulfill your promise. That might mean lining up manufacturers or other vendors ahead of time, creating a budget, and working out a timeline for fulfillment.
elyrly 10 hours ago 0 replies      
I have seen products created on crowdfunding website that actually showed how-many customers have paid for the item. This in turns allow you time to manufacture the product knowing each item is not a sunk cost.
Startup Interview with SaaS CEO about customer onboarding
4 points by sandeep45  10 hours ago   2 comments top
jayfk 8 hours ago 1 reply      
Great watch, but why is this on ask HN?
Ask HN: Did HN just change font?
8 points by ColinWright  13 hours ago   3 comments top 2
sp332 12 hours ago 1 reply      
I think it has always used Verdana for text and Courier for input & code. http://web.archive.org/web/20070405032412/http://news.ycombi... What does your browser render it as? (Right-click -> Inspect Element, click "Computed" at the top of the right-hand pane.)

Edit: oh, now it's "Verdana, Geneva, sans-serif" where it used to be "Verdana, sans-serif". So if you have Geneva but not Verdana, and Geneva was not already your sans-serif font, it will look different.

thekevan 9 hours ago 0 replies      
Yep, I came to the "Ask" section to see if anyone else noticed it.
Ask HN: How do I fund an open source project management app?
2 points by jitnut  10 hours ago   1 comment top
hkarthik 8 hours ago 0 replies      
One option which you may want to explore is Assembly. I haven't used it myself but I'm interested in potentially launching something on it in the future.


Looks like I have traction, but no connections. Where to go?
7 points by r_z_  14 hours ago   16 comments top 6
erichocean 13 hours ago 1 reply      
Apply to an incubator. That's what they're for: providing connections.
JSeymourATL 12 hours ago 0 replies      
Build-in time for strategic networking. Create a Top 10-20 list of social web players (Linkedin is a good search tool). *Important: actually, reach out to these folks for a live conversation (first by phone). They need to hear your voice and vice-versa. If there's good tonality, schedule a meetup in person for more dialog.
hpagey 11 hours ago 0 replies      
For consumer apps, you would need millions of page views/ users / daily interactions to declare product/market fit. I would wait for some more time. Get more users, interactions and then decide on applying to incubators of moving to SV. You will be in strong position then.
eroo 14 hours ago 1 reply      
It would be really helpful to have some more context.

(0) Can you link to the product?

(1) When did you start and what has the growth trajectory been (e.g., weekly growth rate)?

(2) Is there some marginal cost other than server space that is limiting growth?

dennybritz 12 hours ago 1 reply      
Perhaps start with Angellist. Angels are pretty approachable, and many are happy to make introductions to VC funds if appropriate.
fsk 13 hours ago 2 replies      
Ask your customers to pay $5/month and bootstrap. Then, with revenue, look for investors if that's the path you want.
We got our first Delaware tax bill: $74,018.74
117 points by ccvannorman  1 day ago   82 comments top 17
chao- 1 day ago 4 replies      
Quite the opposite. In Texas, until a certain point in revenue, they really don't want to waste any time and resources on you. They ask "Did you make at least $1,000,000 last year? No? Well, please move along peasant, we have real businesses to deal with who might actually be worth auditing."

Makes sense, somewhat. Even in the case of minor fraud, they would likely spend more in manpower than they could possibly reclaim from such a low-earning outfit. You still have to file, of course, but it's properly simple if you didn't earn a significant amount.

EDIT: corrected from $200,000 to one million.

awad 1 day ago 5 replies      
At previous company incorporated in New York (oh, to be young and naive) with just 3 technical co-founders we were slapped with a ~30kish fine for not having worker's compensation. When trying to explain the situation to the state, that the only 3 employees were the owners who typed in front of a computer all day but were still interested in resolving the issue, we were no less threatened by the official on the other end that we had no leverage and they were going to go after our personal assets. Quite a terrifying time for a 21 year old to be honest.
miiiiiike 1 day ago 1 reply      
It's a good idea to run your numbers through the Franchise Tax Calculator: http://corp.delaware.gov/taxcalc.shtml Mine was $180k on the statement, ~$1.7k in reality.

Still, even though I knew exactly how much I owed, when I saw $180k on the statement my first call was to my lawyer, for a hug.

alasdair_ 1 day ago 1 reply      
I really wish that there was a good document that outlined all the paperwork required to run a "standard" Delaware-based C-corp for an online business located in Silicon Valley.

Things like workmans comp, sales tax, board meetings and minutes requirements, insurance requirements and the like. No need for lots of explanations, just a list of all the major things you need to do to run a C-Corp legally.

Does anyone know of such a list?

mattzito 1 day ago 2 replies      
We got hit with a sales tax audit in NYC here, which apparently is pretty common. Turned into a multi-week process of providing documentation, justifying why some of our deals were non-taxed, justifying some of the work we did, and a little lawyer time.

I think in the end we owed ~$3k. What a waste of everyone's time.

serkanh 1 day ago 0 replies      
I once got a bill from Florida department of revenue that was close to 40000 for a company i started which had been registered for a while but had no real revenue. Turns out FDR applies the exact same methodology to calculate the dues by simply averaging or guessing revenues based on the industry you are in.
btown 1 day ago 0 replies      
Are you me? Had almost exactly the same experience, though we're still in the prototyping stage and don't have paying customers yet. When I filled out the DE franchise tax form online, it recalculated to the much smaller amount based on the $350 minimum. It's worth noting that you should call your agent to see whether their required forms are just them offering to fill out a form you could file yourself with state governments, often at an added fee just to fill out a few lines for you. In my experience, they'll give a straight answer if you ask.

Also, for any companies doing business in New Jersey, there's a $500 minimum income tax even if you make no revenue - and even if you put the business on hold but still keep some assets on the corporation's books at any time that year - so be aware of that. It's somewhat ironic that for companies operating at a loss, states often refuse to waive any minimum tax - would they not wish the companies to be able to put that money towards their success, and to have a higher probability of generating both jobs and much higher income taxes in their states in future years? Though I suppose New Jersey is not the state best known for forward-thinking political practices...

moron4hire 1 day ago 2 replies      
To anyone going out on their own, be it with a company or as a freelancer: if you don't have an accountant to help you figure out what your taxes will be, you're A) nuts, and B) losing money.

Scratch that, that goes out to everyone. Get an accountant. Stop guessing at your taxes. You'll save more than you spend. That's their business.

phantom784 1 day ago 1 reply      
I couldn't help but notice Delaware's "Division of Corporations" is using a PO Box in Binghamton, NY. Any idea why?
ccvannorman 1 day ago 0 replies      
[OP] The e-mail reply from our lawyer:"In theory, the default way of calculating eventually is lower than the alternative method when a company becomes large.

Every February we have frantic calls from our clients when they receive their tax bills."

LOL.. just founder things

frankus 1 day ago 0 replies      
I once got a bill from the California Franchise Tax Board. I had been working remotely for a California company and barely even set foot in the state, but they decided I owed them state income tax.

Ended up being easier to pay up than dispute it (~$250).

mrfusion 1 day ago 1 reply      
How much taxes do LLC's pay in Delaware? Would I still have to pay my home state's property taxes?
emcrazyone 1 day ago 2 replies      
I would have guess it to be a scam. Why does the State of Delaware use a Birmingham, NY address?
27182818284 1 day ago 0 replies      
Same thing happened to us. We hired a proper CPA and it was fixed.
justin66 1 day ago 0 replies      
What did you actually owe?
cody3222 1 day ago 0 replies      
This happened to me!
korzun 1 day ago 1 reply      
> Any other startups here gotten a nice "tax surprise"? Would love to hear about it!

You registered in Delaware without figuring out how much actual taxes you will have to pay. If you did, you would know right away that that invoice was bogus.

Not sure why this is on the front page.

Ask HN: What's it like working at Apple?
5 points by Flameancer  1 day ago   2 comments top
msoad 1 day ago 1 reply      
I have a couple of friends working there. It's pretty dry corporate culture. Expect being overworked sometimes. But on the bright side they pay really well and the RSU will be a lot of money. At the end of the day it all depends on who are you working with not the company itself. I hope you get lucky and get into a good team.
Ask HN: SCSS/Sass vs. LESS
7 points by jrub  14 hours ago   15 comments top 10
tannerj 7 hours ago 0 replies      
I started with LESS (Bootstrap) and couldn't make the column gutters collapse. I did some research, found that I could do that in Zurb Foundation so I started using foundation. It was that simple difference that caused me to go with SASS over LESS. I actually wasn't doing anything custom in either framework at first, but once I wanted to start customizing the framework I started learning SASS and really like it. I've also been learning ruby/rails and it's nice that its written in the same language. I've now dropped the idea of a huge framework and have been learning compass and susy with sass and I really like that solution as opposed to an entire framework. I think both LESS and SASS are great and valid solutions to the problem. At this point it's just preference.

edited for clarity

hajile 13 hours ago 1 reply      
You will probably need to use node js anyway for other is build tools, but won't need ruby unless you use rails. Reduce your build to dependencies and use less.

That said, I recommend you look into stylus. It is better than either less or sass in my opinion as it gets rid of a lot of unnecessary syntax.


ceejayoz 13 hours ago 0 replies      
FYI, Bootstrap has an official SCSS version as well as the LESS one. https://github.com/twbs/bootstrap-sass
poseid 14 hours ago 0 replies      
I like LESS recently, since it is NodeJS (= fast and easy to setup)
akvlad 13 hours ago 0 replies      
I think the big reason people prefer SASS is because of Compass - http://compass-style.org/
Spoom 14 hours ago 1 reply      
I prefer LESS since its syntax is essentially a superset of CSS.

I'd still be interested in hearing the arguments in favor of SASS.

hcarvalhoalves 12 hours ago 1 reply      
Myself I prefer LESS since it's closer to CSS.

I believe most people using SASS do because of Rails.

mmset 13 hours ago 2 replies      
I use LESS because easy very easy to compile with node.js. Not to disregard the fact that Bootstrap 3 is LESS bound.
sehr 13 hours ago 0 replies      
LESS. I don't enjoy logic in stylesheets and having to mess with Ruby mucks up most JS based build systems IMO
drakmail 14 hours ago 0 replies      
I'm use SASS, because it's default for Rails.
Ask HN: What tools do you use to test/review a website visually?
3 points by jtfairbank  16 hours ago   1 comment top
courseeplus 15 hours ago 0 replies      
Alexa Ranks and google analytics
Is it possible to post a project both on Kickstarter and Indiegogo?
5 points by narayanb  18 hours ago   4 comments top 3
csmdev 14 hours ago 1 reply      
Don't half-ass two things. Whole-ass one thing.

When your attention is split on two different and identical campaigns, both of them will fail. Choose one and make it count.

andy_felsil 17 hours ago 0 replies      
At first glance I'm not convinced to the idea, I'd assume that it's better to keep the community in one place. Reaching critical mass sounds easier with just one URL to share too.

I might like it if the fundraisers were complementary and addressed to the same target groups, but actually different.

josephschmoe 11 hours ago 0 replies      
The page you use for crowdfunding is just a front page. It doesn't make sense to have two front pages - it's confusing to the user.

Also Kickstarter requires you meet your goal in order to get any money. It would be a shame if you got half the money on IndieGoGo and your Kickstarter failed because of it.

Ask HN: Coolest way to implement infinite scroll for mobile devices?
2 points by parinck  15 hours ago   2 comments top 2
uptown 7 hours ago 0 replies      
uptown 14 hours ago 0 replies      
You may want to take a look at this post from Airbnb. They tackled this problem, and have open-sourced their solution:


Ask HN: Is a Chromebook suitable for web development?
6 points by welly  19 hours ago   13 comments top 5
andrewbells 19 hours ago 1 reply      
Here is a recent blog post by codestarter, where they describe the process and list some resources for making a dev machine out of a Chromebook: http://blog.codestarter.org/post/93985346780/how-we-turn-199... . It was featured on HN not long ago, some comments might be useful: https://news.ycombinator.com/item?id=8143844 .
skuunk1 17 hours ago 1 reply      
I am a Ruby on Rails developer and I have been developing on my Chromebook (and any place with a browser) using Nitrous.io (https://www.nitrous.io/). There are also other comparable services as well. I mainly use it on days when I am working from home. It give me terminal access in the browser, (but not quite full sudo) as well as a decent IDE.

I have yet to use Crouton on my Chromebook because I have another laptop running Ubuntu and I just haven't found the need yet (though to be honest I haven't picked up my old laptop since I got the Chromebook).

Browserstack lets me check out what the web pages look like in other browsers.

If you do use Crouton, then you should basically have a Ubuntu box. Most Chromebooks have an SD Card slot and USB slots as well if you need extra storage.

CyberFonic 16 hours ago 2 replies      
tldr; The CB is cheap, but you get what you pay for.

Basically you can do anything you can do with Linux if you install Crouton, i.e. it implements Debian like distro in a chroot jail.

I used a Samsung Chromebook for about six weeks and then went out and bought a 13" MacBook Air (used a MacBook Pro before the Chromebook adventure).

The MBA has a bigger screen, better keyboard, way better touch pad. I still throw the CB into the bag when I travel. But most of the time the MBA is the preferred system.

vishalchandra 17 hours ago 1 reply      
Try out https://github.com/dnschneid/crouton and with that you can get Linux support on your Chromebook. Rest will fall in place.
patd 16 hours ago 0 replies      
I use an Acer C720P as a backup dev machine for a Django project.

With crouton you have access to traditional Unix tools so if most of what you need is command-line tools and a text editor, you can manage. It's not amazingly fast and the screen resolution is a bit limiting. But it's still a great machine : cheap, light and powerful enough for web dev. I don't regret buying one.

Ask HN: What's that quote about X window system?
4 points by acron0  19 hours ago   7 comments top 4
lutusp 18 hours ago 1 reply      
> I recall reading a parabolic quote ...

ITYM a hyperbolic quote. A parabolic quote would describe an arc like a thrown ball, which eventually falls back to earth. A hyperbolic quote would escape toward infinity, which is the intended sense of the expression derived from the shapes of conic sections.

DanBC 18 hours ago 0 replies      
I can find a lot about Zawinski's law, or Greenspun's tenth law.

Letts' Law: All programs evolve until they can send email.

Zawinski's Law: Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

Greenspun's tenth law: Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.

RTM provided a "corollary which clarifies the set of "sufficiently complicated" programs to which the rule applies: including Common Lisp." http://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule

ksherlock 14 hours ago 0 replies      
The X server has to be the biggest program I've ever seen that doesn't do anything for you. -K Thompson (plan9 fortune file)
DanBC 13 hours ago 0 replies      
"If the designers of X Windows built cars, there would be no fewer than five steering wheels hidden about the cockpit, none of which followed the same principles . . . but you'd be able to shift gears with your stereo.

Useful feature, that." -- Marcus J. Ranum, Digital Equipment Corp.

       cached 14 August 2014 04:05:01 GMT