hacker news with inline top comments    .. more ..    24 Jun 2017 Ask
home   ask   best   2 years ago   
Ask HN: Declining the job offer after accepting
16 points by gandutraveler  8 hours ago   27 comments top 14
ChuckMcM 7 hours ago 2 replies      
Is there a right way? Not really. When you decline post acceptance the company will create a record of this in their applicant tracking system (its a data base) and they will never offer you a job again (because they aren't sure you're going to take it even if they did). The recruiter(s) you worked with at that company are probably contractors. They will move around to other companies in their careers, if you are the in the unlikely situation where you are interviewing at a company and suddenly they decide not to proceed. That could be why. Generally recruiters work with thousands of people, but only a small number of them actually accept an offer and then later just back out. If you are fortunate and don't have a very unique name then it might be a problem but, if there are very few engineers with the same name it can be a bigger problem.

If you are exceptionally unlucky and you decide to not work at this major tech company and then your friend who pitched the startup can't come through. You are doubly in trouble (can't go back and don't have a job) not to mention a partially transferred visa.

The smart thing to do is to take the job at the Major Tech corp, spend a couple of years saving some money and learning what you can, and then joining your friend with the startup if they still look like a viable offer. That strategy maximizes your future value. You have additional work experience, your friend knows that you will stick with your word even when you might not want to, and you will have a bit more knowledge of how well your friend can put together a startup or not.

sokoloff 4 hours ago 0 replies      
I was working on a startup and we were out of money. I applied for and found another job and let the founder know. On the Sunday before I was to start at the new place, we secured another round of funding, so the first thing I did on Monday was report to my new job and the second thing I did was resign from that job.

New boss was pissed, but it was the right call for me (and I couldn't have done it any earlier).

In your case, I'd give careful thought to the visa issue; I wouldn't sweat it on behalf of the major tech company at all. They've no doubt had worse behavior from candidates and they'll survive the loss. You probably won't be blackballed at that company, but do expect a question about it if you do decide to apply later.

elmerfud 7 hours ago 2 replies      
Don't worry about it. You won't be the first nor the last person who changes their mind on a job change, and I promise they have had it happen before. To retract your acceptance you should just need to terse but professional. The general format would be:

- A sentence indicating you're withdrawing your acceptance of the offer.

- One or two sentences explaining you have an unforeseen unique opportunity. No need for great detail.

- One or two sentences thanking them for their consideration, and hope for future opportunities together.

That's really all there is to it. A professional company that values it's people will understand that these things happen in life and will not hold any thing against you.

wallflower 1 hour ago 0 replies      
I don't know you, but I strongly feel that you are making the wrong decision.

Let's go over the points...

Your friend pitched a startup idea. Based on your wording, it sounds like it is more an idea than an actual company. Either way, unless your friend's startup has the legal resources to sponsor a H1B visa and the financial capability to pay you a prevailing wage, you run the risk of not being able to transfer your visa. The government agency can reject your visa transfer, as well.

By declining the offer, you are putting yourself in jeopardy of losing your ability to work in the country.

I assume you are not married, but for the sake of this response, let's assume you are. What would your wife think about this? Don't be so selfish, imagine that others would be affected.

You are assuming that you will be able to work at the startup legally (H1B transfer). I don't believe you can assume this. Does your friend even know what is involved in sponsoring a H1B? Major tech companies have dedicated departments for managing the H1B process for their employees.

Good luck!

late2part 24 minutes ago 0 replies      
Renege as you see fit. The employment offer was at-will, right? I accepted and then turned down employee #5 at a multi-billion dollar company and never regretted it and it has never been an issue.
pkaye 6 hours ago 1 reply      
As a hiring manager I never take it personally if they decline the offer before the start date. There are a few that never mention it till the starting date and at that point feel they were just hanging on for a better offer and wasted my time so personally would never hire them again in the future.
smt88 7 hours ago 1 reply      
Do it ASAP. Details are mostly unimportant, but be honest.

The more time and money they spend on you before you tell them, the angrier they'll be. They may already have grounds to sue you, although I've never heard of a company doing that in such a situation.

bsvalley 7 hours ago 1 reply      
How can you join a startup (idea stage) on an H1B? Cause make sure you can actually work legally for that startup before doing anything. Technically, you can't unless the startup is already established.
cvaidya1986 2 hours ago 0 replies      
I would do the big co whilst moonlighting and only after funding comes in jump ship esp. considering the visa.
probinso 4 hours ago 0 replies      
Do it quickly. rip off the Band-Aid
dudul 4 hours ago 0 replies      
If you need a visa to work it's probably safer to go with the bigger company. Does your friend realoze they'll have to apply for your visa?
pm24601 5 hours ago 0 replies      
If you have visa issues... how is your friend's cool startup going to help?

Visa issues even pre-trump were hard.

Does your friend have funding?

Does your friend have market validation?

What will happen to you personally if the startup fails - like so many do?

CodeWriter23 6 hours ago 0 replies      
No judgement here. I suggest you check yourself out to make sure this isn't about fear, cold feet, etc.
fancyfredbot 6 hours ago 1 reply      
Have you signed a contract?
Ask HN: Are dating websites allowed to let search engines index the profiles?
4 points by fuckokcupid  4 hours ago   6 comments top 2
savethefuture 4 hours ago 1 reply      
As soon as you entered your personal information into their site you gave it to them so they could display it on your profile. You can't stop google from indexing their site. Its public period. Don't give away personal information if you dont want it to be public.
WestCoastJustin 4 hours ago 1 reply      
Ask HN: What insightful predictions made in past HN discussions came true?
15 points by randomsearch  11 hours ago   1 comment top
haburka 6 hours ago 0 replies      
If any, they were most likely due to sheer number of responses and theories rather than any wisdom.
Ask HN: Who's running a profitable newsletter?
174 points by cronjobma  1 day ago   85 comments top 29
duck 1 day ago 6 replies      
Hacker Newsletter (http://hackernewsletter.com) makes a few thousand a month from sponsors. It could easily make more if I were more of a salesperson, but I'm happy with it as is. Just to be clear though, it took some time to get to that. I'm actually about to hit the seven year mark and issue #357. Speaking of... I'm working on tomorrow's issue as I type this!
DanLivesHere 1 day ago 3 replies      
I am, at http://nowiknow.com

It's a general trivia email. Every day I send a fun fact of the true story behind it.

Been writing for seven years. Not going to disclose how much I make a month, but I describe it as not quite full time job money but a lot more than beer money.

decryption 1 day ago 3 replies      
I run The Sizzle (https://thesizzle.com.au), which is a daily roundup of tech news with an Aussie slant. Any Australians who enjoy HN should check it out!

I've got 445 subscribers that pay $5/m or $50/yr for it. No ads, no tracking, but I do insert affiliate links - primarily eBay.

Costs me about ~$1000/yr to run (Mailchimp, web hosting for a Wordpress blog & Discourse forum and Zapier costs mainly).

This financial year (July 1st 2016 to June 30th 2017), revenue sits at about AUD$18,000. I expect around ~$30,000 next year if there's 0% paid subscriber growth and affiliate link revenue stays the same.

shortformblog 22 hours ago 2 replies      
I run Tedium (http://tedium.co), a twice-weekly newsletter that covers obscure topicshalf the time tech, half the time not. I've been doing it two and a half years.

I make money from a variety of sources on it, including affiliate links, sponsorships (I recently had some success with http://upstart.me on this front), donations from readers (both Patreon and PayPal, because folks want options), banner ads, and fees for syndicating the content to outlets like Vice's Motherboard.

The pieces are written more like stories than link roundups, giving them an evergreen appeal. This week I wrote about the history of the 911 system; last week I wrote about CGA graphics and Windex. It actually has a smaller profile than my last project, ShortFormBlog, but it's more sustainable from a financial and work-life balance perspective.

Last month, I did a T-shirt sale with the help of a vendor (Vacord Screen Printing, http://vacord.com) and made a few hundred dollars through that.

All of this together is not enough to stop me from working a day job, but the mixture of sources and the fact that I syndicate content helps build exposure and ensures that if one source is weaker than another on a different month, the whole machine doesn't fall apart.

If you want to run a profitable newsletter, be willing to rely on more than one revenue stream.

tittietime 23 hours ago 1 reply      

(though the landing page is SFW, the newsletter is not)

I use the newsletter to market tshirts. I sell through about 100 of each design over the course of 2 months. It pays for hosting and funds the next shirt.

List growth has been slow and steady and im looking to increase my shirt order on the next design.

refrigerator 1 day ago 1 reply      
I don't run these but here's a couple that I know of:

- Stratechery (Ben Thompson) - $100k/month (conservative estimate) via subscriptions (https://www.stratechery.com)

- WTF Just Happened Today (Matt Kiser) - $8k/month via Patreon (https://www.patreon.com/wtfjht)

stanislavb 1 day ago 0 replies      
I'm running and moderating all the newsletters of the LibHunt Network - https://www.libhunt.com . There are 18 "Awesome" Newsletters with around 5,000 subscribers altogether. The newsletters are making some money and there is interest in them definitely. The sales part is the most difficult one though.

My most popular newsletters are about Python (https://python.libhunt.com/newsletter/58), Go (https://go.libhunt.com/newsletter/58) & Ruby (https://ruby.libhunt.com/newsletter/58)

P.S. it took me around 13 months to reach the 5k subs...

asanwal 1 day ago 2 replies      
CB Insights (www.cbinsights.com/newsletter)

We have 289,000 on the newsletter which represents a large group of VC, tech M&A, corporate strategy and startup folks interested in data-driven discussion of technology trends. It's the primary way we sell subscriptions to our SaaS platform.

We messed around with ads in the newsletter but they don't monetize nearly as well as the "house ads" to our data/product or to our events.

It's our company's golden goose.

jkmcf 1 day ago 2 replies      
I don't know if they are profitable, but Peter Cooper has a whole bunch of good ones at https://cooperpress.com/
sampl 1 day ago 2 replies      
According to Indie Hackers, Scotts Cheap Flights (discount airfare newsletter) does $320k/mo

Interview here: https://www.indiehackers.com/businesses/scotts-cheap-flights

davidverhasselt 21 hours ago 1 reply      
HNDigest (https://hndigest.com) is profitable some months, others not so much. Turns out it's quite expensive to send a ton of emails if you don't have sponsors.

I should probably spend more time finding sponsorship.

stevesearer 1 day ago 2 replies      
My site https://officesnapshots.com has a weekly newsletter which is basically an email update of the previous week's new content. It is profitable, though just an extension of the advertising options on the rest of the site.

I have a sign up form at the bottom of my site and use double opt-in to make sure people really want to subscribe. I also periodically trim the list down by removing people who don't open or click anything. I figure 1000 subscribers and a 50% open rate is better than 5000 and a 10% open rate as the list is more highly engaged. Plus on MailChimp you're wasting $$ sending to people who don't engage.

Mojah 22 hours ago 1 reply      
Hi there!

I write cron.weekly [1], a weekly newsletter on Linux & open source. When I started around 2y ago there wasn't much competition.

I'm at 6k subscribers now and making roughly 1k (eur) a month via sponsorships. Members don't pay, it's entirely sponsor driven.

It took a little over a year of hard work for free before the first sponsor landed, right now it's pretty good value for the time I put in. As the subscribers grow, that ratio should only get better.

[1] https://www.cronweekly.com

leandot 8 hours ago 0 replies      
Hacker News Books http://hackernewsbooks.com is profitable from Amazon referrals, $200-300/month. I am planning to look for some sponsorship but have not had the time. ~1500 tech-savvy readers in the newsletter + several thousand on the web.
justinavery 18 hours ago 0 replies      
RWD Weekly (http://responsivedesignweekly.com) has two advertising spots in each newsletter. The primary spot runs for $450-500 and promoted link runs at $130 a week.

The newsletter fills probably 3/4 of the placements and then I use the other placements to promote conferences that I love. This is usually in exchange for a ticket (which I give away if I can't attend) and media sponsorship which gives the newsletter some extra exposure.

I only introduced advertising after the subscribers reached over 5000 and the Mailchimp costs became a little too much, now it's sitting just over 29k subscribers. It's a great side project that I'd love to invest more time towards but at the moment it makes enough to cover mailchimp, servers, cloudflare, speedcurve and allows me to patron a couple of other newsletters that I love.

ilamont 1 day ago 0 replies      
Timmerman Report (biotech) is doing well, judging by some comments I've seen posted by its founder, a former Xconomy journalist who started the newsletter full time about two years ago. Not sure if it's profitable, though.

In the indie book publishing world, I subscribe to a paid newsletter called The Hot Sheet (http://hotsheetpub.com/). I pay $60/year or thereabouts, which I think is reasonable considering the insights, analysis, and tips I get every 2 weeks. I doubt it's a full-time income for the two writers, but on the other hand I don't think it's a full-time writing gig, either.

It's worth noting that in many of these niches there is very little in the way of established trade pubs as print magazines covering the industry have folded or become shadows of their former selves when they moved online. It doesn't surprise me that some of the more talented or insightful writers have decided to launch their own brands and build their own audiences.

davepoon 22 hours ago 0 replies      
http://uxcurator.com (UX newsletter)Posted my site on Hacker News and Reddit for the first launch, and it got a few hundred subscribers, and it has been gradually growing since last 2 years. And it roughly makes ($AUD)1k a month via sponsorships.
lathiat 1 day ago 0 replies      
Presumably not profitable at this stage, but inside.com founded by (I believe...) Jason Calacanis to try out lots of different e-mail newsletters currently he's funding it, they have some advertisers and I think they are experimenting with user pays?

Another one I know of that is successfully getting a good following is www.thesizzle.com.au

eibrahim 9 hours ago 0 replies      
I run http://frontendweekly.co/ but haven't monetized it yet. It has ~2k subscribers (plus ~3500 on medium at medium.com/front-end-hacking)
milesaus 21 hours ago 0 replies      
I run growth.email (http://growth.email) which is a weekly growth marketing email, listing 10 articles in each issue. I started it at the beginning of this year, have around 1,700 subscribers and make $40 a week from one single ad in each issue.

At this stage, my pitiful MRR covers costs, etc but not all the time I put in. Soon as I get to 10k+, at $25/cpm it starts adding up to $1k a month, which is a little less painful to look at. :)

sippndipp 1 day ago 0 replies      
We're running http://androidweekly.net and http://swiftweekly.com?

It was roundabout 5 years just editing and making no money. Now it's making money - but just a side income.

1. Content is king!2. Start to build a community (if you link someone in your newsletter just ping him on twitter).3. Go to community events.

Doing ads on Facebook and Twitter actually didn't work that well.

eli 1 day ago 1 reply      
Newsletter ads are very effective (and therefore can be extremely profitable) if you have an audience that people who control marketing budgets want to reach.

That said, email is just another medium. There's no one way to make money just like there's no one way to make money with an app.

I'm guessing it's not the kind of newsletter you meant, but we make millions selling ads in B2B email newsletters.

nnn1234 1 day ago 0 replies      
Stratechery? Ben's monetisation is the newsletter
GoatOfAplomb 20 hours ago 0 replies      
I worked at SmartBrief (smartbrief.com), a company that creates newsletters on behalf of trade associations. The advertising would often yield impressive rates because of valuable niche audience, e.g. heavy equipment financing executives.
khuknows 22 hours ago 0 replies      
https://uimovement.com - ~16k subs, $500-1000+ a month from sponsors. I hardly ever do sales and spend ~1 hour a week on this - I'm sure it could be more profitable with more effort.
blairanderson 1 day ago 0 replies      
FYI I don't know if the offer is still out but I remember seeing this a little while back https://twitter.com/peterc/status/717393310080507904
foundersgrid 20 hours ago 0 replies      
I run https://FoundersGrid.com and we sell our weekly sponsorships ($500 each) most weeks. I'm currently editing edition #358 right now :)
DanBC 21 hours ago 0 replies      
Off topic, but I'd pay (a small amount) for something like Need to Know.


NtK pioneered some things that got taken up elsewhere: dohgifs highlighted terrible algorithmic placement of online ads next to news stories. Private Eye now does this as Malgorithms.

A newsletter like this would fare better now we have things like Patreon.

fowkswe 23 hours ago 0 replies      
The listings project is not mine, but I've always admired Stefanie and her list:


It lists artist studios, coworking and apartment sublets, mostly in NYC and probably does about $500k in revenue year.

Weekly newsletters have been averaging about 300-325 listings @ $30 per week. There are also sponsored emails that I'm sure are in the thousands of dollars per email.

Ask HN: Deep learning algorithms to aggregate technology topics from the web
6 points by larryfreeman  9 hours ago   2 comments top 2
visarga 7 minutes ago 0 replies      
Take a look here as well: https://hackernoon.com/the-unreasonable-ineffectiveness-of-d...

Exactly about news classification with DL.

visarga 9 hours ago 0 replies      
I did something similar for another language. I crawled millions of articles first and build word2vec on the plain text. To compute the embedding of a topic, I summed the vectors of its main keywords - 3-4 well chosen words are enough. The embedding of an article was obtained by summing (or averaging) the vectors of its words. I skipped the stop words (also tried tf-idf) to reduce the noise. The final step was to compute the similarity score of an article related to a topic. This is extremely easy and fast - a dot product between the vectors. Scores over 0.3 (or 0.5) indicate similarity. The main advantage of this method is that it only requires a topic vector, not a whole dataset of training examples. But if you have such a dataset, then you can average the most central keywords per topic, and get topic vectors.

If you have hundreds of classes and a training dataset with about 500+ examples per class, you can also try fastText, Vowpal Wabbit or even Naive Bayes. If you want to use neural nets, there are some 1D CNNs floating around on GitHub, but they don't work all that well compared to simpler classifiers or simple dot product between vectors. Hundreds of classes usually make classifiers sluggish and accuracy is not so great compared to the binary case (spam/not spam). I wouldn't try to do that to predict the best subreddit for an article for example, because there are too many subreddits, but with vectors it's still OK.

Ask HN: What has happened to YC's residential development research project?
93 points by baybal2  1 day ago   20 comments top 6
RubenSandwich 1 day ago 2 replies      
They are still their slowly plugging away at projects: https://harc.ycr.org/.

Edit: You haven't heard from them because they are aiming very high so it will take years before any of their work hits the general public.

Edit 2: From my understanding, they are still working on their Universal Basic Income Research as well and have chosen Oakland as the testbed: http://basicincome.org/news/2017/04/httpswww-youtube-comwatc....

Entangled 1 day ago 7 replies      
Ok, here is a cheap shot of a dream. Future cities like mega malls with a thousand shops facing inside and a thousand homes facing outside, one to one. Roads would be marbled floors and cars would be electric scooters with a basket enough to buy groceries around.

For those who like the outdoors, just get your off road vehicle and face the indomitable and untouched nature. No paved roads, no concrete, nothing outside these habitable malls interconnected by hyperloops. Of course there will be supply roads for trucks but they will be just like highways interconnecting mega farms to mega malls.

Nah, scratch that, there is nothing like a house in the suburbs with a huge yard and a barbecue.

simonebrunozzi 1 day ago 1 reply      
Sam Altman spoke about it recently: https://medium.com/the-naked-founder/sam-altman-on-yc-univer...

AFAIK, Ben Huh is still in charge of the project.

raphman 1 day ago 0 replies      
Two weeks ago, Jonathan Edwards [1] announced on Twitter that he left/leaves HARC [2] but didn't elaborate on the reasons.

[1] http://www.subtext-lang.org/AboutMe.htm[2] https://twitter.com/jonathoda/status/871784998113882118

Kinnard 1 day ago 0 replies      
Did you mean the New Cities Project?
erikj 1 day ago 1 reply      
What do you expect from it?
Ask HN: Best Self-Hosted, Server-Side Image Optimization Tools?
57 points by DivineTraube  1 day ago   31 comments top 20
tyingq 36 minutes ago 0 replies      
https://pngquant.org lossy optimization of png images.
ikennachifo 4 hours ago 0 replies      
With this recent addition(https://webspeedtest.cloudinary.com) to tools you can use, Cloudinary(http://cloudinary.com) is hands down the best for me, In all the time I've been using it, It usually has a way of meeting all my needs. I'm obsessed with page speeds and Optimization, like I have sleepless nights optimizing, but since using Cloudinary, I can sleep well on some nights now. And No! I don't work there, but bless the people that work there.
wsxiaoys 1 day ago 0 replies      
I believe the most complete solution is a full unix environment with shell piping the image stream. so i make https://bash.rocks, which is a web frontend backed by a unix environment.


1. Resizing with imagemagick: https://bash.rocks/Gxlg31/3

2. Resizing and convert to webp: https://bash.rocks/7J1jgB/1

After creating the snippet, you could either use GET https://bash.rocks/0Be95B (query parameters become environment variable) or POST https://bash.rocks/jJggWJ (request body become stdin).

It's not hard to roll your backend like this for private usage (simply exec from node). I'm also working on an open source release.

matrix 1 day ago 0 replies      
For Java (or other JVM languages such as Kotlin), TwelveMonkeys is powerful and does not have external dependencies:


tobltobs 1 day ago 2 replies      
In my experience not the optimization is the hard part. But the eventually necessary scaling down you have to do first.Doing this with Imagemagick, Pillow or whatever will result in possible OOMs or gigabyte sized, not so temporary files filling your /tmp dir for large source images.

The only tool I ever found which does this job reliable even for huge images is http://www.vips.ecs.soton.ac.uk .

cbr 1 day ago 0 replies      
I used to work on mod_pagespeed / ngx_pagespeed, and I'm very proud of our image optimization: https://modpagespeed.com/doc/filter-image-optimize

It compresses and optimizes png, gif, and jpeg, creates webp for browsers that support it, inlines small images into your html, longcaches images, and even creates srcsets.

vladdanilov 1 day ago 0 replies      
I'm working on Optimage [1] for both lossless and visually lossless (lossy) optimizations. Right now it's available for Mac only. But I have a Linux version working internally as part of the upcoming Server plan.

[1] http://getoptimage.com

eeeps 1 day ago 1 reply      
Disclaimer: I work for Cloudinary. But: all of the services that you mention have an awful lot to offer, over roll-your-own solutions. Reliability and scalability, sure but also, right now, just flat-out-better performance and output. From big flashy features like automatic optimization using perceptual metrics and on-the-fly responsive resizing with Client Hints ... all the way down to nitty gritty stuff that doesnt get press releases like, say, dialed-in custom resizing algorithms... in 2017, hosted, paid services can do a lot more a lot better than anything you can set up yourself using free tools.

Images are complicated and important enough that I don't see that changing any time soon.

logicuce 1 day ago 0 replies      
I think Thumbor fits the bill very well. In fact, using something like APSL's thumbor docker image [1], you get the complete setup including the optimizers, object detection, etc. ready to go.

It works really well for UGC as an ondemand optimizer but you can easily make some URL calls to include it in the build time as well.

[1] https://github.com/APSL/docker-thumbor

rawrmaan 1 day ago 1 reply      
Sharp for node.js has proven to be powerful, flexible and fast for my needs: https://github.com/lovell/sharp/
r1ch 1 day ago 0 replies      
I use a combination of jpegoptim, optipng, advpng and zopflipng.

Be especially careful with these utilities when running them on UGC. PNG / JPEG bombs can easily cause OOM or CPU DoS conditions etc.

v3ss0n 1 day ago 1 reply      
pilbox is very powerful : https://pypi.python.org/pypi/pilbox
nielsole 1 day ago 0 replies      
Haven't tried it out, but looked quite promising: https://github.com/thephpleague/glide/Does everything basic, that I would look for in an image API
Mojah 1 day ago 0 replies      
I automated my workflow server side with OptiPNG. https://ma.ttias.be/optimize-size-png-images-automatically-w...
NicoJuicy 1 day ago 0 replies      
I use ImageResizer 4.3.2 for Asp.Net MVC ( it's free), new versions are less free though... Best thing is, if you want to resize, you can just do it through the url. Eg. /Assets/img/logo.png?Width=200
silasb 1 day ago 0 replies      
https://github.com/h2non/imaginary appears to support quality/compression settings.
anilshanbhag 1 day ago 0 replies      
optipng and jpegoptim are pretty good. Have a feeling most of these tools use these utilities inside them.
Theodores 1 day ago 0 replies      
Google Pagespeed for Nginx and Apache is another way to go. The benefit of this approach is that you don't have to bulk out your code.

As for metadata, today I decided to add it back in.


For ecommerce it will eventally help to have product data, e.g. brand, product name etc embedded in the image.

My other tip if you go the Imagemagick/PAGESPEED route then you can use 4:2:2 colour space and ditch half the bits used for chroma.

fweespeech 1 day ago 0 replies      
> What is the most complete solution you are aware of that compresses & optimizes png, jpeg, and webp and can be operated on a server? It should not only be able to optimize as part of the build process but also in response to user-generated content.

Tbh the UGC side is just triggering the "build process side" as the upload occurs.

As far as best,


I'd suggest you look there for some decent examples of how to go about it. They may be defunct but I use a similar approach (slightly different knob tweaks with the same binaries) and it works fine. May not be 100% optimal but its good enough imo.

Ask HN: What are some interesting use cases for a 1060GTX?
3 points by damassive  11 hours ago   7 comments top 5
cjhanks 3 hours ago 0 replies      
Machine learning libraries practically assume some sort of GP-GPU will be available these days. And most software developers are unwilling to go back to being tethered to a desktop for work.

A lot of developers are using what is essentially a "portable desktop", this card fits in with that ethos.

mars4rp 8 hours ago 0 replies      
go to http://course.fast.ai/ watch the first 2 videos, run the notebooks on your GPU, submit to kaggle! it may make you interested in AI !!!
jmstfv 6 hours ago 0 replies      
Have you ever considered mining cryptocurrencies?
debacle 10 hours ago 1 reply      
You could make dusty toast.
Ask HN: How was your experience with AWS Lambda in production?
204 points by chetanmelkani  2 days ago   147 comments top 62
callumlocke 2 days ago 6 replies      
I made an image hosting tool on Lambda and S3, for internal corporate use. Staff can upload images to S3 via an SPA. The front end contacts the Lambda service to request a pre-signed S3 upload URL, so the browser can upload directly to S3. It works really well. Observations:

1. Took too long to get something working. The common use case of hooking up a Lambda function to an HTTP endpoint is surprisingly fiddly and manual.

2. Very painful logging/monitoring.

3. The Node.js version of Lambda has a weird and ugly API that feels like it was designed by a comittee with little knowledge of Node.js idioms.

4. The Serverless framework produces a huge bundle unless you spend a lot of effort optimising it. It's also very slow to deploy incremental changes edit: this is not only due to the large bundle size but also due to having to re-up the whole generated CloudFormation stack for most updates.

5. It was worth it in the end for making a useful little service that will exist forever with ultra-low running costs, but the developer experience could have been miles better, and I wouldn't want to have to work on that codebase again.


Edit: here's the code: https://github.com/Financial-Times/ig-images-backend

To address point 3 above, I wrote a wrapper function (in src/index.js) so I could write each HTTP Lambda endpoint as a straight async function that simply receives a single argument (the request event) and asynchronously returns the complete HTTP response. This wouldn't be good if you were returning a large response though; you'd probably be better streaming it.

munns 2 days ago 1 reply      
Hey all, My name is Chris Munns and I am currently the lead Developer Advocate for Serverless at AWS (I am part of the Lambda PM team). We really appreciate this feedback and are always looking for ways to hear about these pain points. Can email me directly: munns@amazon.com if you ever get stuck.

Thanks,- Chris

scrollaway 2 days ago 2 replies      
We use AWS Lambda to process Hearthstone replay files.

My #1 concern with it went away a while back when Amazon finally added support for Python 3 (3.6).

It behaved as advertised: Allowed us to scale without worrying about scaling. After a year of using it however I'm really not a big fan of the technology.

It's opaque. Pulling logs, crashes and metrics out of it is like pulling teeth. There's a lot of bells and whistles which are just missing. And the weirdest thing to me is how people keep using it to create "serverless websites" when that is really not its strength -- its strength is in distributed processing; in other words, long-running CPU-bound apps.

The dev experience is poor. We had to build our own system to deploy our builds to Lambda. Build our own canary/rollback system, etc. With Zappa it's better nowadays although for the longest time it didn't really support non-website-like Lambda apps.

It's expensive. You pay for invocations, you pay for running speed, and all of this is super hard to read on the bill (which function costs me the most and when? Gotta do your own advanced bill graphing for that). And if you want more CPU, you have to also increase memory; so right now our apps are paying for hundreds of MBs of memory we're not using just because it makes sense to pay for the extra CPU. (2x your CPU to 2x your speed is a net-neutral cost, if you're CPU-bound).

But the kicker in all this is that the entire system is proprietary and it's really hard to reproduce a test environment for it. The LambCI people have done it, but even so, it's a hell of a system to mock and has a pretty strong lock-in.

We're currently moving some S3-bound queue stuff into SQS and dropping Lambda at the same time could make sense.

I certainly recommend trying Lambda as a tech project, but I would not recommend going out of your way to use it just so you can be "serverless". Consider your use case carefully.

cjhanks 3 hours ago 0 replies      
Developing Lambda is an absolutely terrible experience. Tightening the integration between CloudFormation, API-Gateway, and Lambda would really improve the situation. For example, a built-in way to map requests/responses between API-Gateway and Lambda which didn't involve a janky parsing DSL would be pretty nice.

The strategy Lambda seems to suggest you implement for testing/development is pretty laborious. There's no real clear way for you to mock operations on your local system and that's a real bummer.

A lot of things you run into in Python lambda functions are also fairly unclear. Python often will compile C-extensions... I could never figure out if there was really a stable ABI or what I could do to pre-compile things for Lambda.

All of those complaints aside - once you deploy your app, it will probably keep running until the day you die. So that's a huge upside. Once you rake through the muck of terrible developer experience (which I admit, could be unique to me), the service simply works.

So, if you have a relatively trivial application which does not need to be upgraded often and needs very good up-time.. it's a very nice service.

chickenbane 2 days ago 1 reply      
I worked on a project where the architect wanted to use Lambdas for the entire solution. This was a bad choice.

Lambdas have a lot of benefits - for occasional tasks they are essentially free, the simple programming model makes them easy to understand in teams, you get Amazon's scaling and there's decent integration with caching and logging.

However, especially since I had to use them for whole solution, I ran into a ton of limitations. Since they are so simple, you have to pull in a lot of dependencies which negate a lot of the ease of understanding I mentioned before. The dependencies are things like Amazon's API Gateway, AWS Step Functions, and AWS CLI itself, which is pretty low-level. So now, the application logic is pretty easy, but now you are dealing with a lot of integration devops. There's API Gateway is pretty clunky and surprisingly slow. Lambdas shut themselves down, and restarting is slow. The Step Functions have a relatively small payload limit that needs to be worked around. Etc. So use them sparingly!

lanestp 2 days ago 0 replies      
We use Lambda for 100% of our APIs some of which get over 100,000 calls per day. The system is fantastic for micro services and web apps. One caveat, you must use a framework like Serverless or Zappa. Simply setting up API Gateway right is a hideous task and giving your function the right access level isnt any fun either. Since the frameworks do all that for you it really makes life easier.

One thing to note. API Gateway is super picky about your response. When you first get started you may have a Lambda that runs your test just fine but fails on deployment. Make sure you troubleshoot your response rather than diving into your code.

I saw some people complaining about using an archaic version of Node. This is no longer true. Lambdas support Node V6 which, while not bang up to date, is an excellent version.

Anyway, I can attest it is production ready and at least in our usage an order of magnitude cheaper.

thom_nic 2 days ago 1 reply      
I deployed a couple AWS lambda endpoints for very low-volume tasks using claudia.js - Claudia greatly reduces the setup overhead for sane REST endpoints. It creates the correct IAM permissions, gateway API and mappings.

Claudia.js also has an API layer that makes it look very similar to express.js versus the weird API that Amazon provides. I would not use lambda + JS without claudia.

For usage scenarios, one endpoint is used for a "contact us" form on a static website, another we use to transform requests to fetch and store artifacts on S3. I can't speak toward latency or high volume but since I've set them up I've been able to pretty much forget about them and they work as intended.

CSDude 2 days ago 1 reply      
- Monitoring & debugging is little hard

- CPU power also scales with Memory, you might need to increase it to get better responses

- Ability to attach many streams (Kinesis, Dynamo) is very helpful, and it scales easily without explicitly managing servers

- There can be a overhead, your function gets paused (if no data incoming) or can be killed undeterministically (even if it works all the time or per hour) and causes cold start, and cold start is very bad for Java

- You need to make your JARs smaller (50MB), you cannot just embed anything you like without careful consideration

dblooman 2 days ago 0 replies      
We have a number of different use cases at FundApps, some obvious like automated tasks, automatic DNS, cleanup AMI's etc, to the more focused importing and parsing of data from data sources. This is generally a several times a day operation, so lambda was the right choice for us. We also use API gateway with lambdas, its a small API, about 2 requests per second on average, but very peaky during business hours, its response and uptime has been excellent.

Development can be tricky, there are a lot of of all in one solutions like the serverless framework, we use Apex CLI tool for deploying and Terraform for infra. These tools offer a nice workflow for most developers.

Logging is annoying, its all cloudwatch, but we use a lambda to send all our cloudwatch logs to sumologic. We use cloudwatch for metrics, however we have a grafana dashboard for actually looking at those metrics. For exceptions we use Sentry.

Resources have bitten us the most, not enough memory suddenly because the payload from a download. I wish lambda allowed for scaling on a second attempt so that you could bump its resources, this is something to consider carefully.

Encryption of environment variables is still not a solved issue, if everyone has access to the AWS console, everyone can view your env vars, so if you want to store a DB password somewhere, it will have to be KMS, which is not a bad thing, this is usually pretty quick, but does add overhead to the execution time.

petarb 2 days ago 0 replies      
It gets the job done but the developer experience around it is awful.

Terrible deploy process, especially if your package is over 50mb (then you need to get S3 involved). Debugging and local testing is a nightmare. Cloudwatch Logs aren't that bad (you can easily search for terms).

We have been using Lambdas in production for about a year and a half now, to do 5 or so tasks. Ranging from indexing items in Elasticseaech, to small CRON clean up jobs.

One big gripe around Lambads and integration with API Gateway is they totally changed the way it works. It use to be really simple to hook up a lambda to a public facing URL so you could trigger it with a REST call. Now you have to do this extra dance with configuring API Gateway per HTTP resource, therefore complicating the Lambda code side of things. Sure with more customization you have more complexity associated with it, but the barrier to entry was significantly increased.

beefsack 2 days ago 1 reply      
I'm running Rust on Lambda at the moment for a PBE board gaming service I run. I can't say it runs at huge scale though, but using Lambda has provided me with some really good architectural benefits:

* Games are developed as command line tools which use JSON for input and output. They're pure so the game state is passed in as part of the request. An example is my implementation of Lost Cities[1]

* Games are automatically bundled up with a NodeJS runner[2] and deployed to Lambda using Travis CI[3]

* I use API Gateway to point to the Lambda function, one endpoint per game, and I version the endpoints if the game data structures ever change.

* I have a central API server[4] which I run on Elastic Beanstalk and RDS. Games are registered inside the database and whenever players make plays, Lambda functions are called to process the play.

I'm also planning to run bots as Lambda functions similar to how games are implemented, but am yet to get it fully operational.

Apart from stumbling a lot setting it up, I'm really happy with how it's all working together. If I ever get more traction I'll be interesting to see how it scales up.

[1]: https://github.com/brdgme/lost-cities

[2]: https://github.com/brdgme/lost-cities/blob/master/.travis.ym...

[3]: https://github.com/brdgme/lambda/blob/master/index.js

[4]: https://github.com/brdgme/api

cameronmaske 2 days ago 2 replies      
I've been using AWS Lambda on a side project (octodocs.com) that is powered by Django and uses Zappa to manage deployments.

I was initially attracted to it as a low-cost tool to run a database (RDS) powered service side project.

Some thoughts:

- Zappa is a great tool. They added async task support [1] which replaced the need for celery or rq. Setting up https with let's encrypt takes less than 15 minutes. They added Python 3 support quickly after it was announced. Setting up a test environment is pretty trivial. I set up a separate staging site which helps to debug a bunch of the orchestration settings. I also built a small CLI [2] to help set environment variables (heroku-esque) via S3 which works well. Overall, the tooling feels solid. I can't imagine using raw Lambda without a tool like Zappa.

- While Lambda itself is not too expensive, AWS can sneak in some additional costs. For example, allowing Lambda to reach out to other services in the VPC (RDS) or to the Internet, requires a bunch of route tables, subnets and a nat gateway. For this side project, this currently costs way more running and invoking Lambda.

- Debugging can be a pain. Things like Sentry [3] make it better for runtime issues, but orchestration issues are still very trail and error.

- There can be overhead if your function goes "cold" (i.e. infrequent usage). Zappa lets you keep sites warm (additional cost), but a cold start adds a couple of seconds to the first-page load for that user. This applies more to low volume traffic sites.

Overall: It's definitely overkilled for a side project like this, but I could see the economics of scale kicking in for multiple or high volume apps.

[1]: https://blog.zappa.io/posts/zappa-introduces-seamless-asynch...

[2]: https://github.com/cameronmaske/s3env

[3]: https://getsentry.com/

kehers 2 days ago 4 replies      
I've been using it for heavy background job for http://thefeed.press and overall, I think it's pretty ok (I use NodeJs). That said here are few things:

- No straight way to prevent retries. (Retries can crazily increase your bill if something goes wrong)

- API gateway to Lambda can be better. (For one, Multipart form-data support for API gateway is a mess)

- (For NodeJs) I don't see why the node_modules folder should be uploaded. (Google cloud functions downloads the modules from the package.json)

alexcasalboni 2 days ago 0 replies      
I'd recommend using a framework such as the Serverless Framework[1], Chalice[2], Dawson[3], or Zappa[4]. As any other (web) development project, using a framework will alleviate a big part of the pain involved with a new technology.

Anyways, I'd recommend starting from learning the tools without using a framework first. You can find two coding sessions I published on Youtube[5][6].

[1]: https://serverless.com/

[2]: https://github.com/awslabs/chalice

[3]: https://dawson.sh/

[4]: https://github.com/Miserlou/Zappa

[5]: https://www.youtube.com/watch?v=NhGEik26324

[6]: https://www.youtube.com/watch?v=NlZjTn9SaWg

tracker1 2 days ago 1 reply      
Best to keep your workloads as small as possible, cold starts can be very bad, depending on the type of project. Been using mostly node myself, and it's worked out well.

One thing to be careful of, if you're targeting input into dynamodb table(s), then it's really easy to flood your writes. Same goes for SQS writes. You might be better off with a data pipeline, and slower progress. It really just depends on your use case and needs. You may also want to look at Running tasks on ECS, and depending on your needs that may go better.

For some jobs the 5minute limit is a bottleneck, others it's the 1.5gb memory. Just depends on exactly what you're trying to do. If your jobs fit in Lambda constraints, and your cold start time isn't too bad for your needs, go for it.

dcosson 2 days ago 1 reply      

- works as advertised, we haven't had any reliability issues with it

- responding to Cloudwatch Events including cron-like schedules and other resource lifecycle hooks in your AWS account (and also DynamoDB/Kinesis streams, though I haven't used these) is awesome.


- 5 minute timeout. There have been a couple times when I thought this would be fine, but then I hit it and it was a huge pain. If the task is interruptible you can have the lambda function re-trigger itself, which I've done and actually works pretty once you set up the right IAM policy, but it's extra complexity you really don't want to have to worry about in every script.

- The logging permissions are annoying, it's easy for it to silently fail logging to to Cloudwatch Logs if you haven't set up the IAM permissions right. I like that it follows the usual IAM framework but AWS should really expose these errors somewhere.

- haven't found a good development/release flow for it. There's no built-in way to re-use helper scripts or anything. There are a bunch of serverless app frameworks, but they don't feel like they quite fit because I don't have an "app" in Lambda I just have a bunch of miscellaneous triggers and glue tasks that mostly don't have any relation to each other. It's very possible I should be using one of them anyway and it would change how I feel about this point.

We use Terraform for most AWS resources, but it's particularly bad for Lambda because there's a compile step of creating a zip archive that terraform doesn't have a great way to do in-band.

Overall Lambda is great as a super-simple shim if you only need to do one simple, predictable thing in response to an event. For example, the kind of things that AWS really could add as a small feature but hasn't like send an SNS notification to a slack channel, or tag an EC2 instance with certain parameters when it launches into an autoscaling group.

For many kinds of background processing tasks in your app, or moderately complex glue scripts, it will be the wrong tool for the job.

Techbrunch 2 days ago 0 replies      
You might want to have a look at Serverless, a framework to build web, mobile and IoT applications with serverless architectures using AWS Lambda and even Azure Functions, Google CloudFunctions & more. Debugging, maintaining & deploying multiple functions gets easier.

Serverless: https://github.com/serverless/serverless

mgkimsal 2 days ago 2 replies      
I'm reading here a lot of people jumping through some massive amounts of hoops to deal with a system that lock you down to a single vendor, and makes it hard to read logs or even read your own bill.

a few years back, the mantra was "hardware is cheap, developer time isn't". when did this prevailing wisdom change? Why would people spend hours/days/weeks wrestling with a system to save money which may take weeks, months or even years to see an ROI?

falcolas 2 days ago 1 reply      
Most of my experience mirrors that found in other comments, so here's a few unique quirks I've personally had to work around:

- You can't trigger Lambda off SQS. The best you can do is set up a scheduled lambda and check the queue when kicked off.

- Only one Lambda invocation can occur per Kinesis shard. This makes efficiency and performance of that lambda function very important.

- The triggering of Lambda off Kinesis can sometimes lag behind the actual kinesis pipeline. This is just something that happens, and the best you can do is contact Amazon.

- Python - if you use a package that is namespaced, you'll need to do some magic with the 'site' module to get that package imported.

- Short execution timeouts means you have to go to some ridiculous ends to process long running tasks. Step functions are a hack, not a feature IMO.

- It's already been said, but the API Gateway is shit. Worth repeating.

Long story short, my own personal preference is to simply set up a number of processes running in a group of containers (ECS tasks/services, as one example). You get more control and visibility, at the cost of managing your own VMs and the setup complexity associated with that.

cntlzw 2 days ago 0 replies      
Pretty good actually. We started using AWS Lambda as a tool for a cron job.

Then we implemented a RESTful API with API Gateway and Lambda. The Lamdbas are straightforward to implement. API Gateway unfortunately has not a great user experience. It feels very clunky to use and some things are hard to find and understand. (Hint: Request body passthrough and transformations).

Some pitfalls we encountered:

With Java you need to consider the warmup time and memory needed for the JVM. Don't allocate less than 512MB.

Latency can be hard to predict. A cold start can take seconds, but if you call your Lambda often enough (often looks like minutes) things run smooth.

Failure handling is not convenient. For example if your Lamdba is triggered from a Scheduled Event and the lamdba fails for some reason. The Lamdba does get triggered again and again. Up to three times.

So at the moment we have around 30 Lambdas doing their job. Would say it is an 8/10 experience.

xer 2 days ago 0 replies      
At Annsec we are all out on serverless infrastructure and use Lambdas and Step functions in two development teams on a single backlog. Extensibility of a well written lambda is fenomenal. For instance we have higher abstraction lambdas for moving data. We make them handle several input events and to the greatest extent as pure as possible. Composing these lambdas later in Step functions is true developer joy. We unit test them locally and for E2E-tests we have a full clone of our environment. In total we build and manage around 40 lambdas and 10 step functions. Monitoring for failure is conducted using Cloudwatch alarms, Ops Genie and Slack bots. Never been an issue. In our setup we are aiming for an infrastructure that is immutable and cryptological verifiable. It turned out to be bit of a challenge. :)
rowell 1 day ago 0 replies      
I've used it in production and we're building our platform entirely in Serverless/AWS Lambda.

Here are my recommendations:

1) Use Serverless Framework to manage Functions, API-Gateway config, and other AWS Resources

2) CloudWatch Logs are terrible. Auto-stream CloudWatch Logs to Elastic Search Service and Use Kibana for Log Management

3) If using Java or other JVM languages, cold starts can be an issue. Implement a health check that is triggered on schedule to keep functions used in real-time APIs warm

Here's a sample build project I use: https://github.com/bytekast/serverless-demo

For more information, tips & tricks: https://www.rowellbelen.com/microservices-with-aws-lambda-an...

gaius 1 day ago 0 replies      
I've only played with it as opposed to deploying Prod but guve Azure Functions a try too https://azure.microsoft.com/en-us/services/functions/
Jdam 2 days ago 3 replies      
For running Java in Lambda, I had to optimize it for Lambda. To decrease processing time (and in the end the bill), I got rid of all reflection for example and though twice when to initialize what and what to make static. Also, Java Cold Start is an issue. I fixed this with creating a Cloudwatch Trigger that executes the Lambda function every minute to keep it hot. Otherwise, after some minutes of no-one calling the function, it takes 10+ seconds to respond. But if you use Python for example, you don't run into this issue.I built complete backends on top of Lambda/API Gateway/Dynamo and having "NoOps" that also runs very cheap is a killer argument for me.
mfrye0 2 days ago 0 replies      
I started off doing manual build / deploy for a project and it was a total pain in the ass. From packaging the code, to versioning, rollbacks, deploy. Then that doesn't even include setting up API Gateway if you want an endpoint for the function.

Since then I've been using Serverless for all my projects and it's the best thing I've tried thus far. It's not perfect, but now I'm able to abstract everything away as you configure pretty much everything from a .yml file.

With that said, there are still some rough spots with Lambda:

1) Working with env vars. Default is to store them in plain text in the Lambda config. Fine for basic stuff, but I didn't want that for DB creds. You can store them encrypted, but then you have to setup logic to decrypt in the function. Kind of a pain.

2) Working within a subnet to access private resources incurs an extra delay. There is already a cold start time for Lambda functions, but to access the subnet adds more time... Apparently AWS is aware and is exploring a fix.

3) Monitoring could be better. Cloudwatch is not the most user friendly tool for trying to find something specific.

With that said, as a whole Lambda is pretty awesome. We don't have to worry about setting up ec2 instances, load balancing, auto scaling, etc for a new api. We can just focus on the logic and we're able to roll out new stuff so much faster. Then our costs are pretty much nothing.

eknkc 2 days ago 1 reply      
We use Node.JS lambda functions for real time image thumbnail generation and scraping needs. As well as mirroring our S3 buckets to another blob storage provider and a couple of periodic background jobs. It works beautifully. It's a little hard to debug at first but when it's set up, both pricing and reliability is really good for our use cases.

I think a lot of people try to use the "serverless" stuff for unsuitable workloads and get frustrated. We are running a kubernetes cluster for the main stuff but have been looking for areas suitable for lambda and try to move those.

viw 2 days ago 0 replies      
Can talk only about the node.js runtime with native add-ons. Using it for various automation tasks less then 100 invocations a day where it is the most convenient solution out there for peanuts. We also use it for parsing Swagger/API Blueprint files, here we talk 200k+ invocations a day and works great once we figured out logging/monitoring/error handling and limited output (6MB). We do not use any framework because they mostly are not flexible enough but apex (http://apex.run/) and serves us well. We've hit couple of times some limits but as it is invocation per request only some calls failed and the whole service was unaffected. I see the isolation as a big benefit you get. One thing which sucks is that if it fails (and it is not your code) often you have no idea why and if anything can be done. We use it together with AWS API Gateway and the Gateway part is sub par. The gateway does not support correct HTTP like 204 always returns a body and god forbid if you want something else than application/json. To sum it up lambda is great with some minor warts and the API gateway is OK but can easily imagine it much better.
davidvanleeuwen 2 days ago 1 reply      
After multiple side projects with Lambda (e.g. image processing services), we finally implemented it on larger scale. Initially we started out without any framework or tool to help, because there we pretty much non-existent at that time. We created our own tool, and used Swagger a lot for working with API gateway (because it is really bad to work with). Over time everything smoothened out and really worked nicely (except for API Gateway though). Nowadays we have everything in Terraform and Serverless templates, which really makes your life easier if you're going to build your complete infrastructure on top of AWS Lambda and other AWS APIs. There are still a bunch of quarks you have to work with, but at the end of the line: it works and you don't have to worry much about scaling.

I'm not allowed to give you any numbers; here's an old blogpost about Sketch Cloud: https://awkward.co/blog/building-sketch-cloud-without-server... (however, this isn't accurate anymore). For this use-case, concurrent executions for image uploads is a big deal (a regular Sketch document can easily exist out of 100 images). But basically the complete API runs on Lambda.

Running other languages on Lambda can be easily done and can be pretty fast, because you simply use node to spawn a process (Serverless has lots of examples of that).

Let me know if you have any specific questions :-)

Hope this helps.

meekins 1 day ago 0 replies      
We're doing both stream processing and small query APIs using Lambda.

A few pointers (from relatively short experience):

- The best UC for Lambda seems to be stream processing where latency due to start up times is not an issue

- For user/application-facing logic the major issue seems to be start-up-times (esp. JVM startup times when doing Java or your API gets called very rarely) and API Gateway configuration management using infrastructure as code tools (I'd be interested in good hints about this, especially concerning interface changes)

- The programming model is very simple and nice but it seems to make most sense to split each API over multiple lambdas to keep them as small as possible or use some serverless framework to make managing the whole app more easy

- This goes without saying, but be sure to use CI and do not deploy local builds (native binary deps)

lovehashbrowns 2 days ago 0 replies      
I've only been using it for one project right now. I made an API that I can use to push security-related events to a location that a hacker couldn't access, even if they get root on a local system. I use it in conjunction with sec (Simple Event Correlator). If sec detects something, e.g. a user login, or a package install, it'll send the event to the API in AWS Gateway + Lambda. The event then gets stored in a DynamoDB table, and I use a dashing.io dashboard to display the information. It works super well. I still need to convert my awful NodeJS code to Python, but that shouldn't take long.

I do remember logging being a confusing mess when I was trying to get this started. I feel better about the trouble I had now that I see it wasn't just me. But for a side project that's very simple to use, Lambdas have been a blessing. I get this functionality without having to manage any servers or create my own API with something like Python+Flask. Having IAM and authentication built in for me made the pain from the initial set-up so worth it.

zurn 1 day ago 1 reply      
For a serverless system that uses Lambda together with eg CloudFormation, Dynamo, and S3, Cognito etc - it's pretty low level and you spend a lot of time understanding, refining & debugging basic things. The end-to-end logging and instrumentation throughout the services used by your app weren't great.

Doesn't like big app binaries/JARs and Amazon's API client libs are bloated - Clojure + Amazonica goes easily over the limit if you don't manually exclude some Amazon's API JDKs from the package.

On the plus side, you can test all the APIs from your dev box using the cli or boto3 before doing it from the lambda.

Would probably look into third party things like Serverless next time.

marcfowler 2 days ago 1 reply      
We use it with node for a bunch of things like PDF generation, asynchronous calls to various HTTP services etc. I think it's excellent.

The worst part about it by far is CloudWatch, which is truly useless.

Check out https://github.com/motdotla/node-lambda for running it locally for testing btw - saved us hours!

ransom1538 2 days ago 0 replies      
I used it for converting images to BPG format and do resizing. I really enjoyed it. Basically with Docker/lambda these days I feel like the future will be 'having code' and then 'running it' (no more ssh, puppet, kuberdummies, bash, vpc, drama). Once lambda runs a docker file it might take over middle earth. These were my issues with lambda:

1. Installing your own linux modifications isn't trivial (we had to install the bpg encoder). They use a strange version of the linux ami.

2. Lambda can listen to events from S3 (creation,deletion,..) but can't seem to listen to SQS events WTF? It seems like amazon could fix this really easily.

3. Deployment is wonky. To add a new lambda zip file you need to delete the current one. This can take up to 40 seconds (which you would have total downtime).

maephisto 2 days ago 1 reply      
A couple of months ago, I've started using AWS Lambda for a side project. The actual functions were pretty easy to code using `nodejs` and deploying them with `serverless` but the boilerplate to opening them via an http API was the real bummer. IAMs, routing and all kind of other little things standing in the way of actual productive work.Some time after that I tried to setup GCloud Functions and to my surprise that boilerplate was minimal! Write your function and have accessible with just a couple of commands. IMHO GCloud Functions is way more developer friendly and AWS Lambda.
shakna 2 days ago 0 replies      
- Cheap, especially for low usage.

- Runs fast, unless your function was frozen for not enough usage or the like

- Easy to deploy and/or "misuse"

- Debugging doesn't really work

All in all, probably the least painful thing I've used on AWS. But that doesn't necessarily mean much.

aeikenberry 2 days ago 1 reply      
We run many microservices on Lambda and it has been a pleasant experience for us. We use Terraform for creating, managing environment variables, and permissions/log groups/etc. We use CodeShip for testing, and validating and applying Terraform across multiple accounts and environments.

For logging, we pipe all of our logs out of CloudWatch to LogEntries with a custom Lambda, although looking at CloudWatch logs works fine most of the time.

alexbilbie 2 days ago 0 replies      
If you need to store environment variables easily and securely take a look at EC2 Parameter Store - you can fetch the relevant parameters on startup and they are automatically encrypted and decrypted using KMS for you
ajeet_dhaliwal 2 days ago 0 replies      
It's been great for using as 'glue' to do small tasks like clean ups in our case or other short lived minor tasks. I haven't used it for anything major though, only for minor tasks that are easier or more convenient to do with Lambda rather than a different way. The real value comes from the integration with other AWS services, for example, for developers using DynamoDb Lambdas make a lot of maintenance of records far easier with streams events.
rlv-dan 2 days ago 0 replies      
A session I remember that might be of interest:

Building reactive systems with AWS Lambda: https://vimeo.com/189519556

djhworld 2 days ago 0 replies      
Have used it in production for > 2 years, mainly for ETL/Data processing type jobs which seems to work well.

We also use it to perform scheduled tasks (e.g. every hour) which is good as it means you don't have to have an EC2 instance just to run cron like jobs.

The main downside is Cloudwatch Logs, if you have a Lambda that runs very frequently (i.e. 100,000+ invocations a day) the logs become painful to search through, you have to end up exporting them to S3 or ElasticSearch.

eloycoto 2 days ago 0 replies      
I'm using it for a year in a half, and I'm more than happy, The cost increments when you have much load, but I am a happy user to use it for these small applications that need to be always up.

Need to say, that you should use gordon<https://github.com/jorgebastida/gordon> to manage it, Gordon makes the process easier.


StreamBright 2 days ago 0 replies      
We have moved our database maintenance cron jobs to Lambda as well as the image resize functionality. General experience is very positive after we figured out hot to use Lambda from Clojure and Java. People worried about JVM startup times: Lambda will keep your JVM up and running for ~7 minutes after the initial request and you can achieve low latency easily.
dlanger 2 days ago 1 reply      
Hey everyone, I'm Daniel Langer and I help build lambda monitoring products over at Datadog. I see lots of you are unhappy with the current monitoring solutions available to you. If anyone has thoughts on what they'd like in a Lambda monitoring service feel free to email me at daniel.langer@datadoghq.com
rmccue 2 days ago 0 replies      
Pretty great, we're using it for resizing and serving images for our clients (large media companies, banks, etc): https://hmn.md/2017/04/27/scaling-wordpress-images-tachyon/

API Gateway is a little rougher, but slowly getting there.

jakozaur 2 days ago 0 replies      
So far used only for toy/infrequent use cases and it works there well. E.g. Slack command, integration with different systems, cron style job.
Dawny33 2 days ago 1 reply      
- We use Lambda along with Ansible to execute huge, distributed ML workloads which are (completely)serverless. Saves a lot of bucks, as ML needs huge boxes.

- For serverless APIs for querying the S3 which is a result of the above workload

Difficulties faced with Lambda(till now):

1. No way to do CD for Lambda functions. [Not yet using SAM]

2. Lambda launches in its own VPC. Is there a way to make AWS launch my lambda in my own VPC? [Not sure.]

jonathanbull 2 days ago 0 replies      
We use Lambda extensively at https://emailoctopus.com. The develop-debug cycle takes a while, but once you're up and running, the stability is hard to beat. Just wish they'd raise that 5 minute execution limit so we can migrate a few more scripts.
adrianpike 2 days ago 0 replies      
It's good. We're using it for a ton of automation of various developer tasks that normally would get run once in a while (think acceptance environment spinup, staging database load, etc.).

It fails once in a while and the experience is bad, but that's mostly due to our tooling around failure states instead of the platform itself.

tommy5dollar 2 days ago 0 replies      
Been using it for about 6 months with Serverless for Node API endpoints and it's great so far!

The only negatives are:- cold start is slow, especially from within a VPC- debugging/logging can be a pain- giving a function more memory (~1GB) always seems to be better (I'm guessing because of the extra CPU)

erikcw 2 days ago 1 reply      
Lots of great comments here. I'd like to add that being limited to 512mb of working disk space at /tmp has been a stumbling block for us.

Would be really great to have this configurable along with CPU/memory.

Additionally being able to mount and EFS volume would be very useful!

ahmednasir91 2 days ago 0 replies      
- When using with API Gateway the API response time is more than 2-3 seconds for a NodeJS lambda, for Java it will be more.- Good for use cases for example -- cron - can be triggered using Cloudwatch events.-- Slack command bot (API Gateway + Lambda) the only problem is timeout.
betimd 2 days ago 0 replies      
I'm having around 4 mln lambda executions per month, mostly on data processing and I'm happy in overall with performance and easy of deployment. Debugging is hard, frameworks are still very mature. I use AWS SDK and C# and I'm having quite good experience.
akhatri_aus 2 days ago 3 replies      
- There is a surprisingly high amount of API gateway latency

- The CPU power available seems to be really weak. Simple loops running in NodeJS run way way slower on Lambda compared to a 1.1 GHz Macbook by a significant magnitude. This is despite scaling the memory up to near 512mb.

- Certain elements, such as DNS lookups, take a very long time.

- The CloudWatch logging is a bit frustrating. If you have a cron job it will lump some time periods as a single log file, other times they're separate. If you run a lot of them its hard to manage.

- Its impossible to terminate a running script.

- The 5 minute timeout is 'hard', if you process cron jobs or so, there isn't flexibility for say 6 minutes. It feels like 5 minutes is arbitrarily short. For comparison Google Cloud Functions let you work 9 minutes which is more flexible.

- The environment variable encryption/decryption is a bit clunky, they don't manage it for you, you have to actually decrypt it yourself.

- There is a 'cold' start where once in a while your Lambda functions will take a significant amount of time to start up, about 2 seconds or so, which ends up being passed to a user.

- Versions of the environment are updated very slowly. Only last month (May) did AWS add support for Node v6.10, after having a very buggy version of Node v4 (a lot of TLS bugs were in the implementation)

- There is a version of Node that can run on AWS Cloudfront as a CDN tool. I have been waiting quite literally 3 weeks for AWS to get back to me on enabling it for my account. They have kept up to date with me and passed it on to the relevant team in further contact and so forth. It just seems an overly long time to get access to something advertised as working.

- If you don't pass an error result in the callback callback, the function will run multiple times. It wont just display the error in the logs. But there is no clarity on how many times or when it will re-run.

- There aren't ways to run Lambda functions in a way where its easy to manage parallel tasks, i.e to see if two Lambda functions are doing the same thing if they are executed at the exact same time.

- You can create cron jobs using an AWS Cloudwatch rule, which is a bit of an odd implementation, CloudWatch can create timing triggers to run Lambda functions despite Cloudwatch being a logging tool. Overall there are many ways to trigger a lambda function, which is quite appealing.

The big issue is speed & latency. Basically it feels like Amazon is falling right into what they're incentivised to do - make it slower (since its charged per 100ms).

PS: If anyone has a good model/providers for 'Serverless SQL databases' kindly let me know. The RDS design is quite pricey, to have constantly running DBs (at least in terms of the way to pay for them)

forgottenacc57 2 days ago 0 replies      
In the end it's only the web server that is serverless, you still need other servers depending on your use case, and hey, web servers aren't that hard to run anyway.
unkoman 2 days ago 1 reply      
- Decouple lambdas with queues and events, SQS, SNS and S3 events are your friends here

- Use environment variables

- Use step functions to create to create state machines

- Deploy using cloudformation templates and serverless framework

synthc 2 days ago 0 replies      
We had a bad experience: we accidentally made an error in one function that got called a lot, which blocked other functions from runnning. Yay for isolation!
lunch 2 days ago 2 replies      
Can anyone speak to continuous deployments with Lambda, where downtime is not an option? Is it possible to run blue green deployments?
obrit 2 days ago 0 replies      
I'd like to use Lambda@Edge to add headers to my CloudFront responses. Does anybody have any idea when this might be released from preview?
david90 2 days ago 1 reply      
I tried using Lambda, but need to set up the API gateway before using as well. Painful logging and parameter forwarding.
forgottenacc57 2 days ago 0 replies      
Lots of people saying the API gate away is hard.

You don't need to use the API gateway.

Just talk direct to Lambda.

caseymarquis 2 days ago 0 replies      
Any comparisons to GCE and Azure's offerings for those who have used both?
goldenkey 2 days ago 1 reply      
Works terribly. It's basically a thin wrapper around a shoddy jar framework. All the languages supported are basically shit-farmed from the original Java one. The Java one is the only one that works half decently.
Ask HN: Any of you took a loss while trading? How did you handle it?
9 points by rootsudo  1 day ago   12 comments top 8
bsvalley 7 hours ago 0 replies      
#1 - Money/Risk management. You need a stop loss and a price target. Before you take a position on the market, know exactly how much loss you're wiling to take. Know exactly how much gain you're targeting. When you reach any of these 2 numbers, exit. Always stick to this strategy and you'll end up positive. Yes, you won't win the lottery, but this is how the pros do it. For a given position, on average people go for +5% gain then -3% loss. On a 50/50 scenario you're always positive over the long term.

#2 - Gain maximizer. Sell only half of your position on a winning trade when you reach your target point. You're covered in case it goes down (still need a stop loss). Then if it goes up it's all good :) Make sure to setup a second price target to exit. Always stick to rule #1. If you hang on to your stocks with no target price - like your average Joe - you'll go negative.

Welcome to the market.

thiagooffm 19 hours ago 2 replies      
I suggest you don't trade and instead focus yourself on growing your assets by working and doing value investing.

Trading is a 0 sum game. It's the same as playing in the casino. The only one who makes money consistently with trading is:

. your broker

. people holding illegal priviledged info not released yet in the media

. big funds, tricking people like you

In the long term, the value of stocks follow the profits which companies have, If you manage to learn how to do value investing, slowly and calmly, never really selling the companies you commit yourself to, while focusing at your work(which usually pays way more than what you can earn in the stock market).

I think generally people are addicted to losing, to having the thrill of winning/losing something, but when actually, if you want to get rich and consistently become wealthier, there's no thrill and it's a quite boring road. It's up to you which one you could take. The internet is full of forums with crazy people which pretend to be traders and winners but they are all actually addicted to the thrill and losing. Just check how much people are crazy about failed businesses and penny stocks. I wouldn't bet my money or even less something more important: the peace of my mind into this crap.

I want you to answer me a question: how anxious are you? I think a lot. You even came to HN to talk about your loss. I think you are one of the people who will never win on this game, but can actually win in life. You just need to learn how to control it.

I've been managing to grow my assets slowly for years. Have been enough time in the market. If you want to know more, shoot me an e-mail. I won't tell you what stock to buy, but guide you so you can make your own choices. I can already live a few years without working, but I'm so satisfied with my life that I don't have to. I travel every year and so on... and don't even make that crazy software dev salary some people in the US can.

Also... practice Sports. The more you live, the more money you can make.

I used to be very sick about trading and so on. Now I'm healthier, wealthier, less anxious and everything.

nicholas73 5 hours ago 0 replies      
I don't handle it, as it's always hard, but it's easier to take a loss if you go into a trade knowing what could invalidate your thesis. This is either from further research or the price action. Once you can admit the jig is up, and you are convinced the price can fall further, then it's easier to cut the loss. Then, it feels like you've made the right decision and you can move on. A kind of moral victory.
tedmiston 10 hours ago 0 replies      
I only think about the percentage loss (and invest money I can afford to lose), and then buy and hold more often than not.

From my "fun" portfolio personally, which is mostly tech stocks, there are many small losers but a few massive winners that outweigh the losses. When I was trading on a daily / weekly basis I never saw this, but after holding for quarters it's become my new approach. Instead of checking daily, now I check the numbers every few weeks.

If it's any consolation, I got started with investing in the 3 months before the housing crisis in 2008 and watched 50% of what I invested in relatively stable funds disappear basically instantly. That was depressing but in time all of them came back. For longterm, you've really gotta trust dollar-cost averaging and just waiting out the bumps. After surviving that rough introduction, everything else feels minor.

andriesm 1 day ago 1 reply      
I have dabbled many times over the last 20 years in trading and even done it rather professionally. There is no easy way to handle a loss. Take comfort in the story of a Ray Dario having lost everything to the point of borrowing 10k from his dad to get back on his feet and now managing 10s of billions.

Trading is a really hard way to make money.

The best way to bounce back from a loss is to make sure you will never ever place yourself in the position to make the kind of loss from which you can't recover.

You did this, now never ever do it again. There is no sense in trading unless you have a provable long term edge.

Most people would say good luck at this point. I would instead say, re-evaluate if trading makes any sense. Why do it?

Are you willing to spend 20 years "mastering it" only to accept one almost never truly masters the strangeness of financial markets, but perhaps develop a bit of character and pick up some sound principles and some unique chance insights for which there are no shortcuts.

SirLJ 3 hours ago 0 replies      
Before you start trading, do extensive back testing, especially on the risk management... also please note, you'll allways have losses, but if you control them and have a system with slightly better winning percentage, the compounding will make you rich in the long run...
swah 11 hours ago 0 replies      
Bought treasury bonds instead (Brazil).
peller 13 hours ago 0 replies      
From a trader's perspective, losses are the cost of doing business. When you're starting out, your losses are your tuition. An engineer spends at least $100K+ and 4+ years going school to learn their craft; why should it be any different for traders? (Aside from the fact that there's no school and no curriculum; there's only the library, the market, and your wits.) If you read the Market Wizards series, you'll find a lot of pros saying things that amount to "on average out of 10 trades 6 or 7 of them will be losers - but I keep my losses small and let my winners run."

Here's how it works for a casino: the outcome of any given roll of the dice (say) is random. But over a thousand rolls of the dice, the odds of the game are not random. For a casino, obviously the games are set up so the odds favor the house, and "in the end the house always wins." For a trader, your "odds" against the market are determined by your skill. Every individual trade is still essentially random, but if you're good at what you do - if your market analysis, entry timing, risk/money management, and adherence to your trading rules is solid - over a large enough sample size the outcome will be net positive. It's a strange duality for some people to grasp.

A good first read: High Probability Trading (Marcel Link)

The Bible: Reminiscences of a Stock Operator (Edwin Lefevre)

The basics: Technical Analysis of the Financial Markets (John Murphy), Japanese Candlestick Analysis (Steve Nixon - skip the first chapter), Trading and Exchanges (Larry Harris)

Good follow ups: All the Market Wizards books (Jack Schwager)

Psychology: Trading in the Zone (Mark Douglas), The Nature of Risk (Justin Mamis)

Less theory, more practical: Mastering the Trade (John Carter)

Putting it all together: How to Buy (out of print) and When to Sell (both Justin and Robert Mamis; slightly outdated but 100% worth reading.)

Some of these books are "expensive." But even if you only learn one thing from a book, it'll have paid for itself 10 times over. Take copious notes. Find the commonalities. Filter out the bullshit. Watch the markets; see what "clicks" for you. The trading exercise towards the end of Trading in the Zone is a great way to test yourself without risking a whole lot. (On that note, a theme you'll find, if loosing 2 or 3 thousand is going to be an issue for your current well being, perhaps better to wait to put money up until such a cost is somewhat closer to pocket change...)

If that all sounds like too much effort, then perhaps the more typical buy index funds and hold until you retire answer you'll often get around here is more your speed. No shame in it.

Ask HN: Assistive Devices for recently quadriplegic dad
157 points by throwawaysci  2 days ago   35 comments top 27
Voltbishop 2 days ago 0 replies      
I'd recommend the two products:

1. http://www.quha.com/products-2/accessories/quha-pufo/

It's a bluetooth device that allows controlling the mouse cursor with body movement (head or finger etc) It's cheaper. Coupled with a free dwell clicking software, should work!

2. Eye tracker - there are a lot of options, visit reddit.com/r/eyetracking that and reddit.com/r/ALS and ask them for advice. These devices let you control a PC with your eyes are especially designed for people who have ALS. The ones that work really well cost money, but most insurance companies cover them in full. Avoid Tobii, they are not reliable and are more marketing than anything. Mygaze,LC Technologies, Eyetech digital, smi vision. These are all companies you can trust. All should offer free trail periods and should have a rep who can come and visit your dad to do an evaluation. If they don't offer at minimum 2 week trail, they're not a trusted company. Secondly you can contact your local cities AT clinic they have donated equipment for situations like this.

I hope this helps!

Mz 2 days ago 0 replies      
I was passingly acquainted with two quadriplegics from my corporate job. They were both employees of the company.

Quadriplegic just means all four limbs are impaired. The degree of impairment can vary substantially. One of these men had use of his arms, but did not have full use of his hands. He drove himself to work, had a full time job, wife and kid. He broke his neck in a pool accident in his teens. He used a manual wheelchair. He was able to use a manual wheelchair because he had use of his arms. He chose it over an automatic wheelchair to get in regular exercise.

The other was substantially more impaired. He broke his neck in a riding accident later in life. He had been a brilliant surgeon. He used an automated wheelchair. I think he had partial use of one arm and maybe a couple of fingers, which allowed him to navigate a smartphone with that hand. He came in once a week for a few hours to review surgical reports for the company. When ordinary claims processors (like I was) could not figure out if the surgery was covered and their boss with more training couldn't either, we printed off the entire file and hand delivered the paper version to this man on Friday afternoon. I had one claim go to him and hand walked my papers to the meeting.

I also attended an educational talk given by the two of them. This is how I know how they each broke their neck and other details.

Since your father was a consultant, he may be able to return to doing consulting work at some point in the future. The specialized knowledge in his head does not stop being valuable just because of his physical limitations. I am mentioning this because new quadriplegics are often suicidal. They feel that life is simply over. It's not. He was a professor and consultant, like this former surgeon, his knowledge and expertise still has value. Even though the former surgeon could no longer work as a surgeon, his knowledge of surgery was valuable and he had a unique very part time job at a world class company.

Depending on the exact details of your father's limitations, he may also benefit from the use of ordinary things like smart phones with apps. There are also a lot of non-tech assistive devices, like chairs to help them shower and spoons that can be strapped to their hand so they can feed themselves if they have arm movement but limited hand control.


aerovistae 2 days ago 1 reply      
I created a Chrome extension called Hands Free for Chrome which allows near-total control of the browser with your voice.

It occasionally has to be reset by hand if the voice recognition locks up, which is the only barrier. But I'm fairly certain it's the best option available for people in your father's situation.


apostacy_steak 2 days ago 0 replies      
My father contracted Guillain-Barre - a fast oncoming neurological disorder that left him quadriplegic. I did a bunch of research a couple of years ago to help him out.

First, if your dad can still move his head you can use Apple's assistive tech to "tab" through the items on the screen with a turn one way, and "clicking" on an item by turning his face the other.

Second, MS Windows' voice control is actually really decent. You can browse, search, send emails, etc. all with your voice. It takes some training (both for the user and the machine) but my dad has gotten pretty quick with his.

Lastly, there's a bunch of eye trackers out there now, and you can use them for a lot of things. I setup CameraMouse (http://www.cameramouse.org/) for when voice wasn't quite cutting it (or my dad got tired of talking.)

Unfortunately, there's no perfect solution, and all require time to adjust.

j_s 2 days ago 0 replies      
Jouse3 $1500 mouth mouse/joystick seems tried & true. http://www.compusult.net/assistive-technology/our-at-product...

Source: https://www.twitch.tv/nohandsken quadriplegic streamer who plays Diablo/Path of Exile, Heroes of the Storm, World of Warcraft, etc. (I encourage Amazon Prime subscribers to give him ~$2.50 every 30 days via their free Twitch sub! https://help.twitch.tv/customer/portal/articles/2574674-how-... )

slightly related/helpful discussion: https://github.com/melling/ErgonomicNotes

gtsteve 2 days ago 0 replies      
I'm sorry to hear that. That's very bad luck indeed.

I remember hearing about this project some time ago: https://github.com/OptiKey/OptiKey

It might be helpful as it's an open-source project and if extra features are needed you might be able to add them yourself if you are a programmer.

I mentally bookmarked it because I felt it would be a good "make the world a better place" type project to contribute to if I ever had some spare time.

WheelsAtLarge 2 days ago 0 replies      
It's not a laptop but an Amazon Echo is excellent for Music, Audible books and Podcasts. Hard to beat how well the immediate voice recognition works. Home automation works but it's a try and see if it works for you type of situation. It's worth investigating for your needs. Also it's relatively inexpensive. I should add that home automation needs additional hardware and costs. But you can buy an on/off plug for a bit under $30.00.
j_s 2 days ago 0 replies      
Please follow-up to report what works best for your father! So many of these Ask HN's have a great list of options, but no follow-up to report what worked best for your particular situation.

Thanks for bringing this request here to allow the community the chance to contribute!

johnnyg 2 days ago 0 replies      
I'm sorry I don't, but my thoughts are with you and I think its great you are advocating for and supporting your father.
house9-2 2 days ago 0 replies      
If he still has good movement of his neck:

- Smartnav (if Mac you need to buy via 3rd party, but includes the software)

Fairly expensive, there are other variants that cost less/more, some gaming devices like TrackIr might work as well?Possible that health insurance would pay for these types of devices?

I personally use Smartnav about 50% of the time I am programming, along with Dragon/Voicecode due to RSI issues.

Smartnav + Dragon might be enough for using laptop/desktop, not so much for mobile devices. If he actually programs I would recommend voicecode.

All of these technologies have a massive learning curve.

You might want to checkout the voicecode forum and slack channel, I know there are some quadriplegic programmers in that community who would have better insight than I.

- http://www.naturalpoint.com/smartnav/

- http://voicecode.io/

- http://discuss.voicecode.io/

- https://voicecode.slack.com/

jonnycowboy 2 days ago 0 replies      
I work for a company that makes robot arms for assistive purposes (Kinova) so I have some insight to share.

1st a voice setup with Alexa or similar can really help.

With regards to phone use, some of our users have an attachment to put the phone close to their head and use their nose to "click/select" (they can move their head).

Eye tracking technology is really impressive these days (can be as fast as using a mouse). I've recently demoed a system with a Tobii sensor (https://www.tobii.com/) that was hooked up to a laptop, very impressive when combined with appropriate software (it handles scrolling, keyboard shortcuts, etc in a custom interface). I'm not sure with regards to phone/tablet use how well they integrate.

Ping me on Linkedin if you'd like to talk more.

jeremyt 2 days ago 0 replies      
Dragon naturally speaking + Sennheiser ME-3 microphone should be at the top of your list
m0ngr31 2 days ago 0 replies      
A little while back, I had a quadriplegic man reach out to me and thank me for a skill I wrote for Alexa to control a Kodi box. It allowed him to watch what movies and shows he wanted without constantly asking for help.


I'm truly sorry about your dad. That's a scary situation for him to be thrust into.

mindcrime 2 days ago 0 replies      
A company I used to work for did a project once, deploying a lot of home automation for someone with mobility issues. It's been a few years ago, and the tech has probably changed a lot since then, but they used OpenRemote[1] heavily as part of the project. It might be worth looking into.

[1]: http://www.openremote.com/

jiiqo 1 day ago 0 replies      
I work with assistive technology and many of the clients are quadraplegic. Because your dad can control his head accurately, the best computer access device for him is definately a head mouse. Eye tracking solutions are tiring to use and unaccurate.

I have tried most of the commercial solutions available and I think the best headmouse for your dad would be Zono mouse http://www.quha.com/products-2/zono/. It is very easy to use and is as accurate as normal table-top mouse.

11thEarlOfMar 2 days ago 0 replies      
You might take a look at Neuroswitch from Control Bionics:


base2john 2 days ago 0 replies      
Sorry to hear about your family's situation, but hopefully he's able to get back to work soon.

Tecla is great you should give it a try. Depending on his comfort and ability a head tracking mouse from Orin is pricey but works really well with a laptop/desktop setup. Dragon Naturally speaking is useful too.

Also he should make an appointment with a local assistive technology practitioner soon to get a run down of all the options, both low and high tech. You can find these ATP folks at most all rehab hospitals.

Good luck

onlydnaq 2 days ago 1 reply      
You could take a look at tobii [1]. They make some interesting products where they combine eye tracking, speech analysis and other stuff to create user interfaces for people with specific needs.

[1] https://www.tobiidynavox.com/en-US/tobii-dynavox-pceye-plus/

mattbgates 2 days ago 0 replies      
This company develops software for the handicapped: https://getaccessibleapps.com/

I think they've created software that can bypass captchas and will work with you to develop software that can help your dad.

jbannick 2 days ago 0 replies      
Contact Barrie EllisWorld expert on motion impairment assistive technology.Barrie.ellis@oneswitch.org.uk
nikki-9696 2 days ago 0 replies      
Sorry to hear. You might find a lot more useful advice on Reddit, if you haven't already posted there.
kinova 2 days ago 0 replies      
There is this robotics arm that helps gaining autonomy. There are options that allow the user to use his tongue, head or breath to control the arm. It interfaces easily to the wheelchair controls.


xrange 2 days ago 0 replies      
Dasher might be something worth looking into:


rroriz 2 days ago 0 replies      
There is a startup in Brazil that promote DIY Assistive Projects: https://www.meviro.org
mudil 2 days ago 0 replies      
Medgadget has lots of ideas for rehab and assisting devices out there. http://medgadget.com
tinus_hn 2 days ago 1 reply      
There was a talk by a C4 paraplegic at WWDC this year that shows his setup.
htatbr 2 days ago 0 replies      
I worked with quads as a college grad student.I also worked with HIV patients in US Peace Corps.You really need to watch for sepsis.Watch out for catheter infections.https://www.indiegogo.com/projects/the-connected-catheter-by...I would not allow any quad to use a catheter more than a short time. Spinal singularity will fail by increasing infections.A new DNA tech may change this.https://nanoporetech.com/products/smidgionhttps://www.youtube.com/watch?v=OK5nNSwt3MAhttps://www.cs.columbia.edu/2016/dna-sequencing-in-classroom...

Sepsis now dominates the hospital ICU. It is what kills most AIDS patients too. Antibiotic resistance is driving costs.The ICU is now 40% of US hospital budgets.This is bankrupting state and federal budgets. This is why medicare, medicaid, obamacare are bankrupting US Govt (Fed and State) budgets.In 2013 health dominated state budgets.

State spending on health care now exceeds education spending. Look at NM's past budgets.http://www.usgovernmentspending.com/compare_state_spending_2...

Today 1/4 of US VA and Indian Health patients are diabetic. US Defense Dept. funding must now compete with Medicare.Today 40% of hospital costs are for growing ICU's and chronic disease. 1/2 of US Medicare cost = chronic disease from diabetes.

NM ICU's are dominated by chronic disease. http://www.amazon.com/Where-Night-Is-Day-Politics/dp/0801451...

40% of US hospital budgets now pay ICU/chronic disease costs. This cost is going up annually. http://money.cnn.com/video/technology/2013/07/24/fortune-tra...

Can MinION help pre ICU patients better control diet and sepsis infection.http://www.bloomberg.com/news/articles/2015-06-03/deadly-inf...A complete bacterial genome with MinIONhttp://www.nature.com/nmeth/journal/v12/n8/abs/nmeth.3444.ht...

Minion can find septic bacteria fast.https://genomemedicine.biomedcentral.com/articles/10.1186/s1...


Ask HN: What have the s/w interviews evolved to :(
24 points by throwaway1zz  1 day ago   16 comments top 12
brad0 19 hours ago 1 reply      
Ask yourself this, if you're losing sleep over the interview process because of anxiety/stress maybe these companies aren't a great fit right now.

A friend of mine studied really hard for three months while working a full time job to get into one of the big four.

He got the job purely because he studied coding interview questions. So he moved to the other side of the world with his gf to work there.

A year later he went into PIP, got out of PIP and then had so much anxiety he couldn't concentrate on his work. Now he has to move back home with little to show for it.

I know this is hard to hear but you most likely will fail until you change your mentality. You're spending your time being very hard on yourself rather than constructively asking why you want this and what is the next step I can take.

agitator 1 day ago 1 reply      
You are interviewing at companies that so many people in the field from around the world apply to. So it's really damn challenging to even get an interview lined up. It's up to you, often you can have more fun, and play a bigger role in product design, architecture, and planning at smaller companies. But if you care about the status, pay, and comforts of working at one of the big ones, then go for it. It really depends on what you value.

Also, based on my experience with interviewing and being interviewed, the questions are difficult on purpose. If everyone got the solution, you couldn't differentiate between applicants, but if they are very tough, you can get a sense of a persons problem solving skills, how determined they are, what types of questions they ask to proceed through the problem. etc. It shows a lot about a persons thought process and how likely they are to be successful when a tough problem is thrown at them. Often, getting to the solution isn't the point of the question. Seeing how someone solves something, they haven't encountered before is much more informative (especially for companies and startups working on things that are game changers).

imauld 1 day ago 0 replies      
This made the rounds here a little while ago and should help you prepare for CS heavy type interviews:


It's pretty in depth and extremely popular (44k stars). Hope it helps.

sjg007 13 hours ago 0 replies      
Focus on the fundamentals and problem solving with data structures. Try to map the solution into a data structure and algorithmic technique. Read and study the solutions. Start as simply as possible. The first task is to understand the problem being asked. Sometimes we figure that out after being shown the solution.

I wouldn't worry about dynamic programming or network flow per se. Everyone finds those hard and looks up the algorithms.

These are at the harder end. Recursion is maybe somewhere in the middle.

That said the competition level could be very high at these companies and ask yourself if you really want to work in such an environment. Also ask if you want to work on what they work on.

Interview questions are the same as word problems we see in math. Class. You have to map the problem into the CS, then apply what you know. Forget finding something optimal at first. Once you have the toolbox of algorithms/data structures you will find success.

sotojuan 1 day ago 0 replies      
This doesn't answer your question, but have you tried interviewing at companies that don't get a thousand awesome resumes a day and thus have very tough interview process? When I interviewed last year I had none of these problems. Perhaps your stress, anxiety, and job performance is not worth a job at a "respected" company.
gt2 1 day ago 0 replies      
It will be overwhelming to be significantly prepared, even for a passionate CS expert. But a passionate CS expert can still get an offer at these places, even without preparing significantly. This is because (many) interviews are designed to test a breadth of fundamental knowledge, up to pushing the candidate to their limits, and even the limit of the state of the art. As an analogy, just because a Corvette doesn't perform like a Ferrari doesn't mean it's a bad car. It could still be chosen. Perhaps chosen based on some other soft qualities. Or because the Ferrari is too expensive.

Don't worry so much. Keep preparing by focusing on small bite size chunks you can master. Then attempt some interviews.

bsvalley 15 hours ago 0 replies      
"Super human"? I went to college 10 years ago to study computer science. Questions were sometimes way harder than that. Part of it is intelligence but it's mostly a memory exercise. Memorize 500 soluions then you'll crack any SWE interview. The problem is, as soon as you start working you forget about all this BS and nobody cares.

Which is why the interview process is broken. We already go through that painful exercise in college. Companies should stop being lazy and look for a better solution to double check that you really went to college. They could focus on a lot of other things in 45 minutes, projects, behavior, culture fit, etc.

romanovcode 15 hours ago 0 replies      
Ehm, what makes you think that "Facebook, Google, Airbnb et. al." should hire people with "a few years of exp"? Also, programming is not only about those huge megacorps. Just join a startup and get some exp.
ben_jones 1 day ago 0 replies      
The interview process at many software companies is weighted towards fresh grads who display a "genius factor", i.e. are quick to answer typical engineering problems with the best possible academic solution. Simple as that.
Jemaclus 12 hours ago 0 replies      
Consider using something like interviewing.io to practice interviews with real people that can give you feedback in real-time on how you're doing. Some other people in this thread have offered some other choices, like HackerRank and Code Kata and so on. All good choices for practicing algorithms. Becoming good at interviewing for technical positions is a TOTALLY different skill set than being a good programmer, so just keep practicing and eventually you'll get better.

Once you're more confident, consider using something like Triplebyte or interviewing.io to do your tech challenges and hopefully skip past some of the earlier tech challenges.

For what it's worth, as a hiring manager I would also say that, generally speaking, I'm not interested in whether you got the "correct" answer in 45 minutes, and I'm certainly not interested in perfection in 45 minutes. Don't worry about being perfect. Just worry about being competent. I'm far more interested in the bigger picture. Things like:

1) Can you write code in the first place? The basics should be easy for you.

2) Are you familiar with the language you're writing? You shouldn't have to look up how to sort an array or how to declare a function, for example, and your code should be clean and readable.

3) Can you properly assess the problem and begin working on a solution? Take a moment to think about it. Ask follow-up questions if necessary. I always try to repeat the challenge back in my own words, just to make sure I understand what's expected.

4) Is your solution heading in the right direction? If not, you either don't know what you're doing or I didn't explain the challenge well enough.

5) Can you identify and fix edge cases? Usually the problem I give you will have some reasonably obvious edge-cases, like the popular FizzBuzz test has.

Trust me. As an interviewer, I know you're nervous. I don't expect perfection. I expect thoughtfulness, progress, and adaptability. I expect you to think through a problem, make progress toward a solution, and be able to make changes as necessary to fix edge cases and unexpected problems. That's all. (ha.)

IMO, if your interviewer is fixated on whether it's "perfect" or whether you got the exact right answer, they're not good interviewers and perhaps you shouldn't want to work with them in the first place. In fact, an ideal programming challenge should have multiple solutions (e.g., there are many ways to sort a list). The challenge should be more about figuring out how you think than whether you can add 2+2.

Also, keep in mind that the interview works both ways: you should be interviewing them just as much as they are interviewing you. Every question you're being asked is a question you can ask them. If they don't think you're doing a good job, they won't hire you, and if you don't think they're doing a good job, you don't have to work there. Keep in mind that the bare minimum for a programming job is... programming. If that's all they focus on during an interview, they're missing out on everything else you can bring to the table. Imagine if the only question they asked truck drivers was "Can you drive?" or the only question they ask a journalist is "Can you write?"

Good luck!

gonyea 22 hours ago 0 replies      
The trick isn't to interview well. It helps! But the trick is to have a great referral into the companies.
vthallam 1 day ago 0 replies      
CareerCup is good, but it has so much content with no structure, so you feel overwhelmed obviously.

Try leetcode if you haven't already, start with topic wise easy questions, then medium and then difficult, participate in the message boards discussions about problems, you need time and discipline, that's all.

Ask HN: What was it like to quit your job and focus on your successful startup?
160 points by mattbgates  3 days ago   108 comments top 33
welanes 2 days ago 3 replies      
> What was it like to quit your job and focus on your successful startup?

1. Time became my most valuable asset. Everything was filtered through the lens of "does this save me time?" and so I optimized everything: The gym (worked out at home), shopping (got delivered), dating (used fleshl...joking! :).

In the words of Joel Spolsky, "Every day that we spent not improving our products was a wasted day".

2. I worked harder than ever before. My job was tough but output ebbed and flowed with meetings, management, plus the usual office time wasters. The startup workday is more straightforward: wake up, coffee, write code, listen to users, coffee, learn how to add value to the market, coffee.

3. Every two months or so I look back and shake my head at how lame the product was, how little I knew, and how inefficient my workflow was. Which is to say, I continue to learn at an incredible clip yet realize I still don't know a thing. I expect this trend to continue - if it doesn't, I'm not growing.

So, yeah, overall it's been An Incredible Journey. My only regret is that I didn't start sooner.

It's actually a gift how easy it is to go from idea to product to business. To paraphrase Murakami, 'If you're young and talented (or can code), it's like you have wings'.

We're living in the best of times.

antigirl 2 days ago 4 replies      
This might be slightly offtopic butI've been thinking of working on my own ideas for a decade now. I tried working on projects in the evening and weekends but there isnt enough time. Not only does it effect your social life but also effects your motivation and drive in general. After looking at code on a screen for 7 hours, last thing you want to do is go home and look at more code [I guess this depends on your age, i'm 32 now - not an night owl no more]

Anyway i started contracting last year for this exact reason, at around 300-400 day rate and now i've saved enough to quit and follow my 'dream', my last day is JULY 7th. I have enough savings to last me 2 years, sustaining my current social life. no frugalness.

jasonkester 2 days ago 1 reply      
The biggest change was how much more free time I had.

When you're building a Nights and Weekends side project, you get used to stealing whatever free hours you can to work on the product. But you also necessarily build things so that they don't take up much of your time every day. If they did, it would interfere with your day job and that just wouldn't work.

So when you remove the day job, you find that suddenly you have this successful business that runs itself in the background and you can do pretty much whatever you want with your day.

Most people in this situation will immediately fill that time up with work on the product, and I did to some extent. But I also made sure to take a bunch of that time just to enjoy with my family. I eventually settled on 2-3 days a week where I was "at work", with the rest devoted to other pursuits. Both me and my wife are rock climbers, which is an "other pursuit" that will happily expand to fill the time available. We're also parents, so ditto there.

I also make a point of downing tools for a while from time to time. Again, because I can.

I took the kids out of school and dragged them off backpacking around SouthEast Asia for a few months the first year. We did a couple more medium sized trips this year, and I took the entire Fall and Spring off because those are the best times for Bouldering in the forest here. Again, work is happy to ramp up or down to accommodate because I never shifted it out of that ability to run on nights and weekends.

So now, I burst for a few weeks at a time on work stuff (with possibly a more relaxed definition of Full Time than most would use), then slow down and relax for a bit.

It's actually not so bad.

Grustaf 2 days ago 1 reply      
For me, and probably for most, it's not a matter of quitting your job in order to work on your "successful" startup - if it were already successful then there'd be no issue at all. Rather it's about quitting while the startup is just beginning to look reasonably good, _in order_ to make it a success.

Since I've been working as a mobile developer, and also management consultant (my other career), it's always been extremely easy for me to find a new job whenever I needed, so there has been very little risk involved.

Still, it did require some savings, since our startup is very research intensive and will take several years before we see any revenue. We secured some basic funding now though, and things are looking good for the next stage too so I will only have needed 6 months or so of buffer.

In summary, my view is that if you're a reasonably skilled engineer or have some other attractive occupation, there is nothing to fear. The worst that can happen is really that your startup doesn't work, you'll go back to what you did before with a few months of missed income but with plenty of useful experience.

I don't think there are many situations or cultures where a failed technology startup attempt on your rsum would count against you in any way, in most places quite the opposite.

pedrohidalgo 3 days ago 2 replies      
I did it twice. Both times I failed but I don't regret it. I felt very good when doing it. I lost savings but it was a good live experience.

On the second time I created this plugin: http://plugins.netbeans.org/plugin/61050/pleasure-play-frame... I tried to make a living of this, I only sold 5 licenses of 25 dollars per year and to develop this plugin took me 2 and a half months of hard work.

Didn't have a problem getting a new job both times.

jaequery 3 days ago 2 replies      
Well I ran a solo startup so my experience might be different from others but for me, it was like living a dream. While working full-time somewhere, I started this SaaS business and soon after, the monthly income kept doubling to the point I was able to quit my job knowing my living expenses are cared for. When I quit, it was like wow, the feeling of liberation just hits you so hard. I worked for 7 years prior to doing that and now I was free. The feeling of being able to go outside anywhere at any time of the day, go shopping, or go eat at a nice restaurant, just relax at the beach, stuffs you just couldn't do when you are working full-time somewhere. It's like getting out of jail for the first time. That's what it was like for me to quit my job and focus on my startup.
williamstein 3 days ago 0 replies      
I gave the following talk exactly one year ago right when I decided to quit my academic job to focus on my "startup dream" (now https://cocalc.com): http://wstein.org/talks/2016-06-sage-bp/. It's a year later, and the quick summary is that I feel basically really, really happy to be working on this company. It's exciting to be successfully hiring and building a team of people who work very well together and can focus 100% on a specific problem, unlike how things worked for me in my part of academia (pure math research). Though stressful at times, doing what I'm doing now is much less painful than trying to build a company while doing another fulltime job (namely teaching at the University fulltime) -- that really sucked. I'm extremely glad that customers are paying us and investors funded us enough that I can do this.

Having the luxury to focus on one thing, rather than juggling several, is much like having an office that is neat, tidy, and uncluttered. It feels good in the same way. At least by quitting a job and focusing on a startup, you have the option to focus 100% on it. Actually focusing 100% on one thing is a difficult skill in itself, even with the right circumstances; however, it's completely impossible (at least for me) with two fulltime jobs at once, especially jobs like teaching (which involve lots of public speaking at scheduled times) or running a website with paying customers (which demands, e.g., responding to DDOS attacks).

foxhop 3 days ago 2 replies      
Only asking for opinions from people who were successful is literally asking for survivor bias perspectives.

I dream of working for myself but I've never taken the plunge. My income from side projects is about 1/3 of the way to my minimum number to quit and go full time.

I do a lot of thinking about this, my number is the same as my financial independence / early retirement number.

One of the biggest things that holds me back is medical insurance for a family of 5. Having an employer offsets this cost a LOT.

dwviel 2 days ago 0 replies      
I agree with some of the other's comments that time is the most valuable asset. Be productive! Also:1) Most of the advice that you find online is not really applicable. I like the stuff that Sam Altman (yes, ycombinator) puts out because it most mirrors what I've experienced.2) You need to do it yourself. Whatever it is. Just do it.3) Funding is a game. Here on the east coast investors pretty much only fund the growth stage of a company after you have real traction and are virtually guaranteed some kind of success. You will need to fund your startup out of your own pocket until then! This is what they call "friends and family" because you will be hitting them up for money, favors, meals, places to stay, etc.4) The tech is the easy part, as that is what you know. Marketing and selling is the hard part, as that is what you likely don't know. It is the part you should try to get help with if you can, but finding someone that knows what their doing is tough, so you can be sidetracked quite a bit.5) The "do things that don't scale" is completely true. Reaching out one-on-one has been the most successful way to engage potential customers, even though it is very time and labor intensive.
markatkinson 2 days ago 1 reply      
I left my permanent position and was lucky enough to continue working on a contract basis to continue covering some of my bills.

I reached a point after about 4 months where I realised the journey to make the business profitable would most likely be a five year slog, and while the opportunity was there it wasn't a cause I felt I could devote 5 years of my life to.

So gave the software for free to the people that were helping with beta testing and went back to my job. I found the most positive thing was how it helped propel my career at my current employer as I got a better role, they seem to have more respect for me afterwards and I operate more independently now.

So I suppose if you can build some sort of safety net before quitting that helps.

jonathanbull 2 days ago 1 reply      
Last year I quit a great full time role to focus on my side project, EmailOctopus (https://emailoctopus.com). Haven't looked back since!

I didn't take the plunge until quite late on, waiting until it was making enough money to comfortably cover my personal expenses. No regrets there - growth was slow in the early days and if I hadn't had the luxury of a monthly pay packet, I probably would have given up before I got the chance to properly validate the business.

Transitioning to full time brought more stress than I expected, but the experience is priceless. In the past few months alone I've learnt more than I did in 3 years of employment.

Realistically, what's the worst case scenario? I'm a reasonably skilled dev in a strong market so there's not much to lose. If it all goes wrong I'll get another job with a load of experience (and stories!) under my belt.

Disruptive_Dave 2 days ago 0 replies      
I can tell you what it was like to quit my job and focus on a very-early-holy-shit-what-are-we-doing-startup. Was at a marketing agency for 10 years when I left to pursue my first co-founded startup. Made the decision after we got a little organic press and signed up 5k+ users in 24 hours. I had been looking for an excuse to leave and do something new anyway.

It was scary as hell (no revenue coming from the startup), fun as hell, challenging as hell. Had my savings all planned out to help support the adventure, but still had that daily stress of knowing every dollar I spent was not coming back anytime soon. That part wasn't fun. But I didn't have kids or a mortgage and knew this was my chance to do something of the sort.

10/10 would do again in a similar situation, though knowing what I know now, I might have launched a business instead of chased a cool idea.

jondubois 3 days ago 0 replies      
I never returned to my old job... Why would you go back? I did return to working as a contractor for a while. A lot of failed startup engineers become contractors.

I'm back working for a startup again now so I guess I'm just going back and forth.

I've worked for a few startups and none of them has had an exit yet but one of those I have shares in is doing relatively well.

Doing contracting work is a smarter decision in general. You can actually plan to make a sizeable amount of money and then watch it happen without taking any risks - It's all within your control. With startups, you might often feel that it's outside of your control, especially if you're not a co-founder.

RHSman2 2 days ago 0 replies      
Am 3-4 months in with concept. Have anchor client and consultancy partnership. The biggest and wildest thing is moving from someone with an idea to a visionary leader in a space. It's fundamentally awesome to take said idea, turn it into a offering and go out on the line with it. I have had nothing but wonderful experiences with others helping me out. I am 40, have 2 kids, a dog and a wife and a pretty good network which I have had to really 'work' (authentically) but the assistance from others, trust and good feelings are something that have made me believe in the human race a bit more! In corporate it's all protectionism and petty arguments AND a visionary thinker is not what they want!I am lucky to have found an excellent co-founder who is a great counter part.Definitely taking/balancing risks, going fast and working really hard (doesn't feel like work building your own thing though) however the finances have to be calculated and I have taken the consulting/product route which seems to be good but converting a service driven model to a recurring revenue model is the goal AND the challenge.
zghst 3 days ago 1 reply      
I am 2 months into doing this, it feels a lot like when I moved to SF, lot of things up in the air but very promising. I had a 6 month financial plan but it was hard to adapt to the lack of income and seeing my savings decrease month by month, so I'm doing contracting and TaskRabbit. I'm not really a ramen noodles person, so I make due with what I can get at the grocer.

I have a previous coworker who'd love to help me, but I don't want to babysit his work and I feel he's not valuable enough to the business. I would like another cofounder, but it doesn't bother me that I'm doing it all on my own, I have spent the last 10 years getting ready for this, so I'm more than ready. I am doing more than okay on my business alone, but I wish I had some expertise for a second opinion. I am really really thinking about going into an accelerator program or seeking angel investment, but I'm apprehensive about taking cash at this (or any stage). My biggest fear is actually having to get a real job again, I will do anything to prevent that from happening since that means my startup is dead.

erikrothoff 2 days ago 2 replies      
I'm on my last day at my current job before taking the plunge to focus on my half successful startup! I'm going down in pay to about half, we're still losing money every month for me being employed. Our runway is ~6 months before our savings are out. Basically we have to double the monthly revenue that took 5 slow years to grow up to. Exciting and scary times!!!
cjr 2 days ago 0 replies      
I started my screenshot Saas product, Urlbox (https://urlbox.io) as a side project back in 2013. There were times I did think about shutting it down as the amount of time spent on it initially compared to the revenue growth was just too depressing! A whole 3.5 years later it had grown just enough that I could begin to take it more seriously and actually support myself full-time working on it.

As with everything there are pros and cons. The pros are obviously that you get to spend your time doing something you enjoy (hopefully), and can work whenever and wherever you feel like (this can also be a con!). The cons are that you will always be worrying about stuff like churn, whether servers will go down whilst you're away on holiday, how you're going to grow enough to support a family etc etc.

The long, slow Saas ramp of death really is a thing, and there are no silver bullets in terms of growth/marketing - just many small things that all contribute. I also always used to think 'if only I could just get to $x MRR then everything would be so much better and I'd be much more comfortable and relaxed', but when you do eventually break through that barrier you realise you're just more worried about how you are going to achieve the next one, so it's kinda never ending!

I also agree with other posts here that if you're already a decent developer in a good market, then what is the worst that can happen really? Try doing some fearsetting. I'm sure you could always find another job if your thing doesn't work out, but you do need to give these things time. I also failed a bunch of times with other startup ideas, one of which was also YC backed.

daliwali 2 days ago 2 replies      
I started doing this recently by quitting my job, and I currently have no income.

There is this assumption that one must build a minimum viable product that has to be released as quickly as possible, so much that it's become startup mantra. It's no surprise that a lot of these products seem to be technically shallow, everyone is reaching for low hanging fruit.

I feel rather alone trying to do something that I think hasn't been done before, or if it had, wasn't executed well. I don't think I could possibly commit to it without having strong motivation, which I struggled with while having a full-time job.

The biggest technical/social challenge I have is to make something that a non-technical user could easily get right away and make something with it. I think the automation of web dev is an inevitability, and frameworks were just a historical blip on this path. The same thing is happening to web design. http://hypereum.com

gwbas1c 3 days ago 0 replies      
Quitting my job was great!

Let's just say that my mistake was that I was too afraid to hurt my co-founder's feelings. If we parted ways when we should have, I might have actually gotten somewhere. (Then again, I might have gotten nowhere either!)

greglindahl 3 days ago 1 reply      
I had my financial ducks in a row before I did my first startup -- I knew how much "fun money" I could expend getting the startup to work. Most of the horror stories I hear revolve around people who didn't do this.
fergie 2 days ago 4 replies      
I quit my job to start my own company last January. Its been fairly successful so far.

For somebody like me, and probably a lot of HN readers, its _actually_ a fairly low risk proposition because qualified, experienced software engineers are so sought after. Whatever you are doing, you will always be able to pick up a $1000-$1500 a day gig when you need to bootstrap your actual project.

My old boss has contacted me a few times to see if I want to come back- definitely do not want to.

You talk about "fear", and you talk about a "successful" startup. Here's the thing: You never know if a startup will be successful, and you just have to give it a go for the love of it, rather than any expectation of success. Don't be afraid- there are plenty of worse things in this world than a failed company.

Have learned a lot about bookkeeping.

wonderwonder 2 days ago 0 replies      
I landed a contract to develop a saas system before I really knew how to program for the web, I had some prior systems programming experience but had not coded for about 5 years. I quit my job to develop it and delivered but was never able to land another customer.

I still have the original client 3 years later and the company grosses about $3,500 per month and I net $1,250. It pretty much runs itself, requires maybe 2 hours of work every 2 - 3 months. I spent a little over a year trying to grow it from the initial customer with no luck.

Landed a job as a full stack engineer afterwards and I really like it. I am actively looking to start a new project but I will keep my main job while doing it. I had the benefit of a wife who makes a good salary to support me during that prior adventure (Still do :) )

andrewchambers 3 days ago 1 reply      
Just a reminder that life is pretty short, and worst case, you can probably just get another job and only be 6 months to 1 year behind your friends in terms of savings.
12s12m 2 days ago 0 replies      
I actually have a different take on this. I was building a side project 2 times in my past, to a point where there was some buzz around these side projects. And, I didn't have a job when I was working on these. However, as the products were nearing completion for V1. I got very good offers for contract work. I also had debts, so I took up the jobs. However, this led to the products failing to get any traction.

So, if you are planning to leave a job and have a good product which is getting you even half of the money you need. Leaving your job will only increase the chances of success. However, hanging on to the job while working on a product is going to be much harder.

wolfer 2 days ago 0 replies      
TLDR - Sort out your finances first, startups are extremely stress inducing and even now with our startup at the point of break even, 1m ARR and 100% growth predicted for next financial year, I still currently earn less than I need to live but with the promise of a strong exit in the next 3-4 years if we hit our targets.

Previously I contracted as a full stack developer bringing in other developers on projects as and when the project timescales wouldn't have been achievable with just me. Running a software consultancy alone, dealing with all of the usual rigmarole of a business and performing proper client outreach was stressful, but financially and personally very rewarding (especially when you close a big deal completely on your own).

In order to get involved in my current startup, which at the outset was comprised of a designer, biz dev (CEO) and myself as CTO I had to cut off ties with my previous clients and dedicate all of my available time to the new startup. I had leveraged myself quite a bit running the previous consultancy as billings were growing year on year, so my VAT/Corporation tax accounts were generally paid out of job fees towards the end of the year rather than set aside throughout the year, leaving me in a negative cash flow position when stopping work for existing clients. Luckily there were some ongoing payments that didn't require development resource, so the small admin time required to invoice and chase up was all that was required, and enabled me to setup payment arrangements with HMRC to settle these liabilities over a period of time, out of this cashflow. Setting up these arrangements was very stressful, and I would strongly advise anyone coming into a startup to fully evaluate their financial situation before committing even if the opportunity seems huge.

Initial salaries in the new start-up were minimal (1000 p/m approx), and it took a solid three years, extremely long working days, almost unmanageable personal stress and around 0.5m of funding before we're now up to an above average average salary, 1m ARR, a team of 15 and strong growth projected for the coming year.

Success is a subjective term and occasionally I have to refocus to see the light at the end of the tunnel, but with enough grit, luck and determination, its possible to tip the balance to a point where success is now more likely than not.

hesdeadjim 2 days ago 0 replies      
I've had two very different experiences starting a company.

The first time I was two years out of school with $12k in the bank, had a partner with a ton of experience, and a decent idea. We crunched for six months, launched, failed, and then tried to pivot. I ran out of cash a few months before the iPhone launched and had I had a longer runway we could have ported our app to the iPhone and potentially seen success.

A year ago and nine years later than that attempt, I've started a small video game company with another friend (justintimegame.com). Despite my life situation being more complicated and expensive to maintain, the prior nine years success combined with my wife's income basically lets me try and fail until I get sick of it instead of when the money would run out. Obviously I'm aiming for success, but the massively reduced stress from barely worrying about money let's me be much more open to experimentation while also being resilient to failure.

I don't regret starting and failing my first company however. It set me up for having a higher risk threshold and an interest in startups that ended up working out quite well for me.

geekme 2 days ago 0 replies      
I started up twice. First was a e-commerce market place and the second one was into mobile app development. Both failed. I was running them for around 4 years. Now I am back to a full time Job. I never regretted my failures since I learnt a lot from them and now I am working on my 3rd side project hoping to succeed.
verelo 2 days ago 0 replies      
So for me I left a job and worked on my startup that ended up unsuccessful. Fortunately during that time I met my current business partners and we have since made and sold a company that by most people's definitions has been a very good run.

Expecting to get it right is the failure we all make at some point (even when we say out lout "this might not work out" we still somehow expect it to work). Expecting failure to lead to something positive is the long game I'd urge you to wait for, it's hard to remain in a good mental state at times while you're working hard and feeling under appreciated, but that is sadly just what it's like.

mezod 2 days ago 0 replies      
The way I did it was to start freelancing so that I could pay the expenses with less hours and then use the rest of the time for sideprojects. Finally, one of my sideprojects (https://everydaycheck.com) shows interest and traction, so it's easy for me to just do less freelance hours and put more time into it, hopefully I can get to a point where I can work 100% on it.

I guess the "quit your job" problem only exists if you have major responsibilities, like a family, or paying debt back. Otherwise, it makes no real sense to consider it, the opportunity is too big.

cylinder 3 days ago 0 replies      
Self employment is incredible when the money's rolling in, and terrible when it's not. I can't speak for a full fledged startup with funding though because then you just work for investors.
smdz 2 days ago 0 replies      
1. One day you are excited with the possible opportunities, then you are overwhelmed and then you are depressed. If you leave without a plan, it takes a few months to just get the mind straight. If you do have a plan - it falls apart and you still end up spending a few months that feel unproductive.I am not discouraging on having a plan - you must. But also note that you made a plan with an "employee mindset".

2. 6 month financial backup is usually not enough. I have heard many stories where people try going independent for 6 months, run out of money and start looking for a job. What happens is - entrepreneurship gets into you in that time and if one goes back to a job, I can bet they feel even more frustrated. You need 1.5 years of backup or 2-3 years of "frugal living backup". I struck positive cashflows in just about 5 months, but it wasn't good enough. I distinctly remember thinking "Maybe, I should have done this part time". Then I struck a mini-gold-mine at 8 months. Having a good backup will help you persist longer. I did not have a growth strategy that worked. But I focused on working and doing the right thing. Keep it rolling.

3. The biggest worry I had when starting was about providing "enough" for my family and any emergencies for next 1.5-3 years at any point in time. Unlike many stories, I promised myself not to wait until I go bankrupt or in a lot of debt - Nearing that is a huge red flag, where I would typically exit and take a regular job. However, taking a job is the last thing I want to do. That thing kept me money-oriented for a while and made me work on stuff that generated positive cashflow.

4. Would it have been possible to return to your old job? - Maybe, but I would not want to. I waited too long to jump ship. Infact, my experience on multiple "good" jobs is what is keeping me away from them. Once you taste entrepreneurship, its hard to go back

5. I do not consider myself successful. May be semi-successful, some people see it as success. But I have come a long way from fearing failures. Success may or may not last long. I enjoy the process and the tremendous personal growth it results into. I ensure my financial backup now gives me 5-6 years minimum to start afresh - if I have to. Do not undervalue the role of money - it definitely makes things easier.

6. This is my favorite quote about Karma. I heard it many years back (and thought it was impractical). Especially useful when I feel I did everything right but nothing works:"Karm karo, fal ki chinta mat karo" (Do your duty without thinking about results)

P.S.: I don't know about others, but I have restricted myself into writing lesser HN comments because it takes quite a bit of time/energy. This one is an act of impulse. How do other entrepreneurs feel about this?

apatters 2 days ago 0 replies      
If you're single, childless, and have a few thousand dollars in savings, quitting your job to focus on a side project or startup is very easy. You can achieve your dream in the next 24 hours. Here's the roadmap.

1. Give your employer your 2 weeks/1 month notice (depending on locale). Taking this step immediately is critical because the urgency and shock of the change will force you into being fast and practical about all the subsequent steps.

2. Create a monthly budget for yourself which assumes no income that you are not 100% sure about. So if you have interest from investments or a freelance contract that's a absolute guarantee you can include it. For most people the income side of this budget is going to be low or nothing. Your goal with this budget is to stretch your funds out for 6-12 months. The good news is that in 2017, the principle of geoarbitrage allows you to live on virtually any budget. If you live in the Bay Area your next step is going to be to move somewhere cheaper. On the cheapest end of the spectrum, I'll use Thailand as an example because I live here, you can get a basic apartment in the suburbs of Chiang Mai or Bangkok for $100-$500/mo, your initial arrival can be visa-free, and you'll live on delicious Thai food from a restaurant down the road for a few dollars a day. Network heavily with people in your intended destination before you even arrive because it'll make everything 100 times easier.

3. Now create a business plan for your new entity. The business plan should include a description of the product or service which you're going to market, how you're going to market it, what you're going to charge (start high), and any and all costs of development and operation including your own time. It should include monthly profit/loss projections (you're not allowed to use these projections in your budget, they are goals, not guarantees). The most important thing about your business isn't what product or service you initially offer. Once you have assets and control you can try anything you want. Until then the goal of your business is to make enough income that your assets are growing, no matter what that entails.

If you're leaving the country as a part of this process I would advise forming an LLC and opening a bank account before you go, as these things can be difficult from overseas. You'll be very busy trying to make money and living your dream so you don't want to have to deal with paperwork.

Prepare yourself mentally to work very hard for at least the next 6 months and do whatever you need to do to make enough cash. You will become practical and decisive, and you'll learn many realities about business, such as cash flow is king, very quickly. I got my start being nickel-and-dimed by agencies in India over Elance. It sucked and it was hard and it was 100% worth it.

There are many objections to this strategy which typically stem from risk aversion, or a desire to not worry about money. I would submit that if one objects to the risk, this plan is a personal growth opportunity: it will teach them how to handle stress, plan for contingencies, and so on. If the objection is that they don't want to worry about money, I would point out that money is just a way for people to quantify your value to them, and since no man is an island, there are great personal and financial rewards to be reaped from confronting this objection and discovering what other people truly value about you.

Doing step 1 first and now is the key. If your path brings you through Bangkok let me know and we'll grab a beer! I've seen many people succeed at this and a few fail. Your odds are better than you think.

jhylau 2 days ago 0 replies      
one word: liberating
How do you Design a cryptocurrency like ethereum/BTC to have stable value?
8 points by noloblo  1 day ago   10 comments top 8
blakdawg 1 day ago 0 replies      
The value swings represent changing ideas in the minds of human beings about the value of the underlying currency.

If you want the value of a currency to be stable, you need to be in a position to economically support the stable price - if lots of people want to sell, the stabilizer must be prepared to buy a lot of the currency at the stable price. If a lot of people want to buy, the stabilizer must be prepared to sell a lot of the currency at the stable price. This is potentially a very, very expensive - and perhaps impossible - undertaking.

See https://en.wikipedia.org/wiki/Monetary_policy for more.

Ultimately, value isn't something that's designed, it's the overall effect of a lot of individuals' preferences and guesses about the future.

rthomas6 8 hours ago 1 reply      
The first step is to not make it deflationary, which causes people to hoard it instead of spend it. Unfortunately BTC gained such popularity in part because of this property.

One cool way to do this that I've been thinking about is to tie the mining reward rate to the exchange rate somehow. This has the effect of more coins being created when prices are low, and less created when prices are high. In theory this should stabilize the price. The problem with this method is you need a way to measure the "price" in a decentralized way separate from any other currency.

One way to do THAT is to aim for a certain velocity of money (https://en.wikipedia.org/wiki/Velocity_of_money). Theoretically it should correlate to inflation: if people are hoarding, the *coin will gradually start to decrease in value until more people are spending. If people are spending like crazy, the value will gradually increase. Not sure if that is a good solution, just one that I thought up.

WheelsAtLarge 1 day ago 0 replies      
It's not possible to have a 100% stable currency. At their core currencies are human opinions of how much to exchange for a product or service. But human opinion changes on a whim.

I think a currency could be stabilized by automatically adding coins to the total when people feel they want to hoard them, such as economic panics and reducing them when they feel they want to use it without care such as when the economy is booming.

It has to be automatic but finding a way to do that is the big question. But what and how? Use an index, maybe but it has to be impartial and give a true view of how people feel about the economy at the time.

For the US dollar the Federal Reserve is responsible for increasing and decreasing the money supply but they have truckloads of people telling them how the economy is doing.

A cryptocurrency has potential since in theory you can figure out exactly how people are using them.

Additionally, the supply has to increase as more people use it to create economic goods and services. If you don't, there won't be enough currency to go around and it will limit the potential products/services that people can create and consume.

Finnucane 1 day ago 0 replies      
As others have said, basically it's a problem of control. Normal currencies, in relatively normal times, are semi-managed by central banks. They use tools like reserve requirements and interest rates to influence the rate at which new currency flows into the general economy. It's possible for one bank to trade in the market in a way to peg one currency against another, as China did for some years. And it was costly, because they were doing it while trying to maintain a large trade surplus, so large numbers of dollars were coming in that needed to be neutralized somewhere. It's why they hold so much US debt. Real currencies get in trouble--severe deflation or inflation--when circumstances (the gold standard, external debts financed in another country's currency) make it hard to respond to economic conditions.

This becomes the question: can the system be designed to respond dynamically to demand in a way that mimics the actions of a central bank, without there actually being a central bank? If the goal was to make it so that it was reasonably constant in terms of purchasing power, there's probably a number of metrics one could choose to measure that, but how to achieve it, that's above my pay grade.

notahacker 15 hours ago 0 replies      
option (i) have a credible centralised monetary authority that promises to redeem it for its reserves of dollars or gold or whatever you want it to be stable against. Though arguably for the crypto-currency to actually be credible it needs that monetary authority to hold (or have the facility to borrow at short notice) the asset it's backing it with

option (ii) replicate the fiat system: i.e. the spendable currency is mostly credit with a future repayment obligation and thus actual future demand to hold the currency, and imbalances between rates of borrowing and repayment which drive the currency value up and down are kept in check by algorithmic adjustments to "interest rates" creditors face when borrowing to meet margin calls and "taxes" which make debtors have to buy even more coin to meet payment obligations.(This still isn't going to work unless credit creators are regulated though...)

whether it's still a cryptocurrency after all that is another question....

SirLJ 3 hours ago 0 replies      
Everything is about speculation, so you cannot prevent it, unless you have a large quantity of reserve tokens and act as a central bank...
Suncho 1 day ago 1 reply      
You can't design a cryptocurrency with stable value. At least not without a central monetary authority anyway. A key selling point of cryptocurrency is that it (theoretically) exists outside of the influence of a central authority. But this very feature is the thing that will prevent it from ever having a stable value.

Because there's a limited amount of it, it's deflationary in nature. If a currency becomes deflationary, then it's not going to be a currency for very long. People will hoard it instead of using it in their everyday transactions.

Cryptocurrency gets most of its value from speculation. In the case of Bitcoin, at least, that speculation is spurred by some people's belief that it has importance as a potential future currency. It doesn't matter that they're wrong. Their beliefs are still enough to boost the price of the speculative asset. Then you have this cycle where its price will crash every so often. If Bitcoin's price ever stabilizes, it will be because most people have given up on it as a currency.

Once the hype dies down, the question is whether it can do what gold did and become a valuable asset with a relatively steady price. Will there be fringe Bitcoin bugs like there are gold bugs who create enough demand for it to keep the price from going to zero? I guess we'll see.

At its core, Bitcoin is gold. And if you want an answer to why we'll never be using Bitcoin as our currency, ask yourself why gold is never again going to be our currency. It's the same answer. Without a central authority, you can't keep the price stable.

Now we could certainly construct a credit structure on top of Bitcoin. So instead of trading Bitcoin directly, we trade IOUs for Bitcoin. That's what we did with the gold standard. If they're "just as good" as Bitcoin, then it effectively increases the circulation of Bitcoin-denominated assets and can help keep prices stable.

You'd of course need some kind of trusted authority to tweak various macroeconomic variables to ensure that the value of Bitcoin (or Bitcoin IOUs) remains stable. But it's possible at least for a while. You just have to give up one of the core principles of cryptocurrency.

Eventually you're going to have so much money circulating and so little actual Bitcoin to back it up that the system is going to become too brittle and you'll have to leave the Bitcoin standard.

Hey look. You just bootstrapped another fiat currency.

Bitcoin is gold. It can be manipulated by governments just as much as gold can. And it has just as much potential to be a real currency as gold does. For a long time there, we were lucky and we were mining new gold at roughly the rate required to keep prices stable. No such luck with the cryptocurrencies. And even if you manage to hit on the right rate of currency minting today, the future looks different.

Ask HN: What Podcasts are you listening right now and why?
71 points by dirtylowprofile  1 day ago   72 comments top 57
georgespencer 20 hours ago 1 reply      
Two which I really enjoy:

- In Our Time. Legendary BBC Radio 4 show in which four experts discuss a topic (e.g. 'enzymes', 'The Egyptian Book of the Dead', 'The Paleocene-Eocene Thermal Maximum') in terms a layperson can understand for about an hour, guided by a host who asks all the dumb questions for the listener.

- Norm Macdonald Live. Former SNL castmember spends a couple of hours interviewing e.g. Billy Bob Thornton, Adam Sandler, etc.). One of the most consistently funny and off-key shows I've ever heard.

rubayeet 19 hours ago 0 replies      
[0] Tell Me Something I Don't Know - Trivia show with a twist, hosted by Steven Dubner of Freakonomics fame.

[1] How I built this - Interviews successful entrepreneurs on their background, motivations, challenges etc. in building their businesses.

[2] Revolutions - Podcast on some of the biggest political revolutions in history. I am going through season 2 (American Revolution against the British Empire).

[3] War Stories - "Traces the evolution of warfare through the eyes who lived it". Season 1 focused on armoured warfare (a.k.a. Tanks). Waiting on Season 2.

[4] Science Vs - Researches the fads/opinions (organic food, meditation, ghosts etc.) to figure out if they are based on science.

[0] http://tmsidk.com/[1] http://www.npr.org/podcasts/510313/how-i-built-this[2] http://www.revolutionspodcast.com/[3] https://angrystaffofficer.com/war-stories-podcast/[4] https://gimletmedia.com/science-vs/

doe88 20 hours ago 1 reply      
New season of Revisionist History by Malcolm Gladwell has just started http://revisionisthistory.com/

Worth it.

FlorianOver 21 hours ago 1 reply      
teekert 20 hours ago 0 replies      
Linux Unplugged (Jupiter Broadcasting) [0]: Informal chat with some cool people from the Linux world, often guests like Martin Wimpress (Ubuntu Mate, Raspberry Pi enthusiast), Frank Karlitschek/Jos Poortvliet (Nextcloud), Ikey Doherty (Solus/Coreboot), etc.

Linux Action News (Jupiter Broadcasting) [1]: 30 min overview of news from the Linux world.

No Agenda: For a healthy news diet [2].

TWIT: Loving the over-friendliness and forced extravertedness less and less and missing Dvorak, but still, a nice Tech overview.

Story Grid: (From time to time) In depth analysis of books from the perspective of a writer and editor. Very insightful.

[0] http://linuxactionnews.com/[1] http://www.jupiterbroadcasting.com/115911/halls-of-endless-l...[2] http://www.noagendashow.com/

scarface74 5 hours ago 0 replies      
Knowledge @ Wharton - sometimes interesting, I end up skipping about half of the episodes.

Exponent -Ben Thompson of Stratechery -- very insightful commentary on business and technology/

NPR Planet Money -- Economics is a second love of mine.

Startup -- by Gimlet Media -- Stories about the startup culture

Science vs -- Researches fad and compares them to the actual science.

Acquired -- discusses technology acquisitions

Internet History Podcast -- just what it says it is.

Freakonomics -- Because it's Freakonomics, should be required listening for anyone who wants to talk about economics.

Political Gabfest -- definitely liberal leaning political commentary.

Career Tools/Manager Tools -- I suggest these two podcasts to anyone who is working. Binge on them from the beginning and skip the ones that aren't relevant to you.

The Talk Show w/John Gruber -- required listening for Apple nerds.

Accidental Tech Podcast -- same as above/

Slate Money -- Did I mention I'm an economics nerd?

Pandabob 19 hours ago 0 replies      
Econtalk for its usually insightful discussions.http://www.econtalk.org/

Conversations with Tyler for the same reason as Econtalk.https://itunes.apple.com/us/podcast/conversations-with-tyler...?

Bodega Boyz because nothing makes me laugh like Desus and Mero.https://soundcloud.com/bodega-sushi

akurilin 21 hours ago 0 replies      
To Be Continuous (https://www.heavybit.com/library/podcasts/to-be-continuous/) has been my go-to.

It's great if you're interested in continuous delivery, startups, fundraising, product development, best practices etc. from two founders who have been and continue to be successful at their roles.

smkellat 17 hours ago 0 replies      
I mostly listen to things from the BBC due to the World Service no longer broadcasting to North America as of 2001. There are some CBC programs since the loss of Radio Canada International. Mostly comedies plus The Archers Omnibus.

These are for listening pleasure. CBC & BBC both have comedy of the week podcasts. Because News on CBC is hilarious. Drama of the Week on BBC is good though sometimes off the wall.

I could listen to Larry Kudlow for business reasons but lately I cling tight to my comedies. I need the escape.

eddyg 16 hours ago 0 replies      
I listen to the SANS Internet StormCast ("daily 5-10 minute information security threat updates") every morning:https://isc.sans.edu/podcast.html

Many of my other favorites have already been mentioned, but I also listen to:

Twenty Thousand Hertz ("stories behind the world's mostrecognizable and interesting sounds")https://www.20k.org

and have started listening to this new NPR podcast:

Wow in the World ("a new way for families to connect, look up and discover the wonders in the world around them. Every episode, hosts Mindy and Guy guide curious kids and their grown-ups away from their screens and on a journey. Through a combination of careful scientific research and fun, we'll go inside our brains, out into space, and deep into the coolest new stories in science and technology")http://www.npr.org/podcasts/510321/wow-in-the-world

akras14 22 hours ago 1 reply      
Hardcore History
kageneko 16 hours ago 1 reply      
I listen to a lot of horror fiction podcasts like Darkest Night, the No Sleep Podcast, some other fiction stuff like Rabbits and Alice Isn't Dead, and then some other stuff.

Planet Money -- my favorite

99% Invisible

Marketplace with Kai Ryssdal

History Honeys

Six Feats Under

My wife is more into Sunday School Dropouts than I am, but I listen to it occasionally. She also listens to some other history podcasts but I don't recall what they are.

unicornporn 21 hours ago 0 replies      
Zero Books: Advancing Conversations:


Team Human:


Internet History Podcast:


Featured Voices:


cjCamel 19 hours ago 0 replies      
Not technical, but tech related:

http://exponent.fm/ Exponent by Ben Thompson (of Stratechery) and James Allworth is great for analysis of big tech issues and news.

https://trackchanges.postlight.com/ Track Changes by Paul Ford and Rich Ziade can be quite light, but they have some interesting guests and have lived on the web since it started.

Funny stuff - if you like Football (Soccer) then The Football Ramble is essential. http://www.thefootballramble.com

For british nonsense humour two of them have just stated a spin off. Humour is subjective though so YMMV and don't judge me! http://stakhanovindustries.com/lukeandpeteshow

(edited all of my beautifully crafted markdown links because I forget HN can't do that)

jogundas 20 hours ago 0 replies      
I am surprised that the omega tau ( http://omegataupodcast.net ) has not been mentioned yet. Many episodes on aviation, but also quite a few on hardware, software, and science. The style is incredibly nerdy, which I guess is an advantage for the HN crowd.
Dowwie 18 hours ago 0 replies      
About startups:

1. "How I Built This" with Guy Raz https://www.npr.org/rss/podcast.php?id=510313

2. "Startup" by Gimlet media http://feeds.gimletmedia.com/hearstartup

3. Stanford's DFJ ETL: https://web.stanford.edu/group/edcorner/uploads/podcast/Educ...

4. "This week in startups" http://feeds.feedburner.com/twist-audio

dmoreno 20 hours ago 0 replies      
Security Now -- great weekly review of security and IT news

Stuff to blow your mind -- has some great in depth analisys about science and more

Techstuff -- loved the series about all the story of Sony, Nintendo, Samsung...

The Bikesheed -- two very technical guys, very funny

The changelog -- great interviews

Software engineering daily -- some guest with some technical topic every day.

torbjorn 20 hours ago 1 reply      
The History of Rome - An excellent 191 episode series covering, well, the history of Rome by Mike Duncan.
ghostwreck 20 hours ago 1 reply      
Masters of Scale - just started and really great, Reid Hoffman talks about startups and growing them.
flarg 20 hours ago 0 replies      
Not seen Floss Weekly mentioned thus far. Features some really amazing projects and the presenter somehow makes complex topics understandable. This and HN are the main sources of my tech knowledge.
amerkhalid 12 hours ago 0 replies      
Side Hustle School - https://sidehustleschool.com/

It is short and quick, and very interesting to see creativities of people to generate some side income.

VelNZ 20 hours ago 0 replies      
I'm a big fan of Skeptoid (http://www.skeptoid.com). Brian does a great job of telling the story of many popular pseudoscientific/conspiratorial/unexplained things and then addresses them with evidence and scientific skepticism but in an insightful way without mocking or being judgemental.
apstyx 22 hours ago 0 replies      
Hardcore History

The Economist (Paid for but worth every cent, 8 hours of news)

The Economist asks

Tim Ferris

No such thing as a fish


All songs considered

mvindahl 19 hours ago 0 replies      

- Reply All

- Planet Money

- Hanselminutes

Reply All is about the internet and planet money is about money, but in both cases it's as much about people and the interesting things that we do.

Hanselminutes is Scott Hanselman interviewing interesting guests about aspects of software development. It has a laid back and friendly pace. Scott is always well prepared and a very nice host.

gaastonsr 23 hours ago 0 replies      
How I Built This with Guy Raz
ajdlinux 20 hours ago 0 replies      
My podcasts aren't hugely funny, or startup centric, but everyone else is sharing their list... Not an American, but with a strong interest in what's going on across the Pacific.

Risky Business, Pod Save America, Lawfare Podcast, Chat 10 Looks 3, The Dollop, Bombshell (War on the Rocks), FiveThirtyEight Politics

SirLJ 1 hour ago 0 replies      
On the Wind Sailing
nunez 6 hours ago 0 replies      
None because I've been in a music kind of mood lately.
arcticfox 21 hours ago 3 replies      
In addition to what I consider the usual suspects (Startup [Gimlet] / Radiolab / This American Life), I just added two a little more off the beaten path:

The Pitch - Shark Tank on a podcast essentially. Somewhat deeper. The more recent episodes are way better than the first ones so just skip to the end.

Waking Up - so refreshing to hear someone as thoughtful as Sam Harris on a regular basis. I love that he is so calmly rational that he can have productive conversations with everyone from left to right, atheist to Muslim.

2m 21 hours ago 0 replies      
My list https://lists.pocketcasts.com/d4460e3f-2edc-4d23-b6da-8c62d0...

If I would recommend one from the list it would be Ted Radio Hour

divan 18 hours ago 0 replies      
NPR's podcasts: - How I Built This - Planet Money - Ted Radio Hour - AltLatino

- 99% invisible - Hanselminutes

- GoTime

NPR's podcasts (and How I Built This especially) is of incredible quality - they even write music for each episode.

epinifrim 20 hours ago 0 replies      
Google Cloud Platfrom Podcast is really good one, especially if you are interested in google technologies.https://www.gcppodcast.com
phillc73 20 hours ago 0 replies      
The Final Furlong Pod: http://www.attheraces.com/finalfurlongpodcast

Why? Because I love horse racing and it is funny.

raleighm 20 hours ago 0 replies      
I'm only a few episodes in but engaging thus far: http://argumentninja.com/podcast-episodes/
aatchb 20 hours ago 2 replies      
Some that haven't been mentioned:

Security Now - Steve Gibson basically reviewing the week in software and hardware security.

Rationally Speaking - Intellectual stuff

No such thing as a fish - fun trivia from the people behind the QI tv show.

jansho 19 hours ago 0 replies      
Not related to startups specifically (unless edtech?) but I highly recommend Metalearn for personal development. It's interview-based but that's what I like about it!
wqweto 17 hours ago 0 replies      
http://www.nerds2nerds.com/ -- just to troll demigod :-))
aquilax 20 hours ago 0 replies      
Artifexian http://www.artifexian.com/ - two guys discussing world building
ropeladder 22 hours ago 0 replies      
Reply all is funny and about the internet.The Dig is not funny or about startups but I just discovered it and wanted to share. Smart policy discussion with a lefty viewpoint.
dirtylowprofile 15 hours ago 0 replies      
Damn! Thanks for the suggestions! Right now I'm listening to Masters of Scale for starters.
fpgaminer 21 hours ago 0 replies      
The IndieHackers podcast has been really great so far.
tmccrmck 21 hours ago 0 replies      
I highly recommend Embedded.fm if you're interested in any thing in the embedded software world.
crispytx 23 hours ago 0 replies      
Revisionist History with Malcolm Gladwell
dejawu 20 hours ago 1 reply      
- Accidental Tech Podcast (Marco Arment's podcast about Apple tech)

- Harmontown

- Obsessed with Joseph Scrimshaw

aaronbrethorst 20 hours ago 1 reply      
The Daily - An NPR-like ~20 minute podcast from the NYT. https://www.nytimes.com/podcasts/the-daily

Indie Hackers - Insightful 1:1 interviews with founders of smaller 'lifestyle' businesses. https://www.indiehackers.com/podcast

In Our Time - Wonderful history podcast from the BBC http://www.bbc.co.uk/programmes/b006qykl

Lovett or Leave It - Insightful weekly political podcast from Jon Lovett, a former speechwriter for Barack Obama who was once called "the funniest man in Washington." https://getcrookedmedia.com/lovett-or-leave-it-6077c7aca95c

The Perceptive Photographer - 10-15 minute podcast released every Monday from my favorite photography teacher. Insightful and brief snippets about a variety of topics of interest to fine art photographers. https://www.danieljgregory.com/perceptivephotographerpodcast...

Pod Save America - Twice-weekly podcast from four guys who used to be in the Obama White House. Super-insightful political commentary. Lots of coarse language. https://getcrookedmedia.com/here-have-a-podcast-78ee56b5a323

Pod Save the People - DeRay McKesson's weekly podcast on social justice and activism. Even if you don't know DeRay's name, you'd probably recognize him based on his blue Patagonia vest. https://getcrookedmedia.com/pod-save-the-people-56bc42af53d

S-Town - A co-production from Serial and This American Life. It starts off as a murder mystery and then goes off into left field. A beautiful, sort of American Gothic look at our country. The ending left me feeling a bit...empty maybe? Still, an incredibly worthwhile way to spend seven hours. https://stownpodcast.org

skyisblue 20 hours ago 0 replies      
React native radio. Currently trying to build an app in React Native.
tzhenghao 20 hours ago 0 replies      
1. a16z

2. Acquired - Podcast about Tech Acquisitions + IPOs

3. Recode Decode by Kara Swisher

forkLding 22 hours ago 0 replies      
Software Engineering Daily, has some focus on startups
mike128 20 hours ago 0 replies      
Tim Ferriess Show
evanb 21 hours ago 0 replies      
What Trump Can Teach Us About Con Law

The West Wing Weekly

Planet Money


The Adventure Zone

This American Life

In Our Time with Melvyn Bragg





99% Invisible

The Tobolowsky Files


Coffee Break German

rthille 21 hours ago 0 replies      
The Political Gabfest - SlateThe Ezra Klein Show - VoxThe Weeds - VoxWaking Up - Sam Harris
adomanico 11 hours ago 0 replies      
Sam Harris - Waking Up Podcast

Bill Burr - Monday Morning Podcast

Joe Rogan - PowerfulJRE (not every episode)

Duckton 19 hours ago 0 replies      
Waking Up with Sam Harris.Very informative, interesting guests. Especially lately with everything going in US politics.
jaymenon 20 hours ago 0 replies      

why oh why

planet money


all songs considered

Dinner party Download

Hidden Brain

Radio Lab

The Splendid Table


You are not so Smart

boltzmannbrain 20 hours ago 0 replies      
Coding: Talk Python to Me, Software Engineering Daily

AI: Talking Machines

Misc: Waking Up with Sam Harris, a16z

Ask HN: What books are you reading?
26 points by curiousgal  2 days ago   27 comments top 22
prthkms 19 hours ago 0 replies      
Drive: The Surprising Truth About What Motivates Us (https://www.amazon.com/Drive-Surprising-Truth-About-Motivate...)

Throws out a new perspective on what motivates people.

jasonkester 1 day ago 0 replies      
Seul sur Mars (The Martian, in French)

It is really boosting my understanding of the French language, and giving me more confidence to speak it.

It's a simple story that's easy to follow, especially having read the book in English and seen the film a couple times. And really, how lost can you get? If you can't follow a paragraph or to, chances are he'll still be stuck on Mars for a while and you won't have missed much.

It's written in an informal, conversational style, using language that real people might use. I find myself reading a phrase that translates back to a saying I've used in English. Ah, looks like they use that in French too. I'll add it to the repertoire.

I can pick it up after a while off and quickly get back in to it without explanation. Hmm... this looks like the part where the guy is stuck on mars...

And as a bonus, it's kinda hard work to read in a foreign language, so if I pick it up in bed it's guaranteed to put me to sleep inside of half an hour.

Highly recommended.

Jtsummers 2 days ago 0 replies      
Specifying Systems, http://lamport.azurewebsites.net/tla/book.html

Engineering a Safer World, https://mitpress.mit.edu/books/engineering-safer-world

Software Specification Methods, https://www.amazon.com/Software-Specification-Methods-Henri-... (also available through Safari Books Online, at least at my office)

Read most of the third one this week, a useful comparison of the various approaches. My objective is to understand how to better produce formal (or more formal) specifications. Either for whole systems or just for significant or critical portions of them.

comsci-bro 2 days ago 0 replies      
The PhD Grind: http://pgbovine.net/PhD-memoir/pguo-PhD-grind.pdf

It is a wonderfully written memoir that perfectly details the grad school experience and also includes some helpful notes from the author. I'll be graduating next year (bachelor's in CS), and my dad asked me if I wanted to enter grad school. The book sure did add some fuel to the fire.

mars4rp 23 hours ago 1 reply      
Black Swan, was way better than my expectation! I especially love silent evidence, if you are following HN and are not billionaire read it!
thakobyan 2 days ago 0 replies      
Currently I'm listening to "Personal MBA" audiobook and loving it so far. I'm not the biggest fan of business books but decided to give this one a try to learn a bit more about marketing and sales.

Here are the books I've read and want to read: https://booknshelf.com/@tigran/shelves

JSeymourATL 1 day ago 0 replies      
Raising the Bar: Integrity and Passion in Life and Business: The Story of Clif Bar & Co.

Just started this book last night. The story begins as the Founder of Clif Bar walks away from selling his company and a $40M personal pay-out. Big idea so far, your business is an ultimate form of self-expression. > https://www.goodreads.com/book/show/29691.Raising_the_Bar

pedrodelfino 1 day ago 0 replies      
I am reading "Mindset: The New Psychology of Success", from the Stanford's Professor Carol Dweck. I really wish I had read that 10 years ago... By that time the book wasn't published but some part of the science behind it was already available. I am almost finishing it and it is probably the best of the 12 books I have read so far this year.
bcbrown 2 days ago 1 reply      
Mind And Nature - A Necessary Unity, by Gregory Bateson, and I Am A Strange Loop, by Douglas Hofstadter. They're a great combination, as they're both attempts to define the concept of "mind" through patterns. Bateson is one of the early thinkers in the field of Cybernetics, which I've been meaning to learn more about.

Here's my (unfinished) reviews of the books I've read so far this year: https://github.com/bcbrown/bookreviews/tree/master/2017. At the end of the year I'll flesh them out a little more.

b_emery 1 day ago 0 replies      
The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolutionby Walter Isaacson

It's a history of where all this - startup culture, silicon valley, computers, internet, hackers - came from. Should be essential reading for anyone working in IT.

jetti 13 hours ago 0 replies      
WiX Cookbook and WiX 3.6: A Developer's Guide to Windows Installer XML
wu-ikkyu 1 day ago 0 replies      
Manufacture of Madness: A Comparative Study of the Inquisition and the Mental Health Movement.

Highly recommended for anyone interested in an "outside the box" perspective on mental health and society at large.

astrodev 2 days ago 0 replies      
Peter Frankopan, The Silk Roads - world history from an Asiacentric perspective.

Harold Coyle, Team Yankee - WW3 in Europe in the 1980s from the perspective of a tank company commander. Poorly written, in my opinion, but the accurate (or so I hope) descriptions of the military tactics and equipment almost make up for it.

James Gleick, The Information: A History, A Theory, A Flood - excellent book about the history of information.

gubsz 1 day ago 0 replies      
Into Thin Air by Jon Krakauer.

It goes into detail about the Mount Everest disaster in the 90s.

delgadillojuanm 2 days ago 0 replies      
Im reading the Deep Learning book written by Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Also "Intercom on staring up"
miguelrochefort 2 days ago 1 reply      
"Getting Things Done" by David Allen
house9-2 2 days ago 0 replies      
Stephen King

The Dark Tower II: The Drawing of the Three

Wanted to read the first one before the movie came out, now I am hooked...

alltakendamned 2 days ago 0 replies      
Windows Internals 7th Ed.

Seveneves, Neal Stephenson

Astrophysics for people in a hurry, Neil De Grasse Tyson

SirLJ 2 days ago 0 replies      
Right now I am reading again "More Money Than God"


elyrly 2 days ago 0 replies      
Just finished The cartel today, next Algorithms to live by
stevekemp 2 days ago 0 replies      
The Chronicles of Amber, again.
sidcool 2 days ago 0 replies      
Essential AngularJS
Ask HN: How do I go about validating an idea?
10 points by yingw787  1 day ago   5 comments top 5
nicholas73 1 day ago 0 replies      
The landing page tells me nothing about what you do, and your FAQ only a hint of it. And honestly it doesn't nothing to convince me that you actually provide value. It's not even clear that you have an 'algorithm' versus simply having parsed some financial statements.

First and foremost, an investor wants to know whether your algorithm works. Have you backtested it with price history, and then what are the performance metrics?

And if it works, then the obvious question is: why do you need to sell your service?

The other thing that I'd worry about as an investor is that oftentimes value stocks are cheap for a reason. It's one of the last places I'd want to algo trade.

adolfoabegg 1 day ago 0 replies      
* Remove all the boilerplate text from your landing page. Focus on what your algo does: why is it different from current existing solutions?

* Use metrics to demonstrate capability.

* Hit real investors and get real feedback.

mtmail 14 hours ago 0 replies      
Who is "Wilburn Preston -- Index Fund Manager" from the testimonial? Wouldn't that person know others in the industry?
SirLJ 3 hours ago 0 replies      
You'll need to publish extensive back testing maybe 20 years to prove to the investor the system actually works, also you'll need to live trade it as well for few years and have audited real results... and check with your financial regulators what exactly you can advertise to avoid getting sued by SEC and the government...
kull 1 day ago 0 replies      
Are you an investor yourself ?
Ask HN: Framework for resolving disputes and technical disagreements?
4 points by fosco  1 day ago   1 comment top
mahesh_gkumar 1 day ago 0 replies      
If it's difficult to prove one way or the other, why do you care which option is chosen? Just pick one! if it doesn't work once released into production, pick something else! Remove emotions from the equation. No one should feel happy or sad that their option got chosen, nor should one feel happy or sad when that option works or fails. In the end, the only thing that should matter is the business value the solution is providing. Do you think your customers' care whose solution got picked? If people get 'hurt' easily when their solution is not chosen, then they need to grow up.
What would you have done to prevent the ETHereum flash crash on Coinbase yday?
6 points by noloblo  1 day ago   11 comments top 5
chollida1 1 day ago 2 replies      
Unfortunately it looks like you had a perfect storm of

1) Large order sent to market

2) Exchanges with a serious lack of liquidity

3) stop loss orders making things worse.

Everyone has their personal pet peeves, mine is stop loss orders. It's one of the 3 things that amateurs tend to use with out any understanding of markets. The other two being use of margin, probably doesn't need any explanation, and trading currencies/currency pairs.

In today's markets, stop loss orders are like market orders........99.99% of the time only people who don't know what they are doing use them.




spcelzrd 1 day ago 0 replies      
Other markets like NYSE and NASDAQ have circuit breakers. If an index declines by a certain amount, all trading is halted so people can think and sort out their positions. This doesn't stop a single stock from flash crashing, but transactions during a flash crash are frequently reversed.
unstatusthequo 1 day ago 0 replies      
Prevent? I wish I had the foresight to have had buy orders in at $0.11 on the rise!
meric 1 day ago 0 replies      
When a large market order come in, if it looks like it will move the price by more than a certain percentage, warn the seller there isn't enough depth to support his order.

And circuit breakers.

Finnucane 1 day ago 1 reply      
Design a cryptocurrency to have stable value, not speculative-asset value.
Ask HN: What projects are 20 years ahead of their time?
62 points by miguelrochefort  2 days ago   103 comments top 39
est 2 days ago 1 reply      
I'd say IE5.5.

All those dHTML shit, like DirectX, MIDI, .htc (Javascript Components), <img dynsrc> for videos, a sane CSS box model, css and inter-page transformations, VML for vector graphics, VRML for VR (buzzword), XML islands (influenced e4x and later JSX), the original xhr object, native cryto APIs, etc.

You can even program jscript on server side with asp, or execute standalone with ActiveScript, even control native GUI like customizing your folder, the browser can be morphed to the file explorer. You can make apps with few kb of jscript unlike 55MB electron install bundle.

The Windows help files (CHM) are like thousand years better than macOS counterparts and linux man files. CHM was the de facto ebook format back then and it works really well with features like indexable topics and full text search. We now have to use devdocs.io or dash.

yes it has its quirks and worms, but it was way ahead of its time.

bsenftner 2 days ago 1 reply      
Back in 2002, I started working on "automated digital doubles of real people" with an eye on automated actor replacement. I'd been working in VFX, with a history in 3D development going back to the early 80's. Long story short: started with a stellar team and I wrote and was awarded a global patent on automated actor replacement in filmed media. The company, patent and seed funding all come together just as the financial crisis at the end of Bush's term climaxed. Team scattered, I went it alone building a freelancer tiny team. Pivot to a game character service. I was never able to achieve more than a half way decent game character creation web API, which no one would pay to use. All that is left now is 1) the twitter account https://twitter.com/3davatarstore?lang=en, and of course I still have all the tech. I use it for facial recognition now, and no longer make avatars. It's even better now, but even when contacted by interested parties, no one want to pay for it, at all.
TACIXAT 2 days ago 5 replies      
FPV Drone Racing - While automated flight is making great progress, there are some amazing pilots out there. Something about a person standing in a field with goggles and a controller feels so future to me.

3D Printing - This is going to be the main way to manufacture things in the future. The lab that is 3D printing houses with concrete. That makes me terrified for home values going forward. It will likely shift all the value into the land. The house will just become something you tear down and reprint every 10 years.

CRISPR - s/shitty gene sequence/perfect gene sequence/g That's insane. It's like an anti-virus product for the body (irony intended). We're going to live a very long time and be practically disease free pretty soon. I'm planning on living until 150 (27 now). It's placing a big bet on medical science, but I feel like we're on the edge of some huge things.

tylerruby 2 days ago 2 replies      
Lilium - The worlds first electric vertical take-off and landing jet. (https://lilium.com/)

Neuralink - Develops high bandwidth and safe brain-machine interfaces. (https://neuralink.com/)

Magic Leap - Mixed Reality (https://magicleap.com)

Crispr-Cas9 - A unique technology that enables geneticists and medical researchers to edit parts of the genome by removing, adding or altering sections of the DNA sequence. (https://en.wikipedia.org/wiki/CRISPR#Cas9)

This is a great question. The acceleration of technology has made it important for entrepenuers to look further ahead than ever when deciding where they want to make their impact in the world. Tomorrow's successful leaders in business will be the ones that peered into the most obscure places of the future to find it's problems and it's solutions.

AndrewDucker 2 days ago 5 replies      
Look back 20 years to 1997. Before Windows 2000 brought together the home and server codebases. Before the internet was available on mobile phones. Heck, back then the internet was used by less than 2% of the global population.

What was 20 years ahead of its time then? What would you have looked at and thought "That'll be massive in 20 years"?

About the only thing I can think of is VR. Which Sega tried to launch in the late 90s, and only now is selling over a million units.

tpeo 2 days ago 2 replies      
headcanon 2 days ago 1 reply      
I'd say most of the ethereum-based startups, like swarm city or golem. Right now they're going a Cambrian explosion of different business models, many of which require everyday people to be using cryptocurrencies on a regular basis. Current prices are due to speculation for this eventuality, and personally I'm bullish on crypto, but this feels too much like the 90s was for the internet. Not that that's a bad thing, it's just the natural cycle of innovation.
fiftyacorn 2 days ago 0 replies      
Driverless cars and a lot of AI - to me these technologies are beginning, and Im expecting to see a lot of bad AI in the coming years before it settles down
DonbunEf7 2 days ago 1 reply      
The various object-capability projects alive right now, like Tahoe-LAFS, Monte, Cap'n Proto, Sandstorm, and Genode, are all very much future technology. Imagine:

* No accounts, no passwords, just secret keycaps

* Instead of messy and complex role-based tables, capabilities always know exactly what they are capable of doing

* No more confused deputies

* Fine-grained trust

jonrgrover 10 hours ago 0 replies      
Any project that works with information inside a computer program as well as data has a chance to be 20 years ahead of its time. Current software development supports information very poorly, and supports data very well. Right now the most we can do to work with information is to extract it after the fact using Data Analytics. If we could work with information inside a program that would be a huge leap from where we are now.
candiodari 2 days ago 0 replies      
Homomorphic encryption. Software that is secure, based purely on the software itself, and can still be secure even on compromised hardware. This will enable real cryptocurrencies and zero-trust state-like entities.
richardthered 2 days ago 1 reply      
The Long Now Foundation and their 10,000-year clock. http://longnow.org/clock/

It's a clock. A physical clock. Designed and built to run, accurately, for 10,000 years without human intervention.

spodek 2 days ago 0 replies      
Mine: to motivate people to choose to lower their pollution and greenhouse gas emissions (significantly, not just raising awareness or things like using electric cars instead of driving significantly less).

People can do it but they prefer living the way they do, which is what is causing the problems, knowing in principle that they should change their behavior but not actually doing so.

Miami flooding more and more is not enough of a burning platform yet. Nature will provide it if we don't choose to change ourselves.

otterley 2 days ago 0 replies      
Not quite 20 years too soon, but Six Degrees pioneered social networking in 1997: https://en.wikipedia.org/wiki/SixDegrees.com
qubex 2 days ago 1 reply      
Hopefully my grandmother's funeral arrangements.
Raphmedia 2 days ago 1 reply      
Anything VR / AR. It did exist but the world and the tech wasn't ready. Hell, both still aren't.
otterley 2 days ago 0 replies      
erik998 2 days ago 0 replies      
James Orlin Grabbe https://en.wikipedia.org/wiki/James_Orlin_Grabbe

His Digital Monetary Trustshttps://en.wikipedia.org/wiki/Digital_Monetary_Trust

The End of Ordinary Moneyhttps://www.memresearch.org/grabbe/money1.htm

Cycan artificial intelligence project that attempts to assemble a comprehensive ontology and knowledge base of everyday common sense knowledge, with the goal of enabling AI applications to perform human-like reasoning.

The project was started in 1984 by Douglas Lenat at MCC and is developed by the Cycorp company. Parts of the project are released as OpenCyc, which provides an API, RDF endpoint, and data dump under an open source license.


Prolog, Backward chaining, forward chaining, opportunisitic reasoning.

d--b 2 days ago 1 reply      
What does that really mean? Do you mean projects that are being worked on right now and that will be delivered in 20 years? Or projects that are finished now and that will be understood in 20 years? Or projects that people thought wouldn't be possible before 20 years?

1: VR, self driving vehicles, nuclear fusion, artificial photosynthesis, quantum computers, robots that can manipulate things like men, wave energy harvesting, colonize mars, cure cancer, cure Alzheimer's disease.

2: no idea!

3: drones, deepmind, blue led, electric sports cars, flyboards, voice activated assistants, smart wearables...

owebmaster 2 days ago 0 replies      
IndieWebCamp: https://indieweb.org/
SideburnsOfDoom 2 days ago 0 replies      
Some current things are either going to be huge, normal, mainstream and just work in a boring way in 20 years time, or they are current fads which will pass.

Cryptocurrencies. 3d printers.

js8 2 days ago 0 replies      
Antropogenic Global Warming.

I am being sarcastic. But it's very hard to see, today, any technology that could make my life significantly better (at least than fixing climate change).

halis 2 days ago 0 replies      
Uh if you look at what is happening today, OBVIOUSLY Javascript was 20 years ahead of its time. You're welcome.
boramalper 2 days ago 1 reply      
Netscape Enterprise Server and Server-Side JavaScript (SSJS)


tim333 2 days ago 2 replies      
Nuclear fusion? I can see this thing in 20 years http://news.mit.edu/2015/small-modular-efficient-fusion-plan...
skdotdan 1 day ago 0 replies      
SpaceX. Of course they still need time, work and funding. But they have a working product ahead of their competitors that clearly shows a path to the future of space exploration and colonization.
DocSavage 2 days ago 0 replies      
Making significant impact on psychiatry via brain simulations.https://www.humanbrainproject.eu/en/medicine/

I could see this happening within 20 years, but not in the confines of the current project.

auganov 2 days ago 0 replies      
Datomic when it first came out probably. Meaning still over 10 years ahead of its time.
srinivasang87 2 days ago 0 replies      
Breakthrough Starshot / Nano spacecrafts - much more than 20 years ahead
PeterStuer 2 days ago 0 replies      
I was doing autonomous mobile robotics in the 1987-1996 era, does that count?
sunstone 2 days ago 0 replies      
SpaceX self landing rockets. "I'm on a barge!"
srinivasang87 2 days ago 0 replies      
1. Breakthrough Starshot / Nano spacecraft - much ahead of its time2. Wireless power transmission of electricity / space based solar farms
TurboHaskal 2 days ago 1 reply      
APL comes to my mind.
conception 2 days ago 1 reply      
Crypto currencies probably. I could see taking a generation for them to be mainstream and understood by the public the way the internet is today.
bikamonki 2 days ago 0 replies      
Whatever Elon is working on...
baybal2 2 days ago 0 replies      
ghuntley 2 days ago 1 reply      
carsongross 2 days ago 0 replies      
Well, intercooler.js is 20 years behind its time:


Does that count?

Ask HN: Does anyone know what's happening with Magicleap?
16 points by yalogin  3 days ago   4 comments top 2
mrep 3 days ago 0 replies      
They raised [1] $793.5M 16 months ago so they still probably have a decent amount of cash that they can ride out.

[1]: https://www.crunchbase.com/organization/magic-leap/funding-r...

valuearb 3 days ago 1 reply      
They took the wrong tack, GPUs rapidly increasing processing power means that portable devices can drive AR. You don't need super heavy, super expensive custom hardware. You just need great software.
Ask HN: What if WannaCry would have tried issuing fake Bitcoins?
7 points by tarikozket  2 days ago   5 comments top 4
Lan 1 day ago 0 replies      
Not a whole lot. The fastest CPUs will net you less than 100 MH/s [0]. The fastest single-card GPU configurations will net you around 1,000 MH/s [0]. The current total hash rate is around 5,000,000,000,000 MH/s [1]. WannaCry affected around 300,000 PCs [2]. So if every WannaCry miner was operating off CPU and getting 100 MH/s, it would only be 30,000,000 MH/s. Or 0.0006% of the current total hash rate. If they all had high-end GPUs getting them 1,000 MH/s then that would bring it up to 300,000,000 MH/s. Or 0.006% of the current total hash rate. So WannaCry wouldn't even be a drop in the bucket. If you're wondering how the hash rate is so high, it's because mining has switched to using ASICs, the fastest of which run around 14,000,000 MH/s [3].

[0] https://en.bitcoin.it/wiki/Non-specialized_hardware_comparis...

[1] https://blockchain.info/charts/hash-rate

[2] https://en.wikipedia.org/wiki/WannaCry_ransomware_attack

[3] https://en.bitcoin.it/wiki/Mining_hardware_comparison

axonic 1 day ago 1 reply      
If I understand the OP correctly, he means could the blockchain be attacked by a vast number of infected hosts, causing a malware-induced change in the consensus of nodes, allowing BTC to be illicitly acquired, spent, or produced.

I believe controlling a massive number of nodes in the network via infection techniques like WannaCry used would open the door for many actual and hypothetical attacks. Please see the Bitcoin Wiki page titled Weaknesses [1] for more details about attacks involving the control of network resources.

More realistically, a simpler attack would be to go for control of the wallets if you have that kind of access to the infected hosts. However, if an actor had an interest in devaluing Bitcoin, to buy after a crash and sell after recovery perhaps, or just destabilize users' trust and destroy it (states?) then there could be a lot of profit in it I believe. Bitcoin has many competitors and enemies, is this something we should worry about?

[1] https://en.bitcoin.it/wiki/Weaknesses

download13 2 days ago 0 replies      
Can you clarify?
miguelrochefort 1 day ago 0 replies      
There's no such thing as fake Bitcoins.
Do all YC startups use Clerky and have the same style of terms in offer letters?
4 points by mrezak  1 day ago   5 comments top 4
arjunvpaul 10 hours ago 0 replies      
There might be a way to meet both your needs. If you are satisfied with everything else, there is no harm in letting the hiring process go ahead. Ask the employer to send you the offer letter but don't sign anything yet.

Once you have the offer letter in hand, CONSULT A LAWYER on your own dime to prepare the following. (Even if the employer's lawyers prepared the agreement and modified clerky agreements, you would have to get your own lawyer to review any agreement before you sign them)

Step 1:Write down your verbal agreement in an email. For example:"Thank you for the offer letter. I would like to take this opportunity to document in writing what we discussed over the phone. Specifically,-- As part of my employment agreement, you will provide me with a blue pony within 2 business days of my landing in SFO. -- Within 90 days, You will also provide me with a visa and bear all costs for the same.If this is also your understanding of our discussion. Please send me an email back specifically stating that."

Step 2:-- Strike out any clause in the standard agreement that says that there shall be no other verbal or written agreements (your lawyer can help identify these sneaky clauses). Initial all pages. Sign it.-- Attach and Send it back to him/her. -- Ask him to make 2 copies of that set, initial all pages AND any struck-out clauses and send one copy back to you.

This way he/she gets to use Clerky and you get a good written documentation any agreement.

If he refuses to do that. Don't take the job.

NOTE OF CAUTION: Its not up to your employer to give you a green card or a visa. The US Government has to give you one. Your employer will have to pay "thousands of dollars to hire a lawyer" to a visa or green card anyway.

smt88 1 day ago 0 replies      
I'm sorry to say this, but this employer sounds like a scumbag. If this is how he treats you (by lying and trying to exploit you) at this stage, how is he going to treat you once you're stuck with him?

> he has to pay thousands of dollars to hire a lawyer to include the terms

This is utter bullshit. Call 5-10 local lawyers, tell them what is needed, and find out how much it would cost. Present the employer with the names of those lawyers and their prices.

> Is my request unreasonable?

ABSOLUTELY NOT!!! Always get everything in writing, no matter how much you trust someone. For all you know, the person who made you the promises will be at a different company or dead in a few years.

It's completely normal and reasonable to get something in writing, and it's either naive or malicious that someone is refusing.

nagrom42 10 hours ago 1 reply      
If you absolutely need to take this offer, draft up a written agreement outlining all the terms you two have verbally agreed to. Have you and the other party sign the form and email yourself and the company a digital copy of the signed form. Also, ask your boss to reply to the email and confirm the agreement to be valid.
smt88 1 day ago 0 replies      
(Also, assuming your HN username can be traced to your real name and therefore your future employer, I'd suggest deleting this and re-posting it under a throwaway username that can't be traced to you.)
Ask HN: As a lazy but concerned user, how do you run your own email server?
16 points by Cilvic  3 days ago   11 comments top 6
danieltillett 3 days ago 2 replies      
The problem is not setting up your own email server (this is relatively easy), it is getting all your mail into other people's inboxes. Basically the big players these days (I am looking at you Microsoft) just treat any mail coming from a private server as spam. Even more frustratingly they don't do it consistently, just frequently enough that you can't rely on anyone getting your email.

After running my own email server for 15 years I gave up a couple of years ago and paid for someone else to solve the nightmare of dealing with the big email gatekeepers.

type0 17 hours ago 0 replies      
You should fix the link for cloudron, it's https://cloudron.io

The problem there is as they moved from beta - you need to pay them 8$/mo to get catch-all email and updates..

DamonHD 3 days ago 2 replies      
I run my own email server and have done for ~25 years, but what do you mean by 'secure'?

SMTP isn't a secure transport.

Having your email stored on someone else's computers (ie the cloud) is not necessarily 'secure'.

Having a well-constructed and well-managed host somewhere you physically control seems to me the most 'secure' arrangement, which is what I have always had. Currently for the cost of a Raspberry Pi and occasional 'apt-get update' etc.

thiagooffm 2 days ago 0 replies      
I pay for protonmail. Works like a charm, got even an app. 50 bucks a year, totally worth it.
Lan 2 days ago 0 replies      
You could try an all-in-one solution like iRedMail[0] or Mail-in-a-Box[1]. Those supposedly do most of the leg work for you and set up a commonly used stack (Postfix, Dovecot, SpamAssassin, Roundcube, etc). I've never used either of them since I just install everything piecemeal, but I imagine there is an ease of use tradeoff compared to setting the same stack up yourself. In other words, it'll be easier to set up initially, but the downside is that you wont learn the ins and outs of the individual components. So if something breaks or you need to make an adjustment you're going to have a more difficult time at that point.

That said, there are some things you should be aware of when running a mail server:

1. You need to make sure that the IP address and domain name that SMTP is bound to is not on a blacklist[2]. You also need to consider the trustworthiness of your host because you could very well get caught in the cross-fire if one of their other customers gets them range banned. Certain cloud providers that make it very easy to change IP will more than likely have all of their addresses on some blacklist or another.

2. You also need to make sure you have matching forward (A record) and reverse (PTR record) DNS records for that IP address. This is called Forward-confirmed reverse DNS, aka FCrDNS. Many mail servers will reject email from servers that do not have or have mismatching records for FCrDNS.

3. You must set up SPF and DKIM. Many mail servers will either reject mail from servers without these, or at least weight heavily against it.

4. You probably want to make sure TLS is set up properly, otherwise your mail is going to travel the internet in plaintext.

5. The IP address you're sending from is going to start off with no reputation. The volume, type of mail, and how many people mark your mail as spam is going to decide whether other mail servers start filtering you or not. You may have no problems here. If you're unlucky, you will need to try to reach out to whichever major mail provider is filtering your mail. Many of them have a ticketing system for this, but you'll be at the mercy of whomever is working that ticket. There are also various whitelists that might be worth trying your server on. They're usually very selective and will probably reject your request.

6. You really, really need to make sure you've got your policies set up correctly because you do not want to accidentally set up an open relay[3] that will be used to spam other people.

7. Greylisting is a very, very effective means of spam filtering. The downside is that mail from new servers wont be delivered instantaneously and will instead be delivered whenever their mail server tries to deliver it again. Other than that, most spam is malformed in some way so some basic DNS checks will filter a ton of it. There are also free RBL and DNSBL lists that will pick up the slack.

[0] http://www.iredmail.org/[1] https://mailinabox.email/[2] https://mxtoolbox.com/blacklists.aspx[3] https://en.wikipedia.org/wiki/Open_mail_relay

feborges 2 days ago 0 replies      
you don't.
Ask HN: Has Duckduckgo gotten worse recently?
42 points by pmoriarty  2 days ago   43 comments top 19
bsstoner 2 days ago 3 replies      
Disclaimer: I work for DDG.

I'd be interested in looking into any examples of searches where the results aren't good enough or where it seems to have gotten worse recently.

As far as I know there haven't been any changes over the past few weeks that would have made things worse.

watertorock 2 days ago 2 replies      
Unrelated but I wish they would rebrand already. Call it Duck, call it Go, call it something simple.

It's not an easy or memorable name at the moment, and branding matters.

tblyler 2 days ago 0 replies      
I noticed about a week or two ago that I was always leaning toward !g as well. I have been testing out my own instance of searx. I have yet to specify using google with it, since it already uses google results.


proaralyst 2 days ago 1 reply      
I've actually noticed the opposite; I've been resorting to !g much less.
babyrainbow 2 days ago 0 replies      
Ha! I had the exact opposite.

I used DDG a while when it was introduced. But returned back to google since the results were no as good.

But recently I was feeling googles results has got a lot worse and gave DDG another try.

Big difference! Like Google vs Yahoo back in the days.

Now DDG is my default search.

cdevs 2 days ago 0 replies      
Duckduckgo is my default at work and on my phone. I love it for quick programming questions and usually everything else is fine but if I'm doing any design and need images I usually for some reason head to google but for everything else I'd rather not have google turn my query into my next YouTube ad2 seconds later..
grizzles 1 day ago 0 replies      
I just switched over to DDG because I was getting Google's human detector + classify these images nearly every search.

I guess my estimated worth to Google must be fairly low because I don't click on many ads and I often use a work VPN.

stuaxo 2 days ago 0 replies      
Weird, I've been feeling like the same thing has been happening, just noticed over the last couple of weeks that it doesn't seem to find what I want.

I'm in the UK and noticed I often seem to be getting US centric results and have to try using Google more often.

Edit: ddg has been my default for 2 years.

ravenstine 2 days ago 0 replies      
I've noticed it's results have gotten much better. Now if there were a version of the Personal Blocklist chrome plugin for it(so I don't see crap from W3Schools, WebMD, or Livestrong, etc.), I would never use Google again.
chatnati 2 days ago 0 replies      
I too feel the results have been poor of late. I think Google probably "knows" the type of questions you look for (ie. usually on stack overflow for programming questions, or whatever most popular links people are clicking on around that time). If I'm looking for something very esoteric, I can tell straight away that DDG doesn't understand my query from its results and I go over to Google and find the answer on the first, second or third page.
tmaly 2 days ago 3 replies      
I have been using g! much more for programming.

I was hoping to build an extension for DDG a few months back, but things seemed to have changed in the forum.

This could explain why we are seeing changes.

snoitavla 2 days ago 1 reply      
I've recently wrote yet another ddg API wrapper for python https://github.com/alvations/rubberduck. I'm loving how I can browse the web in jupyter notebook. Oh the irony of having jupyter on a browser and calling an API to get search results
faulker 2 days ago 0 replies      
I've personally had the opposite, I've been getting better results over the last year and I'm using !g very rarely now.
Polyisoprene 2 days ago 0 replies      
For searches containing multiple words, code or error messages I resort to google as ddg doesn't find the relevant pages.Other than that it's a lot better than before.
dodgedcactii 2 days ago 0 replies      
i've noticed this too and its fucking with my mind, since i end up using !g in a private window and not having the history (the whole point of being not tracked)
ramayac 2 days ago 0 replies      
I did the switch 2 weeks ago, I'm actually enjoying it!
amelius 2 days ago 0 replies      
Perhaps they should let the user choose the search algorithm. Make it an option somewhere.
lhuser123 1 day ago 0 replies      
I hope it keeps getting better.
everdayimhustln 2 days ago 0 replies      
I had a problem with a chrome extension hanging DDG from displaying results, but otherwise seems awesome as usual on mobile and desktop.
Ask HN: How do you organize your files
68 points by locococo  2 days ago   39 comments top 31
phireal 2 days ago 0 replies      
Home directory is served over NFS (at work). Layout is as follows:

 phireal@pc ~$ ls -1 Box/ - work nextcloud Cloud/ - personal nextcloud Code/ - source code I'm working on Data@ - data sources (I'm a scientist) Desktop/ - ... Documents/ - anything I've written (presentations, papers, reports) Local@ - symlink to my internal spinning hard drive and SSD Maildir/ - mutt Mail directory Models/ - I do hydrodynamic modelling, so this is where all that lives Remote/ - sshfs mounts, mostly Scratch/ - space for stuff I don't need to keep Software/ - installed software (models, utilities etc.)
At home, my main storage looks like:

 phireal@server store$ ls -1 archive - archived backups of old machines audiobooks - audio books bin - scripts, binaries, programs I've written/used books - eBooks docs - docs (personal, mostly) films - films kids - kids films misc - mostly old images I keep but for no particular reason music - music pictures - photos, organised YYYY/MM-$month/YYYY-MM-DD radio - podcasts and BBC radio episodes src - source code for things I use tmp - stuff that can be deleted and probably should tv_shows - TV episodes, organised show/series # urbackup - UrBackup storage directory web - backups of websites work - stuff related to work (software, data, outputs etc.)

kusmi 2 days ago 1 reply      
I made an automatic document tagger and categorizer. It collects any docs or HTML pages saved to Dropbox, dropped into a Telegram channel, saved with zotero, Slack, Mattermost, private webdav, etc, cleans the docs, pulls the text, performs topic modeling, along with a bunch of other NLP stuff, then renames all docs into something meaningful, sorts docs into a custom directory structure where folder names match the topics discovered, tags docs with relevant keywords, and visually maps the documents as an interactive graph. Full text search for each doc via solr. HTML docs are converted to clean text PDFs after ads are removed. This 'knowledge base' is contained in a single ECMS, external accounts for data input are configured from a single yaml file. There's also a web scraper that takes crawl templates as json files and uploads data into the CMS as files to be parsed with the rest of the docs. The idea is to be able to save whatever you are reading right now with one click whether you are on your mobile or desktop, or if you are collaborating in a group, and have a single repository where all the organizing is done actively 24/7 with ML.

Currently reconstructing the entire thing to production spec, as an AWS AMI, perhaps later polished into a personal knowledge base saas where the cleaned and sorted content is public accessible with REST/cmis api.

This project has single handedly eaten almost a third of my life.

jolmg 2 days ago 0 replies      
My home directory:

 - bin :: quick place to put simple scripts and have available everywhere - build :: download projects for inspection and building, not for actively working on them - work-for :: where to put all projects; all project folders are available to me in zsh like ~proj-1/ so getting to them is quick despite depth. - me :: private projects for my use only - proj-1 - all :: open source - proj-2 - client :: for clients - client-1 - proj-3 - org :: org mode files - diary :: notes relating to the day - 2017-06-21.org :: navigated with binding `C-c d` defaulting to today - work-for :: notes for project with directory structure reflecting that of ~/work-for - client - client-1 - proj-3.org - know :: things to learn from: txt's, books, papers, and other interesting documents - mail :: maildirs for each account - addr-1 - downloads :: random downloads from the internet - media :: entertainment - music - vids - pics - wallpaper - t :: for random ad-hoc tests requiring directories/files; e.g. trying things with git - repo :: where to put bare git repositories for private projects (i.e. ~work-for/me/) - .password-store :: (for `pass` password manager) - type-1 :: ssh, web, mail (for smtp and imap), etc. - host-1 :: news.ycombinator.com, etc. - account-1 :: jol, jolmg, etc.
Not all folders are available on all machines, like ~/repo is on a private server, but they follow the same structure.

amingilani 2 days ago 0 replies      
My home folder is it.

 . Desktop Downloads Google Drive // My defacto Documents folder legal library // ebooks and anything else I read ... Downloads Sandbox // all my repositories or software projects go here Porn // useful when I was a teen, now just contains a text file with lyrics to "Never Gonna Give You Up"
I backup my home folder via Time Machine. I haven't used Windows in years but when I did, I used to do something similar. Always kept a separate partition for games, and software because those could be reinstalled easily, personal data was always kept in my User folder.

ashark 2 days ago 0 replies      
- ebooks: I don't love Calibre, but it's the only game in town.

- music: Musicbrainz Picard to get the metadata right. I've been favoring RPis running mpd as a front-end to my music lately.

- movies/TV: MediaElch + Kodi

I don't have a good solution for managing pictures and personal videos that doesn't involve handing all of it to some awful, spying "cloud" service. Frankly most of this stuff is sitting in Dropbox (last few years worth) or, for older files, in a bunch of scattered "files/old_desktop_hd_3_backup/desktop/photos"-type directories waiting for my wife and I to go through them and do something with them. Which is increasingly less likely to happensometimes I think the natural limitations of physical media were a kind of blessing, since one was liberated from the possibility of recording and retaining so much. Without some kind of automatic facial recognition and taggingand saving of the results in some future-proof way, ideally in the photos/videos themselvesthis project is likely doomed.

My primary unresolved problem is finding some sort of way to preserve integrity and provide multi-site backup that doesn't waste a ton of my time+money on set-up and maintenance. When private networks finally land in IPFS I might look at that, though I think I'll have to add a lot of tooling on top to make things automatic and allow additions/modifications without constant manual intervention, especially to collections (adding one thing at a time, all separately, comes with its own problems, like having to enumerate all of those hashes when you want something to access a category of things, like, say, all your pictures). Probably I'll have to add an out-of-band indexing system of some sort, likely over HTTP for simplicity/accessibility. For now I'm just embedding a hash (CRC32 for length reasons and because I mostly need to protect against bit-rot, not deliberate tampering) at the end of filenames, which is, shockingly, still the best cross-platform way to assert a content's identity, and synchronizing backups with rsyncZFS is great and all but doesn't preserve useful hash info if a copy of a file is on a non-ZFS filesystem, plus I need basically zero of its features aside from periodically checking file integrity.

mcaruso 2 days ago 2 replies      
One thing I do that I've found to be pretty helpful is to prefix files/directories with a number or date, for sorting. Some things are naturally ordered by date, for example events. So I might have a directory "my-company/archive", where each item is named "20170621_some-event".

Other things are better sorted by category or topic. For tools or programming languages I'm researching I might have a directory with items "01_some-language", "02_setup", "10_type-system", "20_ecosystem", etc.

lkurusa 2 days ago 0 replies      
Roughly this scheme:

~/dev for any personal project work

~/$COMPANY for any professional work I do for $COMPANY

~/teaching for teaching stuff

~/research for academic research (it's a big mess unfortunately)

~/icl for school related projects (where "icl" is Imperial College London)

For my PDFs I use Mendeley to organize them and have them available everywhere along with my annotations.

I store my books in iBooks and on Google Drive in a scheme roughly like: /books/$topic/$subtopic

Usually organizing your files is usually just commitment, move files off ~/Downloads as soon as you can :-)

bballer 2 days ago 1 reply      
I try not to over think it, just:

 ~/$MAJOR_TOPIC | |--- ./$MORE_SPECIFIC | |--- ./$MORE_SPECIFIC | |--- ./general-file.type | | ./general-file.type | |--- ./$MORE_SPECIFIC | |--- ./general-file.type

As you find yourself collecting more general files under a directory that can be logically grouped, create a new directory and move them to it.

Also keep all your directories in the same naming convention (idk maybe I'm just OCD)

Animats 2 days ago 0 replies      

with each project under Git. Layouts for Go, Rust, ROS, and KiCAD are forced by the tools. Python isn't as picky.

Web sites are

 sitename/ info - login data for site, domains, etc. site - what gets pushed to the server work - other stuff not pushed to server
with each site under version control.

two2two 2 days ago 0 replies      
One external raid (mirrored) that holds information only necessary for when I'm working at my desk. Within that drive I have an archive folder with past files that are rarely/ever needed. The folder structure is labeled broadly such as "documents" "media" and more specific folders within. For the file level I usually put a date at the beginning of the name going from largest to smallest (2017-6-21_filename). For sensitive documents; I put in encrypted DMG files using the same organization structure.

As for all "working" documents, they're local to my machine under a documents or project folder. The documents folder is synced to all my devices and looks the same everywhere with a similar organization structure as my external drive. My projects folder is only local to my machine, which is a portable, and contains all the documents needed for that project.

TL;DR Shallow folder structure with dates at the beginning of files essentially.

sriku 2 days ago 0 replies      
If you're particularly asking about reference material that you take notes about and would like to search and retrieve and produce reports on, Zotero might work for you. I have many years of research notes on it - it's a hyper-bookmarking tool that can keep snapshots of web pages, keep PDFs and other "attachments" within saved articles, lets you tag and organize them along with search capabilities.

Outside of that scope, my files reside randomly somewhere in the ~/Documents folder (I use a mac) and I rely on spotlight to find the item I need. It's not super great but is workable often enough.

It's not a silly question!

edit: I've been trying to find a multi-disk solution and haven't had much success with an easy enough to use tool. I use git-annex for this and it helps to some extent. I've also tried Camlistore, which is promising, but has a long way to go.

xymaxim 2 days ago 0 replies      
Another option is to have a look at a tag-based filesystem instead of hierarchical ones to organize everything semantically. I'm using Tagsistant (there're other options) for a couple of months now and I'm almost happy. More satisfied with the idea itself and the potentiality.
majewsky 2 days ago 0 replies      
My file layout is quite uninteresting. The most noteworthy thing is that I have an additional toplevel directory /x/ where I keep all the stuff that would otherwise be in $HOME, but which I don't want to put in $HOME because it doesn't need to be backed up.

- /x/src contains all Git repos that are pushed somewhere. Structure is the same as wanted by Go (i.e., GOPATH=/x/). I have a helper script and accompanying shell function `cg` (cd to git repo) where I give a Git repo URL and it puts me in the repo directory below /x/src, possibly cloning the repo from that URL if I don't have it locally yet.

 $ pwd /home/username $ cg gh:foo/bar # understands Git URL aliases, too $ pwd /x/src/github.com/foo/bar
As I said, that's not in the backup, but my helper script maintains an index of checked-out repos in my home directory, so that I can quickly restore all checkouts if I ever have to reinstall.

- /x/bin is $GOBIN, i.e. where `go install` puts things, and thus also in my PATH. Similar role to /usr/local/bin, but user-writable.

- /x/steam has my Steam library.

- /x/build is a location where CMake can put build artifacts when it does an out-of-source build. It mimics the structure of the filesystem, but with /x/build prefixed. For example, if I have a source tree that uses CMake checked out at /home/username/foo/bar, then the build directory will be at /x/build/home/username/foo/bar. I have a `cd` hook that sets $B to the build directory for $PWD, and $S to the source directory for $PWD whenever I change directories, so I can flip between source and build directory with `cd $B` and `cd $S`.

- /x/scratch contains random junk that programs expect to be in my $HOME, but which I don't want to backup. For example, many programs use ~/.cache, but I don't want to backup that, so ~/.cache is a symlink to the directory /x/scratch/.cache here.

richardknop 2 days ago 0 replies      
I mostly work with Golang so usually all work related stuff will be in my GOPATH in ~/code/go/src/github.com/company-name/.

Non Golang code will go to ~/code, sometimes ~/code/company-name but I also have couple of ad hoc codebases spread around in different places on my filesystem.

So it is a bit disorganized. However last few years I have rarely ever needed to cd outside of ~/code/go.

Some legacy codebases I worked on (and still need to contribute to from time to time) can be in most random places as it took some effort and time to configure local environment of some of these beasts to be working properly (and they depend on stuff like Apache vhosts) so I am too afraid to move those to ~/code as I might break my local environment.

_mjk 2 days ago 0 replies      
I use `mess` [1].Short descrption: New stuff that is not filed away instantly goes into a folder "current" linked to the youngest folder in a tree (mess_root > year > week).If needed at a later time: file it accordingly, otherwise old folders are purged if disk space is low.Taking it a step further: synching everything across work and personal machines using `syncthing`.

[1] http://chneukirchen.org/blog/archive/2006/01/keeping-your-ho...

ktopaz 2 days ago 0 replies      
I have my files pseudo-organized, meaning I kind of try to keep them where they should be logically, but since this varies a lot - they're not really organized.The thing is - I use "everything" a free instant file search tool from voidtools.It is blazingly fast, just start typing and it finds files while you type.It uses the ntfs file system (windows only, sorry everyone else) existing index to perform instant searches, it is hands down the ultimate most fast file search tool I have ever encountered - files literally are found while you type their names, without waiting for even a milli second.

So, no organization (the ocd part of me hates this) but i always find my files in an instant, no matter where i left them.

romdev 2 days ago 0 replies      

Filename preserved, ordered by date or grouped in arbitrary functional folders






Primary Artist

 YYYY.AlbumName (Keeps albums in date order) AlbumName Track# Title.mp3 (truncates sensibly on a car stereo)

YYYY-MM-DD.Event Description (DD is optional)


scripts - reusable across clients




 source code documents
Utils (single-executable files that don't require an install)

I use Beyond Compare as my primary file manager at home and work. Folder comparison is the easiest way to know if a file copy fully completed. Multi-threaded move/copy is nice too.

oelmekki 2 days ago 1 reply      
Beside the usual `Images`, `Videos`, `code` directory, the single most important directory on my system is `~/flash` (as in : flash memory). This is where my browser downloads files and where I create "daily" files, which I quickly remove.

This is a directory that can be emptied at any moment without the fear of losing anything important, and which help me keeping the rest of my fs clean. Basically `/tmp` for user.

frik 2 days ago 0 replies      
For ebooks I created folders for main-categories and some sub-categories (inspired by Amazon.com or some other ebook shop structure).

For photos folders per device/year/month.

For Office documents pre-pending date using the ISO date format (2017-06-21 or 170621) works great. (for sharing with others over various channels like mail/chat/fileserver/cloud/etc)

xmpir 2 days ago 0 replies      
Most of my files stay in the download folder. If I think I will need them at a later stage against I upload them to my Google Drive. Google is quite good at searching stuff - for me that also works for personal files. I have probably 100 e-books that are on my reading list and will never get read by me...
codemac 2 days ago 0 replies      
recoll has worked great for a document index.


I also recommend calibre for e-books, but I never got to the "document store" stage that I think some people have.

mayneack 2 days ago 0 replies      
symlinks for ~/Downloads and ~/Documents into ~/Dropbox is my only interesting upgrade. Across the varying different devices I have different things selectively synced. Large media files are the only things that don't live in dropbox in some way or another. It's pretty convenient for mobile access (everything accessible from web/mobile). I've done some worrying about sensitive documents and such, but most of it is also present in my email, so I think I lost that battle already. It also means there's very little downside to wiping my HD entirely if I want to try a different OS (which I used to do frequently, but ended up settling on vanilla ubuntu).
raintrees 2 days ago 0 replies      
-clients - For client specific work

-devel - For development/research

 -Language/technology -specific research case
And I built my own bookmarking tool for references/citations.

joshstrange 2 days ago 0 replies      
Calibre may be a little rough looking but it's very powerful and it's what I use.

Edit: Also you might want to make a small title edit s/files/ebooks unless you are inquiring about other types of files as well.

house9-2 2 days ago 0 replies      



When reading for pleasure I typically read paper, try to limit the screen time if possible.

rajadigopula 2 days ago 0 replies      
If its for e-books only, you can try adobe digital editions or calibre. You can tag and create collections with search functionality on most formats.
gagabity 2 days ago 0 replies      
Dump everything on desktop or downloads folder then use Void Tools Everything to find what I need.
cristaloleg 2 days ago 0 replies      
~/work - everything related to job

~/github - just cloned repos

~/fork - everything forked

~/pdf - all science papers

eternalnovice 1 day ago 0 replies      
Organizing my files has been an obsession of mine for many years, so I've evolved what I think is a very effective system that combines the advantages of hierarchical organization and tagging. I use 3-character tags as part of every file's name. A prefix of tags provides a label that conveys the file's place in the hierarchy of all my files. To illustrate, here's the name of a text file that archives text-based communications I've had regarding a software project called 'Do, Too':

- pjt>sfw>doToo>cmm

'pjt' is my tag for projects

'sfw' is my tag for software and computer science

'doToo' is the name of this software project

'cmm' is my tag for interpersonal communications

Projects (tagged with 'pjt') is one of my five broad categories of files, with the others being Personal ('prs'), Recreation ('rcn'), Study ('sdg'), and Work ('wrk'). All files fall into one of these categories, and thus all file names begin with one the five tags mentioned. After that tag, I use the '>' symbol to indicate the following tag(s) is/are subcategories.

Any tags other than those for the main categories might follow, as 'sfw' did in the example above. This same tag 'sfw' is also used for files in the Personal category, for files related to software that I use personally--for example:

- prs>sfw>nameMangler@nts

Here, NameMangler is the name of the Mac application I use to batch-modify file names when I'm applying tags to new files. '@nts' is my tag for files containing notes.I also have many files whose names begin with 'sdg>sfw' and these are computer science or programming-related materials that I'm studying or I studied previously and wanted to archive.

A weakness of hierarchical organization is that it makes it difficult to handle files that could be reasonably placed in two or more positions in the hierarchy. I handle this scenario through the use of tag suffixes. These are just '|'-delimited lists of tags that do not appear in the prefix identifier, but that are still necessary to convey the content of the file adequately. So for example, say I have a PDF of George Orwell's essay "Politics and the English Language":

- sdg>lng>politicsAndTheEnglishLanguage_orwell9=wrt|wrk|tfl|georgeOrwell

The suffix of tags begins with '=' to separate it from the rest of the file name. A couple of other features are shown in this file name. I use '_' to separate the prefix tags from the original name of the file ('orwell9' in this case) if it came from an outside source. I'm an English teacher and use this essay in class, and that's why the tags 'wrk' for Work and 'tfl' for 'Teaching English as a Foreign Language' appear. 'wrt' is my tag for 'writing', since Orwell's essay is also about writing. The tag 'georgeOrwell' is not strictly necessary since searching for "George Orwell" will pick up the name in the text content of the PDF, but I still like to add a tag to signal that the file is related to a person or subject that I'm particularly interested in. Adding a camel-cased tag like this also has the advantage that I can specifically search for the tag while excluding files that happen to contain the words 'George' and 'Orwell' without being particularly about or by him.

That last file name example also illustrates what I find to be a big advantage of this system: it reduces some of the mental overhead of classifying the file. I could have called the file 'wrk>tfl>politicsAndTheEnglishLanguage=sdg|wrt|lng|georgeOrwell', but instead of having to think about whether it should go in the "English teaching work-related stuff" slot or the "stuff about language that I can learn about" slot, I can just choose one more or less arbitrarily, and then add the tags that would have made up the tag prefix that I didn't choose as a suffix.

There's actually a lot more to the system, but those are the basics. Hope you find it helpful in some way.

graycat 2 days ago 0 replies      
From a recent backup, there are

417,361 files

in my main collection of files for mystartup, computing, applied math, etc.

All those files are well enough organized.

Here's how I do it and how I do relatedwork more generally (I've used thetechniques for years, and they are allwell tested).

(1) Principle 1: For the relevant filenames, information, indices, pointers,abstracts, keywords, etc., to the greatestextent possible, stay with the old 8 bitASCII character set in simple text fileseasy to read by both humans and simplesoftware.

(2) Principle 2: Generally use thehierarchy of the hierarchical file system,e.g., Microsoft's Windows HPFS (highperformance file system), as the basis(framework) for a taxonomic hierarchyof the topics, subjects, etc. of thecontents of the files.

(3) To the greatest extent possible, I doall reading and writing of the files usingjust my favorite programmable text editorKEdit, a PC version of the editor XEDITwritten by an IBM guy in Paris for the IBMVM/CMS system. The macro language is Rexxfrom Mike Cowlishaw from IBM in England.Rexx is an especially well designedlanguage for string manipulation as neededin scripting and editing.

(4) For more, at times make crucial use ofOpen Object Rexx, especially its functionto generate a list of directory names,with standard details on each directory,of all the names in one directory subtree.

(5) For each directory x, have in thatdirectory a file x.DOC that has whatevernotes are appropriate for gooddescriptions of the files, e.g., abstractsand keywords of the content, the source ofthe file, e.g., a URL, etc. Here the filetype of an x.DOC file is just simple ASCIItext and is not a Microsoft Word document.

There are some obvious, minor exceptions,that is, directories with no file namedx.DOC from me. E.g., directories createdjust for the files used by a Web page whendownloading a Web page are exceptions andhave no x.DOC file.

(6) Use Open Object Rexx for scripts formore on the contents of the file system.E.g., I have a script that for a currentdirectory x displays a list of the(immediate) subdirectories of x and thesize of all the files in the subtreerooted at that subdirectory. So, for allthe space used by the subtree rooted at x,I get a list of where that space is usedby the immediate subdirectories of x.

(7) For file copying, I use Rexx scriptsthat call the Windows commands COPY orXCOPY, called with carefully selectedoptions. E.g., I do full and incrementalbackups of my work using scripts based onXCOPY.

For backup or restore of the files on abootable partition, I use the Windowsprogram NTBACKUP which can backup abootable partition while it is running.

(8) When looking at or manipulating thefiles in a directory, I make heavy use ofthe DIR (directory) command of KEdit. Theresulting list is terrific, and commonoperations on such files can be done withcommands to KEdit (e.g., sort the list),select lines from the list (say, all filesx.HTM), delete lines from the list, copylines from the list to another file, useshort macros written in Kexx (the KEditversion of Rexx), often from just a singlekeystroke to KEdit, to do other commontasks, e.g., run Adobe's Acrobat on anx.PDF file, have Firefox display an x.HTMfile.

More generally, with one keystroke, haveFirefox display a Web page where the URLis the current line in KEdit, etc.

I wrote my own e-mail client software.Then given the date header line of ane-mail message, one keystroke displays thee-mail message (or warns that the dateline is not unique, but it always hasbeen).

So, I get to use e-mail message date linesas 'links' in other files. So, if somefile T1 has some notes about some subjectand some e-mail message is relevant, then,sure, in file T1 just have the date lineas a link.

This little system worked great until Iconverted to Microsoft's Outlook 2003. IfI could find the format of the filesOutlook writes, I'd implement the featureagain.

(9) For writing software, I type only intoKEdit.

Once I tried Microsoft's Visual Studio andfor a first project, before I'd typedanything particular to the project, I got50 MB or so of files nearly none of whichI understood. That meant that wheneveranything went wrong, for a solution I'dhave to do mud wrestling with at least 50MB of files I didn't understand; moreover,understanding the files would likely havebeen a long side project. No thanks.

E.g., my startup needs some software, andI designed and wrote that software. SinceI wrote the software in Microsoft's VisualBasic .NET, the software is in just simpleASCII files with file type VB.

There are 24,000 programming languagestatements.

So, there are about 76,000 lines ofcomments for documentation which isIMPORTANT.

So, all the typing was done into KEdit,and there are several KEdit macros thathelp with the typing.

In particular, for documentation of thesoftware I'm using -- VB.NET, ASP.NET,ADO.NET, SQL Server, IIS, etc. -- I have5000+ Web pages of documentation, fromMicrosoft's MSDN, my own notes, andelsewhere.

So, at some point in the code where somedocumentation is needed for clarity forthe code, I have links to my documentationcollection, each link with the title ofthe documentation. Then one keystroke inKEdit will display the link, typicallyhave Firefox open the file of the MSDNHTML documentation.

Works great.

The documentation is in four directories,one for each of VB, ASP, SQL, and Windows.Each directory has a file that describeseach of the files of documentation in thatdirectory. Each description has the titleof the documentation, the URL of thesource (if from the Internet which is theusual case), the tree name of thedocumentation in my file system, anabstract of the documentation, relevantkeywords, and sometimes some notes ofmine. KEdit keyword searches on this file(one for each of the four directories) arequite effective.

(10) Environment Variables

I use Windows environment variables andthe Windows system clipboard to make a lotof common tasks easier.

E.g., the collection of my files ofdocumentation of Visual Basic is in mydirectory


Okay, on the command line of a consolewindow, I can type


and then have that directory current.

Here 'G' abbreviates 'go to'!

So, to command G, argument 'VB' acts likea short nickname for directory


Actually that means that I have --established when the system boots -- aWindows environment variable MARK.VB withvalue


I have about 40 such MARK.x environmentvariables.

So, sure, I could use the usual Windowstree walking commands to navigate todirectory


but typing


is a lot faster. So, such nicknames arejustified for frequently used directoriesfairly deep in the directory tree.

Environment variables



are used by some other programs,especially my scripts that call COPY andXCOPY.

So, to copy from directory A to directoryB, I navigate to directory A and type


which sets environment variable


to the directory tree name of directory A.Similarly for directory B.

Then my script


takes as argument the file name and doesthe copy.

My script


takes two arguments, the file name of thesource and the file name to be used forthe copy.

I have about 200 KEdit macros and about200 Rexx scripts. They are crucial toolsfor me.

(11) FACTS

About 12 years ago I started a fileFACTS.DAT. The file now has 74,317 lines,is


bytes long, and has 4,017 facts.

Each such fact is just a short note,sure, on average

2,268,607 / 4,017 = 565

bytes long and

74,317 / 4,017 = 18.5

lines long.

And that is about

12 * 365 / 4,017 = 1.09

that is, an average of right at one newfact a day.

Each new fact has its time and date, alist of keywords, and is entered at theend of the file.

The file is easily used via KEdit and afew simple macros.

I have a little Rexx script to run KEditon the file FACTS.DAT. If KEdit isalready running on that file, then thescript notices that and just brings to thetop of the Z-order that existing instanceof KEdit editing the file -- this way Iget single threaded access to the file.

So, such facts include phone numbers,mailing addresses, e-mail addresses, userIDs, passwords, details for multi-factorauthentication, TODO list items, and otherlittle facts about whatever I want helpremembering.

No, I don't need special software to helpme manage user IDs and passwords.

Well, there is a problem with thetaxonomic hierarchy: For some files, itmight be ambiguous which directory theyshould be in. Yes, some hierarchical filesystems permitted to be listed in morethan one directory, but AFAIK theMicrosoft HPFS file system does not.

So, when it appears that there is someambiguity in what directory a new fileshould go, I use the x.DOC files for thosedirectories to enter relevant notes.

Also my file FACTS.DAT may have suchnotes.

Well, (1)-(11) is how I do it!

guilhas 2 days ago 0 replies      
Zim wiki
Ask HN: Want to study SSL, HTTPS, and the works. Where to start?
18 points by surds  3 days ago   9 comments top 6
tialaramex 1 day ago 0 replies      
I recommend beginning at the fundamentals. For example here's a video that walks through Diffie Hellman, so that anybody can follow, you can probably sprint through it, but by taking it slow they avoid accidentally forgetting anything important.


Grasping the fundamentals means that when it comes to policy decisions (e.g. in the management of certificates) you can see what the consequences of a particular decision are, rather than just hoping that whoever proposed that policy knew what they were doing.

For example, I think a lot of people today use Certificate Signing Request (CSR) files without understanding them at all. But once you have a grounding in the underlying elements you can see at once what the CSR does, and why it's necessary without needing to have that spelled out separately.

Or another example, understanding what was and was not risky as a result of the known weakness of SHA-1. I saw a lot of scare-mongering by security people who saw the SHA-1 weakness as somehow meaning impossible things were now likely, but it only affected an important but quite narrow type of usage, people who understood that could make better, more careful decisions without putting anybody at risk.

ZoFreX 3 days ago 1 reply      
I'm more of a learning by doing person. Here's three exercises that you'll learn a lot doing:

1) https://www.ssllabs.com/ssltest/ - try to get an A+. It's not important to in most cases in practice, but you'll learn a lot getting there. Their rating guide is also handy: https://github.com/ssllabs/research/wiki/SSL-Server-Rating-G...

2) MITM yourself. I've done this using Charles, you can do it with any HTTP proxy that lets you rewrite requests on the fly - I hear Fiddler is popular. MITM yourself and try changing the page for an HTTP site. Then try doing it on a website that is part HTTP part HTTPS (e.g. HTTPS for login page for example) and "steal your password". Try again on a website that redirects from HTTP to HTTPS using a 301 but does not have HSTS. Finally try on a site with HSTS (nb: you won't manage this one). Congratulations, you now truly understand why HSTS is important and what it does better than most people!

3) Set up HTTPS on a website. You've probably already done this. In which case maybe do it with LetsEncrypt for an extra challenge?

AaronSmith 2 days ago 0 replies      
To study SSL, HTTPS, CAs including installation and management of SSL certificates, You can consider following references:




indescions_2017 3 days ago 1 reply      
Check out High Performance Browser Networking. Ilya Grigorik is a very smart cookie and will take you right up to the present day state-of-the-art:


moondev 3 days ago 0 replies      
I learn best by example, and I have learned so much just by evaluating and implementing hashicorp vault: https://www.vaultproject.io/docs/secrets/pki/index.html

It doesn't hold your hand at all, but it gives you a nice "task" to accomplish. Reading up on all the terminology and exactly how and why it works was really fun.

schoen 3 days ago 1 reply      
I hear good things about Bulletproof SSL/TLS by Ivan Risti:


There was also a nice web page presenting all kinds of PKI concepts that I came across a few years ago but haven't been able to find since then. :-(

Ask HN: Any startups working on clojure and Bitcoin/Ethereum?
9 points by pankajdoharey  1 day ago   6 comments top 2
akrisanov 1 day ago 1 reply      
Found very interesting project/company https://github.com/status-im which is doing almost everything in Clojure and ClojureScript plus Go.

I track open positions from time to time on next website https://blockchain.works-hub.com/.

rpod 1 day ago 1 reply      
Why narrow it down to Clojure? Feels a tad artificial to me.

AFAIK, any Dapp using Ethereum has to make use of their JSON-RPC interface, which is language-agnostic. So it's perfectly fine to build a Clojure application on Ethereum, although there is a wrapper library for the JSON-RPC interface available in Javascript. No idea how many startups, if any, are using Clojure and for what reason.

Ask HN: What the best way to optimize images in S3?
5 points by hartator  1 day ago   3 comments top 3
cjhanks 3 hours ago 0 replies      
Personally, I prefer to perform such optimizations on 'load' rather than on 'store' (as recommended by dgelks). This is particularly useful if you may want to change your optimized image format.

You would have a canonical lossless image stored in S3. When a user makes a request to your CDN, it calls an origin server (assuming a cache miss) that transforms the canonical images into an optimized form.

Any basic WSGI, FCGI, CGI application behind NGINX will probably be sufficient.

dgelks 1 day ago 0 replies      
For this sort of work I found using a lambda function with a trigger on s3 upload works very well, aws-lambda-image seems like a popular project to use instead of writing your own code https://github.com/ysugimoto/aws-lambda-image
savethefuture 1 day ago 0 replies      
I have a microservice setup to process my s3 images, on the fly and when they're uploaded.
       cached 24 June 2017 04:05:01 GMT