hacker news with inline top comments    .. more ..    22 Jun 2017 Ask
home   ask   best   2 years ago   
Ask HN: Best Self-Hosted, Server-Side Image Optimization Tools?
46 points by DivineTraube  5 hours ago   26 comments top 17
wsxiaoys 2 hours ago 0 replies      
I believe the most complete solution is a full unix environment with shell piping the image stream. so i make https://bash.rocks, which is a web frontend backed by a unix environment.


1. Resizing with imagemagick: https://bash.rocks/Gxlg31/3

2. Resizing and convert to webp: https://bash.rocks/7J1jgB/1

After creating the snippet, you could either use GET https://bash.rocks/0Be95B (query parameters become environment variable) or POST https://bash.rocks/jJggWJ (request body become stdin).

It's not hard to roll your backend like this for private usage (simply exec from node). I'm also working on an open source release.

eeeps 1 hour ago 1 reply      
Disclaimer: I work for Cloudinary. But: all of the services that you mention have an awful lot to offer, over roll-your-own solutions. Reliability and scalability, sure but also, right now, just flat-out-better performance and output. From big flashy features like automatic optimization using perceptual metrics and on-the-fly responsive resizing with Client Hints ... all the way down to nitty gritty stuff that doesnt get press releases like, say, dialed-in custom resizing algorithms... in 2017, hosted, paid services can do a lot more a lot better than anything you can set up yourself using free tools.

Images are complicated and important enough that I don't see that changing any time soon.

nielsole 1 hour ago 0 replies      
Haven't tried it out, but looked quite promising: https://github.com/thephpleague/glide/Does everything basic, that I would look for in an image API
matrix 2 hours ago 0 replies      
For Java (or other JVM languages such as Kotlin), TwelveMonkeys is powerful and does not have external dependencies:


cbr 2 hours ago 0 replies      
I used to work on mod_pagespeed / ngx_pagespeed, and I'm very proud of our image optimization: https://modpagespeed.com/doc/filter-image-optimize

It compresses and optimizes png, gif, and jpeg, creates webp for browsers that support it, inlines small images into your html, longcaches images, and even creates srcsets.

vladdanilov 4 hours ago 0 replies      
I'm working on Optimage [1] for both lossless and visually lossless (lossy) optimizations. Right now it's available for Mac only. But I have a Linux version working internally as part of the upcoming Server plan.

[1] http://getoptimage.com

rawrmaan 4 hours ago 1 reply      
Sharp for node.js has proven to be powerful, flexible and fast for my needs: https://github.com/lovell/sharp/
tobltobs 3 hours ago 1 reply      
In my experience not the optimization is the hard part. But the eventually necessary scaling down you have to do first.Doing this with Imagemagick, Pillow or whatever will result in possible OOMs or gigabyte sized, not so temporary files filling your /tmp dir for large source images.

The only tool I ever found which does this job reliable even for huge images is http://www.vips.ecs.soton.ac.uk .

r1ch 4 hours ago 0 replies      
I use a combination of jpegoptim, optipng, advpng and zopflipng.

Be especially careful with these utilities when running them on UGC. PNG / JPEG bombs can easily cause OOM or CPU DoS conditions etc.

v3ss0n 4 hours ago 1 reply      
pilbox is very powerful : https://pypi.python.org/pypi/pilbox
Mojah 3 hours ago 0 replies      
I automated my workflow server side with OptiPNG. https://ma.ttias.be/optimize-size-png-images-automatically-w...
NicoJuicy 4 hours ago 0 replies      
I use ImageResizer 4.3.2 for Asp.Net MVC ( it's free), new versions are less free though... Best thing is, if you want to resize, you can just do it through the url. Eg. /Assets/img/logo.png?Width=200
silasb 4 hours ago 0 replies      
https://github.com/h2non/imaginary appears to support quality/compression settings.
Theodores 2 hours ago 0 replies      
Google Pagespeed for Nginx and Apache is another way to go. The benefit of this approach is that you don't have to bulk out your code.

As for metadata, today I decided to add it back in.


For ecommerce it will eventally help to have product data, e.g. brand, product name etc embedded in the image.

My other tip if you go the Imagemagick/PAGESPEED route then you can use 4:2:2 colour space and ditch half the bits used for chroma.

anilshanbhag 4 hours ago 0 replies      
optipng and jpegoptim are pretty good. Have a feeling most of these tools use these utilities inside them.
fweespeech 4 hours ago 0 replies      
> What is the most complete solution you are aware of that compresses & optimizes png, jpeg, and webp and can be operated on a server? It should not only be able to optimize as part of the build process but also in response to user-generated content.

Tbh the UGC side is just triggering the "build process side" as the upload occurs.

As far as best,


I'd suggest you look there for some decent examples of how to go about it. They may be defunct but I use a similar approach (slightly different knob tweaks with the same binaries) and it works fine. May not be 100% optimal but its good enough imo.

Ask HN: What has happened to YC's residential development research project?
82 points by baybal2  11 hours ago   15 comments top 6
RubenSandwich 6 hours ago 2 replies      
They are still their slowly plugging away at projects: https://harc.ycr.org/.

Edit: You haven't heard from them because they are aiming very high so it will take years before any of their work hits the general public.

Edit 2: From my understanding, they are still working on their Universal Basic Income Research as well and have chosen Oakland as the testbed: http://basicincome.org/news/2017/04/httpswww-youtube-comwatc....

Entangled 5 hours ago 5 replies      
Ok, here is a cheap shot of a dream. Future cities like mega malls with a thousand shops facing inside and a thousand homes facing outside, one to one. Roads would be marbled floors and cars would be electric scooters with a basket enough to buy groceries around.

For those who like the outdoors, just get your off road vehicle and face the indomitable and untouched nature. No paved roads, no concrete, nothing outside these habitable malls interconnected by hyperloops. Of course there will be supply roads for trucks but they will be just like highways interconnecting mega farms to mega malls.

Nah, scratch that, there is nothing like a house in the suburbs with a huge yard and a barbecue.

simonebrunozzi 5 hours ago 0 replies      
Sam Altman spoke about it recently: https://medium.com/the-naked-founder/sam-altman-on-yc-univer...

AFAIK, Ben Huh is still in charge of the project.

raphman 4 hours ago 0 replies      
Two weeks ago, Jonathan Edwards [1] announced on Twitter that he left/leaves HARC [2] but didn't elaborate on the reasons.

[1] http://www.subtext-lang.org/AboutMe.htm[2] https://twitter.com/jonathoda/status/871784998113882118

Kinnard 6 hours ago 0 replies      
Did you mean the New Cities Project?
erikj 6 hours ago 1 reply      
What do you expect from it?
What would you have done to prevent the ETHereum flash crash on Coinbase yday?
3 points by noloblo  3 hours ago   9 comments top 3
chollida1 2 hours ago 2 replies      
Unfortunately it looks like you had a perfect storm of

1) Large order sent to market

2) Exchanges with a serious lack of liquidity

3) stop loss orders making things worse.

Everyone has their personal pet peeves, mine is stop loss orders. It's one of the 3 things that amateurs tend to use with out any understanding of markets. The other two being use of margin, probably doesn't need any explanation, and trading currencies/currency pairs.

In today's markets, stop loss orders are like market orders........99.99% of the time only people who don't know what they are doing use them.




Finnucane 3 hours ago 1 reply      
Design a cryptocurrency to have stable value, not speculative-asset value.
unstatusthequo 2 hours ago 0 replies      
Prevent? I wish I had the foresight to have had buy orders in at $0.11 on the rise!
Ask HN: How was your experience with AWS Lambda in production?
194 points by chetanmelkani  1 day ago   136 comments top 61
callumlocke 1 day ago 6 replies      
I made an image hosting tool on Lambda and S3, for internal corporate use. Staff can upload images to S3 via an SPA. The front end contacts the Lambda service to request a pre-signed S3 upload URL, so the browser can upload directly to S3. It works really well. Observations:

1. Took too long to get something working. The common use case of hooking up a Lambda function to an HTTP endpoint is surprisingly fiddly and manual.

2. Very painful logging/monitoring.

3. The Node.js version of Lambda has a weird and ugly API that feels like it was designed by a comittee with little knowledge of Node.js idioms.

4. The Serverless framework produces a huge bundle unless you spend a lot of effort optimising it. It's also very slow to deploy incremental changes edit: this is not only due to the large bundle size but also due to having to re-up the whole generated CloudFormation stack for most updates.

5. It was worth it in the end for making a useful little service that will exist forever with ultra-low running costs, but the developer experience could have been miles better, and I wouldn't want to have to work on that codebase again.


Edit: here's the code: https://github.com/Financial-Times/ig-images-backend

To address point 3 above, I wrote a wrapper function (in src/index.js) so I could write each HTTP Lambda endpoint as a straight async function that simply receives a single argument (the request event) and asynchronously returns the complete HTTP response. This wouldn't be good if you were returning a large response though; you'd probably be better streaming it.

munns 1 day ago 1 reply      
Hey all, My name is Chris Munns and I am currently the lead Developer Advocate for Serverless at AWS (I am part of the Lambda PM team). We really appreciate this feedback and are always looking for ways to hear about these pain points. Can email me directly: munns@amazon.com if you ever get stuck.

Thanks,- Chris

scrollaway 1 day ago 2 replies      
We use AWS Lambda to process Hearthstone replay files.

My #1 concern with it went away a while back when Amazon finally added support for Python 3 (3.6).

It behaved as advertised: Allowed us to scale without worrying about scaling. After a year of using it however I'm really not a big fan of the technology.

It's opaque. Pulling logs, crashes and metrics out of it is like pulling teeth. There's a lot of bells and whistles which are just missing. And the weirdest thing to me is how people keep using it to create "serverless websites" when that is really not its strength -- its strength is in distributed processing; in other words, long-running CPU-bound apps.

The dev experience is poor. We had to build our own system to deploy our builds to Lambda. Build our own canary/rollback system, etc. With Zappa it's better nowadays although for the longest time it didn't really support non-website-like Lambda apps.

It's expensive. You pay for invocations, you pay for running speed, and all of this is super hard to read on the bill (which function costs me the most and when? Gotta do your own advanced bill graphing for that). And if you want more CPU, you have to also increase memory; so right now our apps are paying for hundreds of MBs of memory we're not using just because it makes sense to pay for the extra CPU. (2x your CPU to 2x your speed is a net-neutral cost, if you're CPU-bound).

But the kicker in all this is that the entire system is proprietary and it's really hard to reproduce a test environment for it. The LambCI people have done it, but even so, it's a hell of a system to mock and has a pretty strong lock-in.

We're currently moving some S3-bound queue stuff into SQS and dropping Lambda at the same time could make sense.

I certainly recommend trying Lambda as a tech project, but I would not recommend going out of your way to use it just so you can be "serverless". Consider your use case carefully.

chickenbane 1 day ago 1 reply      
I worked on a project where the architect wanted to use Lambdas for the entire solution. This was a bad choice.

Lambdas have a lot of benefits - for occasional tasks they are essentially free, the simple programming model makes them easy to understand in teams, you get Amazon's scaling and there's decent integration with caching and logging.

However, especially since I had to use them for whole solution, I ran into a ton of limitations. Since they are so simple, you have to pull in a lot of dependencies which negate a lot of the ease of understanding I mentioned before. The dependencies are things like Amazon's API Gateway, AWS Step Functions, and AWS CLI itself, which is pretty low-level. So now, the application logic is pretty easy, but now you are dealing with a lot of integration devops. There's API Gateway is pretty clunky and surprisingly slow. Lambdas shut themselves down, and restarting is slow. The Step Functions have a relatively small payload limit that needs to be worked around. Etc. So use them sparingly!

lanestp 1 day ago 0 replies      
We use Lambda for 100% of our APIs some of which get over 100,000 calls per day. The system is fantastic for micro services and web apps. One caveat, you must use a framework like Serverless or Zappa. Simply setting up API Gateway right is a hideous task and giving your function the right access level isnt any fun either. Since the frameworks do all that for you it really makes life easier.

One thing to note. API Gateway is super picky about your response. When you first get started you may have a Lambda that runs your test just fine but fails on deployment. Make sure you troubleshoot your response rather than diving into your code.

I saw some people complaining about using an archaic version of Node. This is no longer true. Lambdas support Node V6 which, while not bang up to date, is an excellent version.

Anyway, I can attest it is production ready and at least in our usage an order of magnitude cheaper.

CSDude 1 day ago 1 reply      
- Monitoring & debugging is little hard

- CPU power also scales with Memory, you might need to increase it to get better responses

- Ability to attach many streams (Kinesis, Dynamo) is very helpful, and it scales easily without explicitly managing servers

- There can be a overhead, your function gets paused (if no data incoming) or can be killed undeterministically (even if it works all the time or per hour) and causes cold start, and cold start is very bad for Java

- You need to make your JARs smaller (50MB), you cannot just embed anything you like without careful consideration

thom_nic 1 day ago 1 reply      
I deployed a couple AWS lambda endpoints for very low-volume tasks using claudia.js - Claudia greatly reduces the setup overhead for sane REST endpoints. It creates the correct IAM permissions, gateway API and mappings.

Claudia.js also has an API layer that makes it look very similar to express.js versus the weird API that Amazon provides. I would not use lambda + JS without claudia.

For usage scenarios, one endpoint is used for a "contact us" form on a static website, another we use to transform requests to fetch and store artifacts on S3. I can't speak toward latency or high volume but since I've set them up I've been able to pretty much forget about them and they work as intended.

dblooman 1 day ago 0 replies      
We have a number of different use cases at FundApps, some obvious like automated tasks, automatic DNS, cleanup AMI's etc, to the more focused importing and parsing of data from data sources. This is generally a several times a day operation, so lambda was the right choice for us. We also use API gateway with lambdas, its a small API, about 2 requests per second on average, but very peaky during business hours, its response and uptime has been excellent.

Development can be tricky, there are a lot of of all in one solutions like the serverless framework, we use Apex CLI tool for deploying and Terraform for infra. These tools offer a nice workflow for most developers.

Logging is annoying, its all cloudwatch, but we use a lambda to send all our cloudwatch logs to sumologic. We use cloudwatch for metrics, however we have a grafana dashboard for actually looking at those metrics. For exceptions we use Sentry.

Resources have bitten us the most, not enough memory suddenly because the payload from a download. I wish lambda allowed for scaling on a second attempt so that you could bump its resources, this is something to consider carefully.

Encryption of environment variables is still not a solved issue, if everyone has access to the AWS console, everyone can view your env vars, so if you want to store a DB password somewhere, it will have to be KMS, which is not a bad thing, this is usually pretty quick, but does add overhead to the execution time.

petarb 1 day ago 0 replies      
It gets the job done but the developer experience around it is awful.

Terrible deploy process, especially if your package is over 50mb (then you need to get S3 involved). Debugging and local testing is a nightmare. Cloudwatch Logs aren't that bad (you can easily search for terms).

We have been using Lambdas in production for about a year and a half now, to do 5 or so tasks. Ranging from indexing items in Elasticseaech, to small CRON clean up jobs.

One big gripe around Lambads and integration with API Gateway is they totally changed the way it works. It use to be really simple to hook up a lambda to a public facing URL so you could trigger it with a REST call. Now you have to do this extra dance with configuring API Gateway per HTTP resource, therefore complicating the Lambda code side of things. Sure with more customization you have more complexity associated with it, but the barrier to entry was significantly increased.

beefsack 1 day ago 1 reply      
I'm running Rust on Lambda at the moment for a PBE board gaming service I run. I can't say it runs at huge scale though, but using Lambda has provided me with some really good architectural benefits:

* Games are developed as command line tools which use JSON for input and output. They're pure so the game state is passed in as part of the request. An example is my implementation of Lost Cities[1]

* Games are automatically bundled up with a NodeJS runner[2] and deployed to Lambda using Travis CI[3]

* I use API Gateway to point to the Lambda function, one endpoint per game, and I version the endpoints if the game data structures ever change.

* I have a central API server[4] which I run on Elastic Beanstalk and RDS. Games are registered inside the database and whenever players make plays, Lambda functions are called to process the play.

I'm also planning to run bots as Lambda functions similar to how games are implemented, but am yet to get it fully operational.

Apart from stumbling a lot setting it up, I'm really happy with how it's all working together. If I ever get more traction I'll be interesting to see how it scales up.

[1]: https://github.com/brdgme/lost-cities

[2]: https://github.com/brdgme/lost-cities/blob/master/.travis.ym...

[3]: https://github.com/brdgme/lambda/blob/master/index.js

[4]: https://github.com/brdgme/api

gaius 4 hours ago 0 replies      
I've only played with it as opposed to deploying Prod but guve Azure Functions a try too https://azure.microsoft.com/en-us/services/functions/
cameronmaske 1 day ago 2 replies      
I've been using AWS Lambda on a side project (octodocs.com) that is powered by Django and uses Zappa to manage deployments.

I was initially attracted to it as a low-cost tool to run a database (RDS) powered service side project.

Some thoughts:

- Zappa is a great tool. They added async task support [1] which replaced the need for celery or rq. Setting up https with let's encrypt takes less than 15 minutes. They added Python 3 support quickly after it was announced. Setting up a test environment is pretty trivial. I set up a separate staging site which helps to debug a bunch of the orchestration settings. I also built a small CLI [2] to help set environment variables (heroku-esque) via S3 which works well. Overall, the tooling feels solid. I can't imagine using raw Lambda without a tool like Zappa.

- While Lambda itself is not too expensive, AWS can sneak in some additional costs. For example, allowing Lambda to reach out to other services in the VPC (RDS) or to the Internet, requires a bunch of route tables, subnets and a nat gateway. For this side project, this currently costs way more running and invoking Lambda.

- Debugging can be a pain. Things like Sentry [3] make it better for runtime issues, but orchestration issues are still very trail and error.

- There can be overhead if your function goes "cold" (i.e. infrequent usage). Zappa lets you keep sites warm (additional cost), but a cold start adds a couple of seconds to the first-page load for that user. This applies more to low volume traffic sites.

Overall: It's definitely overkilled for a side project like this, but I could see the economics of scale kicking in for multiple or high volume apps.

[1]: https://blog.zappa.io/posts/zappa-introduces-seamless-asynch...

[2]: https://github.com/cameronmaske/s3env

[3]: https://getsentry.com/

kehers 1 day ago 4 replies      
I've been using it for heavy background job for http://thefeed.press and overall, I think it's pretty ok (I use NodeJs). That said here are few things:

- No straight way to prevent retries. (Retries can crazily increase your bill if something goes wrong)

- API gateway to Lambda can be better. (For one, Multipart form-data support for API gateway is a mess)

- (For NodeJs) I don't see why the node_modules folder should be uploaded. (Google cloud functions downloads the modules from the package.json)

alexcasalboni 1 day ago 0 replies      
I'd recommend using a framework such as the Serverless Framework[1], Chalice[2], Dawson[3], or Zappa[4]. As any other (web) development project, using a framework will alleviate a big part of the pain involved with a new technology.

Anyways, I'd recommend starting from learning the tools without using a framework first. You can find two coding sessions I published on Youtube[5][6].

[1]: https://serverless.com/

[2]: https://github.com/awslabs/chalice

[3]: https://dawson.sh/

[4]: https://github.com/Miserlou/Zappa

[5]: https://www.youtube.com/watch?v=NhGEik26324

[6]: https://www.youtube.com/watch?v=NlZjTn9SaWg

dcosson 1 day ago 1 reply      

- works as advertised, we haven't had any reliability issues with it

- responding to Cloudwatch Events including cron-like schedules and other resource lifecycle hooks in your AWS account (and also DynamoDB/Kinesis streams, though I haven't used these) is awesome.


- 5 minute timeout. There have been a couple times when I thought this would be fine, but then I hit it and it was a huge pain. If the task is interruptible you can have the lambda function re-trigger itself, which I've done and actually works pretty once you set up the right IAM policy, but it's extra complexity you really don't want to have to worry about in every script.

- The logging permissions are annoying, it's easy for it to silently fail logging to to Cloudwatch Logs if you haven't set up the IAM permissions right. I like that it follows the usual IAM framework but AWS should really expose these errors somewhere.

- haven't found a good development/release flow for it. There's no built-in way to re-use helper scripts or anything. There are a bunch of serverless app frameworks, but they don't feel like they quite fit because I don't have an "app" in Lambda I just have a bunch of miscellaneous triggers and glue tasks that mostly don't have any relation to each other. It's very possible I should be using one of them anyway and it would change how I feel about this point.

We use Terraform for most AWS resources, but it's particularly bad for Lambda because there's a compile step of creating a zip archive that terraform doesn't have a great way to do in-band.

Overall Lambda is great as a super-simple shim if you only need to do one simple, predictable thing in response to an event. For example, the kind of things that AWS really could add as a small feature but hasn't like send an SNS notification to a slack channel, or tag an EC2 instance with certain parameters when it launches into an autoscaling group.

For many kinds of background processing tasks in your app, or moderately complex glue scripts, it will be the wrong tool for the job.

rowell 4 hours ago 0 replies      
I've used it in production and we're building our platform entirely in Serverless/AWS Lambda.

Here are my recommendations:

1) Use Serverless Framework to manage Functions, API-Gateway config, and other AWS Resources

2) CloudWatch Logs are terrible. Auto-stream CloudWatch Logs to Elastic Search Service and Use Kibana for Log Management

3) If using Java or other JVM languages, cold starts can be an issue. Implement a health check that is triggered on schedule to keep functions used in real-time APIs warm

Here's a sample build project I use: https://github.com/bytekast/serverless-demo

For more information, tips & tricks: https://www.rowellbelen.com/microservices-with-aws-lambda-an...

tracker1 1 day ago 1 reply      
Best to keep your workloads as small as possible, cold starts can be very bad, depending on the type of project. Been using mostly node myself, and it's worked out well.

One thing to be careful of, if you're targeting input into dynamodb table(s), then it's really easy to flood your writes. Same goes for SQS writes. You might be better off with a data pipeline, and slower progress. It really just depends on your use case and needs. You may also want to look at Running tasks on ECS, and depending on your needs that may go better.

For some jobs the 5minute limit is a bottleneck, others it's the 1.5gb memory. Just depends on exactly what you're trying to do. If your jobs fit in Lambda constraints, and your cold start time isn't too bad for your needs, go for it.

mgkimsal 1 day ago 2 replies      
I'm reading here a lot of people jumping through some massive amounts of hoops to deal with a system that lock you down to a single vendor, and makes it hard to read logs or even read your own bill.

a few years back, the mantra was "hardware is cheap, developer time isn't". when did this prevailing wisdom change? Why would people spend hours/days/weeks wrestling with a system to save money which may take weeks, months or even years to see an ROI?

falcolas 1 day ago 1 reply      
Most of my experience mirrors that found in other comments, so here's a few unique quirks I've personally had to work around:

- You can't trigger Lambda off SQS. The best you can do is set up a scheduled lambda and check the queue when kicked off.

- Only one Lambda invocation can occur per Kinesis shard. This makes efficiency and performance of that lambda function very important.

- The triggering of Lambda off Kinesis can sometimes lag behind the actual kinesis pipeline. This is just something that happens, and the best you can do is contact Amazon.

- Python - if you use a package that is namespaced, you'll need to do some magic with the 'site' module to get that package imported.

- Short execution timeouts means you have to go to some ridiculous ends to process long running tasks. Step functions are a hack, not a feature IMO.

- It's already been said, but the API Gateway is shit. Worth repeating.

Long story short, my own personal preference is to simply set up a number of processes running in a group of containers (ECS tasks/services, as one example). You get more control and visibility, at the cost of managing your own VMs and the setup complexity associated with that.

meekins 13 hours ago 0 replies      
We're doing both stream processing and small query APIs using Lambda.

A few pointers (from relatively short experience):

- The best UC for Lambda seems to be stream processing where latency due to start up times is not an issue

- For user/application-facing logic the major issue seems to be start-up-times (esp. JVM startup times when doing Java or your API gets called very rarely) and API Gateway configuration management using infrastructure as code tools (I'd be interested in good hints about this, especially concerning interface changes)

- The programming model is very simple and nice but it seems to make most sense to split each API over multiple lambdas to keep them as small as possible or use some serverless framework to make managing the whole app more easy

- This goes without saying, but be sure to use CI and do not deploy local builds (native binary deps)

Techbrunch 1 day ago 0 replies      
You might want to have a look at Serverless, a framework to build web, mobile and IoT applications with serverless architectures using AWS Lambda and even Azure Functions, Google CloudFunctions & more. Debugging, maintaining & deploying multiple functions gets easier.

Serverless: https://github.com/serverless/serverless

cntlzw 1 day ago 0 replies      
Pretty good actually. We started using AWS Lambda as a tool for a cron job.

Then we implemented a RESTful API with API Gateway and Lambda. The Lamdbas are straightforward to implement. API Gateway unfortunately has not a great user experience. It feels very clunky to use and some things are hard to find and understand. (Hint: Request body passthrough and transformations).

Some pitfalls we encountered:

With Java you need to consider the warmup time and memory needed for the JVM. Don't allocate less than 512MB.

Latency can be hard to predict. A cold start can take seconds, but if you call your Lambda often enough (often looks like minutes) things run smooth.

Failure handling is not convenient. For example if your Lamdba is triggered from a Scheduled Event and the lamdba fails for some reason. The Lamdba does get triggered again and again. Up to three times.

So at the moment we have around 30 Lambdas doing their job. Would say it is an 8/10 experience.

zurn 15 hours ago 0 replies      
For a serverless system that uses Lambda together with eg CloudFormation, Dynamo, and S3, Cognito etc - it's pretty low level and you spend a lot of time understanding, refining & debugging basic things. The end-to-end logging and instrumentation throughout the services used by your app weren't great.

Doesn't like big app binaries/JARs and Amazon's API client libs are bloated - Clojure + Amazonica goes easily over the limit if you don't manually exclude some Amazon's API JDKs from the package.

On the plus side, you can test all the APIs from your dev box using the cli or boto3 before doing it from the lambda.

Would probably look into third party things like Serverless next time.

xer 1 day ago 0 replies      
At Annsec we are all out on serverless infrastructure and use Lambdas and Step functions in two development teams on a single backlog. Extensibility of a well written lambda is fenomenal. For instance we have higher abstraction lambdas for moving data. We make them handle several input events and to the greatest extent as pure as possible. Composing these lambdas later in Step functions is true developer joy. We unit test them locally and for E2E-tests we have a full clone of our environment. In total we build and manage around 40 lambdas and 10 step functions. Monitoring for failure is conducted using Cloudwatch alarms, Ops Genie and Slack bots. Never been an issue. In our setup we are aiming for an infrastructure that is immutable and cryptological verifiable. It turned out to be bit of a challenge. :)
mfrye0 1 day ago 0 replies      
I started off doing manual build / deploy for a project and it was a total pain in the ass. From packaging the code, to versioning, rollbacks, deploy. Then that doesn't even include setting up API Gateway if you want an endpoint for the function.

Since then I've been using Serverless for all my projects and it's the best thing I've tried thus far. It's not perfect, but now I'm able to abstract everything away as you configure pretty much everything from a .yml file.

With that said, there are still some rough spots with Lambda:

1) Working with env vars. Default is to store them in plain text in the Lambda config. Fine for basic stuff, but I didn't want that for DB creds. You can store them encrypted, but then you have to setup logic to decrypt in the function. Kind of a pain.

2) Working within a subnet to access private resources incurs an extra delay. There is already a cold start time for Lambda functions, but to access the subnet adds more time... Apparently AWS is aware and is exploring a fix.

3) Monitoring could be better. Cloudwatch is not the most user friendly tool for trying to find something specific.

With that said, as a whole Lambda is pretty awesome. We don't have to worry about setting up ec2 instances, load balancing, auto scaling, etc for a new api. We can just focus on the logic and we're able to roll out new stuff so much faster. Then our costs are pretty much nothing.

Jdam 1 day ago 3 replies      
For running Java in Lambda, I had to optimize it for Lambda. To decrease processing time (and in the end the bill), I got rid of all reflection for example and though twice when to initialize what and what to make static. Also, Java Cold Start is an issue. I fixed this with creating a Cloudwatch Trigger that executes the Lambda function every minute to keep it hot. Otherwise, after some minutes of no-one calling the function, it takes 10+ seconds to respond. But if you use Python for example, you don't run into this issue.I built complete backends on top of Lambda/API Gateway/Dynamo and having "NoOps" that also runs very cheap is a killer argument for me.
viw 1 day ago 0 replies      
Can talk only about the node.js runtime with native add-ons. Using it for various automation tasks less then 100 invocations a day where it is the most convenient solution out there for peanuts. We also use it for parsing Swagger/API Blueprint files, here we talk 200k+ invocations a day and works great once we figured out logging/monitoring/error handling and limited output (6MB). We do not use any framework because they mostly are not flexible enough but apex (http://apex.run/) and serves us well. We've hit couple of times some limits but as it is invocation per request only some calls failed and the whole service was unaffected. I see the isolation as a big benefit you get. One thing which sucks is that if it fails (and it is not your code) often you have no idea why and if anything can be done. We use it together with AWS API Gateway and the Gateway part is sub par. The gateway does not support correct HTTP like 204 always returns a body and god forbid if you want something else than application/json. To sum it up lambda is great with some minor warts and the API gateway is OK but can easily imagine it much better.
eknkc 1 day ago 1 reply      
We use Node.JS lambda functions for real time image thumbnail generation and scraping needs. As well as mirroring our S3 buckets to another blob storage provider and a couple of periodic background jobs. It works beautifully. It's a little hard to debug at first but when it's set up, both pricing and reliability is really good for our use cases.

I think a lot of people try to use the "serverless" stuff for unsuitable workloads and get frustrated. We are running a kubernetes cluster for the main stuff but have been looking for areas suitable for lambda and try to move those.

davidvanleeuwen 1 day ago 1 reply      
After multiple side projects with Lambda (e.g. image processing services), we finally implemented it on larger scale. Initially we started out without any framework or tool to help, because there we pretty much non-existent at that time. We created our own tool, and used Swagger a lot for working with API gateway (because it is really bad to work with). Over time everything smoothened out and really worked nicely (except for API Gateway though). Nowadays we have everything in Terraform and Serverless templates, which really makes your life easier if you're going to build your complete infrastructure on top of AWS Lambda and other AWS APIs. There are still a bunch of quarks you have to work with, but at the end of the line: it works and you don't have to worry much about scaling.

I'm not allowed to give you any numbers; here's an old blogpost about Sketch Cloud: https://awkward.co/blog/building-sketch-cloud-without-server... (however, this isn't accurate anymore). For this use-case, concurrent executions for image uploads is a big deal (a regular Sketch document can easily exist out of 100 images). But basically the complete API runs on Lambda.

Running other languages on Lambda can be easily done and can be pretty fast, because you simply use node to spawn a process (Serverless has lots of examples of that).

Let me know if you have any specific questions :-)

Hope this helps.

lovehashbrowns 1 day ago 0 replies      
I've only been using it for one project right now. I made an API that I can use to push security-related events to a location that a hacker couldn't access, even if they get root on a local system. I use it in conjunction with sec (Simple Event Correlator). If sec detects something, e.g. a user login, or a package install, it'll send the event to the API in AWS Gateway + Lambda. The event then gets stored in a DynamoDB table, and I use a dashing.io dashboard to display the information. It works super well. I still need to convert my awful NodeJS code to Python, but that shouldn't take long.

I do remember logging being a confusing mess when I was trying to get this started. I feel better about the trouble I had now that I see it wasn't just me. But for a side project that's very simple to use, Lambdas have been a blessing. I get this functionality without having to manage any servers or create my own API with something like Python+Flask. Having IAM and authentication built in for me made the pain from the initial set-up so worth it.

marcfowler 1 day ago 1 reply      
We use it with node for a bunch of things like PDF generation, asynchronous calls to various HTTP services etc. I think it's excellent.

The worst part about it by far is CloudWatch, which is truly useless.

Check out https://github.com/motdotla/node-lambda for running it locally for testing btw - saved us hours!

ransom1538 1 day ago 0 replies      
I used it for converting images to BPG format and do resizing. I really enjoyed it. Basically with Docker/lambda these days I feel like the future will be 'having code' and then 'running it' (no more ssh, puppet, kuberdummies, bash, vpc, drama). Once lambda runs a docker file it might take over middle earth. These were my issues with lambda:

1. Installing your own linux modifications isn't trivial (we had to install the bpg encoder). They use a strange version of the linux ami.

2. Lambda can listen to events from S3 (creation,deletion,..) but can't seem to listen to SQS events WTF? It seems like amazon could fix this really easily.

3. Deployment is wonky. To add a new lambda zip file you need to delete the current one. This can take up to 40 seconds (which you would have total downtime).

aeikenberry 21 hours ago 0 replies      
We run many microservices on Lambda and it has been a pleasant experience for us. We use Terraform for creating, managing environment variables, and permissions/log groups/etc. We use CodeShip for testing, and validating and applying Terraform across multiple accounts and environments.

For logging, we pipe all of our logs out of CloudWatch to LogEntries with a custom Lambda, although looking at CloudWatch logs works fine most of the time.

maephisto 1 day ago 0 replies      
A couple of months ago, I've started using AWS Lambda for a side project. The actual functions were pretty easy to code using `nodejs` and deploying them with `serverless` but the boilerplate to opening them via an http API was the real bummer. IAMs, routing and all kind of other little things standing in the way of actual productive work.Some time after that I tried to setup GCloud Functions and to my surprise that boilerplate was minimal! Write your function and have accessible with just a couple of commands. IMHO GCloud Functions is way more developer friendly and AWS Lambda.
shakna 1 day ago 0 replies      
- Cheap, especially for low usage.

- Runs fast, unless your function was frozen for not enough usage or the like

- Easy to deploy and/or "misuse"

- Debugging doesn't really work

All in all, probably the least painful thing I've used on AWS. But that doesn't necessarily mean much.

alexbilbie 1 day ago 0 replies      
If you need to store environment variables easily and securely take a look at EC2 Parameter Store - you can fetch the relevant parameters on startup and they are automatically encrypted and decrypted using KMS for you
ajeet_dhaliwal 1 day ago 0 replies      
It's been great for using as 'glue' to do small tasks like clean ups in our case or other short lived minor tasks. I haven't used it for anything major though, only for minor tasks that are easier or more convenient to do with Lambda rather than a different way. The real value comes from the integration with other AWS services, for example, for developers using DynamoDb Lambdas make a lot of maintenance of records far easier with streams events.
rlv-dan 1 day ago 0 replies      
A session I remember that might be of interest:

Building reactive systems with AWS Lambda: https://vimeo.com/189519556

StreamBright 1 day ago 0 replies      
We have moved our database maintenance cron jobs to Lambda as well as the image resize functionality. General experience is very positive after we figured out hot to use Lambda from Clojure and Java. People worried about JVM startup times: Lambda will keep your JVM up and running for ~7 minutes after the initial request and you can achieve low latency easily.
djhworld 1 day ago 0 replies      
Have used it in production for > 2 years, mainly for ETL/Data processing type jobs which seems to work well.

We also use it to perform scheduled tasks (e.g. every hour) which is good as it means you don't have to have an EC2 instance just to run cron like jobs.

The main downside is Cloudwatch Logs, if you have a Lambda that runs very frequently (i.e. 100,000+ invocations a day) the logs become painful to search through, you have to end up exporting them to S3 or ElasticSearch.

jonathanbull 1 day ago 0 replies      
We use Lambda extensively at https://emailoctopus.com. The develop-debug cycle takes a while, but once you're up and running, the stability is hard to beat. Just wish they'd raise that 5 minute execution limit so we can migrate a few more scripts.
adrianpike 1 day ago 0 replies      
It's good. We're using it for a ton of automation of various developer tasks that normally would get run once in a while (think acceptance environment spinup, staging database load, etc.).

It fails once in a while and the experience is bad, but that's mostly due to our tooling around failure states instead of the platform itself.

eloycoto 1 day ago 0 replies      
I'm using it for a year in a half, and I'm more than happy, The cost increments when you have much load, but I am a happy user to use it for these small applications that need to be always up.

Need to say, that you should use gordon<https://github.com/jorgebastida/gordon> to manage it, Gordon makes the process easier.


rmccue 1 day ago 0 replies      
Pretty great, we're using it for resizing and serving images for our clients (large media companies, banks, etc): https://hmn.md/2017/04/27/scaling-wordpress-images-tachyon/

API Gateway is a little rougher, but slowly getting there.

Dawny33 1 day ago 1 reply      
- We use Lambda along with Ansible to execute huge, distributed ML workloads which are (completely)serverless. Saves a lot of bucks, as ML needs huge boxes.

- For serverless APIs for querying the S3 which is a result of the above workload

Difficulties faced with Lambda(till now):

1. No way to do CD for Lambda functions. [Not yet using SAM]

2. Lambda launches in its own VPC. Is there a way to make AWS launch my lambda in my own VPC? [Not sure.]

jakozaur 1 day ago 0 replies      
So far used only for toy/infrequent use cases and it works there well. E.g. Slack command, integration with different systems, cron style job.
erikcw 1 day ago 1 reply      
Lots of great comments here. I'd like to add that being limited to 512mb of working disk space at /tmp has been a stumbling block for us.

Would be really great to have this configurable along with CPU/memory.

Additionally being able to mount and EFS volume would be very useful!

tommy5dollar 1 day ago 0 replies      
Been using it for about 6 months with Serverless for Node API endpoints and it's great so far!

The only negatives are:- cold start is slow, especially from within a VPC- debugging/logging can be a pain- giving a function more memory (~1GB) always seems to be better (I'm guessing because of the extra CPU)

ahmednasir91 1 day ago 0 replies      
- When using with API Gateway the API response time is more than 2-3 seconds for a NodeJS lambda, for Java it will be more.- Good for use cases for example -- cron - can be triggered using Cloudwatch events.-- Slack command bot (API Gateway + Lambda) the only problem is timeout.
betimd 1 day ago 0 replies      
I'm having around 4 mln lambda executions per month, mostly on data processing and I'm happy in overall with performance and easy of deployment. Debugging is hard, frameworks are still very mature. I use AWS SDK and C# and I'm having quite good experience.
dlanger 23 hours ago 0 replies      
Hey everyone, I'm Daniel Langer and I help build lambda monitoring products over at Datadog. I see lots of you are unhappy with the current monitoring solutions available to you. If anyone has thoughts on what they'd like in a Lambda monitoring service feel free to email me at daniel.langer@datadoghq.com
forgottenacc57 1 day ago 0 replies      
In the end it's only the web server that is serverless, you still need other servers depending on your use case, and hey, web servers aren't that hard to run anyway.
synthc 1 day ago 0 replies      
We had a bad experience: we accidentally made an error in one function that got called a lot, which blocked other functions from runnning. Yay for isolation!
akhatri_aus 1 day ago 3 replies      
- There is a surprisingly high amount of API gateway latency

- The CPU power available seems to be really weak. Simple loops running in NodeJS run way way slower on Lambda compared to a 1.1 GHz Macbook by a significant magnitude. This is despite scaling the memory up to near 512mb.

- Certain elements, such as DNS lookups, take a very long time.

- The CloudWatch logging is a bit frustrating. If you have a cron job it will lump some time periods as a single log file, other times they're separate. If you run a lot of them its hard to manage.

- Its impossible to terminate a running script.

- The 5 minute timeout is 'hard', if you process cron jobs or so, there isn't flexibility for say 6 minutes. It feels like 5 minutes is arbitrarily short. For comparison Google Cloud Functions let you work 9 minutes which is more flexible.

- The environment variable encryption/decryption is a bit clunky, they don't manage it for you, you have to actually decrypt it yourself.

- There is a 'cold' start where once in a while your Lambda functions will take a significant amount of time to start up, about 2 seconds or so, which ends up being passed to a user.

- Versions of the environment are updated very slowly. Only last month (May) did AWS add support for Node v6.10, after having a very buggy version of Node v4 (a lot of TLS bugs were in the implementation)

- There is a version of Node that can run on AWS Cloudfront as a CDN tool. I have been waiting quite literally 3 weeks for AWS to get back to me on enabling it for my account. They have kept up to date with me and passed it on to the relevant team in further contact and so forth. It just seems an overly long time to get access to something advertised as working.

- If you don't pass an error result in the callback callback, the function will run multiple times. It wont just display the error in the logs. But there is no clarity on how many times or when it will re-run.

- There aren't ways to run Lambda functions in a way where its easy to manage parallel tasks, i.e to see if two Lambda functions are doing the same thing if they are executed at the exact same time.

- You can create cron jobs using an AWS Cloudwatch rule, which is a bit of an odd implementation, CloudWatch can create timing triggers to run Lambda functions despite Cloudwatch being a logging tool. Overall there are many ways to trigger a lambda function, which is quite appealing.

The big issue is speed & latency. Basically it feels like Amazon is falling right into what they're incentivised to do - make it slower (since its charged per 100ms).

PS: If anyone has a good model/providers for 'Serverless SQL databases' kindly let me know. The RDS design is quite pricey, to have constantly running DBs (at least in terms of the way to pay for them)

lunch 1 day ago 2 replies      
Can anyone speak to continuous deployments with Lambda, where downtime is not an option? Is it possible to run blue green deployments?
unkoman 1 day ago 1 reply      
- Decouple lambdas with queues and events, SQS, SNS and S3 events are your friends here

- Use environment variables

- Use step functions to create to create state machines

- Deploy using cloudformation templates and serverless framework

obrit 1 day ago 0 replies      
I'd like to use Lambda@Edge to add headers to my CloudFront responses. Does anybody have any idea when this might be released from preview?
david90 1 day ago 1 reply      
I tried using Lambda, but need to set up the API gateway before using as well. Painful logging and parameter forwarding.
forgottenacc57 1 day ago 0 replies      
Lots of people saying the API gate away is hard.

You don't need to use the API gateway.

Just talk direct to Lambda.

caseymarquis 1 day ago 0 replies      
Any comparisons to GCE and Azure's offerings for those who have used both?
goldenkey 1 day ago 1 reply      
Works terribly. It's basically a thin wrapper around a shoddy jar framework. All the languages supported are basically shit-farmed from the original Java one. The Java one is the only one that works half decently.
Ask HN: Assistive Devices for recently quadriplegic dad
155 points by throwawaysci  1 day ago   33 comments top 27
Voltbishop 1 day ago 0 replies      
I'd recommend the two products:

1. http://www.quha.com/products-2/accessories/quha-pufo/

It's a bluetooth device that allows controlling the mouse cursor with body movement (head or finger etc) It's cheaper. Coupled with a free dwell clicking software, should work!

2. Eye tracker - there are a lot of options, visit reddit.com/r/eyetracking that and reddit.com/r/ALS and ask them for advice. These devices let you control a PC with your eyes are especially designed for people who have ALS. The ones that work really well cost money, but most insurance companies cover them in full. Avoid Tobii, they are not reliable and are more marketing than anything. Mygaze,LC Technologies, Eyetech digital, smi vision. These are all companies you can trust. All should offer free trail periods and should have a rep who can come and visit your dad to do an evaluation. If they don't offer at minimum 2 week trail, they're not a trusted company. Secondly you can contact your local cities AT clinic they have donated equipment for situations like this.

I hope this helps!

Mz 1 day ago 0 replies      
I was passingly acquainted with two quadriplegics from my corporate job. They were both employees of the company.

Quadriplegic just means all four limbs are impaired. The degree of impairment can vary substantially. One of these men had use of his arms, but did not have full use of his hands. He drove himself to work, had a full time job, wife and kid. He broke his neck in a pool accident in his teens. He used a manual wheelchair. He was able to use a manual wheelchair because he had use of his arms. He chose it over an automatic wheelchair to get in regular exercise.

The other was substantially more impaired. He broke his neck in a riding accident later in life. He had been a brilliant surgeon. He used an automated wheelchair. I think he had partial use of one arm and maybe a couple of fingers, which allowed him to navigate a smartphone with that hand. He came in once a week for a few hours to review surgical reports for the company. When ordinary claims processors (like I was) could not figure out if the surgery was covered and their boss with more training couldn't either, we printed off the entire file and hand delivered the paper version to this man on Friday afternoon. I had one claim go to him and hand walked my papers to the meeting.

I also attended an educational talk given by the two of them. This is how I know how they each broke their neck and other details.

Since your father was a consultant, he may be able to return to doing consulting work at some point in the future. The specialized knowledge in his head does not stop being valuable just because of his physical limitations. I am mentioning this because new quadriplegics are often suicidal. They feel that life is simply over. It's not. He was a professor and consultant, like this former surgeon, his knowledge and expertise still has value. Even though the former surgeon could no longer work as a surgeon, his knowledge of surgery was valuable and he had a unique very part time job at a world class company.

Depending on the exact details of your father's limitations, he may also benefit from the use of ordinary things like smart phones with apps. There are also a lot of non-tech assistive devices, like chairs to help them shower and spoons that can be strapped to their hand so they can feed themselves if they have arm movement but limited hand control.


aerovistae 1 day ago 1 reply      
I created a Chrome extension called Hands Free for Chrome which allows near-total control of the browser with your voice.

It occasionally has to be reset by hand if the voice recognition locks up, which is the only barrier. But I'm fairly certain it's the best option available for people in your father's situation.


apostacy_steak 1 day ago 0 replies      
My father contracted Guillain-Barre - a fast oncoming neurological disorder that left him quadriplegic. I did a bunch of research a couple of years ago to help him out.

First, if your dad can still move his head you can use Apple's assistive tech to "tab" through the items on the screen with a turn one way, and "clicking" on an item by turning his face the other.

Second, MS Windows' voice control is actually really decent. You can browse, search, send emails, etc. all with your voice. It takes some training (both for the user and the machine) but my dad has gotten pretty quick with his.

Lastly, there's a bunch of eye trackers out there now, and you can use them for a lot of things. I setup CameraMouse (http://www.cameramouse.org/) for when voice wasn't quite cutting it (or my dad got tired of talking.)

Unfortunately, there's no perfect solution, and all require time to adjust.

j_s 1 day ago 0 replies      
Jouse3 $1500 mouth mouse/joystick seems tried & true. http://www.compusult.net/assistive-technology/our-at-product...

Source: https://www.twitch.tv/nohandsken quadriplegic streamer who plays Diablo/Path of Exile, Heroes of the Storm, World of Warcraft, etc. (I encourage Amazon Prime subscribers to give him ~$2.50 every 30 days via their free Twitch sub! https://help.twitch.tv/customer/portal/articles/2574674-how-... )

slightly related/helpful discussion: https://github.com/melling/ErgonomicNotes

gtsteve 1 day ago 0 replies      
I'm sorry to hear that. That's very bad luck indeed.

I remember hearing about this project some time ago: https://github.com/OptiKey/OptiKey

It might be helpful as it's an open-source project and if extra features are needed you might be able to add them yourself if you are a programmer.

I mentally bookmarked it because I felt it would be a good "make the world a better place" type project to contribute to if I ever had some spare time.

j_s 1 day ago 0 replies      
Please follow-up to report what works best for your father! So many of these Ask HN's have a great list of options, but no follow-up to report what worked best for your particular situation.

Thanks for bringing this request here to allow the community the chance to contribute!

WheelsAtLarge 1 day ago 0 replies      
It's not a laptop but an Amazon Echo is excellent for Music, Audible books and Podcasts. Hard to beat how well the immediate voice recognition works. Home automation works but it's a try and see if it works for you type of situation. It's worth investigating for your needs. Also it's relatively inexpensive. I should add that home automation needs additional hardware and costs. But you can buy an on/off plug for a bit under $30.00.
johnnyg 1 day ago 0 replies      
I'm sorry I don't, but my thoughts are with you and I think its great you are advocating for and supporting your father.
house9-2 1 day ago 0 replies      
If he still has good movement of his neck:

- Smartnav (if Mac you need to buy via 3rd party, but includes the software)

Fairly expensive, there are other variants that cost less/more, some gaming devices like TrackIr might work as well?Possible that health insurance would pay for these types of devices?

I personally use Smartnav about 50% of the time I am programming, along with Dragon/Voicecode due to RSI issues.

Smartnav + Dragon might be enough for using laptop/desktop, not so much for mobile devices. If he actually programs I would recommend voicecode.

All of these technologies have a massive learning curve.

You might want to checkout the voicecode forum and slack channel, I know there are some quadriplegic programmers in that community who would have better insight than I.

- http://www.naturalpoint.com/smartnav/

- http://voicecode.io/

- http://discuss.voicecode.io/

- https://voicecode.slack.com/

jonnycowboy 23 hours ago 0 replies      
I work for a company that makes robot arms for assistive purposes (Kinova) so I have some insight to share.

1st a voice setup with Alexa or similar can really help.

With regards to phone use, some of our users have an attachment to put the phone close to their head and use their nose to "click/select" (they can move their head).

Eye tracking technology is really impressive these days (can be as fast as using a mouse). I've recently demoed a system with a Tobii sensor (https://www.tobii.com/) that was hooked up to a laptop, very impressive when combined with appropriate software (it handles scrolling, keyboard shortcuts, etc in a custom interface). I'm not sure with regards to phone/tablet use how well they integrate.

Ping me on Linkedin if you'd like to talk more.

jeremyt 1 day ago 0 replies      
Dragon naturally speaking + Sennheiser ME-3 microphone should be at the top of your list
m0ngr31 1 day ago 0 replies      
A little while back, I had a quadriplegic man reach out to me and thank me for a skill I wrote for Alexa to control a Kodi box. It allowed him to watch what movies and shows he wanted without constantly asking for help.


I'm truly sorry about your dad. That's a scary situation for him to be thrust into.

mindcrime 1 day ago 0 replies      
A company I used to work for did a project once, deploying a lot of home automation for someone with mobility issues. It's been a few years ago, and the tech has probably changed a lot since then, but they used OpenRemote[1] heavily as part of the project. It might be worth looking into.

[1]: http://www.openremote.com/

jiiqo 10 hours ago 0 replies      
I work with assistive technology and many of the clients are quadraplegic. Because your dad can control his head accurately, the best computer access device for him is definately a head mouse. Eye tracking solutions are tiring to use and unaccurate.

I have tried most of the commercial solutions available and I think the best headmouse for your dad would be Zono mouse http://www.quha.com/products-2/zono/. It is very easy to use and is as accurate as normal table-top mouse.

mattbgates 17 hours ago 0 replies      
This company develops software for the handicapped: https://getaccessibleapps.com/

I think they've created software that can bypass captchas and will work with you to develop software that can help your dad.

11thEarlOfMar 1 day ago 0 replies      
You might take a look at Neuroswitch from Control Bionics:


base2john 1 day ago 0 replies      
Sorry to hear about your family's situation, but hopefully he's able to get back to work soon.

Tecla is great you should give it a try. Depending on his comfort and ability a head tracking mouse from Orin is pricey but works really well with a laptop/desktop setup. Dragon Naturally speaking is useful too.

Also he should make an appointment with a local assistive technology practitioner soon to get a run down of all the options, both low and high tech. You can find these ATP folks at most all rehab hospitals.

Good luck

onlydnaq 1 day ago 1 reply      
You could take a look at tobii [1]. They make some interesting products where they combine eye tracking, speech analysis and other stuff to create user interfaces for people with specific needs.

[1] https://www.tobiidynavox.com/en-US/tobii-dynavox-pceye-plus/

jbannick 23 hours ago 0 replies      
Contact Barrie EllisWorld expert on motion impairment assistive technology.Barrie.ellis@oneswitch.org.uk
nikki-9696 1 day ago 0 replies      
Sorry to hear. You might find a lot more useful advice on Reddit, if you haven't already posted there.
rroriz 22 hours ago 0 replies      
There is a startup in Brazil that promote DIY Assistive Projects: https://www.meviro.org
kinova 1 day ago 0 replies      
There is this robotics arm that helps gaining autonomy. There are options that allow the user to use his tongue, head or breath to control the arm. It interfaces easily to the wheelchair controls.


xrange 23 hours ago 0 replies      
Dasher might be something worth looking into:


tinus_hn 1 day ago 1 reply      
There was a talk by a C4 paraplegic at WWDC this year that shows his setup.
mudil 1 day ago 0 replies      
Medgadget has lots of ideas for rehab and assisting devices out there. http://medgadget.com
htatbr 23 hours ago 0 replies      
I worked with quads as a college grad student.I also worked with HIV patients in US Peace Corps.You really need to watch for sepsis.Watch out for catheter infections.https://www.indiegogo.com/projects/the-connected-catheter-by...I would not allow any quad to use a catheter more than a short time. Spinal singularity will fail by increasing infections.A new DNA tech may change this.https://nanoporetech.com/products/smidgionhttps://www.youtube.com/watch?v=OK5nNSwt3MAhttps://www.cs.columbia.edu/2016/dna-sequencing-in-classroom...

Sepsis now dominates the hospital ICU. It is what kills most AIDS patients too. Antibiotic resistance is driving costs.The ICU is now 40% of US hospital budgets.This is bankrupting state and federal budgets. This is why medicare, medicaid, obamacare are bankrupting US Govt (Fed and State) budgets.In 2013 health dominated state budgets.

State spending on health care now exceeds education spending. Look at NM's past budgets.http://www.usgovernmentspending.com/compare_state_spending_2...

Today 1/4 of US VA and Indian Health patients are diabetic. US Defense Dept. funding must now compete with Medicare.Today 40% of hospital costs are for growing ICU's and chronic disease. 1/2 of US Medicare cost = chronic disease from diabetes.

NM ICU's are dominated by chronic disease. http://www.amazon.com/Where-Night-Is-Day-Politics/dp/0801451...

40% of US hospital budgets now pay ICU/chronic disease costs. This cost is going up annually. http://money.cnn.com/video/technology/2013/07/24/fortune-tra...

Can MinION help pre ICU patients better control diet and sepsis infection.http://www.bloomberg.com/news/articles/2015-06-03/deadly-inf...A complete bacterial genome with MinIONhttp://www.nature.com/nmeth/journal/v12/n8/abs/nmeth.3444.ht...

Minion can find septic bacteria fast.https://genomemedicine.biomedcentral.com/articles/10.1186/s1...


Ask HN: What was it like to quit your job and focus on your successful startup?
155 points by mattbgates  1 day ago   105 comments top 33
welanes 1 day ago 3 replies      
> What was it like to quit your job and focus on your successful startup?

1. Time became my most valuable asset. Everything was filtered through the lens of "does this save me time?" and so I optimized everything: The gym (worked out at home), shopping (got delivered), dating (used fleshl...joking! :).

In the words of Joel Spolsky, "Every day that we spent not improving our products was a wasted day".

2. I worked harder than ever before. My job was tough but output ebbed and flowed with meetings, management, plus the usual office time wasters. The startup workday is more straightforward: wake up, coffee, write code, listen to users, coffee, learn how to add value to the market, coffee.

3. Every two months or so I look back and shake my head at how lame the product was, how little I knew, and how inefficient my workflow was. Which is to say, I continue to learn at an incredible clip yet realize I still don't know a thing. I expect this trend to continue - if it doesn't, I'm not growing.

So, yeah, overall it's been An Incredible Journey. My only regret is that I didn't start sooner.

It's actually a gift how easy it is to go from idea to product to business. To paraphrase Murakami, 'If you're young and talented (or can code), it's like you have wings'.

We're living in the best of times.

antigirl 1 day ago 4 replies      
This might be slightly offtopic butI've been thinking of working on my own ideas for a decade now. I tried working on projects in the evening and weekends but there isnt enough time. Not only does it effect your social life but also effects your motivation and drive in general. After looking at code on a screen for 7 hours, last thing you want to do is go home and look at more code [I guess this depends on your age, i'm 32 now - not an night owl no more]

Anyway i started contracting last year for this exact reason, at around 300-400 day rate and now i've saved enough to quit and follow my 'dream', my last day is JULY 7th. I have enough savings to last me 2 years, sustaining my current social life. no frugalness.

jasonkester 1 day ago 1 reply      
The biggest change was how much more free time I had.

When you're building a Nights and Weekends side project, you get used to stealing whatever free hours you can to work on the product. But you also necessarily build things so that they don't take up much of your time every day. If they did, it would interfere with your day job and that just wouldn't work.

So when you remove the day job, you find that suddenly you have this successful business that runs itself in the background and you can do pretty much whatever you want with your day.

Most people in this situation will immediately fill that time up with work on the product, and I did to some extent. But I also made sure to take a bunch of that time just to enjoy with my family. I eventually settled on 2-3 days a week where I was "at work", with the rest devoted to other pursuits. Both me and my wife are rock climbers, which is an "other pursuit" that will happily expand to fill the time available. We're also parents, so ditto there.

I also make a point of downing tools for a while from time to time. Again, because I can.

I took the kids out of school and dragged them off backpacking around SouthEast Asia for a few months the first year. We did a couple more medium sized trips this year, and I took the entire Fall and Spring off because those are the best times for Bouldering in the forest here. Again, work is happy to ramp up or down to accommodate because I never shifted it out of that ability to run on nights and weekends.

So now, I burst for a few weeks at a time on work stuff (with possibly a more relaxed definition of Full Time than most would use), then slow down and relax for a bit.

It's actually not so bad.

Grustaf 1 day ago 1 reply      
For me, and probably for most, it's not a matter of quitting your job in order to work on your "successful" startup - if it were already successful then there'd be no issue at all. Rather it's about quitting while the startup is just beginning to look reasonably good, _in order_ to make it a success.

Since I've been working as a mobile developer, and also management consultant (my other career), it's always been extremely easy for me to find a new job whenever I needed, so there has been very little risk involved.

Still, it did require some savings, since our startup is very research intensive and will take several years before we see any revenue. We secured some basic funding now though, and things are looking good for the next stage too so I will only have needed 6 months or so of buffer.

In summary, my view is that if you're a reasonably skilled engineer or have some other attractive occupation, there is nothing to fear. The worst that can happen is really that your startup doesn't work, you'll go back to what you did before with a few months of missed income but with plenty of useful experience.

I don't think there are many situations or cultures where a failed technology startup attempt on your rsum would count against you in any way, in most places quite the opposite.

pedrohidalgo 1 day ago 2 replies      
I did it twice. Both times I failed but I don't regret it. I felt very good when doing it. I lost savings but it was a good live experience.

On the second time I created this plugin: http://plugins.netbeans.org/plugin/61050/pleasure-play-frame... I tried to make a living of this, I only sold 5 licenses of 25 dollars per year and to develop this plugin took me 2 and a half months of hard work.

Didn't have a problem getting a new job both times.

jaequery 1 day ago 2 replies      
Well I ran a solo startup so my experience might be different from others but for me, it was like living a dream. While working full-time somewhere, I started this SaaS business and soon after, the monthly income kept doubling to the point I was able to quit my job knowing my living expenses are cared for. When I quit, it was like wow, the feeling of liberation just hits you so hard. I worked for 7 years prior to doing that and now I was free. The feeling of being able to go outside anywhere at any time of the day, go shopping, or go eat at a nice restaurant, just relax at the beach, stuffs you just couldn't do when you are working full-time somewhere. It's like getting out of jail for the first time. That's what it was like for me to quit my job and focus on my startup.
williamstein 1 day ago 0 replies      
I gave the following talk exactly one year ago right when I decided to quit my academic job to focus on my "startup dream" (now https://cocalc.com): http://wstein.org/talks/2016-06-sage-bp/. It's a year later, and the quick summary is that I feel basically really, really happy to be working on this company. It's exciting to be successfully hiring and building a team of people who work very well together and can focus 100% on a specific problem, unlike how things worked for me in my part of academia (pure math research). Though stressful at times, doing what I'm doing now is much less painful than trying to build a company while doing another fulltime job (namely teaching at the University fulltime) -- that really sucked. I'm extremely glad that customers are paying us and investors funded us enough that I can do this.

Having the luxury to focus on one thing, rather than juggling several, is much like having an office that is neat, tidy, and uncluttered. It feels good in the same way. At least by quitting a job and focusing on a startup, you have the option to focus 100% on it. Actually focusing 100% on one thing is a difficult skill in itself, even with the right circumstances; however, it's completely impossible (at least for me) with two fulltime jobs at once, especially jobs like teaching (which involve lots of public speaking at scheduled times) or running a website with paying customers (which demands, e.g., responding to DDOS attacks).

foxhop 1 day ago 2 replies      
Only asking for opinions from people who were successful is literally asking for survivor bias perspectives.

I dream of working for myself but I've never taken the plunge. My income from side projects is about 1/3 of the way to my minimum number to quit and go full time.

I do a lot of thinking about this, my number is the same as my financial independence / early retirement number.

One of the biggest things that holds me back is medical insurance for a family of 5. Having an employer offsets this cost a LOT.

dwviel 1 day ago 0 replies      
I agree with some of the other's comments that time is the most valuable asset. Be productive! Also:1) Most of the advice that you find online is not really applicable. I like the stuff that Sam Altman (yes, ycombinator) puts out because it most mirrors what I've experienced.2) You need to do it yourself. Whatever it is. Just do it.3) Funding is a game. Here on the east coast investors pretty much only fund the growth stage of a company after you have real traction and are virtually guaranteed some kind of success. You will need to fund your startup out of your own pocket until then! This is what they call "friends and family" because you will be hitting them up for money, favors, meals, places to stay, etc.4) The tech is the easy part, as that is what you know. Marketing and selling is the hard part, as that is what you likely don't know. It is the part you should try to get help with if you can, but finding someone that knows what their doing is tough, so you can be sidetracked quite a bit.5) The "do things that don't scale" is completely true. Reaching out one-on-one has been the most successful way to engage potential customers, even though it is very time and labor intensive.
jonathanbull 1 day ago 1 reply      
Last year I quit a great full time role to focus on my side project, EmailOctopus (https://emailoctopus.com). Haven't looked back since!

I didn't take the plunge until quite late on, waiting until it was making enough money to comfortably cover my personal expenses. No regrets there - growth was slow in the early days and if I hadn't had the luxury of a monthly pay packet, I probably would have given up before I got the chance to properly validate the business.

Transitioning to full time brought more stress than I expected, but the experience is priceless. In the past few months alone I've learnt more than I did in 3 years of employment.

Realistically, what's the worst case scenario? I'm a reasonably skilled dev in a strong market so there's not much to lose. If it all goes wrong I'll get another job with a load of experience (and stories!) under my belt.

markatkinson 1 day ago 1 reply      
I left my permanent position and was lucky enough to continue working on a contract basis to continue covering some of my bills.

I reached a point after about 4 months where I realised the journey to make the business profitable would most likely be a five year slog, and while the opportunity was there it wasn't a cause I felt I could devote 5 years of my life to.

So gave the software for free to the people that were helping with beta testing and went back to my job. I found the most positive thing was how it helped propel my career at my current employer as I got a better role, they seem to have more respect for me afterwards and I operate more independently now.

So I suppose if you can build some sort of safety net before quitting that helps.

Disruptive_Dave 1 day ago 0 replies      
I can tell you what it was like to quit my job and focus on a very-early-holy-shit-what-are-we-doing-startup. Was at a marketing agency for 10 years when I left to pursue my first co-founded startup. Made the decision after we got a little organic press and signed up 5k+ users in 24 hours. I had been looking for an excuse to leave and do something new anyway.

It was scary as hell (no revenue coming from the startup), fun as hell, challenging as hell. Had my savings all planned out to help support the adventure, but still had that daily stress of knowing every dollar I spent was not coming back anytime soon. That part wasn't fun. But I didn't have kids or a mortgage and knew this was my chance to do something of the sort.

10/10 would do again in a similar situation, though knowing what I know now, I might have launched a business instead of chased a cool idea.

jondubois 1 day ago 0 replies      
I never returned to my old job... Why would you go back? I did return to working as a contractor for a while. A lot of failed startup engineers become contractors.

I'm back working for a startup again now so I guess I'm just going back and forth.

I've worked for a few startups and none of them has had an exit yet but one of those I have shares in is doing relatively well.

Doing contracting work is a smarter decision in general. You can actually plan to make a sizeable amount of money and then watch it happen without taking any risks - It's all within your control. With startups, you might often feel that it's outside of your control, especially if you're not a co-founder.

zghst 1 day ago 1 reply      
I am 2 months into doing this, it feels a lot like when I moved to SF, lot of things up in the air but very promising. I had a 6 month financial plan but it was hard to adapt to the lack of income and seeing my savings decrease month by month, so I'm doing contracting and TaskRabbit. I'm not really a ramen noodles person, so I make due with what I can get at the grocer.

I have a previous coworker who'd love to help me, but I don't want to babysit his work and I feel he's not valuable enough to the business. I would like another cofounder, but it doesn't bother me that I'm doing it all on my own, I have spent the last 10 years getting ready for this, so I'm more than ready. I am doing more than okay on my business alone, but I wish I had some expertise for a second opinion. I am really really thinking about going into an accelerator program or seeking angel investment, but I'm apprehensive about taking cash at this (or any stage). My biggest fear is actually having to get a real job again, I will do anything to prevent that from happening since that means my startup is dead.

RHSman2 1 day ago 0 replies      
Am 3-4 months in with concept. Have anchor client and consultancy partnership. The biggest and wildest thing is moving from someone with an idea to a visionary leader in a space. It's fundamentally awesome to take said idea, turn it into a offering and go out on the line with it. I have had nothing but wonderful experiences with others helping me out. I am 40, have 2 kids, a dog and a wife and a pretty good network which I have had to really 'work' (authentically) but the assistance from others, trust and good feelings are something that have made me believe in the human race a bit more! In corporate it's all protectionism and petty arguments AND a visionary thinker is not what they want!I am lucky to have found an excellent co-founder who is a great counter part.Definitely taking/balancing risks, going fast and working really hard (doesn't feel like work building your own thing though) however the finances have to be calculated and I have taken the consulting/product route which seems to be good but converting a service driven model to a recurring revenue model is the goal AND the challenge.
erikrothoff 1 day ago 2 replies      
I'm on my last day at my current job before taking the plunge to focus on my half successful startup! I'm going down in pay to about half, we're still losing money every month for me being employed. Our runway is ~6 months before our savings are out. Basically we have to double the monthly revenue that took 5 slow years to grow up to. Exciting and scary times!!!
cjr 1 day ago 0 replies      
I started my screenshot Saas product, Urlbox (https://urlbox.io) as a side project back in 2013. There were times I did think about shutting it down as the amount of time spent on it initially compared to the revenue growth was just too depressing! A whole 3.5 years later it had grown just enough that I could begin to take it more seriously and actually support myself full-time working on it.

As with everything there are pros and cons. The pros are obviously that you get to spend your time doing something you enjoy (hopefully), and can work whenever and wherever you feel like (this can also be a con!). The cons are that you will always be worrying about stuff like churn, whether servers will go down whilst you're away on holiday, how you're going to grow enough to support a family etc etc.

The long, slow Saas ramp of death really is a thing, and there are no silver bullets in terms of growth/marketing - just many small things that all contribute. I also always used to think 'if only I could just get to $x MRR then everything would be so much better and I'd be much more comfortable and relaxed', but when you do eventually break through that barrier you realise you're just more worried about how you are going to achieve the next one, so it's kinda never ending!

I also agree with other posts here that if you're already a decent developer in a good market, then what is the worst that can happen really? Try doing some fearsetting. I'm sure you could always find another job if your thing doesn't work out, but you do need to give these things time. I also failed a bunch of times with other startup ideas, one of which was also YC backed.

daliwali 1 day ago 2 replies      
I started doing this recently by quitting my job, and I currently have no income.

There is this assumption that one must build a minimum viable product that has to be released as quickly as possible, so much that it's become startup mantra. It's no surprise that a lot of these products seem to be technically shallow, everyone is reaching for low hanging fruit.

I feel rather alone trying to do something that I think hasn't been done before, or if it had, wasn't executed well. I don't think I could possibly commit to it without having strong motivation, which I struggled with while having a full-time job.

The biggest technical/social challenge I have is to make something that a non-technical user could easily get right away and make something with it. I think the automation of web dev is an inevitability, and frameworks were just a historical blip on this path. The same thing is happening to web design. http://hypereum.com

wonderwonder 1 day ago 0 replies      
I landed a contract to develop a saas system before I really knew how to program for the web, I had some prior systems programming experience but had not coded for about 5 years. I quit my job to develop it and delivered but was never able to land another customer.

I still have the original client 3 years later and the company grosses about $3,500 per month and I net $1,250. It pretty much runs itself, requires maybe 2 hours of work every 2 - 3 months. I spent a little over a year trying to grow it from the initial customer with no luck.

Landed a job as a full stack engineer afterwards and I really like it. I am actively looking to start a new project but I will keep my main job while doing it. I had the benefit of a wife who makes a good salary to support me during that prior adventure (Still do :) )

gwbas1c 1 day ago 0 replies      
Quitting my job was great!

Let's just say that my mistake was that I was too afraid to hurt my co-founder's feelings. If we parted ways when we should have, I might have actually gotten somewhere. (Then again, I might have gotten nowhere either!)

greglindahl 1 day ago 1 reply      
I had my financial ducks in a row before I did my first startup -- I knew how much "fun money" I could expend getting the startup to work. Most of the horror stories I hear revolve around people who didn't do this.
wolfer 1 day ago 0 replies      
TLDR - Sort out your finances first, startups are extremely stress inducing and even now with our startup at the point of break even, 1m ARR and 100% growth predicted for next financial year, I still currently earn less than I need to live but with the promise of a strong exit in the next 3-4 years if we hit our targets.

Previously I contracted as a full stack developer bringing in other developers on projects as and when the project timescales wouldn't have been achievable with just me. Running a software consultancy alone, dealing with all of the usual rigmarole of a business and performing proper client outreach was stressful, but financially and personally very rewarding (especially when you close a big deal completely on your own).

In order to get involved in my current startup, which at the outset was comprised of a designer, biz dev (CEO) and myself as CTO I had to cut off ties with my previous clients and dedicate all of my available time to the new startup. I had leveraged myself quite a bit running the previous consultancy as billings were growing year on year, so my VAT/Corporation tax accounts were generally paid out of job fees towards the end of the year rather than set aside throughout the year, leaving me in a negative cash flow position when stopping work for existing clients. Luckily there were some ongoing payments that didn't require development resource, so the small admin time required to invoice and chase up was all that was required, and enabled me to setup payment arrangements with HMRC to settle these liabilities over a period of time, out of this cashflow. Setting up these arrangements was very stressful, and I would strongly advise anyone coming into a startup to fully evaluate their financial situation before committing even if the opportunity seems huge.

Initial salaries in the new start-up were minimal (1000 p/m approx), and it took a solid three years, extremely long working days, almost unmanageable personal stress and around 0.5m of funding before we're now up to an above average average salary, 1m ARR, a team of 15 and strong growth projected for the coming year.

Success is a subjective term and occasionally I have to refocus to see the light at the end of the tunnel, but with enough grit, luck and determination, its possible to tip the balance to a point where success is now more likely than not.

andrewchambers 1 day ago 1 reply      
Just a reminder that life is pretty short, and worst case, you can probably just get another job and only be 6 months to 1 year behind your friends in terms of savings.
12s12m 1 day ago 0 replies      
I actually have a different take on this. I was building a side project 2 times in my past, to a point where there was some buzz around these side projects. And, I didn't have a job when I was working on these. However, as the products were nearing completion for V1. I got very good offers for contract work. I also had debts, so I took up the jobs. However, this led to the products failing to get any traction.

So, if you are planning to leave a job and have a good product which is getting you even half of the money you need. Leaving your job will only increase the chances of success. However, hanging on to the job while working on a product is going to be much harder.

hesdeadjim 1 day ago 0 replies      
I've had two very different experiences starting a company.

The first time I was two years out of school with $12k in the bank, had a partner with a ton of experience, and a decent idea. We crunched for six months, launched, failed, and then tried to pivot. I ran out of cash a few months before the iPhone launched and had I had a longer runway we could have ported our app to the iPhone and potentially seen success.

A year ago and nine years later than that attempt, I've started a small video game company with another friend (justintimegame.com). Despite my life situation being more complicated and expensive to maintain, the prior nine years success combined with my wife's income basically lets me try and fail until I get sick of it instead of when the money would run out. Obviously I'm aiming for success, but the massively reduced stress from barely worrying about money let's me be much more open to experimentation while also being resilient to failure.

I don't regret starting and failing my first company however. It set me up for having a higher risk threshold and an interest in startups that ended up working out quite well for me.

fergie 1 day ago 4 replies      
I quit my job to start my own company last January. Its been fairly successful so far.

For somebody like me, and probably a lot of HN readers, its _actually_ a fairly low risk proposition because qualified, experienced software engineers are so sought after. Whatever you are doing, you will always be able to pick up a $1000-$1500 a day gig when you need to bootstrap your actual project.

My old boss has contacted me a few times to see if I want to come back- definitely do not want to.

You talk about "fear", and you talk about a "successful" startup. Here's the thing: You never know if a startup will be successful, and you just have to give it a go for the love of it, rather than any expectation of success. Don't be afraid- there are plenty of worse things in this world than a failed company.

Have learned a lot about bookkeeping.

verelo 1 day ago 0 replies      
So for me I left a job and worked on my startup that ended up unsuccessful. Fortunately during that time I met my current business partners and we have since made and sold a company that by most people's definitions has been a very good run.

Expecting to get it right is the failure we all make at some point (even when we say out lout "this might not work out" we still somehow expect it to work). Expecting failure to lead to something positive is the long game I'd urge you to wait for, it's hard to remain in a good mental state at times while you're working hard and feeling under appreciated, but that is sadly just what it's like.

geekme 1 day ago 0 replies      
I started up twice. First was a e-commerce market place and the second one was into mobile app development. Both failed. I was running them for around 4 years. Now I am back to a full time Job. I never regretted my failures since I learnt a lot from them and now I am working on my 3rd side project hoping to succeed.
mezod 1 day ago 0 replies      
The way I did it was to start freelancing so that I could pay the expenses with less hours and then use the rest of the time for sideprojects. Finally, one of my sideprojects (https://everydaycheck.com) shows interest and traction, so it's easy for me to just do less freelance hours and put more time into it, hopefully I can get to a point where I can work 100% on it.

I guess the "quit your job" problem only exists if you have major responsibilities, like a family, or paying debt back. Otherwise, it makes no real sense to consider it, the opportunity is too big.

cylinder 1 day ago 0 replies      
Self employment is incredible when the money's rolling in, and terrible when it's not. I can't speak for a full fledged startup with funding though because then you just work for investors.
smdz 1 day ago 0 replies      
1. One day you are excited with the possible opportunities, then you are overwhelmed and then you are depressed. If you leave without a plan, it takes a few months to just get the mind straight. If you do have a plan - it falls apart and you still end up spending a few months that feel unproductive.I am not discouraging on having a plan - you must. But also note that you made a plan with an "employee mindset".

2. 6 month financial backup is usually not enough. I have heard many stories where people try going independent for 6 months, run out of money and start looking for a job. What happens is - entrepreneurship gets into you in that time and if one goes back to a job, I can bet they feel even more frustrated. You need 1.5 years of backup or 2-3 years of "frugal living backup". I struck positive cashflows in just about 5 months, but it wasn't good enough. I distinctly remember thinking "Maybe, I should have done this part time". Then I struck a mini-gold-mine at 8 months. Having a good backup will help you persist longer. I did not have a growth strategy that worked. But I focused on working and doing the right thing. Keep it rolling.

3. The biggest worry I had when starting was about providing "enough" for my family and any emergencies for next 1.5-3 years at any point in time. Unlike many stories, I promised myself not to wait until I go bankrupt or in a lot of debt - Nearing that is a huge red flag, where I would typically exit and take a regular job. However, taking a job is the last thing I want to do. That thing kept me money-oriented for a while and made me work on stuff that generated positive cashflow.

4. Would it have been possible to return to your old job? - Maybe, but I would not want to. I waited too long to jump ship. Infact, my experience on multiple "good" jobs is what is keeping me away from them. Once you taste entrepreneurship, its hard to go back

5. I do not consider myself successful. May be semi-successful, some people see it as success. But I have come a long way from fearing failures. Success may or may not last long. I enjoy the process and the tremendous personal growth it results into. I ensure my financial backup now gives me 5-6 years minimum to start afresh - if I have to. Do not undervalue the role of money - it definitely makes things easier.

6. This is my favorite quote about Karma. I heard it many years back (and thought it was impractical). Especially useful when I feel I did everything right but nothing works:"Karm karo, fal ki chinta mat karo" (Do your duty without thinking about results)

P.S.: I don't know about others, but I have restricted myself into writing lesser HN comments because it takes quite a bit of time/energy. This one is an act of impulse. How do other entrepreneurs feel about this?

apatters 1 day ago 0 replies      
If you're single, childless, and have a few thousand dollars in savings, quitting your job to focus on a side project or startup is very easy. You can achieve your dream in the next 24 hours. Here's the roadmap.

1. Give your employer your 2 weeks/1 month notice (depending on locale). Taking this step immediately is critical because the urgency and shock of the change will force you into being fast and practical about all the subsequent steps.

2. Create a monthly budget for yourself which assumes no income that you are not 100% sure about. So if you have interest from investments or a freelance contract that's a absolute guarantee you can include it. For most people the income side of this budget is going to be low or nothing. Your goal with this budget is to stretch your funds out for 6-12 months. The good news is that in 2017, the principle of geoarbitrage allows you to live on virtually any budget. If you live in the Bay Area your next step is going to be to move somewhere cheaper. On the cheapest end of the spectrum, I'll use Thailand as an example because I live here, you can get a basic apartment in the suburbs of Chiang Mai or Bangkok for $100-$500/mo, your initial arrival can be visa-free, and you'll live on delicious Thai food from a restaurant down the road for a few dollars a day. Network heavily with people in your intended destination before you even arrive because it'll make everything 100 times easier.

3. Now create a business plan for your new entity. The business plan should include a description of the product or service which you're going to market, how you're going to market it, what you're going to charge (start high), and any and all costs of development and operation including your own time. It should include monthly profit/loss projections (you're not allowed to use these projections in your budget, they are goals, not guarantees). The most important thing about your business isn't what product or service you initially offer. Once you have assets and control you can try anything you want. Until then the goal of your business is to make enough income that your assets are growing, no matter what that entails.

If you're leaving the country as a part of this process I would advise forming an LLC and opening a bank account before you go, as these things can be difficult from overseas. You'll be very busy trying to make money and living your dream so you don't want to have to deal with paperwork.

Prepare yourself mentally to work very hard for at least the next 6 months and do whatever you need to do to make enough cash. You will become practical and decisive, and you'll learn many realities about business, such as cash flow is king, very quickly. I got my start being nickel-and-dimed by agencies in India over Elance. It sucked and it was hard and it was 100% worth it.

There are many objections to this strategy which typically stem from risk aversion, or a desire to not worry about money. I would submit that if one objects to the risk, this plan is a personal growth opportunity: it will teach them how to handle stress, plan for contingencies, and so on. If the objection is that they don't want to worry about money, I would point out that money is just a way for people to quantify your value to them, and since no man is an island, there are great personal and financial rewards to be reaped from confronting this objection and discovering what other people truly value about you.

Doing step 1 first and now is the key. If your path brings you through Bangkok let me know and we'll grab a beer! I've seen many people succeed at this and a few fail. Your odds are better than you think.

jhylau 1 day ago 0 replies      
one word: liberating
Ask HN: What the best way to optimize images in S3?
3 points by hartator  4 hours ago   2 comments top 2
dgelks 4 hours ago 0 replies      
For this sort of work I found using a lambda function with a trigger on s3 upload works very well, aws-lambda-image seems like a popular project to use instead of writing your own code https://github.com/ysugimoto/aws-lambda-image
savethefuture 4 hours ago 0 replies      
I have a microservice setup to process my s3 images, on the fly and when they're uploaded.
Ask HN: What if WannaCry would have tried issuing fake Bitcoins?
7 points by tarikozket  16 hours ago   5 comments top 4
Lan 6 hours ago 0 replies      
Not a whole lot. The fastest CPUs will net you less than 100 MH/s [0]. The fastest single-card GPU configurations will net you around 1,000 MH/s [0]. The current total hash rate is around 5,000,000,000,000 MH/s [1]. WannaCry affected around 300,000 PCs [2]. So if every WannaCry miner was operating off CPU and getting 100 MH/s, it would only be 30,000,000 MH/s. Or 0.0006% of the current total hash rate. If they all had high-end GPUs getting them 1,000 MH/s then that would bring it up to 300,000,000 MH/s. Or 0.006% of the current total hash rate. So WannaCry wouldn't even be a drop in the bucket. If you're wondering how the hash rate is so high, it's because mining has switched to using ASICs, the fastest of which run around 14,000,000 MH/s [3].

[0] https://en.bitcoin.it/wiki/Non-specialized_hardware_comparis...

[1] https://blockchain.info/charts/hash-rate

[2] https://en.wikipedia.org/wiki/WannaCry_ransomware_attack

[3] https://en.bitcoin.it/wiki/Mining_hardware_comparison

axonic 8 hours ago 1 reply      
If I understand the OP correctly, he means could the blockchain be attacked by a vast number of infected hosts, causing a malware-induced change in the consensus of nodes, allowing BTC to be illicitly acquired, spent, or produced.

I believe controlling a massive number of nodes in the network via infection techniques like WannaCry used would open the door for many actual and hypothetical attacks. Please see the Bitcoin Wiki page titled Weaknesses [1] for more details about attacks involving the control of network resources.

More realistically, a simpler attack would be to go for control of the wallets if you have that kind of access to the infected hosts. However, if an actor had an interest in devaluing Bitcoin, to buy after a crash and sell after recovery perhaps, or just destabilize users' trust and destroy it (states?) then there could be a lot of profit in it I believe. Bitcoin has many competitors and enemies, is this something we should worry about?

[1] https://en.bitcoin.it/wiki/Weaknesses

download13 16 hours ago 0 replies      
Can you clarify?
miguelrochefort 10 hours ago 0 replies      
There's no such thing as fake Bitcoins.
Ask HN: What happens to blockchain transactions in case of network partition?
55 points by raghuvamz  2 days ago   22 comments top 7
EthanHeilman 2 days ago 1 reply      
Some suggested reading on the subject.

I wrote a paper "Eclipse Attacks on Bitcoins Peer-to-Peer Network" [0] about maliciously partitioning the Bitcoin network. Much of the paper focuses on how to partition the network, but Section 1.1 Implications of eclipse attacks should give a good sense for how Bitcoin's security properties depend on the network not being partitioned.

"Hijacking Bitcoin: Routing Attacks on Cryptocurrencies" [1] also discusses network partitions and Bitcoin. As with Eclipse Attacks it focuses on both the how and the effects.

Interestingly blockchains built on Algorand [2] would not fork under a network partition they would just cease to create new blocks until the network is whole again.

[0]: "Eclipse Attacks on Bitcoins Peer-to-Peer Network" https://www.usenix.org/node/190891

[1]: "Hijacking Bitcoin: Routing Attacks on Cryptocurrencies" https://arxiv.org/abs/1605.07524

[2]: "Algorand: Scaling Byzantine Agreements for Cryptocurrencies" https://people.csail.mit.edu/nickolai/papers/gilad-algorand-...

londons_explore 2 days ago 2 replies      
In a network partition of any kind (either due to a physical network partition, or a "virtual" partition caused by incompatible client software), the entire network gets duplicated.

Someone who had 27 bitcoins before the split gets 27 of each type of coin after the split.

Every transaction will be incorporated into one or both of the copies. Some transactions will depend on other transactions, and therefore as time passes, even a small difference in the sets of transactions applied to each copy will snowball into the majority of transactions ending up in only one tree.

There is a vulnerability in the bitcoin design here: Transactions from one partition can be replayed on the other tree at any time, now or the future. If someone sends you coins that only exist on one partition, but they later receive coins to the same address on the other partition, you can steal them by replaying the transaction.

throwaway2016a 2 days ago 2 replies      
TLDR: The blockchain has the tools to resolve this situation but it would be a huge mess.


The answers there are mostly right. If left to it's own devices, the fork will be resolved when the country gains access to the network again.

The way it would typically be resolved is that the chain that has done the most work (in a Proof of Work coin) will "win"... in practice this means the one with the longest chain and most transactions.

When this happens, the transactions in the blocks that roll back are likely to be added back to the mempool (in memory list of unconfirmed transactions) in which case they will probably still be added to a block. So for most legitimate transactions they might not notice.

However, there is a problem here. Adding hundreds of thousands of transactions to the mempool on many coins will cause huge problems.

Another problem is if the same output is spent on both forks. Called a double spend. In coins... each transaction has one of more inputs and one or more outputs. Outputs can then be used as inputs to other transactions. Each output can only be used as an input once.

If that happens, the transaction that was on the fork that lost will itself be lost since the network will reject it for trying to spend an already spent output.

Furthermore, if anyone travels from that country and connects to a network outside of it. They will eventually roll back and join the fork on that side of the partition as that partition will inevitable eventually have more "work done" than the one in the partition they left.

Now, if the country never gains internet access again. You effectively have two different coins. But you risk chaos as described above. One possible solution in that scenario is to "hard fork" and have everyone on one side of teh partition install a new blockchain client. Then it's official, they are two separate coins.

patio11 2 days ago 0 replies      
As a practical matter, every Bitcoin-accepting business with a competent ops team will stop accepting transactions until Bitcoin's central authorities declare All Clear.

(This is phrased in a fashion which Bitcoiners will not appreciate but it is not incorrect. For precedent, see the hardfork around the 0.8 release.)

stale2002 2 days ago 0 replies      
There are some weird assumptions in your question.

Assumption 1: a government is able to shut down its entire internet, and block off all electronic communications.

This assumption is fine. There are multiple historical examples of governments of doing this.

When a government does this though, there is no network split. A network split is when you have 2 networks that are cut off from one another. The government "shutting off the internet" does not create 2 networks. it make the population of the country have zero access to ANY network. Which means no split.

Which leads us to:

Assumption 2: A government is able to cut off access to the OUTSIDE internet, while also maintaining an INTERNAL network that can talk to each other, but not talk to the outside world.

This is basically impossible. There are no examples of governments being able to do this in any significant capacity.

Sure, there is some attempted internet censorship in countries like China, but the great firewall is extremely leaky. And even if it were 99% effective, 99% effective isn't good enough.

In order to partition the bitcoin network, you do not need to make it impossible for 99% of the population to get access to the outside world. You need to stop 100%, with no margin for error. This is because as soon as a SINGLE node is able to get access to the outside world, it can rebroadcast the information to all internal nodes.

shuntress 2 days ago 0 replies      
Something I think is very cool about bitcoin is that fact that you can compare it directly to physical currency in a way that makes it easy to understand at a high level.

The block chain is essentially the same as a physical ledger that everyone (collectively) uses to confirm and record all transactions.

If everyone suddenly split (partitioned) in to 2 separate groups with 2 separate ledgers each 'everyone' (now that there are 2) would continue to use the ledger of their group.

I'm not sure if bitcoin makes any arrangements for 'merging' ledgers. My understanding is that among divergent chains the longest chain always 'wins' and any others are considered fraudulent.

So, once the two partitions are re-combined, when individuals reach out to 'everyone' and say "give me the latest version of the ledger" they would find the 2 competing ledgers and should choose to trust the one that is longer.

Ask HN: What do you want to see in Debian 10 (buster)?
347 points by lamby  4 days ago   305 comments top 83
pwdisswordfish 3 days ago 11 replies      
HEADLINE: Easier way to create local packages

DESCRIPTION: My first distro was Debian. Then, for a while, I used Arch. But it kept irritating me with its total disregard for backwards-compatibility (symlinking /usr/bin/python to python3), coarse-grained packages (want to install QEMU PPC without pulling in every other architecture as well? too bad!), lack of debug packages (good luck rebuilding WebKit just to get stack traces after a SIGSEGV), and package versioning ignoring ABI incompatibilities (I once managed to disable the package manager by upgrading it without also upgrading its dependencies... and later cut off WiFi in a similar manner). So, when I finally trashed my root partition a few weeks ago, I decided to use the opportunity to return to Debian.

One thing I miss from Arch, though, is having an easy way to create a package. It's simply a matter of reading one manpage, writing a shellscript with package metadata in variables and two-to-four functions (to patch up the unpacked source, check the version, build it, and finally create a tarball), and then running `makepkg`. And it will just download the source code, check signatures, patch it, and build it in one step; it even supports downloading and building packages straight from the development repository. I took advantage of it to create locally-patched versions of some software I use, while keeping it up to date and still under the package manager's control.

Contrast that with creating a .deb, where doing the equivalent seems to require invoking several different utilities (uscan, dch, debuild; though ) and keeping track of separate files like debian/control, debian/changelog, debian/rules and whatever else. All the tooling around building packages seems oriented towards distro maintainers rather than users. I'd love something that would relieve me of at least some of the burden of creating a local package from scratch.

DISTRIBUTION: unstable, I guess

ATsch 3 days ago 4 replies      
- HEADLINE: Simplify contributing to Debian.

- DESCRIPTION:TL;DR: Debian's web pages are hard to navigate and use and it's very hard to see what's happening.

I contribute to FOSS projects whenever I have time and have been wanting to contribute to Debian, but the difficulty is offputting. I'm used to searching for the program name and arriving at a portal page from which I can easily browse the source, see the current problems and instantly start interacting with the community. Unfortunately, contributing to Debian seems to require in-depth knowledge about many systems and arcane email commands. As a would-be contributor this completely alienates me.

One reason is that Debian has many independent services: lintian, mailing lists, manpages (which btw are fantastic and give me hope), Wiki, CI, alioth, the package listing, BTS, etc. To contribute, you need to learn most of them and For example, searching a package name gives me a page at packages.debian.org, but it's very hard to navigate or even discover the other services from there. I can't easily see if there are any lintian issues, critical bugs or current discussions. Additionally, I find most of the systems very hard to use (I still can't figure out the mailing list archives). Ideally, these services would be more tightly integrated.

Another big reason Debian is very hard to contribute to is the main discussion takes place via mailing lists. I understand that many people enjoy working with them, but for light usage they are a big pain. Submitting and history are in completely different programs, there seems to be no real threading, volume is often high and reading large amounts of emails is a chore to me. A solution here would be an improved mailing list archive with options for replying directly integrated to the site.

- DISTRIBUTION: unstable


gub09 3 days ago 3 replies      
HEADLINE: Consolidation of Documentation; Removal of Outdated Documentation

DESCRIPTION: Any time you do a web search for anything regarding Debian, the search results include a huge amount of official but outdated information. Normally for Linux-related questions I refer to the amazing Arch wiki, but there are topics that are Debian-specific, and then sifting through all the detritus is a huge waste of time. There's a wiki, a kernel handbook, a manual, random xyz.debian.org pages, mailing lists, user forums, the Debian Administrator's Handbook...

Granted, it's a huge effort to clean all of that up, but perhaps there's a way to incorporate user feedback, so that pages can be marked as "outdated" by users, or updated by users (wait, there's a log-in page- does this mean I can edit wiki pages? Did not know that...:( ), or otherwise made more systematic.

In particular, it would be great to have more complete information on the installation process: which images to use (RC, ..., or weekly image?), how to put them on a USB stick (why does my 32GB stick now say it has 128GB?; you mean I can just copy the files to a FAT32-formatted drive?), what the options are (for hostname, is any name, a FQDN necessary?), etc. For every single clarification, there will be a hundred, thousand, ten thousand people who are helped; that seems like a worthwhile investment. Everyone is a beginner at the beginning, regardless of knowledge outside this specific domain, so why not make it easier.

All that said, have been using Stretch/testing for a few years, love it, love the Free/Libre Software ethos, love what you guys do, keep it up, thank you!

hsivonen 3 days ago 6 replies      
HEADLINE: Repurpose testing as a rolling release positioned for not-just-testing usage


There are users who'd like to use a non-corporate community distro but who don't need or want software to be as old as software in Debian stable. The standard answer is "use testing" (e.g. http://ral-arturo.org/2017/05/11/debian-myths.html), but 1) security support for testing is documented to be slower than for stable and unstable (https://www.debian.org/doc/manuals/securing-debian-howto/ch1...) and 2) the name is suggestive of it being for testing only.

Please 1) provide timely security support for testing and 2) rename testing to something with a positive connotation that doesn't suggest it's for testing only. I suggest "fresh" to use the LibreOffice channel naming.


ROLE: Upstream browser developer. (Not speaking on behalf of affiliation.)

Dunedan 4 days ago 3 replies      

Python 3 as default


Just to quote from the packaging manual:

> Debian currently supports two Python stacks, one for Python 3 and one for Python 2. The long term goal for Debian is to reduce this to one stack, dropping the Python 2 stack at some time.

The first step for that would be of course Python 3 as default Python version and I'd like to see that for buster, as Python 3 nowadays offers way more features than Python 2 and should be the choice for new Python projects.

JoshTriplett 4 days ago 1 reply      
HEADLINE: Switch to persistent journald by default

DESCRIPTION: Right now, Debian's default install includes rsyslog, and every message gets logged twice. Once in rsyslog on disk, and once in journald in memory. Let's turn on the persistent journal by default, and demote rsyslog to optional. (People who want syslog-based logging can still trivially install it, such as people in an environment that wants network-based syslogging. But that's not the common case.) This will make it easier to get urgent messages displayed in desktop environments as well.

kebolio 3 days ago 3 replies      
HEADLINE: easier, simpler package creation and building

DESCRIPTION: on distros like arch, to a lesser extent void and even gentoo, writing package definition files (PKGBUILDs, ebuilds, templates) is relatively straightforward; in contrast, i don't even know where to start with finding, editing and building debian packages. i think they're built from source packages but beyond that i have no clue. i think visibility of documentation could help here, if not more radical changes to be more similar to the arch/gentoo workflow.

JoshTriplett 4 days ago 2 replies      
HEADLINE: Full audit of what's in "standard" and "important"

DESCRIPTION: There have been numerous detailed analyses posted to debian-devel that go through every package in standard and important and list out which ones shouldn't be. However, actual changes have only ever been made here on a point-by-point basis. (I've managed to get a dozen or so packages downgraded to "optional" and out of the default install by filing bugs and convincing the maintainer.) I'd really like to see a systematic review that results in a large number of packages moved to "optional".

This would include downgrading all the libraries that are only there because things depending on them are (no longer something enforced by policy). And among other things, this may also require developing support in the default desktop environment for displaying notifications for urgent log messages, the way the console does for kernel messages. (And the console should do so for urgent non-kernel messages, too.)

DISTRIBUTION: Start with unstable early in the development cycle, so that people can test it out with a d-i install or debootstrap install of unstable.

heftysync 3 days ago 4 replies      
HEADLINE: First class ZFS install support on Live CD.

DESCRIPTION: The license conflict between the open source ZFS and open source Linux kernel mean ZFS needs to be in contrib. Unlike a lot of other packages in contrib, ZFS doesn't rely on any non-free software. It just can't be in Debian main because of the conflict of licenses.

However, it would be nice if there was a way to have a more official path to ZFS on root for Debian. The current instructions require a fairly high number of steps in the ZFS On Linux wiki.

The ZFS On Linux wiki also lists a new initramfs file that has to be included so ZFS is supported. It seems odd that Debian couldn't include that as part of initramfs. I realize Debian doesn't want to necessarily promote non-free software, but this is free software that just conflicts with the GPL. It doesn't seem like it should be a second class citizen where you have to manually include files that should already be part of the package.

By the nature of the license conflict, it will be a second class citizen in that it can't be part of the normal installation package and you'll have to compile on the fly. However, it would be nice if there was a mode in the Live CD that could handle ZFS installation rather than doing it all manually.

DISTRIBUTION: currently mixture of testing/unstable but I'd like to use day(s) old sid (see other post).

miclill 3 days ago 2 replies      
- HEADLINE: Not start services by default

- DESCRIPTION: If I installed e.g. postgresql I would prefer it not starting automatically by default. I would rather like a message:If you want x to start on boot, type 'update-rc.d enable x'

- DISTRIBUTION: (Optional) [stable]

- ROLE/AFFILIATION: (software dev, mostly web)

jancsika 3 days ago 3 replies      
- HEADLINE: transactional upgrades and package installs

- DESCRIPTION: This is a feature of the guix package manager. From their website:

"Each invocation is actually a transaction: either the specified operation succeeds, or nothing happens. Thus, if the guix package process is terminated during the transaction, or if a power outage occurs during the transaction, then the users profile remains in its previous state, and remains usable."

They also do transactional rollbacks, but I'm not sure how realistic that is for the apt package system.

avar 3 days ago 2 replies      
HEADLINE: Better support for non-free firmware during installation

DESCRIPTION: Long-time Debian user here and free software supporter. One aspect where I don't have any practical choice for free software is my non-free iwlwifi firmware.

It's a huge PITA to install Debian like that when you don't have the fallback of a wired network. You provide "non-free" firmware packages, but these don't have the actual firmware! Rather they're dummy *.deb packages that expect to be able to download the firmware from the installer, which is of course a chicken & egg problem for WiFi firmware.

I end up having to "apt install" the relevant package on another Debian system, copy the firmware from /lib manually, copy it to a USB drive, then manually copy it over in the installer.

I understand that the Debian project doesn't want to distribute non-free firmware by default, but it would be great to be able to run a supported official shellscript to create an ISO image that's like the Stretch installer but with selected non-free firmware available on the image.

DISTRIBUTION: Stable on my server, testing on my laptop.

captainmuon 3 days ago 3 replies      
HEADLINE: Keep selected packages up-to-date on stable

DESCRIPTION: If you are using Debian, especially stable, you have to put up with outdated packages. This is especially a problem with browsers, although you do include security updates and track Firefox ESR, if I understand correctly. But things like Webkitgtk do not recieve updates, and lack feature and security wise after a while.

I think keeping up-to-date versions and having a stable distribution is not per se a conflict. Stable means to me no breaking changes, no need for reconfiguration when I update. It shouldn't mean frozen in time.

It would be great if certain packages would recieve frequent updates even in stable:

- packages that are not dependencies, have a good track record of backwards compatibility, and are unlikely to break

- packages that have to be updated because of security issues (which I think is already addressed now)

- or because of a fast moving ecosystem - even if it was safe, it is frustrating to use a very outdated browser component. I think many networked packages could fit in this category, e.g. Bittorrent or Tor clients, if there are protocol changes.

I think the situation has improved a lot (https://blogs.gnome.org/mcatanzaro/2017/06/15/debian-stretch...), and it would be great to have a stable basis in future and still have up-to-date applications on top as far as possible.

DISTRIBUTION: stable (but also others)

esjeon 3 days ago 1 reply      
- HEADLINE: Easier DEB repository creation.

- DESCRIPTION: Creating a custom remote/local/CD/DVD repo or a partial mirror is simply a nightmare, mainly because package management internals are poorly documented. There are many tools developed to just solve this problem, but most of them aren't actively maintained. Aptly seems like the best right now, but is way much complicated and inflexible.

rahiel 3 days ago 4 replies      
HEADLINE: enable AppArmor by default

DESCRIPTION: AppArmor improves security by limiting the capabilities of programs. Ubuntu has done this years ago [1]. I'd like to see profiles for web browsers enabled by default.

I think AppArmor is the right choice of default Mandatory Access Control for Debian because Ubuntu and security focused Debian derivatives like Tails [2] and SubgraphOS [3] have already committed to it.

[1]: https://wiki.ubuntu.com/SecurityTeam/KnowledgeBase/AppArmorP...

[2]: https://tails.boum.org/contribute/design/application_isolati...

[3]: https://subgraph.com/

sandGorgon 3 days ago 4 replies      
HEADLINE: a merger of flatpkg and snap

DESCRIPTION: a consensus on the next generation of package management. Please.We have had decades of fragmentation (not to mention duplicated innovation) around the RPM vs DEB ecosystem. Which is why it is still hard for beginners to want to use Linux - try explaining to anyone who comes from a Mac about rpm vs deb vs whatever else. Which is why they would pay for the mac rather than use Linux ("its too hard to install software").

Its not just my opinion - PackageKit (https://www.freedesktop.org/software/PackageKit/pk-intro.htm...) was invented for this reason. So you could have Gnome Software Manager that can work the same on every flavor of Linux. Its time to build this the right way.

You have an opportunity now - but again the camps are getting fragmented. We now have snap (ubuntu/deb) vs flatpkg (redhat) all over again. And pretty strongly divided camps are beginning to form around them. It seems that the new rhetoric is snap for servers and flatpkg for desktops... which absolutely doesnt make sense.

Debian is the place to make this stand - systemd was adopted from fedora despite Ubuntu making a strong push for something else. Debian made Ubuntu adopt systemd. I dont think anyone has anything but respect for that process. Debian 10 must take a stand on this.

CaliforniaKarl 4 days ago 3 replies      

Remove openssl1.0


stretch made OpenSSL 1.1 the default openssl package. Unfortunately, OpenSSL 1.0 was kept around, since so many things depended on it.

There should now be enough time that a firm stance can be taken toward not allowing OpenSSL 1.0 in Debian Buster.

Once TLS 1.3 is finalized, OpenSSL 1.2 will be released with TLS 1.3 support. Not supporting TLS 1.3 in buster would (in my opinion) make Debian appear less in other people's eyes. That means supporting OpenSSL 1.2, and having three OpenSSL packages (1.0, 1.1, and 1.2) is too much for one distribution.



rahimnathwani 3 days ago 2 replies      
HEADLINE: Easy way to use multiple screens with different DPI

DESCRIPTION: Many laptops (e.g. Macbook Pro) come with retina screens, but most of us use 'regular' monitors. Even after setting org.gnome.desktop.interface scaling-factor and playing with xrandr, it can be difficult or impossible to get a single external non-retina display set up in the right position and without one screen containing tiny text (or huge text).

Being able to make it work at all, and persist after a reboot, would be great. Having per-monitor scaling in the Display settings panel (or in 'Arrange Combined Displays') would be amazing.

DISTRIBUTION: I've experienced this with jessie. I haven't tried with stretch.

hsivonen 3 days ago 0 replies      
HEADLINE: Provide rolling compiler packages in stable


There are users who simultaneously want to get their infrastructural packages like compilers from their distro and want to build fresh upstream application releases from source.

This leads to pressure for Linux apps and libraries to be buildable using whatever compiler version(s) that shipped in Debian stable, which amounts to Debian stable inflicting a negative externality on the ecosystem by holding apps and libraries back in terms of what language features they feel they can use.

To avoid this negative externality, please provide the latest release (latest at any point in time, not just at time of Debian stable relase) of gcc, clang, rustc+cargo, etc. as rolling packages in Debian stable alongside the frozen version used for building Debian-shipped packages so that Linux apps and libraries aren't pressured to refrain from adopting new language features as upstream compilers add support.

(Arguably, the users in question should either get their apps from Debian stable or get their compilers from outside Debian stable, too, but the above still seems a relevant concern in practice.)


ROLE: Upstream browser developer. (Not speaking on behalf of affiliation.)

Dunedan 4 days ago 2 replies      

100% reproducible packages


While having over 90% of packages reproducible already is awesome, 100% would be even better. The stretch release announcement describes best why:

> Thanks to the Reproducible Builds project, over 90% of the source packages included in Debian 9 will build bit-for-bit identical binary packages. This is an important verification feature which protects users from malicious attempts to tamper with compilers and build networks.

heftysync 3 days ago 1 reply      
HEADLINE: Better package discovery

DESCRIPTION: There are a ton of packages in Debian. I sometimes browse through all of the packages looking for some gem that I didn't know about before. It's a time intensive process and I don't have any input into my decision other than reading the description. Sometimes I'll install it immediately. Other times I'll check out the website to see if it's still maintained (or if there's a better alternative). It's all a very manual process.

popcon doesn't fill this void. Popcon tells me what packages are popular across all users. I'm more interested in what a subset of users with similar interests or preferences would install. Or maybe I want to see what it's like to live in someone else's shoes. For instance, maybe I'm learning a new programming language and I want to setup my environment similar to an experienced user so I have all of the popular libraries already installed.

It would be nice if there was a better way to discover packages that are relevant to you. Perhaps you could add this feature as a way of getting people to install popcon? For example, you could say if you install popcon, then it will upload your set of installed packages and make recommendations for you.

If people are able to add metadata about themselves (e.g. I'm an expert Emacs user and I'm a golang developer), then you could use that plus their package list to make recommendations. I could say "show me what packages golang developers tend to install". Or you could say "for someone with a package list similar to mine, find out what packages are popular that I'm missing".

beefhash 4 days ago 3 replies      

First-class init that is not systemd


I believe it's notorious that systemd is highly controversial, even spinning off a fork called Devuan. It might be more favorable to reunite the community by including one alternative init system that is, fundamentally, a first-class citizen in the Debian ecosystem.

"First-class" implies that the user is given a choice on new installations in a specified prompt. The default should be the option "systemd (recommended)".


buster+1 given the expected effort


Individual and hobbyist system administrator

jopsen 3 days ago 6 replies      
HEADLINE: Lower barrier for contributors DESCRIPTION:Have a git repo for each package with a simple issue tracker, like GitHub/gitlab, a flow for accepting pull-requests and automated CI.Also move away from message boards and IRC to more user friendly tools.

Currently, it's too hard to report bugs, inspect debian source packages, propose fixes, etc. The overhead to making a simple contribution is too high.Note: this isn't a debian specific issue, many open source projects has old infrastructure.

vacri 3 days ago 0 replies      
HEADLINE: Reconcile and refresh the Debian Wiki

DESCRIPTION: The wiki is frequently stale or incomplete. A lot of people get information much more readily out of a wiki than mailing lists. Like me, for example :) Mailing lists have a very high latency (often infinite) and can be difficult to search.

For example, say you want to host your own apt repo to hold a custom package; this page is not very clear https://wiki.debian.org/DebianRepository/Setup - how do you choose which of all the software types to use? It's a reasonable software overview, but not great to help people get a repo set up.

Arch has a fantastic wiki that's clear and concise. It's also more readable (mediawiki) than Debian format, though I understand Debian aims to work as bare html for greater device compatibility.

DISTRIBUTION: Primarily Stable, with later sections for non-stable if needed.

ROLE: sysadmin

brian_herman 4 days ago 1 reply      

Secure Boot in Stable


UEFI Secure Boot Support in Debian.

Debian does not run on systems with Secure Boot enabled.




I work at an insurance company and all of our development computers and most of our servers run debian jessie.

We will probably upgrade to Debian 9 very soon! Thanks for all the hard work on debian Iamby!

EDIT: grammar and formatting

hd4 3 days ago 0 replies      
- HEADLINE: An install-time option to set up a barebones WM

- DESCRIPTION: The installer should offer an option to install a simple WM, like i3 or awesomewm, in the way that there is an option in the minimal installer to install a DE like Xfce or GNOME. Bonus points if you make it aesthetically pleasing to some extent.

- HEADLINE: Kernels in repo which do more than the mainline/default kernel

- DESCRIPTION: I'm thinking of specifically of the patches by Con Kolivas, but any other useful pre-compiled kernels being available in the repo would be great, it would save me having to figure it out by myself and I'm sure there are many who would welcome the availability of pre-patched kernels, better i/o schedulers etc

- HEADLINE: Look into more optimisation (like Solus)

- DESCRIPTION: Solus (www.solus-project.com) does some optimisation on their distro that would be a good-to-have in any other distro

- ROLE/AFFILIATION: Infrastructure programmer for multinational corp

ghostly_s 3 days ago 1 reply      
HEADLINE: More user-friendly install process

DESCRIPTION: Recently had to reinstall my Debian system for the first time in a while, and was struck by how user-unfriendly the installer still is compared to many of the alternatives. I don't think it's necessarily a problem that it's ncurses, but it could use some more explicit hand-holding. I remember one point where I needed to select some options from a list and there was no indication of what operation was required for selection, for example (I think I needed to hit '+'?). I'm pretty familiar with command lines and curses-type UI's and this was unintuitive for me, I can only imagine how frustrating it might be for a more desktop-oriented user.

I also recall a very confusing UI paradigm where the installer steps are a modal view and there's a hidden 'overview/master menu' you can back out into at any time, and it's not clear to me how those two modes are related and what state it leaves your installation in if you back out and then jump into the installation at a different step.

Generally the explanatory text is quite good at telling you what decision needs to be made, and providing necessary info to research that decision if necessary, but how you make those decisions I think could still be improved.


keithpeter 3 days ago 1 reply      
- HEADLINE: Continue to provide a mechanism for offline installation of a large selection of the repository.

- DESCRIPTION: Debian is the only distribution that I know of that provides .iso images from which you can install the operating system and subsequently install a wide range of (libre) software. In addition, Debian provides update .isos. These affordances make installing and maintaining a desktop computer without an Internet connection, or with a slow and expensive connection, viable. I hope that Debian will continue to provide this affordance as we transition from optical disks over the next few releases.

- DISTRIBUTION: All Debian distributions.

- ROLE/AFFILIATION: End user (desktop)

Dunedan 4 days ago 2 replies      

Wayland as default display server


X11 is aging, so it's time to switch to Wayland. It'd be cool if buster would ship with Wayland as default display server.

aleden 3 days ago 2 replies      
- HEADLINE: port pledge(2) from OpenBSD

- DESCRIPTION: Debian has been a great source of innovation and leadership within the OSS world. Make the next big move by adopting pledge(2) from OpenBSD to be the first major mandatory security feature on Linux. There is little hassle in making programs use it, and the LOC in the kernel is tiny compared to say SELinux. See [1] for more details.

[1] http://www.openbsd.org/papers/hackfest2015-pledge/mgp00001.h...

- DISTRIBUTION: Any and all!

- ROLE/AFFILIATION: CS program analysis researcher with MIT/CSAIL.

jl6 4 days ago 4 replies      
HEADLINE: Out of the box support for being run in VirtualBox.

DESCRIPTION: I tested the stretch release candidates in VirtualBox, and while I did eventually get them working, I had to follow the instructions in several bug reports from across both the Debian and VirtualBox probably project websites.

I don't mind following instructions, so if there is a reason why this can't be achieved seamlessly with zero configuration, then I would at least like to see some official instructions prominent on the Debian website.

COMMENT: Debian is awesome, thanks for everyone's hard work!

TekMol 3 days ago 1 reply      
HEADLINE: Support for more wifi hardware.

DESCRIPTION: The #1 reason why I don't use Debian on the desktop is missing wifi support during installation. I wish Debian could write and include free wifi drivers for all recent laptops.

DISTRIBUTION: Debian 8 on the server. Mint Mate on the Desktop.

ROLE/AFFILIATION: Founder and CEO of a tech startup.

HateInAPuddle 1 day ago 0 replies      
HEADLINE: A way to bootstrap something slimmer than "minbase" for building images.

DESCRIPTION: When building images (especially container images), there should be a way to only install the bare minimum to make apt work. No init system, no bash, no filesystem utilities, nothing. Even `debootstrap --variant=minbase` is overkill in that regard.

One way would be to create an option for deboostrap that would accept a list of desired packages (similar to pacstrap from Arch) instead of using "--variant".

kpcyrd 3 days ago 0 replies      
HEADLINE: Decent rust support

DESCRIPTION: On rolling release distros there's currently a vim version that ships rust syntax highlighting, rustc and cargo. This is pretty much all you need to get started with rust development. Debian stable currently ships rustc, but lacks cargo, which is rather essential if you actually want to compile your project on a debian server. The vim-syntax thing would be nice to have. :)

DISTRIBUTION: stable/stable-backports

miclill 3 days ago 0 replies      
- HEADLINE: Continue being an awesome distribution

- DESCRIPTION: Continue with the values that make debian great. E.g.https://www.debian.org/code_of_conducthttps://www.debian.org/social_contracthttps://www.debian.org/intro/free

- DISTRIBUTION: (Optional) [stable, testing, unstable, or even a Debian deriviative]

- ROLE/AFFILIATION: (software dev, mostly web)

markvdb 3 days ago 1 reply      
HEADLINE: nginx-rtmp support in Debian

DESCRIPTION: At https://fosdem.org , we are using the nginx rtmp module intensively. It seems it is becoming a de facto standard when an in-house streaming server is preferred, as opposed to an external streaming platform. It combines excellently with ffmpeg, the recently pacakged voctomix and several components of the gstreamer framework, to create an excellent FOSS video streaming stack. Some debconf video people too seem to be interested. Some positive interest from Debian nginx pacakagers. Unfortunately, no clear way forward yet.

Hopefully, Buster opening up might create some opportunities to get things going again!

SEE ALSO: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=843777#23

DISTRIBUTION: Debian 10 stable & Debian 9 backports.

ROLE/AFFILIATION: http://fosdem.org staff (= all year round volunteer), responsible for video streaming & recording since FOSDEM 2017

yaantc 3 days ago 1 reply      
HEADLINE: Smarter handling of icons files during updates

DESCRIPTION:This is a nitpick/wishlist item really. I started using Stretch while in testing, and noticed that most updates would download rather large sets of icons (few MBs). They look like archive files of icons, and I guess that if any change happens the whole set is downloaded again. This wasn't the case in Jessie.

When on a slow Internet link, it can definitely slow down upgrades. It would only be noticeable for Testing/Unstable, as otherwise these sets of icons would not change much. But when regularly updating testing, often these icons sets were a significant part of the downloaded data.

It could be nice to make updating those icons optional, for people behind slow links. Alternatively, handling them as a versioned list (text, easy to diff efficiently) + independent files could make their update more efficient than compressed archive files.

Again, just a nitpick/wishlist item. It's just that I haven't chased down what this comes from (I guess for GUI package management like synaptic? TBC) and don't know where this could be reported. You just gave me the opportunity ;)

DISTRIBUTION: Testing/Unstable (any version with frequent changes)

vinlinux 3 days ago 0 replies      
HEADLINE: Regression fixes from upstream

DESCRIPTION: GCC 6.4 will be released soon (July). I wish Debian will get all the regression fixes that this update will bring (according to the new numbering convention, version 6.4 does not mean new features, so no breaking-changes, only fixes). Same for CUDA 8.0.61 (already available for ~5 months) which is a maintenance update after version 8.0.44, the one available in Stretch. I'm saying this because Jessie never got the latest bug fix release (4.9.4) for the 4.9 series of GCC, not even in the backports (it still offers the 4.9.2 instead). I wish there was a policy that allowed regression fixes from upstream to be ported and with the same priority as security fixes. GCC and CUDA are only examples, the same scheme would be applicable to any other package as well. In my view, this would foster Debian adoption on desktops at a higher level. If this can't be done for the current Debian Stable, I hope my (and other people's similar) concerns will be taken into account in the future. As a developer, I care about this level of support. We all love Debian, we'd just like to make it better. Thanks.


rkv 4 days ago 1 reply      
HEADLINE: Stabilize dpkg C library (libdpkg)

DESCRIPTION: Any plans to go ahead and stabilize the dpkg library for buster? Having access to a stable package management library is essential in our software. Ie. being able to verify package signatures and querying the database for files. Both of which are not supported.


apenwarr 3 days ago 1 reply      
Eliminate all the scripts that go into a package, moving them to runtime. This is the only way to eliminate instability caused by buggy scripts that then prevent upgrades.

Also get rid of all interactivity during install and upgrade. It's deadly for managing big fleets.

miclill 3 days ago 1 reply      
- HEADLINE: Make it easier to use newer software.

- DESCRIPTION: I think there is lots of ways. Things like flatpak look promising but also docker. It would be nice if there where less papercuts when using those things. I also dream about a command named "playground [name]" which instantly gives me a shell where I can try stuff without interfering with anything else. When finished I can just "playground remove [name]". I know that it's possible today but it's a but of a hassle.

- DISTRIBUTION: (Optional) [stable]

- ROLE/AFFILIATION: (software developer, mostly fullstack webdev)

coma_ 3 days ago 0 replies      
HEADLINE: better keychain integration/mechanism for handling PGP/SSH

DESCRIPTION: It would be great to have a central keychain where keys (SSH, PGP) could be unlocked on a sessions basis (think of a merge between gpg-agent [who wouldn't scream about being hijacked every other day] and ssh-agent [who wouldn't be shell-specific and able to handle multiple keys without having to manually :

> eval $(ssh-agent -s)> ssh-add /path/to/key1> ssh-add /path/to/key2> ...


As a desktop user, what I would like is, on a session basis, when I first provide the passphrase for a given key (when I ssh into a server from the CLI or decrypt a PGP encrypted email from Thunderbird [with enigmail] for instance) have a keychain securely unlock these keys for the duration of the session (that is, until I lock the screen, close the lid or log out).

mtgx 4 days ago 0 replies      
Headline: Stronger security enabled by default

Description: More KSP security features enabled by default, perhaps even Firejails pre-installed, Wayland as default along with flatpaks, etc

justin_vanw 3 days ago 0 replies      
- HEADLINE: Proper support for installations on GPT partitions

- DESCRIPTION: I have tried installing debian many times on various machines and have had huge trouble getting the install usb stick to boot properly (or in the end for the bootloader to install) with Debian. Ubuntu installs flawlessly on these machines.

- DISTRIBUTION: (Optional) [stable, testing, unstable, or even a Debian deriviative]

- ROLE/AFFILIATION: (Optional, your job role and affiliation)

warp 3 days ago 1 reply      
HEADLINE: faster release cycle

DESCRIPTION: In the past I've often ran into stuff in Debian just being too old for my needs. I don't need the bleeding edge, but two years is a really long time. I've switched to Ubuntu a few years ago, but not being a fan of Canonical it would be nice if I could come back to Debian.


ROLE: full stack web developer

anguis_fragilis 1 day ago 0 replies      
- HEADLINE: Aggressive Optimization of PNG Files

- DESCRIPTION: PNG image files use too much space in Debian's source tree; in user's install size; and in Debian's website.

All meta-data that does not affect display should be removed and the file should receive a complete lossless compression run with an optimizing tool.

Just try: find / -name "*.png" 2>/dev/null | xargs -d '\n' optipng -preserve -o7 -zm1-9 -strip "all"

A byte here, a byte there, and then suddenly your system is now several MB smaller and runs actually faster.

Upstream should be made aware of this.

Thank you.

heftysync 3 days ago 0 replies      
- HEADLINE: Bring back the standard (console) Live CD

- DESCRIPTION: Jessie had a standard Live CD. While the HTML still refers to this flavor, it is not found on any mirror that I checked for Stretch.

I have to use the live CD to install ZFS on Root. I would prefer to not bother downloading or booting a desktop environment when I don't need one.

I don't know why it was removed, but the name was always strange to me. Name it textonly or expert or something so people don't choose it. Standard sounds like it is the recommended image.


feikname 4 days ago 1 reply      

WiFi-direct GUI


Using WiFi direct on most debian-based distros is a hassle, requiring a lot of manual terminal work. A GUI in the network section for WiFi Direct would make connections easier and faster.

shmerl 3 days ago 0 replies      
- HEADLINE: improve transitions and out of the box usability of rolling Debian variants.

- DESCRIPTION: Since Debian testing / unstable are often advertised as targeted for desktop usage, they can benefit from some more focus on preventing breakage. I know it's somewhat counter intuitive to expect stability from "unstable" or "testing" variant, but at the same time Debian can benefit from avoiding the stigma of server only distro. Having out of the box robust desktop experience (which is not falling behind) is the goal here.

In the period between Jessie and Stretch, testing had a number of breakages in my KDE desktop. Packages fell out of sync (like KDE frameworks and Plasma packages weren't properly aligned, because some packages were stuck in unstable because of not building on some archs) causing all kind of instability issues. It lately became a bit better, but I think desktop stability can get some more love, especially for the most popular DEs like KDE.

And if neither testing nor unstable fit that role, may be another branch should be created for it?

- DISTRIBUTION: Debian testing / unstable.

- ROLE/AFFILIATION: Programmer, Debian user.

chungy 2 days ago 0 replies      
HEADLINE: Promote x32 to official status and recommend it as the default install.

DESCRIPTION: The Linux x32 ABI, for the most part, combines the best of both worlds in x86: the lower memory footprint of 32-bit software (and likewise, 4GiB process limits to go with it) by keeping pointer sizes and data types the same as i386, but still allowing applications to take advantage of the expanded registers, instructions, and features of x86_64 CPUs. For most systems that aren't database servers, this can result in large memory footprint reductions and greater performance as a result. Debian has had an unofficial x32 port for years, that is presently difficult to install and get running.


qwerty987 3 days ago 0 replies      
HEADLINE: kindly support "SecureBoot" as soon as possible(for all of us who are dual booting debian and MSWindows)
tmaly 1 day ago 0 replies      
HEADLINE: better documentation and cookbook for systemd

DESCRIPTION: I know systemd is very controversial, but if we are going to be stuck with it, I would like to see more documentation and examples.

allan_wind 3 days ago 0 replies      
- HEADLINE: thunderbolt, amt firmware loader

- DESCRIPTION: The last laptop that I bought from Lenovo had a thunderbolt port, and I had to use that port to get 3 x 4k monitors to work. The hardware shipping with non-functional firmware. The only way to upgrade the firmware was by booting Windows. I was not sure if there were other devices with old firmware, so I spent hours waiting for a full OS upgrade. Dell was working on a thunderbolt firmware loader at the time, not sure if they released it by now.

Similar situation with the AMI firmware security issues (CVE-2017-5689). The only way to upgrade (afaik) is by running a particular windows installer.

It seems really dumb having to buy a throw-away drive just to be able to boot windows to upgrade firmware. Obviously, I understand this at the feet of the hardware vendor. I was going to suggest pre-installed Debian, but Lenovo will ruin that with pre-installed crapware.


- ROLE/AFFILIATION: entrepreneur

sherr 3 days ago 0 replies      
HEADLINE: Finalise LXD support

DESCRIPTION:It would be great if Debian finished its LXD (container hypervisor)packaging and got it up to a decently complete level (per Ubuntu).


hultner 3 days ago 1 reply      
- HEADLINE: Run testing or unstable containers with ease on stable

- DESCRIPTION: I would absolutely love a well supported container system for running testing/unstable in a container. I feel that docker requires a lot upfront work with mixed results.

We often develop software using packages of the next debian version (such as Python 3.6) and these packages aren't always available in backports or otherwise outside of testing, in these cases it would be really nice to easily boot up this software in a container.


- ROLE/AFFILIATION: Lead Product Developer at Cetrez

heftysync 3 days ago 0 replies      
HEADLINE: More fine grained meta packages as community recommendations.

DESCRIPTION: There are a few Debian meta packages but they are really broad. Example: it would be great if there were a few developer leaning packages grouped into one meta package.

For instance, I always install etckeeper, apt-listchanges and apt-listbugs. I think anyone following testing or unstable would want to install those and I'm not aware of any real alternatives to those. I can't imagine using unstable without apt-listbugs to warn you when there high priority bugs in the packages that were already uploaded.

DISTRIBUTION: mixture of testing/unstable.

Ianp5a 3 days ago 1 reply      
HEADLINE: Installer easy option to separate OS / and /home partitions.

DESCRIPTION: It is often recommended to separate the OS partition from the users data partition containing /home. This should be available as an easy option for non IT users. If 1 partition exists, a recommended split MB size is is default. If 2 partitions exist, they are checked for OS files and home files, so the user sees which one will be overwritten. This is convenient and a safety net for most users, and a lifeline for non IT people who may not know the recommendation, or how to proceed.

pksadiq 3 days ago 1 reply      
- HEADLINE: Try to include gtk4 in Debian 10 GNU/Linux

- DESCRIPTION: It would be nice if Debian testing freeze is delayed until an enough stable version of gtk4 is included in testing (and thus eventually in next stable).

heftysync 3 days ago 0 replies      
HEADLINE: Latest elixir in Debian

DESCRIPTION: Debian unstable still has elixir 1.3.3. It looks like the "official" path forward is to add Erlang Solutions as another apt repository and install packages from there. However, this feels wrong to me as a user. I want to get packages from Debian.

I can't remember which distribution it is, but IIRC one of the other ones has developers upload builds from their personal machines and they are signed with GPG. I don't like this because it is opening yourself up to problems. Perhaps someone uploads a malicious binary build. Or perhaps their developer machine is compromised and someone else uploads it for them or infects their upload.

All of this would go away with 100% reproducible builds in Debian and when it builds on Debian infrastructure. That's not the case when Erlang Solutions is setup as the provider.

I realize this is a minor point as few people will install it, but I was surprised that other distributions include the latest Elixir but Debian does not. The latest is 1.4.4 and I couldn't find anything related to 1.4.x in the upload queue or bug reports. It seems like the package maintenance has been outsourced to Erlang Solutions.

brimstedt 2 days ago 1 reply      
Headline: better install options

Description: in debian installer you can chose a few standard setups. The default options are a bit crazy to me and also i miss alot of pkgs by default.

A bit of cleanup would be nice (iirc you can select database server for example, that ll give you my/maria)

It would be nice if you yould specify a code like "xxx/yyy" that would resolve to a public repo of predefined templates in which you could also define your own.

I for one, would define a server, workstation and laptop setup.Server setup would includr sshd, screen, etc

amorphid 3 days ago 3 replies      
HEADLINE: list apt package dependencies

DESCRIPTION: something like 'apt-get deps <package>' returning a list of all deps for a package. This would be super duper when trying to install a standalone package file on a system where the deps aren't already present.

johntadd 3 days ago 0 replies      
- HEADLINE: Kernel and Desktop

- DESCRIPTION: This request might not be considered in a short term or never be considered, but personally I hope that can be done.

For Desktop, I wish there exists Debian defined environment or interfaces which transparently integrates with desktop environment like power manager. So when switching between different, for instance, desktop environment or window manager, I don't need to tune for specific setting (particularly non-Debian way) in order to get it work.

For Kernel, I would like to see integration with seL4.

- ROLE/AFFILIATION: Software Engineer

brimstedt 1 day ago 0 replies      
HEADLINE: Disable pcspkr by default

DESCRIPTION: Please disable pcsprk by default :-)

coma_ 3 days ago 1 reply      
HEADLINE: improve website/doc to ease install process

DESCRIPTION: installing Debian should be a straightforward process for average Joes and Jannes, that's not the case currently. The process to acquire the proper ISO and have it on a bootable USB stick/SD card is overly complicated (because the information is hidden, missing or incomplete).

As an average Joe, when you visit debian.org there is no obvious place to click to get the latest stable ISO. The default (in a tiny box in the upper right corner on the homepage) is a net-install ISO. net-install are sub-optimal for users who require special firmware for their network card (dvd-1 amd64 should be the default).

You should consider that the default install process for most desktop users will consist of installing Debian from a USB stick on an amd64 system. Once the the right ISO is properly put forward, you should provide clear and detailed info on how to properly transfer the ISO to the USB stick and make it bootable.

Etcher is a free/libre, cross-platform, user-friendly, straightforward GUI (over "dd" iirc) that takes care of the process of making a bootable drive. It should be promoted and made part of the install doc.

Same goes for SD-card installs, many single-board computer enthusiasts (who are not necessarily tech savvy) renounce trying to make a bootable SD card themselves and simply buy a pre-installed one. Simply because the information isn't provided in a straightforward fashion on Debian website and they are not offered with a relatively simple process .

no, using "dd" from the CLI isn't simple: as a Joe you must care about many concepts that are un-obvious (wait what does it mean "the volume is mounted" ? how do I unmount it ? how do I identify the proper volume ? fuck I unmounted the drive, it won't auto-mount anymore ! file-system ? what are you talking about ? MBR ? DOS-compat...)

ROLE/AFFILIATION: electronics engineer, based in Europe, involved in local initiatives to promote free software (LuGs, crypto parties, hacker spaces,...)

Thank you for your awesome work, I wouldn't be involved in promoting free/libre operating systems if it wasn't for Debian (a great community project that cares for users rights/freedoms and provides an overall simple desktop experience).

bodhammer 3 days ago 1 reply      
- HEADLINE: Support for LXD (And LXC 2.0)

- DESCRIPTION: XD isn't a rewrite of LXC, in fact it's building on top of LXC to provide a new, better user experience. Under the hood, LXD uses LXC through liblxc and its Go binding to create and manage the containers.It's basically an alternative to LXC's tools and distribution template system with the added features that come from being controllable over the network.


- ROLE/AFFILIATION: Enthusiast and wanna be developer

CiPHPerCoder 3 days ago 0 replies      
I'd like to see packages be handled differently to where nothing is forked.

Instead of pinning to, say PHP 7.1.5, pin to 7.1 and stop backporting fixes. It's okay to have 7.1.6.

acd 3 days ago 1 reply      
Headline: Light Debian desktop theme

Describtion: Debian should have easy usability to set the desktop theme to a light color theme. Right now it is quite difficult for users to change desktop look and feel. Please also make usability testing of changing desktop settings. The current color scheme which is dark does suit all users. A dark and light theme should more users covered.

Many thanks to all the Debian developers for creating a great distribution!

pette 2 days ago 0 replies      
Too late. Switched everything to Devuan.

No systemd (and pulseaudio if desktop) for me.

amorphid 3 days ago 1 reply      
HEADLINE: Create local copy of remote repos

DESCRIPTION: Personally I'd like something like 'apt-get update --local' which pulled down a remote copy of every repo. That's be super handy for something like a build machine, and it'd reduce the need to install & maintain an Aptly repo.

heftysync 3 days ago 1 reply      
HEADLINE: Day(s) old sid as a rolling distro

DESCRIPTION: I think I represent a number of users. We want to use unstable as a rolling distribution, but we don't want to run into every edge case. Testing doesn't update fast enough and doesn't have as good of security. There's no middle ground between absolute bleeding edge and the too conservative testing.

I used to use unstable but there's that annoying race condition where I could upgrade at the exact wrong time when brand new (broken) package versions were uploaded and not enough time has passed for even the first round of bugs. I'd like a day safety buffer so apt-listbugs has a chance to warn me about catastrophic bugs.

Setting up a true rolling distribution may be too much work for Debian. Actual Debian developers will be running unstable. It would be nice if there was a middle ground for non-Debian developers who want a rolling distribution but don't want to get hit by every edge case in sid.

I think a nice compromise would be to cache the sid packages for a day (or two) and set that up as another branch. A full day of possible bug reports from people on bleeding edge sid would give us a chance at missing the catastrophic edge cases while still being very current.

I think this could encourage more Debian developers. If I wanted to join Debian as a DD, I would need to have an unstable installation somewhere. It wouldn't be my daily driver because I don't want to run into those breaking edge cases. If my daily driver was day old sid, I could have another machine / VM that runs sid and would almost be identical to what my daily driver is running. It's not like testing where packages could be entirely different due to the delay in migrating.

Unlike testing, day old sid would migrate all packages even if there are release critical bugs. There would be no waiting period beyond the strict day limit. If there is a catastrophic edge case, people already on day old sid using apt-listbugs would be able to avoid it. New installations would hit it but you could warn users (see below).

If you make apt-listchanges and apt-listbugs as required packages for day old sid, then people could be informed about what broke on the previous day.

It would be nice to integrate apt-listbugs into an installer for day old sid and fetch the latest critical or high priority bugs before the installation. A new user could then decide if that's a good day to install. Or you could have a simple website that says here's the day old sid installer and these packages currently have critical or high priority bugs. If you would install those packages, maybe wait another day or two for it to settle down.

Maybe day old sid is too close. Perhaps 2 day sid or 3 day old sid? I don't feel that testing fills this role already because testing waits for 2-10 days and won't update if there are release critical bugs. I'm fine with something closer to bleeding edge sid, but I'd really like to allow a few days for the bleeding edge users to report bugs so I can decide whether to upgrade. I don't have an expectation that day(s) old sid is more stable than testing or less unstable than sid. All it provides is a buffer so I can get bug reports and make my decision about whether to upgrade.

DISTRIBUTION: day old sid.

srikwit 3 days ago 0 replies      
- HEADLINE: Sysmon equivalent for Debian

- DESCRIPTION:Tool to log process spawns, kills, network connection start/stop, file modifications etc. onto event logs for review.


- ROLE: Security Analyst

omginternets 4 days ago 3 replies      

SELinux installed by default


Not sure what else to say...

kurgan 3 days ago 1 reply      
HEADLINE: No Systemd

DESCRIPTION: Systemd is creating far more issues than benfits. Everyone knows it except for its author, L. P. Still Debian has chosen to go down this road, and the result is that people had to fork and to move to Devuan. Go back to a sane, simple, stable init system. This is expecially true for a server-oriented distribution.

ROLE: Fabio Muzzi, freelance linux sysadmin since 1995, loyal Debian user up to Debian 7, now Devuan user and supporter.

qwerty987 3 days ago 0 replies      
Please support "Secureboot" ASAP (for all of us who are dualbooting debian with MSWindows )
Immortalin 2 days ago 0 replies      
Headline: Better way to switch between 2.5 and 5GHz wifi modes
asbesto 2 days ago 0 replies      
I wish to see systemd eradicated from debian. Back to SysVInit!
vasili111 3 days ago 1 reply      
What about OS itself? Is it fully reproducible build?
buster 3 days ago 0 replies      
My picture as avatar! :)
JTechno 3 days ago 0 replies      
systemd out
inopinatus 3 days ago 0 replies      
Couldn't be more Debian to ask for our feedback in a key:value structure. The hilarious part is that some of them are marked optional.
cyberpunk5000 2 days ago 0 replies      
HEADLINE: separate user "apps" from system "packages"

DESCRIPTION:In my use cases, which I think are common, I want a stable base operating system and user interface, but for the applications I work with every day (browser, compiler, office suite, etc.) to be cutting edge.

My dream is to separate packages into two tiers with different update policies, similar to the Android and Apple app stores, and for that matter BSD ports. Platform software like the kernel, system libc, X11, and desktop environments release and update like stable. "Apps" like Firefox and LibreOffice are easily installed and updated on a rolling basis.

I know that I can achieve this now with a custom backports and apt pinning config, but that's more of a low-level project than I'm envisioning. My request is for something that's more of a newbie-friendly point-and-click sort of thing.

cyberpunk5000 2 days ago 1 reply      
HEADLINE: concrete release timelines

DESCRIPTION:For many years I've been fond of Debian and have used it for side hobby projects. But I've had to use Ubuntu and Fedora for real work because I need a modicum of certainty about the intervals between releases.

I acknowledge that Ubuntu's rigid release-every-6-months, LTS-every-24 is impractical for a volunteer project with high standards. But without any firm timeline it's impossible for me to plan and use Debian in production.

For example, a commitment that releases will always be spaced somewhere between 6 and 24 months, would go a long way.

Ask HN: What projects are 20 years ahead of their time?
58 points by miguelrochefort  1 day ago   98 comments top 38
est 1 day ago 1 reply      
I'd say IE5.5.

All those dHTML shit, like DirectX, MIDI, .htc (Javascript Components), <img dynsrc> for videos, a sane CSS box model, css and inter-page transformations, VML for vector graphics, VRML for VR (buzzword), XML islands (influenced e4x and later JSX), the original xhr object, native cryto APIs, etc.

You can even program jscript on server side with asp, or execute standalone with ActiveScript, even control native GUI like customizing your folder, the browser can be morphed to the file explorer. You can make apps with few kb of jscript unlike 55MB electron install bundle.

The Windows help files (CHM) are like thousand years better than macOS counterparts and linux man files. CHM was the de facto ebook format back then and it works really well with features like indexable topics and full text search. We now have to use devdocs.io or dash.

yes it has its quirks and worms, but it was way ahead of its time.

bsenftner 1 day ago 1 reply      
Back in 2002, I started working on "automated digital doubles of real people" with an eye on automated actor replacement. I'd been working in VFX, with a history in 3D development going back to the early 80's. Long story short: started with a stellar team and I wrote and was awarded a global patent on automated actor replacement in filmed media. The company, patent and seed funding all come together just as the financial crisis at the end of Bush's term climaxed. Team scattered, I went it alone building a freelancer tiny team. Pivot to a game character service. I was never able to achieve more than a half way decent game character creation web API, which no one would pay to use. All that is left now is 1) the twitter account https://twitter.com/3davatarstore?lang=en, and of course I still have all the tech. I use it for facial recognition now, and no longer make avatars. It's even better now, but even when contacted by interested parties, no one want to pay for it, at all.
TACIXAT 1 day ago 5 replies      
FPV Drone Racing - While automated flight is making great progress, there are some amazing pilots out there. Something about a person standing in a field with goggles and a controller feels so future to me.

3D Printing - This is going to be the main way to manufacture things in the future. The lab that is 3D printing houses with concrete. That makes me terrified for home values going forward. It will likely shift all the value into the land. The house will just become something you tear down and reprint every 10 years.

CRISPR - s/shitty gene sequence/perfect gene sequence/g That's insane. It's like an anti-virus product for the body (irony intended). We're going to live a very long time and be practically disease free pretty soon. I'm planning on living until 150 (27 now). It's placing a big bet on medical science, but I feel like we're on the edge of some huge things.

tylerruby 1 day ago 1 reply      
Lilium - The worlds first electric vertical take-off and landing jet. (https://lilium.com/)

Neuralink - Develops high bandwidth and safe brain-machine interfaces. (https://neuralink.com/)

Magic Leap - Mixed Reality (https://magicleap.com)

Crispr-Cas9 - A unique technology that enables geneticists and medical researchers to edit parts of the genome by removing, adding or altering sections of the DNA sequence. (https://en.wikipedia.org/wiki/CRISPR#Cas9)

This is a great question. The acceleration of technology has made it important for entrepenuers to look further ahead than ever when deciding where they want to make their impact in the world. Tomorrow's successful leaders in business will be the ones that peered into the most obscure places of the future to find it's problems and it's solutions.

AndrewDucker 1 day ago 5 replies      
Look back 20 years to 1997. Before Windows 2000 brought together the home and server codebases. Before the internet was available on mobile phones. Heck, back then the internet was used by less than 2% of the global population.

What was 20 years ahead of its time then? What would you have looked at and thought "That'll be massive in 20 years"?

About the only thing I can think of is VR. Which Sega tried to launch in the late 90s, and only now is selling over a million units.

tpeo 1 day ago 2 replies      
headcanon 1 day ago 1 reply      
I'd say most of the ethereum-based startups, like swarm city or golem. Right now they're going a Cambrian explosion of different business models, many of which require everyday people to be using cryptocurrencies on a regular basis. Current prices are due to speculation for this eventuality, and personally I'm bullish on crypto, but this feels too much like the 90s was for the internet. Not that that's a bad thing, it's just the natural cycle of innovation.
fiftyacorn 1 day ago 0 replies      
Driverless cars and a lot of AI - to me these technologies are beginning, and Im expecting to see a lot of bad AI in the coming years before it settles down
candiodari 1 day ago 0 replies      
Homomorphic encryption. Software that is secure, based purely on the software itself, and can still be secure even on compromised hardware. This will enable real cryptocurrencies and zero-trust state-like entities.
richardthered 1 day ago 1 reply      
The Long Now Foundation and their 10,000-year clock. http://longnow.org/clock/

It's a clock. A physical clock. Designed and built to run, accurately, for 10,000 years without human intervention.

spodek 1 day ago 0 replies      
Mine: to motivate people to choose to lower their pollution and greenhouse gas emissions (significantly, not just raising awareness or things like using electric cars instead of driving significantly less).

People can do it but they prefer living the way they do, which is what is causing the problems, knowing in principle that they should change their behavior but not actually doing so.

Miami flooding more and more is not enough of a burning platform yet. Nature will provide it if we don't choose to change ourselves.

otterley 1 day ago 0 replies      
Not quite 20 years too soon, but Six Degrees pioneered social networking in 1997: https://en.wikipedia.org/wiki/SixDegrees.com
Raphmedia 1 day ago 1 reply      
Anything VR / AR. It did exist but the world and the tech wasn't ready. Hell, both still aren't.
qubex 1 day ago 1 reply      
Hopefully my grandmother's funeral arrangements.
erik998 16 hours ago 0 replies      
James Orlin Grabbe https://en.wikipedia.org/wiki/James_Orlin_Grabbe

His Digital Monetary Trustshttps://en.wikipedia.org/wiki/Digital_Monetary_Trust

The End of Ordinary Moneyhttps://www.memresearch.org/grabbe/money1.htm

Cycan artificial intelligence project that attempts to assemble a comprehensive ontology and knowledge base of everyday common sense knowledge, with the goal of enabling AI applications to perform human-like reasoning.

The project was started in 1984 by Douglas Lenat at MCC and is developed by the Cycorp company. Parts of the project are released as OpenCyc, which provides an API, RDF endpoint, and data dump under an open source license.


Prolog, Backward chaining, forward chaining, opportunisitic reasoning.

d--b 1 day ago 1 reply      
What does that really mean? Do you mean projects that are being worked on right now and that will be delivered in 20 years? Or projects that are finished now and that will be understood in 20 years? Or projects that people thought wouldn't be possible before 20 years?

1: VR, self driving vehicles, nuclear fusion, artificial photosynthesis, quantum computers, robots that can manipulate things like men, wave energy harvesting, colonize mars, cure cancer, cure Alzheimer's disease.

2: no idea!

3: drones, deepmind, blue led, electric sports cars, flyboards, voice activated assistants, smart wearables...

owebmaster 1 day ago 0 replies      
IndieWebCamp: https://indieweb.org/
SideburnsOfDoom 1 day ago 0 replies      
Some current things are either going to be huge, normal, mainstream and just work in a boring way in 20 years time, or they are current fads which will pass.

Cryptocurrencies. 3d printers.

js8 1 day ago 0 replies      
Antropogenic Global Warming.

I am being sarcastic. But it's very hard to see, today, any technology that could make my life significantly better (at least than fixing climate change).

skdotdan 9 hours ago 0 replies      
SpaceX. Of course they still need time, work and funding. But they have a working product ahead of their competitors that clearly shows a path to the future of space exploration and colonization.
halis 1 day ago 0 replies      
Uh if you look at what is happening today, OBVIOUSLY Javascript was 20 years ahead of its time. You're welcome.
boramalper 1 day ago 1 reply      
Netscape Enterprise Server and Server-Side JavaScript (SSJS)


tim333 1 day ago 2 replies      
Nuclear fusion? I can see this thing in 20 years http://news.mit.edu/2015/small-modular-efficient-fusion-plan...
DocSavage 1 day ago 0 replies      
Making significant impact on psychiatry via brain simulations.https://www.humanbrainproject.eu/en/medicine/

I could see this happening within 20 years, but not in the confines of the current project.

srinivasang87 1 day ago 0 replies      
Breakthrough Starshot / Nano spacecrafts - much more than 20 years ahead
auganov 1 day ago 0 replies      
Datomic when it first came out probably. Meaning still over 10 years ahead of its time.
PeterStuer 1 day ago 0 replies      
I was doing autonomous mobile robotics in the 1987-1996 era, does that count?
sunstone 1 day ago 0 replies      
SpaceX self landing rockets. "I'm on a barge!"
DonbunEf7 1 day ago 1 reply      
The various object-capability projects alive right now, like Tahoe-LAFS, Monte, Cap'n Proto, Sandstorm, and Genode, are all very much future technology. Imagine:

* No accounts, no passwords, just secret keycaps

* Instead of messy and complex role-based tables, capabilities always know exactly what they are capable of doing

* No more confused deputies

* Fine-grained trust

srinivasang87 1 day ago 0 replies      
1. Breakthrough Starshot / Nano spacecraft - much ahead of its time2. Wireless power transmission of electricity / space based solar farms
TurboHaskal 1 day ago 1 reply      
APL comes to my mind.
conception 1 day ago 1 reply      
Crypto currencies probably. I could see taking a generation for them to be mainstream and understood by the public the way the internet is today.
bikamonki 1 day ago 0 replies      
Whatever Elon is working on...
baybal2 23 hours ago 0 replies      
ghuntley 1 day ago 1 reply      
carsongross 1 day ago 0 replies      
Well, intercooler.js is 20 years behind its time:


Does that count?

Ask HN: Any startups working on clojure and Bitcoin/Ethereum?
4 points by pankajdoharey  13 hours ago   5 comments top 2
rpod 8 hours ago 1 reply      
Why narrow it down to Clojure? Feels a tad artificial to me.

AFAIK, any Dapp using Ethereum has to make use of their JSON-RPC interface, which is language-agnostic. So it's perfectly fine to build a Clojure application on Ethereum, although there is a wrapper library for the JSON-RPC interface available in Javascript. No idea how many startups, if any, are using Clojure and for what reason.

akrisanov 10 hours ago 1 reply      
Found very interesting project/company https://github.com/status-im which is doing almost everything in Clojure and ClojureScript plus Go.

I track open positions from time to time on next website https://blockchain.works-hub.com/.

Who is hiring remote workers?
46 points by kaymaylove  2 days ago   27 comments top 13
smcnally 2 days ago 0 replies      
This outfit does a good job on Twitter:


There are a few others.

Angel.co has Remote OK as a job search parameter

znq 2 days ago 1 reply      
We are always looking for talented people at Mobile Jazz[1] and Bugfender[2]. Best is to apply via our Jobs API[3]

[1] https://mobilejazz.com

[2] http://bugfender.com

[3] http://mobilejazz.com/jobs

niico 1 day ago 0 replies      
If anybody is looking for a freelance or full-time remote UI Designer. I will be happy to help you! Portfolio: http://ngb.agency/
VirtruTyler 2 days ago 2 replies      
We are hiring remote workers.We're not looking for PHP developers, but you can find a list of our openings here (Scroll to the bottom): https://www.virtru.com/careers/

I'm one of the engineers, and can honestly say it's a great company to work for.

stephenr 2 days ago 0 replies      
I got my start remote working with X-Team (https://x-team.com) and their sister company X-Five (then called XHTMLized) (https://www.xfive.co)

I don't know if they have any openings currently but definitely worth checking them out in my opinion.

dirtylowprofile 1 day ago 0 replies      
Are the links provided in the comments applicable to other countries? I am in Asia and just wondering if you are willing to hire developers on the other side of the globe.
scalesolved 2 days ago 1 reply      
Got a link to your resume? Happy to give you some pointers. (3 years working remotely so far and have been on a lot of hiring teams for remote positions).
wolco 2 days ago 1 reply      
What country / timezone are you located in?

Personal experience is local jobs turn into remote positions easier than trying to apply against 1000s of others.

ThomPete 1 day ago 0 replies      
I am, send me a mail.
smt88 2 days ago 0 replies      
Email me at smt88hn@gmail.com
Ask HN: Has Duckduckgo gotten worse recently?
39 points by pmoriarty  23 hours ago   40 comments top 19
bsstoner 22 hours ago 3 replies      
Disclaimer: I work for DDG.

I'd be interested in looking into any examples of searches where the results aren't good enough or where it seems to have gotten worse recently.

As far as I know there haven't been any changes over the past few weeks that would have made things worse.

watertorock 23 hours ago 2 replies      
Unrelated but I wish they would rebrand already. Call it Duck, call it Go, call it something simple.

It's not an easy or memorable name at the moment, and branding matters.

tblyler 23 hours ago 0 replies      
I noticed about a week or two ago that I was always leaning toward !g as well. I have been testing out my own instance of searx. I have yet to specify using google with it, since it already uses google results.


proaralyst 23 hours ago 1 reply      
I've actually noticed the opposite; I've been resorting to !g much less.
babyrainbow 20 hours ago 0 replies      
Ha! I had the exact opposite.

I used DDG a while when it was introduced. But returned back to google since the results were no as good.

But recently I was feeling googles results has got a lot worse and gave DDG another try.

Big difference! Like Google vs Yahoo back in the days.

Now DDG is my default search.

grizzles 10 hours ago 0 replies      
I just switched over to DDG because I was getting Google's human detector + classify these images nearly every search.

I guess my estimated worth to Google must be fairly low because I don't click on many ads and I often use a work VPN.

cdevs 20 hours ago 0 replies      
Duckduckgo is my default at work and on my phone. I love it for quick programming questions and usually everything else is fine but if I'm doing any design and need images I usually for some reason head to google but for everything else I'd rather not have google turn my query into my next YouTube ad2 seconds later..
stuaxo 19 hours ago 0 replies      
Weird, I've been feeling like the same thing has been happening, just noticed over the last couple of weeks that it doesn't seem to find what I want.

I'm in the UK and noticed I often seem to be getting US centric results and have to try using Google more often.

Edit: ddg has been my default for 2 years.

ravenstine 20 hours ago 0 replies      
I've noticed it's results have gotten much better. Now if there were a version of the Personal Blocklist chrome plugin for it(so I don't see crap from W3Schools, WebMD, or Livestrong, etc.), I would never use Google again.
tmaly 23 hours ago 2 replies      
I have been using g! much more for programming.

I was hoping to build an extension for DDG a few months back, but things seemed to have changed in the forum.

This could explain why we are seeing changes.

snoitavla 19 hours ago 1 reply      
I've recently wrote yet another ddg API wrapper for python https://github.com/alvations/rubberduck. I'm loving how I can browse the web in jupyter notebook. Oh the irony of having jupyter on a browser and calling an API to get search results
faulker 22 hours ago 0 replies      
I've personally had the opposite, I've been getting better results over the last year and I'm using !g very rarely now.
chatnati 17 hours ago 0 replies      
I too feel the results have been poor of late. I think Google probably "knows" the type of questions you look for (ie. usually on stack overflow for programming questions, or whatever most popular links people are clicking on around that time). If I'm looking for something very esoteric, I can tell straight away that DDG doesn't understand my query from its results and I go over to Google and find the answer on the first, second or third page.
Polyisoprene 22 hours ago 0 replies      
For searches containing multiple words, code or error messages I resort to google as ddg doesn't find the relevant pages.Other than that it's a lot better than before.
dodgedcactii 23 hours ago 0 replies      
i've noticed this too and its fucking with my mind, since i end up using !g in a private window and not having the history (the whole point of being not tracked)
ramayac 23 hours ago 0 replies      
I did the switch 2 weeks ago, I'm actually enjoying it!
amelius 22 hours ago 0 replies      
Perhaps they should let the user choose the search algorithm. Make it an option somewhere.
lhuser123 14 hours ago 0 replies      
I hope it keeps getting better.
everdayimhustln 16 hours ago 0 replies      
I had a problem with a chrome extension hanging DDG from displaying results, but otherwise seems awesome as usual on mobile and desktop.
Ask HN: You are named CEO of Uber, what do you do in 180 days of change?
33 points by thechut  6 hours ago   45 comments top 21
nailer 5 hours ago 2 replies      
- I'd purge: disclose everything that needs to come out fully and immediately

- Throw away self driving cars for now. The tech will become commoditised. Almost everyone at YC in November was doing self driving motorcycles (I have no idea why either).

- I'd closely align Uber with consumers and environmental groups rather than falling in with taxi industry corruption. lobbying etc. Make cities change laws to benefit their citizens: let ride sharing exist so they can get picked up in rough neighborhoods (PS, abandon tipping, it breaks this), allow Uber cars in public transport lanes (because they are public transport), make sure ride sharing has dedicated space at the airport. Be tough to local governments when you need to be tough, but better yet, have consumers be tough for you. Expose the risks that cities like Austin have put consumers in by replacing Uber with Facebook groups of strangers. Expose cities like London where the normal black cabs frequently illegally refuse to pick up passengers and the mayor wants to 'protect' them because they're 'historical'. Ride sharing is for everyone.

Finnucane 4 hours ago 1 reply      
Since I have no interest in seeing them survive, I'd take a the Bob Nardelli/Melissa Meyer approach of guaranteeing a nice golden parachute for myself, then drive the company over the cliff, and collect the payout on the way out the door.

Or, give up on the model of turning drivers into serfs for the benefit of privileged hipsters and focus on more mass-transit solutions.

thrill 5 hours ago 0 replies      
1. I'd have a Come To Jesus meeting with my Board and ask them why they were so accepting of the poisonous environment. Not everyone would survive that Saving Private Ryan opening scene.

2. If I survived the previous activity, then having already once been assigned the position of CEO in a company destroyed by the previous management, I'd be as transparent as I could to all interested parties (that would include more than the investors) about the challenges and opportunities.

3. If I was still kicking after that, I'd implement (many) of the two dozen pages of notes I took from being an Uber driver for a year to see what it was about. I like talking with (a variety of) people, and a "taxi driver" is like a bartender in the natural sharing of thoughts for many passengers (who are poor to rich, small to large business people, ranging from unknown to runway-model-famous people).

4. Passengers would not be feeling like they were guessing about a) the fare, or b) the quality of the car, or c) the quality of the driver.

5. Drivers would not be treated as third-class citizens.

mabbo 5 hours ago 0 replies      
Step 1: Begin with culture.

Get every single employee involved. Have a very big summit with the single goal: creating the culture that Uber should aspire to be, and coming up with a distinct plan for how they're going to get there. Engineers like to solve problems, and it's clear there's a big one here that has been identified.

Once we have our goal of who we want to be as a company, there will need to be continual work to make sure we're still aligned on that goal, aligned on that culture. There will be people who need to leave, by their own choice or not, based on whether they want to and can be part of that change.

Probably worth hiring Fowler to be part of it, if she's willing.

Step 2: Cut some losses.

This Uber/Waymo thing? It's time to settle. It's clear that even if somehow Uber is innocent in all this, we're not going to win the case in court. Come to Google and say "We're sorry we let this happen. We want to be better than that. Waymo and Uber have the same goals in mind, so let's work together." It'll be expensive, but it'll be cheaper than never having self-driving cars.

Step 3: Plan.

Come up with 1, 3, 5, and 10 year plans. Where does Uber intend to be at each of those milestones? How do they relate to each other? On what day is Uber profitable? How does Uber stop the bleeding? And are these milestones achievable while still meeting the cultural goals from step 1? If not, come up with better goals. If I can't find a way to profitability without meeting the cultural goals, I step down and let a better leader step up.

Uber board, let me know when you're ready for my bold and inspiring leadership.

a2tech 5 hours ago 1 reply      
1) Support the self driving thing for another round of funding, then stuff it. I agree with someone else in the thread-its too blue sky now to be wasting money on. Let other companies chase it into the ground then license it when its commercially viable

2) Refocus on the core product. Drop the tipping, convert more drivers through incentives or slightly bumping fare payouts on the backend. They can pull some of the cash they'll save from dumping self-driving tech on this. People won't use competitors if they don't have drivers and the fares are higher

3) They should really consider having a few real employees in hot markets that screen first time drivers and their cars. I've had a few drivers show up in vehicles that technically met Uber's standards, but were really shitbuckets. Thats not the image Uber wants to present.

frgtpsswrdlame 6 hours ago 3 replies      
Keep focus on self-driving through the next funding cycle (for the hype) then drop it like a hot potato. It is way too long term for Uber to be burning money on it, they have immediate short-term problems.

I'd probably try to rebrand as the new, legal Uber and start spending money on lobbying local politicians. The "taxi-app" market has almost no switching costs and so Uber has almost no pricing power. They'll need to fix this to ever run a profit. Effectively Uber exists because it broke the laws that created barriers to entry. If it wants to continue to exist it needs to erect new barriers that protect it and keep out competition.

dbot 5 hours ago 0 replies      
Uber needs to focus on customer loyalty - having a base of loyal customers means they can engage in licensing deals with the true automakers.

How to build loyalty? Create a rewards/milage program similar to airlines. Most business travelers commit to one airline because of the status and perks they receive. Some Uber perk ideas include:

1. priority response during high demand

2. free "upgrades" from UberX to higher class vehicles when demand/wait time allows

3. partnership with airline lounges to get access when traveling

temp-dude-87844 2 hours ago 0 replies      
1. Admit 'we can do better' and purge management ranks. Bring in some reliable, non-rockstar talent from the industry, and start a corporate responsibility blog.

2. Cut costs. Reduce headcount, relocate (out of SF/SV) or offshore dev work. Spin off or sell off expensive ventures, like self-driving research -- there are too many companies working on self-driving, it's not smart to compete with them in-house. Partner with one instead.

3. Increase revenue and maximize rider capture.

- Reduce fare subsidies in metros with lower competition, but offer a loyalty program for riders.

- Institute Expand high-margin LoS like UberRUSH, the courier service. Forge contracts with surburban/exurban governments to provide transportation services; try to absorb government subsidides.

- Monetize data collected during normal operations: partner with market research firms, expand incidental mapping operations to reduce reliance on external maps.

- Deepen partnerships with automakers, and make preferred partnerships with sites that are frequent origins and destinations.

- Think about usecases: not just cab-hailing in one's hometown, but offering safe passage to high-profile sites in foreign metros while one is travelling. Make people choose Uber for the same reasons they discretionarily choose another brand: consistency of experience, trust, and perks; i.e. don't compete solely on price. Make partnerships that support these use-cases.

Companies to be wary of: Google, Amazon, and automakers with whom they have no pre-existing relationship.

Companies to court: HERE Maps (owned by Volkswagen, BMW, and Daimler); hyperlocal providers like FourSquare, Snapchat; car-sharing companies like Zipcar; arch-rivals of their competition like Facebook, Walmart (!); AirBnb and hotel chains.

stinkytaco 3 hours ago 0 replies      
Have a series of summits to discuss culture. Have focus group meetings with everyone from execs to drivers in the room. Hire a new HR director or put someone into place that can overhaul the reporting system. Do sensitivity trainings, retreats, summits, whatever. This directly a addresses the harassment issue. I don't know much about how harassment reporting works, but there has to be some sort of standard that HR professionals are trained in.

Then I would raise fares and remove tipping. Improve the customer experience.

Finally I would hire more local employees to monitor on-the-ground operations like drivers, cars, and service. Set up a mentoring program to help drivers get started and stay on. No one is fooling anyone by treating them like contractors. For customers, drivers are the face of the company. I'm brought to mind of Disney's "cast member" concept. The lowest employees on the totem pole are the ones your customers see the most, invest in them.

bsvalley 5 hours ago 2 replies      
1. Hire Chris Lattner who just left Tesla to take over self-driving initiative.

2. Ask Chris Sacca to come out of retirement and to get more involved in Uber's strategy.

3. Find a COO like yesterday.

4. Shut down Uber eats and Uber rush and double down on self-driving initiative.

5. Make the team leaner, more agile and eliminate 1s layer of management (engineering managers, etc.). Dev teams should be %100 autonomous driven (not managed) by product managers.

6. Eliminate Tips.

7. Pay Uber drivers way more and have them sign a special contract (can't sign-up with other competitors).

8. Increase passenger engagement during a ride by providing location-based deals, events, etc. A twitter-like app to use while riding an Uber.

etc... this is just a short list.

atemerev 5 hours ago 0 replies      
Nice try, Thuan. :)

I would have run a management buyout (this is why the smear campaign started, right? 70 billion is way too expensive, and valuation needed to be put down fast, while making plausible reasons for that congratulations, it worked).

Then, I'd have fired half of those 14k employees (really? for a startup?), and finally pivoted by rolling out their geo-matching API for everybody to run their own businesses and services on it (they should have it for like 2 years already). The first one who will provide something like AWS for shared economy wins!

maxxxxx 5 hours ago 0 replies      
I don't think there is a path for them to be a self sustaining business on the scale their valuation demands. With the investment money they have taken and the resulting growth expectations I don't think they can do much without disappointing their investors massively. If their valuation goes down a lot of employees will leave. I wouldn't be too surprised if they got bought by someone in the future.
maxfurman 5 hours ago 0 replies      
Raise fares and cut costs. Uber has a lot of revenue, but they are wasting it on the self-driving moonshot among other things. Unfortunately this will mean firing a big chunk of those 14k employees but if the company goes bankrupt they will all be unemployed.

Acknowledge the reality that Uber will not conquer the world and is not worth seventy billion dollars by taking a down round (assuming they need to raise money).

skywhopper 5 hours ago 2 replies      
Michael Dell may have put it best: "What would I do? I'd shut it down and give the money back to the shareholders."
YCode 5 hours ago 1 reply      
First spend a few weeks going to the employees in key areas and one/a few on one asking them what needs changed.

It's been my experience that the people on the ground know what's wrong with the company, they just don't have the authority or vision to do anything about it.

Then look for the most common threads and how to tackle those.

Naive, I know. But it's what I would do.

kpil 5 hours ago 1 reply      
Reorg like crazy. At the end of the turmoil announce that my work as an goodlike management doctor has been done and everything is now on track for the less gifted but down to earth management that is now necessary, exit, laugh all the way to the bank, and find some other suckers to trick.
emilsedgh 5 hours ago 1 reply      
I don't know why is everybody so much against the driverless cars project. I think its a nice R&D project that could pay off really really big. I don't think it costs a lot of money comparing what they are burning nowadays.

I think they should:

* Make sure HR problems are actually resolved internally. Its not a big problem to solve I think.

* Try to grow in more towns and countries aggressively.

* Use the economics of scale to bring down the costs for drivers so they can keep the cost down without paying from VC money. For example:

Buy insurance in bulk for their drivers,

Contract repair shops for their drivers,

Even buy cars in bulk and lease them out to drivers.

greedo 4 hours ago 0 replies      
Sell. Find a company/sucker willing to buy it and return the money to shareholders. Uber is dead. It just doesn't know it yet.
sngz 2 hours ago 0 replies      
sell the company to verizon and then resign and get paid millions
tossapp9 4 hours ago 0 replies      
1. Fix their terribly redesigned Android app and revert to the pre-facelift version.
hluska 3 hours ago 0 replies      
I assume that the CEO of Uber has three major goals. These goals are to stop the steady flow of ugly PR, slow the company's burn rate, and prepare the company for a liquidity event. Because of recent turmoil, I don't think that Uber has any shot of an IPO in the next two years.

Since 180 days really isn't enough time to onboard a COO or CFO, I would only focus on damage control. I would need those executives to help curb the burn rate.

Step 1:

Before I accepted the job, I would accept that there is a very high probability that this will end badly and that my reputation will never recover. I would wonder whether the VCs who were instrumental in bringing me in would stand by me when this goes to hell, or would I take the lion's share of the blame?

So, I would make sure that my compensation from Uber was properly invested so that I could survive the rest of my life if I could never get another job. And, I would work with a very competent money manager to optimize every single dollar that I would get from Uber.

Step 2:

If Uber is going to reach a liquidity event, it needs some highly competent executives who can take leadership of their respective areas. At this point, Uber badly needs a CEO/COO/CFO troika that can work together. Therefore, my first real step as CEO would be to help the board recruit solid AAA+ players for the two remaining positions.

Step 3:

While recruitment was ongoing, I would undertake two major initiatives in tandem. First, I would conduct my own investigation into what happened and learn as much about the previous culture that I could. There is a very high probability that this investigation would lead to a new round of firings. Therefore, the second initiative would be to be completely transparent and wholly public about what is going on. I have to assume that every single thing that I would do would be heavily scrutinized by the media, investors and stakeholders. Therefore, I would get in front of it and send weekly emails to employees/investors that were cross posted on a blog. If journalists want to see fire, they can see the same fire that I see. In a news vacuum, it gets more tempting to print dubious sources who may or may not actually be telling the truth.

Step 4:

Once I had a solid grasp on what happened (and once I knew that the bad apples were all promoted to Uber customers, or maybe drivers), I would start fixing the culture. Hopefully, by this point, we would have at least a COO on board. With her help, I would make sure that HR was fully independent and powerful. Simply put, the new Uber would have a strong 'no asshole' rule.

Step 5:

SDC be damned....Uber drivers need to be promoted to first class citizens within the Uber ecosystem. Once I was fairly confident that the culture now marginalized assholes, I would work closely with drivers. I would argue that Uber drivers have an incredible understanding of all the efficiencies and inefficiencies within Uber's market. Therefore, we need to empower drivers and encourage them to come forward with any suggestions to make their jobs better. I argue that Uber is closely tied to their drivers. As drivers succeed and make money, Uber should succeed and make even more.

In 180 days, sadly, I don't think I would have the chance to put a true focus on rider demands. In fact, I'm not even sure that I would have time to engage the drivers. But, these five steps are my perfect case.

Ask HN: Does anyone know what's happening with Magicleap?
16 points by yalogin  2 days ago   4 comments top 2
mrep 1 day ago 0 replies      
They raised [1] $793.5M 16 months ago so they still probably have a decent amount of cash that they can ride out.

[1]: https://www.crunchbase.com/organization/magic-leap/funding-r...

valuearb 1 day ago 1 reply      
They took the wrong tack, GPUs rapidly increasing processing power means that portable devices can drive AR. You don't need super heavy, super expensive custom hardware. You just need great software.
Ask HN: How do you organize your files
59 points by locococo  1 day ago   39 comments top 31
phireal 1 day ago 0 replies      
Home directory is served over NFS (at work). Layout is as follows:

 phireal@pc ~$ ls -1 Box/ - work nextcloud Cloud/ - personal nextcloud Code/ - source code I'm working on Data@ - data sources (I'm a scientist) Desktop/ - ... Documents/ - anything I've written (presentations, papers, reports) Local@ - symlink to my internal spinning hard drive and SSD Maildir/ - mutt Mail directory Models/ - I do hydrodynamic modelling, so this is where all that lives Remote/ - sshfs mounts, mostly Scratch/ - space for stuff I don't need to keep Software/ - installed software (models, utilities etc.)
At home, my main storage looks like:

 phireal@server store$ ls -1 archive - archived backups of old machines audiobooks - audio books bin - scripts, binaries, programs I've written/used books - eBooks docs - docs (personal, mostly) films - films kids - kids films misc - mostly old images I keep but for no particular reason music - music pictures - photos, organised YYYY/MM-$month/YYYY-MM-DD radio - podcasts and BBC radio episodes src - source code for things I use tmp - stuff that can be deleted and probably should tv_shows - TV episodes, organised show/series # urbackup - UrBackup storage directory web - backups of websites work - stuff related to work (software, data, outputs etc.)

kusmi 1 day ago 1 reply      
I made an automatic document tagger and categorizer. It collects any docs or HTML pages saved to Dropbox, dropped into a Telegram channel, saved with zotero, Slack, Mattermost, private webdav, etc, cleans the docs, pulls the text, performs topic modeling, along with a bunch of other NLP stuff, then renames all docs into something meaningful, sorts docs into a custom directory structure where folder names match the topics discovered, tags docs with relevant keywords, and visually maps the documents as an interactive graph. Full text search for each doc via solr. HTML docs are converted to clean text PDFs after ads are removed. This 'knowledge base' is contained in a single ECMS, external accounts for data input are configured from a single yaml file. There's also a web scraper that takes crawl templates as json files and uploads data into the CMS as files to be parsed with the rest of the docs. The idea is to be able to save whatever you are reading right now with one click whether you are on your mobile or desktop, or if you are collaborating in a group, and have a single repository where all the organizing is done actively 24/7 with ML.

Currently reconstructing the entire thing to production spec, as an AWS AMI, perhaps later polished into a personal knowledge base saas where the cleaned and sorted content is public accessible with REST/cmis api.

This project has single handedly eaten almost a third of my life.

amingilani 1 day ago 0 replies      
My home folder is it.

 . Desktop Downloads Google Drive // My defacto Documents folder legal library // ebooks and anything else I read ... Downloads Sandbox // all my repositories or software projects go here Porn // useful when I was a teen, now just contains a text file with lyrics to "Never Gonna Give You Up"
I backup my home folder via Time Machine. I haven't used Windows in years but when I did, I used to do something similar. Always kept a separate partition for games, and software because those could be reinstalled easily, personal data was always kept in my User folder.

ashark 1 day ago 0 replies      
- ebooks: I don't love Calibre, but it's the only game in town.

- music: Musicbrainz Picard to get the metadata right. I've been favoring RPis running mpd as a front-end to my music lately.

- movies/TV: MediaElch + Kodi

I don't have a good solution for managing pictures and personal videos that doesn't involve handing all of it to some awful, spying "cloud" service. Frankly most of this stuff is sitting in Dropbox (last few years worth) or, for older files, in a bunch of scattered "files/old_desktop_hd_3_backup/desktop/photos"-type directories waiting for my wife and I to go through them and do something with them. Which is increasingly less likely to happensometimes I think the natural limitations of physical media were a kind of blessing, since one was liberated from the possibility of recording and retaining so much. Without some kind of automatic facial recognition and taggingand saving of the results in some future-proof way, ideally in the photos/videos themselvesthis project is likely doomed.

My primary unresolved problem is finding some sort of way to preserve integrity and provide multi-site backup that doesn't waste a ton of my time+money on set-up and maintenance. When private networks finally land in IPFS I might look at that, though I think I'll have to add a lot of tooling on top to make things automatic and allow additions/modifications without constant manual intervention, especially to collections (adding one thing at a time, all separately, comes with its own problems, like having to enumerate all of those hashes when you want something to access a category of things, like, say, all your pictures). Probably I'll have to add an out-of-band indexing system of some sort, likely over HTTP for simplicity/accessibility. For now I'm just embedding a hash (CRC32 for length reasons and because I mostly need to protect against bit-rot, not deliberate tampering) at the end of filenames, which is, shockingly, still the best cross-platform way to assert a content's identity, and synchronizing backups with rsyncZFS is great and all but doesn't preserve useful hash info if a copy of a file is on a non-ZFS filesystem, plus I need basically zero of its features aside from periodically checking file integrity.

jolmg 1 day ago 0 replies      
My home directory:

 - bin :: quick place to put simple scripts and have available everywhere - build :: download projects for inspection and building, not for actively working on them - work-for :: where to put all projects; all project folders are available to me in zsh like ~proj-1/ so getting to them is quick despite depth. - me :: private projects for my use only - proj-1 - all :: open source - proj-2 - client :: for clients - client-1 - proj-3 - org :: org mode files - diary :: notes relating to the day - 2017-06-21.org :: navigated with binding `C-c d` defaulting to today - work-for :: notes for project with directory structure reflecting that of ~/work-for - client - client-1 - proj-3.org - know :: things to learn from: txt's, books, papers, and other interesting documents - mail :: maildirs for each account - addr-1 - downloads :: random downloads from the internet - media :: entertainment - music - vids - pics - wallpaper - t :: for random ad-hoc tests requiring directories/files; e.g. trying things with git - repo :: where to put bare git repositories for private projects (i.e. ~work-for/me/) - .password-store :: (for `pass` password manager) - type-1 :: ssh, web, mail (for smtp and imap), etc. - host-1 :: news.ycombinator.com, etc. - account-1 :: jol, jolmg, etc.
Not all folders are available on all machines, like ~/repo is on a private server, but they follow the same structure.

mcaruso 1 day ago 2 replies      
One thing I do that I've found to be pretty helpful is to prefix files/directories with a number or date, for sorting. Some things are naturally ordered by date, for example events. So I might have a directory "my-company/archive", where each item is named "20170621_some-event".

Other things are better sorted by category or topic. For tools or programming languages I'm researching I might have a directory with items "01_some-language", "02_setup", "10_type-system", "20_ecosystem", etc.

lkurusa 1 day ago 0 replies      
Roughly this scheme:

~/dev for any personal project work

~/$COMPANY for any professional work I do for $COMPANY

~/teaching for teaching stuff

~/research for academic research (it's a big mess unfortunately)

~/icl for school related projects (where "icl" is Imperial College London)

For my PDFs I use Mendeley to organize them and have them available everywhere along with my annotations.

I store my books in iBooks and on Google Drive in a scheme roughly like: /books/$topic/$subtopic

Usually organizing your files is usually just commitment, move files off ~/Downloads as soon as you can :-)

Animats 1 day ago 0 replies      

with each project under Git. Layouts for Go, Rust, ROS, and KiCAD are forced by the tools. Python isn't as picky.

Web sites are

 sitename/ info - login data for site, domains, etc. site - what gets pushed to the server work - other stuff not pushed to server
with each site under version control.

two2two 1 day ago 0 replies      
One external raid (mirrored) that holds information only necessary for when I'm working at my desk. Within that drive I have an archive folder with past files that are rarely/ever needed. The folder structure is labeled broadly such as "documents" "media" and more specific folders within. For the file level I usually put a date at the beginning of the name going from largest to smallest (2017-6-21_filename). For sensitive documents; I put in encrypted DMG files using the same organization structure.

As for all "working" documents, they're local to my machine under a documents or project folder. The documents folder is synced to all my devices and looks the same everywhere with a similar organization structure as my external drive. My projects folder is only local to my machine, which is a portable, and contains all the documents needed for that project.

TL;DR Shallow folder structure with dates at the beginning of files essentially.

_mjk 21 hours ago 0 replies      
I use `mess` [1].Short descrption: New stuff that is not filed away instantly goes into a folder "current" linked to the youngest folder in a tree (mess_root > year > week).If needed at a later time: file it accordingly, otherwise old folders are purged if disk space is low.Taking it a step further: synching everything across work and personal machines using `syncthing`.

[1] http://chneukirchen.org/blog/archive/2006/01/keeping-your-ho...

majewsky 23 hours ago 0 replies      
My file layout is quite uninteresting. The most noteworthy thing is that I have an additional toplevel directory /x/ where I keep all the stuff that would otherwise be in $HOME, but which I don't want to put in $HOME because it doesn't need to be backed up.

- /x/src contains all Git repos that are pushed somewhere. Structure is the same as wanted by Go (i.e., GOPATH=/x/). I have a helper script and accompanying shell function `cg` (cd to git repo) where I give a Git repo URL and it puts me in the repo directory below /x/src, possibly cloning the repo from that URL if I don't have it locally yet.

 $ pwd /home/username $ cg gh:foo/bar # understands Git URL aliases, too $ pwd /x/src/github.com/foo/bar
As I said, that's not in the backup, but my helper script maintains an index of checked-out repos in my home directory, so that I can quickly restore all checkouts if I ever have to reinstall.

- /x/bin is $GOBIN, i.e. where `go install` puts things, and thus also in my PATH. Similar role to /usr/local/bin, but user-writable.

- /x/steam has my Steam library.

- /x/build is a location where CMake can put build artifacts when it does an out-of-source build. It mimics the structure of the filesystem, but with /x/build prefixed. For example, if I have a source tree that uses CMake checked out at /home/username/foo/bar, then the build directory will be at /x/build/home/username/foo/bar. I have a `cd` hook that sets $B to the build directory for $PWD, and $S to the source directory for $PWD whenever I change directories, so I can flip between source and build directory with `cd $B` and `cd $S`.

- /x/scratch contains random junk that programs expect to be in my $HOME, but which I don't want to backup. For example, many programs use ~/.cache, but I don't want to backup that, so ~/.cache is a symlink to the directory /x/scratch/.cache here.

sriku 1 day ago 0 replies      
If you're particularly asking about reference material that you take notes about and would like to search and retrieve and produce reports on, Zotero might work for you. I have many years of research notes on it - it's a hyper-bookmarking tool that can keep snapshots of web pages, keep PDFs and other "attachments" within saved articles, lets you tag and organize them along with search capabilities.

Outside of that scope, my files reside randomly somewhere in the ~/Documents folder (I use a mac) and I rely on spotlight to find the item I need. It's not super great but is workable often enough.

It's not a silly question!

edit: I've been trying to find a multi-disk solution and haven't had much success with an easy enough to use tool. I use git-annex for this and it helps to some extent. I've also tried Camlistore, which is promising, but has a long way to go.

xymaxim 1 day ago 0 replies      
Another option is to have a look at a tag-based filesystem instead of hierarchical ones to organize everything semantically. I'm using Tagsistant (there're other options) for a couple of months now and I'm almost happy. More satisfied with the idea itself and the potentiality.
richardknop 1 day ago 0 replies      
I mostly work with Golang so usually all work related stuff will be in my GOPATH in ~/code/go/src/github.com/company-name/.

Non Golang code will go to ~/code, sometimes ~/code/company-name but I also have couple of ad hoc codebases spread around in different places on my filesystem.

So it is a bit disorganized. However last few years I have rarely ever needed to cd outside of ~/code/go.

Some legacy codebases I worked on (and still need to contribute to from time to time) can be in most random places as it took some effort and time to configure local environment of some of these beasts to be working properly (and they depend on stuff like Apache vhosts) so I am too afraid to move those to ~/code as I might break my local environment.

romdev 19 hours ago 0 replies      

Filename preserved, ordered by date or grouped in arbitrary functional folders






Primary Artist

 YYYY.AlbumName (Keeps albums in date order) AlbumName Track# Title.mp3 (truncates sensibly on a car stereo)

YYYY-MM-DD.Event Description (DD is optional)


scripts - reusable across clients




 source code documents
Utils (single-executable files that don't require an install)

I use Beyond Compare as my primary file manager at home and work. Folder comparison is the easiest way to know if a file copy fully completed. Multi-threaded move/copy is nice too.

ktopaz 1 day ago 0 replies      
I have my files pseudo-organized, meaning I kind of try to keep them where they should be logically, but since this varies a lot - they're not really organized.The thing is - I use "everything" a free instant file search tool from voidtools.It is blazingly fast, just start typing and it finds files while you type.It uses the ntfs file system (windows only, sorry everyone else) existing index to perform instant searches, it is hands down the ultimate most fast file search tool I have ever encountered - files literally are found while you type their names, without waiting for even a milli second.

So, no organization (the ocd part of me hates this) but i always find my files in an instant, no matter where i left them.

bballer 1 day ago 1 reply      
I try not to over think it, just:

 ~/$MAJOR_TOPIC | |--- ./$MORE_SPECIFIC | |--- ./$MORE_SPECIFIC | |--- ./general-file.type | | ./general-file.type | |--- ./$MORE_SPECIFIC | |--- ./general-file.type

As you find yourself collecting more general files under a directory that can be logically grouped, create a new directory and move them to it.

Also keep all your directories in the same naming convention (idk maybe I'm just OCD)

oelmekki 1 day ago 1 reply      
Beside the usual `Images`, `Videos`, `code` directory, the single most important directory on my system is `~/flash` (as in : flash memory). This is where my browser downloads files and where I create "daily" files, which I quickly remove.

This is a directory that can be emptied at any moment without the fear of losing anything important, and which help me keeping the rest of my fs clean. Basically `/tmp` for user.

xmpir 1 day ago 0 replies      
Most of my files stay in the download folder. If I think I will need them at a later stage against I upload them to my Google Drive. Google is quite good at searching stuff - for me that also works for personal files. I have probably 100 e-books that are on my reading list and will never get read by me...
codemac 1 day ago 0 replies      
recoll has worked great for a document index.


I also recommend calibre for e-books, but I never got to the "document store" stage that I think some people have.

mayneack 1 day ago 0 replies      
symlinks for ~/Downloads and ~/Documents into ~/Dropbox is my only interesting upgrade. Across the varying different devices I have different things selectively synced. Large media files are the only things that don't live in dropbox in some way or another. It's pretty convenient for mobile access (everything accessible from web/mobile). I've done some worrying about sensitive documents and such, but most of it is also present in my email, so I think I lost that battle already. It also means there's very little downside to wiping my HD entirely if I want to try a different OS (which I used to do frequently, but ended up settling on vanilla ubuntu).
raintrees 1 day ago 0 replies      
-clients - For client specific work

-devel - For development/research

 -Language/technology -specific research case
And I built my own bookmarking tool for references/citations.

joshstrange 1 day ago 0 replies      
Calibre may be a little rough looking but it's very powerful and it's what I use.

Edit: Also you might want to make a small title edit s/files/ebooks unless you are inquiring about other types of files as well.

house9-2 1 day ago 0 replies      



When reading for pleasure I typically read paper, try to limit the screen time if possible.

rajadigopula 1 day ago 0 replies      
If its for e-books only, you can try adobe digital editions or calibre. You can tag and create collections with search functionality on most formats.
gagabity 1 day ago 0 replies      
Dump everything on desktop or downloads folder then use Void Tools Everything to find what I need.
cristaloleg 1 day ago 0 replies      
~/work - everything related to job

~/github - just cloned repos

~/fork - everything forked

~/pdf - all science papers

eternalnovice 8 hours ago 0 replies      
Organizing my files has been an obsession of mine for many years, so I've evolved what I think is a very effective system that combines the advantages of hierarchical organization and tagging. I use 3-character tags as part of every file's name. A prefix of tags provides a label that conveys the file's place in the hierarchy of all my files. To illustrate, here's the name of a text file that archives text-based communications I've had regarding a software project called 'Do, Too':

- pjt>sfw>doToo>cmm

'pjt' is my tag for projects

'sfw' is my tag for software and computer science

'doToo' is the name of this software project

'cmm' is my tag for interpersonal communications

Projects (tagged with 'pjt') is one of my five broad categories of files, with the others being Personal ('prs'), Recreation ('rcn'), Study ('sdg'), and Work ('wrk'). All files fall into one of these categories, and thus all file names begin with one the five tags mentioned. After that tag, I use the '>' symbol to indicate the following tag(s) is/are subcategories.

Any tags other than those for the main categories might follow, as 'sfw' did in the example above. This same tag 'sfw' is also used for files in the Personal category, for files related to software that I use personally--for example:

- prs>sfw>nameMangler@nts

Here, NameMangler is the name of the Mac application I use to batch-modify file names when I'm applying tags to new files. '@nts' is my tag for files containing notes.I also have many files whose names begin with 'sdg>sfw' and these are computer science or programming-related materials that I'm studying or I studied previously and wanted to archive.

A weakness of hierarchical organization is that it makes it difficult to handle files that could be reasonably placed in two or more positions in the hierarchy. I handle this scenario through the use of tag suffixes. These are just '|'-delimited lists of tags that do not appear in the prefix identifier, but that are still necessary to convey the content of the file adequately. So for example, say I have a PDF of George Orwell's essay "Politics and the English Language":

- sdg>lng>politicsAndTheEnglishLanguage_orwell9=wrt|wrk|tfl|georgeOrwell

The suffix of tags begins with '=' to separate it from the rest of the file name. A couple of other features are shown in this file name. I use '_' to separate the prefix tags from the original name of the file ('orwell9' in this case) if it came from an outside source. I'm an English teacher and use this essay in class, and that's why the tags 'wrk' for Work and 'tfl' for 'Teaching English as a Foreign Language' appear. 'wrt' is my tag for 'writing', since Orwell's essay is also about writing. The tag 'georgeOrwell' is not strictly necessary since searching for "George Orwell" will pick up the name in the text content of the PDF, but I still like to add a tag to signal that the file is related to a person or subject that I'm particularly interested in. Adding a camel-cased tag like this also has the advantage that I can specifically search for the tag while excluding files that happen to contain the words 'George' and 'Orwell' without being particularly about or by him.

That last file name example also illustrates what I find to be a big advantage of this system: it reduces some of the mental overhead of classifying the file. I could have called the file 'wrk>tfl>politicsAndTheEnglishLanguage=sdg|wrt|lng|georgeOrwell', but instead of having to think about whether it should go in the "English teaching work-related stuff" slot or the "stuff about language that I can learn about" slot, I can just choose one more or less arbitrarily, and then add the tags that would have made up the tag prefix that I didn't choose as a suffix.

There's actually a lot more to the system, but those are the basics. Hope you find it helpful in some way.

graycat 1 day ago 0 replies      
From a recent backup, there are

417,361 files

in my main collection of files for mystartup, computing, applied math, etc.

All those files are well enough organized.

Here's how I do it and how I do relatedwork more generally (I've used thetechniques for years, and they are allwell tested).

(1) Principle 1: For the relevant filenames, information, indices, pointers,abstracts, keywords, etc., to the greatestextent possible, stay with the old 8 bitASCII character set in simple text fileseasy to read by both humans and simplesoftware.

(2) Principle 2: Generally use thehierarchy of the hierarchical file system,e.g., Microsoft's Windows HPFS (highperformance file system), as the basis(framework) for a taxonomic hierarchyof the topics, subjects, etc. of thecontents of the files.

(3) To the greatest extent possible, I doall reading and writing of the files usingjust my favorite programmable text editorKEdit, a PC version of the editor XEDITwritten by an IBM guy in Paris for the IBMVM/CMS system. The macro language is Rexxfrom Mike Cowlishaw from IBM in England.Rexx is an especially well designedlanguage for string manipulation as neededin scripting and editing.

(4) For more, at times make crucial use ofOpen Object Rexx, especially its functionto generate a list of directory names,with standard details on each directory,of all the names in one directory subtree.

(5) For each directory x, have in thatdirectory a file x.DOC that has whatevernotes are appropriate for gooddescriptions of the files, e.g., abstractsand keywords of the content, the source ofthe file, e.g., a URL, etc. Here the filetype of an x.DOC file is just simple ASCIItext and is not a Microsoft Word document.

There are some obvious, minor exceptions,that is, directories with no file namedx.DOC from me. E.g., directories createdjust for the files used by a Web page whendownloading a Web page are exceptions andhave no x.DOC file.

(6) Use Open Object Rexx for scripts formore on the contents of the file system.E.g., I have a script that for a currentdirectory x displays a list of the(immediate) subdirectories of x and thesize of all the files in the subtreerooted at that subdirectory. So, for allthe space used by the subtree rooted at x,I get a list of where that space is usedby the immediate subdirectories of x.

(7) For file copying, I use Rexx scriptsthat call the Windows commands COPY orXCOPY, called with carefully selectedoptions. E.g., I do full and incrementalbackups of my work using scripts based onXCOPY.

For backup or restore of the files on abootable partition, I use the Windowsprogram NTBACKUP which can backup abootable partition while it is running.

(8) When looking at or manipulating thefiles in a directory, I make heavy use ofthe DIR (directory) command of KEdit. Theresulting list is terrific, and commonoperations on such files can be done withcommands to KEdit (e.g., sort the list),select lines from the list (say, all filesx.HTM), delete lines from the list, copylines from the list to another file, useshort macros written in Kexx (the KEditversion of Rexx), often from just a singlekeystroke to KEdit, to do other commontasks, e.g., run Adobe's Acrobat on anx.PDF file, have Firefox display an x.HTMfile.

More generally, with one keystroke, haveFirefox display a Web page where the URLis the current line in KEdit, etc.

I wrote my own e-mail client software.Then given the date header line of ane-mail message, one keystroke displays thee-mail message (or warns that the dateline is not unique, but it always hasbeen).

So, I get to use e-mail message date linesas 'links' in other files. So, if somefile T1 has some notes about some subjectand some e-mail message is relevant, then,sure, in file T1 just have the date lineas a link.

This little system worked great until Iconverted to Microsoft's Outlook 2003. IfI could find the format of the filesOutlook writes, I'd implement the featureagain.

(9) For writing software, I type only intoKEdit.

Once I tried Microsoft's Visual Studio andfor a first project, before I'd typedanything particular to the project, I got50 MB or so of files nearly none of whichI understood. That meant that wheneveranything went wrong, for a solution I'dhave to do mud wrestling with at least 50MB of files I didn't understand; moreover,understanding the files would likely havebeen a long side project. No thanks.

E.g., my startup needs some software, andI designed and wrote that software. SinceI wrote the software in Microsoft's VisualBasic .NET, the software is in just simpleASCII files with file type VB.

There are 24,000 programming languagestatements.

So, there are about 76,000 lines ofcomments for documentation which isIMPORTANT.

So, all the typing was done into KEdit,and there are several KEdit macros thathelp with the typing.

In particular, for documentation of thesoftware I'm using -- VB.NET, ASP.NET,ADO.NET, SQL Server, IIS, etc. -- I have5000+ Web pages of documentation, fromMicrosoft's MSDN, my own notes, andelsewhere.

So, at some point in the code where somedocumentation is needed for clarity forthe code, I have links to my documentationcollection, each link with the title ofthe documentation. Then one keystroke inKEdit will display the link, typicallyhave Firefox open the file of the MSDNHTML documentation.

Works great.

The documentation is in four directories,one for each of VB, ASP, SQL, and Windows.Each directory has a file that describeseach of the files of documentation in thatdirectory. Each description has the titleof the documentation, the URL of thesource (if from the Internet which is theusual case), the tree name of thedocumentation in my file system, anabstract of the documentation, relevantkeywords, and sometimes some notes ofmine. KEdit keyword searches on this file(one for each of the four directories) arequite effective.

(10) Environment Variables

I use Windows environment variables andthe Windows system clipboard to make a lotof common tasks easier.

E.g., the collection of my files ofdocumentation of Visual Basic is in mydirectory


Okay, on the command line of a consolewindow, I can type


and then have that directory current.

Here 'G' abbreviates 'go to'!

So, to command G, argument 'VB' acts likea short nickname for directory


Actually that means that I have --established when the system boots -- aWindows environment variable MARK.VB withvalue


I have about 40 such MARK.x environmentvariables.

So, sure, I could use the usual Windowstree walking commands to navigate todirectory


but typing


is a lot faster. So, such nicknames arejustified for frequently used directoriesfairly deep in the directory tree.

Environment variables



are used by some other programs,especially my scripts that call COPY andXCOPY.

So, to copy from directory A to directoryB, I navigate to directory A and type


which sets environment variable


to the directory tree name of directory A.Similarly for directory B.

Then my script


takes as argument the file name and doesthe copy.

My script


takes two arguments, the file name of thesource and the file name to be used forthe copy.

I have about 200 KEdit macros and about200 Rexx scripts. They are crucial toolsfor me.

(11) FACTS

About 12 years ago I started a fileFACTS.DAT. The file now has 74,317 lines,is


bytes long, and has 4,017 facts.

Each such fact is just a short note,sure, on average

2,268,607 / 4,017 = 565

bytes long and

74,317 / 4,017 = 18.5

lines long.

And that is about

12 * 365 / 4,017 = 1.09

that is, an average of right at one newfact a day.

Each new fact has its time and date, alist of keywords, and is entered at theend of the file.

The file is easily used via KEdit and afew simple macros.

I have a little Rexx script to run KEditon the file FACTS.DAT. If KEdit isalready running on that file, then thescript notices that and just brings to thetop of the Z-order that existing instanceof KEdit editing the file -- this way Iget single threaded access to the file.

So, such facts include phone numbers,mailing addresses, e-mail addresses, userIDs, passwords, details for multi-factorauthentication, TODO list items, and otherlittle facts about whatever I want helpremembering.

No, I don't need special software to helpme manage user IDs and passwords.

Well, there is a problem with thetaxonomic hierarchy: For some files, itmight be ambiguous which directory theyshould be in. Yes, some hierarchical filesystems permitted to be listed in morethan one directory, but AFAIK theMicrosoft HPFS file system does not.

So, when it appears that there is someambiguity in what directory a new fileshould go, I use the x.DOC files for thosedirectories to enter relevant notes.

Also my file FACTS.DAT may have suchnotes.

Well, (1)-(11) is how I do it!

frik 1 day ago 0 replies      
For ebooks I created folders for main-categories and some sub-categories (inspired by Amazon.com or some other ebook shop structure).

For photos folders per device/year/month.

For Office documents pre-pending date using the ISO date format (2017-06-21 or 170621) works great. (for sharing with others over various channels like mail/chat/fileserver/cloud/etc)

guilhas 1 day ago 0 replies      
Zim wiki
Ask HN: What books are you reading?
13 points by curiousgal  21 hours ago   19 comments top 16
jasonkester 5 hours ago 0 replies      
Seul sur Mars (The Martian, in French)

It is really boosting my understanding of the French language, and giving me more confidence to speak it.

It's a simple story that's easy to follow, especially having read the book in English and seen the film a couple times. And really, how lost can you get? If you can't follow a paragraph or to, chances are he'll still be stuck on Mars for a while and you won't have missed much.

It's written in an informal, conversational style, using language that real people might use. I find myself reading a phrase that translates back to a saying I've used in English. Ah, looks like they use that in French too. I'll add it to the repertoire.

I can pick it up after a while off and quickly get back in to it without explanation. Hmm... this looks like the part where the guy is stuck on mars...

And as a bonus, it's kinda hard work to read in a foreign language, so if I pick it up in bed it's guaranteed to put me to sleep inside of half an hour.

Highly recommended.

JSeymourATL 6 hours ago 0 replies      
Raising the Bar: Integrity and Passion in Life and Business: The Story of Clif Bar & Co.

Just started this book last night. The story begins as the Founder of Clif Bar walks away from selling his company and a $40M personal pay-out. Big idea so far, your business is an ultimate form of self-expression. > https://www.goodreads.com/book/show/29691.Raising_the_Bar

comsci-bro 17 hours ago 0 replies      
The PhD Grind: http://pgbovine.net/PhD-memoir/pguo-PhD-grind.pdf

It is a wonderfully written memoir that perfectly details the grad school experience and also includes some helpful notes from the author. I'll be graduating next year (bachelor's in CS), and my dad asked me if I wanted to enter grad school. The book sure did add some fuel to the fire.

Jtsummers 17 hours ago 0 replies      
Specifying Systems, http://lamport.azurewebsites.net/tla/book.html

Engineering a Safer World, https://mitpress.mit.edu/books/engineering-safer-world

Software Specification Methods, https://www.amazon.com/Software-Specification-Methods-Henri-... (also available through Safari Books Online, at least at my office)

Read most of the third one this week, a useful comparison of the various approaches. My objective is to understand how to better produce formal (or more formal) specifications. Either for whole systems or just for significant or critical portions of them.

thakobyan 19 hours ago 0 replies      
Currently I'm listening to "Personal MBA" audiobook and loving it so far. I'm not the biggest fan of business books but decided to give this one a try to learn a bit more about marketing and sales.

Here are the books I've read and want to read: https://booknshelf.com/@tigran/shelves

bcbrown 21 hours ago 1 reply      
Mind And Nature - A Necessary Unity, by Gregory Bateson, and I Am A Strange Loop, by Douglas Hofstadter. They're a great combination, as they're both attempts to define the concept of "mind" through patterns. Bateson is one of the early thinkers in the field of Cybernetics, which I've been meaning to learn more about.

Here's my (unfinished) reviews of the books I've read so far this year: https://github.com/bcbrown/bookreviews/tree/master/2017. At the end of the year I'll flesh them out a little more.

astrodev 19 hours ago 0 replies      
Peter Frankopan, The Silk Roads - world history from an Asiacentric perspective.

Harold Coyle, Team Yankee - WW3 in Europe in the 1980s from the perspective of a tank company commander. Poorly written, in my opinion, but the accurate (or so I hope) descriptions of the military tactics and equipment almost make up for it.

James Gleick, The Information: A History, A Theory, A Flood - excellent book about the history of information.

gubsz 13 hours ago 0 replies      
Into Thin Air by Jon Krakauer.

It goes into detail about the Mount Everest disaster in the 90s.

delgadillojuanm 21 hours ago 0 replies      
Im reading the Deep Learning book written by Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Also "Intercom on staring up"
house9-2 19 hours ago 0 replies      
Stephen King

The Dark Tower II: The Drawing of the Three

Wanted to read the first one before the movie came out, now I am hooked...

alltakendamned 21 hours ago 0 replies      
Windows Internals 7th Ed.

Seveneves, Neal Stephenson

Astrophysics for people in a hurry, Neil De Grasse Tyson

elyrly 21 hours ago 0 replies      
Just finished The cartel today, next Algorithms to live by
stevekemp 16 hours ago 0 replies      
The Chronicles of Amber, again.
miguelrochefort 18 hours ago 1 reply      
"Getting Things Done" by David Allen
sidcool 16 hours ago 0 replies      
Essential AngularJS
SirLJ 19 hours ago 0 replies      
Right now I am reading again "More Money Than God"


Ask HN: Want to study SSL, HTTPS, and the works. Where to start?
14 points by surds  1 day ago   9 comments top 6
tialaramex 3 hours ago 0 replies      
I recommend beginning at the fundamentals. For example here's a video that walks through Diffie Hellman, so that anybody can follow, you can probably sprint through it, but by taking it slow they avoid accidentally forgetting anything important.


Grasping the fundamentals means that when it comes to policy decisions (e.g. in the management of certificates) you can see what the consequences of a particular decision are, rather than just hoping that whoever proposed that policy knew what they were doing.

For example, I think a lot of people today use Certificate Signing Request (CSR) files without understanding them at all. But once you have a grounding in the underlying elements you can see at once what the CSR does, and why it's necessary without needing to have that spelled out separately.

Or another example, understanding what was and was not risky as a result of the known weakness of SHA-1. I saw a lot of scare-mongering by security people who saw the SHA-1 weakness as somehow meaning impossible things were now likely, but it only affected an important but quite narrow type of usage, people who understood that could make better, more careful decisions without putting anybody at risk.

ZoFreX 1 day ago 1 reply      
I'm more of a learning by doing person. Here's three exercises that you'll learn a lot doing:

1) https://www.ssllabs.com/ssltest/ - try to get an A+. It's not important to in most cases in practice, but you'll learn a lot getting there. Their rating guide is also handy: https://github.com/ssllabs/research/wiki/SSL-Server-Rating-G...

2) MITM yourself. I've done this using Charles, you can do it with any HTTP proxy that lets you rewrite requests on the fly - I hear Fiddler is popular. MITM yourself and try changing the page for an HTTP site. Then try doing it on a website that is part HTTP part HTTPS (e.g. HTTPS for login page for example) and "steal your password". Try again on a website that redirects from HTTP to HTTPS using a 301 but does not have HSTS. Finally try on a site with HSTS (nb: you won't manage this one). Congratulations, you now truly understand why HSTS is important and what it does better than most people!

3) Set up HTTPS on a website. You've probably already done this. In which case maybe do it with LetsEncrypt for an extra challenge?

AaronSmith 1 day ago 0 replies      
To study SSL, HTTPS, CAs including installation and management of SSL certificates, You can consider following references:




moondev 1 day ago 0 replies      
I learn best by example, and I have learned so much just by evaluating and implementing hashicorp vault: https://www.vaultproject.io/docs/secrets/pki/index.html

It doesn't hold your hand at all, but it gives you a nice "task" to accomplish. Reading up on all the terminology and exactly how and why it works was really fun.

schoen 1 day ago 1 reply      
I hear good things about Bulletproof SSL/TLS by Ivan Risti:


There was also a nice web page presenting all kinds of PKI concepts that I came across a few years ago but haven't been able to find since then. :-(

indescions_2017 1 day ago 1 reply      
Check out High Performance Browser Networking. Ilya Grigorik is a very smart cookie and will take you right up to the present day state-of-the-art:


Ask HN: As a lazy but concerned user, how do you run your own email server?
13 points by Cilvic  1 day ago   10 comments top 5
danieltillett 1 day ago 2 replies      
The problem is not setting up your own email server (this is relatively easy), it is getting all your mail into other people's inboxes. Basically the big players these days (I am looking at you Microsoft) just treat any mail coming from a private server as spam. Even more frustratingly they don't do it consistently, just frequently enough that you can't rely on anyone getting your email.

After running my own email server for 15 years I gave up a couple of years ago and paid for someone else to solve the nightmare of dealing with the big email gatekeepers.

DamonHD 1 day ago 2 replies      
I run my own email server and have done for ~25 years, but what do you mean by 'secure'?

SMTP isn't a secure transport.

Having your email stored on someone else's computers (ie the cloud) is not necessarily 'secure'.

Having a well-constructed and well-managed host somewhere you physically control seems to me the most 'secure' arrangement, which is what I have always had. Currently for the cost of a Raspberry Pi and occasional 'apt-get update' etc.

thiagooffm 1 day ago 0 replies      
I pay for protonmail. Works like a charm, got even an app. 50 bucks a year, totally worth it.
Lan 1 day ago 0 replies      
You could try an all-in-one solution like iRedMail[0] or Mail-in-a-Box[1]. Those supposedly do most of the leg work for you and set up a commonly used stack (Postfix, Dovecot, SpamAssassin, Roundcube, etc). I've never used either of them since I just install everything piecemeal, but I imagine there is an ease of use tradeoff compared to setting the same stack up yourself. In other words, it'll be easier to set up initially, but the downside is that you wont learn the ins and outs of the individual components. So if something breaks or you need to make an adjustment you're going to have a more difficult time at that point.

That said, there are some things you should be aware of when running a mail server:

1. You need to make sure that the IP address and domain name that SMTP is bound to is not on a blacklist[2]. You also need to consider the trustworthiness of your host because you could very well get caught in the cross-fire if one of their other customers gets them range banned. Certain cloud providers that make it very easy to change IP will more than likely have all of their addresses on some blacklist or another.

2. You also need to make sure you have matching forward (A record) and reverse (PTR record) DNS records for that IP address. This is called Forward-confirmed reverse DNS, aka FCrDNS. Many mail servers will reject email from servers that do not have or have mismatching records for FCrDNS.

3. You must set up SPF and DKIM. Many mail servers will either reject mail from servers without these, or at least weight heavily against it.

4. You probably want to make sure TLS is set up properly, otherwise your mail is going to travel the internet in plaintext.

5. The IP address you're sending from is going to start off with no reputation. The volume, type of mail, and how many people mark your mail as spam is going to decide whether other mail servers start filtering you or not. You may have no problems here. If you're unlucky, you will need to try to reach out to whichever major mail provider is filtering your mail. Many of them have a ticketing system for this, but you'll be at the mercy of whomever is working that ticket. There are also various whitelists that might be worth trying your server on. They're usually very selective and will probably reject your request.

6. You really, really need to make sure you've got your policies set up correctly because you do not want to accidentally set up an open relay[3] that will be used to spam other people.

7. Greylisting is a very, very effective means of spam filtering. The downside is that mail from new servers wont be delivered instantaneously and will instead be delivered whenever their mail server tries to deliver it again. Other than that, most spam is malformed in some way so some basic DNS checks will filter a ton of it. There are also free RBL and DNSBL lists that will pick up the slack.

[0] http://www.iredmail.org/[1] https://mailinabox.email/[2] https://mxtoolbox.com/blacklists.aspx[3] https://en.wikipedia.org/wiki/Open_mail_relay

feborges 1 day ago 0 replies      
you don't.
Ask HN: Help Coinbase ate my $5,000.00
53 points by redm  3 days ago   20 comments top 10
1ris 2 days ago 1 reply      
Send them a invoice of USD 5000, for the service you hoped to get, but didn't receive. Include a dead line. Send via registered mail. Probably nothing will happen.

Go a small claims court and pay probably not more than USD 200.

Most other suggest to ask nicely. You already did, nothing happend and from history i doesn't seem like it ever will.

jamesjguthrie 3 days ago 1 reply      
Coinbase is horrible. Their customer service is none existent. I waited 6 weeks to get my bank account verified so that I could make withdrawals to my bank. They say send at least 6 EUR to their Estonia account. I sent 10 EUR and it never got verified. I e-mailed numerous times, tweeted their support account, and tweeted the CEO. No response from an actual person at all.

Moved my ETH and LTC elsewhere and sold to a private buyer.

sharemywin 3 days ago 1 reply      
Phone: 415-843-1515 548 Market Street #23008 San Francisco, 94104 California, United States

Read more at CB Insights: https://www.cbinsights.com/company/coinbase

DanBC 2 days ago 0 replies      
In England you'd write a letter before action. This would set out what happened, why it's wrong, and what you want them to do to put it right. You'd give them a time limit - maybe 14 days. You'd send it by mail, by signed-for delivery.

Then if they haven't put things right in the time limit you'd go to court, which can be done online now.

I think a similar process exists where you are, and would focus coinbase into fixing the problem.

djb_hackernews 3 days ago 1 reply      
Email the CEO: brian@coinbase.com
mattbgates 2 days ago 0 replies      
This reminds me of a South Park episode.. https://www.youtube.com/watch?v=FJNRVptyb9Y
mdotk 2 days ago 1 reply      
At what point do you consider it theft? 6+ weeks had to be up there...
foobarbazetc 22 hours ago 0 replies      
Ask your bank to reverse the wire.
kilimchoi 2 days ago 0 replies      
This happened to my friend too. It seems like a regular thing
Zekio 3 days ago 1 reply      
could contact your bank and try to get them to reverse the transfer
Ask HN: How to contact sponsors for Open Source?
4 points by franciscop  1 day ago   2 comments top
mtmail 1 day ago 1 reply      
Have a look at https://opencollective.com/ as well.

I wonder how you will deal with taxes. You offer a service (number of hours) against payment and registered as Ltd (for profit).

Ask HN: What level of math is required to complete the teachyourselfcs subjects?
17 points by pilatesfordogs  3 days ago   2 comments top
thephyber 2 days ago 1 reply      
I had to look up teachyourselfcs[1]. I haven't tried the resources, but the subject matter appears to cover standard undergrad BS in CS concepts.

The simplest answer is just to parity the math courses of a Berkeley / MIT / Stanford CS degree, although it will likely be a little overkill, especially if you intend to limit yourself to a strict subset of TYCS. For example, databases and networking generally require very different math prereqs than computer graphics or machine learning.

You will need a high school level of math (grammar school math, algebra, trigonometry, basic stats) to be able to program most things.

Discrete math is used heavily in many parts of CS (it is integral to understanding how to accurately negate programming expressions).

You should probably understand calculus at a high level, although my experience with actual calculus usage in my career is zero.

Probabilities are used heavily in concepts like caching / performance, which will touch OS, arch, data structures, and likely others. For this, you should find a "statistics for engineers" type of course / book for undergrads, which may or may not make use of calculus to prove some of the statistical concepts.

Linear algebra is used heavily wherever graphics cards are used, so graphics, video, machine learning, etc. Linear algebra will likely have calculus as a prerequisite.

Modulo math is used heavily for cryptography and some data structures (hash tables). An undergrad will get a few days or weeks of this, and probably not an entire course.

Set theory and graph theory are used sporadically. Networking, distributed systems, etc will make use of them.

Hope this helps.

[1] https://teachyourselfcs.com/

Ask HN: Teaching Code to Someone That Asked Me for a Job
63 points by sirrele  1 day ago   60 comments top 26
grardb 1 day ago 5 replies      
>However, I made a deal with him where I would give him a little more money that what he asked for and he wouldn't need to pay me if he committed to learning how to code (with my assistance).

I would highly recommend against this.

If you lend money to someone, expect to never get it back ever, regardless of the deal you made. I know this from personal experience.

What I also know from personal experience is not to expect someone to learn how to code because you want them to. I gave my old laptop (which was still working well) to my cousin under the condition that he completed a single Udacity course on programming. I will tell you right now that he did not even come close to finishing the course.

I don't know you or your friend of course, but if I had to put my money on it, your friend is not going to learn to code, and you're not going to get your money back for a long time, if ever.

PaulStatezny 1 day ago 0 replies      
I can speak from experience.

TL/DR: Lessons from teaching a friend to code.

I have a friend who was looking for a career change. I've spent somewhere between 75 and 150 hours helping him learn to code. (Web development.) Here's what I learned in the process:

1. I highly overestimated how quickly one could learn web development with no prior programming experience. I was too optimistic, and I told him if he put serious time in, he could have the skills to build a simple web app in 6 months. He put in a more realistic amount of time than I'd suggested, balancing other areas of life. It took him closer to 18 months, including enrolling in a coding camp, which he's now about to complete.

2. Charging money for a service helps people take it seriously. At first I didn't charge him, but then I took the advice of a friend who has that philosophy. This isn't definitive evidence, but I think charging for the training helped both him and myself to take it seriously and put effort into it. He's now about to graduate from a code camp, and I'm not sure if he would have done it if not for establishing that mindset that this training is valuable. (I recognize the value of code camps is debatable.)

3. Motivation is an important (and tricky) thing. There were times where he was spending more time on video games than programming. But I remember when I was learning, and programming felt very hard and mysterious for years before I began to feel comfortable making an entire project on my own. The difficulty level was demotivating at times.

danso 1 day ago 2 replies      
I've tried to teach code to coworkers and friends for free. It doesn't really work out. Teaching programming is significantly more difficult than doing it (IMO). You'll definitely learn how hard programming is to learn, and that can be helpful. But unless someone puts real skin in the game, and/or have a real need to learn programming, they probably won't be able to muster the focus and commitment needed to reach a satisfactory skill level.
kazanz 1 day ago 1 reply      
I started a little full-stack bootcamp to give back to the community, I had about a 50% success rate teaching people from all types of backgrounds, economic and educational levels etc.

I typically charged a couple hundred bucks so that they would have "skin in the game". 100% of the people, I or someone else funding, failed to finish the course. Having skin in the game is absolutely critical to their success.

In this case, it seems they have nothing to lose, and I suspect if they are willing to beg you for cash, they would have no problem going elsewhere.

I suggest some sort of deal where he has to put something in other than "time and effort". Perhaps have him "pay" you in other ways, such as chores around your house. Mowing the lawn, etc.

aerovistae 1 day ago 0 replies      
I have helped many people learn to code.

I was teaching JS basics to a friend the other day who was interested, and he said something that I thought was particularly well-worded. When he stumbled across the idea of classes (as in OOP, that is), I said he should avoid them for now because it's too advanced and it would just make things confusing. I encouraged him to focus on basic functions and control flow.

He demurred, insisting that we do something "actually interesting" and had me teach him how to create a class. He likened the motivation to learning Brazilian jiu-jitsu:

> I don't want to spend hours practicing passing the guard, I want to learn to rip someone's fucking arm off.

And having learned some BJJ myself, and having experienced that exact same desire and irritation, I couldn't help but sympathize. Passing the guard is a crucial part of BJJ, but it feels quite basic and uninteresting in the beginning, much like if statements and for loops.

The point is, make sure your learner is working on something he finds interesting, like trying to put together a basic calculator app or anything concrete that he can relate to. Empty isolated exercises that aren't leading to building anything are detrimental to interest and motivation.

jrochkind1 1 day ago 1 reply      
If you don't loan money to friends, this sounds like the worst loan conditions ever, don't do it. It's like loaning money to friends but worse. "A commitment to learning how to code" is a pretty subjective loan repayment/forgiveness condition, now you're in the position of judging whether he _really_ demonstrated that commitment. This is even _more_ complicated to your friendship than a straight loan.

Who knows if he really has motivation to learn to code at all, or is just doing it to get your money. Without his own internal motivation, learning is not gonna work. But even if he has internal motivation, this is still like loaning money to friends but worse.

stevedt 1 day ago 1 reply      
Just my anecdotal experience: the people who should be coding will find a way to start coding.

The barrier to entry is so low -- talking about learning to code, not necessarily finding employment. I have tried with coworkers and family members who wanted to do it for the money and their heart just wasn't in it.

tbirrell 1 day ago 3 replies      
I assume the job is about coding, otherwise this would be a very interesting situation and will likely backfire as your friend will have no motivation to actually learn.

That said, here are some tips. Note that I have never taught someone to code, but I am familiar with mentoring someone through a skill set I already have. Consider all languages as placeholder terms for whatever stack you are going to teach, they are what I started my career in, so I'm using them here.

- Beware the curse of knowledge. Yeah, I know XYZ the most obvious thing in the world, but if you think back to the dawn of time, you'll remember when you didn't understand how a function worked.

- Start slow. This builds off the last one. Start with the basics. I personally learned to code in the following order. HTML > CSS > JS > jQuery > PHP > MySQL > PHP. Start easy and lay a solid foundation, then build on that.

- Teach the language before the framework. Okay, this is based off my learning experience rather than my teaching experience. However, if you want your friend to fully grasp and be able to keep going, teach them JS before jQuery and PHP before Laravel. Show them how the magic works. It will make it so much easier for them (and you) down the road.

- Have fun. I know that's a cliche ending to every list ever. But remember to make the process enjoyable. Presumably, you are a programer, and if you are anything like me, you love what you do. Try to instill that in your teaching. It'll make your friend more likely to stay and learn without fighting the process.

Good luck!

gorpomon 1 day ago 0 replies      
Here's my two cents as someone who's taught coding and also taught literacy for adults.

My gut feeling is that it will be hard to make work. Learning to code takes a long time (a bootcamp is 11 - 17 weeks at 60hrs a week, so 660 - 1020hrs). However keep in mind that's entry level proficiency.

I think the best outcome would be that this person learns enough to get into a bootcamp. You'd be shocked how many people apply who just aren't ready to even start. They could learn enough with you that they find out if they like it or not, and if so from there they can take out a private loan to attend one. Keep in mind, I'm not sure how predatory (or not) the companies giving private loans to bootcamp grads are, but it is an option, and at ~$17k in cost, it's steep but not life derailing if doesn't work out (my guess is it's about the cost of a broken bone if you're not insured, just a guess there).

If you make the expectation they learn to code to get a job, it probably won't happen, if you level set that they learn enough to get into a program and OWN IT, then perhaps you'll have some success.

Just my two cents, hope this helps!

RankingMember 1 day ago 0 replies      
I think wanting to teach him to code is admirable, but I hope he actually WANTS to learn, given the difficulty and need for perseverance.

I do question the lending money thing but maybe there's some aspect of his character that makes you trust him enough to risk the relationship that we're not aware of.

quaunaut 1 day ago 1 reply      
I sorta did something like this.

A friend of mine[1] pushed and pushed and pushed me to learn Python, and at one point even paid me a small amount to make two plugins for Anki(a flashcard dekstop app), which was really difficult, but are still on my Github today[2].

That was enough of a push that then, I pushed as hard as I could into Django, got my first gig, and a year later switched to Ruby/Rails, and have been growing ever since.

I highly recommend this course of action. It absolutely changed my life, and brought me from barely being able to scrounge out a minimum wage job after crossing the country, to making more money than my Father and being able to live effectively wherever I want, and in just 5 years. It's been incredible, and it's all thanks to his kindness toward me.

1. https://twitter.com/kfdm

2a. https://github.com/bravely/Anki-Reset-Leech

2b. https://github.com/bravely/Anki-Priority-Switcher

franciscop 1 day ago 0 replies      
I have been using codementor.io and I love it. It is similar, except that you get paid to teaching instead of having to pay to teach. That one is for you in case you are interested to keep growing (and get some money).

For your friend I think that is great as you avoided any kind of conflict of interests for him.

Grustaf 1 day ago 1 reply      
Really great initiative, I hope it works out well! I haven't taught anyone like this, but in other contexts.

I would say since money is involved and he doesn't necessarily have the same strong motivation that self learners usually do that it's important to lay down clear rules, i.e. what does it mean to learn to code. Ideally this should be output based rathe than input oriented, like building a certain simple app. That's also extremely rewarding.

I always tell people to follow the stanford course on swift (from itunes u), it's absolutely brilliant. Depending on his background something else might be more suitable, but in any case you could act as the TA checking his homework, but also as a classmate/teacher that can answer questions. The bulk of the presentation of new material you ca safely leave to a mooc i think.

Good luck!

hackermailman 1 day ago 0 replies      
The best intro I've found is the edx course HtDP which is now a micromasters track https://www.edx.org/micromasters/software-development it's free to audit instead of paying for the verified track (which gets you TA help). He could conceivably do the first 2 courses in that track then learn whatever you're doing or finish the whole track by next year.
avdicius 1 day ago 0 replies      
A personal account of perhaps an old timer already. Entirely based on personal experience so probably less than representative.

People are either able to code or not. Teaching does not work. Those who are able to code almost entirely pick all the skills by themseves. If a 'natural born' coder gets into some formal environment, such as university or something, suh a person in two months surpasses the level of all the peers and the direct instructor as well.

In the university I was trained for automatics. But I quickly learned that coding takes me no effort at all, as opposed to, say, understanding electronics. After reading Wirth, Kernighan & Ritchie, and Stroustrup I often found myselfs hinting students from the programming department how to perform their tasks as they were scratching their heads and I was just passing by.

This has nothing to do with inteligence. I'm perhaps not very smart. When I starred at some scheme I had no idea if this an amplifier or something else, what is the role of one resistor or another. At the same time mates from my group read it as it was written in plain English (err, in plain Russian to be precise). But the very same persons were totally unable to code. It's very strange. For me coding is trivial and takes no inteligence. This is why I do it for living. The path of the least resistance. I'm kind of puzzled why persons smarter than me cannot code.

Anyway, after reading some foundational books the only thing that helps is reading other people's good code. For me it was reading pieces of the old (around 90's) BSD and GNU code.

I never met a person I'd appreciate for directy handing me over any useful coding skill. YMMV.

nyxtom 1 day ago 0 replies      
Make sure you set project oriented goals, this might help eliminate the issue of him asking you to fix things for him all the time and also create some foundational skills. Also, there are some wonderful classes online that go from basic logic to coding skills all the way up the stack. Check out http://www.nand2tetris.org/
kreeWall 1 day ago 0 replies      
I was hired a year ago without a computer science background into a position as a data analyst (I know data), but needed to learn how to code. I've been using a lot of online resources on my journey of coding. Feel free to reach out if you'd like some of these, I can help you figure out what would work best for the languages and concepts you'll need help with!
agjacobson 1 day ago 0 replies      
Coding is a very complex skill. The success probability in going from zero coding experience to being a self-starting coder is a bet I wouldn't take except out of pure friendship or charity.

There are two problems. If you substitute "gardening" above for "coding," my statement is still true. The second is the nature of learning exercises. We all know that a real problem to solve generates motivation that is much more valuable than, and can self-generate syntax and algorithm knowledge.

With all the free tools, free courses, and free pdfs around, I wouldn't try to train someone who is not already brimming with questions generated by real frustration.

gedrap 1 day ago 0 replies      
This is a noble effort but your post is missing one key answer: is this person motivated enough to do it? Or just needs some money right now and maybe for the next month or two?

Coding is hard, and takes a lot of hours (like, a thousand or two) to get to basic proficiency. So yeah, you need a lot of will power and motivation to get it done. Especially outside formal setting like university, where you are pretty much forced to do it.

I know plenty of smart people who have uninspiring jobs with low salaries, and they keep talking how they will learn to code, but they don't manage to go past hello world.

hluska 1 day ago 0 replies      
What if you flipped it around a little? You aren't so much lending him money and teaching him to code as it is hiring him to help you with a side project.

If you judiciously assign him tasks to work on, help him get started but encourage him to use solid Google/Stack Overflow skills to solve his own problems, you may end up truly helping him.

The alternative, which scares me a little, is that he will start working on tutorials, get bored, start to doubt himself and then completely disappear from your life because he failed and can't pay you back.

meerab 1 day ago 0 replies      
One common mistake is to assume that 'only way to solve his job problem is to teach him to program'. If your goal is to make the man stand on his own feet, there are more than one way to do so. High schoolers have career counselors. Their job is to match student's interest and available careers. You are overwriting career counseling step.

You are saying 'Learn to code, that career path worked for me, it will work for you'.

oblib 1 day ago 0 replies      
I did something similar with my daughter's fianc, but I didn't really teach him how to code. I taught him how to learn how to code and he picked it up fast and was very productive.

Unfortunately my daughter broke off the engagement so we didn't get a chance to work on much together but he went on to a career in coding so it worked out pretty good just the same.

eof 1 day ago 2 replies      
I have tried to teach people to code. More often than not it does not work out. It appears to me there is something intrinsic or developed at a very young age which sets a mind up to be both able and willing to think like a programmer.

Most people simply don't think that way, end of story. Some do though, I hope you picked one of them.

I wouldn't worry too much about teaching him to "code" as there are a lot of ways to be valuable in the industry in general, and helping your friend is a very good thing to do.

In any case, I think you will probably get a lot out of this . Just don't try and force a square peg down a round hole.

brador 1 day ago 0 replies      
Lend only money you can lose and ask them to learn programming. But don't sweat if they don't and walk away if it starts risking the friendship.
zappo2938 1 day ago 0 replies      
1. Learn Microsoft Excel. 2. Take test at temp agency. It's all they care about. 3. Profit. Quickest way to earn a living with a computer.
obstinate 1 day ago 0 replies      
It would have to be a very close friend indeed for me to offer to do a bunch of free labor (in the form of teaching) in addition to giving money, and I'd have to be satisfied that the circumstances were not reasonably within the person's control. It's not that I think people don't deserve help up when they screw things up that are within their control. It's that I don't want to "throw good money after bad," and from my experience and what I've heard from others, that's the majority outcome for situations like this.
Ask HN: What technology of the last 30 years has had the largest social impact?
7 points by cmod  1 day ago   4 comments top 4
zer00eyz 1 day ago 0 replies      
The biggest change encompasses all of the things that your talking about.

Pace, and not the pace of progress, the literal pace of people.

20 years ago when I moved to the bay area, from the east coast there was still a cadence that was much slower than the "new york minute" that I was used to.

Today everyone is rushing from one thing to the next, we all seem compelled to know and respond instantly. Almost everything you listed can be answered with some form of "faster" (smaller elapsed time) than we could have done it previously, and I think that is a big deal.

onion2k 1 day ago 0 replies      
I'm split between "the internet" and "mobile phones". 30 years ago is a couple of years prior to the invention of the World Wide Web and browsers, so that's clearly a huge change that's had a massive impact on a lot of people. Equally though, access to data and comms has been changed immensely by mobile phones for literally everyone on Earth. It's hard to say which has had a bigger impact.
owebmaster 19 hours ago 0 replies      
None. The fall of the Berlin Wall had the same or more social impact that any or some of your points joined. I'd point the rise of China but it is not tech oriented but politically.
baybal2 1 day ago 0 replies      
CMOS lithography, green revolution
Ask HN: What simple tools or products are you most proud of making?
18 points by christiangenco  3 days ago   20 comments top 14
jetti 4 hours ago 0 replies      
Plsm (https://github.com/jhartwell/Plsm). It was my first Elixir project and I have almost 10 times more stars on the project than my next highest project. I was also complimented on how clean the code was, and this was only a month into learning Elixir.
krrishd 3 days ago 1 reply      
I'm proud of having made http://write.itskrish.co -- it's a stream-of-consciousness journalling/writing tool built in React.

The "stream-of-consciousness" bit is enabled by the two key features: you choose a finite duration within which to write, and if you stop typing more than a few seconds, your writing is deleted. This essentially forces you to continuously type for the session, and at least for me and the users I've spoken to, this forces out thoughts/ideas/feelings that otherwise wouldn't have made it to the keyboard.

I've personally been using it routinely for months as a therapeutic journal, and at this point I've practically been Pavlov'd into opening it up whenever I'm under cognitive/emotional duress.

it's open source (http://github.com/krrishd/write), and I appreciate feedback!

pigpen34 1 day ago 0 replies      
CronAlarm https://www.cronalarm.com - A cron / scheduled job monitoring service.

My main job requires a ridiculous amount of file and data transfers that are mostly scheduled to run during off-peak hours. I needed a way to centralize the results of these jobs in order to keep tabs on things. I built this as an in-house tool and then discovered a few services already existed for this. I thought my solution offered some things these others didn't, and if somebody was paying these other services I might have some success as well. It's been a lot of fun, and if anyone has any suggestions I'd love to hear them.

assafmo 2 days ago 0 replies      
SQLiteProxy - https://github.com/assafmo/SQLiteProxy - which has proven itself big time in production the last couple of weeks.

A telegram bot that sends me NBA related tweets from the ESPN Stats & Info twitter - https://t.me/nbaespnstats - https://github.com/assafmo/nba-espn-stats-and-info-telegram-... - which was amazing during the 2017 playoffs and made the whole watching experience awesome for me. The channel also have around 20 followers right now, so I guess others like it to. :-)

A script that downloads all my shows every day - assafmo/DownloadMyEpisodes

mattbgates 3 days ago 0 replies      
I created MyPost ( https://mypost.io ) as a way to get a page up and running on the Internet in seconds. It was originally made for me... but I really wanted to share it with the world and see if people would find a use for it. I wanted a place where I didn't have to register for any account and could use HTML and CSS (Javascript works to an extent too!) to design web pages. Really, to make or share quick notes when I was working with my clients. It was also a way for them to easily make changes to the post so long as they had a password.

But it can be used for so much more ( https://mypost.io/post/what-can-i-do-with-mypost ).

It is completely free to use. I don't have any plans to charge for it, and have not even added advertising or anything to it yet, but it still receives maintenance and updates, though no more major feature implementations are planned. It was my first web app and taught me a lot, from learning the basics of database programming to a friendly UI that could be understood by everyone. My sister, who is not very tech or computer savvy, was the beta tester. Whenever she questioned something or got stuck on something, I redesigned that feature to make it even easier. Whether it was functionality or the wording.. if she questioned it, it was redone.

It boosted my confidence into the web app world. Right now, I've got about 8 more web apps in the works, 3 of which are in the stages of beta testing, and though there is a free version, they will actually be paid subscription to access additional features. So I am proud to boast about this project, as it was the start to my empire.

minhajuddin 1 day ago 0 replies      
I am proud of making LiveForm (https://liveformhq.com/) and GetSimpleForm (https://getsimpleform.com/). Both are simple products which allow users to integrate contact forms seamlessly in their websites. However, their use by others gives me great pleasure. A very recent user of LiveForm is using it to do translations! A user submits a scanned document with some extra information using LiveForm and my customer translates the text for a fee :)
jventura 2 days ago 0 replies      
I've build http://mockrest.com/ last week and even did a Show HN at https://news.ycombinator.com/item?id=14537247.

I've built many things before but why am I proud of this one specifically? Basically because I've built it with no expectations what-so-ever if this thing will ever be needed by someone else but me. Also, I've built it fast (less than 1 week), polished it a bit, and released it as soon as it was working ok-ish..

And why am I proud of being able to build it although it is not complete? Because I used to deal with perfection for so long that I had to force myself to release anything at all. In fact, it used to be very hard for me to even start doing anything for myself, as I would have analysis-paralysis. For quite some time I had to force myself to think "when good is good enough", read a lot of things about that subject, read other people opinions on these things, etc. etc. etc. After figthing with my own perfectionism, it seems that I finally can do things having lower expectations.. That's why I'm proud..

nstart 2 days ago 0 replies      
I recently polished a little and released a chrome extension to auto skip YouTube ads after they've played enough to ensure the content creator gets paid. Pretty darn happy with it. https://chrome.google.com/webstore/detail/youtube-auto-ad-sk...

Waiting for Firefox to approve the add-on now.

palerdot 2 days ago 0 replies      
I've built http://hotcoldtyping.com, an interactive way to learn touch typing with instant feedbacks with keyglows and graphs based on accuracy.

I've built http://remindoro.com, a chrome extension to get repeat reminders.

http://palerdot.in/moon-phase-visualizer/ - A simple web demo to understand moon's phases and eclipses.

All of these stuffs are open source (my github - https://github.com/palerdot) and I'm proud of these tools

rpeden 2 days ago 1 reply      
StoryGrabber - http://stories.rpeden.com/

Just a little tech news aggregator I put together using React and Node. Pulls the top 10 stories from HN and a bunch of subreddits, and pushes updates to the browser every 15 minutes via socket.io.

I've still got plenty of improvements to make to it, but I'm trying to break the habit of working on side projects that I don't ship. So I've shipped this one, even though I won't consider it 'done' for quite a while yet. :)

ademcan 2 days ago 0 replies      
I am working (on my spare time) on canSnippet for the last few years. I made a very first web-based version https://github.com/ademcan/canSnippet and I recently released a desktop version for macOS (Windows and Linux are coming...) https://www.cansnippet.com/.I got very good and positive feedback so far :)
siquick 1 day ago 1 reply      
http://mp.soundshelter.net/ - Get a playlist of your most listened to tracks on Spotify

Not because it was technically difficult, but because it solved a problem that me and seemingly hundreds of other people who signed up are having.

ianleeclark 3 days ago 1 reply      
I'm a fan of https://freethepodcast.com/

I found a naive, yet effective way of adblocking podcasts which is easily scalable. Although it's not yet released, early access is close to releasing and I'm hoping that it takes off. Really proud of it because it's incredibly cross-dimensional (i.e., marketing, programming, &c.) and that having a podcast adblocker is non-trivial problem to solve.

Artlav 2 days ago 0 replies      
Many small things, from compilers, IDE and mail client to a bitcoin node, magnet simulator and video-to-3D model thingy.

None of it is public, however, for obvious reasons.

Ask HN: How secure is the encryption offered by OS X's Disk Utility?
98 points by whitepoplar  4 days ago   40 comments top 6
jackjeff 3 days ago 5 replies      
It's state of the art block level encryption, however file system level encryption such as what will be offered by the upcoming APFS is fundamentally better.

In a block level encryption each sector is encrypted below the file system. Doing the nave thing of encrypting each sector with the encryption key is fundamentally insecure. This is called the EBC mode of operation. There's a nice picture of a penguin on wikipedia encrypted with ECB which demonstrates this:


Secure mode of operations generally try to propagate the result of previously encrypted blocks to the next ones. But this approach is not really suitable for mass storage devices. You cannot re-encrypt all the sectors behind the one you just changed. That's just impractical, since writing to sector #0 amounts to rewrite the entire disk.

So in practice schemes like AES-XTS are used. They work by having some kind of way of "tweaking" the encryption, so that it is different for each block (avoiding the pitfalls of ECB), but in a way which allows random access to sectors (i.e. in a way that is predictable). AES-XTS is a tradeoff for this special use case but it is not as robust as more classical modes of operations which would typically be used in an encrypted filesystem.

Details about AES-XTS issues:https://sockpuppet.org/blog/2014/04/30/you-dont-want-xts/

purple-dragon 3 days ago 1 reply      
It's as secure/strong as the standards' based cryptographic method used, i.e., AES-128 or AES-256. If you're curious about the strength of FileVault, some academics published a paper detailing their analysis (spoiler: they thought it was pretty good): http://eprint.iacr.org/2012/374.pdf
mherrmann 3 days ago 2 replies      
I never cared about encrypting local disks until my MacBook was stolen three days ago. I have backups but the thought of someone having all my data is very scary.
netheril96 3 days ago 1 reply      
If you are using disk utility to create an encrypted container file (as opposed to an on-disk encrypted volume), you may want to check my open source project https://github.com/netheril96/securefs. It encrypts at file level with AES-GCM, rather than AES-XTS at block level, and the size is not fixed at creation.
Heliosmaster 3 days ago 2 replies      
Another (related) question: how much slower is MacOS with FDE enabled?
coolio2657 3 days ago 2 replies      
It is standard security, nothing out of the blue for a default functionality included in an OS, meaning it is of solid average quality, which, however, unfortunately in the world of security means it is probably not up to par and worth using.

The encryption standards it uses are pretty good, but that is not where blanket whole-disk encryption (which I assume you're talking about) fail. For example, hackers could analyze the preboot environment of an encrypted mac and sniff out the password using a variety of methods. Simply put, whole-disk encryption is too complicated and bug-prone process to really trust to closed-source software.

As for single-file encryption, which is relatively neat and simple, Disk Utility would probably do a pretty good job.

Ask HN: Anyone got any project ideas for fun office tools?
3 points by jukedill  1 day ago   11 comments top 6
Jemaclus 22 hours ago 1 reply      
I'll tell you something I've always wanted. I want a giant lever that's attached to a "The Price Is Right"-style fixture with tons of tiny lightbulbs. When I pull the lever, the light bulbs start turning on from the bottom to the top. What's happening during this time is my unit tests are running, then it's connecting to the server(s) to begin deployment. Once the deployment begins, the top section starts flashing, and when deployment is complete, the top of the fixture starts flashing "YOU WIN!!" or something, similar to a carnival game.

Basically, I want deploying to be super boring under the hood, but SUPER AWESOME in the office.

Another idea I had was one of those TNT detonator devices, with the handle that you press downward, and it lights up a bunch of lights and then has a little LED animated explosion on the wall. Or the giant hammer thing where you slam a hammer into a thingy on the ground, and the weight goes flying up, and it has to hit the top in order for the deployment to begin.

Ya know what, let's just take all carnie games and turn them into deployment mechanisms. HOW AWESOME WOULD THAT BE??



superqwert 1 day ago 1 reply      
This sounds like an awful distraction
zer00eyz 1 day ago 0 replies      
I don't have a particular project in mind but rather a broad suggestion.

Software is fun, but hardware is now "easy" -- There are plenty of hardware starter kits from places like adafruit and seed studio that you could drop into your office and let people have at it.

A few hundred bucks (and lets face it that isn't a lot if your doing software) can get you a lot of toys for people to play with and explore with.

mtmail 1 day ago 0 replies      
A traffic light that goes red when the office is too loud.


(discussed here https://news.ycombinator.com/item?id=14582187)

8draco8 1 day ago 1 reply      
Have you heard about "Is the toilet free?" ?


tmaly 1 day ago 0 replies      
a "Thinking" sign that resembles the On Air signs in a radio studio.
Ask HN: What personal finance advice would you give your 25yo self ?
85 points by zabana  2 days ago   189 comments top 74
baccredited 2 days ago 6 replies      
- credit card debt is an emergency and optimizing any other money stuff while you carry it is like rearranging your sock drawer while your house is on fire

- save 25x your annual spending and never work again

- start by saving at least something (even 1%) and save 50% of all future raises

- long commutes are for fools. So are new cars--buy used.

- spend on things that you value. I've given myself a tech budget for years because good tools matter to me

- host a dinner party instead of eating out (most of the time)

- If you have a gamblers mindset to investing, carve out a small portion (10%?) of your money and use it for risky investing. I call mine the 'casino fund'. Track your returns.

- read voraciously about finance and early retirement. You only need about 20 books or so to gain a background that is easily more valuable than your college degree. This is a good start:https://www.reddit.com/r/financialindependence/wiki/books

truewords 2 days ago 3 replies      
Marry well. What I mean is this:

Your spouse should have a career or should think of having a career of his/her own. Its not about having a lot of money, its to eventually have someone as a financial backup in case things go wrong. Works for both partners.

I love my wife but financially I am in trouble. I make enough money but she has no career aspirations. Her family is quite poor and I had to get mortgage for a house for her parents. In future I will also need to worry about their health expenses.

This effectively means I can never get out of the rat race.

damagednoob 2 days ago 2 replies      
Don't leave disposable income hanging around as savings in a no/low interest bank account. Invest it in an index fund with low fees (Vanguard, etc.) or better yet, figure out an asset allocation and invest in a few accumulating funds.

A good piece of advice I picked up from Rami Sethi (when it was still worthwhile reading his blog) was think about how much of your free time per month do you spend on various activities (Facebook, Gaming, etc). How much of that free time is devoted to thinking about your personal finance? If it's less than you think you should be doing, schedule it in.

Also starting reading https://www.reddit.com/r/personalfinance/ regularly.

BIackSwan 2 days ago 3 replies      
Travel as much as you can. It will get harder to find opportunities to do so as the years go by and responsibilities increase.
saryant 2 days ago 0 replies      
Have enough in savings to walk away whenever you want. I've been stuck in jobs far longer than I wanted because I'd neglected my emergency fund and didn't have the cash on hand to walk away.

Now I make sure I have a year's living expenses available outside of my investments. If I get tired of my job, I can just quit knowing that I have the cash to float myself for a while.

A year may be more than you need but at least six months is a good minimum. You'll have the cash to cover a job loss, car troubles, most medical expenses, etc. on hand without going into debt. And an emergency fund should be liquid and save, not invested and at risk. You may only get 1% in a savings account but view the low returns as the cost of insurance, since that's effectively what an emergency fund isself-insurance.

dtnewman 2 days ago 0 replies      
I'm only 30, so it's not like 25 was particularly long ago for me. But since then, I've gotten married, which is a major life change (overall for the better for me!).

But if I could go back to age 25, before I was married, I'd have told myself to travel to more far-flung places. Being married, I have to:

A) Agree with my wife on where we want to travel

B) Have time to travel that works for both of our schedules (which is difficult to find... plus we have to spend at least some of our time off going to visit our respective families, and now I have two families to visit instead of one)

C) Have the money to travel. In our case, we have two incomes, but still, it was much cheaper when I'd travel with friends and cram four people into a cheap hotel room.

I'm not complaining here. I'm fortunate to have spare income that let's me travel quite a bit with my wife and it's really a fantastic experience to travel with your partner. But there are trade-offs that I simply didn't have as a 25 year old. So those places that are far away and hard to get to? See them while you're young.

Murkin 2 days ago 1 reply      

Meet and keep in touch with as many people as possible. Switch jobs, travel the world, volunteer and always _always_ make new connections.

The best financial (and personal) gains you will make in life will come from the right connections.

damassively 2 days ago 2 replies      
I'm not 25 yet but this is my (current) course:

1. Reduce all bills/belongings to bare essentials to live minimally.

2. Pay off all debt while maintaining $1,500 emergency fund.

3. Save 6 months living expenses.

4. Invest in yourself with excellent groceries, gym membership/local park visits, medical/hygiene care and other healthy habits.

5. Invest in Vanguard's Total US Stock, Total International Stock and Total Bond ETFs (% as age) and don't touch it.

6. Invest in building your own business - tech or otherwise.

sametmax 2 days ago 0 replies      
I travelled a lot and figured out very soon I should never ever contract dept or have a strong commitment (house, children, etc) too soon. I'm still amazed so many people regret doing that, being young don't mean being dumb, why did you do that?

Anyway, my advice would be:

Start meditation sooner.

I would give myself many other advices about risks, people and self-acceptance, but I would have not being able to listen to them at that time.

That's the problem with advices, you must be in a place in your life where you can actually use them.

But I would be able to meditate and figure it out, since that's how it happened.

Replace that with any tool that helped you develop yourself.

If you don't have such a tool, find one quickly that suit you.

Oh, and yes, travelling help, so do so. But you'll reach a limit in what it brings to the table. You need to find a better tool on the long run. Just like money helps, but has a max amount after which it won't make you happier.

m-i-l 2 days ago 1 reply      
I reckon I got most things right, but as this is specifically "what would you have done differently, what lessons have you learned?" the big one for me is: don't invest in individual stocks, invest in broad (ideally low fee, e.g. passively managed) funds instead. Growing up in an era of privatisation, reading the financial press with all their information on individual companies, seeing all the stock market pundits with their various stock picks, and so on, buying individual stocks just seemed to be "the thing to do". But at the end of the day, unless you spend an enormous amount of time on it, you are very unlikely to be able to out-perform the market. In my case there were many bad investments I could arguably have avoided, e.g. all the dot com stocks I bought on 15 March 2000 and banking stocks I bought on 25 April 2008. But I did buy a lot of BP shares on 15 April 2010 (5 days before the Deepwater Horizon explosion), and I don't think any expert could have foreseen that. Individual stocks are far more of a gamble than funds.
logfromblammo 2 days ago 1 reply      
First, don't get married.

If you can manage to get that one done, you can actually act on the rest of the financial advice in this thread. If not, you will have to be in unanimous agreement to do anything wise with money (i.e. keep emergency fund, plus six months living expenses in liquid savings), whereas foolishness may be undertaken unilaterally.

bluedino 2 days ago 1 reply      
Do not go out and buy <sports car> and proceed to spend <stupid amount> of money on it trying to make it cool/fast when you are 22. And then immediately after that do it again with another car. You could almost buy a house.
jaclaz 2 days ago 0 replies      
In Italy there is (was, as it is not much used anymore) proverb:"Fare i conti spesso, moderar le voglie, spender men di quel che si raccoglie".

It cannot be translated easily, but more or less it amounts to:

Do the (financial) math often, limit your cravings, spend less than you can gather.

switch007 2 days ago 0 replies      
Eat out less (the value for money is extremely poor in the UK)

Contribute more to an index fund.

Save harder for a deposit. High rent/shared housing is horrible.

Don't try to keep up with the Joneses. There'll always be someone richer, with a nicer car you can't win that game. You weren't born in to money, don't even attempt to act like it. Live below your means.

You need to treat yourself far less often than marketing companies would have you believe.

bsvalley 2 days ago 1 reply      
Take as many risks as you can now otherwise you'll be maxed out in terms of income for the rest of your life. You're young and can live for cheap. Startups, businesses, etc. Don't get trapped in the W2 lifestyle and big corporates.
holtalanm 2 days ago 4 replies      
"You dont need to spend $50 on that video game. Or that one. Or that one. Actually, just give me your wallet, you can have it back in 10 years."

Also, start mining bitcoin.

matt_s 2 days ago 1 reply      
I think about 25 was when was on the cusp of a series of job moves/promotions in the next 5 years. I would imagine a lot of 25yo might be in the same spot (or not - whatever this is my anecdote).

Keep your cost of living the same when you see large pay bumps or raises. This means the big things like car, house/rental, etc. Don't just go get a new car and increase your spending or move to a "nicer" apartment or buy a house because you have the money available. Keep the car, stay in the apartment and save the extra money.

People will stay that owning a home is an investment - maybe in some areas it is - but not all. If home values are relatively flat in that area or grow very slowly then it is a losing proposition. You will be paying property taxes, school taxes and all the other "taxes" of owning a home: maintenance, repairs, accumulating "stuff" to fill it, etc. If the growth in that area is slow then that is all money down the drain - you won't get it back when you sell.

pgm8705 2 days ago 0 replies      
Stop going out eating and drinking so much. Don't work for pennies on the dollar in exchange for equity.
muninn_ 2 days ago 0 replies      
Uh, well it's the same advice I'd give somebody now:

1. Max out employer's 401k match. 2. Build emergency fund to 3-4 months expenses3. Max out Roth IRA4. Pay off low interest loans (if you have high-interest loans, which I don't/haven't then this becomes #1 and pay them off first).

jansho 2 days ago 0 replies      
Slightly older but I'm learning a hard lesson now: make sure to have at least 6 months worth of savings before going fully self-employed!
_Codemonkeyism 2 days ago 1 reply      
Don't spend money on stupid things. You don't need them.
msl09 2 days ago 1 reply      
Scratches his head mumbling What kind of financial advice can I give to some one that has no job...?

Oh yeah, you don't need to save that internship money, I'm good now. Besides, I make like 5-6 times what you are making.

Talyen42 2 days ago 1 reply      
$10,000 saved in your twenties is $200,000-300,000 in your pocket at age 55
dvko 2 days ago 1 reply      
Read up on finance, learn about assets vs liabilities.

Understand buying vs renting before doing either.

Keep monthly (recurring) expenses low.

If you absolutely need a car, keep your ego in check and look at mileage & reliability.

Think long term.

randomguy101 2 days ago 0 replies      
Every time you treat yourself, with a fine bottle of wine, or a nice sofa, or a great suit, you develop your appetite for "nice" things.

The more money you make, the more nice things you acquire, and the harder it is to imagine living without them. At the furthest reach, it's a private jet--the crack cocaine of travel.

Develop these appetites with great caution.

gremlinsinc 23 hours ago 0 replies      
THere will be this thing called bitcoin -- you do everything you possibly can to buy up as much as you possibly can then you hold it till 2017... if you don't hold it or lose the wallet you'll want to commit suicide, so don't do that.
woudsma 2 days ago 3 replies      
I'm 25, keeping an eye on this thread..

Best financial decision i've made was to buy some ETH (Ethereum) last year.

herghost 2 days ago 0 replies      
Don't get into consumer credit debt. Not huge figures, but I'd convinced myself that I would always have debt so I might as well not care about it. This led to my buying things I didn't really need on credit, whilst just paying for it monthly, forever.
dlwdlw 1 day ago 0 replies      
Index investing is great, but don't be afraid of taking risks for things you believe in. Actions where you put something at risk are those that truly show who you are. Constantly evaluate your emotions of fear and greed as to avoid falling into the trap of gambling or the trap of mindless following without the courage to think for yourself. Position yourself so that you either gain or learn from your risks. The only games that matter are those with your skin in the game.
AndrewKemendo 2 days ago 1 reply      
You're asking the wrong question.

What are your personal goals in the next 5, 10, 30 years? What do you plan on doing that requires money? How much money does that require?

Without knowing anything about your goals then any advice you will get (as demonstrated in this thread) will steer you toward structuring your life around saving money and getting safe but modest returns. Is that what you're asking for?

Here is a question you should ask yourself probably every 6 months:

"If I had infinite resources (money/whatever) what would I do?"

Take that answer and then figure out how to accomplish that without infinite resources.

markivraknatap 21 hours ago 0 replies      
Don't buy an expensive car. After a while all cars become a means that takes you from place A to place B. Save that money and invest it wisely somewhere else.
nunez 2 days ago 1 reply      
make a spreadsheet comparing your earnings to your debts and billables, record ALL of your debts and STICK TO THE FUCKING SHEET
howeyc 2 days ago 0 replies      
Don't buy so much stuff, it's mostly junk that'll sit in a box somewhere. When contemplating a purchase, honestly ask yourself how much use it will get, and for how long it will be useful.

Rent. Home ownership only starts making sense on a 5+ year time frame, in some markets 10+ years. Having the ability to move for a better job will reap huge financial benefits, and moving for a short commute will allow you to have so much more free time.

Save vigorously, but not so much that you have a dreadful life now, pining for the future when things will get better once you have "retirement money."

k2xl 2 days ago 0 replies      
A subreddit I highly recommend (depending on your goals) http://www.reddit.com/r/financialindependence
sailfast 2 days ago 0 replies      
Max out your annual IRA contributions you idiot. Don't break the glass unless it is to travel or be with family. Set up YNAB so you aren't kidding yourself with rotating card spending. Cap your bar budget.
LVB 2 days ago 0 replies      
1. Understand the basics of asset allocation, taxes and account types. Read Bernstein for primers on these.

2. Be highly skeptical of most of the financial services industry, especially those selling load funds, insurance, annuities, and who want to manage your money.

3. Enjoy simple cars, or no car if you can manage. The amount of money I've seen friends and family dump into vehicles over 25 years is staggering. I don't even see cars at this point. I don't care what others drive, and I don't care what I drive so long as it's reasonably comfortable, safe, economical.

ozzmotik 2 days ago 0 replies      
stop spending so much money on mind altering substances
AdamN 2 days ago 0 replies      
Have fun! Often doing things that are actually fun are the cheapest - like buying an old wind-up record player and some 78s from the Salvation Army and having a picnic with your friends (or a date!)
a5seo 2 days ago 0 replies      
If you are market timing, remember you have to be right twice, once when you buy and second when you sell.

Ideally, you'll come to realize that trading is a waste of your time, and you should set and forget a regular investment flow into the Wilshire 5000 or something equally diverse.

Lastly, don't let FOMO lure you into into investing in the new hotness of your age. For me, it was Internet stocks in 98-99. By the time you're hearing about it and it's productized in a way consumers can get involved, it's too late.

gao8a 2 days ago 0 replies      
- If a decision comes down to either not having enough time or not having enough money, rest assure there will never be a sweet-spot where you will have both.

- Always take into consideration mental health cost. Your commute, your work, the people you choose to surround yourself with. Debt in this area is unpredictable and therefore dangerous in the long run.

- No one has it figured out. Youth will always be wasted by the youth.

Good luck.

bungie4 2 days ago 1 reply      
(From a 57 yr old talking to his 25 yr old self)

If I had all the money lost from 'stock market corrections' on my investments, I could retire comfortably today.

Stop being a little fishy swimming with big fishies.

The interest paid today is a pittance compared to the risk. Save your after tax money in something with near zero risk until the interests rates rebound to make the reward worth the risk.

theandrewbailey 2 days ago 0 replies      
Good job on hitting the tuition free college jackpot. Keep paying off the rest of those loans, but don't stress out about it. Those will be paid soon enough, but get a credit card before they are. Keep being careful about what you're spending.

Move closer to your office. Even if the rent is a little more, the price is worth it if you don't have to use your car all the time.

zrail 2 days ago 1 reply      
Max out your 401k, even if the options look crappy.
miguelrochefort 2 days ago 0 replies      
Time in the market beats timing the market.
dennisgorelik 1 day ago 0 replies      
By the time I was 25 I already followed all the good advices listed in the comments here.

The only advices missing are predictions that are possible only with actually observing the future (e.g. buy GOOG/AAPL/AMZN).

sotojuan 2 days ago 0 replies      
This might be dumb and I'm only 23 but if you live in an expensive city/area, get a roommate/SO/whatever for a year or so. Funnel all saved living expenses to loans/savings account. It's not as nice as being alone but man, it's a lot of money saved.
forkLding 2 days ago 0 replies      
marsrover 2 days ago 1 reply      
Don't move into an apartment that costs 50% of your salary, no matter how nice it is.
NumberCruncher 2 days ago 0 replies      
Save up as much as you can and put it into risk free investments. Wait until there is blood in the streets and put everything into the DAX/DOW/ whatever index or economy too big to fail. Sell out at 100% profit. Repeat.
yotamoron 2 days ago 0 replies      
Forget the money, so you can make enough of it doing what you love: https://www.youtube.com/watch?v=khOaAHK7efc
kapauldo 2 days ago 0 replies      
Put the irs limit into an ira invested in the s&p 500 and forget about it.
DanBC 2 days ago 0 replies      

Also, put some money aside for savings.

EliRivers 1 day ago 0 replies      
Don't waste your time with cash savings. All of it into a handful of low cost index funds with dividends reinvested.
bluedino 2 days ago 0 replies      
Save for 6 months and buy that $2799 PowerBook with cash instead of paying the minimum payment on that 22% MacMall card for the next 5 years.
brimstedt 2 days ago 0 replies      
- save in stocks, not savings accounts or funds

- dont waste money on tv subscriptions

- play less videogames

- take greater care of your friends and relations

floorlamp 2 days ago 0 replies      
For me this was last year, so just this: Convert all Bitcoin to Ether and then buy more Ether with USD.
_e 2 days ago 0 replies      
Cash is king.
bencoder 2 days ago 1 reply      
Don't sell those 1000 bitcoin
shove 2 days ago 0 replies      
"that girl is poison"
iamgopal 2 days ago 1 reply      
Should have invest myself instead of giving advice to others to invest in it. Really.
pdnell 2 days ago 0 replies      
Invest that $10,000 in bitcoin.
kasey_junk 2 days ago 0 replies      
Don't buy that condo.
smarptlaya87 2 days ago 0 replies      
well im not really saving im investing in my own company.

Im building my start up ejgiftcards.com. Its generating revenue with about 20-30% margins. Current revenue is about 50-60k per month.

taway_1212 2 days ago 0 replies      
Consider jobs out of your (my) country ASAP, they do pay better.
amingilani 2 days ago 0 replies      
Turned 25 last month, your advices are very helpful.
unstatusthequo 2 days ago 0 replies      
Put $150 a month into a RothIRA. Every month.
jshelly 2 days ago 0 replies      
Want vs. Need
petraeus 2 days ago 0 replies      
Start playing the stock market
dec0dedab0de 2 days ago 2 replies      
Do not buy a house.
kamaal 1 day ago 0 replies      
I'm a little late to this but this might help you. I'm an Indian, but I assume you are in the US. Having spent some time in the US this is what I would advice:

- Go get yourself a savings account and a checking account. Fill only enough amount in your checking account that you need to get by the month. Remaining goes into savings.

- Buy a home as quickly as you can, in an affordable place in the outskirts. By the time your kids arrive necessary infrastructure will be in place. Also rent is just another form of tax. And having your own home also means some place to rest without financial implication when you are old.

- Take the 401K plan seriously.

- Max out other instruments such as the IRA and Roth IRA.

- Buy a durable, long lasting car. And stick with that as long as it lasts.

- Healthy life style. Nothing pays as well as good health. Buy a bicycle or play a sport. Ensure your heart is healthy and you are not obese. There are other things to this, like learning to cook healthy food. Remember bad health too will account for a big chunk of your earnings in a place like the US.

- Be frugal. Frugality means making decisions that pay on the longer run. $5 may buy you burger combo in McD but trust me it will cost you on the longer run. You don't want that kind of frugality. Which is why the learning how to cook makes even more sense on the longer run.

- Be productive, in all ages. Have free time to network and develop new skills. Never be afraid to start from the beginning or learn and do something new.

- Lastly save. Save a lot.

SirLJ 19 hours ago 0 replies      
The Lottery numbers for the big jackpot...
Bahauddin 2 days ago 0 replies      
Psyonic 2 days ago 0 replies      
Go all in on ethereum.
taf2 2 days ago 1 reply      
Buy redhat stocks it's at like $4. Oh buy apple, google and amazon too.
       cached 22 June 2017 20:05:01 GMT