hacker news with inline top comments    .. more ..    1 Jan 2017 Best
home   ask   best   2 years ago   
Lessons from 3,000 technical interviews interviewing.io
682 points by leeny  3 days ago   311 comments top 46
forrestbrazeal 3 days ago 7 replies      
The author draws a hard distinction between Udacity/Coursera MOOCs (good) and traditional master's degrees (bad). I'll interject that with Georgia Tech's Online Master's in Computer Science program [0], which is delivered via Udacity and insanely cheap [1], you can get the best of both! (Their "Computability, Complexity and Algorithms" class is one of the top Udacity courses cited in the article.)

Keep in mind that a traditional degree program does have a huge advantage over a strict MOOC: accountability. It sounds good to say that anybody can go push themselves through one of these courses. Try pushing yourself through ten, and actually writing all the papers and implementing all the code, while working full time and having a family. That grade looming at the end of the semester really does wonders for your motivation. Plus you can get help from live professors and TAs, and the Piazza forums for OMSCS are full of smart, curious students who love talking about the subject at hand. There's a richness to the degree experience that I don't think you get with scattered classes.

(Obvious disclaimer: I'm a current OMSCS student)

[0] http://omscs.gatech.edu[1] https://www.omscs.gatech.edu/program-info/cost-payment-sched...

graffic 3 days ago 1 reply      
"Whether passing an algorithmic technical phone screen means youre a great engineer is another matter entirely and hopefully the subject of a future post."

This sentence plus the inverse correlation between experience and "interview performance" shown there. Makes a big smell about how biased are those interviews to themselves and not to real technical interviews.

From the data it looks like the questions asked using that service are the ones you might learn in university and after many years not using them, that knowledge fades away because you're not using it.

This is reinforced by MOOCs being the 101 of the subject they're dealing with. It would be interesting to see if there are trivia questions from 101 courses.

The most obvious bias is in the clickbait title. Those 3K interviews are in a specific platform, meaning they're done in a specific way.

So after checking their results it seems that interviews done using that service benefit people with fresh university or 101 lessons knowledge.

What worries me more is the lack of improvement and perhaps the moral superiority of ending the article with a "these findings have done nothing to change interviewing.ios core mission". It feels like the entire statistics game shown there was to feed back what they already knew.

fecak 3 days ago 4 replies      
Thanks for writing this Aline. As a recruiter for almost 20 years, I wish I had access to all my data and then the time to compile it, and anecdotally I'd expect the finding about MOOCs would be similar.

The most selective of my hiring clients over the years tended to stress intellectual curiosity as a leading criterion and factor in their hiring decisions, as they felt that trait had led to better outcomes (good hires) over the years. MOOCs are still a relatively recent development and new option for the intellectually curious, but it's not much different than asking someone about the books on their reading list.

Unfortunately, demonstrating intellectual curiosity often takes up personal time, so someone with heavy personal time obligations and a non-challenging day job is at a significant disadvantage. One could assume that those who have the time to take MOOCs also have time to study the types of interview questions likely favored by the types of companies represented in this study.

Thanks for continuing to share your data for the benefit of others.

blazespin 3 days ago 9 replies      
I am perplexed why anyone would think that interview performances has any interesting statistical relevance. Much more interesting would be how successful the candidate was after receiving a job at the company.
closed 3 days ago 1 reply      
Interesting article! Some minor statistical pet peeves:

1. Setting non-significant bars to 0 seems fishy. Leaving them and putting confidence intervals on everything would let them speak for themselves.

2. Calling something effect size is ambiguous. That's like saying you measured distance in units (and the wiki article on effect size linked makes clear there are a billion measures of effect size).

I'm guessing their measure of effect size were the beta coefficients in a multiple regression?

kcbanner 6 hours ago 0 replies      
Interesting points, but needs better statistical analysis.

For instance:

* There is high effect (which I assume is correlation?) between performance and being at a top company, with less effect from top school. How far out of school is the interviewee? How far out of the top company is the interviewee? The time impact is probably a confounder.

* Years of experience has no effect. This may be due to survivorship bias, where the top potential performers don't need to do interview practice on their site.

* Speaking of having no effect, there is no such thing as "not achieving significance"... I'd rather see the estimated effect size with error bars. Is the "founded a startup" listed at zero because it is at 0.05 +/- 0.10 effect, or 0.40 +/- 0.50 effect?

* There is mention of Coursera/Udacity having a huge effect, but not when coupled with top company. There is some speculation as to why, but it leaves out some other possibilities that can be easily tested. For example: are the people who don't take Coursera courses and are not from a top school significantly worse than everyone else?

k2xl 3 days ago 1 reply      
Interviewing is a funny thing.

I remember when I graduated from a "Top School" and interviewed at "hot startups" from the valley. I aced a lot of the interviews - why? Because I had just taken classes on LinkedLists, Binary Trees, HashMaps, etc... So when they asked me to whiteboard a "shortest path algorithm" it was just rehashing what I did in school.

Years later, looking back, I fail to see the relevance in most of the technical questions. In fact, if I had to do those questions over again today I would probably fail miserably. Yet, I have been in the industry for a while now and have worked with countless more technologies and have accomplished far more than my younger self.

Just because someone performs well in a technical interview doesn't mean they will do a good job. That is the data that really matters. I've interviewed hundreds of candidates as a hiring manager for some big startups, and from my experience technical interviews are not a great indicator of success.

I'm saying this coming from someone who has gone to a "Top School" and done multiple Coursera/Udacity/etc classes.

Yes, someone might be able to whiteboard a random forest or write a merge sort, but do they know how to engineer a system? Can the candidate:

> Communicate well with others in a group?

> Solve unique technical problems?

> Research and learn new technologies effectively?

> Understand how to push back to product owners if there's scope creep?


These are all things that are not really analyzed in many technical interviews.

As I'm reading this analysis all I can think of is that it is pretty useless - if not dangerous for the industry.

What I've found is that it is critically important that someone knows how to code at some basic level. But their ability to code and explain algorithms on the fly, while probably relevant in academia/research, is such a minor part of the day-to-day of a programmer - At least from my experience.

geebee 3 days ago 5 replies      
Interesting bit on the MS degree. I followed the link, and I'm not quite as surprised that the correlation is poor, or even negative, given the way the data was collected and analyzed.

Absolutely agree that some MS degrees are pretty much less rigorous cash cows by now, that allow students to skip the fundamentals such as data structures, operating systems, and compilers.

However, many CS MS degrees actually do require this as a background, to the point where some programs have emerged to prepare non-CS majors for MS degrees, kind of like those post-bac premed programs. It's hard to believe that those MS degrees, which require a decent GPA in those core courses, along with high GRE scores (sorry, but we are talking about interviewing skill, which may be more related to exam taking ability than job performance), wouldn't result in a similar profile to people with CS degrees from top schools.

This is fully acknowledged in the text of the article referenced in a link, but unless people follow it, I do think the message may be a bit misleading.

That's an aside, though. The value may very well be in the prep for these degrees (ie., the post-bac CS coursework required for admissions to a reputable MS program). If you can get that through online courses (udacity or coursera) through genuinely rigorous self-study? Yeah, that might do it, for far less money. I've audited a few of them, and they're the real deal, that's the real coursework there.

Futurebot 3 days ago 3 replies      
The takeaway from this is that those who do best are those with:

- the wealthiest/most financially supportive parents/relatives

- upbringings that are conducive to academic success

- the most free time

as those are the ones who, by a large margin, attend top schools, work at top companies, and have time to spend on self-learning. Another data point of confirmation of a well-studied idea.

Assortative mating: http://www.economist.com/news/briefing/21640316-children-ric...

Few poor at rich schools even all these years later: https://www.nytimes.com/2014/08/26/education/despite-promise...

Why people care about elite schools: https://medium.com/@spencer_th0mas/why-america-cares-about-e...

Apocryphon 3 days ago 3 replies      
Not to harp on the "technical interviews are disconnected from actual work!" angle too much, but I'm reminded of a comment from a thread about the creator of Homebrew failing a Google interview. Someone pointed out that it goes to show that it's possible to create widely-used software without an intimate knowledge of CS. I wonder if that's a disconcerting fact for some employers to grapple with.
ma2rten 3 days ago 3 replies      
Until recently I worked at a startup as Machine Learning Engineer/Data Scientist. There I got some experience interviewing people and looking at their resumes. In my experience, which is very limited compared to this post, people who put an MOOC on their resume are usually less qualified compared to people who don't.

There is nothing wrong with MOOCs, but they are almost always beginner-level. If you put them on your resume it kindof implies you don't have a lot of experience beyond that. Putting the Coursera Machine Learning course on your resume would be the equivalent of putting Java 101 on your resume for a Software Engineer.

I would recommend anyone to put projects on your CV instead. Even if you don't have a lot of work experience, just put side-projects and school projects on there.

collyw 3 days ago 5 replies      
Interesting and surprising, especially the experience thing. I think I am a significantly better engineer than earlier in my career, so I assumed experience would count for a fair bit. Then again I have inherited projects from experienced guys who make crap high level architecture decisions and the code is way more difficult to work with than it ought to be.

But then this article seems to be measuring interview performance, not actual ability on the job. So is any of it actually relevant at all?

bhntr3 3 days ago 0 replies      
I wonder if "took courses" could be a stand in for "prepared heavily". It seems like people with all the other attributes might think they didn't need to study. People without them might think they did and took courses to "catch up". In my experience, preparation is the key driver of performance in these types of interviews.

It seems reasonable that a person who took a MOOC might have prepared in other ways as well while people who didn't probably didn't prepare much at all (since watching a few Algo lectures seems the most accessible refresher.)

serge2k 3 days ago 0 replies      
That graph is about the best evidence I've seen that interviews are garbage. Years of experience doesn't matter at all? Coursera courses are the best thing ever?
chewyshine 3 days ago 1 reply      
Top school is probably serving as a proxy for intelligence in this analysis...a well known predictor of both interview and actual job performance.
memracom 3 days ago 0 replies      
I think you are seeing the effect of people who have decided for themselves to pursue lifelong learning. The Udacity/Coursera thing just clusters these people in a way that you notice them in the stats. But remember that statistics do lie. You need to dig into the reality behind the numbers, and question whether you are measuring all the right indicators.

My experience comes from several decades developing software and from time to time, hiring people. The people that worked out best, either as colleagues or hires, always seemed to be learning new things and were ahead of the curve trying out new techniques or tools before they became popular.

If you understand how a tool/technique becomes popular as the mass of software developers wrestle with new problems and finally find a way to master them, then it makes sense that constant learning makes some people stand out of the crowd. They happen to be the first ones to learn the new tool/technique and if they do not introduce it to their development team, then when management does make the decision to introduce it, the folks who know how to drive it have a chance to excel and appear to be rocket scientists.

grogenaut 3 days ago 0 replies      
Unless I missed it in the article the data is all about passing the interview not acutally seeing if any of these things correlate to the employees working out in the 1,3,5 year time spans.

With this data you're just biasing towards people who interview well, which, I don't think you actually care about.

Well I mean I guess you do if you're a recruiter (if you're a moral recuiter you care about both), but not really if you're an employer.

sundvor 3 days ago 0 replies      
Searched the article and the comments here for "Pluralsight", with zero hits. So what makes Udacity/Coursea preferable? TLDR, I'm asking this as Pluralsight was a significant contributor to my landing my latest role after redundancies.

The long version: I recently landed a role after some time off, having changed from mainly back end Php/Coldfusion to C# in the last year. I was able to make the switch in my last role. For me, moving to C# was a big transition; as well as guidance from a (fantastic) mentor, I used Pluralsight to learn C#, asp.net and DDD - e.g. from Jon Skeet, Scott Allen and Julie Lerman, to mention but a few.

Being completely burnt-out on the old stacks, I was set on making my next role a C# one. I've come to love what Microsoft are doing with Core, open sourcing etc, as well as the strictly typed C# language and ability to use NCrunch with live unit tests. So I signed up for a year after relinquishing my corp subscription, kept doing their courses, and found the training material highly accessible with great quality content. Each interview was a learning process, when I didn't know something from a test, I'd go and study it so that I'd be better prepared for the next role. One of these was the study of data structures and basic computer algorithms, where I was lacking. I might not have had years of experience, but the experience I had was mostly best practice.

During my search, I typically got great feedback on the fact that I was doing Pluralsight courses, and it was a significant factor in being hired for the new role - it showed cultural fit, in addition to passing their tech tests (which happened to involve structures). My company had interviewed a lot of candidates, struggling to find the right talent. Just possessing technical skills is one thing, having the right attitude towards learning is another.

At any rate, I'll keep using Pluralsight to raise my proficiency in my new stack - even as an old timer, I am having a newfound level of enthusiasm towards my whole profession which I haven't felt since I coded in assembly on the good old Amigas. I would be interested in knowing why Coursera / Udacity might be better or more accepted in the marketplace though.

faitswulff 3 days ago 7 replies      
It's rather shocking how much effect Udacity/Coursera had on interview performance - more than graduating from a top school or being employed at a top company:

"...only 3 attributes emerged as statistically significant: top school, top company, and classes on Udacity/Coursera."

ordinaryperson 3 days ago 1 reply      
The master's in CS can be useful if:

1. You have an undergrad degree in liberal arts2. You pay as little tuition as possible3. You take no time off and continue to work FT

These apply to me -- my undergrad was in English, I paid 6k total (27% of the 21k total cost) and went to school at night over 4 years while my career continued to progress.

Most of the people in my program couldn't write a FOR loop if their life depended on it, they viewed it (incorrectly) as a jobs program while the school needed the $$ to keep the dept afloat, so I'm not surprised they fared poorly in technical interviews.

But that doesn't mean the degree isn't useful. If you're already a programmer, it helps get your foot in the door at many places. HR managers/recruiters feel more confident forwarding on your rsum, they can't parse your GitHub repos.

The degree is icing on the cake, it's not going to magically turn you into the Cinderella of Programming if you have no real-world experience. I got my master's with a QA and a paralegal and today? They're still a QA and a paralegal.

That being said, timed technical interviews are almost universally asinine, IMHO. When in real life do you have 10 minutes to figure out a problem? Or are prevented from Googling the answer? The measure of successful programmers is how efficient and professional they are in problem solving, not how much useless information they can keep in their head.

Things I've never had to do in 'real' life:-Never had to split a linked list given a pivot value-Never had to reverse a string or a red/black tree-Never written my own implementation for Breadth First Search

etc etc

Personally I'd rather see take-home assignments that roughly approximate the type of work you'd do, which in my career has been churning out new features or applications. Does knowing the time-complexity of radix sort vs heap sort really have a material impact on your effectiveness as a programmer? No.

Bahamut 3 days ago 0 replies      
It should be noted that these technical interviews are biased to a particular style, so the data only really is of relevance for these types of interviews.
maverick_iceman 3 days ago 0 replies      
This is a very poorly done analysis. At a minimum she needs to define top school/top company. Also I'd like to see the confidence intervals around the effect sizes. In addition, looking up MOOC information from LinkedIn may result in a lot of false negatives. (She doesn't mention if MOOC courses in non-CS subjects count.) Did all the interviewees have CS degrees? What about the Masters degrees, is she including non-CS ones? Is the sample of interviewees representative or there's any selection bias that we should be aware of?

A study which doesn't answer so many basic methodological questions is garbage.

acjohnson55 3 days ago 0 replies      
On the master's front, I went down a slightly unusual path. I enrolled in a master's program in music technology at NYU [1]. I already had a master's in engineering from Princeton [2], but after time away from the software world, I wanted to retool for a return to engineering, but with a focus on applications that actually mattered to me.

It turned out to be a very expensive, but very fulfilling decision, and it paved a route for a very successful past four years.

Compared to my first master's, it was less theoretical and much more project-based. In that sense, it was fantastic preparation for career work, because every semester, I had to conceptualize and ship 4-5 different projects in all sorts of subject areas. The value of that shouldn't be underestimated. It also directly led me to cofounding a startup that had a brief lifetime, but effectively converted me to a full-stack engineer.

Today, I don't use much of the subject matter I learned in my day-to-day, but I draw on the creativity, problem-solving skills, and work patterns every day.

My Princeton program was great too, but I thought I'd share about the NYU program, as that was the more outside-the-box choice. There's something special to be said for a master's degree, when it's interdisciplinary and let's you focus on the intersection of engineering skills and subject matter expertise.

[1] http://steinhardt.nyu.edu/music/technology

[2] http://ee.princeton.edu/graduate/meng-program

kenoyer130 3 days ago 0 replies      
We really need a further correlation between people who pass the interviews and job performance a year later. I do a lot of interviewing at my current job and we have found no strong correlation at all between CS skills and actual ability to "get things done".

We toned down the CS type questions since they tend to take too long. We still ask a few basic tree and string manipulation questions to weed out the people who have no idea how to program and get insight into how the person thinks.

I still feel at the end of the day we could flip a coin on accepting an interview candidate once they have shown basic competency and have the same results.

I have been telling candidates that a public github repo with a nice commit history carries much more weight with me then a CS degree since we have been burned so many times before.

sytelus 3 days ago 0 replies      
yes, this is absolutely startling:

For people who attended top schools, completing Udacity or Coursera courses didnt appear to matter. (...) Moreover, interviewees who attended top schools performed significantly worse than interviewees who had not attended top schools but HAD taken a Udacity or Coursera course.

Possible explanation might be that people going through regular degree typically spread themselves thin over many subjects (digital electronics, compiler design, OS theory, networking etc) while MOOC folks sharply focuses on exactly the things for interviews (i.e. popular algorithms). Its like interval training for one specific purpose vs long regime for fully rounded health. The problem here is not academic system but how we measure performance in interviews. I highly doubt if results would be same if interviewers started asking questions from all these different subjects instead of just cute algorithm puzzles.

AlexCoventry 3 days ago 0 replies      

 If you know me, or even if youve read some of my writing, you know that, in the past, Ive been quite loudly opposed to the concept of pedigree as a useful hiring signal. With that in mind, I feel like I owe clearly acknowledge, up front, that we found this time runs counter to my stance.
Did the interviewers have access to the applicant's resume? If so, to what extent do these results simply reflect the interviewers' bias for top schools and famous companies?

lgleason 3 days ago 0 replies      
While I do think that interviewing is broken, I would love to see the raw data with this. For example, did Udacity courses have other related traits associated with them, IE: did these candidates that also have a certain number of years of experience, degree etc.? 3000 is a small sample size and I'm wondering if there is some sampling bias here.
pklausler 2 days ago 0 replies      
I conducts lots of tech interviews for SWE positions, and as everybody's boning up on algorithmic trivia, I've learned that I can get a stronger hiring signal by asking simpler questions that people with an aptitude for programming will succeed on and people with an aptitude for memorizing the implementations of algorithms will not.

(Simple example: given two closed intervals [a..b] and [c..d], how do you compare the four values to determine whether or not the intervals overlap? You may laugh, but it defeats about 50% of candidates in the first minute of an interview because they just don't understand simple relationships and Boolean expressions.)

Mister_Y 1 day ago 0 replies      
One of the reasons I got hired by Airbnb is that I took MOOCs, but I also believe that most of my knowledge comes from reading books and that's a thing I didn't put on my CV. So, even if showing interest in learning opens for you a huge amount of opportunities, I think you actually have to go deeper than just enrolling on a couple of MOOCs.
analog31 3 days ago 0 replies      
Something I wonder is how the participants in these interviews were selected from the general population of job candidates. Painting with a broad brush, the best workers might not even be candidates, because they've already been hired. And the best candidates might be the least likely to seek coding interview practice.
lintiness 3 days ago 1 reply      
"Im excited to see that what mattered way more than pedigree was the actions people took to better themselves."

so a degree from a top school is not earned (nor are admissions i guess), but rather conferred at birth? i beg to differ.

the commentary on the "disutility" masters degrees is even worse.

chvid 3 days ago 1 reply      
I have been through an interview process many times; I have never been asked a technical question / asked to do on blackboard or on computer problem solving.

I guess that that form of interviewing is simply not common in my neck of the woods (Denmark).

I am curious to what sort of questions / tasks are actually given to the interviewee?

And are they in any way biased towards more textbook/academic ones? (I.e. "implement bubble sort" rather than "create a blue button").

henrik_w 3 days ago 3 replies      
I thought the most interesting finding was that completing Udacity or Coursera courses on programming/algorithms (for non-top school graduates) was highly predictable of strong interviewing performance.
ggggtez 3 days ago 0 replies      
Interestingly, they suggest that if you attend a to school, the effect of Udacity is negligible. I'd argue that Udacity is this filling in gaps of a poor education.
clark-kent 2 days ago 0 replies      
Basically a "bad" programmer that can't write maintainable code that prepares for technical interviews by brushing up on algorithms and whiteboard style questions will do better than a very good programmer with lots of years experience.
allThumbs 3 days ago 7 replies      
I feel like things are operating according to the following pattern:

 1. Go to college: a. spend many semesters in lectures all of which tangentially brush upon the final exam based on the whims of the lecturer. b. cram for final exam last minute panic to crunch memory according to advice on content which was brushed upon during lectures. 2. Interview for job: a. cram for interview by going to coursera to crunch memory according to interview memes based on the whims of the interviewer. b. spend the rest of term of employment exercising skills which tend to be tangentially brushed upon during both interview and schooling while the majority of actual tasks are often googled and stack-overflowed into place based on arbitrary design decisions and politicized stack choices. 3. Results: a. good interviewees have learned appropriate memes to reassure interviewers. b. good students have learned obligatory cruft to reassure professors. c. actual necessities are tangential to many or most entry barriers.
How accurate is this?

sgt101 3 days ago 0 replies      
When you are interviewing for a specialist post (and most posts are specialist to some degree) you are looking for evidence that the candidate can do that particular job. Therefore a course that indicates that they have the particular skills required is highly desirable!
shanwang 3 days ago 0 replies      
I'm not surprised that MOOCs are a big factor, people like me who have left school years ago have forgotten how to write a BFS, we need something to brush up those knowledge.

If you run statistics against using sites like careercup, you may find that being the top factor.

jventura 3 days ago 1 reply      
It was a very good reading, but I wonder how interviewing performance relates to job ("real") performance?
Eridrus 3 days ago 0 replies      
Huh, I hadn't bothered to list MOOCs on my resume since I didn't think employers would be interested, maybe data like this will make employers more interested in the courses, which would probably get more people to shell out for the certificates.
bootload 3 days ago 0 replies      
"We got this data from looking at interviewees LinkedIn profiles."

Verification of completion and award id? There are a lot of individuals who will add a degree regardless of attendance, completion or award. Who validates the assertions?

ditonal 3 days ago 4 replies      
Very unsurprising for me. You are measuring your ability to solve algorithm puzzles. Most engineers don't actually do many algorithm puzzles in day-to-day work, especially the types of algorithms that interviews tend to focus on like sorting and dynamic programming. So "years of experience" is not measuring experience in what you're actually being tested on. On the other hand, you do exactly those types of things in many CS classes, and in Coursera classes, algorithms are exactly what you practice. So it makes sense it correlates.

Top company is a predictor for the obvious reason - it's selection bias for people who already passed those interviews at the company. You're not good at the interview because you worked at the company, you work for the company because you're good at the interview.

Master's degrees seem like largely for international students needing visas, career switchers, etc so not surprised they are not a strong predictor. And if anything the course material moves past the intro data structures stuff the whiteboard interviews tend to test.

The only huge surprise for me here is that Coursera is a stronger predictor than top company and top school. I would have predicted top company > top school > Coursera.

The post that I would be much more interested in is correlating performance reviews to interview performance. That gets suggested as a possible future post.

boha 3 days ago 0 replies      
Sad to see so much detail paid to the data, and so little to the setup of the experiment itself.

It shouldn't be surprising that an online technical screen favors candidates who've participated in a MOOC, but is blind, say, to years of experience. A screen like this is timed-performance-at-a-distance, which resembles MOOC participation. The full spectrum of qualities that comprise a Good Hire might incorporate the other signals from the post, but this type of interview won't test them.

(I'll be the first to admit I'm biased against performative coding in engineering interviews. Tech screens like this are often necessary, though, so they have their place.)

pmiller2 3 days ago 0 replies      
Was there any kind of statistical correction applied when the data were partitioned into MOOC + top school vs MOOC + no top school?
conqrr 3 days ago 0 replies      
Slightly Off-topic, but does anyone have an invite for this platform? I have been trying to get one since ages.
marsian 3 days ago 0 replies      
Is this guy a paid shill for academic friends trying to boost enrollments and overcome the disillusionment of the younger people who realize too much emphasis is placed on academics and not enough on practical application?

The world needs more vocational schools and trade schools and technical schools than it does colleges and universities.

China announces ban on ivory trade by end of 2017 bbc.com
453 points by adamnemecek  1 day ago   167 comments top 24
chanderson0 1 day ago 2 replies      
For data on elephant populations and the effect of ivory bans or increased demand, check out: https://elephant-atlas.org/ - including an API. The counts come from an awesome program called the Great Elephant Census (http://www.greatelephantcensus.com/).

Disclaimer: I work for The OCR

DoodleBuggy 1 day ago 4 replies      
They need to go further and ban any and all exotic animal trade for the quackery medicine and delicacy markets.
adamnemecek 1 day ago 2 replies      
You should also check out https://reddit.com/r/babyelephantgifs. We just finished a fundraiser for David Sheldrick Wildlife Trust (an elephant orphanage in Kenya) in collaboration with the UK branch of the organization. But you can still donate!


DSWT even made a special video for this fundraiser https://www.youtube.com/embed/ogJprDLQFl8

Just check out for example this little orphan https://www.instagram.com/p/BOi9KfJDSKz/

The situation is really bad, if the next 10 years are as bad as the last 10, elephants will be basically extinct in the wild. This will have wide reaching consequences as elephants are keystone species which means that they are extremely important for their environment. If they go away, ecosystems will collapse which will cause further unrests in the general region.


Furthermore, money poured into the black market tends to end up in the wrong hands (read terrorist groups) http://www.nationalgeographic.com/tracking-ivory/article.htm...

blondie9x 1 day ago 2 replies      
This is really great news! But we can go even further, let's continue with banning Rhino horn markets (would we call it keratin markets rather than ivory?). Keratin would be primary component here, the same protein that makes human fingernails also is what composes Rhino horns.


In Vietnam and some of China there are some who cling to a belief that eating the Rhino horn will cure/prevent cancer or increase libido. Those who believe this probably aren't aware that eating their fingernails would have the same effect.

thinkloop 1 day ago 4 replies      
There have been quite a few downvotes for suggesting that illegalization may not be the best remedy.

What are people's thoughts on something like: https://youtu.be/YUA8i5S0YMU


Controlled trophy hunting big game is one of the most effective ways (sadly) of protecting the hunted animals.

finid 1 day ago 2 replies      
Good move, but that gives those traders about 12 months to "kick up the volume" of ivory traded.

After that, it becomes a typical black market affair, kinda like trade in stolen art works.

Pica_soO 1 day ago 5 replies      
I always wondered- why is the ivory market not flooded with fakes? Everything is faked in china, this is a valuable good- is there no way to synthesize dentin flood the poachers out of the buisness?
gigatexal 1 day ago 3 replies      
Woot! Thank you China! Elephants are my favorite animal and it's sad to see such magestic creatures slaughtered for such nominal things. Or any animal for that matter. Shark fin soup? Seriously what a waste of a needed predator.
SpikeDad 20 hours ago 0 replies      
Coincidentally I watched a new documentary last night on Netflix named "The Ivory Game" (https://www.netflix.com/browse?jbv=80117533&jbp=0&jbr=6)

Heartbreaking documentary on the massive killing of elephants for ivory and how futile all of the efforts the African nations are taking to try to stop poaching and killing by local farmers.

Investigative segments include a Chinese journalist undercover with WildLeak talking to the Chinese criminals involved in the massive illegal ivory trade.

I hope this was instrumental in getting the Chinese govt to actually set a ban date for ivory. Currently their regulation are so lax and so corrupt that it's easy for the "legal" ivory dealer to launder illegal ivory in order to sell millions of dollars worth every day.

Did you know over 1000 Kenyan and other African game rangers have been kill in the protection of elephants by poachers. Terrible.

c3534l 1 day ago 1 reply      
Am I the only one shocked to learn that ivory was still legal in China? I know China has a reputation for being a bit dystopian, but I thought this was one of those things that everyone agreed on.
mrcsparker 6 hours ago 0 replies      
This was long time coming, and hopefully they'll do a lot against the black market as well.
hashkb 1 day ago 3 replies      
OK... how about rhino horn? Compared to rhinos, elephants are in zero danger. Not defending poachers, but this is the wrong thing to prioritize. (Also cheetahs, and probably a lot of other species)
seanmcdirmid 1 day ago 1 reply      
Again? They have already banned the ivory trade a few times, do they actually mean it this time?
Pica_soO 21 hours ago 0 replies      
We banned the hunt for dodos, T-rex and mammoths in Germany last year. Wasn't easy, but government heroics, they are possible.

Esoterics kills.

dschulz 1 day ago 2 replies      
"by the end of 2017"?

This must be a joke. Why not earlier? Why do they have to wait a whole year to enforce a ban like this?

It's almost like "China announces ban on human trafficking by end of 2017". Yeah, let's give criminals some time to find new career opportunities!

aryehof 1 day ago 0 replies      
Way to go China. Thank you.
Havoc 20 hours ago 0 replies      
Why not end of 2016?
aaron695 1 day ago 0 replies      
Sorry but I find most comments here @#!@ing inane.

We kill millions of pigs every year (intelligent animals) and farm millions of cows (Extinct many many animals through environmental destruction through use of large amounts of land) but we expect China to care about the second hand effects of the ivory trade.

Where they still have millions living in poverty, we are rich (speaking as a westerner who eats meat, not all of HN)

We love to be racist don't we?

I guess if we convince the Chinese the Rape of Nanking was cows and not elephants they can become like us.

alphacome 1 day ago 0 replies      
how about rhinoceros horn trade?
adamnemecek 1 day ago 2 replies      
> I think people on HN perceive hunting or ivory trading as something bad such as human trafficking. It isn't.

They both involve exploitation and suffering of intelligent and sentient being for selfish reasons.

> My great-grandfather was an Ivory trader

Your grandfather was a scum and no amount of mental gymnastics will change that.

> With increasing government regulation and heavy handed attitude of regulators the entire industry got pushed underground.

Illegal markets tend to be a lot smaller than legal markets (in terms of amount of goods purchased at least).

> As far as Ivory trade in Kenya is considered the people who take money from the western world to conserve are often the leaders of the cartels who smuggle ivory.

Welcome to corrupt governments? Does the fact that laws cannot be enforced 100% mean that we don't need them?

> The result of the ban would be further exploitation of the poachers and hunters

You should start a charity to support these poor souls /s.

> and yes they are going to poach elephants at faster rate.

Can you explain to me why the poaching crisis wasn't this bad until 2008 when CITES lifted restriction on ivory sales?

This whole discussion reminds me of this Onion video


truth_sentinell 1 day ago 4 replies      
beeman 1 day ago 2 replies      
tn13 1 day ago 2 replies      
Only if bans worked!
fma 1 day ago 2 replies      
What will happen to the people who hunt/trade ivory. Specifically, those who live in villages and depend on it as a source of income. Will there be assistance to wean them off elephant hunting.

I hope they get new skills and do not poach other animals.

Rust is mostly safety graydon2.dreamwidth.org
483 points by awalGarg  3 days ago   445 comments top 40
agentgt 3 days ago 3 replies      
I'm a lowly ancient Java programmer and I think Rust is far far more than safety.

In my opinion Rust is about doing things right. It may have been about safety at first but I think it is more than that given the work of the community.

Yes I know there is the right tool for the right job and is impossible to fill all use cases but IMO Rust is striving for iPhone like usage.

I have never seen a more disciplined and balanced community approach to creating PL. Everything seems to be carefully thought out and iterated on. There is a lot to be said to this (although ironically I suppose one could call that safe)!

PL is more than the language. It is works, community and mindshare.

If Rust was so concerned with safety I don't think much work would be done on making it so consumable for all with continuous improvements of compiler error messages, easier syntax and improved documentation.

Rust is one of the first languages in a long time that makes you think different.

If it is just safety... safety is one overloaded word.

rcthompson 2 days ago 3 replies      
I think Rust is mostly about safety in the same way that skydiving is mostly about safety. Having safety features that you know you can rely on allows you to take risks that you normally wouldn't in order to accomplish some really awesome things.

(I guess in this analogy C is a parachute that you have to open manually, while Rust is a parachute that always opens at exactly the right altitude, but isn't any heavier than a normal parachute.)

steveklabnik 3 days ago 4 replies      
I'll probably be writing a slightly longer response post to this later, but for now... EDIT: here it is: http://words.steveklabnik.com/fire-mario-not-fire-flowers

I think the core of it is this:

> Safety in the systems space is Rust's raison d'tre. Especially safe concurrency (or as Aaron put it, fearless concurrency). I do not know how else to put it.

But you just did! That is, I think "fearless concurrency" is a better pitch for Rust than "memory safety." The former is "Hey, you know that thing that's really hard for you? Rust makes it easy." The latter is, as Dave[1] says, "eat your vegetables."

I'm not advocating that Rust lose its focus on safety from an implementation perspective. What I am saying is that the abstract notation of "safety" isn't compelling to a lot of people. So, if we want to make the industry more safe by bringing Rust to them, we have to find a way to make Rust compelling to those people.

1: https://thefeedbackloop.xyz/safety-is-rusts-fireflower/

yjftsjthsd-h 3 days ago 4 replies      
I was surprised to see Ada in the list of unsafe languages, since it always was sold to me as being designed for safety. A bit of searching leads me to believe that Ada is better about memory even though it mostly uses types for safety, and better enforcement of bounds on array access should solve overflow issues regardless. Am I missing something?
ocschwar 2 days ago 2 replies      
Rust is about letting the compiler slap you for your mistakes in the privacy of your own Xterm, instead of letting Jenkins do it 10 minutes later, in front of all your co-workers.
alkonaut 2 days ago 2 replies      
Its the safe and performant that attracts me.

If you look at Rust from C then the point is safety, but if you look at it from the other direction, e.g from F# then what attracts you is that you will get the same safety guarantees (and perhaps a few more) but without the GC and heap overhead.

koja86 3 days ago 1 reply      
In case you missed that there's a big disillusioned C++ crowd out there.

Just hear the pain:https://news.ycombinator.com/item?id=13276351

And some of them are watching you with great interest.

zpallin 2 days ago 0 replies      
I agree with the premise of the article.

However, I feel that Steve Klabnik is trying to dispel myths about Rust not being anything "but" safety, to shape how other Rust developers talk about Rust, not denying that Rust's central purpose is around being a safe language.

This is because there is a lot of miscommunication about Rust. A lot of people who aren't immediately sold on the language walk away thinking it's slow (it's not), it's complicated (not really), and not production ready (it actually is). And that's because Rust developers don't know how to talk about Rust. I am guilty, for one.

Since Steve is such a huge part of RustLang development, it's his duty to direct the conscious effort to promote the language.

No reason to get into a debate over click-baity titles. :)

bassislife 3 days ago 8 replies      
The issue with safety is that nothing is really safe.Once you have some level of safety in your programming language, you realize that there are still a lot of other sources of hazard (hardware errors, programming logic errors etc.)

So I guess, it would be better to say that Rust is about decreasing unsafetyness or whatever the correct word for that is.

edit: since I see posts about Go, this is evidently another approach toward decreasing unsafetyness by providing fewer and easier to understand primitives so that the programming logic is harder to write wrong. It might come at a moderate cost for some applications.

paulddraper 2 days ago 2 replies      
If you're a C++ programmer, Rust is mostly about memory safety.

If you're a Java programmer, Rust is mostly about tighter resource usage.

If you're a Python programmer, Rust is mostly about type safety and speed.

stcredzero 2 days ago 1 reply      
I do not mean to pick on C++: the same problems plague C, Ada, Alef, Pascal, Mesa, PL/I, Algol, Forth, Fortran ... show me a language with manual memory management and threading, and I will show you an engineering tragedy waiting to happen.

I think if programming is to make progress as a field, then we need to develop a methodology for figuring out how to quantify the cost-benefit trade-offs around "engineering tragedies waiting to happen." The fact that we have all of these endless debates that resemble arguments about religion shows that we are missing some key processes and pieces of knowledge as a field. Instead of developing those, we still get enamored of nifty ideas. That's because we can't gather data and have productive discussions around costs.

There are significant emergent costs encountered when "programming in the large." A lot of these seem to be anti-synergistic with powerful language features and "nifty ideas." How do we quantify this? There are significant institutional risks encountered when maintaining applications over time spans longer than several years. There are hard to quantify costs associated with frequent short delays and lags in tools. There are difficult to quantify costs associated with the fragility of development environment setups. In my experience most of the cost of software development is embodied in these myriad "nickel and dime" packets, and that much of the religious-war arguing about programming languages is actually about those costs.

(For the record, I think Rust has a bunch of nifty ideas. I think they're going down the right track.)

geodel 3 days ago 8 replies      
The original Rust author make great points about safety. I think this new thrust on marketing emerges from Rust Roadmap 2017 which puts Rust usage in industry as one of the major goal. Currently Rust is about Go's age but nowhere close in usage. As the roadmap says "Production use measures our design success; it's the ultimate reality check." I agree with that.
jMyles 3 days ago 6 replies      
> countless lives lost

I have no doubt that people have had their lives ruined, or even died, as the result of flaws in system programming, but is anyone actually tracking this? Is it "countless?"

pjmlp 3 days ago 2 replies      
> Modula-3, Eiffel, Sather

Nice to see these languages on Rust's team radar, specially Sather.

Just shows how you guys have researched prior work, congratulations.

progman 2 days ago 0 replies      
The author states: "A few valiant attempts at bringing GC into systems programming -- Modula-3, Eiffel, Sather, D, Go -- have typically cut themselves off from too many tasks due to tracing GC overhead and runtime-system incompatibility, and still failed to provide a safe concurrency model."

Nim follows a different approach.

Details: http://nim-lang.org/docs/manual.html#threads

Benchmark (with Rust): https://github.com/costajob/app-servers

childintime 2 days ago 1 reply      
I think Rust is not about safety, but about reusability. Do you like to take on a dependency on someone's code when it is in C? The answer is: roll your own code. Rust means the end of that.

Rust means software that can be written once and used "forever". Thus it enables true open source. In comparison C/C++ pay a mere lip-service, by also giving you, along with the code, lots of reasons to worry.

This is the real innovation behind Rust.

Ar-Curunir 3 days ago 0 replies      
I think it's a bit funny that in an industry that (supposedly) prides itself on "meritocracy", there are many people that refuse to use (or learn) performant memory-safe languages, when memory-safe code is always better than memory-unsafe code (in terms of resource usage, reduction of bugs, etc, etc.).
luckydude 1 day ago 0 replies      
"Our engineering discipline has this dirty secret, but it is not so secret anymore: every day the world stumbles forward on creaky, malfunctioning, vulnerable, error-prone systems software and every day the toll in human misery increases. Billions of dollars, countless lives lost."

Billions of dollars and countless lives lost? I'm not saying that buffer overruns aren't a thing but this seems like marketing claims without substance. Yes, I read through the examples below, still think he's overstating it.

dbcurtis 2 days ago 2 replies      
What about bare-metal options? Is there any development effort in that direction?

Most of the C that I do these days is Arm Cortex-Mx work. Realtime cooperative multi-tasking using an RTOS on the bare metal. It seems like Rust would be a great option for that kind of work if the low-level ecosystem were complete enough.

willtim 3 days ago 0 replies      
It's a shame there is no mention of ATS, which also attempts safe systems programming usings an advanced type system.
pron 3 days ago 4 replies      
I completely agree. This is what I wrote on Reddit in response to Klabnik's post:

Rust can make such an important contribution to such an important slice of the software world, that I really fear that trying to make a better pitch and get as many adopters as quickly as possible might create a community that would pull Rust in directions that would make it less useful, not more.

Current C/C++ developers really do need more safety. They don't need a more pleasant language. Non C/C++ developers don't really need a language with no GC. Now, by "don't need" I absolutely don't mean "won't benefit from". But one of the things we can learn from James Gosling about language design is, don't focus on features that are useful; don't even focus on features that are very useful; focus on features that are absolutely indispensable... and compromise on all the rest. The people behind Java were mostly Lispers, but they came to the conclusion that what the industry really, really needs, is garbage collection and good dynamic linking and that those have a bigger impact than clever language design, so they put all that in the VM and wrapped it in a language that they made as familiar and as non-threatening as possible, which even meant adopting features from C/C++ that they knew were wrong (fall-through in switch/case, automatic numeric widening), all so they could lower the language adoption cost, and sell people the really revolutionary stuff in the VM. Gosling said, "we sold them a wolf in sheep's clothing". I would recommend watching the first ~25 minutes of this talk[1] to anyone who's interested in marketing and maintaining a programming language.

If Rust would only win over 10% of C/C++ programmers who today understand the need for safety, say, in the next 5-10 years, that would make it the highest-impact, most important language of the past two decades. In that area of the software world change is very, very slow, and you must be patient, but that's where Rust could make the biggest difference because that's where its safety is indispensable. A few articles on Rust in some ancient trade journals that you thought nobody reads because those who do aren't on Twitter and aren't in your circle may do you more good than a vigorous discussion on Reddit or the front page of HN. Even the organizational structure in organizations that need Rust looks very different from the one in companies that are better represented on Reddit/HN, so you may need to market to a different kind of people. So please, be patient and focus your marketing on those that really need Rust, not on those outside that group you think you can win over most quickly because they move at a faster pace.

[1]: https://www.youtube.com/watch?v=Dq2WQuWVrgQ

orblivion 2 days ago 0 replies      
I haven't used Rust, but generally speaking wouldn't you say that safety in some sense includes the other nice features? For instance, if it were safe but not fast (like, say, a GC language) it wouldn't be useful. So it has to be safe and fast (which it sounds to me like it is). Okay, so what if it were safe, fast, but a real hassle to use? Well that's not very useful either. So it has to be safe, fast, and usable. Just like Moxie's approach to security: focus on usability, so people actually use the damn thing. And it sounds like all the other nice features make Rust more usable.
djsumdog 2 days ago 1 reply      
This is a little off topic, but when I looked at his post I thought, "Wait .. is that LiveJournel?" .. and yes it is apparently. Or at least a fork of it called DreamWidth.

Interesting to see forks of older OSS Perl web apps still in use today.

maxpert 2 days ago 1 reply      
I have been anxious about using Rust as webserver. But so far there is no mature framework that I think can use. I had a look at few like mio and iron framework etc. It has no mature Websocket implementation or an http package mature enough to be used in production. I am looking forward to make an ultra efficient PubSub server that supports HTTP poll and Websockets. Hope my dream comes true :)
kebolio 3 days ago 2 replies      
> [Go] failed to provide a safe concurrency model.

What did he mean by this?

agumonkey 2 days ago 0 replies      
I think that safety is often doing small things clearly. When you read about thread safe computing, you end up with many rules FP make impossible. So even it's mostly safety, it encompasses a larger area in disguise.
keldaris 2 days ago 2 replies      
As someone looking at this influx of discussion from the point of view of a curious bystander, I can't help but be annoyed by two persistent misconceptions that keep being perpetuated in many statements of this kind.

1) Memory safety is or should be a top priority for all software everywhere. The OP goes so far as to state: "When someone says they "don't have safety problems" in C++, I am astonished: a statement that must be made in ignorance, if not outright negligence."

This is borderline offensive nonsense. There are plenty of areas in software design where memory safety is either a peripheral concern or wholly irrelevant - numerical simulations (where crashes are preferable to recoverable errors and performance is the chief concern), games and other examples abound. It's perfectly true that memory safety issues have plagued security software, low level system utilities and other software, it's true that Rust offers a promising approach to tackle many of these issues at compile time and that this is an important and likely underappreciated advantage for many usecases. There's no need to resort to blatant hyperbole and accusations of negligence against those who find C++ and other languages perfectly adequate for their needs and don't see memory safety as the overriding priority everywhere. Resorting to such tactics isn't just a bad PR move, it actively prevents people from noticing the very real and interesting technical properties that Rust has that have little to do with memory safety.

2) Rust is just as fast or faster than C++.

Rust is certainly much closer to C++ in performance than to most higher level interpreted languages for most usecases and is often (perhaps even usually) fast enough. Leave it at that. From the point of view of high performance programming, Rust isn't anywhere close to C++ for CPU-bound numerical work. For instance, it does not do tail call optimizations, has no support for explicit vectorization (I understand that's forthcoming), no equivalent to -ffast-math (thereby limiting automatic vectorization, use of FMA instructions in all but the most trivial cases, etc.), no support for custom allocators and so on. I'm also not sure if it's possible to do the equivalent of an OpenMP parallel-for on an array without extra runtime overhead (compared to C/C++) without resorting to unsafe code, perhaps someone can correct me if it's doable.

Over the past week or so, motivated largely by a number of more insightful comments here on HN from the Rust userbase, I've tried out Rust for the first time, and found it to be quite an interesting language. The traits system faciliates simple, modular design and makes it easy to do static dispatch without resorting to CRTP-like syntactic drudgery. The algebraic/variant types open up design patterns I hadn't seriously considered before in the context of performance-sensitive code (variant types feature in other languages, but are usually expensive or limited in other ways). The tooling is genuinely excellent (albeit very opinionated) and easily comparable to the best alternatives in other languages. I'm not yet sure if I have an immediate use for Rust in my own projects (due to the performance issues listed above and easier, higher level alternatives in cases where performance is irrelevant), but I will be closely following the development of Rust and it's definitely on my shortlist of languages to return to in the future.

However, I would have never discovered any of this had I not objected to the usual "memory/thread safety" story in a previous HN discussion and received a number of insightful comments in return. I think focusing on the safety rationale alone and reiterating the two hyperbolized misconceptions I listed above does a real disservice to the growth of a very promising language. I think Steve Klabnik's blog post to which the OP responds is a real step in the right direction and I hope the community takes it seriously. Personally, I know a few programmers who've entirely ignored Rust due to the existing perception ("it's about memory safety and nothing else") and in the future I'll suggest Rust as worthy of a serious look as an interesting alternative to the prevailing C++-style designs. I'm certainly glad I tried it.

mtgx 3 days ago 1 reply      
Even if Rust adds increasingly more "unsafe" features in order to appeal to new developer groups, I agree that it should remain a "100% safe by default language", and they should continuously try to improve the performance of the safe code, rather than get lazy and say developers can just use the unsafe syntax if they want 3x the performance. This would only lead more and more developers to increase the usage of unsafe code. It would be even worse if Rust would allow unsafe code by default for any future feature.
w8rbt 2 days ago 1 reply      
How is Rust better than and different from D?

Does anyone concerned about security use D?

amelius 3 days ago 1 reply      
Does Rust have template metaprogramming? And does it look more clean and organized than boost's C++ implementation?
raverbashing 3 days ago 1 reply      
Rust is great, however the safety aspect gets in the way sometimes

The right granularity for error handling is important, as well as making it easy to handle (abort? providing a default value? doing something else?)

It's not that it is not important, but code usability is important as well, lest it goes on the way of C++ hell (though I don't think it can get that bad, there are some warts - like "methods" and traits)

krakensden 3 days ago 2 replies      
What about stack overflows? I heard that rust no longer protects against those for benchmark reasons.
sh_tinh_hair 2 days ago 3 replies      
Safety stopped here: "curl https://sh.rustup.rs -sSf | sh"So did my interest.
sh_tinh_hair 2 days ago 0 replies      
io::stdin().read_line(&mut guess).expect("failed to read line");

My eyes have seen the glory of RUST, it's really javascript, right?

zerofan 2 days ago 2 replies      
zerofan 2 days ago 1 reply      
Annatar 3 days ago 1 reply      
Thaxll 2 days ago 1 reply      
The daily post about Rust and Go is getting tiresome... every single day we've got one.
StevePerkins 2 days ago 2 replies      
> "Safety in the systems space is Rust's raison d'tre."

I think this quote points to the REAL underlying issue here.

Rust is a language primarily built for systems programming. It has many strengths to celebrate, and brings curated best practices as well as its own novel features to systems programming.

However, most programmers in 2016 aren't "systems programmers" anymore. At the very least, most programmers who actively talk-up new technologies on web forums are not systems programmers. The majority (or at least the majority of the vocal and socially engaged) are web developers, mobile developers, CRUD apps and microservices, etc.

As interesting as Rust may be in the systems space, it doesn't bring much compelling new hype to the table for web stuff.

You have yet-another-concurrency-approach? That's great, but most web developers rely on an app server or low-level library for that, and seldom have to think about concurrency up at the level of their own code.

You have an approach for memory safety without a garbage collector? That's great, but most web developers have never even had to think much about garbage collection. Java, Go, etc... the garbage collection performance of all these languages is on a level that makes this a moot point 99.999% of the time.

You have a seamless FFI for integrating with C code? That's great, but after 20 years of web development I can count on one hand the number of times I've seen a project do this. And those examples were Perl-based CGI apps way back in the day.

Rust people seem almost dumbfounded that everyone hasn't jumped all over their language yet. And from a systems programmer perspective, memory safety without garbage collection does sound amazing. But you guys really need to understand that Hacker News and Reddit hype is driven by web developers, and that community isn't even sure whether or not type safety is a worthwhile feature! So really, it's amazing that you've managed to draw as much hype as you have. It's not about the mainstream popularity of your language, it's about the mainstream popularity of your field.

jinmingjian 2 days ago 1 reply      
The memory safety of Rust is over consumed.

1. several small languages also introduce the type system to try to solve the memory safety problem. But all of them are less famous. Because there are many reasons that makes a language being accepted massively from other tons.

2. in many cases, it is not hard to do manual memory management. There are many great software done with manual memory management. Although I admit the quest to memory management is always wonderful for system. But go to the follow #3.

3. linear/affine type system[1] is not the panacea. The case of "used exactly once" is just a small case. Forcely to this pattern makes large boilerplates. And constraints and verifications to system can not be done many levels and aspects. Is this truly valuable to add all into type system?

4. memory safety of Rust comes with price, which have added many complexities and language burdens to itself. Who like to read the following function declaration?(just borrowed as example):

fn foo<'a, 'b>(x: &'a str, y: &'b str) -> &'a str

5. So, finally, the question arises: does the current form of the memory safety of Rust deserve as the hope of next industry language? I'm afraid...

[1] https://en.wikipedia.org/wiki/Substructural_type_system

Facebook Doesnt Tell Users Everything It Really Knows About Them propublica.org
422 points by colinprince  2 days ago   261 comments top 30
ForrestN 2 days ago 14 replies      
There must be many HNers who work at Facebook. Anyone willing to make a throwaway account and tell us how it feels from the inside for Facebook to be one the wrong side of so many ethical issues? It just seems like in so many dimensions they've been caught saying wrong things or appearing to outright lie, and I'm curious how developers who work for them think about aiding a company that seems to be so compromised at the moment. Now that it's fairly clear that the service doesn't serve any unique, unambiguously positive purpose, what world-changing mission can you possibly decide that Facebook is achieving these days?
quadrangle 2 days ago 1 reply      
I know this is too late to get noticed much, but here's the truth:

This is a race-to-the-bottom. Everyone in this whole area has to compete with whoever is the scummiest exploiter unless they really go out of their way to sell their service with privacy and ethics as the top feature. So, some ethical niche services can exist, but meanwhile, everyone else is screwed, and network effects make any niche thing stay pretty irrelevant.

The only way to avoid races-to-the-bottom in a competitive market is with real enforceable regulation that outlaws the worst shit and requires truly effective disclosures otherwise. That's not easy, sometimes it's impossible, and it often has major negative side-effects and problems, but whether or not we determine that regulation is worth it or not, we know that races-to-the-bottom are a real thing, so we can give some leeway that each company isn't actively trying to be malicious they are just competing in a race-to-the-bottom situation (and we can reject the dogmatic free-market people who deny that this and all sorts of other natural market-failures exist).

soared 2 days ago 7 replies      
You can replace "Facebook" with thousands of other companies. Everyone is doing this because the cost is low, its easy, and the return is massive. The sole service my roommate's company does is match your customer with data about them from countless other sources.

If you want a peek into a small section of this type of data, go build a facebook ad. You can see all the targeting options. You can upload a list of email and build a "look a like" audience of people who are similar to your customers.

A company called cartalytics will let a brand purchase lists of people who have bought a specific product in the past 6 months and show them ads. Ex. If you've bought a big mac (with a credit or debit card) in the last month, I can show you McDonalds ads.. but they are super expensive.

caconym_ 2 days ago 4 replies      
> Facebook Doesnt Tell Users Everything It Really Knows About Them

I've been saying this for years. It's pretty clear that if Facebook told regular users just how much they knew, those users would be seriously creeped out (though, these days, probably not creeped out enough to do anything about it). I expect that another example of this would be the ability of their facial recognition system and the breadth of the database behind it.

Users are Facebook's product, and they should expect to be treated as such. The Facebook site and associated services are just infrastructure designed to a) collect information on users and b) give advertisers optimal access to those users.

edit: also, obviously, Facebook is not the only company engaged in this sort of thing. It's all around us.

mattbee 2 days ago 1 reply      
Facebook have to respond to Data Subject Access Requests in the UK, which oblige them to send you every piece of personally-linked information - for a maximum 10 fee.

I did this with my bank a few years back and got back a box file full of credit scores, lending decisions and other stuff they'd never normally expose. Facebook's data for a busy user is going to be enormous by comparison - has anyone done this lately (and published / summarised the results?)

kminehart 2 days ago 6 replies      
That might explain a pretty creepy thing Facebook did the other day to me.

I just created a new Facebook account after maybe 4 years of radio silence. Two years ago, I had a job doing IT contracting; often I would go to businesses and repair laptops or run cable to a COM room. We had very very few residential clients since they weren't worth our time; the few that we did have were really just courtesy for doing business for so long. I went to one residents home a SINGLE time, hardly interacted with the man, and he definitely did not know my last name.

Guess who pops up on my "Suggested friends", with no mutual friends or place of work or any similar "liked" pages? Yeah, that one client.

Similarly, we worked in a small office in a cold storage facility, and Facebook also suggested that I add their accountant as my friend.

It's really creepy, but if Facebook was able to know that I worked at that employer then it's possible that it was able to make the connection.

emptybits 2 days ago 2 replies      
> "He said users can visit a page in Facebooks help center, which provides links to the opt-outs for six data brokers that sell personal data to Facebook."

The link provided is: https://m.facebook.com/help/494750870625830?helpref=uf_perma....

LOL. The amount of personal information requested at those "opt-out" links is suspicious and/or ironic.

Examples of information requested to "opt-out" of the USA partners' reach include: Social Security Number (!), date of birth, "all variations" of full name, all recent mailing addresses, ... (!!)

bogomipz 2 days ago 2 replies      
>"For instance, opting out of Oracles Datalogix, which provides about 350 types of data to Facebook according to our analysis, requires sending a written request, along with a copy of government-issued identification in postal mail to Oracles chief privacy officer."

This is outrageous. Why is the onus on a user who never gave permission to a data broker in the first place? They deal in digitial domain when it comes to selling your data when it comes to consumers rights and concerns they operate exclusively via snail mail?

Don't expect this to change any time soon. These brokers have the US Electorate in their pocket. Bought and paid for.

owly 2 days ago 1 reply      
1. Of course!2. It's not too late to delete your account. Go for it!3. Block it all! https://github.com/jmdugan/blocklists/blob/master/corporatio...
throw2016 2 days ago 0 replies      
I think this forum has to recognize a lot of work being done in the valley especially Google and Facebook is ethically questionable and seeking to brush it under the carpet or 'normalize' it perpetuates a dissonance. For starters the whole mythology of liberal freedom loving nerds sits in stark contrast to the reality of actively developing and enabling authoritarian technologies.

The curious consequence of the willful ignorance on one's own actions is the continued posturing and stark dissonance in expecting ethical behavior from other segments of society. If you can't behave ethically you can't expect it from others.

That level of dissonance is untenable and ultimately every intelligent person has to realize not recognizing and confronting unethical behavior is a race to the bottom and will reflect in every aspect of life around you.

j2bax 2 days ago 4 replies      
Is there anyone out there making a paid, zero advertising/data collecting social network? What if this service allowed you to buy access for 50 of your closest friends and family? I would think if it was executed properly and you provided a standard "I'm deleting Facebook and here is why, apply to join my paid for network group" post people would consider making the jump. I know there's a lot to Facebook and I wouldn't expect some new company to stack up feature for feature. Just give me chat, text/image posts and the wall and I will be happy that I can keep up with my close friends and family. I wouldn't be entirely surprised or disappointed if Apple attempted something like this on their Messages platform but I would just hope they'd make it accessible to all phone/computer/tablet users.
lalos 2 days ago 1 reply      
Sort of related, have people noticed or have they officially announced that they are tagging photos on the alt html field with a description of the actual photo? It's pretty accurate with texts like "two people smiling, with baby".
creepydata 2 days ago 2 replies      
It's creepy how much companies know about you.

When I got married my husband pretty much immediately showed up as my spouse on my transunion credit report as my spouse. How did they know that? Our names are different. At the time we didn't have any loans together. We lived together but so do siblngs and roommates. We didn't register for any wedding registries or send out any announcements. Our wedding consisted of signing some paperwork at city Hall. They also marked me as "Active Duty Military or Dependant" (hubby is in the army so I became a "dependant" when we got married). So the only logical explanation is transunion can access DEERS, but I would hope the DoD doesn't allow random private companies access to DEERS... They DO have a website where you can lookup if someone is covered under the SCRA but dependants aren't covered under the SCRA and don't show up when queried (I tried).

Again this is my credit report. I didn't report a change in my martial status to any of my financial institutions. Not banks, not credit cards, and we already had a joint account for two years before we were married.

mungoid 2 days ago 2 replies      
I cant speak for other countries, but why do American people seem to trust companies more than they do the government? I mean, it is completely known that companies are here to make money, and publicly traded companies are here to please their investors so they will do whatever it takes to do that. They study us, classify us, categorize us, manipulate us. They spend billions in research so they can make that 'perfectly tailored' ad to get us to buy their product. They are constantly buying our data and selling our data, JUST to make their investors happy, and we seem to always just shrug it off.


I am honestly more ok with the government having this data to keep tabs on me than these hundreds of other companies treating my personal info like it's a trading card.

chriswwweb 2 days ago 1 reply      
It's funny that a newspaper criticizes Facebook's data mining practices ... but when I opened the article on their website, my privacy badger addon told me that 16 scripts had been blocked (facebook!, twitter, google analytics, chartbeat, outbrain, pardot, ...). Then I read through the article and half way down they throw me a huge banner in the way telling me to like their page on Facebook :/ So basically they preach something and do something else, they are really a bunch of hypocrites!
alanh 2 days ago 0 replies      
Ironically, I cannot read this article as I am immediately redirected to https://www.facebook.com/plugins/share_button.php?app_id=229...

(likely due to a script having a bad reaction with one of the browser extensions granting me a small illusion of privacy)

tripzilch 1 day ago 0 replies      
> One Facebook broker, Acxiom, requires people to send the last four digits of their social security number to obtain their data.

This is just one of the many WTFs that Facebook apparently actively supports.

In what world, what possible explanation was this ever a good idea? Or a reasonable idea? Either the US SSN is like a password (it's not) then how did Acxiom get their hands on it, or it isn't (correct) and it doesn't serve the purpose for identification.

Letting this sort of crap run wild also affects what is considered "normal" or common privacy in other parts of the world, like the EU, it slides the window. Continuously pushing the boundaries against people watching helplessly as layer upon layer of foundations of surveillance are built. Authorities don't do much until adoption is way beyond the curve of network effect, or they do it weirdly. And by then people think it's normal or acceptable.

Already now, on countless popular sites, advertising transgresses heavily on not only guidelines but also law. Medical claims, product placement, child advertising, you name it.

What can we do to not make the lowest common denominator decide what's normal?

pwnna 2 days ago 3 replies      
Speaking of which, perhaps someone can shed some light on the suggested friends feature. Many people suspected it uses GPS/Wifi to perform location based friend suggestions, as well as contact book uploading. However, it doesn't really explain my own case:

I recently encountered a friend suggestion for someone that I only know online (IRC and later, Google Hangout). I don't really know who they are other than a name (as exposed by GHangout). I've never met them as they are in a completely different country. I don't have the facebook app and the messenger app is forbidden to read my contacts as per CyanogenMod's Privacy Guard. I fail to understand how FB can suggest this? The only possible reason I can think of is when they searched my name on Facebook. How else can they do it?

linkregister 2 days ago 1 reply      
When I read this article, I was expecting to see a description of what they collect from users. But the real controversial and creepy part is what's available from the data brokers.

The fact Facebook is aggregating all this to make for better advertising options is discomforting, to be sure.

The most concerning aspect of the article is that these data brokers are able to correlate my purchases. It seems inevitable that insurance companies will take all of these individual data points into account: "We're sorry Mr. Register, because you buy McDonald's every week we'll have to raise your life insurance rates."

fritzw 2 days ago 1 reply      
This became painfully obvious when LinkedIn's algorithm started making extremely circuitous connections that freaked people out. People in a relative manner are painfully stupid, algorithms are ridiculously capable. The result is freak out. Facebook being psychologically aware, protected its users from the truth before it could be known. It was long ago that google's Eric Schmidt said "we are on the verge of predicting our users thoughts" google is just as slick as Facebook.
pschastain 2 days ago 0 replies      
I haven't had a Facebook account for years, and my phone number was never associated with it. Yesterday I visited the site on my laptop to look up the page for a tavern that's re-opening. 1/2 hr later I got a text from 32665 with a Facebook confirmation code. WTF; creeped the hell outa me. I replied with "stop" and received verification that "Texts from Facebook are now turned off." I visited their site again to request whatever data they have on me, but even though I checked the "I don't have a Facebook account" button for the request they insist that I log in to finish the process. Not sure where to go from here with it.

[edit] grammar

bigmofo 2 days ago 1 reply      
How do the data brokers know whether one shops at dollar stores? Who is leaking our inforamtion to the brokers? Is the store or the credit card company releasing information to a third party? Store gets the customer name from the credit card. Credit card company knows that a transaction took place at the dollar store. Any other possibilities?
NumberCruncher 2 days ago 1 reply      
It is easy to forget that FB is a media company and as such it is not only making money by selling ads but also by manipulating the masses. They may focus today on serving ads making $3.6 billions annually. Tomorrow they may focus on something else for example on serving fake news for manipulatig elections and making 10x more. The data they collect is only a means to an end and I am afraid I won't like the end when it arrives.
MarkMc 2 days ago 1 reply      
> Of the 92 brokers she identified that accepted opt-outs, 65 of them required her to submit a form of identification such as a driver's license. In the end, she could not remove her data from the majority of providers.

So what exactly was the problem? She doesn't have a driver's licence?

Karunamon 2 days ago 0 replies      
Why is it obligated to?

No, really. Why? Why on earth is it a problem what a company does internally with its own collected data? Why is this only being directed at Facebook?

discordianfish 2 days ago 0 replies      
I understand the general privacy concern but what exactly is the critic here? That facebook aquires data from external source? Aren't those external source the real problem?
intrasight 1 day ago 0 replies      
If everyone (like me) installed an FB ad blocker, then they'd not get a return on their investment for buying that 3rd party data.
throwaway4897 2 days ago 0 replies      
I'm operating on the assumption that some day soon there will be a market for personal services.

I know it sounds crazy now but people also thought no regular person would ever need a personal computer. IMO the next computing revolution is in personal appliances with OS (but black box, plug-n-play to regular foks) that serve up usable voice recognition and other SaaS stacks that replace the "free" data black holes currently in use.

antoniobg 2 days ago 0 replies      
Who thought they did?
optionalparens 2 days ago 0 replies      
Some day I'll need to reclassify all my cyberpunk books as non-fiction it seems. We are nearly at the point of having real-life equivalents of things like information brokers and a Central Intelligence Corporation. The funny thing is the real companies I've seen are more creepy than the over-the-top portrayals of your typical dystopian corporate future. Worse yet, they do a better job of automating it all vs. the typical human intelligence or hacking missions in those types of books.
Technical report on DNC hack [pdf] us-cert.gov
448 points by jbegley  2 days ago   442 comments top 46
codedokode 2 days ago 9 replies      
I have looked through the report. The only useful information was brief description of attack methods, everything else looks like a list of general recommendations one can find on the OWASP website.

As I understand from report the main methods used were:

- sendind emails with executable files that victims for some reason executed

- phishing

So, they used script kiddie level tools anyone could use (and they are cheap; you don't have to buy expensive zero-day exploits on a black market). But of course this could be done intentionally so it looks amateur-ish.

This attacks could be easily mitigated. First, OS and applications should not run unknown files from Internet (because some people got used to double click on everything they get in email), second, we should start using physical cryptographic keys instead of passwords. Common people cannot handle passwords, they either make easily guessed passwords or enter them everywhere without thinking. I hate passwords too because they are hard to remember (and please don't suggest that I should download some software and upload my passwords to a "cloud" in NSA-controlled country).

By the way iOS is the only popular operating system I know that doesn't allow to execute files downloaded from web or emails. Apple did it the right way.

The report also contains a pretty useless firewall rule named "PAS TOOL PHP WEB KIT FOUND" that can be used to search malware in PHP files. It is interesting that they have replaced digits in 'base64_decode' function name with regexp as if there were any other similar functions.

snowwrestler 2 days ago 5 replies      
Folks, the point of this report is not to justify the punitive actions taken today. It is to provide information that companies can use to protect themselves against similar attacks in the future.

So if you judge it by whether it "makes the case" against Russia, it will be lacking. We don't need 100 comments pointing that out.

downandout 2 days ago 4 replies      
It seems unlikely that email hacking will stop in the future. If the leaked emails actually influenced the elections, it was because of their content. I've heard exactly zero credible claims that the leaked emails were falsified in any way. Perhaps if political candidates/party executives are going to do unethical/illegal things, they shouldn't discuss them over email.

Edit: changed "zero claims" to "zero credible claims"

altendo 2 days ago 4 replies      
As an aside, for those looking to understand YARA rules, [1] provides a brief introduction and [2] introduces how to write them. I needed to look it up myself, but seems relatively straightforward if you have a programming background.

tl;dr: YARA rules are a method of categorizing malware based on their characteristics. So the PDF here released a YARA rule to determine a specific piece of malware used in the hack (it's not clear to me what it identifies, other than a PHP script).

For convenience, here's the YARA rule presented in the PDF formatted to be more readable:


 meta: description = "PAS TOOL PHP WEB KIT FOUND" strings: $php = "<?php" $base64decode = /\='base'\.\(\d+\*\d+\)\.'_de'\.'code'/ $strreplace = "(str_replace(" $md5 = ".substr(md5(strrev(" $gzinflate = "gzinflate" $cookie = "_COOKIE" $isset = "isset" condition: (filesize > 20KB and filesize < 22KB) and #cookie == 2 and #isset == 3 and all of them

[1] https://securityintelligence.com/signature-based-detection-w...

[2] http://yara.readthedocs.io/en/v3.5.0/writingrules.html

EDIT: formatting

coldcode 2 days ago 4 replies      
Jeez people, read the report, it isn't any kind of justification of anything, its just a fairly generic don't do this, like I see 100 times a week at work. The real details were likely shown to congress and the senate (or at least a portion of it). Those are the only people who can say if the actual attack was real or imagined. Do you think the British and Americans were going to publish stories about Enigma back in WW2 in the Times during the war? There were like a handful of people in the world who knew the details.

While we technical folks would love to see all the details that's not how intelligence works. Some things have to be secret even though these days everything becomes a conspiracy and a political controversy and a tweet storm.

That said I doubt anyone in either party committee had any idea how security works; even worse is that much of the US government is (and will be) lead by political benefactors with an axe to grind and not people with a real clue about modern security either so expect nothing much different in the future until someone hacks the nuclear "football".

sschueller 2 days ago 8 replies      
The Sony hack had more evidence than this...

Someone explain to me why this is such an issue?

There have been many proven hacks from many states that are far worse (the Chinese Fighter plane that looks almost identical to the F35 come to mind) than exposing the DNC's dirty laundry. No one is denying that the emails are real. This seems like some sort of distraction.

khrakhen 2 days ago 1 reply      
Great. Password expiration.

Cue everyone recycling a set of 10 unique passwords among devices and/or writing passwords on Post-It notes on work computers at the office.

Only bit of truth in here was the phishing campaign. That could be anyone, however. This is barely more advanced than the Nigerian bank scam e-mails.

Yet the POTUS says it's a sign of the highest levels of Russian government.

We've already been fed lies about e-mails being altered (DKIM signatures disprove this) and now this PDF ignores the insider element in the DNC/Wasserman-Schultz leaks.

What a joke. NIST, NSA, FBI, CIA et alia should be discredited almost entirely at this point.

__jal 2 days ago 3 replies      
This is a magic report.

Over the next 24 hours, it will transform a huge number of people into experts on intelligence reporting requirements, hacking, sources and methods, and diplomacy.

wongarsu 2 days ago 1 reply      
Page 5 lists a YARA signature names "PAS_TOOL_PHP_WEB_KIT" that is supposed to match some kind of payload from the attack. It looks generic but is surprisingly specific.

A quick search reveals that it happens to exactly match [1] (if you fix a few obvious bugs where the github code uses $COOKIE instead of $_COOKIE, or produces base64decode instead of base64_decode. The attackers probably fixed that in production). Apart from the exact combination of three `isset` and two `_COOKIE`, that code starts with the unusual sequence `<?php $l___l_='base'.(32*2).'de'.'code';` which happens to be matched by the (also very unusual) regex from the report. It also ticks all other boxes from the provided signature.

I just found that within five minutes by searching github. It seems like an encrypted payload that can be executed by visiting the php page while having the password in a POST parameter or in a Cookie.

I'm not an expert, but the encryption looks very simple. Maybe somebody feels up to the challenge to try some statistical analysis or similar on it?

[1] https://github.com/Nu11ers3t/Null/wiki

ryanlol 2 days ago 4 replies      

 ~ grep IPV4 JAR-16-20296.csv|awk -F ',' '{print $1}'|sed 's/[][]//g'|sort -u|grep -f exits -c 191 ~ grep IPV4 JAR-16-20296.csv|awk -F ',' '{print $1}'|sed 's/[][]//g'|sort -u|wc -l 876
At least 191 of the IOC IPs are (probably random) Tor exit nodes :) The actual number may very well be higher, I just grabbed current exit node list from https://check.torproject.org/exit-addresses

Here's the PHP backdoor the YARA rule is for http://sprunge.us/ReFg I'll probably put up the rest of the samples in a sec.

Edit: Here, I uploaded most of the samples listed in the csv http://www.filedropper.com/samples_5

Edit 2: The obfuscation used in the russian PHP shells looked awfully familiar, I think the shell they're using could very well be this one http://profexer.name/pas/download.php originally shared on a .ru hacker forum.

AlexCoventry 2 days ago 0 replies      
There doesn't seem to be much new information there. A bunch of IP addresses, file hashes to look for, and general network security advice, in addition to a history of the attacks which was already public, and an explicit attribution to the Russians.

They mention a phishing attack which took place after the election, but don't give any further details.

Dolores12 2 days ago 4 replies      
This report is a joke. I didn't find any reasoning about attribution.

Here is the only valuable part:

"rule PAS_TOOL_PHP_WEB_KIT{meta:description = "PAS TOOL PHP WEB KIT FOUND"strings:$php = "<?php"$base64decode = /\='base'\.\(\d+\*\d+\)\.'_de'\.'code'/$strreplace = "(str_replace("$md5 = ".substr(md5(strrev("$gzinflate = "gzinflate"$cookie = "_COOKIE"$isset = "isset"condition:(filesize > 20KB and filesize < 22KB) and#cookie == 2 and#isset == 3 andall of them}"

droithomme 2 days ago 1 reply      
Is this more or less reputable than the clear and unambiguous claims of Craig Murray regarding the DNC leak, which he has stated clearly were the result of him personally traveling to DC, acquiring the data dump face to face from a non-Russian DNC insider, and then returning to the UK to give them to Assange himself. If the us-cert.gov report is to be believed, then both Assange and Murray are liars. Both can not be true. Who is more credible? Perhaps we can compare the history of truth reliability in claims from each party? Would that be a reasonable approach to ascertain who is lying here and who is telling the truth?
maxlybbert 2 days ago 0 replies      
At least the report is short. As others have stated, it doesn't really lay out any new evidence to believe the Russian government was behind the hack. It lays out information that almost looks like evidence, such as a list of usernames, but doesn't discuss how the information is relevant to anything. There is an assertion that three teams were involved, and that two teams communicated with each other, but no discussion of where this information comes from or why anyone should care how many teams there were. I get the feeling that there's a message for someone, but I'm certainly not the intended recipient.

The advice on avoiding similar hacks in the future is a grab bag. Near the end it encourages using /etc/shadow on POSIX systems. I installed Linux on my personal computer in 1999. Since then, I've installed several Linux distributions, FreeBSD, OpenBSD, Plan 9, Inferno, etc. I can't remember any installation offering to store password hashes in /etc/passwd. Some of the advice is better, but not all of it is. I'm honestly disappointed. Perhaps this is a wake-up to somebody, but I would hope Sony's hack would have already served that purpose.

crb002 2 days ago 0 replies      
I've done security remediation for the U.S. Govt. About the same vulns you would expect on 2003 PHP apps that haven't been updated since (OS or otherwise). Congress doesn't budget for server/app maintenance, simple as that.
varjag 2 days ago 0 replies      
We'll probably never know the name of the poor intern tasked to slap this together overnight.

Hope the main report due in 3 weeks has some substance.

jordache 2 days ago 0 replies      
Wtf? Majority of report is copy & pasted security risk descriptions
dominotw 2 days ago 3 replies      
>In spring 2016, APT28 compromised the same political party, again via targeted spearphishing.

I think I might have missed it, but how did they conclude that it was 'APT28' ?

> APT28 is known forleveraging domains that closely mimic those of targeted organizations and tricking potentialvictims into entering legitimate credentials. APT28 actors relied heavily on shortened URLs intheir spearphishing email campaigns.

Aren't these standard phishing 101 techniques. What makes them specific to 'APT28'. This 'report' looks like someone googled 'phishing 101' and 'web security 101' and copy pasted bunch of stuff from wikipedia.

myf01d 2 days ago 2 replies      
Even if Wikileaks never published anything, She would have still lost. She had the greatest help, money and collusion from government, media, international community elites and her party and still lost against the most unpopular and unfit candidate of all time who got more than 300 electoral votes. That's how loser and corrupt she is. Just get over it.
rplst8 2 days ago 1 reply      
Interestingly it says only one political party was hacked.
Abishek_Muthian 2 days ago 1 reply      
"At least one targeted individual activated links to malware hosted on operational infrastructure of opened attachments containing malware" - I pity that individual, like I'm sure he's getting blamed in the party like 'Hey, aren't you the piece of work who clicked a link'
PaulHoule 2 days ago 0 replies      
That big list of code names is a hoot, it seems they mixed the names of soldiers from metal gear solid 5 with a list of names they got off a small IRC server as well as some codenames out of James Bond novels Ian Fleming never wrote plus some fragments of mime headings salted with just a little bit of line noise.
bjourne 2 days ago 0 replies      
The report released by the US govt only contains a birds-eye view of the hacking incident and not much technical details. But they do reference APT28 and APT29 which are described in reports from FireEye in 2014 and 2015:

 http://www2.fireeye.com/rs/fireye/images/rpt-apt28.pdf https://www2.fireeye.com/rs/848-DID-242/images/rpt-apt29-hammertoss.pdf
The evidence is circumstantial, but there is so much of it that I think you can confidently say that Russia is behind it. For example, compile times pointing towards office workers in Moscow, Russian language settings and so on.

Read the reports and make up your own minds!

cekvenich3 2 days ago 2 replies      
'The U.S. Government assesses that informationwas leaked to the press and publicly disclosed.'

Who in US Government?

What information was leaked?

showmeevidence 1 day ago 0 replies      
Throwaway because I work in a related field.

Folks, now is the time that we need to make it clear that posturing and PR statements do not constitute valid, independently verifiable evidence. As a citizen of the United States, I am beyond terrified that our government has made public statements, buttressed by newspaper articles supported by nothing but anonymous sources[1], vilifying Russia for a nation-state-level cyberattack. The support for such claims, as presented, is the "sophistication" of the attack, which is not evidenced here (phishing is not a particularly sophisticated means of entry). At best, this is a mistake, and at worst, it wreaks of anti-Russia propaganda that will only serve to escalate tensions between the two countries. Every single person who absorbs a report like this without seeking supporting evidence (note that this report immediately starts by claiming Russia's involvement, and never provides support) is, to some extent, culpable in a hypothetical reality where the US Government is blatantly wrong about this one.

There's only one thing we can do at this point: File Freedom of Information requests. The fine folks at Muckrock[2] make this absurdly easy. Send requests to the CIA and FBI -- hold them accountable to their statements, which have to date been unsupported, that Russia as a nation-state entity was behind anything.

1: http://www.nytimes.com/2016/12/09/us/obama-russia-election-h...2: https://www.muckrock.com

jayess 2 days ago 1 reply      
First step to prevent "Russian hackers": Don't make your password "p@ssw0rd"


ejcx 2 days ago 3 replies      
tl;dr - "ultra advanced cyber persistent cyber threat cyber actors" are sending phishing emails and people are still clicking on them.
jbeckham 2 days ago 4 replies      
Nothing about this supports a Russian attribution.
w8rbt 2 days ago 0 replies      
They ought to encourage the use of prepared statements to defend against SQL injections. It's the only way to handle that threat, yet the report does not mention it:

"5. Input Validation - Input validation is a method of sanitizing untrusted user input provided by users of a web application, and may prevent many types of web application security flaws, such as SQLi, XSS, and command injection."

 https://en.wikipedia.org/wiki/Prepared_statement https://www.owasp.org/index.php/SQL_Injection_Prevention_Cheat_Sheet

bobzibub 2 days ago 0 replies      
I think that there is a second assumption which is overlooked:Hypothetically, let us assume that the Russians did break in and steal emails etc. Governments do so all the time so it could well be true. Now the question to me is: why would they release all the emails to Wikileaks? The emails seem relatively benign and not very damming of HRC. Why not keep the information in your back pocket until they can be researched and leveraged? Releasing them diminishes their value to an intelligence agency. And why not release selected HRC's (herself) emails? Surely the Russians could have gotten those of they tried. Assuming she's not squeaky clean, they could have released selected individual emails anonymously and ensured a Trump win, plus keep other assets for later. Would a better hypothesis be that US intelligence services saw break-ins and so released the information they knew foreign governments could be used as leverage aagainst a likely future president? This way they immunize against the information's use, plus blame the Russians but the US would dearly like to punish Russia for their victory in Syria anyway. This makes more sense to me but am interested in why this hypothesis is wrong or less likely.
nukka 2 days ago 1 reply      

Here is technical analysis of one of the malware used.http://researchcenter.paloaltonetworks.com/2015/07/unit-42-t...

Why does everyone thinks that only Russians can write such a malware, that too in python!

Also, does it dawns on anyone that anyone can actually take this malware up and reverse engineer it and repurpose it. All it takes is changing one JSON blob embedded in the code to point to your own servers for CNC, use your own AES IV/Key.

Also, I find it funny that they use embedded time stamps and resource locales as a proof of anything. Didn't anyone ever use Resource Hacker or 'strings' command? Is it really this hard to scrub or falsify timestamps in a DLL/EXE?

The most damning proof would have been some SSL certificate reused in a known compromised server for CnC. I heard rumors around it but nowhere in the analysis this was highlighted or discussed.

Jabbles 2 days ago 0 replies      
Interesting date in footnote:

http://msdn.microsoft.com/en-us/library/ff648653.aspx. Web site last accessed April 11, 2016.

Have they just copied this report from some general security advice and prepended the latest attack at the top?

(Also, no https?)

peache 1 day ago 0 replies      
Nonsense. I'm doing cybersecurity analysis for a Navy program this very day. To the person that says "The attackers did use stealthy persistence techniques often called 'rootkits'" -- you know exactly nothing about what you're talking about.

A rootkit is the means to obtain "root" permissions which is an exclusive feature of UNIX/Linux operating systems. Powershell is a Windows product... these systems are Windows based. No rootkit. Period.

geoffreyhale 2 days ago 0 replies      
Page 7: "Firewalls can be configured to block data from certain locations (IP whitelisting)"
EvanAnderson 2 days ago 0 replies      
At least some of the DNC users who had VPN access (which, presumably terminated "behind the firewall") had local Administrator rights on the PCs they used [1]. Getting one of those people to load malware and piggybacking on their VPN connection (letting them enter 2FA if there even was any) was likely a cinch.

There's nothing that I've read anywhere that makes me think the DNC was any kind of difficult target to compromise. Likely their information security posture was on par industry norms for small office networks-- absolutely terrible.

[1] https://wikileaks.org/dnc-emails/emailid/8763

emmelaich 2 days ago 0 replies      
I remember two spates of gmail phishing, one in early 2015 that Google responded to:


I think it's quite possible that this sort of warning actually may have increased phishing attempts because it made malware authors aware of increased possibilities.

From the report ...> In summer 2015, an APT29 spearphishing campaign directed emails [ .. ]

PS. please don't seasons instead of dates in reports, specify the quarter of the year.

zanethomas 2 days ago 0 replies      
Calling it a "report on the DNC hack" is a fine example of fake news.
peter_retief 2 days ago 0 replies      
Multiple failures in network security and silly users fall victim to unsophisticated hackers, is this news or an apology;)
rogerthis 2 days ago 0 replies      
What about the disclaimer at the top of the document?
monochromatic 2 days ago 0 replies      
They may have evidence it was the Russians, but it sure isn't in this report.
fraytormenta 2 days ago 0 replies      
ok i think i know what happened: Obama forced FBI to produce the report, but they got nothing, so they filled it mostly with irrelevant slightly-more complicated mumbo jumbo than Obama can understand to slide under his scrutiny, and then he pushes it out to the public without first consulting an actual security professional.
empath75 2 days ago 1 reply      
catawbasam 2 days ago 1 reply      
crabstraggler 2 days ago 1 reply      
mastermike 2 days ago 0 replies      
It's clear that this is being done to validate their lies about the Russian's hacking. The US-CERT report came out today on this. I understand all this content and it is very limited scope. It does not provide any validation that Russia was involved in any kind of hacking against the US. They described what is probably the most common form of spear-phish hacking, put Russia's name on it, and listed a bunch of other hacking tools which are made by hackers who actually claim to be part of ISIS (probably CIA assets, looks to me like they are trying to false flag this) https://en.wikipedia.org/w/index.php?title=Fancy_Bear&oldid=...
LargeCompanies 2 days ago 2 replies      
Ummm what piece of information in the leak caused Clinton to lose?

Reality check nothing because there were no bombshells found like James Coomey re-opening the FBI's investigation against that woman. A woman who nationally especially compared to Obama is highly unlikeable with a horrible public image. Though we're stuck with that crazy man... losing game either way!

Enter your address and find out everyone who represents you in U.S. government whoaremyrepresentatives.org
409 points by tonyztan  2 days ago   158 comments top 38
DrScump 2 days ago 3 replies      
Actually, this title doesn't do the service justice -- it yields detail clear down to local offices and gives a detail '+' link for each to get details like contact information.

For example, this is what is returned for a given, random Sunnyvale, CA address; the lone change I would suggest is to have the county and then city offices listed last to maintain a sequence of decreasing granularity. Note that Sunnyvale is an example of at-large city council representation, so all are listed. Very nicely done!

Barack Obama President of the United States

Joseph R. Biden Vice-President of the United States

Michael M. Honda United States House of Representatives CA-17

Dianne Feinstein United States Senate

Barbara Boxer United States Senate

Edmund G. Brown Jr. Governor

Gavin Newsom Lieutenant Governor

Laurie Smith Sheriff

Lawrence E. Stone Assessor

Jeffrey Rosen District Attorney

Gustav Larsson City Council Member Seat 1

Glenn Hendricks City Council Member Seat 2

David Whittum City Council Member Seat 4

Pat Meyering City Council Member Seat 5

Jim Davis City Council Member Seat 6

Tara Martin-Milius City Council Member Seat 7

Jim Griffith Mayor and City Council Member Seat 3

John Chiang State Treasurer

Kamala D. Harris Attorney General

Betty T. Yee State Controller

Alex Padilla Secretary of State

Dave Jones Insurance Commissioner

Tom Torlakson State Superintendent of Public Instruction

Larrikin 1 day ago 1 reply      
The feature I've been looking for but can't seem to find is a calendar view of when your elected officials are up for election.

Virginia for example holds their major state elections the year after Presidential elections. Local elections come up at seemingly random times. I vote absentee and remember coming home for a visit and my parents asking me to vote in some small election that was being held in the middle of summer.

Being able to add all of the offices to my calendar, ideally with important deadlines like when you can apply for absentee, vote early, and when you have to have your ballot in by would be amazing.

epoch_100 2 days ago 9 replies      
To everyone posting here:

I am a creator of this service. If you'd like to get in contact with us, email secure@politiwatch.org!

We're extremely encouraged by all the positive feedback here, and we're glad to provide a service that you all found helpful.

padobson 2 days ago 2 replies      
I got my list and was pleased to see things like Auditor and Coroner, but why no state senator or state representative?

There's also no judicial branch to be found, which may not matter much for the Federal Supreme Court, because they're appointed, but just about every jurisdiction I fall under, State Supreme, State Appeals, Local Criminal, Local family, has an elected judge.

Judges tend to be a major source of ballot fatigue, because nobody knows who they are. You could argue that's a good thing, because then only informed voters are selecting them, but you could also argue that it's a bad thing, because only the self-interested are voting for them.

saurik 2 days ago 1 reply      
This website believes the Auditor-Controller for Santa Barbara County is Robert Geis, but he retired last March and was replaced by Theodore A. Fallati. I also wouldn't say that it is fair to say that this is "everyone": it is missing all of the local special districts (such as the Goleta Water District and the Isla Vista Recreation and Park District) that I would argue are much more important to my life than the person who is currently the "Treasurer-Tax Collector-Public Admin." (someone I believe I have never actually met, despite having been extremely active in local politics for years and having run to be a County Supervisor, even now being elected to the board of a new district which will come into existence in March 2017).
Tau_Zero 2 days ago 2 replies      
Sounds like an awesome service. Unfortunately, I'm seeing:

Error 403 Daily Limit Exceeded. The quota will be reset at midnight Pacific Time (PT). You may monitor your quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/civicinfo/quo...

11thEarlOfMar 2 days ago 2 replies      
I'd like to see this expanded to show the full text, the representative's summary, and constituent comments for the bills, codes, policies, docket, whatever they are actually going to be voting on.

Then, show how they voted.

joelcollinsdc 2 days ago 1 reply      
Looks like this API does a lot of the heavy lifting for this:https://developers.google.com/civic-information/docs/v2/
jonknee 2 days ago 2 replies      
Really useful, love the idea. There may be some data troubles though, I looked up myself and found that the Twitter link for Senator Maria Cantwell goes to a porn account, not her actual profile. Yikes!
jedbrown 1 day ago 0 replies      
Cool service. Beware that the Wikipedia links may direct to different people with the same name, particularly for local offices. For example, my Assessor links to a British wartime codebreaker and my Surveyor links to a Kiwi rugby star.
coreyp_1 2 days ago 8 replies      
Why does it ask for your address? Just entering your zip code is sufficient to obtain a list. Are they harvesting the address data for some reason?
2sk21 2 days ago 0 replies      
Potentially very useful but it had incorrect names for the county level officials in my county. There does not seem to be any way to send in corrections.
dsalzman 2 days ago 1 reply      
Feature Request: Add a picture of the representative when you use + to drill down. It's nice to attach a face to a name.
protomyth 2 days ago 0 replies      
It would be nice if sites like this started taking into account Tribal governments. Otherwise, nice and simple website.
mdc2161 2 days ago 0 replies      
If you like whoaremyrepresentatives, we would love feedback on [Act On This](https://www.actonthis.org/) as well.

We don't go down to as local of a level yet, but are more focused on giving information about specific actions you can take related to issues you care about.

While the current list of issues comes from us, we're on-boarding a couple of non-profits so they can use the tool to help organize volunteers at a state and local level.

WhitneyLand 1 day ago 1 reply      
Fantastic service. Should it then link to their position all the top issues, like for, against, or refuses to commit?

By the way, my address has 30 people, all the same color.

tunesmith 1 day ago 0 replies      
If you guys have all this information, I would love to see a breakdown by subject, for instance, all the elected positions that have something to do with managing elections, and their next election date (so I know who to donate money to if I want to maximize health of elections nationwide).
throwaway2016a 2 days ago 0 replies      
Impressive. It even had my Registry of Deeds. Although it did list some people from other districts. I'm from New Hampshire and it showed all our Executive Council even the ones not in my district.[1]

[1] In New Hampshire if the Governor is the Chairman of the Board the executive council is like the directors on the board.

cpeterso 1 day ago 0 replies      
The site could use geolocation (server and/or client side) to prepopulate users' location for a zero-click user experience.

The site could also specify a numeric input type for the zip code field so mobile browsers will display the numeric keypad instead of the alphabetic keyboard.

jzwinck 2 days ago 3 replies      
I'm one of the millions of Americans living overseas. Who represents us?
ryao 1 day ago 0 replies      
I wanted to find out who my representatives were to discuss the state of internet connectivity in my area. This made the research easy. :)
creepydata 2 days ago 0 replies      
Not everyone, doesn't show my mayor or city council.
wishinghand 2 days ago 2 replies      
What's the source of the data once an address is submitted? I tried to do a similar project during a hackathon and all of the APIs I found were either dead or insufficient.
mushmouth 2 days ago 0 replies      
This would be really useful for other countries. since some of the info is hard to find.
andyfleming 2 days ago 0 replies      
It would be nice to be able to expand the list view of this to print to a PDF.
jelder 2 days ago 3 replies      
Severely inaccurate results for me. Gave me reps for CT but I live in MA.

Edit: typo

Pigo 1 day ago 0 replies      
Brought to you by the left-wing preachy makers of BibleOrQuran
ada1981 1 day ago 0 replies      
this should be an official .gov website
jamisteven 2 days ago 1 reply      
Site is down.
ch 2 days ago 0 replies      
Funny. The results keep coming back empty for me.
tonetheman 2 days ago 0 replies      
Very cool.
chanandler_bong 2 days ago 5 replies      
As a US citizen living abroad, I have no representatives. This, as I am getting ready to file my US income taxes for which I receive no benefits or representation.
cperciva 2 days ago 1 reply      
I'm guessing the title was intended to have s/and out/and find out/ ?
rocky1138 2 days ago 3 replies      
It's important to note that this is US only.
Proven 2 days ago 0 replies      
LOL, yeah right they "represent" you.You wouldn't need that site if they did.
briankwest 2 days ago 0 replies      
Data is stale, My uncle Ronnie was killed three years ago, and they still have him listed.


source99 2 days ago 3 replies      
I really don't understand this.

Is the point so that I can tweet or mail letters to my elected officials?

Is tweeting at officials supposed to help my station in life? Just seems ridiculous to me that tweeting would be taken seriously. I suppose it could be taken seriously but that would actually scare me more.

I always believed that if you want to make a difference than vote with your wallet and I don't mean to donate money to politicians. I mean to make purchases from companies you respect.

fnj 2 days ago 1 reply      
A technical tour de force, but the premise is flawed. "I" have no representatives. "We" have representatives as a group. The mob who rules by force of numbers; all strictly democratically acting.

Right off the top, the president, VP, both senators, federal and state representative, governor, and lieutenant governor; every one of them is 100% useless to me personally, because not a single one of them shares even one tiny insignificant view that is important to me. Sure, hey, that's the breaks, but let's not pretend they represent me.

UDP vs TCP gafferongames.com
450 points by colinprince  3 days ago   189 comments top 24
mjevans 3 days ago 12 replies      
The main difference between TCP and UDP, as this programmer discovered, relates to quality of realtime service.

Times to use UDP over TCP

 * When you need the lowest latency * When LATE data is worse than GAPS (loss of) in data. * When you want to implement your own form of error correction to handle late/missing/mangled data.
TCP is best when

 * You need all of the data to arrive, period. * You want to automatically make a rough best estimate use of the available connection for /rate/ of transfer.
For a videogame, having a general control channel to the co-ordination server in TCP is fine. Having interactive asset downloads (level setup) over TCP is fine. Interactive player movements /probably/ should be UDP. Very likely with a mix of forward error correction and major snapshot syncs for critical data (moving entity absolute location, etc).

peterburkimsher 3 days ago 3 replies      
Do you want to hear a joke about UDP?

You might not get it, but I don't really care.

mikepurvis 3 days ago 3 replies      
The recommendation to avoid TCP altogether is surprising to me. Having encountered a number of video-conferencing systems which are in a similar space, it seems pretty standard to have separate real-time and control sockets on UDP and TCP respectively. I skimmed the linked paper and didn't find it conclusive; can someone summarize how it is that having a TCP socket can affect UDP traffic on the same interface?

All that said, I certainly see the argument for an all-UDP protocol in terms of defining your own retransmission approach, or attempting to avoid it altogether with forward error correction or whatever.

rstuart4133 3 days ago 4 replies      
The entire article makes me shudder in disbelief.

It's aimed at people who don't know the difference between UDP and TCP (and possibly wet string). Yet he recommends they implement their own reliably protocol over UDP, and they avoid TCP because it's better to implement your own QoS?

Why not add obtaining PhD in quantum mechanics just to round it out? It wouldn't alter the odds of pulling it off overly.

techman9 3 days ago 0 replies      
I appreciate the first user comment on this article:

"I didnt care for this article 7 months ago. I regret it. My whole game is useless now. I should have go with udp."

geokon 3 days ago 1 reply      
What are some alternative protocols?

It's been a while since my networking class, but if I remember correctly with UDP you have some serious issues where you can end up clobbering your network, filling up buffers in the middle and dropping tons of packets. The lack of congestion control is a huge no-no.

For instance in the example he gives, sure you can tolerate dropped packets for player-position data, but how do you know if you can tolerate sending at 10Hz 100Hz 1000Hz? Even with TCP you can't (I think....) programmatically adapt to the size of your pipe. That's kinda abstracted away for you so that you just say "send file A to B" and it does it for you

Are you supposed to write your own congestion control in userland???? Seems like this should be a solved problem

bsaul 3 days ago 1 reply      
Anyone here knows what protocol should be used for mobile multiplayer games , provided they are real time and not turn by turn ? (think people playing on a decent 3g connexion).

I think you can't send a udp paquet to a phone because the carrier will block it, but i'm not sure.

milansuk 3 days ago 5 replies      
> they may arrive out of order, be duplicated, or not arrive at all!

This is first time what I read that datagram can be duplicated. Is it true? It's duplicated by network or does it mean that peer send it again?

zubspace 3 days ago 0 replies      
I tried to build an UDP library once in C# with different methods (BeginReceiveFrom, ReceiveFromAsync, ReceiveFrom). You learn a ton and it's quite interesting. My goal was to recreate something similar to .NET remoting based on UDP.

But be aware that it's a daunting task, because there are so many things you need to handle all together: lost packets, reordered packets, duplicate packets, connection handshakes, session handling, reliable/unreliable channels, packet resends, random disconnects, reconnects, network congestion, spoofing, protocol hacking attempts, dos-attacks, banning, encryption, etc...

If you're writing an UDP library, you also need to think of performance, object pooling, connection buffers, threading/async issues and on top of that you also want to provide a nice API to the outside world for the client and server... Well, it gets messy...

If you're into this thing, I can advice you to look at haxe libraries. Learned a lot of them. There are very simple, idiomatic server/client-side implementations which are easy to follow, even if you don't know haxe [1][2].

[1]: https://github.com/svn2github/haxe/blob/master/std/neko/net/...

[2]: https://github.com/svn2github/haxe/blob/master/std/neko/net/...

sydd 3 days ago 1 reply      
Ah I love Glen's articles. Ive learned so much from them: game physics, integrators, game loops, game networking,..
josephg 3 days ago 1 reply      
Does anyone know if SCTP is suitable / in use by any games? It supports streams to work around the head-of-the-line blocking problem TCP runs into and it also supports opt-in unreliable delivery for game data. On the surface seems ideal for games, though I don't know if its getting much actual use.
ozy 3 days ago 1 reply      
"TCP has an option you can set that fixes this behavior called TCP_NODELAY"

That fixes nothing. Now you are sending too many small packets using too many syscalls. Just like UDP, buffer in user space, send in one go. If you do that, TCP_NODELAY makes no difference. (The exception is user input, if you want to send those as they happen, use TCP_NODELAY, but think about the why ... it has little to do with what this article is talking about.)

Games likely send data only around 25 times per second, and ping is likely < 50ms. Waiting on a dropped packet and the delay it causes is unnoticeable. Added that clients will need some kind of latency compensation and prediction, independent of the TCP/UDP choice. Delays and then bursts of 100ms or such are doable.

The problem starts when the connection stalls for more than 100ms, especially in high bandwidth games. During the stall both behave the same. After the stall, TCP will be playing catchup and wasting more time receiving outdated data, and handing it to user space in order. UDP just passes on what is received, with a lot less catching up, and maybe some dropping of packets.

But gameplay has been degraded in both cases. UDP just has a higher chance of masking and shortening degradation more.

Anything more than that is basically cargo-culting, like this article.

j_s 3 days ago 1 reply      
This feels like a nice introduction.

Anyone here have any experience using QUIC in any application of their own?

The custom congestion control makes me wonder if it only works alongside TCP traffic - once everything goes QUIC, then what happens? I looked for a bit about the story in ancient history of some blazing fast server OS TCP implementation that broke the rules so it fell over when more than one server was on the network, but couldn't find it.

dpeck 3 days ago 0 replies      
reminds me of this old article we had as assigned reading during a networking class in school.

The Internet Sucks: Or, What I Learned Coding X-Wing vs. TIE Fighter(http://www.gamasutra.com/view/feature/131781/the_internet_su...).

jokoon 3 days ago 0 replies      
In short, TCP will work hard to deliver 100% of the packets. So when a packet is lost, TCP asks to re-send the packet. This is fine to display a webpage or send a file, but it can't be tolerated in games where time continuity matters. I think it's the same issue in VOIP and video conferencing too.
johnbellone 3 days ago 0 replies      
If you want to get familiar with TCP/UDP and are not gun-shy with C I would suggest the Pocket Socket Guide (now Practical Guide to TCP sockets)[1]. I have a really old edition, but it ages extremely well, and its one of the books I always use to refresh myself on network programming basics.

[1]: https://www.amazon.com/gp/product/0123745403/ref=ox_sc_act_t...

cpncrunch 3 days ago 4 replies      
The reality is that these days there generally isn't any packet loss, so UDP vs TCP isn't such an issue as it might have been in the past. In fact TCP has a number of advantages these days such as easier firewall traversal, WebSockets, etc.
fivesigma 3 days ago 0 replies      
A huge problem with TCP wrt gaming is the default ACK frequency on Windows which is set to 2. This effectively almost doubles the latency of game connections (sending/receiving a lot of time-sensitive small packets).

It can be changed with a registry setting (TcpAckFrequency) but you can't expect even a significant fraction of your users to do that. Why this isn't a per-connection option sort of like TCP_NODELAY is beyond me.

dozzie 3 days ago 1 reply      
I thought that it was somebody who has recently discovered the existence ofUDP protocol and brags about that to the world, but from skimming the article,it actually has some non-trivial remarks about UDP and TCP.

BTW, the article's view angle is multiplayer game programming.

the_arun 3 days ago 0 replies      
There was a similar discussion in Hackernews earlier - https://news.ycombinator.com/item?id=7507377
JohnStrange 3 days ago 0 replies      
What about UDT? I never tried it but always wondered about it's benefits/disadvantages.
aakilfernandes 3 days ago 4 replies      
Another important difference: UDP can't be used in the browser =(
dogismycopilot 3 days ago 2 replies      
That's a pretty obnoxious 'donate' button. They can't make it even larger?
z3t4 3 days ago 4 replies      
I think tcp have an unfair reputation. Our networks are better now then 30 years ago ... Worst case latency for tcp is like 3 seconds, compared to the packet never arriving. The trick is to hide the lag with animations. I think google, facebook, and world of warcraft use tcp for their real time apps !?
Creative Tim: Growing a side project into a $17k/month business indiehackers.com
436 points by csallen  2 days ago   138 comments top 9
morgante 2 days ago 20 replies      
I really love these IndieHackers interviews but surprisingly find them somewhat demotivating.

Even the more successful interviewees make less revenue per month than I can make through straight consulting. And they're the success storiesmost people (including myself) make far less per month from their products.

How do people keep motivated to work on side projects when consulting is so much more profitable?

rfrey 2 days ago 7 replies      
I devour indiehacker's interviews and I love the transparency - thanks to both sides of the table.

According to the numbers here, 42% of the 17,000 monthly revenues are spent on 6 people. That's $1190 per person per month if everyone is paid equally.

Is everyone in Romania, and is that a living wage there? If not, how are your employees making ends meet?

Edit: that last sentence sounds accusatory - I don't mean it to. I just mean: do you have a strategy of part-timers who have other income streams, do you take side contracts, etc.

axelut 2 days ago 5 replies      
Hi everybody, here is Alex, the co-founder of Creative Tim. Hope the information from this interview will help you achieve more with your current business or give you the courage to start your own business.

If you have any suggestions or feedback I would be glad to talk with you.


ensiferum 2 days ago 0 replies      
With all these "success" stories flying about I'd absolutely love to know how many losers there are for each "Creative Tim". 10x, 100x, 1000x ?
gigatexal 2 days ago 3 replies      
17k a month is only 204k a year which is awesome don't get me wrong but I'm not sure that supports 6 employees.
vyoming 2 days ago 0 replies      
They make great templates. We have used their chart templates in our BI tool - https://drilldb.com

Their pricing is also very affordable compared to many alternatives. You don't have to think twice before making a purchase.

nkkollaw 1 day ago 0 replies      
I read some of the comments, many saying that $17,000/mo. isn't a lot of money, but don't forget that Creative Tim is based in Romania.

Salaries in Romania aren't as good as in the States.

I'm currently based in Italy and a programmer can make as little as 1,200/mo. (salaries in Italy are much higher than in Romania, possibly 100-200% more). I make more than that freelancing, but I don't think 17,000/mo. is bad, both in general and for a company based in Romania.

gfosco 2 days ago 0 replies      
This worked very well on me, as I just bought one of their developer licenses for a pro package... Beautiful stuff.
wheelerwj 2 days ago 2 replies      
here's a link to the other HN thread with no comments or anything:


Working remotely, coworking spaces, and mental health bitquabit.com
381 points by jrheard  1 day ago   114 comments top 37
AYBABTME 1 day ago 9 replies      
For the last 2.5 years I've been working at DigitalOcean as a remote employee. DO has more than 50% of the staff remote. I think it's important that a large chunk of a company & team be remote, to be successful in the exercise, so that there be a forcing function to use asynchronous communications. It's been really life changing for me.

We have a bunch of style of remotees; work from home, work from coffeeshop, work from coworking spaces and work from a new place every day.

I've tried all of these styles, starting with work from home, then getting super depressive from loneliness and getting a coworking space (DO pays for it), then realizing I didn't use it and instead working from a mix of home, coffeeshop, and random visits I pay to my friends. And now I've been switching to mostly working from the crazyest settings I can think of. I worked from camping spots, from a sailboat, in a national park, on a beach in Asia, and it all works out once you're used to "travelling from anywhere".

I'm having the best time of my life by experimenting with what it really means when your ability to feed is now decoupled from your physical location. I feel like I'm living in a future that maybe more of the people will have the chance to live soon, and that it's my duty to find a "Theory of Working In The Future". My first theorem is "Don't stay home everyday else you shall go crazy".

Also, think about the implications of OneWeb and the constellation that SpaceX has been working on; I'm thinking "what if I could get low latency/high bandwidth internet from the middle of any ocean"? The future looks bright.

[edit]: just realized I'm kind of praising my employer a lot here. My comment isn't meant as recruiting spam, tho I think DO's great to remote folks. Also we're building massive distributed systems everyday and it's fun. So uh... check this out? http://grnh.se/wv3fgo

iguanayou 1 day ago 4 replies      
As a remote worker, I would really enjoy a coworking space at least 1-2 times per week, but the hour commute and the expense just are not worth it for me. I've been working remotely for two years and absolutely go stir crazy, and even into fits of depression, when I'm not really pro-active about getting out.

I've been volunteering with a community theatre this year, which gets me out of the house after work most week days. My mood goes up about 10x when I do this. During the month or so downtime between plays though, things start going bad again.

I'm also involved with some other meetups/clubs and do piano lessons. Putting together a deliberate schedule of "outside activities", at least for me, is absolutely necessary to make it work.

And I would still never go back to working in an office!

shanemhansen 1 day ago 3 replies      
Overall a well written article with some valid criticisms. After 2 years working remotely I observed roughly the same facts, but have a slightly different spin on the whole thing.

Like everything in life, working remotely has tradeoffs. One person's pro is someone else's con.

Pro: I potentially gained hours of my life back every day. I know many people who work for similar companies who spend more than an hour commuting every day. They take less-desirable jobs and leave behind their coworkers just so they can get an hour of their life back and reduce their commute from 2+ hrs to 30 minutes.

Pro: I can disassociate my COL from the company's choice of office location.

Pro: I can cook food in the crock pot on a regular basis without worrying about my house burning down.

Pro: I can walk my dogs during my lunch break (or pick up food from the grocery store or run some other errand).

Pro: (subjective) My coworkers competence is higher than what I typically observe from companies that limit their hiring pool to people who live within a few minutes (or hours) of one office building.

Pro: I can and have worked from a hammock, a camper van in a state park, a car on a road trip, and a cafe in Paris.

Pro: No requirement to waste literally hours of my day in bullshit pre-lunch planning, post-lunch coffee, etc. When I'm onsite I'm happy to spend lots of time on watercooler talk, but I'm not obligated to do it every work day.

Pro: I can go hiking on my lunch break.

Con: I don't see anyone but my spouse. I have to go to additional meetups in order to make up for this.

Con: Not as much face time with execs. This can matter politically and for your career.

onetwotree 1 day ago 0 replies      
Working remotely killed my mental health. Even with Slack, Hangouts, and all the rest, I became lonely, had difficulty focusing on work, and generally became significantly less happy and productive. I tried a coworking space, and while I made some great friends there, it still wasn't doing it for me.

I've been back at an office job for about 3 months now and it's been a huge improvement. I love being in an office with a team of people all working on the same thing, solving problems together, and socializing.

Obviously, different things work for different people, but I wish I hadn't bought into the remote work idea as wholeheartedly as I did. It's important to be aware of what you get out of onsite work in addition to the drawbacks.

scottlegrand2 1 day ago 1 reply      
As someone who spent the better part of the last decade working remotely, and having read tens of rants about open office floor plans with which I agree 100%, I think that the problem here is that there is no Silver Bullet. Those of us who prefer a results only work environment will never thrive in an open office, and those of us who need human contact will not thrive in a results only work environment.

More likely if you're reading this, you're somewhere between those two extremes. I'm an introvert when I need to get things done, but I'm an extrovert everywhere else. I have seen just as many people crash and burn trying to motivate themselves while working remotely as I have seen people go quietly nuts in an open office.

I am going to be working from a co-working space in the near future, but I suspect that I will still need to spend significant amounts of time on my own in my home office if I want to stay productive. I don't expect that solution to work for anyone else, but after nearly two decades in the software industry, I know what works for me.

socialist_coder 1 day ago 3 replies      
I wouldn't consider myself an extreme introvert but I am 100% satisfied working from home (been doing it for 4.5 years now).

The author mentions socialization only in real life. What about online socialization? I still talk to 2 of my best friends in a chat room (used to be IRC, now we use Slack). I keep up with old friends on Facebook. I have discussions and arguments on HN and Reddit.

And probably most importantly, my remote company has a VOIP chat that everyone is on and we routinely have "water cooler" type convos, in addition to serious stuff.

So yeah, I think you can solve this problem without needing real life interactions. Embrace your digital life to the extreme! And, companies hiring remote workers need to support them better, with VOIP and text chat rooms that they can be in (with other employees) and feel like they're part of the team and not just a worker.

mattlondon 1 day ago 0 replies      
It is not just remote work either - I've noticed a trend in my work where although I work in an open plan office surrounded by people, my teams are increasingly "global" which means that usually there is 1 team member on their own in each office.

Although I am surrounded by people, since you're not working with these other people, and/or there are desk moves every 3 months or so as teams are growing, you only ever end up with very superficial "friendships"/social interactions. "Hello" "How was your weekend" "Which team are you on?" "I am on this team" etc etc. You're just doing it out of politeness really, then in a month or two they'll move on to another team/office or there will be another desk move and you're back to square one, surrounded by strangers.

It is not unusual for me to go a whole day in an office surrounded by hundreds of coworkers without physically saying anything to anyone apart from "thanks" for holding open the door.

It is extremely isolating.

greenspot 1 day ago 1 reply      
Coworking spaces are a mixed bag. I went there for few months but stopped again.

Good is that you face more serendipity than when working from home. But really, it is not that much more. After only few weeks, the novelty wore off and I got bored and saw more the downsides. Like super small tables, no dual monitor setup, always too cold, the commute, less free fruit, and the people. Some are quite nice and you realize that you need random social encounters but there are also the typical odd people to whom you cannot relate at all (like everywhere). Those people don't hurt but I remember one who reserved the best flexdesk the night before by leaving tons of her post-its and other papers there. No big deal but nobody who makes you happy either.

I knew most people I met there before. Bonding with new people without having a common mission was not easy, it just didn't feel natural (and I am rather the extrovert sales type of guy). So, you can still feel 'alone' in a coworking space.

I think a coworking space makes more sense if you need a space as a team and want or need to see each other f2f on a regular base.

For business meetings or doing interviews, I prefer lobbies of top hotels, they are even more representative than the best coworking spaces and at the end of the month also cheaper with full service included and no extra fee when booking some meeting room. And for two hours working away from home, I am a fan of Starbucks or any coffee ahop with good wifi.

tmm 1 day ago 1 reply      
> If youre thinking of working remote, then think about what kind of working environment youre happiest with before you take the job, and make sure youll have that environment available to you.

Seems to me that the best part of remote working is the ability to figure out what the best environment is for you. Don't be a theorist, be an experimentalist: try a bunch of different situations and see what you like best. It sounds like the author started down this path, but stopped too soon (at first).

> Are you sad when a lot of your office is out sick, or are you relieved?

Usually relieved, then I wonder why I bothered with an hour of driving to sit in the office by myself, when I could have done that from home.

> Do you get uncomfortable when youre in quiet environments for too long, or do you revel in them?

Love quiet! My office at work has no windows (not even internal ones); being able to close the door and cut off the outside world is the best!

> Do you feel weirdly lonely when youre in a noisy coffee shop, or do you feel energized?

Annoyed by the noise mostly. Coffee shops are for getting coffee and getting out. Libraries are way better for actual work, IMO.

ENTP 1 day ago 2 replies      
This echoes my current gig to a tee: worked at home for a couple of months, went stir crazy, found an office. For me, the first office space was a hipster cafe type place that was just too freaking noisy. I moved to a Regus office which was ok but Regus were awful so a few of us clubbed together and got a truly shared office with both closed and open spaces for different type of work. I can highly recommend this setup as some days you just need to hole yourself up in a private enclosed office to do brain work. However the open space promotes social interaction and feeds the soul.
minipci1321 1 day ago 1 reply      
5 years remote. Personal bottom line:

-- negative: lost interest in having friends (and generally the patience needed to talk to people), no social life, no career, work-life balance completely broken

-- positive: sleeping 8 hours a night! (and more if I need to), making walks in the park/training at noon, never really sick, comfortable home office, can be efficient again (only my job doesn't require that). Started having ideas again and thinking about side projects.

Before going remote, during 15 years I was commuting 2+ hours in the morning (so 2 hours again in the evening), sleeping 4 hours a night by the end of the week, dozing off the entire weekend and generally feeling extremely exhausted, mentally and physically, easily catching flu etc.

Would I move back to office employment? I really hope I won't have to.

vxxzy 1 day ago 0 replies      
I've been working remote now for 5 years. All from home (sometimes from a car or a cabin). It takes discipline and limits. In my case - limiting my urges to finish out a project or get a bit further at 2am. It can be a blessing and a curse. I'm getting ready to 'venture out' - spend time at coffee shop or coworking. For the record, I've been 'remote' in many roles; employee and consultant. Overall, for me, I find routine an absolute necessecity to get work done. By routine, I mean 'get dressed as if going into the office' - it's about mindset.
xiphias 1 day ago 1 reply      
Getting a girlfriend who also works at home may be hard, but it can fix most of this problem. I liked working while my girlfriend was working at the same time...with some fun interruptions.
codingdave 1 day ago 1 reply      
My company is 100% remote. And I recall very clearly in the interview process one of the most important traits to succeed - Be self-aware.

Everyone is different. Every new remote worker figures out what they need to do to make it work for themselves. And we do not all do the same things. But we all know ourselves well enough to try things, see how it works for ourselves, and figure out what changes we need to make it work. Of course, we also talk to each other, give suggestions, etc. But ultimately, to succeed on your own, you have to proactively care for your own mental health. And self-awareness is vital to doing so.

dkarapetyan 1 day ago 0 replies      
Great post. The spectrum stuff is very important and it's important to be honest to yourself and during the interview process about what you expect. I now tell every place I interview being on-site all week is not gonna work for me. I need to be remote at least 2 days out of the week to recharge. Otherwise I will burn out in less than 6 months. Most places seem to be ok with that and make concessions to letting me do that.
gnfisher 1 day ago 0 replies      
Living abroad I've gone the remote road since 2009. I went through a lot of this same stuff but never associated it with my being remote/working from home, though now looking back I can see it most likely did. My health and mood increased once I started to force myself out to socialize more and once we had our children - the household is always busy and full of noise and life now, versus before when my wife went off to work and I sat alone in a quiet apartment all day long. Since the birth of our second I've been doing my first sprint in any nearby coffee shop each morning and that has improved my mood and productivity even further. I've been toying with the idea of a co-working space and this article has convinced me to give it a go.

We are social creatures, to varying degrees, and if we limit out interaction with others too severely, I think it makes it too easy to look exclusively and excessively inward. I'm all about self-analysis and looking inward but there comes a point when you go too far and it's no longer about reflection but a feedback loop of anxiety/fear/self-doubt... at least that has been the case in my experience! Also, regular exercise (running, lifting) has always helped me out of these emotional funks.

AndyNemmity 1 day ago 1 reply      
I have a similar issue at the moment. I have tried co-working spaces, but I work with people globally and my day tends to start at 5am. So to get to a co-working space, I'd have to leave at 4:30am, and that's provided they were open (they aren't.)

I've spoke to one about potentially giving me a key to the space, and perhaps that would work, but it's often easier just to roll out of bed, throw on coffee, and start my meetings.

I tend to repeat, and stop going outside much at all, just staying indoors. That then perpetuates my desire to not go outdoors.

It's solvable though, it takes effort on my part to continue experimenting, and trying new things. It only becomes an issue when I just keep repeating the same situation. Definition of insanity, repeating same things, expecting different results.

eeeeeeeeeeeee 1 day ago 1 reply      
I joined a co-working space this summer and it was...meh. The people were nice and the facility was great, but I don't really see the point unless your employer is going to pickup the tab and/or you have a small apartment with no office or an insufficient one.

I found myself missing my widescreen monitor, standing desk, and chair. The amount of money I've sunk into my home office felt wasted when I used the co-working space.

And the co-working space would swing between eerily quiet or way too much noise. It seemed weird to go (and pay) for a co-working space where everyone is primarily staring at their laptops.

On the other hand, I would maybe consider going from a two bedroom to a one bedroom apartment if I had a full-time 24/7 use of a co-working space and then the cost would more than even out.

opticalflow 22 hours ago 0 replies      
For my job(s) the last 8 years or so, I've had a mix of on-site (10%), travel (40%) and work-from-home (the remainder). There have been very long stretches when I'm neither in the office, nor travelling to meet with customers, however -- sometimes, months. From my standpoint, I can commiserate with the author here. I live out in the hinterlands with my wife and 6 kids, so there's no shortage of social interaction -- however if I've been stuck here for 6 weeks, I begin to get a little stir crazy. I "recharge" by going to trade shows/events/meetups in NYC (which is about an hour and a half away) -- the energy of the city is refreshing, but I wouldn't want to put up with it every day. Just once in a while...
amyfransz 1 day ago 0 replies      
As a remote worker myself, I can definitely resonate with the dark sides of working remotely as described in this article. The lifestyle is mostly portrayed as living the dream, however the lack of social interaction and finding a proper work-life balance where you also set time aside for friends, exercise, or meditation for example, is pretty difficult. That's actually what gave me the idea to bring together a community of remote workers to work, live, and travel the world together where the hassle of accommodation, flights, work spaces, gym passes, and social activities are taken care of so remote workers can enjoy the remote lifestyle to the fullest. If anyone's interested you can find more info here http://www.theremotetrip.com
mccolin 1 day ago 1 reply      
Excellent personal story on finding the right workspace in a remote situation. Home office vs. Coworking space is a conversation I have with other remote works quite a bit, and the answer is different for everyone.
abalashov 1 day ago 2 replies      
I relate closely to this post, especially the position that working remotely is for a certain kind of introvert only.

To such an extent do I relate to it that I wrote a blog post about it a year ago:


This post struck a similar note:

Working from home might genuinely be the ideal environment for those closest to the introvert end of the spectrum, and I think those are the people who form angelic choirs of blog posts asking if you have met their lord and savior, the Fortress of Infinite Solitude, Home Office Edition. For them, the quiet work environment makes their jobs dramatically more enjoyable. But for me, it was the opposite: Id gone from management (high social interaction) to software development (lower social interaction), and from working in an office (hundreds of people) to working from home (two cats), and expected that this would all be fine.

Zelmor 1 day ago 0 replies      
Working remotely for 7 months now. Three things I do not miss are - neon lights which used to trigger my migraine on a regular basis, - 45-60 minutes of public transport commute in the heat/cold,- the need to look busy when the work is already done.

While I did experience a bit of a breakdown at one point, it is something to overcome. I like the "lazy days" of regular work in coffee shops, libraries and sometimes even pubs (no alcohol during work hours though, that's bad on so many levels). When there is heavy need of cognitive abilities, I tend to stay in, start the day with a cold shower, breakfast and coffee, then work at my standing desk.

I find standing desks really something all offices should support for their workforce. It keeps you active during the day and allows for greater focus. Start small, go for what fits your physique. Use a rubber mat. I would suppose it also helps with what the author calls "off-days", since I do not encounter them. There is always something to improve upon. If no hard-work is available, I just work on documentation and learning new skills that advance my work/life/career. This allows me a good night's sleep.

alkonaut 1 day ago 0 replies      
I have tried coworking in both co-rented offices and "office hotels" as well as work from home. For a long while I had a (pretty expensive) seat at a coworking space that I didn't use, but just knowing that I could leave my isolation at home and go there meant it felt less isolated.

I wish I could work in cafes, at friends etc, but I just can not bring myself to work without a proper big screen and keyboard, which means most nomadic coffee shop setups are off limits. One day working off my laoptop and my neck, eyes and and back hurts. I need a proper desk, which makes it a lot harder to move around.

jschwartzi 1 day ago 0 replies      
This mirrors my experience going from product support to software engineering. I moved roles in a company that was primarily a sales organization, so I still had a lot of responsibilities to other people that required social interaction. When I switched companies to join a product development team all the social interaction went away. In fact, earlier this year I went for several months without any real interaction with my co-workers, and I work in a cubicle farm. This had a severe impact on my mental health, to the extent that I'm in therapy now. I've since started seeking more interaction with co-workers during the day because I actually need it to work effectively.

People often write about software development as if it's a solitary activity, but I can only do my best work for other people. Having personal relationships with my co-workers makes me a stronger developer, and I can't do it remotely.

coldtea 1 day ago 0 replies      
>But for me, it was the opposite: Id gone from management (high social interaction) to software development (lower social interaction), and from working in an office (hundreds of people) to working from home (two cats), and expected that this would all be fine.

Could be the change from bossing people around (being a manager) to being bossed around (being a dev)? Because all the rest (interaction with friends, walks, going to the gym etc) one could still have from working remotely -- like the author says they did for the first months anyway.

agumonkey 1 day ago 0 replies      
How many found new project ideas and partners in coworking spaces ?
intrasight 1 day ago 0 replies      
What I like about working from home with regards to socializing is that I get to choose when, where and with whom I socialize. I didn't have that control in an office setting. While most of my clients and colleagues are in other cities, I make sure that I have several local if for no other reason than to have some professional socialization opportunities. And of course there are a dozen groups that I could choose to participate in - actually more than I'd ever have time to do.
Mister_Y 23 hours ago 0 replies      
feeling isolated is not a good thing most of the people, that's why libraries, coworking spaces and meetups exist. After all, the human being is a social animal.But what we can do to prevent this is planning where to work from, where to go, analyzing if working remotely is good or not for us. I found this piece of content really helpful as I'm tired to read how nice is to be a digital nomad, well... it is if you're an outgoing person, or if you're the type of guy who likes to feel pushed to always go the extra mile. But we have to prevent people from just going to the middle of nowhere expecting amazing things will happen as you have to be the one who moves first.Loved this!
d13 1 day ago 1 reply      
What's an "off day"?
Kiro 1 day ago 0 replies      
I feel like an alien for my lack of social needs.
techdebtland 1 day ago 0 replies      
innocentoldguy 1 day ago 0 replies      
I love working at home. Not going out of my house for weeks on end doesn't bother me at all, and I'm much more productive when I'm working on my own. I think I could be quite happy as a shut-in. I can see how the lack of a social life would bother some people though. I guess the success of working remotely depends a lot on your emotional needs and personality.

On another note, I do have to disagree with the author with regards to making the most money in either New York or the Bay Area. Perhaps the salary looks bigger on its own, but when you consider housing costs, food, gas, taxes, and other costs of living, you actually end up making a lot less than you do in other locations. I've received multiple offers from the Bay Area, and one or two from New York, but they just can't compete, all things considered. Plus, I don't have any desire to cram my family into a 1,000 square-foot cubbyhole, when we can enjoy seven times the space elsewhere for half the price.

WhatIsThisIm12 1 day ago 0 replies      
I like my coworking spaces like I like my haircuts; effective with nobody talking to me.
Zigurd 1 day ago 0 replies      
Does anyone know of a valid study of how workgroup size, the ratio of meetings to individual working time, etc. compared to well-organized remote work? By well organized, I mean designed to mitigate the problems cited in this article and otherwise.
edblarney 1 day ago 0 replies      
It's great to see a lot of agreement on this.

I agree with most of it.

Short commutes + some privacy at work would probably entice most people in.

Having 1-2 days a week at the office would be grand.

Being able to focus for long periods without interruption is quite important, frankly, I have no idea how software gets written in those cramped open workspaces with all the noise etc..

Granted, different types of software for different types of things.

I can imagine a deeply technical problem requiring more thought than say a lot of dev-ops, scripting, code reviews, bug fixing, etc. etc..

flamedoge 1 day ago 1 reply      
site is down
A Guide to Deep Learning yerevann.com
410 points by adamnemecek  3 days ago   33 comments top 9
AndrewKemendo 3 days ago 2 replies      
The primary thing missing from all of these guides is that you need to have two things for ML:

1. A purpose for utilizing it

2. A data set to train/act on

Without that, all you get are a bunch of shovels and picks, but no idea of what kind of wood/bricks you need or a plan for the house.

nsxwolf 3 days ago 3 replies      
Everyone's into deep learning, but what would I actually do with it? With some other field, like computer graphics, one can fairly quickly get a 3D cube spinning on their screen and know it has some relation to the special effects in the Star Wars movie they just saw. No one makes it obvious what the hobbyist can expect to do with deep learning or how it relates to the broader world.
kowdermeister 3 days ago 1 reply      
If links are broken for you, then turn of your adblocker, because he is measuring clicks with Google analytics and he's JS is broken thanks to the missing ga function.

#issue reported

miguelrochefort 3 days ago 2 replies      
I'll bite.

We see these being posted every week. Why?

hnarayanan 3 days ago 2 replies      
This looks really good, and an interesting wag to describe the landscape. Is it just my phone or are the links completely broken on phones?
md2be 2 days ago 1 reply      
I didn't see k-means nearest neighbor in the list. Also, wouldn't a mathematical statistics be a prerequisite?
blueyes 3 days ago 0 replies      
this is by far the most visually attractive list of deep learning resources i've seen. and they hit a lot of the main points.
roye 3 days ago 1 reply      
I saw stars indicate difficulty; do colors also have some meaning?
bamura 3 days ago 0 replies      
Good consolidation @adamnemecek...
Nintendo releases original Zelda design docs nintendo.co.uk
333 points by jlturner  3 days ago   44 comments top 7
buzzybee 3 days ago 1 reply      
Don't be deceived: the reason why these plans look nice is because of the team structure of the Zelda project. The design team spent a lot of time drawing up polished graphics and layouts and then "threw it over the fence" for implementation, in waterfall fashion. This is still a common practice within Japanese teams, as evidenced by, for example, Mighty No. 9's documentary, where you can witness an entire level constructed in Microsoft Excel. [0] The separation of roles is not a definite downside if the game design is already well-understood, and American teams have flirted with big up-front design on occasion, but tend to lean towards making sure everyone stays hands-on and can test and iterate independently.

One of the stories about Zelda that appears in interviews is that the second quest is the result of a miscommunication about how much space was available: They could have had a single quest that was twice as big.

The main direct advantage of drawing everything out is that you can quickly explore different types of setups(relative scales, positioning, iconography, etc.) and do a few passes of testing on it before committing it to code, for the same reasons that one might do wireframing and mockups for application UI.

[0] https://youtu.be/Ri4bV3Z186Q?t=249

gallerdude 3 days ago 4 replies      
Even as a programmer, I appreciate having the vision of the game first and figuring out how to program it second.
ldjb 3 days ago 0 replies      
Do also check out, if you haven't already, this Iwata Asks interview from 2009, which contains further insight into the development of the original Zelda, and more design documents:


(Be sure to click the Next button at the bottom of that page for the second part.)

tantalor 3 days ago 0 replies      
Very disappointing. Uploading a few sketches to a blog is hardly what I'd call "releasing original design doc". Where's the rest of it? What about the story? Boss mechanics? Weapons? Enemies? Dungeons?
georgeecollins 2 days ago 1 reply      
I once took an English class where we read screenplays that were printed in a book. One of them was of the great movie "Chinatown."

The professor --- who had never worked in film-- said: "See, the screenwriter really comes up with everything. See how he thinks of every scene, every bit of dialog, that the actors and the directors follow."

Later I read a book about the making of Chinatown and learned that the script was some huge thing that Robert Town and Roman Polanski rewrote every day while they were making the movie. The version in the text book was the correct script in the order of the final cut.

Anyone who takes a class in video game design or development should be wary of those who haven't really done it.

elihu 3 days ago 0 replies      
It seems kind of obvious that Nintendo would have Official Game Asset Graph Paper, but it's pretty cool to actually see it.
CaptSpify 3 days ago 1 reply      
Can we have the url point to https://www.nintendo.co.uk/News/2016/December/Take-a-look-be... (the original) instead? I didn't see gamasutra adding anything worthwhile
Machine Learning Crash Course: Part 2 berkeley.edu
370 points by llazzaro  3 days ago   35 comments top 4
wiradikusuma 3 days ago 9 replies      
Now that there's a bunch of AI/ML-related links in the front page, probably now is the best time to ask:

As I learn deep learning, from the practical point-of-view, I found that the idea is simply to feed some "black box" with labeled data so next time it can give you correct label given unlabeled data. In essence, it's pattern recognition. What do you think?

And then, as I try to find use cases for ML (you know, finding problem for the solution), I found that actually, many problems that can be solved with ML can actually be solved with rules. For example, detecting transaction fraud. You just need to find the right rules/formula. Forget ML, if you can't hardcode if-else, just use rules engine. What do you think?

So, I'm starting to think that ML is good for solving problems where (1) we're too lazy to formulate the rules, or (2) the data is too complex/big to analyze by rules (as in, understanding image or voice). What do you think?

supremus_58 3 days ago 3 replies      
does anyone have recommendation on a text that is mathematically heavy but also looks at modern approaches/appleications?
Keyframe 3 days ago 2 replies      
I also have a question regarding ML. Are there resources where I can see how could I treat video sequences (series of images / spatial/temporal continuity) as inputs? Trying to find a starting point for learning and use case I have in mind.
Kalzumeus Software Year in Review 2016 kalzumeus.com
373 points by mkeeter  2 days ago   232 comments top 20
dsacco 1 day ago 4 replies      
I owe a great deal of my own personal success to Patrick McKenzie and Thomas Ptacek, both of whom have been steadfast, consistent and generous advisors (both in public comments and in "hey can I bounce this off of you" emails).

After following Patrick's writings and stories for a number of years now, I can confidently say that his relentless transparency has been one of the greatest gifts I received in the industry. His advice may not strictly work for everyone in the literal sense, but I believe that diligently attempting to use his suggestions as a template is, itself, a highly productive exercise in programming and business.

There is one particular note I want to make about patio11's success: Patrick is a phenomenal marketer with remarkable business savvy who happens to be a programmer. He is not primarily a programmer, which is evidenced by his recent work at Stripe and the work he is best known for on HN (essentially, writing about shipping software, not the software itself).

This is not to say he is not a good programmer - I simply can't comment on that, though I have reason to believe he is after seeing Starfighter's game. Rather, he leverages that skill set as a means to an end, not an end in itself.

I think this is a really important point to make because I see many people who try to pursue significant career success by e.g. ranking up on TopCoder, or open sourcing impressive software. While those things can lead to success, there is a vast, long tail of people who are very capable programmers with no recognition doing those things. Healthy self-promotion and efficient improvement/maintenance of one's technical skills has a much higher probability of success than attempting to become Fabrice Bellard.

This is demonstrative - in my opinion, the sum of all of patio11's advice can be summarized as follows: Don't be a programmer, be a $SOMETHING who happens to program, and program well.

idlewords 1 day ago 7 replies      
I read these reports from Patrick with interest, but feel like he inhabits a different universe than I do.

Pinboard made $256K last year, so I operate in at least the same financial ballpark. But I do my taxes on TurboTax and have never spoken to an accountant or lawyer. My business is a sole proprietorship.

From my perspective, Patrick overcomplicates everything he undertakes with business processes and overhead. From his perspective, I'm probably a irresponsible slacker.

The upshot is that there are as many ways to run an online business as there are people, and how you do it depends as much on your personality as on objective factors. Big props to him for writing about his experience so openly, and in a way that so many people clearly find helpful.

whichdan 1 day ago 3 replies      
Speaking of salary negotiation, Patrick's article[0] on it is one of the single most valuable things I've read in my career. I can probably attribute about 25% of my salary to it.

[0] http://www.kalzumeus.com/2012/01/23/salary-negotiation/

spitfire 1 day ago 4 replies      
Patrick is such a tease. I want to know what that two week project at Stripe that /every/ company should have is now.
grow91 1 day ago 1 reply      
If anyone is interested in how much Appointment Reminder sold for: https://feinternational.com/buy-a-website/6192-b2b-saas-prof...
patryn20 22 hours ago 0 replies      
Reading these is always interesting to me. Looking at these sorts of sites and the amount of articles and buzz they gather on Hacker News and other sites, I always assume they must be doing high six-figures in income. Yet the ones that are transparent are generally in the $300k a year or less revenue range.

I've always considered my side projects and businesses failures, but judging by these numbers I've been more successful than I realized. So I feel good about that, but I think I really need to re-evaluate my goals and how I pursue them. Because I've been extremely negative about what are apparently successes.

Perhaps I also need to be more open to hiring a broker next time I sell a project.

charlieirish 1 day ago 3 replies      
Patrick never ceases to deliver on value to this community and many others. Even though this is a story about how he latest startup didn't go as planned (he's now full time at Stripe), he drops knowledge bombs for us all to learn:

- knowing when to move on to something new (Appointment Reminder -> Starfighter) when he didn't have the 'fire in his belly'y any longer

- financial planning using a simple spreadsheet: the retirement fund!

- when to borrow money and how to calculate risk

- when to join a company (rather than start something new)

- the value of personal leverage (personal and professional development)

- when and how to sell your startup

- the trials of shipping (six weeks became three years)

- setting goals for the future

- the value of family

I've added Patrick's year in review to a few other bootstrappers and solopreneurs that I have felt are helpful, instructive or inspiring:


stevoski 1 day ago 0 replies      
From the article:

"Doing [meat-and-potatoes work on marketing and sales that the business needed] was a real joy back when I was running Bingo Card Creator, but repeating it felt a lot like repeating high school."

Oh boy, that's exactly how I feel when I'm in the mood to make a new product. Patrick nailed it.

dennisgorelik 1 day ago 3 replies      
It is hard for me to reconcile:

1) "Tokyo is a wonderful city and I love it a lot enough to not regret the rent"

2) "the balance in my Excel file went progressively negative. This was expected, but caused me a bit of stress."

Ogaki is a city where Patrick successfully launched his first startup - Bingo Card Creator.

Patrick met his wife in Ogaki.

Then McKenzie family moved to Tokyo, which "helped" Patrick:

- to fail at his another startup - Starfighter and converted him from entrepreneur to an employee;

- "not felt as effective as a husband / father";

- to accumulate "health debt".

What is so "wonderful" about Tokyo and what is there "to love"?

emiliobumachar 1 day ago 2 replies      
At least here in Brazil, credit card debt has a fame for having the highest interest rates of all possible forms of debt, by a wide margin.

If you don't mind saying, did you ever considered other forms of debt, e.g. applying for a loan at the bank? If yes, what tanked it?

mutagen 1 day ago 0 replies      
I'd love a series of interviews with the people you've dealt with over the years at your Ogaki bank and their perceptions and misconceptions about your account and the transactions that go in and out of it.
pavlov 1 day ago 2 replies      
I'm a bit surprised that he was running Appointment Reminder as a sole proprietorship. My business ventures have mostly sucked, but I've always found the separation of personal and business concerns afforded by proper incorporation to be invaluable, and not expensive at all in the big picture. (This was in Finland; I imagine Japan could be very different.)

On the other hand, I guess that makes AR a fine example of a company that would have benefited from Stripe Atlas if it had existed back then? :)

tabbott 1 day ago 2 replies      
Why were you unable to keep StockFighter running?
__derek__ 1 day ago 1 reply      
@patio11, there's a broken link to your post about salary negotiation. It's currently relative and needs a leading slash.

 <a href="2012/01/23/salary-negotiation/">previous writing on the subject</a>

grok2 1 day ago 0 replies      
patio11, why does the ToS page on Appointment Reminder still list your name as the one the agreement is with?


Epenthesis 1 day ago 1 reply      
(I realize this is rather tactless, but I ask in the interest of fairly weighing the risk/rewards of startups/consulting vs. big tech)

Is it fair to read this to mean that patio11's net worth is < 250 k$? (Based on the fact his liquid assets were negative prior to the sale of AR, and said sale netted less than the price of a house in Tokyo, median price: ~250 kUSD.).

jonaldomo 1 day ago 1 reply      
Great write up. Does anybody have more information on the attack near the end "...defrauded by Lithuanian hacker gang which figured out how to use our application to proxy a telephone call through Twilios phone number verification feature to a phone sex line in the Caribbean..."
Ayraa 1 day ago 1 reply      
Maybe the time and effort demands of both wouldn't have allowed it but given how Patrick's personal debt grew, it seems it would have helped to do at least a little bit of consulting while he was working on Appointment Reminder.

Doing freelance / consulting work on the side while bootstrapping your product to profitability seems increasingly common now. Though this is definitely more sustainable in locales with lower costs of living while you also don't have to provide for a whole family.

mythrowaway99 1 day ago 3 replies      
No offense, but how do you get $120,000 into credit card debt with the claimed $30,000 per week consulting income.
tbrooks 1 day ago 1 reply      
Why host Jekyll on AWS Lightsail and not S3?

It's simpler, faster, and cheaper.

Plus you can use Cloudfront and use AWS Certificate Manager for SSL.

Machine Learning Crash Course: Part 1 berkeley.edu
327 points by rafaelc  2 days ago   16 comments top 4
aub3bhat 2 days ago 0 replies      
As a counter-argument Linear regression to ML is "goto statement" to programming.

Linear regression looks great on paper since you can derive residuals, slopes compare the individual "effects" etc. But thats unnecessary and in some cases wrong when the goal is mere prediction and not explanation. The big difference between ML and statistics is that latter selects a "correct" linear model and then assumes a distribution for "errors" due to pesky reality. The effects are used for explanations (538 Nate Silver style wonk/punditry). Machine Learning on other hand tries to predict as close to observations as possible without imposing a model or caring about an explanation.

The simplest introductory Machine Learning approach should not be linear regression but rather a 1- nearest neighbor model.

E.g. rather than giving data about house prices and square footage. The question should be "How do you predict price of a house in given location?". "What are relavant features?" (location,location, location,school district,number of rooms,sq ft, etc), "How would you collect labels/data". (Zillow, exclued prices older than 2-3 years).

The simplest answer would be that the price is same as that of the neighboring house (closest lat/long) with similar sq foot sold recently.This can then be implemented as a weighted distance metric and tested using Leave one out cross validation (I know not the best metric). But consider how Nearest Neighbors allows us to incorporate location information in a natural manner. That is very important and cannot incorporated in an elegant manner in a linear regression model.

A big part of ML is applying different set of methods across several domains. Thus for beginners Teaching ML should not be about teaching limear models or gradient descent but rather how do you start thinking from ML perspective.

minimaxir 2 days ago 2 replies      
The whole "machine learning is just fancy statistics" discussion that happens on Hacker News endlessly is often pedantic semantics. However, in the case of linear regression, this is basic statistics that is an analysis life skill and has many practical applications outside of the hardcore TensorFlow blog posts. (case in point, I first learned linear regression during my undergrad in a "Statistics for Business" class)
CN7R 2 days ago 3 replies      
What's the best way for college freshmen to learn about ML? -- A.I. and ML aren't really topics talked about until upper-divs, which means a year or two out for me.
rrggrr 2 days ago 0 replies      
Seriously... thank you for posting. This was the intro I've been looking for but never found.
Microsoft Word for Windows Version 1.1a Source Code (2014) computerhistory.org
279 points by rmason  3 days ago   133 comments top 24
coldcode 3 days ago 1 reply      
The article is missing a bit, since Word for Mac (1985) was more the model for the eventual Windows 1.X version, but isn't even mentioned in the article. There were also versions for other "windowed" OS's during the same 5 year period before Windows was sufficiently viable to make it work. People often forget that new apps appeared first on the Mac until around 1990 or so when Windows 3.0 shipped (Word appeared a bit before); basically MacOS was much more advanced than Windows up until that point. After Windows 3 the first platform flipped completely to Windows. I shipped my first MacOS app in 1987.
eb0la 3 days ago 4 replies      
Is this the first source code with Hungarian notation released to the public?(https://en.m.wikipedia.org/wiki/Hungarian_notation)
pcunite 3 days ago 1 reply      
A file "filewin.c" has the following comment:

FUTURE: MacWord does a DoJump(&venvMainLoop) !! (which,according to DavidLu is "guaranteed to crash soon there after"). we should really figure out a better way...

Who is DavidLu and did this ever get fixed?


davidw 3 days ago 3 replies      

 Morristown, NJ 07960
That's Bell Labs isn't it?

The 'obscure language' used is troff: https://en.wikipedia.org/wiki/Troff

dsp1234 3 days ago 1 reply      
sp2tab.bat which calls tabify.sed to replace spaces with tabs. Nice to know that some things never change.

Edit: Also a 16bit windows executable version of GREP, just tested on a 32bit version of Windows and it still works (but not on 64bit)

xurukefi 3 days ago 1 reply      
Profanity check:

 $ grep -i fuck * -r Opus/asm/wordgrep.asm:; BP is used as always, the other registers are free to fuck with. Opus/asm/wordgrep.asm: je another_fucking_out_of_range_jump Opus/asm/wordgrep.asm:another_fucking_out_of_range_jump:

lewisjoe 3 days ago 1 reply      
Naive question: Is it possible to build/run this source in a simulator, VM or something?

Is there any way, we can run windows 1.1, and get this up and running as of today?

mizzao 3 days ago 0 replies      
> We are grateful to Roy Levin, Managing Director of Microsoft Research, Silicon Valley, for working hard to find the source code and getting permission for us to release it.

Ironic as MSR-SVC was shut down a few months after this article was written.

ungzd 3 days ago 0 replies      
It's still dark ages of word processing today but "cryptic commands" are in markdown and html.
olav 3 days ago 1 reply      
I remember the joy of writing a 200 page book in FrameMaker on a smallish SPARCstation in 1994. Even in 1998 it was easy to convince my boss to license FrameMaker for Windows as writing software. Word was still too buggy to write anything exceeding a few pages or with embedded images with it. Sadly, Adobe never marketed FrameMaker to a mass market.
the_mitsuhiko 3 days ago 0 replies      
I wonder how much of this code still lives on in current generation Word.
mark-r 3 days ago 0 replies      
Word 1.1 is the catalyst that got me to try Windows. I was familiar with word for DOS at the time, and the potential of proportional fonts and OS-level printer drivers caught my imagination - it was the future, I was sure. It didn't hurt that I got a promotional copy for cheap with Windows bundled. I still have the floppies around somewhere.

Many years later, and I'm still writing Windows programs. Excellent strategy on Microsoft's part.

Nice to see this hugely transformative software available in an archive.

iconjack 3 days ago 0 replies      
Comes on 33 disks, 28 of which are printer drivers.
faragon 3 days ago 0 replies      
Opus/sort.c implements Merge Sort, limited to 16383 records.
cja 2 days ago 0 replies      
What was "PC Word" and how did it relate to "Microsoft Word"? Was it also for Windows?

(References to both found in http://antitrust.slated.org/www.iowaconsumercase.org/011607/...)

peterburkimsher 3 days ago 0 replies      
"Will Office be open source in future?"

"Yes. I give you my Word."

eggoa 3 days ago 6 replies      
The 1984 BYTE magazine review said it was "clever, put together well, and performs some extraordinary feats" but "extremely frustrating to learn and operate efficiently."

This still describes Word, 32 years later.

x0 3 days ago 0 replies      
There's a whole bunch of sed files in the source code!
sytelus 3 days ago 2 replies      
It seems that Microsoft Word unseated far more established player WordPerfect even before OS dominance. Any insights on how this happened?
chris_wot 3 days ago 0 replies      
This actually might be useful in rendering old documents for The Document Foundation's file format project.
TAForObvReasons 3 days ago 2 replies      
> To access this material you must agree to the terms of the license displayed here, which permits only non-commercial use and does not give you the right to license it to third parties by posting copies elsewhere on the web.

Does this conflict with the terms of the Open Specification Promise? https://en.wikipedia.org/wiki/Microsoft_Open_Specification_P... If so, does that taint anyone working on a project involving DOC word documents?

albertgao 3 days ago 0 replies      
reading comments gave me a illusion that only the smartest people deserve to use Word. seems I can get a Mensa certification on using every editions of Word well without too much learning.....God I need to test my IQ, should be 9999.
pritambarhate 3 days ago 0 replies      
We use GSuite (Google Apps and Email) at work. Though I have MS Word installed on my machine, I find that I increasingly use Google Docs for most of my document needs. Only when I need to something that just can't be done in Google Docs, I open MS Word. This is becoming increasingly rare.

I mainly use MS Word to open documents shared by Clients and to check the resumes of candidates which are generally in MS Word format in India.

Google Sheets also have improved a lot over the past year or so.

Russia Hysteria Infects WashPost: False Story About Hacking U.S. Electric Grid theintercept.com
298 points by platinumrad  20 hours ago   249 comments top 35
mmaunder 17 hours ago 4 replies      
We reverse engineered the IOC's included in Thursday's report from the FBI that released malware data that is supposedly associated with the 'Russian' election hack. Turns out it's a hacking group in Ukraine, anyone can get it for free (but if you're nice you'll donate to their BTC account) and the DHS and FBI sample was several versions behind.


The trouble is that the report was released at the same time as the expulsion of 35 Russian diplomats and the whole context around it, including some of the language used in the report, implies it's proof of a Russian election hack.

We also analyzed the IP's they shared and they're just a mish-mash of known attack IP's around the world - probably hacked hosts being used as an attack platform by everyone. ISP's include Linode and Digital Ocean.

I'm having serious Colin Powell UN flashbacks here: IC releases questionable data as justification for military policy decisions.

I've done two interviews this morning about this story and I'm told by one very well known journalist that publications both on the left and right think this whole thing stinks. Here's RS's take.


a3n 18 hours ago 6 replies      
> Whats the problem here? It did not happen.

> There was no penetration of the U.S. electricity grid. The truth was undramatic and banal. Burlington Electric, after receiving a Homeland Security notice sent to all U.S. utility companies about the malware code found in the DNC system, searched all their computers and found the code in a single laptop that was not connected to the electric grid.

"and found the code in a single laptop that was not connected to the electric grid."

So, the first step in penetrating a system was accomplished, getting the code onto a device that could potentially (or so they attacker may have hoped) be connected to the target network.

Until I hear that the code was put on the laptop by its owners intentionally and for legitimate reasons, this sounds like an attack. The headlines and responses are arguably alarmist and not fully informed, but it's still an attack. The dismissal of alarmism seems intended to obscure the likelihood that there was, in fact, the start of an attack.

If a spear phishing attack fails, was it not still an attack? That it was an attack in the direction of the power grid is, by definition, alarming. [EDIT: The first sentence in this paragraph confuses my point, and can profitably be ignored.]

The intercept's article could have been less sensationalist itself, and I wonder what the motivation for the overdramatization of the Post's failure would be. Competition? Schadenfreude? Sensationalist link baiting?

Regardless, I had hoped for a more sober and professional style from the intercept from its early days, and I've long ago stopped reading it, modulo the odd HN post.

woodruffw 18 hours ago 5 replies      
From the referenced article:

> Editors Note: An earlier version of this story incorrectly said that Russian hackers had penetrated the U.S. electric grid. Authorities say there is no indication of that so far. The computer at Burlington Electric that was hacked was not attached to the grid.

This is journalistic ethics in action. WaPo has publicly admitted a mistake and revised their article as a result. Greenwald can (and deserves to) give himself a pat on the back.

That being said, I am disappointed in his bad faith equivocation of the (occasionally sloppy and partisan) news media with "news" that is patently false and engineered to maximize advertising revenue. Calling this "fake news" just gives the GOP more (dishonest) ammunition in its 40 year war with the Post.

disordinary 18 hours ago 3 replies      
Fake news isn't a new phenomenon, in the 1870s a satrical/comedic article in a New Zealand newspaper about an impending Russian invasion led to such wide spread hysteria that the colonial government almost bankrupted itself. To sate the public it had to invest heavily in naval vessels and build 17 forts to fight off the (non existent) Russian menace.

It's a wee bit hypocritical for the US to get so upset about these things though, considering all the elections that the CIA have been involved in, not to mention the stuff that Snowden revealed (like tapping the German Chancellors phone). Everyone knows that whatever espionage Russia is doing to the US the US is doing back in kind. All the powers will be hacking each other.

ChuckMcM 18 hours ago 1 reply      
These events have a reaction time, a response time, and a validation time. It is critical that legitimate news outlets keep their validation time small so that they can accurately report events.

The danger is pretty clear, if response time is shorter than validation time, people or systems will respond, perhaps irreversibly, before validation can be achieved.

That is how you do real damage in a system. Hopefully a very public critical response to the Washington Post here will help extend their response time again past the validation time.

Mikeb85 19 hours ago 3 replies      
WaPo, Nytimes, CNN, all part of the US propaganda machine.

The anti-Russia hysteria is getting ridiculous, and the more the media drum it up, the less people believe it.


striking 19 hours ago 3 replies      
It's funny how much flak Facebook (and almost no one else) got for "fake news" when the vast majority of today's journalism is so saccharine.

Is there a way contemporary journalism can be fixed?

Spooky23 18 hours ago 0 replies      
Sorry, Glenn, that's not fake news.

The intercept has plenty of "all hat" articles where the picture painted by the headlines doesn't necessarily match the content.

astaroth360 11 hours ago 1 reply      

"Hope everyone remembers just before Trump took office @ggreenwald was praising Breitbart & @jeremyscahill was joking about working for Putin"

^ This

betrothed 16 hours ago 0 replies      
I'm inclined to say, however small, or peripheral, or target cognizant, this is still an infrastructure hack.

It's not a SCADA attack on systems that deliver services, but surely an attack that lands close enough to "The Electric Grid" to pay attention.

Pay attention now, not later.

Jerry2 16 hours ago 0 replies      
WaPo is now a blatant propaganda outlet. They don't seem to care about the truth anymore and they've become what they accuse others of: a fake news source.
disposablezero 17 hours ago 0 replies      
This is what happens when the majority of journalists both have a profit motive and cozy up to the establishment: they'll say anything and a low/no-information populace gobbles it up without a grain of salt.

The Intercept, Democracy Now, Thom Hartmann, TYT, et. al. are in a precarious position because they often speak the truth, which is inconvenient to those in power. Whether they can mostly survive and measurably supplant establishment media by demographics isn't certain. Whether Trump will target investigative journalists and net neutrality (likely) Erdogan-style is anyone's guess.

PS: Another interesting CIA operation which taints media and fuels conspiracy theories https://en.wikipedia.org/wiki/Operation_Mockingbird

MasterScrat 6 hours ago 0 replies      
Thank god for people like Glenn Greenwald and Matt Taibbi. My friends and family look at me like I've decided to have dog food for dinner when I say the whole Russia story is made up.
esalman 17 hours ago 0 replies      
WashPost now has Editors Note acknowledging its key claim was false (source: https://twitter.com/ggreenwald/status/815291333942411264 )
sergiotapia 18 hours ago 0 replies      
At this point there is -zero- evidence Russia hacked anything. Anyone saying that is cringeworthy.
divbit 19 hours ago 3 replies      
What is the boundary between what we consider "fake news" and news with a tiny kernel of truth somewhere in it (in this story it sounds like a semi-related laptop was infected with some malware) that is sensationalized to claim something much broader? I think that there are some pieces of news (e.g. meme-news that people post on social media sites, that would be similar to what one might read in a tabloid) that get automatically rejected by my BS filter a lot easier than something like the piece mentioned in the article, which was posted by a respectable journal.
canjobear 15 hours ago 0 replies      
On a related note, what was the outcome of the claim that North Korea hacked Sony? I was never convinced by the evidence presented but it seems to be something people generally believe was true.
JudasGoat 16 hours ago 0 replies      
God. Once politics is involved, objectivity and truth are the first casualties.
astaroth360 13 hours ago 0 replies      
I'd like to see better evidence of the US election being hacked, but I understand they wouldn't want to release anything that could cut off their ways into Russian systems. I don't know how anyone expects to get real proof of it without us deciding to give away strategically important gaps in Russian infosec.
mavdi 18 hours ago 0 replies      
I find this article as misleading as the original WP article. Truth is out the window, get used to it. If one side can come up with utter nonsense, then why can't the other?
zzzcpan 18 hours ago 0 replies      
Isn't malware on some worker's laptop a common way of penetrating disconnected networks? Not that it matters, as it serves the agenda equally well being either a "false" or a "true" story. Seems like calling Russia out on covert operations was too scary for them, so they chose hacking as a more acceptable thing.
ehaskins 18 hours ago 1 reply      
For your reference the fourth paragraph from WashPost story:

>Burlington Electric said in a statement that the company detected a malware code used in the Grizzly Steppe operation in a laptop that was not connected to the organizations grid systems. The firm said it took immediate action to isolate the laptop and alert federal authorities

czep 18 hours ago 0 replies      
There's more than enough real news to be upset about. With all the focus on email hacking, why no furor over the stolen $100mn?

In the meantime we hear sob stories about the consulate chef being deported. Poor guy! Hard to feel bad when he's got 9 digits stashed in a Swiss account.

fixxer 18 hours ago 0 replies      
I once thought fake news meant Breitbart and Alex Jones...
mtgx 18 hours ago 0 replies      
This is why every time there was a post about "banning fake news" on HN, I specifically gave WashPost as an example (knowing they've written pure propaganda/false stories in the past) and questioned "whether a site like WashPost would have its fake news articles blocked on Facebook, too", when they are caught manufacturing stories (which they arguably did here).

Because if such articles from the big media companies wouldn't be blocked, then the system would be biased and unworkable, and Facebook or Google will just find a lot of backlash against them over it.

mSparks 17 hours ago 1 reply      
almost comedy.

is it the good AIDs or bad AIDs (Mary Whitehouse experience reference)

It was clearly the good computer virus designed to penetrate state infrastructure. because Glenn Greenwald said so.

StanislavPetrov 17 hours ago 1 reply      
IBM 18 hours ago 5 replies      
When it comes to Glenn Greenwald, @noahpinion said it best [1]:

Is there a catch-all term for middle-aged white lefty dudes who are pro-Russia because their political outlook was defined by the Iraq War?

[1] https://twitter.com/Noahpinion/status/815104514046902273

dimino 17 hours ago 4 replies      
dimino 18 hours ago 3 replies      
Glenn Greenwald is being really weird about this Russia hacking thing.


tmuir 18 hours ago 0 replies      
The Washington Post, is a national media institution who has had their press credentials revoked by Trump. Therefore the organization, by definition, at a disadvantage, when it comes to gaining CONTEXT about the subject of their reporting. Thus, The Washington Post is mired in CONTROVERSY. This is only natural.

People misinterpret each other's text messages and internet comments, often with CONTROVERSIAL outcomes, because the initiator of the message has failed to provide sufficient CONTEXT. This is only natural.

The entire aviation industry vilified Captain Sullenberger, even though he had just saved 155 peoples lives, because everyone investigating the incident lacked sufficient CONTEXT to explain to themselves, and each other how Sully was able to accomplish something that had never happened in the history of aviation. Captain Sullenberger did, in fact, possess sufficient CONTEXT, which he gained over a long career of landing other failing airplaines. This CONTEXT possessed by Sullenberger, at least as portrayed in the movie, is written all over Tom Hanks face in the form of a stiff upper lip. That guy was as cool as the other side of the pillow the whole time, before, during, and after his water landing. Once sufficient CONTEXT was provided, the CONTROVERSY immediately subsided. This is only natural.

The United States of America is at a fairly CONTROVERSIAL point in its history. I wonder, if American's sought out the true CONTEXT of the people they find the most CONTROVERSIAL, their political opponents, if said CONTROVERSY would naturally subside.

Find someone you disagree with, and see how long you can keep talking to them.



draw_down 19 hours ago 1 reply      
The really shameful part of this is the xenophobic garbage spewed by Democrats who are upset their candidate lost the election. They can't handle the thought that they simply lost, so now anyone who disagrees with them is an agent planted by Putin.
zxcvvcxz 17 hours ago 0 replies      
The most ironic part of the whole "fake news" debacle is that those pushing it are just as guilty.

It's OK though, fear not, because Facebook will tell you what's real and what's not.

vonnik 17 hours ago 0 replies      
The Wikileaks-Greenwald-Russia axis stands firm.
exabrial 17 hours ago 0 replies      
Remember news agencies only make money if people are panicking...
Chrome browser for businesses enterprise.google.com
315 points by vikiomega9  3 days ago   270 comments top 27
jpochtar 3 days ago 8 replies      
Is anyone else concerned that this means IT can choose to hold back the version of Chrome in their organizations? Auto-updating Chrome has been low key one of the best solutions to the pain of backwards compatibility with older browsers. In the past we not only had to worry about compatibility between browsers, we had to worry about compatibility between browser versions. Further, auto-updating Chrome as dramatically reduced the time from new web feature implementation to widespread deployment and thus usability. I take it that turning off auto-updating will not be widespread, but I'd rather not risk it
m-p-3 3 days ago 3 replies      
This isn't a new thing at all, but more awareness to the sysadmins shouldn't hurt.

Also, let me point you to the Legacy Browser Support extension.


With proper GPOs you can force a domain/subdomain to open in IE directly from any links.

twblalock 3 days ago 5 replies      
One notable thing about the dominance of Chrome (and the decline of Firefox) is that WebKit is now a de facto standard. The desktop and mobile versions of Chrome and Safari are based on it, and according to Microsoft, "any Edge-WebKit differences are bugs that were interested in fixing." (https://en.wikipedia.org/wiki/Microsoft_Edge). That covers every major desktop and smartphone browser except Firefox.

At some point, developers are going to target their stylesheets for WebKit only, because Firefox rendering differences are going to be seen as nuisances that aren't worth overcoming in order to reach a tiny minority of users. Firefox will have to work toward WebKit compatibility as Microsoft Edge does.

WebKit is doing pretty well for something that was originally part of KDE.

reitanqild 3 days ago 4 replies      
For me Chrome is already the new IE: good enough to saturate the market to the point where devs stops caring about standards compliance.

Personally I have tried to like it, multiple times but I always get annoyed and go back to FF. But then again I prefer Linux over Mac and Netbeans over IntelliJ so maybe it's just me.

krzyk 3 days ago 16 replies      
Is this about replacing one monopoly with another?

It is slowly starting too look like in the early 00's, most developers are starting to support only chrome, with no testing in Firefox or latest IE.

On Android it is even worse, no one is testing with Firefox Mobile (Fennec) - this is WinXP-IE6 all over again.

vc4 3 days ago 0 replies      
It will be so much better for developers if there is another browser installed apart from IE in all enterprises.

I still encounter companies downgrading to older version of IE to support their legacy applications.

DannyBee 3 days ago 3 replies      
I thought this had been around forever?(I mean some form of installer and deployment kit/admin tool)

I see other pages talking about it going back to 2010/2011

newscracker 3 days ago 0 replies      
It's good to see that Chrome's auto-updates can be turned off and updates pushed manually to users at a time of the IT administration team's choosing (I had to dig down into a few links to find this information). But it seems like there's nothing equivalent to Firefox ESR [1] in place, and Chrome would continue to update with the same frequency as the general public release (of the consumer focused version). Does anyone know if a longer term security-only-updates model is available with this like Firefox ESR (just for the sake of curiosity)? When I searched online I found a two year old reddit thread that indicated there wasn't one.

[1]: https://www.mozilla.org/en-US/firefox/organizations/

therealmarv 3 days ago 2 replies      
Oh cool, a version for corporate rule fetishism admins who love controlling their MSI files. Happy I don't work in such an environment.
vayarajesh 3 days ago 0 replies      
so much better for the web application world. Most of the support tickets / issues are mostly related to the browser they are using. This is one step closer to removing IE.\
valarauca1 3 days ago 0 replies      
This is a big negative for security. The best thing a sys-admin can do is blindly click okay every update.

Out of date web browser bugs are pretty much THE mainstream hacking route. It is the route of least resistance.

kozak 3 days ago 1 reply      
Chrome is the new IE. Enterprise apps are being developer with the assumption that they will only even be run on Chrome, and become fragile because of that assumption. A recent example: someone has set up a race condition of timers that was only working (i.e. resolving in a specific order that is needed for the app to work) in Webkit-based browsers. No one cared to fix that, because it did work in Chrome, and that's all that's needed.
eb0la 3 days ago 0 replies      
I guess this is more oriented to the devices like, meeting room hardware (can it be plugged to a VoIP server / Avaya switch?) that predates Polycom and Cisco markets... using hangouts.

I guess for the TCO for Digital Signage devices is cheaper than the incumbents (cost of Android signage app/web app vs Windows app/web app + remote management control).

lee 3 days ago 1 reply      
I don't see a Linux version on there. Does anyone know when/if Google will provide one?
cylinder 2 days ago 0 replies      
This is a silly off-topic request, but any chance someone knows of a theme (ideally Wordpress) that looks like this page? I actually really like it and could use it for my next project.
ComodoHacker 3 days ago 0 replies      
Are there any real differences besides support?
nunez 3 days ago 0 replies      
I thought this was a thing for years...?
lhaussknecht 3 days ago 1 reply      
We already deployed chrome in the enterise to run our Chrome packaged app.Too bad Google EOLed packaged apps....
damiien 3 days ago 0 replies      
The hidden passwords sharing would be a killer feature, especially for business.
mmanfrin 3 days ago 1 reply      
Are the benefits of these things to enterprise customers just the protection against dumb employees?
iampherocity 3 days ago 1 reply      
My apologies.
purity_resigns 3 days ago 0 replies      
Chrome is but IE writ large.
eriknstr 3 days ago 0 replies      
So this is what Google Ultron looks like. Neat.
chipsz 3 days ago 0 replies      
CSS Bug: Navbar disappears, but hovering menu links still works.

Chrome 55.0.2883.95

A_Crazy_Idea 3 days ago 0 replies      
For all those businesses without intellectual property or databases.
B1FF_PSUVM 3 days ago 0 replies      
> automatically sign you into all your favorite Google services

I'm dying to do that. "Give me convenience or give me death", as Jello Biafra so well put it.

(Now, if they could make apps not automatically sniff out each other's use of a Google account ... like, keep blogger and maps out of gmail's knickers ...)

ktta 3 days ago 8 replies      
I think most interesting part of all this is that they're offering support for this[1]. So that and it now being a part of G suite is making it an 'official' service/product.

And since Chromium browser is an open source browser which receives many contributions from many developers[2], this will add Google to the list of companies which take contributions from OSS and make money off it.

Please correct me if I'm wrong, but I'm not aware of any other OSS product which contributes to their revenue in a direct way. Only thing that comes close is Kubernetes and then Tensorflow, but both aren't in the same 'level' that chrome is in now.


Some DNS lookups causing 5xx errors due to leap second bug cloudflarestatus.com
246 points by nomadicactivist  11 hours ago   106 comments top 21
nullc 11 hours ago 6 replies      
My CDMA phone dropped service for a few minutes after the leap second.

It's absurd that we continue to keep subjecting ourselves to these disruptions and the considerable amount of work that goes into handling leap seconds for the systems that aren't disrupted by them.

Leap seconds serve no useful purpose. Applications that care about solar time care usually care about the local solar time, while UT1 is a 'mean solar time' that doesn't really have much physical meaning (it's not a quantity that can be observed anywhere, but a model parameter).

It would take on the order of 4000 years for time to slip even one hour. If we found that we cared about this thousands of years from now: we could simply adopt timezones one hour over after 2000 years, existing systems already handle devices in a mix of timezones.

[And a fun aside: it appears likely that in less than 4000 years we would need more than two leapseconds per year, sooner if warming melts the icecaps. So even the things that correctly handle leapseconds now will eventually fail. Having to deal with the changing rotation speed of the earth eventually can't be avoided but we can avoid suffering over and over again now.]

There are so many hard problems that can't just easily be solved that we should be spending our efforts on. Leapseconds are a folly purely made by man which we can choose to stop at any time. Discontinuing leapseconds is completely backwards compatible with virtually every existing system. The very few specialized systems (astronomy) that actually want mean solar time should already be using UT1 directly to avoid the 0.9 second error between UTC and UT1. For all else that is required is that we choose to stop issuing them (a decision of the ITU), or that we stop listening to them (a decision of various technology industries to move from using UTC to TAI+offset).

The recent leap smear moves are an example of the latter course but a half-hearted one that adds a lot of complexity and additional failure modes.

(In fact for the astronomy applications that leap seconds theoretically help they _still_ add additional complication because it is harder to apply corrections from UTC to an astronomical time base due to UTC having discontinuities in it.)

ChuckMcM 9 hours ago 6 replies      
Once again we're screwed by different people wanting "time" to mean different things. There is no hope for humanity once we start traveling anywhere close to light speed into and out of the solar system.

I propose a new "non-time" time system. It has exactly two real values which range from 0 to tau and an integer, the first real number is radians of earth rotation, and the second is radians of the rotation around the Sun. The integer reflects the number of complete cycles. So lunch time in Greenwich 'pi'.

It has the benefit that its "source" is actually the planet, so we can use a telescope at Greenwich to pick a certain alignment of stars as the "zero", "zero" point and then each time it realigns to that exact point, you can increment the "year" count.

I believe we can build a robust system to support this out of stone. We'll need to create a circle of stones but using a small hole drilled through a stone and a marker on the ground we can always identify 0.0,0.0, 0.0,pi/2, 0.0, pi, and 0.0, 3*pi/2.

tyingq 15 minutes ago 0 replies      
They apparently run their own DNS proxy called "RRDNS", written in golang. https://blog.cloudflare.com/tag/rrdns/
gamegoblin 11 hours ago 2 replies      
I guessed most big services would be using something akin to time smearing [1] since the first big leap-second outages years ago. Is there any reason why cloudfare would be unable to use this technique?

[1] https://developers.google.com/time/smear

karlhughes 11 hours ago 0 replies      
This was shared a while ago, but it's relevant again: http://www.madore.org/~david/computers/unix-leap-seconds.htm...
userbinator 11 hours ago 2 replies      
I'm curious what if anything would be problematic if everything just effectively "ignored" leap seconds (i.e. would this outage not have occurred?) --- one minute is always 60 seconds, an hour is always 60 minutes, and a day always 24h. I mean, if you consider the fact that human society has managed to function perfectly well with almost everyone not knowing nor caring what a leap second is, and yet apparently some software does --- leading to problems like this --- something doesn't feel right.
ComputerGuru 10 hours ago 0 replies      
I was at a relative's and tried to load two different web sites.. my first thought was that their wifi sucked. My second was "will we finally learn a lesson today about the disturbing trend towards constant re-centralization of all our online services?"
jlgaddis 11 hours ago 1 reply      
I'm guessing CloudFlare runs their own custom DNS server software?
justinholmes 3 hours ago 0 replies      
Funny that they wrote about it in 2014 https://blog.cloudflare.com/its-go-time-on-linux/
brongondwana 4 hours ago 0 replies      
Was glad things have improved since 4 years ago!


This time I didn't get paged for anything on leap second day :)

dmd 3 hours ago 0 replies      
Half the people posting here need to read https://qntm.org/calendar
thisrod 9 hours ago 0 replies      
There is a higher order issue here. DNS time stamps have been stable for decades. Why has anyone written new code to format them since the last leap second?
zitterbewegung 9 hours ago 0 replies      
So you can basically see if a tool or company hasn't experienced a leap second if their system goes down because of it.
tscs37 6 hours ago 0 replies      
I knew it.

I knew that something is going to break somehow because for some reason people continue to falsely believe that 1 minute always has 60 seconds.

zkms 9 hours ago 2 replies      
What causes real-world problems with leap seconds is actually unrelated to the nasty interactions of metrology and solar time -- it's a specific and avoidable problem with how NTP (and many OSes/languages) represent time -- it's a types issue.

The right way for computers to represent time is with a number that represents the number of constant-rate ticks that have elapsed past a some agreed-upon epoch. If you know what the epoch is and how long each tick is (lots of people use 1 / 9.192 GHz), it is easy to know how many ticks are between any two time values, and you can convert a time value with one epoch to one with a different epoch and tick rate -- you can do everything people expect to do with time. There are no numbers that represent an invalid time value, and for each moment, there is a unique time value that represents it. There's a one-to-one mapping with no nasty edge cases.

Leap seconds are a step function that is added to a constant-rate timescale (whose name is "TAI") in order to generate a discontinuous timescale (whose name is "UTC") that never is too different from solar time. There is nothing fundamentally abhorrent about leap seconds -- there are just good and bad ways to represent, disseminate, and compute with timescales that involve leap seconds.

The right way to handle leap seconds can be seen with many GNSSes and PTP (very high precision hardware-assisted time synchronization over Ethernet). GPS, BeiDou, Galileo, and PTP all involve dissemination and computation on time values -- and with dire consequences for failure/downtime/inaccuracy.

The designers of those systems all somehow converged on the choice to separate out the nice, predictable, constant-rate and discontinuity-free part of UTC from the nasty step function (the leap second offset). Times in all those systems are represented as the tuple (TAI time at t, leap offset at t). This means that the entire system can calculate and work with (discontinuity-free and constant-rate) TAI times but also truck around the leap offsets so when time values need to be presented to a user (or anything that requires a UTC time), the leap offset can be added then. Crucially, all the maths that are done on time values are done on TAI values, so calculating a time difference or a frequency is easy and the result is always correct, regardless of the leap second state of affairs. Representing UTC time as a tuple makes the semantics of that data type easy to reason about -- the "time" bit is in the first element and is completely harmless -- the edge cases have all live in the second half of the tuple.

NTP and Unix (and everything descending and affected by those) have made the mistake of representing and transmitting time as a single integer, TAI(t) + leap offset(t). This is not a data representation that has sensical semantics and it is very hard to reason about it. First of all, the leap second offset is nondeterministic and also unknown -- there is no way to get it from NTP and there is no good way to know the time of the next leap event. Second of all, there are repeated time values for different moments in time (and when a negative leap second will happen, there will be time values that represent no moments in time). Predictably, introducing nondeterministic discontinuities doesn't work so well in the real world. There are a bunch of bugs in NTP software and OS kernels and applications that make themselves shown every time there is a leap second. It's not even just NTP clients that struggle -- 40% of public Stratum-1 NTP servers had erroneous behavior [0] related to the 2015 leap second! Given that level of repeated and widespread failure, the right solution is not to blame programmers -- it should be to blame the standard. The UTC standard and how NTP disseminates UTC are fundamentally not fit for computer timekeeping.

GNSS receivers and PTP hardware get used in mission-critical applications (synchronizing power grids and multi-axis industrial processes, timestamping data from test flights and particle accelerators) all the time -- and even worse, there's no way to conveniently schedule downtime/maintenance windows during leap second events! "Leap smear" isn't an acceptable solution for those applications, either -- you can't lie about how long a second is to the Large Hadron Collider. GNSS and PTP systems handle leap second timescales without a hitch by representing UTC time with the right data type -- a tuple that properly separates two values that have the same unit (seconds) but have vastly different semantics. The NTP and unix timestamp approach of directly baking the discontinuities into the time values reliably causes problems and outages. The leap second debacle is not about solar time vs atomic time; it's about the need for data types that accurately represent the semantics of what they describe.

[0]: http://crin.eng.uts.edu.au/~darryl/Publications/LeapSecond_c...

mikehollinger 9 hours ago 0 replies      
I'll just leave this here:

Have a look at a j excellent video that explains why time algorithms are hard to sort out: https://m.youtube.com/watch?v=-5wpm-gesOY

Happy New Year from Austin!

web007 7 hours ago 1 reply      
Are there any public "skewing" NTP pools that distribute the leap seconds as lag / gain over 24 or 48 hours as some of the large providers do? That seems to be the generally accepted answer to leap-second chaos, and certainly seems simpler than all of the hidden bugs in systems all over the place trying to deal with :60 on a clock.
aburan28 11 hours ago 0 replies      
I was wondering who this leap second was going to affect!
iopq 7 hours ago 0 replies      
Is that why Google Maps was down?
known 8 hours ago 0 replies      
I just did sudo rdate -s time-a.nist.gov
homero 9 hours ago 0 replies      
I didn't see an outage at all
Be Careful with Python's New-Style String Format pocoo.org
307 points by BerislavLopac  21 hours ago   140 comments top 28
tedunangst 20 hours ago 1 reply      
Maybe a more fully worked example is needed. You're making a blog hosting service as a service service. Bloggers have different ideas about what page titles should be.

 Post Title Blog Name: Post Title Blog Name - Post Title Post Title - Blog Name Blog Name ----embdash---- Post Title ~~~ xXx Post Title xXx ~~~
It's a little overwhelming to put every possibility in a dropdown, so you allow the user to specify a format string.

 title = userformats.title.fmt(post)
This doesn't look so very dangerous. And then the user can say

 "{post.title} - {post.blog.title}" "{post.title}: Another fine post by "{post.author}" "~~~ xXx {post.blog.__init__.dbconnection.__keys__.password} xXx ~~~"
And then oops.

anderskaseorg 19 hours ago 2 replies      
The proposed idea of relying on undocumented internals and blacklisting attribute names to securely sandbox formatting strings is _really_ _dangerous_. Never do that in production code! Language expansions could render your sandbox unsafe at any time.

You can write your own safe formatting engine in much less code.

 def safe_format(fmt, **args): return re.sub(r'\{([^{}]*)\}', lambda m: args.get(m.group(1)), fmt) safe_format('{foo} {bar}', foo='Hello', bar='world') # Hello world
Add bells and whistles as desired.

coldtea 20 hours ago 4 replies      
Err, why would you allow for the user to enter arbitrary format strings in the first place?

Might as well write "be careful about eval of arbitrary user provided strings".

kazinator 19 hours ago 1 reply      
Sane Lisp approach: provide easily analyzable target syntax.

 This is the TXR Lisp interactive listener of TXR 163. Use the :quit command or type Ctrl-D on empty line to exit. 1> (defvar foo 42) foo 2> `@(list foo) ... @foo` "42 ... 42"
What is that backticked literal? Let's quote it:

 3> '`@(list foo) ... @foo` `@(list foo) ... @foo`
Hmm, prints back in same form. Probably syntactic sugar for a list; what is in the car? cdr?

 4> (car '`@(list foo) ... @foo`) sys:quasi 5> (cdr '`@(list foo) ... @foo`) ((list foo) " ... " @foo)
What if we type in this syntax ourselves:

 6> '(sys:quasi (list foo) " ... " (sys:var foo)) `@(list foo) ... @foo`
Prints as the notation!

To sandbox this, we just have to walk a list and enforce some rule. For instance, the rule might be that all elements after sys:quasi must be string object or else (sys:var sym) forms, where sym is a symbol on some allowed list. Thus (list foo) would be banned.

A custom interpreter which calculates the output string while enforcing the check is trivial to write as a one-liner.

Of course if your program just eval such an untrusted quasiliteral, it has access to the dynamic/global evironment:

 `Mouhaha! @(file-get-string "/etc/passwd")`
Very convenient for some attacker. :)

nhumrich 19 hours ago 2 replies      
This article makes you thing the new f-strings will be discussed. What is really being discussed is 'string'.format(), which isn't new in any way.
angusp 1 hour ago 0 replies      
It's a well known C pattern that you should never trust a user supplied format string, E.g. printf(arg) vs printf("%s", arg). The same applies here
Animats 20 hours ago 4 replies      
No, Rust does not have the ability to access any variable in the program via a format string. Rust has this:

 format!("{argument}", argument = "test"); // => "test"
That's just named arguments to the format. Also, that's a macro; it's expanded at compile time.

Python's approach is lame. It should have used something with a limited list of named arguments, or maybe a dict.

agentgt 20 hours ago 0 replies      
This is actually a problem with a lot of languages that allow runtime template like String interpolation.

For example Groovy on the JVM has GStrings which one can do fairly nasty things.

As well it actually is fairly hard to lock down most of the template languages on the JVM for user templates. (If you are going to allow user templates I recommend one of the Java Mustache-like implementations).

FeepingCreature 2 hours ago 0 replies      
If you are making a new language, the easy way to fix this is make formatting a property of your strings, not a runtime function. Ie. instead of having "foo {bar}".format(bar=bar), have "foo {bar}" be equivalent to "foo "+bar.

This sidesteps the problem because only literal strings are formatted.

tzury 9 hours ago 0 replies      
If within the context that parse and execute user input, sensitive data is available, then what this has to do with New-Style String Format? I mean,

Shall not be available to any function call, whatsoever!

ryanmccullagh 19 hours ago 1 reply      
This is the same principle for not passing user input to the first argument of `printf(3)` in C. Coming from C, I would never allow the user control of the format string if I were writing code.
peterwaller 19 hours ago 0 replies      
Reminds me of the turing complete interpreter lurking in libc's printf.


unscaled 15 hours ago 0 replies      
I was always puzzled by Python's insistence to forego string interpolation until the latest version.

Runtime string formatting, even if done safely (e.g. .NET's String.Format() which doesn't have property access AFAIK), can still cause unexpected exceptions at the very least, and suffers from inferior performance.

Manishearth 17 hours ago 0 replies      
Yeah, I never liked Python's "new-style" format because of this. It didn't occur to me that you could use field access to access globals (never done enough Python metaprogramming to mess with the reflection stuff), but I was afraid of arbitrary getters being invoked.

In general I'm very wary of runtime string formatting. Strings tend to be untrusted input with a large degree of freedom. format strings are almost always known at compile time (and more trustworthy). If your interpolation system is more than simply mapping keys to values or positions, you should probably restrict it to compile time. Feel free to expose a harder-to-use runtime API. Rust has compile-time format strings, for example. They're not as powerful as `str.format`, but they could be without there being security issues. JS has a different syntax for format literals. Regular strings cannot be "formatted", you must specify a string literal with backticks and that gets converted to a string value when the interpreter gets there. These literals can execute arbitrary code, but since it's just literals there's no way for an untrusted string to get in there.

One main use case for runtime string formatting is i18n. But that really should use a different solution. Most string formatting APIs are geared for programmer convenience -- the programmer is writing the code and the string. The scales shift for translators, who are only writing the strings. They don't need things like field access and stuff.

Besides, most string formatting APIs are inadequate for i18n. Not if you want to handle stuff like pluralization (http://mlocati.github.io/cldr-to-gettext-plural-rules/).

Another use case is template engines and stuff like that. In that case, field access is useful, but you probably should exert more control on these things (which is exactly what jinja2 seems to be doing here)

At one point I toyed with an idea for a super-type-safe template engine in Rust. It would validate the templates at compile time, and additionally ensure that the right types are in the right places. For example, it could ensure that strings that get interpolated with the HTML are either "trusted", escaped, or otherwise XSS-sanitized (using the type system to mark such types). Similarly, url attributes (href, etc) can only have URLs that have been checked for `javascript:`. Never got around to writing it, sadly.

yladiz 17 hours ago 1 reply      
So is the issue that it's a problem to let the format string be controlled arbitrarily? It's good to warn users around it because some may not know the dangers, but in general you don't trust user input, so I don't really see this as something you need to build something custom around, rather just be careful and follow good practices. It should be understood not to pass user data directly to internal code without sanitizing it. Proposing some custom code that uses undocumented internal features is overkill and also dangerous since things that aren't documented/internal can change suddenly.
babyrainbow 12 hours ago 1 reply      
Discussion when this pep when it was accepted and people handwaving and down voting arguments against it.


Cyph0n 20 hours ago 1 reply      
Scala's solution to string formatting is the best out there imo. You can put basically any code into the format placeholder.


Pxtl 14 hours ago 0 replies      
Odd they reference c# since c# does not have this feature in its traditional format strings - they don't allow you to access arbitrary members.

C# recently added string interpolation, which does allow arbitrary code, but string interpolation itself is compiled C# code and can't be stored like a format string.

Personally I use mustache when i need format-string-like-behavior from semi-trusted users.

libeclipse 18 hours ago 2 replies      
I tend to use the old style ("%s" % "lel"). Just wondering, does this affect that too?
foota 18 hours ago 0 replies      
Wow, I had no idea you could access attributes from within format strings.
agumonkey 20 hours ago 0 replies      
Once again strings and io formatting making fun of us.
innocentoldguy 20 hours ago 1 reply      
While I agree this is a possible attack vector, I think it is extremely unlikely, at least in the localization realm, for several reasons:

1. The localization company should never know what programming language you're using.

2. You shouldn't give localization companies direct access to your internal strings. More than hacking your code, they're almost guaranteed to screw the formatting up.

3. Typically, translators are hired for their native language abilities, and not for their technical prowess. I've met precious few who knew how to open a text editor, let alone hack your product via its strings.

I worked with Python and I18N/L10N for about 15 years. The way I always handled localization was to parse all our strings into a PostgreSQL database, and then provide a web interface for translators to do their work. This interface provided translators with the full-context of the strings they were translating, which internal strings often don't, prevented the inclusion of certain characters and keywords, and kept the translators from screwing up the formatting. By doing it this way, we got much better translations, and our internal strings were never out of our control.

bbcbasic 12 hours ago 0 replies      
Pure functions ftw
atabdiwa 20 hours ago 1 reply      
... Uncontrolled format string bugs? In 2016? Really? Someone would fall for that? ..
dlbucci 20 hours ago 4 replies      
I do love writing python, but it's pretty shocking when I find out you can write something like `event.__init__.__globals__[CONFIG][SECRET_KEY]`. That language just does not care about privacy or information hiding at all, I guess.
ak217 20 hours ago 3 replies      
Until now I thought the new features were confined to `f""` format strings.

Edit: As correctly pointed out, this feature has been around since the introduction of str.format(). So this warning applies to all Python versions.

stevebmark 19 hours ago 1 reply      
Doesn't this apply to any language that has string interpolation, Ruby, Python, Javascript Perl, etc? And doesn't it not really matter, because it's not realistic to use a dynamic string template in a program?
chrisdone 14 hours ago 0 replies      
> So what do you do if you do need to let someone else provide format strings? You can use the somewhat undocumented internals to change the behavior.

Well, that's just sloppy and shame on you for exposing programming internals to a user in a service. What did you expect? Write your own format string parser and stop being lazy.

EveryPolitician: Open dataset on politicians everypolitician.org
257 points by tfgg  1 day ago   49 comments top 15
shakethemonkey 1 day ago 4 replies      
Political Graveyard[1] is the world's richest open dataset on American politicians, and its size dwarfs everypolitician.org

[1] http://www.politicalgraveyard.com/

[Disclaimer: I have no affiliation with the site, not even sure who runs it]

thomasfoster96 1 day ago 2 replies      
233 countries? It would be much better to organise the data in some sort of heirarchy, given having the UK, Wales and Scotland all on the list is somewhat confusing (and it leaves out US and Australian state legislatures).

Also, I'm wondering how the data was collected - the party affiliation information for the Australian parliament is very strange. Not entirely wrong, but probably misleading.

JumpCrisscross 1 day ago 0 replies      
> mySociety Limited is a project of UK Citizens Online Democracy, a registered charity in England and Wales

Contributing data [1] on powerful people carries risks. These risks depend on whose information you are sharing, how you got it and your country's strength of rule of law.

Britain has very broad dragnet surveillance laws on its books [2]. If you are going to contribute, please consider the INFOSEC and OPSEC ramifications of those laws.

[1] http://docs.everypolitician.org/contribute.html

[2] https://www.theguardian.com/uk-news/2016/mar/01/snoopers-cha...

IanCal 1 day ago 0 replies      
I have tried reading through the documentation and looked at the github repo, and cannot find a license.

Could you please add one? I currently can't work out what I'd be able to use this for, so far I'd be concerned I cannot reuse it in any way at all.

eriknstr 1 day ago 2 replies      
I was happy to see that they include politicians outside of the US. Unfortunately there is no data about any of the politicians in my country yet.
donmatito 1 day ago 1 reply      
That's an awesome project. Some civic tech initiatives promise to bring transparency on representative activity/lazyness, vote records, or transparency/corruption. This promises to unify datasets in a consistent, comparable manner. Very interesting
nailer 1 day ago 1 reply      
From the site's UI:

> Find representatives from your country:

Expected to use the box to find representatives. Instead it's a place to enter your country.

You should already know my country. Let me type in a representative's name.

Goldenromeo 1 day ago 1 reply      
Have you considered allowing email signup for the gender balance game feature?
pani00 20 hours ago 0 replies      
> If you know where to find more data for this country, please let us know.


DasIch 1 day ago 1 reply      
They don't even have current data on who is currently in parliament for many countries and in the case they do that data is essentially worthless. I really don't see the point of this.

If you want to bother at all, you should have data on the level of http://abgeordetenwatch.de (for Germany only but surely similar projects exist in other countries). So how they voted, which committees are they part of, which jobs (beside being a politician) do they have. If you can get it, even which lobbyists they've met with (http://ec.europa.eu/transparencyregister/).

foxfired 1 day ago 1 reply      
233 countries, 0 from my country.
devoply 1 day ago 1 reply      
politicians are mere servants. need a better open data set on the people and corps they serve.
nodesocket 1 day ago 0 replies      
I quickly scanned the title and thought it would include each politician's net worth.

Strange use of "richest", I would have personally gone with "largest" but...

Fifer82 1 day ago 0 replies      
I can't believe my tax pays for so many useless cretins and the UK is still on its knees.
id122015 1 day ago 2 replies      
Thank you for such a useful project, its a good start.Ill happily contribute with data sources, also I can translate the website in other languages if requested.

And most importantly for those who live in countries with huge tax rates, next time when Ill protect my hard earned money that they try to steal as tax and inflation, Ill use the feature to donate it to this website.

What Does Any of This Have to Do with Physics? nautil.us
356 points by dnetesn  3 days ago   67 comments top 20
volkk 2 days ago 2 replies      
A great read.

Something that stuck with me throughout the article was that the concept of "you can do anything" was almost masked by the fact that he had placed all of his "apples" into one basket--Rajeev. Of course this was a different time, and I think it highlights just how important the internet and technology have become in our professional success.

Had this been present day, Henderson could have tried to make use of others through collaboration, just as Rajeev himself pointed out towards the end. Somewhere in the article he mentioned his doubts within his teacher, and that's something that I think most people need to realize. Teachers are just people with their own faults. Nowhere does it state that your teacher is going to know the answer to your success. If you continually find yourself lost and doubtful, you should extend your reach and try to seek help from other minds as well.

He was on a journey with thousands of forks within thousands of roads, and simply locking yourself in a room for 15 hours a day, essentially brute forcing different paths isn't a healthy way of going about research or anything in life.

kurthr 2 days ago 1 reply      
Wow, I see how this is relevant to startups, because it's one of the best essays about grad-school and PhD research as I've seen. The people who attempt it are capable and driven, but a good advisor is often critical. There are a lot of hills to climb and the most important thing to learn is how to guide yourself when the way isn't clear! We want to change the world...

When I saw these lines I thought maybe his Advisor wasn't doing such a good job:

A year or so of research with Rajeev, and I found myself frustrated and in a fog, sinking deeper into the quicksand but not knowing why. Was it my lack of mathematical background? My grandiose goals? Was I just not intelligent enough? Or maybe it was the type of research Rajeev had me doing.

Then he moved on to a thesis and graduated, which shows that Rajeev was doing his job as a Boss and Professor. Advice about using your strengths, working with others, focusing on success and minimizing mistakes... it really does translate to most quests.

I'm sad the writer doesn't remember being happy since he starting his PhD. Choosing to make your dream your job is a dangerous thing, especially if you can't still enjoy the path. He's good at writing so I hope he enjoys that now.

smaddali 3 days ago 1 reply      
A quote from this long article..---Now you know what makes theoretical physics so hard, he said. Its not that the problems are hard, although they are. Its that knowing which problems to try and solve is hard. That, in fact, is the hardest part.---As with startups, all startups are hard but knowing which one to pursue and give life is very hard.
BeetleB 2 days ago 3 replies      
"Shut up and calculate" was indeed not coined by Feynman. It was, in fact, coined by David Mermin in an essay he had written once.

The amusing thing is that Mermin himself had forgotten that he had coined it and claimed Feynman to be the source. Eventually, he looked into it and found the earliest reference to the phrase was his own essay! (with no reference to Feynman)

His book, Boojums All The Way is one of the most entertaining books about his adventures as a physicist.

(For those who do not know him, he co-wrote the standard text book on Solid State Theory).

pducks32 2 days ago 1 reply      
As a physicist, I see people all the time wondering what to do with it and looking for a justification for all the hard work. But physics is a hobby subject. It's rim is so vastly complicated that you can push and push at the boundary your whole life and get no where. You have to do it because you love it and you have to accept the abstract nonsense of it all. I also studied math and art history so I was down to do thinks I thought were abstract awesomeness without wondering about a job. I was lucky in that I'm a software dev so I had a job anywhere but my point is that I really feel for people who follow what they love and then become disillusioned. It really sucks.
nappy-doo 3 days ago 1 reply      
This is possibly the best article I've ever read about grad school. If you haven't read it yet, do. You'll like it.
d_burfoot 2 days ago 0 replies      
> You can do whatever you want

This is very dangerous advice to give a young person. But the author should have done a better job at interpreting the message. If your father tells you that you can do whatever you want, do you conclude that you can get good enough at tennis to win the US Open? No, of course not, that's absurd. But winning the US Open is MUCH EASIER than discovering the Holy Grail of physics. Properly understood in this context, the father's message meant "if you want, you can become a physicist" - and it was probably correct. The author's downfall was that he overinterpreted the promise of the message and was also too ambitious to accept the lesser reward of "merely" becoming an average professional physicist.

hnarayanan 2 days ago 0 replies      
This piece moved me to tears. It feels like watching my own life retold by someone whos so much more articulate than me.

Its strange when the source of your intellectual self-worth is also the source of your depression.

anigbrowl 2 days ago 2 replies      
What wed created is called a toy model: an exact solution to an approximate version of an actual problem. This, I learned, is what becomes of a colossal conundrum like quantum gravity after 70-plus years of failed attempts to solve it. All the frontal attacks and obvious ideas have been tried. Every imaginable path has hit a dead end.

Isn't that a clue that one of the premises is fundamentally wrong? I'm no scientist but I rely on the scientific method, and questioning my assumptions when I'm stuck almost invariably proves more productive than refining my hypothesis. OK, my problems are very shallow, but nature's complexity generally seems to be the result of simple processes, elaborated and iterated. The author's description reminds me very much of the experience of painstakingly 'solving' one side of a Rubik's cube before realizing the more general iterative approach.

placebo 2 days ago 1 reply      
Very good read, and resonated with me because I had read the same new agey books at the time, went to study physics with the same "I'll find the grail" philosophy and had felt the painful blow of disillusionment, together with other blows that convinced me to leave the path much earlier than the author of the article.

Many years later, I feel that "the grail" is still the driving force behind most of my thoughts, but frankly, I doubt it is reachable by thought. Suppose someone will solve quantum gravity. I'd be very excited and curious - it would be wonderful and fascinating, but I believe any claim that "Physics is solved" that might be stated after that would be as misguided as lord Kelvin's claim at the time. Any solution would eventually just set the stage for the next grail chase, with more food for the mind to chew on from an infinite supply and no answer will really make a dent in the armour surrounding the question of what is the essence of this food supply or it's relation to the thoughts that contemplate it. I can't prove any of this of course...

Nomentatus 1 day ago 0 replies      
I went in to University (not physics) with the same stars, but I'd read far more history of science. I knew that almost everyone who tries to make something more than an incremental discovery fails miserably, I just thought it was very honorable to make the attempt, if you thought you might have what it takes.

I left because, when I looked around after many years, it was very clear that (where I was) absolutely none of the professors around me had any intention of solving the problems they were paid to discuss, nor any interest in doing so. They were quite capable of becoming angry at any sign that others did. So even if I did want to solve the problems (which I still did) hanging around them would be more hindrance than help. They wanted the prestige, they wanted to cash the people's checks - just so long as they didn't have to do the job, because it might pose some small risk that reputation, and affect the size of their wine cellar.

Rajeev's eventual answer had to more to do with reputation than big problem-solving, he may have been a functionary when push comes to shove, as well.

j7ake 2 days ago 0 replies      
One thing I got from this article is that the art of doing science takes years to develop. Developing a taste for what is good research and a direction for what is a good path only comes from an apprenticeship model where you copy and learn from your mentor. It really shows how important taste, guidance, and perseverance is in order to avoid getting lost or distracted.

Wonderful article.

NumberSix 2 days ago 2 replies      
"Shut up and calculate" hasn't produced much in the way of concrete or practical results compared to the heyday of fundamental physics in the first half of the 20th century that produced quantum mechanics, special and general relativity, the atomic bomb, etc. It has produced extremely complex mathematical systems like string theory that seem to have led nowhere.

Quantum mechanics is probably "incomplete" as Einstein argued. Hence attempts to unify general relativity and the current quantum theory are likely to fail, as they appear to, since a revised quantum theory is needed.

If the data -- angular velocity distributions of start etc. -- used to support "dark matter," "dark energy" and other patches to the prevailing theory of the Big Bang and cosmology is in fact evidence that Newtonian gravity does not apply at galactic scales and above, then general relativity is not correct at galactic scales and above. Again this would make unifying the established quantum theory and the established general relativity theory incapable of matching observed reality.

The ubiquitous lack of secure longer term jobs like Einstein's civil service job at the patent office -- he was not a post-doc -- make deeper conceptual analysis of the outstanding problems in physics today difficult, probably impossible.

potbelly83 2 days ago 1 reply      
Great essay, really hits the nail on the head about how hard it is to do fundamental research. I especially liked the comment about controlling your emotions.
hkon 2 days ago 0 replies      
Wow, I rarely read pieces like this all the way through. But this was simply too good to just scroll through. (Realized this after scrolling halfway through).

So, scrollers beware.

WhitneyLand 2 days ago 0 replies      
Would recommend reading for everyone. There's so much more here than getting a phd in physics.

There a few different life lessons here to learn from or to think about.

daxfohl 1 day ago 0 replies      
It seems like academia needs a "20% time" thing like Google. You can get a grant for doing cyclotomic fiber bundles in a single dimension, because it's mathy and publishable and not too far from the mainstream, even if the likelihood of this being The Grail (or real in any way) is low.

You can't get funding to look at something completely off the cuff. Even 100 years ago Einstein couldn't have gotten funding to investigate some idea that distorts distance and time. I think 20% time to investigate whatever crazy idea you want would be beneficial to making more substantial progress in the real fundamental problems.

justinpombrio 2 days ago 1 reply      
Grad school is tough.
pas 2 days ago 1 reply      
tl;dr: success (in many walks of like, as in science, especially in abstract branches like theoretical physics and mathematics) is simply not quitting, and has almost nothing to do with winning big (Nobel, Fields, Abel). And those who stay long enough gain tenure.

It's not glamorous, it's shitty. Long hours, low pay. But you do science. And no one ever can take that away from you, which is nice.

jjangsangy 2 days ago 0 replies      
This story resonated with me deeply, I enjoyed every word.
Top algorithms in interview questions geeksforgeeks.org
264 points by g2183889  14 hours ago   56 comments top 15
hal9000xp 7 hours ago 4 replies      
This is nice list and ability to implement these algorithms certainly won't hurt.

But I have to say that knowing these algorithms alone won't help you much during job interview with smart employer like Google.

The reason is simple but often overlooked by many people: the most important thing is not these algorithms themselves but ability to recognize them in problems.

You may learn pretty quickly how these algorithms work and implemented but it may take years of practice to earn ability to recognize them.

Google won't ask you directly to implement Dijkstra algorithm. They may give you a problem which on surface have nothing to do with graphs. It may take a while before you actually have a light-bulb/aha moment when you realize it's a graph problem.

In practical non-interview problems, ability to recognize algorithms is much more important than knowing their implementation. You can always find their implementation on the Internet after all.

This is why I'm trying hard to improve my problem solving skills by solving competitive programming problems almost everyday.

chadcmulligan 9 hours ago 3 replies      
Would a better way to interview be questions like:

1) I have an array of 1000 integers, which of these would be the best way to sort them a) quicksort b) bubblesort...

2) what sort of structure would you use to store a list of numbers and strings 1) dictionary/hash 2) two arrays ...

and so on. These would give you the certainty that they know which is which, you could ask give reasons to see if they mention O() and so on.

IRL it's very seldom that you write one of these structures, but you use them all the time and need to know which one to use when.

madmax108 9 hours ago 1 reply      
This site is EXTREMELY popular in India, and used a lot by students AND interviewers (I know many who simply ask questions from the front page of G4G on a given day). It's the inverted tree equivalent in India.

I'm someone who was actually interviewed by GeeksForGeeks because a junior from college connected them to me (They do interviews with people who have gotten placed in * dream * companies... not my terminology).

In the interview, which was done over email, I actually mentioned that resources like G4G are bad resources for studying because they over-simplify algorithms and reduce them to silly proportions, and also encourage rote learning. To my surprise, they directly published the same ON THEIR SITE. Speaks volumes of their editorial team (?).This article too has little basis in reality, but more of one guy's list.

I strongly suggest you use much better resources for learning algortihms, rather than this site, which is (by and large) the W3Schools of algorithms/data structures.

shmerl 8 hours ago 0 replies      
It's a pretty pointless thing to ask something like solution for linked list and etc. Either you know it, or not, and if not, coming up with solution that took others many years to come up with is like asking to invent something on the spot - i.e. it's practically impossible. So it's a ridiculous kind of question and doesn't show anything useful about the candidate.
andrewvijay 11 hours ago 7 replies      
I genuinely want to know where people use these algorithms in their code. I'm a non CS dev to begin with so may be I don't know where to use them since I didn't get formal CS education. This way of interviewing is not what I prefer. I have been told I write better code than my CS grad peers but I have no clue about these algorithms and data structures. What do you guys think about this form of interview?
axiom92 9 hours ago 1 reply      
In my opinion, it's a settled question now. If you are looking for a job, you better cram these lists or you are dead meat. The screening tests and the interviews basically boil down to these set of questions for most of the companies.
invaliduser 8 hours ago 1 reply      
This list is very funny, straight out of the 90s, because we are in 2017 and most developers just spend their working days basically writing forms and storing/fetching data over the network/in a database.
bezzi 9 hours ago 0 replies      
Isn't the idea of preparing for software development interviews ridiculous?Instead of improving my algorithms skills to become a better developer I find myself memorizing a ton of problems just so I can answer similar ones during interviews. It feels like I'm preparing for the SATs again.
bryanrasmussen 6 hours ago 0 replies      
I think I've only ever had one of these algorithms come up in a hiring situation - unless I have brought them up myself in the normal flow of conversation.

I had to solve a problem for which a binary search was the correct solution (as well as some caching of results and stuff [although that part was fancy show-off stuff and not really necessary to solve the problem satisfactorily]) but I did the caching first and then sort of froze when it came to binary search because I was thinking 'uhm I should describe my thinking here first' and then the developer who was in charge of the exercise took my hesitation as not knowing the solution so he finished it and that was that (the test was also in Python a language I don't know that well - the theory was if you could figure your way through in Python despite not knowing it you would be able to handle new situations with aplomb. So I guess I failed the aplomb part.)

tzs 11 hours ago 0 replies      
Convex hull is in the number theory category?
KirinDave 7 hours ago 0 replies      
But these questions should be a signal to you as an interviewee. Unless they are extremely salient to your proposed job (e.g., you are applying for a position teaching algorithms) they're a sign the recruiting effort at this workplace is not very healthy.

What that means is that talent and skill will be erratically dispersed throughout that organization. Requests for new staff will take a long time and may not fill needs, and that often times specific managers strongly influence who can get hired where for a variety of reasons.

Personally, I play along with these questions but make a game of pointing out how incredibly synthetic and unrealistic the conditions people put around them are. The goal of the game is to basically force the interviewer to come out and say exactly what algorithm they want, by way of how many other aspects of real-world software and systems they want to exclude from the conversation.

dvt 10 hours ago 2 replies      
What a typical, uninspired, and pretentious list. I've recently started to opt out of interviewing people because I'm often teamed with someone that will Google one of these, think they're some sort of genius, and proceed to make some poor twenty-year-old feel like a doofus for not knowing the algorithm for a convex hull.

Speaking of which -- seriously? The only time I even had to LOOK at that algorithm was when I read a very old game programming book -- before I even went to college, mind you -- and generating pixel-perfect collisions for arbitrary polygons was one of the chapters (the game example was one of those meteor blaster clones).

I have strong feelings about this and I think this article is a complete waste of time, not to mention lazy (it looks auto-generated, anyway) because it perpetuates the idea(l) of making the software engineer interview process as arcane and difficult as possible.

known 11 hours ago 0 replies      
Good list;Please add Containers to the list http://www.cplusplus.com/reference/stl/
phkahler 9 hours ago 1 reply      
Nobody needs to be able to code these in an interview. Ever. For certain domains you should be aware of them and be able to look up decent implementations. But to think that level of knowledge is important in an interview is bogus. I could just as easily ask similar questions and weed out most CS grads that get into Google or Facebook with these:

Please implement a first order low pass IIR filter.Tell me how the butterfly pattern in an FFT gets your from N^2 to N*logN. Oh, and implement an FFT.Write a basic PID controller implementation.Tell me how you'd handle a Field Oriented Control system that needs to run in voltage limit most of the time - what stability issues may occur?Write a fixed-point implementation of the sin(x) function.Implement a 2-pole 2-zero transfer function. For bonus points do it in fixed point without rollover or saturation problems.Assuming you have a matrix library available, give me the boilerplate code for a Kalman Filter.What kind of ODE solver should you use for long term stability when simulating planetary systems?

These are similar difficulty questions from a different domain, but many of them are likely to be used far more often in that domain than any of the interview questions in TFA are likely to be used in their domain.

The goal of an interview is to ascertain weather the candidate is capable of doing stuff and learning stuff, and if that's likely to carry over into the stuff you need done. It's not to see weather they can produce an answer to some specific problem on the spot. How you do that I'm not telling - it's hard enough without helping you find the people I need ;-)

mathogre 8 hours ago 1 reply      
JFC, that's such bullshit. You want the mundane? Go for it. You actually want someone who can take a bunch of real operational data and solve the problem when the Person With The Money says, "We need to know what is actually happening with ____. And we've promised it in two days. You're it."

Who gives a flying fuck about writing the best sorting algorithm? "sort" works just fine, unless you're Google and microseconds matter. And then, you're really at the edge of R&D. You need to be able to manipulate data with aplomb. You need to be able to write an algorithm that works, and then refine it to make it go a hundred (or more) times faster once you understand why it is so slow.

US government to withdraw longstanding warnings about cholesterol (2015) washingtonpost.com
254 points by winteriscoming  2 days ago   246 comments top 25
petilon 2 days ago 5 replies      
This doesn't go far enough. That there is a link between high cholesterol and heart disease is only a hypotheses, not a scientifically proven fact. Lowering cholesterol does not necessarily lower heart disease. Read more here:http://www.nytimes.com/2008/01/27/opinion/27taubes.html

Because the link between excessive LDL cholesterol and cardiovascular disease has been so widely accepted, the Food and Drug Administration generally has not required drug companies to prove that cholesterol medicines actually reduce heart attacks before approval.See: http://www.nytimes.com/2008/01/17/business/17drug.html

Meanwhile drug companies are making billions selling cholesterol lowering medicines called statins.But lowering cholesterol using drugs is not useful. Read more about that here: https://www.bloomberg.com/news/articles/2008-04-15/heart-dis...

Not only that, these statins are dangerous drugs. Its side effects include loss of short term memory, for example. Read more about that here: https://www.scientificamerican.com/article/its-not-dementia-...

clarkmoody 2 days ago 12 replies      
Wouldn't it be nice if there were more government warnings and less restrictions / regulations?

Imagine if the FDA, instead of blocking new drugs for 10 years and $1B, it simply withheld its endorsement until satisfied by the clinical trials. Consumers could then take the government's recommendations into consideration when making a decision and drugs could get to market much faster.

The history of medical reversals -- and in this case, nutrition reversal -- shows that the government isn't magic.

A whole raft of restrictions could be converted to warnings and recommendations, freeing up industry to innovate and consumers to take a little more responsibility for themselves.

Imagine the history of the past few decades if the state had outlawed any foods with more than X% cholesterol. Or trans-fats. Or any of the other food fads over that time. It would have been terrible, especially now that the recommendation is reversed. The whole time, consumers were allowed to factor government warnings into their decisions, but food producers weren't breaking the law by selling foods with (X+1)% cholesterol.

d0mdo0ss 2 days ago 0 replies      
The sugar industry has a history of lobbying public attention away from sugar and towards cholesterol. It has even enlisted places like Harvard School of Public Health on this task:



pdq 2 days ago 4 replies      
Let me know when the government wakes up and inverts the food pyramid, so that breads and grains and carbs are scarce, rather than fats.
mtw 2 days ago 3 replies      
For those who are curious, the 2015-2020 Dietary Guidelines is available here https://health.gov/dietaryguidelines/2015/resources/2015-202... (PDF)

Unfortunately it does not highlight the danger of sugar & processed food. Someone can follows these dietary guidelines and still be able to have a high intake of sugar, HCFS, processed food and all sorts of food chemicals (emulsifiers etc.), all prevalent in the american food industry, and get cardiovascular disease.

We are not going to reverse the obesity epidemic anytime soon!

empath75 2 days ago 8 replies      
There is so much that 'everybody knows' about nutrition that is pure bullshit peddled by farmers, agribusiness and processed food companies.

Just one example-- that fat causes heart disease. Pure nonsense pushed by the soda and sugar industry.


Just eat a balanced diet, don't overeat, and don't eat too much processed food and red meat, and you'll probably be fine.

rcavezza 2 days ago 5 replies      
<<< The new view on cholesterol in food does not reverse warnings about high levels of bad cholesterol in the blood, which have been linked to heart disease. Moreover, some experts warned that people with particular health problems, such as diabetes, should continue to avoid cholesterol-rich diets.

<<<The greater danger in this regard, these experts believe, lies not in products such as eggs, shrimp or lobster, which are high in cholesterol, but in too many servings of foods heavy with saturated fats, such as fatty meats, whole milk, and butter.

Here is my paraphrased takeaway:

Cholesterol you see in your blood results is still bad. Whole milk, butter, and fatty meats is still bad. Foods like eggs, shrimp, and lobster might be good.

I don't think this changes any of my mental models. The foods that I always thought of as "probably not great" are still classified as such, according to this article.

conistonwater 2 days ago 0 replies      
The full dietary guidelines being discussed appear to be here: https://health.gov/dietaryguidelines/2015/guidelines/
test_pilot 2 days ago 0 replies      
Youtube is where most people get their dietary advice now. The most influential diet advice is coming from young attractive healthy looking people. Whatever they're eating seems to be working. Obviously most of these people won the genetic lottery, but they've also nurtured their body correctly with food and exercise.

This seems like a much better approach in convincing people what to eat anyway. Look at the results and imitate healthy people if you want to look and stay healthy.

kazinator 2 days ago 0 replies      
> The greater danger in this regard, these experts believe, lies not in products such as eggs, shrimp or lobster, which are high in cholesterol, but in too many servings of foods heavy with saturated fats, such as fatty meats, whole milk, and butter.

Translation: the lobbyists for various polyunsaturated "edible plastics" are currently in the lead.

mescalito 2 days ago 0 replies      
Interesting, for those who enjoy reading (which I guess are probably quite a lot) I'd recommend the big fat surprise[1], a very interesting book about this, which goes way beyond this article.

[1]: http://thebigfatsurprise.com/

blatherard 2 days ago 2 replies      
This needs a (2015) added to it.
LargeCompanies 1 day ago 0 replies      
Hmmm this was just on the homepage then quickly removed.. top of the page then boom gone? Any reason why this is?

Also I've been on statins since 37 after feeling my heart race and not being able to catch my breathe after being with girlfriend at the time. That was some scary stuff and after the statin and change of diet I no longer have those type of bouts anymore. So maybe it's my change in diet (cut out 75% fried food and sugar intake is 50% less) and the statin combined and or maybe the statin is just a placebo and my change is diet helped eliminate those heart racing attacks?

I had a lot of those attacks from 37 to about 39...im now 41 and haven't dealt with any such attacks unless I eat a say Five Guys or In and Out.

galfarragem 2 days ago 1 reply      
I use to look around in a naive way and take my conclusions:

- Grandfather, 89, ate everything moderately during his life, mostly organic. He evites fat and bread and walks every day. Low energy person.

- Grandmother, passed at 82, cancer. Same diet as my grandfather. Much less exercise though.

- Grandmother, 92, ate a lot until 50's, mostly organic, obese. Since then barely eats, mostly vegetables and yogurt. Right weight and good health. No exercise.

- Grandfather, passed at 71. Ate more fat than anyone I know. Death by heart attack or stroke. Very energetic person with a lot of exercise.

Can you see a pattern? I can't.

thomasvarney723 2 days ago 0 replies      
I recommend Peter Attia's (currently unfinished) series, "The straight dope on cholesterol". He's a surgeon interested in health and fitness. http://eatingacademy.com/?s=cholesterol
known 2 days ago 0 replies      
kelukelugames 2 days ago 1 reply      
My surgeon friend told me medical studies are like the bible. You can find a paper that says something is bad for you and another that says it is good for you. So just believe the one that makes you happier.
fulldecent 2 days ago 0 replies      
This is a link to a non-authoritative source. Could someone please provide a link to the authoritative source so that I may investigate by myself?
known 2 days ago 0 replies      
overgard 2 days ago 0 replies      
The USG is frankly incompetent when it comes to this matter. I guess it's happy news that they're not creating more chaos but yay? Hollow victory
pombrand 1 day ago 0 replies      
For those of you who still believe it's OK or even good to eat a lot of saturated fats, if you look at the studies it's not much of a controversy:" Whether saturated fat is a risk factor for cardiovascular disease (CVD) is a question with numerous controversial views.[1] Although most in the mainstream heart-health, government, and medical communities hold that saturated fat is a risk factor for CVD, some hold contrary beliefs." https://en.wikipedia.org/wiki/Saturated_fat_and_cardiovascul...

Anecdotally, I've read 50% of people who go on Keto see a huge 2-3x increase in triglycerides and LDL-P (Particle count as measured by NMR) - the #1 risk factor for cardiovascular disease. More LDL particles bouncing around in your arteries = bad. People with a condition that makes them break down more LDL have much less artheroscleriosis:http://www.nejm.org/doi/pdf/10.1056/NEJMoa054013People with a condition that makes them have more LDL particles get more artheroscleriosis: https://en.wikipedia.org/wiki/Familial_hypercholesterolemiaRead Peter Attila's exposition that goes in detail (eating cholesterol is fine though): http://eatingacademy.com/nutrition/the-straight-dope-on-chol...Keto could work, but it's a hyper-pro level high-risk diet requiring frequent blood work and still avoiding saturated fats keeping them primarily monounsaturated which makes it very hard to follow. Plus it sucks for weightlifting..

Surprisingly, very low fat diets might be great for you, it might not be the absolute macronutrient composition that matters, but rather the specifics of the nutrients (GI, fiber, other nutrients etc) and your genetic makeup: https://deniseminger.com/2015/10/06/in-defense-of-low-fat-a-...Scary correlations about saturated fats and neurodegenerative disease therein too.

Personally I'm sticking with a "balanced" ~10/45/45 carbs from protein, low GI carb/monounsaturated fat vegan diet - not wanting to risk side effects of any extreme (although I'm having to ensure adequate calcium, K2, D, B12 and DHA and EPA intake - If I didn't find it unethical to eat fish a pescetarian variety would likely be easier/healthier/less gassy). Tip for you vegetarians/vegans: look up low FODMAP foods; foodstuffs low in carbs indigestible in the small intestine which tend to produce more gas.

jordache 2 days ago 0 replies      
oh so now butter is bad again? WTF. Can't believe anything the nutritionists put out.
mistermann 2 days ago 0 replies      
Interesting, why now I wonder?
jljljl 2 days ago 1 reply      
"Eat food, not too much, mostly plants."

- Michael Pollan

pcurve 2 days ago 1 reply      
Just eat in moderation and try to maintain healthy weight.

The last time I checked, average woman in the U.S. is 5'4" but weighs 165lbs; average man clocks in at nearly 200lbs. We can do better.

The U.S. Will Surpass China as the No. 1 Country for Manufacturing by 2020 fortune.com
242 points by jayjay71  3 days ago   221 comments top 21
throwaway2016a 3 days ago 18 replies      
I haven't been myself but my wife is a process engineer and had to go to China to install and train people on a new machine so this is only second hand...

But one of the reasons China is so good at manufacturing is because everything is in one place (or at least clusters)... need 10,000 of an Integrated Circuit? The company that makes it is literally down the street. Need some raw materials? That's down the other street.

I can't think of a place in the US that is like that. If we need parts we have to wait for them to ship. (often from China)

It's not just a labor problem. It is my understanding to be competitive in manufacturing you need to be vertically integrated. And I think that is harder to do in the US.

Although I hope it is true. I live in the US but I don't believe in inherit United States exceptionalism but I do feel like shipping goods by container ship is bad for the environment and bad for consumers. I'd much rather see things made locally.

jayjay71 3 days ago 1 reply      
A link to the full study pdf (1) and a link to a more interactive version of the data (2)

(1) https://www2.deloitte.com/content/dam/Deloitte/us/Documents/...

(2) https://www2.deloitte.com/global/en/pages/manufacturing/arti...

smaddali 3 days ago 3 replies      
While manufacturing may comeback to US, the employment in Manufacturing sector may not increase as much due to automation.
z2 3 days ago 0 replies      
Couldn't one country's CEO be unreasonably bullish or bearish on their country's capabilities relative to others? Do they really know how far along foreign manufacturing schemes are evolving? Nobody is sitting on their hands here; "Made in China 2025" and Germany's "Industrie 4.0" are pretty strong desires to push into advanced production using IoT, smarter automation, and all that jazz. Anecdotally, in a factory visit near Shenzhen, the manager claimed that moving to an automated production line has been pretty easy. Some areas are kept manual only because the worker is cheaper, for now.

Outcomes from competition are hard to predict. It's interesting that Deloitte predicts Germany "holds strong and steady at the number three position" in 2020 when their own survey has them jumping between 2nd and 8th within a couple of years.

bandrami 3 days ago 5 replies      
I'm pretty sure our manufacturing output exceeded China's for most of the oughts, also. The myth that "we don't make things here anymore" is, well, a myth. We just don't employ people to make things anymore.
rahimnathwani 3 days ago 4 replies      
I read the article and skimmed the full study and was left scratching my head:

- What exactly do they mean by 'Competitiveness'?

- Why so much focus on labour costs, when manufacturing cost is increasingly driven by large capital investments (especially in China, where low interest rates and government encouragement have expanded capital investments for years)

- Why all the talk about R&D expenditure. Is that really a driver of manufacturing competitiveness (whatever that means) or manufacturing output?

- How do they expect (on page 46) China's consumption to rise to 46% of GDP by 2025, if GDP will rise at 6.5% per year during the period? Such a shift would require consumption to go up by ~10% per year.

herbst 5 hours ago 0 replies      
I live in europe. I'd assume 80% of the things i own are mostly made in china. Maybe 19% are made within the EU and 1% everywhere else. I cant recall a single product i own that was producted or even manufactured in the U.S. therfore i have a hard time believing that title.
ageofwant 3 days ago 2 replies      
Jobs that are vulnerable to automation are generally not worth keeping. That's no comfort to those loosing them though.

Science fiction from the 60's painted a world where people would lounge around in their airships hopping between beaches, and mountains and parties while robots took care of everything and the lord scientists and engineers that gifted the wold with such plenty smiled benevolently down on the citizens of the Age of Plenty. Yea that did not happen, nobody thought about who would own those automatons, and lo, it was not us.

didibus 1 day ago 0 replies      
We could just all stop buying made in China. Seriously, all these things are laws of economics. You can't bypass them, as they are natural laws. Economics is the study of human motivation. Motivation is what dictates what we do and don't, it rules our actions.

If you wanted to have an impact, be conscious about your actions, pay that extra cost and buy local if you care so much, and don't buy at all if there's no local options.

LargeCompanies 2 days ago 3 replies      
Making America Great Again...
jaekwon 3 days ago 2 replies      
We're not even using the right metric system, so our export abilities are crippled, right? I remember trying to find parts for my 3d printer in the US.
norswap 2 days ago 0 replies      
But is it good news?
johncole 3 days ago 0 replies      
Worst graphic ever.
codez4lyfe 3 days ago 2 replies      
blackbagboys 3 days ago 2 replies      
throwaway2017ab 2 days ago 0 replies      
What a joke ! Why is this even on hackernews ? Does anyone even know the economics of manufacturing and why China is on top of this ?
cynosurelabs 2 days ago 0 replies      
I totally agree with that!
squozzer 3 days ago 2 replies      
Upon what data and assumptions is this prediction based?How will this help Joe Six-pack?
yc-kraln 3 days ago 0 replies      
Who cares? China has more middle-class citizens than the US has citizens. Manufacturing does not move the needle on people's quality of life.
KON_Air 2 days ago 1 reply      
Anglo-Saxons think they can retreat to their shell and prosper again. Great news. Because it never ends badly.
newswriter99 3 days ago 0 replies      
This was posted back in March, right?

Is it a "year-end round-up" of stories or something?

Because if so, this story is kind of dry. Aside from the tiny hyperlink to the Deloitte study, there's NO data in this at all.

What's the deal?

Once mocked, Facebooks $1B acquisition of Instagram was a good move bgr.com
246 points by BishopD  2 days ago   185 comments top 25
mullingitover 1 day ago 7 replies      
Buying instagram was a brilliant move. It's a far more enjoyable social network to use. Lately I've gotten into a scene that's basically a parallel universe where facebook doesn't exist. Everyone uses instagram. It's far more entertaining and creative, and far less saturated with anger and activism. I've dialed back my facebook activity heavily in the past year, to the point that I've deleted the facebook app. I have messenger and the facebook events app, but the losing the news feed has been no loss at all.

I can see a future, not even very far away, where facebook is essentially the AOL of our generation and having an account there is a punchline.

drum 1 day ago 2 replies      
I admire Mark Zuckerberg's confidence to pull the trigger on an acquisition like this, especially doing it without consulting his board. My initial thoughts was that he has a visceral feel for the rate of growth that makes a social network successful, having gone through it himself. He could probably tell just by their publicity of growing to a million users within 2-3 months that they were going to be huge.
iloveluce 1 day ago 4 replies      
Hacker news thread when Facebook acquired Instagram https://news.ycombinator.com/item?id=3817840
pcurve 1 day ago 7 replies      
The real genius is, he left it alone and gave it autonomy. That in contrast to Google, Microsoft, or Yahoo that can't seem to resist the allure of 'synergy'.

I bet most people don't even know Instagram is owned by facebook.

rudolf0 1 day ago 13 replies      
Dumb question, but what is it about Instagram that made it so huge?

Snapchat I actually get, since it's presenting a new kind of communication model. Facebook I get. But Instagram is just like any old image gallery, with a very rudimentary comment system and almost no features unrelated to image uploading.

tsunamifury 1 day ago 2 replies      
The purchase of instagram was a pre-IPO move to prop up Facebooks offering as a mobile company before their core product had actually transitioned. Investors major question was if FB could transition from desktop to mobile at the time, and Mark needed something to back that up. Mark had been quite vocal about Facebooks guesses around mobile, html5, apps and how he needed to reorganize the product teams but couldn't get the story together in time. Thus instagram.

The logic in both the article and most of the comments seems to utterly forget the context of which FB was in at the time.

exodust 1 day ago 0 replies      
I hate the way Instagram allows people to sign up and use the service without verifying the email address they used to sign up with. I'm assuming this is the case after someone used my email to create a profile on Instagram.

I started receiving notification emails from Instagram a few weeks ago, which I ignored at first thinking they were fake. On closer inspection they actually were from Instagram, so I clicked "forgot password" on the Instagram website, reset their password, logged in and deleted all their content and permanently deleted their account. By the looks of it, the profile belonged to some kid - a few family photos and so on.

It's quite slack of Instagram that this is possible. They should not be allowing people to create profiles and use the service before first verifying the email address used to sign up with. I guess basic verification is trumped by the need for "active users" to motivate these "stroke of genius" articles.

symlinkk 1 day ago 3 replies      
I still don't understand why Instagram took off, and I'm a millennial. You could already post pictures to Facebook when it came out. What advantage did Instagram offer over Facebook? Filters? I just don't understand my peers and I don't understand this industry sometimes.
jplahn 1 day ago 1 reply      
A lot of people ask why Instagram took off and get a lot of different answers; I think that's the beauty of it.

I wasn't much of an Instagram user for a long time until I started getting into photography. I'm still a total photography noob, but now my feed of pictures is a combination of friends and amazing photographers that serve as inspiration. I deleted my Facebook account a couple months ago and haven't looked back. Whereas I spent time on Facebook scrolling through vitriol, my time on Instragram is a constant stream of friends lives and beautiful pictures.

I've realized a shallow social network is all the social network I need.

whack 1 day ago 0 replies      
I was in San Francisco when this deal was announced, and I still remember all my supposedly "tech startup savvy" friends mocking Zuckerberg for buying a company with no revenue for a billion dollars. And yes, this group even included a Harvard MBA grad pursuing a career in entrepreneurship. It's no coincidence that this was the same group of people who also mocked Facebook's IPO valuation of ~$70B as proof that we were in a bubble. If there's one thing I learned from that experience, it's that people have no idea how to appraise high-growth high-potential ventures.
terda12 22 hours ago 0 replies      
I'm doing art as a career after my comp sci degree and instagram is just perfect for me. The fact that instagram is pics only works very well for visual artists, and I can easily share my work in a relaxed sort of way. There's also tons of other artists on instagram and the way it's set up I can easily see their pieces of artwork much more easily than twitter/facebook.

Also Facebook is a dead zone now. It's literally pointless political junk and rehashed memes shared by "friends". The only usage I have for it is to message my old friends.

Snapchat is also starting to die out ever since Instagram implemented their own "snapchat" feature. IMO Instagram just does it better than snapchat. Snapchat is just too bloated for me. See with instagram I can follow someone like Kanye West, see his life in cool pics and his "snaps" in a convenient way.

Ericson2314 1 day ago 0 replies      
I think the lesson is social media is a form of fashion, and Facebook may have to continuously by new upstarts to stay trendy.
vit05 1 day ago 1 reply      
Live video will be bigger on instagram than would be on facebook, snapchat or periscope. It is way more easy, fast and better looking than any other app that do live video IMO.

They only need to make a better discovery option than the "Follow someone".

aznpwnzor 19 hours ago 0 replies      
Instagram is what will take down Snapchat or at least defend Facebook against Snapchat's offensive. Snapchat refuses, for good reason to its product, to enact discoverability which Instagram (thanks to Facebook's expertise) handles perfectly.

Instagram has also added to Facebook's main product through the autoplay videos, a tech that FB engineers could not get to work until the Instagram acquisition.

notfreeyet 1 day ago 1 reply      
I never mocked this deal. I always thought this was a small price to pay for insurance that Instagram wouldn't subsume Facebook. It was only 1% of Facebook.

But I still think $19 billion was about $17 billion too much for WhatsApp. It's a messaging product that doesn't directly threaten Facebook the way Instagram does. They could have created 10x $1 billion teams to compete and easily done better than what they have with WhatsApp. It seems like a cowardly use of $19 billion. Oculus cost them $2 billion and there are many other breakthroughs that are equally underpriced.

pfarnsworth 1 day ago 1 reply      
Yep. Kudos to them, I remember thinking this was one of the dumbest acquisitions I had heard of, until Whatsapp. But Instagram has been a huge success.

The thing is that the Instagram and Whatsapp acquisitions are responsible for fueling this idea that all you need to do is create growth, and Facebook and/or Google will pay billions to acquire your company. Snapchat would never have gotten funding if there wasn't this dream that this could happen. We'll see if Snapchat is worth the $25B that is purported for their IPO, but I think those two acquisitions were the catalyst for all of this.

jakebasile 1 day ago 1 reply      
I remember thinking this was a silly move, but my greatest regret is that I did not reserve my preferred user name in time. This means I'll never really be able to use the service.
jonknee 1 day ago 0 replies      
It turned into one of the best tech acquisitions ever, even better than YouTube because it actually makes a profit.
fullshark 1 day ago 0 replies      
The article's position that being mocked by Jon Stewart is meaningful is silly.
RodgerTheGreat 1 day ago 1 reply      
Playing the lottery and winning doesn't retroactively make you a tactical genius. Facebook saw a rising competitor, threw cash at the problem- a page from the playbook of practically every major corporation in history- and in the fullness of time came out ahead. This is a story about nothing.
pasbesoin 1 day ago 2 replies      
People going to Instagram to escape their existing Facebook connections. Starting a new graph on Instagram that is restricted to their current and more private interests. What I've observed.
zappo2938 21 hours ago 0 replies      
I think the brilliance of Instagram is how they solved the problem of image ratio and rendering to different devices -- make everything a square. About the time of the purchase Facebook engineers and designers had been making talks about image layout. (I'll edit this if I find the bookmark to the link.) I wonder if the purchase was an extension of finding a working solution to working with images of different sizes, orientations, and aspect ratios.
draw_down 1 day ago 1 reply      
I think about this whenever a startup is acquired and everyone here balks at the price. Turns out $1B was a bargain for Instagram.
ianstallings 1 day ago 3 replies      
Let's be honest here. FB bought IG because they were the competition and what was the end result? Those guys at IG got benched for years and haven't done much since. So they won, if money was the goal. But if the goal was winning hearts and minds, and making an impact on the industry, they got sidelined. They lost. IG will never reach it's full potential. So yes, a good purchase on FB's part. But for IG? Debatable.
DoodleBuggy 1 day ago 4 replies      
Hmmm... the headline is succinct and probably is satisfactory to summarize the entire article, but let's see if there is some deeper data or meaningful insight!

* Clicked link to read article

* Immediately overwhelmed by fullscreen overlay ad

* Closed fullscreen overlay ad to attempt to read article

* Immediately assaulted by blasting volume autoplay video ad

* Rather than read article, scroll around and attempt to quickly locate blasting volume autoplay video ad to mute or stop

* Unable to quickly find and mute the blasting autoplay nuisance ad, instead decide to close the browser tab without reading even a word of the article

* Remembered to enable adblock in Canary

What a great web experience!

ThinkPad X1 Carbon lenovo.com
286 points by fiji-flo  1 day ago   498 comments top 52
Waterluvian 1 day ago 28 replies      
I want to be excited by a good non-Apple laptop. But they're all just so terribly designed. They're always some sort of plastic, feel flimsy and bendy, have grills and screw holes and uneven surfaces and stickers everywhere.

I had a T530 for a few years at my first job and just hated it. It had fantastic specs but it felt awful and unreliable and like I had to babysit it. I got a 2015 MBP to replace it after IT damaged it (broke off a bunch of plastic bits from two grills) and while it was a migraine getting Windows and Ubuntu to dual boot, I don't think twice when closing the lid and slipping it into my backpack to go home. Tossing my backpack into my trunk, or in the overhead carry-on.

I would pay a fortune to have a solid chassis (not case) metal non-Apple laptop available.

rcarmo 1 day ago 5 replies      
I was assigned one of these (the 2015 model) as my office laptop, and I hate the thing for five reasons (two dealbreakers on top):

- It is huge and unwieldy, like a cafeteria tray (and comes with a sizable power brick)

- The touchscreen doesn't have an oleophobic coating, so it becomes a smudgy mess within hours (unlike my iPad, which looks pristine for weeks despite being even more heavily used)

- The screen seems dim or washed out no matter what I try (then again, I have a retina MacBook at home...)

- Battery life is poor(ish) on the i7 model. I can only partially blame all the corporate junk we have installed.

- Wi-Fi seemed flakey at times (and you need a proprietary dongle for Ethernet)

I've seen the 2016 models (which don't improve on the above) and they share the good bits:

- Very decent (if somewhat mushy) keyboard, with lots of travel (perhaps a bit too much)

- Relatively lightweight (considering the size)

- Pretty decent touchpad (not on a par with a MacBook, but good enough for three-finger gestures, although maddeningly imprecise at times)

- Despite the color/brightness issues, the HIDPI screen is sharp and readable under Windows 10

- The fingerprint sensor actually works (but doesn't hold a candle to TouchID)

- DisplayPort or HDMI output (depending on model, mine has both, and I use my Apple dongle for DP->VGA)

I disabled TrackPoint within a month due to RSI (I used original IBM laptops for years and it was a recurring issue for me).

TrackPoint is much more precise than the trackpad for accurate positioning, but I'd rather carry a Bluetooth mouse and retain the use of my fingers (whereas I'm perfectly fine with the Mac trackpad for drawing diagrams and pixel-level positioning).

Edit: Oh, and I run Linux on it through Hyper-V and Docker, since I need to run Windows 10. Had no trouble booting a couple of Ubuntu/Elementary Live USB drives for playing around, most of the hardware seemed to work.

joecool1029 1 day ago 1 reply      
A large chunk of the Thinkpad user community is pretty fed up with the ultrabook spec that Lenovo is shipping on its flagship models, namely the inability to effectively customize and modify to use. I keep an X201 alongside my late 2013 rMBP simply because I can use expresscard and a dock and ethernet without carrying around a stupid dongle. I understand they are trying to chase Apple's market, but expect them to fall flat with products like this. I can stand to have a few extra mm of thickness to actually have a usable product.

51nb (chinese forum) has been addressing the need for upgraded specs in the old chassis with things like their Thinkpad X62 https://imgur.com/a/As6On#uHTzOer (you can find the boards on ebay)

There's no products that fit the old ultraportable form factor. The T5xx series is great but can't be lugged around easily.

itaysk 1 day ago 1 reply      
I have recently upgraded form 4 year old X1 Carbon to new X1 Yoga.Bottom line is I'd recommend the X1 Yoga over the Carbon.

Here is my original X1 Carbon review: http://blog.itaysk.com/2013/05/04/lenovo-x1-carbon-touch-rev...And here are my thoughts on the X1 Yoga: http://blog.itaysk.com/2016/12/30/from-lenovo-x1-carbon-to-l...

daxorid 1 day ago 2 replies      
354 comments for a constantly-refreshing 404 page( "Product 22TP2TXX15G does not exist" )? Is anyone actually following the link or just commenting on the title?
sigil 1 day ago 1 reply      
My Thinkpad X1 Carbon 2nd gen is still going strong 3 years later, but this 5th gen is pretty tempting.

Main questions:

- Are these "up to 15.5 hours of battery life" numbers purely theoretical? Even brand new, my 2nd gen only got about 6 hours of normal usage. I run Ubuntu so maybe these power saving tweaks are the result of closer hardware integration with Windows?

- The mobile broadband option (Qualcomm Snapdragon X7) is intriguing, it would save me carrying around a separate hotspot. Has this worked for any Linux users? What carrier do you recommend?

- Any reports of the Wigig Dock working on Linux?

Pros that I can see:

- Real function keys! The adaptive touchpad on my 2nd gen is awkward and silly -- can't believe Apple followed their lead on this. Good riddance.

- That the 5th gen is even smaller and lighter boggles the mind. My 2nd gen is already ridiculously thin. After 5 years of hauling 5-8lb T4x models around NYC my back was killing me, and I was probably on the verge of permanent physical injury. The X1 was a godsend.

- Same old Trackpoint I know and love. Not for everyone, but a mouse on the home row + vim is ergonomic heaven for me. Never change, Thinkpads!

Cons that I can see:

- Looks like almost the same display as my 2nd gen, a 2560x1440 WQHD IPS. A small bump up in nits but that's it. Viewing angles are better than on older laptops, but I still can't read my screen in bright environments, and it gets gummed up with debris and smudges way too easily. Apple continues to dominate in this dimension.

- Matter of taste, but the Silver design feels like Lenovo is trying way too hard to look Apple-y.

piker 1 day ago 13 replies      
Has anyone installed Linux on this machine? If so, how has your experience been with respect to driver support, etc.? Thanks in advance.
dgl 1 day ago 5 replies      
They really should write that it has USB-C charging in bigger letters (see http://www3.lenovo.com/medias/ww-lenovo-laptop-thinkpad-x1-c... for the evidence).
reustle 1 day ago 6 replies      
As someone who is constantly on the road, having a built in sim card for connectivity is something I'm super excited about. This could be the 2016 MBP we wanted, but switching to windows still sounds difficult.
arcaster 1 day ago 2 replies      
I absolutely adore my ThinkPad X201, if there was any way to purchase a pristine new-in-box X201 I would. However, I recently picked up a 3rd gen x1 carbon and absolutely love it compared to my rMBP '15. Sure, it's not quite as fast and the screen isn't quite as good, but I can run a true tiling window manager (not the garbage that is KWM on OSX) and have a unfettered dev platform.

I run Fedora, I like it more than Ubuntu or Debian and like the fact that I don't have to worry about random underlying features breaking all the time when I update (which wasted a ton of my time when I was using Arch Linux).

KaoruAoiShiho 1 day ago 1 reply      
Why is the webdesign so amateurish. I really don't understand these companies that can't put proper priority on having webpages that look really good and luxurious. Spend millions on great industrial design but can't spend the thousands for a decent sales page.
tbrock 1 day ago 3 replies      
This looks like a very nice computer.

I really applaud dell for trying to make the XPS 13 somewhat Linux friendly but they really do feel cheap.

Lenovo thinkpads, while not having the quality they had during the IBM era, are still head and shoulders above Dell in this regard.

This is the Linux laptop to get in 2017.

faragon 1 day ago 3 replies      
It is almost perfect. I would love real 4 core i7 with 8MB L3 cache CPU for laptops, instead of the 2 core (4 threads) with 4MB.

Edit: 6th generation of mobile Intel CPUs have models with 4 cores and 8MB of L3 cache (i7-6970HQ), using 45W [1]. However, in 7th generation there are only 2-core models [2].

[1] http://ark.intel.com/products/family/88392/6th-Generation-In...

[2] http://ark.intel.com/products/family/95544/7th-Generation-In...

jaymoorthi 1 day ago 1 reply      
All it shows me when I try to view models or customize is a message saying no models are available at this time.


I don't see a 5th-generation listing under Lenovo's products pages. Does anyone know what the release date will be?

Edit: closest hit I can find is a rumored launch at CES in a few weeks -- http://www.trustedreviews.com/news/leaked-new-lenovo-thinkpa...

karpodiem 1 day ago 0 replies      
It's not the sexiest laptop in the world, but my work Dell Latitude 7470 with an I7 6600U, 16GB DDR4 RAM, NVMe drive with a 1920x1080 screen is the best PC laptop I've used. Works great with the Dell dock, where I have two Ultrasharp 21.5 inch monitors in 1080p, with the laptop screen flipped open; three usable screens.

I still (slightly) prefer my Mid-2014 13 inch Macbook Pro though, but it's very close.

schmichael 1 day ago 1 reply      
Ubuntu on a ThinkPad X1 Carbon is my workstation. As a Go developer I only need Chrome, a terminal, and Go itself (which I install from the official tarball binaries).

Now that I have Google Fi I'm kicking myself for not getting the cell radio builtin as Google will send you a data only SIM for free! I would never have to deal with another terrible airport or hotel captive portal again!

My coworkers are all Macbook Pro users and ask why I don't get one too. I just don't see the point. I think MBP's build quality is nicer but otherwise I get twice the machine for the same price and get to develop on the same OS as my project primarily targets (Linux).

nfoz 1 day ago 4 replies      
They didn't fully learn the lesson of the Superfish fiasco:

"X1 Carbon is available with Microsoft Windows 10 Pro Signature Edition. No more trialware or unwanted apps. No more distractionsand easy provisioning for IT pros."

So do that with every version, and stop feeding people "unwanted apps" altogether. You just admitted that nobody wants them, and this irresponsible attitude is exactly why you got bopped for pre-installing the Superfish MITM malware.

vxxzy 1 day ago 0 replies      
I have the 2016 model running Arch Linux. The machine is a great piece of engineering. Light as a feather, and feels durable. The carbon fiber body feels great in hand. Performance wise, I have the i7 model 16G RAM and 512GB NVMe SSD. The only slight negative would be battery life. I expected more, but it does last me a full 8hr day of work with screen dimmed. I'm looking forward to purchasing new in another year or so.
lobster_johnson 1 day ago 7 replies      
As someone who is considering moving from a MacBook Pro 2015 to a Linux laptop, the keyboard and, in particular, the trackpad on PC models are the biggest disappointments. Apple has spent so much effort on the tactility of their laptops, while PC manufacturers still seem to be stuck in an earlier decade. I was a little bit shocked to discover that Thinkpads still have that terrible little mouse nipple, which I hadn't seen in about 10 years.

The laptop with the most promising keyboard/trackpad combo, that I have found, is the Dell XPS (it has physical trackpad buttons, but at least they're located at the bottom), which is probably not accidental; it looks a lot like a MacBook, too.

LinuxFreedom 1 day ago 3 replies      
Please support the idea of building a better planet for our children and do not buy laptops where you can not replace the battery - managers in companies that build these kind of products have to learn that they are acting against human interests and need to change their way of thinking.

Of course this applies to all Laptops where you can not change battery.

Thanks and have a better 2017!

dchuk 1 day ago 1 reply      
I desperately want exactly this in a Macbook Pro: "The X1 Carbon delivers up to 15.5 hours of battery life. And if youre running low, the rapid charging feature provides 80% capacity in just an hour."

Specifically the rapid charging. Hell I'd be fine with current battery life (6-10 hrs depending on usage) if it could recharge fast.

Same goes for the phones.

hughes 1 day ago 0 replies      
Still running my 2012 x1 carbon... One thing that has kept it going is that it's very serviceable. I hope the new model is as friendly towards component replacement, as I've had to swap out the DC power harness and cooling assembly so far.
bluedino 1 day ago 1 reply      
I owned a first generation X1 for a while. Eventually sold it because it only had 4GB of RAM, and there is not much of a source of the proprietary SSD. The battery was down to 3 hours, and I wasn't going to replace it if I couldn't put at least 8GB/512GB into it.

I really liked the Lenovo X1. From 2012 to now I've went from the MacBook Pro to MacBook Air and now the MacBook Pro Retina. With the Lenovo, the touchpad wasn't quite as good, the power adapter wasn't quite as compact, the battery life wasn't quite as good (but all were good enough).

The screen and keyboard were very good. The trackpoint is a nice addition. The Mini DisplayPort worked with my 27" Apple Cinema Display without issues in both Windows and Linux (Ubuntu worked perfectly BTW). Build quality on the machine was great. Didn't run hot or anything, had all the ports that my Mac did.

Later on they sabotaged the function key row and ruined the touchpad. After a year of customer complaints they put it back, but the things were so expensive I just stuck with a Mac. If I needed a Windows/Linux machine however, they would be my first choice.

stuaxo 1 day ago 1 reply      
Now need to wait until Linux is installable on it.
c2h5oh 1 day ago 0 replies      
I wish HQ CPUs were an option (quad not dual core) - I work primarily with compiled languages.

I wish they ditched SATA option for m.2 only and used the extra space for more battery (or cooling for HQ cpu)

walrus01 1 day ago 0 replies      
Love the Thinkpad hardware. Hate Windows 10.

If something like the current generation Thinkpad hardware (T or X series) could run OSX, it would be what the Macbook Pro used to be... Remember the first generation Intel macbook pro in 2006 which had a full complement of ports? Everything relevant and needed except RS232.

garrettheaver 1 day ago 0 replies      
As someone looking to replace my 2013 13" MBP this is almost exactly what I want.

Enough has already been written about the new MBPs and why many of us will no longer consider them. I've already tried the Kaby Lake Razor Blade Stealth but returned it due to shocking quality and support issues. I considered the Asus Zenbook but it has too few ports to be a serious contender and the HP offerings all have screens with a lower resolution than I want.

I've always avoided Lenovo, party because Apple were building machines I wanted and partly because my experience of Lenovo to date has been low end cheaper models which suck. I'm willing to give them a chance at the upper end with this though. The sooner to market with this the better

therealdrag0 1 day ago 0 replies      
I've had the X1 Carbon 3rd gen (Refurbished) for a few years now as my personal laptop. I don't do much heavy lifting with it, mostly League of Legends, Counter-strike, and some side programming. But it's my favorite laptop I've used. The 14" size is perfect for me and I like the feel of the rest of it. Performs well and didn't break the bank.

(I've always owned windows PCs, though I've used MBP at work for 2 years now, and have had a MacBook Air that I resold because it was too small and didn't have a niche to fill after my ultrabook and my ipad.)

themihai 1 day ago 0 replies      
The real issue with non-Apple hw is the OS. Linux is great for development but sucks on pro media/audio support.
harry8 1 day ago 2 replies      
Asus ZenBook working well for me with linux for the last 3 years. I've always had good linux support from thinkpads too. The Asus replaced an apple macbook pro, the model where apple shipped broken GPUs and didn't recall, so that one is a very expensive web-browser that constantly panics and reboots, giving me the opportunity to write a sentence to apple about how I feel about their miserable company that I'm sure nobody will ever read. "The Donald Trump of Computing Companies." Apple really are amazing though. Top comment is an apple fan boy desperate to continue to believe in apple and affronted by the existence of other laptops while other kool-aid drinkers up vote even though it's got stuff all to do with the ThinkPad X1 Carbon. Probably hasn't even got an apple logo shaved into his head!
imafish 1 day ago 0 replies      
> Plus we managed to keep the 14" display in a 13" chassis. Now thats innovation.

Dell XPS put a 13" display in 12" a chassis and a 15" display in 14" a chassis.. But a 14" display in a 13" chassis. Now thats innovation.

fiji-flo 1 day ago 0 replies      
Page is down :/

Here's an image of the page: http://imgur.com/szPrPuN

ElijahLynn 1 day ago 1 reply      
What happened to the Carbon touch? This would be my next computer except it doesn't have touch. As a developer I use Chrome dev tools emulator with my Thinkpad W510 to perform touch testing. Plus it is fantastic for annotating presentations (Ubuntu, Compiz Annotate plugin).
matt2000 1 day ago 1 reply      
Does Lenovo's history with malware give anyone pause when considering this laptop?

I'm just interested in general - I'm a Mac owner now, but used to be a happy thinkpad owner back in the day.

kylebenzle 1 day ago 0 replies      
I spent a year waiting for the Yoga x260 thinking I had found the perfect machine, after 11 months I am getting ready to send it in for the 3rd time for a "freezing" trackpad, remedied only by plugging in an external mouse. Also, lack of Linux support is insane for a company selling a working man's machine like this, going back to Dell after this :(
rzhikharevich 1 day ago 1 reply      
Am I correct that "Signature Edition" means no Linux?

Source: https://www.reddit.com/r/linux/comments/53ri0m/warning_micro...

ge96 1 day ago 0 replies      
These look nice, I'm only able to afford like the first generation but I look forward to buying that.

If only they were fanless.

rootme 1 day ago 0 replies      
I have a gen 4 X1 carbon. top 16gb ram 1tb hard drive all in. Is the best machine from last year.
smoyer 1 day ago 0 replies      
I have a second generation X1 Carbon and I absolutely love this computer!
gaspoweredcat 1 day ago 0 replies      
id upgrade to one of these in a heartbeat if i could afford to but right now thats not an option, thankfully my current gen2 x1 carbon is still an excellent machine that can handle most everything i throw at it
noobermin 1 day ago 1 reply      
What do these usually go for? Does anyone know?
n0us 1 day ago 0 replies      
Anyone know if this will have precision touchpad?
rch 1 day ago 1 reply      
> we managed to keep the 14 display ... in a 13 chassis. Now thats innovation.

Indeed. I'm not sure what they mean here.

coin 1 day ago 1 reply      
The trackpad is off center (to the left), what's up with that?
mrkoolaid 1 day ago 0 replies      
... But if you drop it won't it still rattle loose?
hbcondo714 1 day ago 2 replies      
> WQHD IPS (2560 x 1440) 300 nits

Is this their OLED screen?

wineisfine 1 day ago 0 replies      
Looks solid. I like the Snapdragon included.
desireco42 1 day ago 1 reply      
I think it is super important to have honest discussion between developers, with as little flame possible about machines we are using.

I am using 5 yr old Macbook Air for development, it works well. I always imagined that I will use ThinkPad /w some Linux distribution in future. It is not trivial to use Linux on your laptop. While a lot of things will be faster, browser most likely will be slower and there will not be as many tools. What I would mostly miss is photography tools, Lightroom is essential for processing photos. I can't not do it.

Let me add one more thing to this fairly random post :). I think X1 is better then Macbooks and Airs at the moment, docking station makes a lot of difference.

vacri 1 day ago 2 replies      
I have an X1 gen 3, and it has a dongle for the ethernet port. Given the thickness of it, I don't see how this new gen machine can hold a 'native rj45'. Perhaps it's one of those hinged ones that opens out?
joshuaNathaniel 1 day ago 1 reply      
ck__ 1 day ago 1 reply      
will it bend?
Roritharr 1 day ago 2 replies      
Only 16GB, no Quad-Core option (i'd take an underclocked one at this point :( )... i need more threads and ram on the go, why is that not an option outside of the alienware 13 monstrosity?
Why I close pull requests jeffgeerling.com
292 points by geerlingguy  3 days ago   179 comments top 15
yegle 3 days ago 3 replies      
At Google, if you want to implement new features (or large refactoring), you'll need to write a design doc. In which, you should answer questions your reviewers might ask (common questions like: why do you want to do this, what are the alternatives, how components interactive with each other before/after your change). This is something like Python's PEP: you need a proposal to convince your reviewer that you have put thought into your change.

Real world examples of these design docs can be found at https://github.com/golang/proposal

makecheck 3 days ago 9 replies      
Also, dont send requests out of the blue. The original maintainer has to know that youre working on something. One reason is that your changes might collide spectacularly with other planned changes you werent aware of. Another reason is that the maintainer might say no to the entire idea, much less the implementation, and save you time.

The mere creation of a fork isnt a sufficient signal, either; the project maintainer isnt going to treat that as a sign that youre actually working on something. (There seem to be an insane number of forks out there that are created and never changed again, apparently used to pad rsums by having important-sounding projects listed on user profiles.)

sytse 3 days ago 1 reply      
Wow, managing over 160 projects? I can imagine that he has to close quickly.

At GitLab we have a written down definition of done so people know what should be in their merge request, see https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CONTRIBU...

And our merge request coaches try to get people over the finish line instead of closing. But that are full time people on a single project. Maintaining 160 projects is a whole different ballgame.

fourthark 3 days ago 3 replies      
I know it's not best practice, but I leave them open. For years.

They may be fixable, they may be useful to someone. I've no need to reject them unless I really think they're a bad idea.

I'm sure this can be frustrating to users and contributors, but I also see it as a way of encouraging forks. "I haven't had a chance to review this, but you might try PR #NN..."

The most useful ones get replaced by better versions by other people, and eventually merged.

Sir_Cmpwn 3 days ago 1 reply      
I also maintain a great number of projects (perhaps more) and generally I give feedback on why a PR is unacceptable and leave it open until it's resolved. Sometimes I'll close it a year or two later.
dclowd9901 3 days ago 1 reply      
You know what would be cool? If I could create a fork off the project i was using. Then I write a feature I need in that project. it becomes a PR, but then the fork is also automatically (if possible) updated whenever the main branch is updated. If it can't be updated automatically you are notified to update your fork against the upstream changes. This would have many benefits, including easy testing of PRs, forks that don't stale, and overall helping closing the loop on open PRs.
radarsat1 3 days ago 0 replies      
One thing that I find different about github vs previously (e.g on sourceforge) when you had to sort of sign up to be part of the group to propose a change, is that people feel a lot more free to suggest out of the blue quite impactful but ultimately rather superficial changes to a project. They argue and argue to get these changes accepted, and then disappear.

On several projects I've been on, I get issues or pull requests proposing to change the entire build system of a project. As you know, for C/C++ projects, the build system can be non-trivial, and maybe many years have gone into getting it to work well. And as things change, we adapt it. But as soon as it's not the flavour of the week, you get github requests suggesting to change to a completely different one, to suit some or other system's needs.

A tweak here, a tweak there, or an entire overhaul being proposed from people who haven't contributed to the actual code base at all. These infrastructure "suggestions" from people who aren't invested in the project but love to play with scripts built up "around" the code get very annoying. I don't know what the difference is exactly but it didn't happen with such frequency when things were more oriented around mailing lists.

I've now got 3 projects that have at least two build systems each, because of random people's preferences. That is a lot of extra work to maintain that is orthogonal to the actual project source code. I've started closing PRs that make infrastructural changes that I don't want to be responsible for, unless I can get the submitter to promise he'll be around for a while to maintain it. I've also started forcing people to put such changes in subfolders so that it's clear which one is the "supported" system. And I haven't shied from "assigning" subsequent bugs back to the original PR submitter. But sometimes that doesn't even solicit a response.

People: if you are going to suggest switching a project to a completely different build system, and then disappear and not promise to maintain said system, please think twice about changing something just because it doesn't suit your preferences of the week.

digi_owl 3 days ago 1 reply      
Between the responses here and on the more recent Chrome for business posting, i find myself wonder if there is a ever widening split between the "push to prod" web dev mentality, and the "clasic software" mentality.
dmuhs 2 days ago 1 reply      
This article gives a nice view for someone who hasn't had that much experience with OSS development like myself. While I'm kinda familiar with CI systems and the concept of coverage, could someone explain to me what the author means by "happy path" in coverage? Is that considered the most used path in standard behaviour?
emmab 3 days ago 0 replies      
If a given PR would be acceptable except for its maintenance burden, and you do not expect the PR submitter to provide sufficient help with maintenance to compensate, you could request the difference from them as payment for acceptance of the PR.
j1vms 2 days ago 0 replies      
The article brings up a lot of great points. Though people do need to be mindful lest they be writing a follow-up, "Why my project got forked, and I now sidelined."
Marazan 3 days ago 1 reply      
The fact they have to state they won't accept pull requests that break the build astounds me.

Who is submitting pull requests that break the build? That is akin to trolling.

caconym_ 3 days ago 3 replies      
Just stopping by to point out that there is a typo in the first sentence of the article, I think: in "I maintain over many", the "over" should not be there.

Now I will read the article. :)

realstuff 3 days ago 1 reply      
saw-lau 3 days ago 0 replies      
Upvoted for introducing me to the term 'bus factor.'


PS4 hack: Fail0verflow demonstrate Linux and Steam running on Firmware 4.05 wololo.net
253 points by loppers92  17 hours ago   106 comments top 11
faragon 13 hours ago 1 reply      
Amazing work. Hector Martin (@marcan42) is an incredibly talented hacker. I still remember his post for enabling the hardware virtualization of the CPU in his laptop, an Acer Aspire 5930 [1], that had it disabled in the BIOS, and it was not user serviceable (I had a similar laptop, with smaller screen, so his post was useful for me). Also his hacks and comments in a Spanish forum [2] for the PS2, Nintendo consoles, and others were plenty insightful. Then the PS3 hack. And now, getting Linux working in the PS4, even with 3D acceleration (without help, just with few specs found in the web (!!!)). It is mind blowing :-)

[1] https://marcan.st/2009/06/enabling-intel-vt-on-the-aspire-89...

[2] https://www.elotrolado.net

notyourwork 16 hours ago 10 replies      
Everytime console hacking comes up I start to wonder how well a manufacturer would do if their next console was open. Would they see a decrease in legit purchases if the console was open for hacking and exploitation? I would think it would be a lot like the PC game market which as far as I can tell is still thriving today.

So assuming there is no economic impact, what is it that makes us want to lock down consoles (similarly cell phones) when we do not do the same to the personal computer we hold so dearly? It is a fascinating story that I suspect is due to timing and when devices hit markets but curious what others think about this.

0x45696e6172 16 hours ago 1 reply      
Flammy 16 hours ago 4 replies      
Anyone remember when PlayStation 3 could be used to install whatever OS you wanted without any jailbreaking? sigh


unicornporn 2 hours ago 1 reply      
PS4 is x86. Would it be possible to eventually run Windows on these machines, making it a cheap Steam gaming computer?
shasheene 11 hours ago 4 replies      
Marcan mentions FreeBSD is not a particularly secure OS.

My understanding is the BSDs have a reputation for being more secure than Linux. Is this not the case?

StavrosK 16 hours ago 1 reply      
What's the CCC33 event?
jokoon 6 hours ago 0 replies      
I wonder if one day there could be some kind of law forbidding manufacturers to restrict running software on the hardware they sell. It seems like an anti-competitive practice.
rasz_pl 1 hour ago 0 replies      
seems PCIE is the next big thing in dumping firmware, first iphones now ps4.
agumonkey 16 hours ago 1 reply      
First time I wanted to own one now.
shmerl 14 hours ago 0 replies      
What about running some demanding games like The Witcher 2?
The Original Postal Has Been Made Open Source runningwithscissors.com
237 points by davemo  2 days ago   76 comments top 18
1wd 2 days ago 1 reply      
"the CDude class, which is the main, player-controlled character in the game."

"CDoofus is the class for the enemy guys [...] Started this file from from CDude and modified it to do some enemy logic using the same assets as the sample 2D guy."

"COstrich is the object for the ostriches wandering about in the game. class COstrich : public CDoofus"

Hilarious. Reminds me of CBruce in Tony Hawk's Pro Skater:"Their code [...] originally written for Apocalypse[...] a Playstation game featuring Bruce Willis, which, we learned, is why in Tony Hawk the code for the classes of skaters is called CBruce."


"Project: Nostril (aka Postal)"

Any idea what RSPiX Blue, Cyan, Green and Orange layers are?

j_s 2 days ago 2 replies      
POSTAL Redux is currently $1.43 (84% off) on Steam; reminds me of Contra. Reviews say the only unexpected down side is load times; hope they can tighten that up.


Edit: I think the big "feature" is co-op.

joaodlf 2 days ago 1 reply      
This reminds me, I actually had some contact with the guys behind RWS many years ago, I was a young man and very excited about HTML/CSS and design in general, I ended up designing their forum (it was running on IPB): http://tinyimg.io/i/5GTDNgx.jpg

Postal 2 is one of my favourite games ever. Had a blast in both single player and online. Me and my friends still talk about it after all these years!

qwertyuiop924 2 days ago 3 replies      
Wow. Now I can play it on my Linux system. Thanks, RWS.

This, as well as being a fantastic gesture, makes it seem as though RWS understands something that many of its contemporaries don't: Games, or rather, their engines, must be open-sourced for those games to continue to be playable and relevant. You can't update your games forever, and sooner or later, they will be rendered unplayable by the inexorable march of technology. If you open source your engine, that doesn't have to happen.

Take a look at Doom. New content for Doom 1 and Doom 2 is still being released by the community, long after competitors like Duke3D have stopped. Why? Because Doom has a passionate community, and many modern, open source engines that make running the game on new systems a piece of cake.

cr0sh 2 days ago 1 reply      
I was pleasantly surprised by the amount of useful comments in the code.

I remember when the source code for Descent was released; not only was the code somewhat opaque (unless you were experienced with portal-style engines), but there were hardly any kind of comments to help guide you along.

pyromine 2 days ago 13 replies      
This is completely off-topic, but in the vein of talking about old games. Does anyone have any suggestions for games that scratched the same itch as the old school RTS games like Age of Empires and Rise of Nations?

I've been searching for years but never found anything that eclipses the classics.

stygiansonic 1 day ago 1 reply      
There are some interesting notes about how the multiplayer was originally implemented:[0]

"Once the game is running, everything is peer-to-peer. The only information the peers send each other is the local players' input data, which is encoded as a 32-bit value (where various bits indicate whether the player is running or walking, which direction, whether the fire button was pressed, etc.). No position, velocity or accelleration data is transmitted. NOTHING else is transmitted."

In order for this to work, they had to make sure all memory was initialized to the same values, so that each client had the same known starting state. They also had to use the same PRNG, initialized to the same state, so that a known, deterministic pattern would be produced. But eventually they ran into a problem that couldn't be solved in software: The FPU of different CPUs would not return the same results for the same inputs:

"However, dispite our best efforts, there was still a serious flaw lurking behind the scenes that eventually caused serious problems that we couldn't work around. It seems that different Floating Point Units return slightly different results given the same values. This was first seen when pitting PC and Mac versions of the game against each other in multiplayer mode. Every once in a while, the two versions would go out of sync with one another. It was eventually tracked down to slightly different floating point results that accumulated over time until eventually they resulted in two different courses of action on each client. For instance, on the PC a character might get hit by a bullet, while on the Mac the same character would be just 1 pixel out of the way and the bullet would miss. Once something like that happens, the two clients would be hopelessly out-of-sync."

0. https://bitbucket.org/gopostal/postal-1-open-source/src/defa...

crack-the-code 2 days ago 0 replies      
Awesome! Now just give me Diablo 2, and I will be content for life.
mixedbit 2 days ago 1 reply      
Does the release include the original game assets, or do assets need to be purchased (like with Doom releases)?
ionised 2 days ago 0 replies      
Many a fun time I had with the Postal level editor, placing ostrich-dispenser-dispensers around the map in great number and watching the ensuing carnage after a few well-aimed napalm canisters.
foxhop 2 days ago 0 replies      
I still have my original copy from 19 years ago, when I was 12 years old! : )

picture: https://twitter.com/RussellBal/status/814612135150055424

Insanity 2 days ago 1 reply      
Reading this, as well as the comments in this thread makes me want to play the game again. Might fire it up over the weekend and I will surely go through the code for a while!

It was a nice late-christmas present ;-)

nercury 2 days ago 0 replies      
Reading the source code of this is sure sobering. All files and code lines are there for the single purpose - to ship the game.
BillinghamJ 2 days ago 0 replies      
Misread the title as saying "Portal"! Pleasantly surprised however - I never played Postal.
aaronsnoswell 2 days ago 1 reply      
I thought the title said 'Portal', not 'Postal'. Disapointed :P
Endy 2 days ago 0 replies      
Is there any word on a DOOM-style sourceport happening?
Circumnavigate 2 days ago 0 replies      
This game is one of my favorite classics.
grendelt 2 days ago 0 replies      
"I regret nothing."
From Secretary to Software Developer: The Hard Way medium.com
255 points by mathchick  3 days ago   143 comments top 18
ChuckMcM 3 days ago 6 replies      
The thing that struck me is "anyone who is fascinated by computers and spends all their free time playing with them can be a developer."

Back before developers were perceived as 'rich' and 'pampered' there were people who were fascinated by computers and spent all their time playing with them and were called 'nerds.' Then it became "cool" to be a developer or "you can get rich as a developer at a startup!" and then you get people who don't care at all about computers and really never have, working as developers.

My litmus test is often to ask someone when they show me a solution, "what other solutions did you consider?" If they have wandered around looking at different ways to attack the problem they are more typically 'nerd' type developers, if their response is "none, this works so I went with it, moving on." they are often just working a day job. Watching the two types of people from the late 90's to today, the people in it for the money burn out much more frequently.

jakobegger 3 days ago 7 replies      
A bit of background: in Austria, many people do an "Apprenticeship" ("Lehre") instead of going to high school. You work at a company and visit a vocational school (about 20% of time).

This is great for practical people -- less theory, more real world experience. But there is a major downside: If you didn't go to high school, you are not allowed to go to university without first completing preparatory courses that can take years.

There is also an upside: If you've worked for at least 4 years, and are under 30 years old, you automatically qualify for "Selbsterhalterstipendium", which is around 700 per month to cover your cost of living while studying at university (you don't have to pay this back, and there also is no tuition)

hillz 3 days ago 2 replies      
It's kind of a bummer that she knew she wanted to work with computers the whole time but her parents thought it would be more responsible to be a secretary. Glad she got there.
mi100hael 3 days ago 6 replies      
Another good example of someone learning to program on their own because they wanted to and then leveraging that experience to get a job doing it professionally. The big secret to learning to program is that there is no big secret. It's basically a glorified trade job and everyone already has the tools in front of them.
nickpsecurity 3 days ago 1 reply      
That was a great read. Held back by parent's preferences into a secretary position, starts experimenting for fun/laziness (many great works started that way), keeps improving, fight the fight in college, and now at SAP working with serious tech. Congratulations on making it to finish line, Denise!

So, you've worked from Excel to GUI's to databases to web stuff. You plan on trying a new paradigm of programming or what for the next level?

Note: Also cool you did karate on the side. I got my start in DOS apps (QBASIC), doing Windows apps in VB6 in mundane, forced position, and karate on the side. Built new things in between assignments, including learning heavyweight stuff, because I was bored with VB or too lazy for some tedious task. The similarities in where we started probably added to my enjoyment of it. Also, I learned a new way to do a frown in text. I'm sure some tech project or new JS framework on HN will give me a use for it in near future. ;)

farhannyc 3 days ago 4 replies      
I don't think you can be a developer in 8 - 12 weeks, as mentioned in this article. Software Development is a skill as much as anything else, and there is no time frame. All you can use to assure yourself is if you have practice, and the confidence in yourself by that practice. For some people that confidence comes after a year, maybe even two years. But then again, that confidence can even come in 2 months.
pfarnsworth 3 days ago 1 reply      
That's a great transition, congrats to the author. My wife's mother also went from a secretary to the COO of a multi-billion dollar real estate company. Similarly, the current CEO of Xerox, Ursula Burns, was an executive assistant at Xerox. Although rare, it seems like things like that happened a lot more often before than now. I'm not sure if it means that as a society we have more opportunity or that we are more pigeonholed in our careers. Maybe it's the free-agent nature of our employment these days, but I don't picture execute assistants these days ever getting the opportunity of jumping into something completely different and rising to the rank of C-level.
49531 3 days ago 1 reply      
> There are also a lot of developer bootcamps: within 812 weeks you can become a developer. I think this is great if you want to become a developer within a small agency or working in house. Those fast tracks mainly teach you how to code, but not other important stuff like software engineering, algorithms and data structures, patterns, databases, theoretical stuff about computers and so on which you would need in bigger projects. Bigger companies mostly want you to have formal education. The same is true when you want to climb up the corporate ladder. Universities dont really teach you how to code, but they teach you timeless things! I never regretted my hard way, because I learned so many different things.

While it's true that a lot of organizations still put a lot of value on a traditional education, the idea that going to a bootcamp qualifies you for work "within a small agency or working in house" just seems condescending. I work for a fortune 500 company and we hire bootcamp grads all the time, many of them have gone from apprentices to junior to mid level engineers in just a couple years, they're fucking fantastic.

I strongly feel that getting relevant applicable skills is essentially to starting a career in software engineering, and that more theoretical skills can then be acquired along the way. I've seen it several times.

xb95 3 days ago 1 reply      
Another Delphi person!! Yay! I spent so much of my life writing Delphi code. As a teenager. Basically from 12-18 I wrote Delphi/Pascal. Hundreds of stupid little Windows apps (and some stupid big ones).

I ran into the same sort of thing you did re: Delphi jobs. I was pretty sad, honestly, having started in the original Delphi days (version 1!) and going to 6 I had gotten pretty good at it...

cyberferret 3 days ago 1 reply      
Well done. Interesting that you couldn't get a job coding in Delphi after you won the competition. I remember back around the same time, here in Australia there used to be quite a few Delphi related jobs around. Perhaps it was different in Europe.

Did you consider writing a stand alone app in Delphi that you could package and sell?

fencepost 3 days ago 0 replies      
My mother worked with someone well above this woman's age who went from basically the department admin (admittedly, in IT with a programming group) to what was apparently a pretty solid Lotus Notes admin, though she did a bit of job hopping in the process before ending back at the same company where she'd started.

Many administrative jobs probably offer a variety of paths that could lead to this. The person working can be someone who does the job as presented to them, or they can be the person who finds out what's needed and figures out the way to do it, learning along the way. An awful lot of programs are written because someone with the skills wants to automate something they find boring.

jorblumesea 3 days ago 0 replies      
I really don't think the 8-12 week code camp means you are a truly competent developer. Sure you can hack around on x or y js framework of the month. But data structures, algos, big O...all of that comes into play at some point as a software engineer. You don't use it every day, or even every week. But it does happen.

And tbh, you can really tell the quality of candidate of code camp vs 4 year degree. We hired a code camp candidate, just to see how it played out. It didn't work that well.

gravypod 3 days ago 1 reply      
Where was this "University of Applied Sciences" and how do i get in.
agumonkey 3 days ago 1 reply      
And hard to avoid Imposter Syndrom the hard way ?
relics443 3 days ago 8 replies      
TL;DR there are a lot of incompetent developers out there.

This isn't a comment about the author, as much as it's about something she said.

"Today you can take a lot of programming and Computer Sciences courses online. Everyone can be developer! There are also a lot of developer bootcamps: within 812 weeks you can become a developer."

This is a very dangerous line of thinking. Some people have convinced themselves that they are competent developers because they went to a bootcamp. And they might have just enough domain knowledge to convince a company with poor hiring practices that they're worth hiring.

I inherited a situation like that (this dev was hired a few weeks before me). After a few weeks it was painfully obvious that this guy was a detriment to the company because of his lack of coding ability. For reasons above my paygrade, we couldn't fire him immediately, and eventually we took all responsibilities away from him. We paid someone to come in and not do work for us.

I've interviewed dozens of developers since then. The one's coming from a bootcamp (or similar situation) have no computer science skills. They also have no problem solving skills; they're unable to break through the box that they were taught in. Most companies can't afford to hire a developer who knows one thing, and one thing only.

Now, we've had 4 year university graduates with experience in the field come in from top schools with degrees in CS. A (scarily) large percentage of them are incompetent as well, though not to the degree of the bootcampers. They're typically serviceable though.

smnplk 3 days ago 0 replies      
I hate it when people use the term "coding" instead of programing.
xchaotic 3 days ago 0 replies      
Sorry if I misread that as a movie title.
muninn_ 3 days ago 1 reply      
Ok. So I started reading this and just couldn't. Too many -comments and I just couldn't follow the article flow. Kudos to this person for putting in the effort to do what they want. I just can't get over the writing style.
       cached 1 January 2017 16:11:01 GMT