hacker news with inline top comments    .. more ..    9 Sep 2016 News
home   ask   best   3 years ago   
Costa Rica has gone 76 straight days using 100% renewable electricity vox.com
272 points by denzil_correa  5 ago   143 comments top 10
eggy 4 ago 0 replies      
Costa Rica also sits on the Ring of Fire like I do here in East Java, so geothermal is accessible there.

Costa Rica did not have to maintain a standing army to protect its borders thanks in large part to relying on treaties and understandings with the US. This frees up some of their GDP for R&D into sustainable energy.

I wonder what the demand for electricity is in Costa Rica today compared to when I was last there in 1993. I was going to buy land then, but I was a bit wary of new laws on foreign-owned properties. It would have cost me a lot to bribe/pay my way to getting power distributed to my land. I was going to use a generator and solar panels anyway. Many people kept guns to defend their property from bandits in my travels there back then. I'm sure it has changed a lot. Selva Verde was my first 'eco-tourist' experience, and I remember talking to some of the worker's there, and the owner's daughter. I am glad to see the business model took off. It's a beautiful country.

jfaucett 3 ago 18 replies      
Notice this is because 80% of the energy obtained was from Hydroelectric power, which is a great energy source because it is reliable - unlike wind and solar which in this case provided 7% and 0.01% respectively and are not reliable. It frustrates me that so many environmentalists are still against hydro because it alters ecosystems when nature itself is inherently violent, dangerous and in a constant state of flux. I sometimes think they reject the idea that the whole point of existence is human flourishing and happiness. The biggest real problem with Hydro in the context of human flourishing is ensuring that the dams are well-constructed and nature proof i.e. hold up against natural disasters because if they don't downstream settlements and human lives can be at risk.
JoeAltmaier 3 ago 3 replies      
" its per capita electricity consumption is about one-quarter of, say, France or Belgium."

Its in the bottom quartile worldwide I believe.

Also the recommendations for the USA are way off. The "only way" to reach this goal is apparently to build lots of wind and solar. No mention whatsoever of nuclear, which is by far the more practical solution. And far less ecologically damaging.

jbrun 1 ago 3 replies      
Quebec has gone about 50 years.
mbloom1915 24 ago 0 replies      
As great as this is and large step for other countries very soon, decarb can only be achieved through transport efforts. transport makes up over 1/3s of emissions so why residential/commercial consumption energy efficiency is great, it is a tiny fraction of achieving overarching goal
m_mueller 31 ago 0 replies      
If anyone is wondering about the worldwide big picture of where electricity is coming from: I made this map / spreadsheet:


edpichler 4 ago 1 reply      
Very good, this is a great step, but just a thing to be remembered: hydroelectric energy is renewable, better than coal, oil, etc, but it still kills a lots of animals and flood great areas, disappearing with waterfalls and changing landscapes.
jefurii 1 ago 2 replies      
Feels like the article was written by a pro-renewable reporter and edited by an oil-company exec. "Yeah they did it but you should be discouraged instead of hopeful."
sickbeard 2 ago 1 reply      
Same story every month. Can't wait for the 100 day update
taneq 2 ago 2 replies      
Objection: Geothermal technically isn't renewable.
People from Mexico show stunning amount of genetic diversity sciencemag.org
51 points by oscarwao  2 ago   21 comments top 4
jpablo 45 ago 5 replies      
As a Mexican I find the comment section nitpicking every word of the article as racist very funny. Americans have really thin skin. Political correctness is indeed drowning the exchange of ideas!
betolink 24 ago 0 replies      
I remember a little while ago my dad went to an academic lunch with Dr. Craig Venter in Chiapas (MX), he was talking about sampling DNA from the Americas to register genes that could potentially help medical research and are endemic to the region.
sandworm101 29 ago 3 replies      
Why the surprise? Mexico is part of a land bridge between two continents. Both have been invaded multiple times by distant populations. It's next to the Caribbean, a host of islands which are known to accelerate genetic changes. With all these peoples coming and going I'd be shocked if it wasn't so diverse.
BurningFrog 27 ago 1 reply      
My understanding is that all native american populations are from a relatively small and homogenous population that walked into Alaska 16000 years ago.

Which means that they're much less genetically diverse than the rest of the world.

This is hard to reconcile with the claims in this article.

Googles DeepMind Achieves Speech-Generation Breakthrough bloomberg.com
28 points by jrcii  1 ago   16 comments top 6
e0m 5 ago 0 replies      
Here's the link to the paper: https://drive.google.com/file/d/0B3cxcnOkPx9AeWpLVXhkTDJINDQ...

And the WaveNet site with audio samples: https://deepmind.com/blog/wavenet-generative-model-raw-audio...

The comparison against state of the art Parametric and Concatenative methods are pretty mind blowing.

Particularly listen to the music samples. That's a generated piano piece that sounds quite musical.

They even include breaths and other auditory signals that really make for a convincing speech sample.

Xcelerate 14 ago 4 replies      
This raises some interesting questions. I've always thought that recording conversations would be sufficient "proof" of what someone said. It seems that soon audio alone will not be sufficient video will be necessary as well.
sharemywin 1 ago 1 reply      
always wondered if something like this could be done. Now just train it on an famous individuals wav data and you got a free celebrity endorsement.
inputcoffee 15 ago 1 reply      
Really this is natural text-to-speech. The headline makes it sound like its thinking the thoughts it speaks.

16,000 analyses per second. So, given Moore's law, around the iphone 9?

Recreating Our Galaxy in a Supercomputer caltech.edu
110 points by chmaynard  5 ago   35 comments top 7
throwaway_yy2Di 4 ago 0 replies      
Apropos, the Gaia space telescope's first data release is next week. It's an extremely large dataset of the kinematics (3d position + velocity) of Milky Way stars.


hossbeast 1 ago 0 replies      
I'm always a bit uncomfortable when I read about findings from these simulations. To say they have "answered questions" about galactic evolution seems to strong a statement. How about, "created a more accurate model"?
mikekij 1 ago 1 reply      
It'd be pretty cool for us to discover that there are tiny virtual humans in that simulation, complete with consciousness, contemplating their own existence.
nsxwolf 55 ago 0 replies      
I thought the Milky Way was a barred spiral.
fiatjaf 4 ago 3 replies      
Nice, you can't model a worm, but you can model a galaxy.
starmole 4 ago 4 replies      
Can somebody who knows about this tell me how much of the novelty in this is CS and how much physics? For example in graphics all the physics of light is well understood - it's just infeasible to compute - so all advances are in CS and how to sample better. Is this the case here? Or is it adding new models?
jessriedel 4 ago 0 replies      
For context, this question of whether vanilla dark matter models are in conflict with galactic observations has been raging for many years. (It includes both the dwarf galaxy problem mentioned in the article, and things like the "core-cusp" problem concerning the radial dark matter density distribution.) These numerical simulations are fiendishly complicated, and there's always a risk that you just keep adding new tweaks or effects until you get the results you're looking for. I'd be interested in reading a skeptical assessment from the folks on the other side of this question.
Radiobox.css Tiny set of CSS3 animations meant for your radio inputs 720kb.github.io
160 points by 720kb  7 ago   78 comments top 21
bertan 4 ago 2 replies      
I love these :D A little bit childish, yes, but why so serious? :)

One can use these, for example, as an easter egg or to add some emotion to the selected item. Sparingly, of course. Surprise people when they are bored of all those forms :)

I think you should keep a list of all the sites using these, so that people can get inspired.

blowski 5 ago 5 replies      
Why all the hate? It's not like you have to use these everywhere just because they exist. The fact that it's possible is good to know as one day it might be a good solution for a problem I'm facing, and then I'll be happy that it exists.

A couple of use cases I can think of:

* Attracting attention to options in a demo

* A game for kids, or otherwise 'wacky' pages

If we restricted web browsers to what developers think we should use, we'd still be using plain green text on black terminals.

pbhjpbhj 3 ago 3 replies      
Meta, question for 720kb.

I'm on mobile and wanted to zoom but you've used this in the demo page:

 meta name="viewport" content="initial-scale=1, maximum-scale=1, user-scalable=no, width=device-width"
I'm curious why? The hit targets were too small for me initially, hence the desire to zoom.

On topic: I like them, used sparingly or in the right context I think they'd be very good. Thanks for sharing.

echelon 23 ago 0 replies      
These are fantastic! I'm totally going to make use of this. Any chance you'll do the same for other input elements? :)
GrumpyNl 4 ago 2 replies      
Nice effects, thanks for sharing. To the haters, share some of your stuf.
jpfed 2 ago 0 replies      
I like Focus! The others are a little bit much for me, but maybe I've been doing government websites for too long...
720kb 5 ago 1 reply      
Actually, there is nothing to complaint about, you need it you use it, you don't need it you don't use it. Please be constructive.
ThomPete 1 ago 0 replies      
These are great but to make them really useful I would look into creating a series of more subtle animations.
edent 6 ago 2 replies      
Those are technically very impressive. My only issue is that my mouse covers most of the animation - and there's nothing to indicate when a button is unchecked.
bigblind 5 ago 3 replies      
Animation should convey information. Otherwise, it's just noise. I have no idea what information these animations could carry.
duiker101 6 ago 1 reply      
Nice to see but please.... don't ever use this in an actual website.
rbobby 2 ago 0 replies      
It looks really weird when the page loads and the radio button is already checked.
budhajeewa 6 ago 2 replies      
But why?

What about the inconsistency factor? If my radio boxes get animations, all other form fields should get them too.

If not, no one should get them.

You know how it's with kids.

rekshaw 6 ago 2 replies      
Nice! I sent a pull request for a new animation called Ping. It is a simple (read: boring) animation for the more stuck up websites.
taneq 2 ago 0 replies      
This seems to be broken if you're running PrivacyBadger.
creshal 6 ago 2 replies      
So why would I use any of them? All (?) OSes animate then anyway, and in a much less obnoxious way.
froh42 6 ago 1 reply      
It's just <blink> all over again
grindsmygears 4 ago 1 reply      
is geocities back?
JonnieCache 6 ago 2 replies      
None of these work for me, firefox 50.0a2 2016-09-08
alexcasalboni 6 ago 0 replies      
Nice but pretty useless for 99% of web users.
ElijahLynn 4 ago 1 reply      
The one that looks classy and not too distracting is Focus. The one I would consider using someday on a normal website but would probably not use the library for that.
DNA analysis reveals there are four distinct giraffe species, not one researchgate.net
64 points by Jerry2  5 ago   39 comments top 6
kristofferR 3 ago 1 reply      
Since this wasn't known before, am I right to assume that a lot of giraffes in zoos are hybrids of the different species?
dschiptsov 2 ago 3 replies      
The classic operational definition of different species, as far as I remember, is that cross-breading between species is not possible.
thaumasiotes 5 ago 2 replies      
> The findings suggest that genetic exchange among the four species is rare, maybe even nonexistent. This genetic isolation defines them as distinct species, as opposed to subspecies.

A bold, up-front assertion that Native Americans were in fact a distinct species from Europeans.

dforrestwilson1 3 ago 0 replies      
So is Pluto a planet or not then?
bfuller 42 ago 0 replies      
Giraffes? Giraffes!
guessmyname 3 ago 1 reply      
Off topic How do I click the links at the bottom of this website?

EDIT: To clarify, as some people thought this was a stupid question... Every time I scroll down to click one of the links the website scrolls my browser back up and loads a new article below the one I was reading. I don't want to read more articles, I want to click the links at the bottom of the website!!! I was not asking if the links work, they probably do, I am asking how do I "click them" if the website keeps moving the webview back up when I scroll to the bottom?

Why Kubernetes is winning the container war infoworld.com
110 points by bdburns  2 ago   56 comments top 13
leetrout 35 ago 2 replies      
I know it's young still, but I think Nomad is going to get a share of this market with little effort.

I played with Mesos & k8s and I picked Nomad instead. Now I'm not managing a huge fleet of servers I want to abstract away as much as I wanted a simple, consistent framework for allocating resources to containerized tasks across a small group of nodes. And I don't think that use case is anything to sneeze at and for a new user there just isn't anything out there as easy as nomad IMO.


tzaman 1 ago 2 replies      
For a devOps fan like me, k8s has been a godsend, and what I like in particular is their 3 month release schedule. There are still some hiccups like no good documentation (or a tutorial really) on setting up shared writeable storage and how to handle databases, or more importantly replication.

The k8s team is very responsive and I'm sure these will be ironed out in the near future so we can all adore each other's cattle :)

mr_luc 8 ago 0 replies      
In my experience, I haven't been coming to k8s because I particularly like the developer experience (despite their efforts to focus heavily on it), but because it cleanly supported some things that I need.

For instance, with k8s, out of the box every running container in a clustered system is discoverable and has its own IP. If you're writing distributed applications, and you're using containers principally as a tool to make your life easier (and not as part of an internal paas or for handling data pipelines or some other use case), having that sort of discovery available out of the box is great.

virtualnm 1 ago 8 replies      
I can bring up an app on Linux or Windows from bare metal in minutes by hand. But the way it's supposed to be done now is something like this, right:

 1) Use one of Chef/Puppet/Salt/Ansible to orchestrate 2) One if those in item 1 will use Vagrant which 3) Uses Docker or Kubernetes to 4) Set up a container which will 5) finally run the app

_asummers 1 ago 2 replies      
One thing I've found extremely difficult to handle is the Zookeeper cluster model of containers. Where when a thing dies, a thing has to come back and be able to referred to as "zookeeper-1" forever. The way to do this currently is to use a service in front of a replication controller with one pod. This feels wrong all over. Supposedly they have a thing called Pet Sets [1] coming to solve this, but it's been in the works for an eternity. Also we've started to outgrow the load balancing simplicity that the k8 load balancer gives you, and I have not seen a nice migration path to something like HAProxy in Kubernetes. All that said, we like kubernetes a lot.

[1] To distinguish from cattle. If you have a red blue and green goldfish, and your red goldfish dies, you can replace with another red fish and not really notice, but if it's purple, the others won't play with it.

batmansmk 1 ago 0 replies      
It pretends to compare Kubernetes, Apache Mesos and Docker Swarm. This article says Kubernetes has a lot of stars on github (doesn't compare it to Docker or Mesos, only says Kubernetes has a lot), same for Slack/Stack Overflow and number of CVs mentioning the tech ... I will pass Infoworld opinion from now on.
DubiousPusher 12 ago 0 replies      
Seems like "Why Kubernetes is winning the orchestration war" is a more appropriate title for this?
spudfkc 57 ago 0 replies      
I haven't dived into Kubernetes yet, but I set up Rancher for our new application and it has been nothing short of amazing so far. I can't express how happy we've been with it.

I previously tried the Mesos/Marathon route (with Mesosphere and then again with Mantl) and that was nothing but a huge waste of time due to all the maintenance that was necessary for all the required servers. With Rancher, it's just spin up a container for each host with a single command and you're done.

0xCMP 1 ago 2 replies      
The setup to get k8s running isn't great, but once it's running and you understand it's config files it makes things so much easier. We're getting ready to deploy k8s at work soon and begin moving more there as we can.

From what I understand, and is completely not in the article, Mesos is designed for scale while most start-ups (and even established companies) can't afford or justify. K8s is simpler but still robust. Better than just fleet or compose and clearly still better than swarm (based on posts read here on hn).

pawadu 1 ago 1 reply      
From the subtitle:

It's all about knowing how to build an open source community -- plus experience running applications in Linux containers, which Google invented

paimpozhil 1 ago 0 replies      
I wish kubernetes has more examples for example their vsphere volume driver has almost 0 documentation/tutorials on how to set that up.


I believe is inadequate

iagooar 1 ago 3 replies      
What is the best resource to learn Kubernetes like a boss? I like ebooks, but will take anything, as long as it's up-to-date and easy to follow without being a long-term sysadmin.
runako 1 ago 1 reply      
On Safari OS X, the article is blocked by a full-page ad that I can't dismiss.
Companies would benefit from helping introverts to thrive economist.com
167 points by tomaskazemekas  8 ago   122 comments top 14
vidarh 2 ago 1 reply      
I just got off a call discussing the problems of technical teams where introverts get thrown into the deep end by being promoted to team leads etc. often without any kind of support.

I suffered through being in that position myself early in my career, and people under me suffered as a result, and I had no follow up or help whatsoever in terms of obtaining the skills to deal with it. It took a lot of time to recognise the problem and "fix it".

It still saps me of energy to spend time actively reaching out to people, but I've learned strategies to work around it (e.g. setting appointments to talk to people so I can't get out of it without being rude prevents me from just indefinitely postponing it), and "compensate" by ensuring I allocate "quiet time" to recharge.

There were also a lot of little things I had to learn. E.g. I eventually learned that simply walking around the office now and again and asking people how they were doing got people to report far higher satisfaction with my level of engagement, even if I spent less time actually responding to issues.

My managers never engaged with me that way when I started leading teams (I once had a manager that didn't actually talk to me for about two years - I passed on status reports once a week and that was pretty much it), so I didn't either for a long time. It turned out to be a very "low touch" method of showing interest that didn't wear me down but gave very positive results.

A lot of teams struggle with bad to non-existent training of people who get promoted into management positions, and that problem gets far worse with people whose "default" is to not spend a lot of time talking to people, and it puts a strain both on the team and the person put into that position that could be reduced very quickly with some basic training and some coaching.

I actually occasionally take on contracts to do coaching for technology managers because I love helping people shortcut all the time I wasted on it when I first started managing teams.

jbb555 6 ago 4 replies      
Training courses seem to be a place that they've increasingly made terrible in this way.

I'm fine with useful training where they teach you some stuff, and let you have a go at it, and are available for questions.

But no... these days they have to make it all "interactive". Instead of teaching stuff they have to stand there picking on people asking "what do you know about X", and then after a few hours of "teaching" you are "rewarded" by being forced to join a group of people for a group activity. No, it's not a reward, it's a few hours of hell.

ugh, this is why I tend to avoid so called training these days. Why can't we just be given actual teaching and notes and not be forced to do awful group activities.

pluma 6 ago 4 replies      
Meanwhile in tech being an introvert is lumped together with "poor social skills" and frowned upon in the name of diversity.

I've actually heard conference speakers call out behaviours as toxic that are basically the defining character traits of introverts (not to mention people in the autism spectrum).

Note that I'm not even talking about the shouty-sweary-abrasive behaviour people like Torvalds are being derided for but simply preferring to work alone rather than in a team.

whamlastxmas 15 ago 0 replies      
I feel like this issue is not introversion. I am an extreme introvert. Socializing is extremely tiring for me. I used to be extremely anti-social and had social phobias. I was bad at socializing and frankly still not great at it.

The issue could be boiled down to a lack of self confidence. There are lots of other things that were part of it, but mostly explained by self confidence.

I worked (and am still working) through this and I am tremendously better than I was. My coworkers argued that I wasn't introverted when I described myself as such once.

I say all this to make the point: the real issue is that some people are poor leaders and have poor social awareness. Leadership is less natural for introverts but the nature of leadership isn't any different for them because of that introversion.

Introverts need to be able to take charge of their own lives and their own success as a leader. Employers shouldn't need to provide quiet spaces for them. They should make their own quiet space. If you're unable to do that at your job for whatever reason (it's as simple as sitting in your car for a 15 minute break) then you need to move to a different environment if you have the expectation of being a successful leader.

achow 6 ago 3 replies      
> ...the best way to encourage creativity is to knock down office walls and to hold incessant meetings.

Being part of office layout team in a Fortune 10 company I'm now aware that most of the time the motivation is cost. It is way cheaper by many order of magnitude to have an open plan office. Other cost related reasons are easy to manage space -can be expanded and scaled down in jiffy- cheaper HVAC installation and maintenance etc.

elcct 3 ago 2 replies      
> Claude Mongeau, the former CEO of Canadian National Railway, for example, set himself the goal of acting like an extrovert five times a day. In any case, the majority of people are on a spectrum of introversion to extroversion.

Probably my comparison is too extreme, but as an introvert myself I can see this as telling gay person to be straight five times a day.

jondubois 6 ago 4 replies      
Very true. I'm an introvert but I have to force myself to act like an extrovert in order to move forward in my career... But I feel somewhat insincere doing that and it takes a lot of effort emotionally.

I think this behaviour is necessary because the corporate environment cultivates a culture of insincerity and sometimes downright hypocrisy.

We all want to feel good about ourselves - So we either:

- Lie to ourselves or;

- Accept the reality and only pretend to believe the lies (that's where the public faade comes in handy).

hellofunk 5 ago 1 reply      
I think there are a greater percentage of introverts in the software development world than in the general population, and attempting to put developers into the same environments as many other corporate jobs just doesn't work very well. I work with some talented guys remotely, all who work from home and I'd casually judge them to all be more introverts than extroverts. The key I believe is to respect someone's ability to get work done without a lot of social interaction and judge them on how they meet deadlines on their own, the quality of the work, and let it go at that. If they can do this, leave them alone and let them thrive. If they can't, then it is possibly not a good fit for your company to have them onboard.
realworldview 1 ago 0 replies      
I would re-title the article, _Generalization, generalization and more generalizations_. Why, this is a typical space-filler article typical of consumer publications, most visible in the Daily Mail, to provoke criticism, rage, agreement but not much else. People don't always fit in the neat boxes so don't waste time reading such common and regurgitated content.
jaggederest 6 ago 0 replies      
I feel like the focus on any particular personality trait is a little facile.

Really, companies should be gaining understanding and helping all of their employees to do their best work regardless of which specific traits you're talking about.

wundersoy 4 ago 2 replies      
But if your work helps you get past being so introvert, which ultimately can make you difficult for others to work with in a team. Isn't that a good thing?
nicolapede 6 ago 2 replies      
Working in investment banking here. I sincerely hope that tech companies will continue to thrive -- the reasoning being that at some point IB will be so much short of talents that it will have to adjust to working practices/policies more suitable for introverts.
timwaagh 5 ago 3 replies      
so yeah because of my supposedly flawed personality i will have 0 chances at promotion beyond just being a developer. and developers are not rewarded very well where i live. the logical conclusion is that i will have to become their competitor as soon as i can. which sadly benefits nobody as i could potentially do better with big-company resources behind me and the companies would do better without upstarts trying to take their profits.
bitwize 5 ago 5 replies      
V8 JavaScript Engine: V8 Release 5.4 v8project.blogspot.com
70 points by okket  3 ago   8 comments top 3
e0m 2 ago 2 replies      
Wow! The new version of V8 "reduces peak memory consumption of on-heap memory up to 40%." and reduces "off-heap peak memory usage by up to 20%". As the author of a large Electron app that has huge implications downstream for our users. Great job!
overcast 39 ago 0 replies      
Wonder how this will translate to resource usage in my node projects. Those memory reductions are significant.
elorant 45 ago 2 replies      
As a side note, is there a way to run this thing standalone as a headless browser?
Some bad Git situations and how I got myself out of them ohshitgit.com
522 points by emilong  10 ago   242 comments top 55
pmoriarty 2 ago 0 replies      
I've heard advice from #git on freenode not to use "git commit --amend", especially on shared repos.

I wish I still had a log of the conversation or remembered the exact problem that led up to it, but it involved a simple amend totally screwing up my repo, and I've avoided it since.

mhw 3 ago 2 replies      
One of the nice workflows that's already built in to the git command line tools is this one. When you're working on a branch and realise that a commit you made a few commits back has a mistake in it:

 # Make correcting change git commit --all --fixup=<ref of commit with mistake> # Continue working on branch, then at some point git rebase --interactive --autosquash
The --fixup option creates commits with subjects formatted like 'fixup! previous commit subject'. --autosquash uses these subjects when building the interactive rebase list.

Handy enough that I set rebase.autoSquash to true in my ~/.gitconfig so 'git rebase -i' always works like this.

dep_b 1 ago 2 replies      
Somebody proposed to use a GUI. That doesn't solve the usability issues of Git. There's this triangle of what the user tries to do, what the commands and options are called and what they actually do. None of them really align, though with some careful use you can actually make Git do what you want - eventually.

I would like to understand what's the yearly damage of such an important tool being so difficult to use. People committing the wrong stuff, unmergeable messes, people not being able to correct their mistakes, there must be thousands of Git rookies fucking up their Git repo or throwing away their work just as I am writing this.

What would be the cost? Millions of Dollars? Perhaps even billions?

It's about as bad as 0 being a legit value for a regular pointer.

eyelidlessness 8 ago 16 replies      
I can't believe no one has responded yet with "use a GUI". After gaining a basic understanding of how branches and merges work, and I do mean basic, I've never been able to screw up a local repo with a GUI client enough that I haven't been able to recover with the same GUI tools.

I understand that people need to know how to use their tools, but for git most people can get away with the very basic usage that GUIs provide. If you've made some unrecoverable mistake with an important set of changes, you can always review the history in the same GUI and reimplement the important changes in a new branch.

mabbo 2 ago 1 reply      
One thing not covered very well was what to do if you push to origin. My favourite way to fix this: use git revert to create an exact opposite commit to your bad commit.

 git revert <bad commit> git push
It leaves a history of the mistake, for better or worse, but it does undo the mistake on origin.

mathieuh 9 ago 2 replies      
I absolutely love git now.

I'm still at uni (at a highly ranked but actually crap university where we don't learn git properly) and this year was my 'year in industry' as we call it in the UK, and my first proper experience with git, aside from `git init` at the end of my project and pushing it to a repo.

I've become so much more confident with git. Seriously, with one caveat (i.e., you haven't pushed your changes to a branch which other developers are working on), it is almost impossible to break irrevocably. Even if you do accidentally break master/develop/whatever, it only causes a bit of hassle and grumbling.

Highly recommend that everyone take a bit of time to learn about "undoing" git commands, whether that's through soft resets, hard resets to origin, or the reflog.

Reflog is also useful for figuring out how someone else broke something and explaining what they did wrong, since you can see what branch they were on at what commit and what commands they ran.

I think git's main problem is the somewhat arcane language it uses, and lack of understanding of what's actually happening behind those words like "rebase", "commit", "patch", "reset" etc.

wyclif 8 ago 2 replies      
If you're concerned about not knowing how to do certain things with git, and understanding at a deeper level how git works, I highly recommend reading Scott Chacon's "Pro Git" book:


froh42 7 ago 0 replies      
With all these recipies - one thing I do whenever I attempt some stunt in git: I assign temporary tags to every changeset that's important.

 git tag tmp git perform-stunt
This eases undoing the stunt without needing to find the "before" state from reflog. And if you use a graphical log viewer (I like SourceTree on Mac) you'll see the tagged state in the history view - which makes things a lot clearer.

And to be aware what happens, there one single explanation of git that helps a lot: http://eagain.net/articles/git-for-computer-scientists/

As soon as you start viewing git as a graph of nodes with branches/tags just being "marked" nodes a lot of things make sense, and whatever "git perform-stunt" you attempt it's easy to explain within that mental model.

chriswarbo 3 ago 0 replies      
Based on the article, and many of the comments here, I didn't realise how comfortable I have become using git!

For example, the last "bad situation" I had to get myself out of involved unreadable .git contents caused by filesystem corruption. If you can "rm -rf fucking-git-repo-dir" then it's not too bad; when that fails with an IO error is when things get interesting!

niuzeta 7 ago 2 replies      
The last addendum reminds me of this inexorably relevant xkcd entry: https://xkcd.com/1597/
Hello71 4 ago 2 replies      
lots of these are unnecessarily complicated:

> Oh shit, I accidentally committed something to master that should have been on a brand new branch!

 # disappear the last commit and all changes from it git reset --hard HEAD^ # make a new branch using the last commit git checkout -b new-branch HEAD@{1}
> Oh shit, I accidentally committed to the wrong branch!

first, you don't need to git-add before and after stash, stash will save the working directory and the index (as documented in the DESCRIPTION of git-stash(1)). but for a more logical way:

 # disappear the last commit and all changes from it git reset --hard HEAD^ # get onto the new branch git checkout new-branch # grab the stuff from what was on the old branch git cherry-pick old-branch@{1}
> Oh shit, I tried to run a diff but nothing happened?!

 git diff --cached
recommended reading for intermediate git users: the DESCRIPTIONs of all of these commands (git-reset(1), git-checkout(1), git-cherry-pick(1), git-diff(1)), and the entirety of gitrevisions(7).

atsaloli 50 ago 2 replies      
There is no substitute for understanding what's going on, especially using a power tool like Git.

It's a cute website, and useful, I really like it. This sentence,

 Bizarrely, git won't do a dif of files that have been add-ed to your staging area without this flag. File under \_()_/"
just screams to me (a professional Git trainer), "I don't understand the Git staging area! I don't know my Git fundamentals! Train me!"

edem 5 ago 2 replies      
I really like this article but there is a problem with it: what happens if I use one of your techniques and I screw up? These steps you describe are a black box to someone who is no git savvy yet. While these definitely help but they propagate the "git is scary, cross your fingers" mentality. What I mean by this is that the reader won't be any wiser after reading

> git reset HEAD~ --hard

What is ~ after HEAD? What is --hard? Is there a --deep option as well?

So I think that you could upgrade this with some annotations over the cryptic parts with a little explanation. What do you think?

mcbain 9 ago 1 reply      
'sudo rmdir'? I don't think that does what they think it does.
glandium 9 ago 2 replies      
> Oh shit, I accidentally committed to the wrong branch!

Other ways to do it (that don't require to retype the commit message):- rebase onto the correct branch:

 git branch foo git reset --hard HEAD~ git rebase --onto name-of-the-correct-branch HEAD foo git checkout name-of-the-correct-branch git merge foo git branch -D foo
- cherry-pick

 git reset --hard HEAD~ git checkout name-of-the-correct-branch git cherry-pick name-of-the-branch-you-mistakenly-committed-to@{1} (or git cherry-pick HEAD@{2})
> Oh shit, I tried to run a diff but nothing happened?!

You probably want to know `git diff HEAD` too.

Edit: formatting.

wfunction 9 ago 2 replies      
Surprised there was nothing on messed-up merges or rebases. They're some of the worst to get out of when you're not totally comfortable with git yet.
Illniyar 9 ago 3 replies      
Actually the easiest thing is simply not to care about how your log looks.

If you don't then there are ry only two things you need to know how to do:

If you didn't push to origin do an ammend. If you did, revert soft and commit the previous code to revert it (you can also put a stash or patch to apply it back).

Which frankly is what the article does, basically.

mkj 4 ago 2 replies      
Another option is to use Mercurial with hg-git to GitHub for network effects.

I've been doing that for a while for dropbear ssh, it does hit occasional problems but is overall more pleasant than straight git.

init0 25 ago 0 replies      
for the rest of it there is http://git.io/git-tips
jakelazaroff 1 ago 0 replies      
> Oh shit, I committed and immediately realized I need to make one small change!

If you don't want to change the commit message, in addition to --amend:

 git commit --amend --no-edit

jmiserez 6 ago 0 replies      
This doesn't even cover half of the bad situations I've gotten myself in over the last few years :D

Long term, it's best to thouroughly read the man pages, e.g. nicely formatted here: https://git-scm.com/docs

oskob 3 ago 1 reply      
Oh shit, I commited a binary file larger than 100 mb and now i can't push to github.com. Solution: https://rtyley.github.io/bfg-repo-cleaner/
pc86 1 ago 1 reply      
> This usually happens to me if I merge to master, then run tests/linters... and FML, I didn't put a space after the equals sign.

Am I the only one that runs my tests before committing, let alone merging to master?

lambdacomplete 5 ago 3 replies      
Getting to "Fuck this noise, I give up." is a very clear indication that you aren't competent enough and you should take a GOOD course about git as soon as humanly possible.

Shameless plug: http://engineering.hipolabs.com/how-to-work-in-a-team-versio...

dahart 2 ago 0 replies      
Great idea! We need more basic git workflows described in plain English.

I was expecting some actual "bad" situations based on the title, and to be fair these were bad to me once and are bad for people new to git, but I'd love to see the level 2,3,etc. version of this article.

uhtred 3 ago 1 reply      
I prefer stackoverflow for things like this as I can see from comments and up votes whether the command does what the poster claims, or whether it is going to make things worse for me.
iatanasov 1 ago 0 replies      
The post is missing the most important command : git --help
alistproducer2 4 ago 0 replies      
One of my favorite teachers in school was a dude-bro programmer who pretty sure as younger than me. He'd spent a summer at Google and made us use git and gerrit. I'm honestly a much better programmer thanks to him. I'm still using the git cli to this day.

I also still say "new up" an object thanks to him. I'm not so proud of that one.

noufalibrahim 9 ago 9 replies      
I don't know if this post was intended as humour or a way to vent out some frustration but in my experience, this path of treating git as "spell X solves problem Y" will always break down.

Version control systems are an important part of the programmers toolkit and it's worth investing a little time to get the fundamentals right.

Sure git is not the friendliest of beasts but what it lacks in interface, it more than makes up in internal consistency and trying to learn it "inside out" is a better long term investment than having a list of ways to solve "git problems".

oneloop 5 ago 1 reply      
Git is complex and nuanced, and short term purple think it's faster to memorize some commands instead of understanding the fundamentals.

I kept having problems with git, so I read a fucking book on it https://git-scm.com/book/en/v2

I'm not saying I never get into situations I can't get myself out of, but the examples in the oh shit website now look like obviously trivialities.

m_mueller 9 ago 0 replies      
One of my SO questions could almost fit in there, somewhere before "Fuck this noise...":


samoa4 1 ago 0 replies      
random567 10 ago 3 replies      
The screwed up and committed to master should end with:

 git reset --hard origin/master
(assuming that the remote is called "origin") With the example in the text, you have to know the number of commits you've made to master.

felixschl 9 ago 1 reply      
I managed to "rm -rf .git" at one point. Took me about a minute to realize and -surprisingly after <c-c>-ing i lost nothing (as far as i was aware). Git is freaking hard to break. Also always remember git-reflog, it safes lives.
0xmohit 7 ago 0 replies      
Nobody likes to read manuals or books, due to which one can see FAQs being posted on Q&A sites.

http://gitready.com/ contains a number of small articles categorized by beginner, intermediate and advanced that might be helpful.

Another resource for commonly used git tips and tricks: https://github.com/git-tips/tips

psyklic 9 ago 0 replies      
Also great for getting out of Git messes: http://justinhileman.info/article/git-pretty/
Zelmor 8 ago 1 reply      
Is the documentation really that bad? Would it benefit from a technical writer going over it? Is the project open for discussion on changes to the documentation?
abarrak 7 ago 1 reply      
It's probably a good time to check if you have some safety against `rm -fr `.

Two days ago, I wanted to delete .git only, but accidentally as my fingers were accustomed with -fr , the command was `rm -fr * .git`. Rails server was running and some hope arose at the moment to just `lsof | grep` .. unfortunately that didn't work with me !

Ironically, all dot files have stuck as obvious :)

hacksonx 8 ago 0 replies      
I moved to git at the begining of this year and I must say that I miss SVN. But everyone keeps telling me that git is better so I'm sticking to it.
OJFord 8 ago 0 replies      

 # create a new branch from the current state of master git checkout -b some-new-branch-name # remove the commit from the master branch git checkout master
Or just `git branch some-new-branch-name`...

 cd .. sudo rmdir nsfw-git-repo-dir
That will only remove it if it's empty? Which it never will be, because there's at least `.git/*`...

Still, amusing :)

bwghughes 8 ago 0 replies      
Fucking love it.

alias gitshit="open http://ohshitgit.com/"

swah 4 ago 0 replies      
I fear Git so much that I make zip packages of the repo before potentially destroying operations.
jimktrains2 4 ago 1 reply      
If the owner is here: with javascript disabled the code is unreadable.
EdiX 4 ago 0 replies      
All of those things and more are way easier to do with gitk and git gui.
kuahyeow 6 ago 0 replies      
Most of the time, stay calm. Do not `git push` hastily, and check `git status` if you can :)
Gonzih 8 ago 1 reply      
why do you run git add . before git stash?
vacri 9 ago 0 replies      
Not quite in the spirit of this article, but "I just want this /one/ file without the rest of the changes from branch foo" is something I use all the time

 git checkout otherbranch git checkout firstbranch -- fileIwant maybe1more git commit -m "brought over files"

lordnacho 7 ago 1 reply      
Surprised he finds git to be complicated. It probably is deep down, but for day-to-day use, compare it to SVN.

Until I switched, there was always a panic when branching or merging. With git, I can branch like a nutter and things seem to still work out in the end.

Not sure why, perhaps someone else has a perspective on it.

nicky0 4 ago 0 replies      
The last one is my usual tactic.
cyphar 9 ago 1 reply      
The last rmdir example should be rm -rf.
shklnrj 6 ago 0 replies      
It is not the fault of Git if to use it you have to know it!

However would appreciate a quick and dirty handbook.

partycoder 9 ago 1 reply      
Well, there are many more situations you can get into.

Like cherry picking, force pushing, merge --no-commit, rebasing... almost any operation can end up going wrong.

Just pay attention.

cyberferret 9 ago 0 replies      
LOL. Bookmarking along with my other Git reference sites...
throw2016 4 ago 0 replies      
git to me is a work of art. There is a lot of complexity underneath but the end user sees something that is simple, fast and easy to use. It scales depending on user needs and It's easy to reason about.

This is a feat of engineering, to take something complex and make it easy for anyone to undestand and use. It shows real expertise and deep understanding of the area.

In many ways it's a shining example against the 'culture of complexity' that we increasingly find ourselves in. Here rather than simplying the objective is to be to make thing as complex as possible, usually in pursuit of extremely niche use cases or because either the expertise or the interest to simplify is not there. If git was designed in this culture it would be fragile, full of buzz words, poorly documented and prone to failure, and something only a few self appointed experts could reason about and use properly.

Annatar 8 ago 1 reply      
Yep, git sucks but it's all the rage now. Mercurial is 100x nicer to use and logical, but since it's written in Python it's slow as molasses, especially with large binary files.

Next on the list: Larry McVoy's Bitkeeper promises to be everything git and Mercurial aren't. (git is "inspired" (read: copycat) by Bitkeeper).

It's funny how Sun Microsystems influenced the industry in so many ways, isn't it?

Four billion messages an hour: benchmarking Deepstream throughput deepstream.io
83 points by wolframhempel  7 ago   37 comments top 10
jgrahamc 5 ago 4 replies      
So, roughly 1.1m messages per second of about 23 bytes.

We're handling 4m to 6m 1.5k log lines per second using Apache Kafka on a cluster of around 100 nodes.

matt_oriordan 3 ago 1 reply      
I'm interested to know what happens when there is a failure or a deployment? Having static servers handling load is only part of the problem in our experience. The true complexity and scalability of a system comes when you consider how it copes under load with unexpected failures (network, hardware), but more importantly expected maintenance such as regular deploys, scaling up and scaling down events. Do you have any metrics for that? Those are the problems we've been focussing most of our energy on at Ably (https://www.ably.io), not just message per second rates which is often not really the problem.
0xmohit 5 ago 0 replies      

 Deepstream relies on garbage collection to free up dereferenced memory. If a machine's CPU is overutilized above 100% for a consecutive time, garbage collection will be delayed and memory can add up. If this continuous for a prolonged period, the server will run out of memory and eventually crash - so be generous enough when it comes to resource allocation to make sure that your processors get some breathing space every once in a while.
"Be generous when it comes to resource allocation".


> The costs of running a six-instance cluster for an hour on AWS are 36 cents (6 x t2.medium @ 0.052$/h + 1 x cache.t2.medium @ 0.068$/h)

AFAIK, pricing in AWS world depends upon the region. Bandwidth, hard disks and so on also contribute to the price.

I'm not sure what to make of such conclusions.

teh 4 ago 0 replies      
t2 instances have credits that are replenished at a constant rate, and used up when you use the CPU - is this sustainable for more than 1h?

That's ~25MB of data per second for an in-memory workload over 6 machines. I think they missing a zero on the size of the messages?

gtirloni 5 ago 1 reply      
What does this compare against? Having a little trouble navigating the "real time" apps landscape.
bsbechtel 5 ago 1 reply      
How does Deepstream compare to Meteor?
AdamMills 2 ago 0 replies      
How many concurrent connections can you get with the test nodes?
siscia 3 ago 1 reply      
How does deepstream compare to MQTT?

I see a lot of advantages on using a more standard protocol such as MQTT over deepstream

stemuk 2 ago 1 reply      
Is it possible to cluster deepstream in multiple AWS regions? It may be beneficial when it comes to latency...
lossolo 5 ago 3 replies      
If you only need one hour then indeed it's great price but if you need it 24/7 then it's 260$ a month which is really expensive if you compare alternative (you can get dedicated server capable of same throughput easily for half the price).
The Art of Insight in Science and Engineering: Mastering Complexity (2014) [pdf] mit.edu
229 points by kercker  13 ago   11 comments top 5
nrubin 9 ago 1 reply      
I took a version of this class (called The Art of Approximation) with Dr. Mahajan when he was a visiting professor at Olin College. The course helped me become fearless about tackling a broad range of math problems I knew little about, and has helped me in countless real life situations from financial planning to major work decisions.

Also, it's a great intro on good ways to nail PM interviews, at least at Google (where I now work, thanks in no small part to the lessons from this class).

If you have a spare few hours, read this book or take his classes on edX. It'll absolutely change your intuition for numbers.

ivan_ah 12 ago 1 reply      
andrzejsz 3 ago 2 replies      
How come this book is available on download I mean what about copyright?
cJ0th 5 ago 0 replies      
What a great find! I didn't even dream of a book like this.Thank you for sharing!
geohump 4 ago 2 replies      
The irony of a an article claiming to master complexity while using one of the worst choices of information dissemination. PDF's. Palpable.
Why I finally ditched Jira liminastudio.com
67 points by virgil_disgr4ce  2 ago   47 comments top 16
strictnein 11 ago 0 replies      
Biggest issue I have with Jira is how it's trying to be the everything tool for all processes and procedures: way too many non-devs and non-qa people looking to get data from it and cramming their stuff into it. Every month it seems like there's another couple of fields that I have to fill out so that some random group can track some metric, or the security team can pretend like we're doing a security audit on all of the code, or whatever. My favorite is the checkbox the QA team has to click to confirm that they've done security checks on new endpoints. They don't have the first clue on how to do that, but they all just check the box because they have to check the box to continue.

Whether or not this is Atlassian's fault is debatable of course. I feel like they were more opinionated back when I started using it ~10 years ago (with the Greenhopper plugin). It was a much, much more pleasant thing to use. Probably because it was only under the control of the teams who were actually doing dev work.

gotofritz 1 ago 0 replies      
I totally agree. Complete pile of crap. The search is quite useless (try searching for something as simple as all tickets to which you added a comment in the last week...no chance). Unless you get a plugin, but who writes these plugins? Can you trust them? Who's assessing them? And how much do they cost? And what if you haven't got admin rights? Keeping track of stuff should be the main functionality of the product, it should be slick out of the box. Not to mention all the other stuff, useless kanban boards, bulk operation, and all the rest of it. I find myself clicking for whole minutes to complete simple tasks.

I think Atlassian have too many pots on the fire and they can't keep up with it.

ma138 47 ago 1 reply      
As a previous Jira user I have come across all the problems in this article. But the biggest problem with Jira, and all similar applications is that developers just don't want to use it - as a PM you either end up having to force a tool on the team or try and work from inaccurate data (which involves lots of asking for status updates rather than getting them async from the tool)

This is actually why we created ZenHub[1] to take this work flow to the makers rather than the managers.

We find this provides higher quality data and improves productivity by reducing context switching. Im interested to hear everyones thoughts about this integrated approach vs siloed tools like Jira. In my biased opinion the days of having a separate tool for this are numbered (see also the GitLab issue board release)

[1] https://www.zenhub.com/

ctrl-j 1 ago 3 replies      
Sounds like a combination of an old version and poorly configured setup. Possibly running on out of date hardware.

I manage an Atlassian setup (Jira, Confluence, Stash, and Bamboo), and we don't seem to have any of the problems described in this article...

Well, except for the configuration part. There's options for almost everything.

The one gripe I have is that, while most things are configurable, there are some that seem like they would be easy to make configurable that get marked off as [wontfix].

Jira support is also pretty amazing. They're also really transparent and dogfood their application and expose their issue tracking to their customers. Bugs filed against Jira are done in Jira.

Shank 1 ago 0 replies      
I definitely see the speed issues with JIRA. Self hosted was a better solution than running on the Atlassian Cloud, oddly enough, but I definitely can relate to 2-3 seconds for actions that take milliseconds using just Github Issues.
zbjornson 43 ago 0 replies      
Agreed. Instead we use github issues with color-coded and prefixed labels (e.g. "type:bug", "comp:auth",... with all type labels blue, all component labels green, ...) and waffle.io for a sortable kanban view. A few big FOSS projects do this (e.g. https://github.com/angular/angular.js/issues).

Pros: best possible github integration, fast, lightweight.

Cons: none for us. There's nothing in Jira that I miss, at least.

Yhippa 39 ago 0 replies      
I used to be a big Atlassian homer because I enjoyed using their products and they were easy to use. Over time I got into some administration and saw the other side and it wasn't quite as pretty. Did some migrations and those were actually okay but had their share of glitches.

I'm starting to see companies "become Agile" by enforcing Jira usage. It's a sad trainwreck to see. I personally think you get people to buy-in to Agile and then once they do you can introduce tooling that will support their needs, not install Jira with all the bells and whistles on Day 1.

mxuribe 31 ago 0 replies      
One of my vendors uses Jira, so we have to use Jira to submit tickets for bugs on their stack/apps...and because apparently there are some config options within jira, its often tough to tell if the suckiness is Jira itself or how an admin has set it up. Our vendor has had to re-config things twice (and we might have to ask them to do so yet again).

On the flip side, I have another vendor, and they moved to youtrack earlier this year...As a former developer youtrack just gels so well for me; everything clicks awesomely and makes sense. It may not be as pretty as Jira, but extremely powerful for search. My vendor is self-hosting a youtrack instance, and who knows there may be similar config. issues like jira, so YMMV...but have a look-see: https://www.jetbrains.com/youtrack/

bitwize 1 ago 4 replies      
O HAI. You think Jira is bad? Let me introduce you to VersionOne.
raverbashing 25 ago 0 replies      
Jira is good compared to Bugzilla and others (not mentioned the complete piles of crap like ClearQuest that couldn't even get basic search right and a "nice" proprietary one that required 2fa for logging in to bugs. Of course it was IE only)

"Kids these days" don't know how long it took for Jira to appear

It is "bad", but it is bad compared to what Google puts out. Not a corporate tool.

virgil_disgr4ce 55 ago 0 replies      
Sorry all, my apparently terrible web host immediately imploded with the traffic. Trying to resolve now.
didsomeonesay 1 ago 1 reply      
Agreed on all points. Self-hosted alternatives?
amyjess 19 ago 0 replies      
JIRA has always felt bloated to me. Of all the ticketing systems I've used professionally, by far and above the best ones were Trac and Redmine (a clone of Trac), which were light, simple, and cut out all the bullshit.
sickbeard 1 ago 2 replies      
Like git, we need Linus to come up with totally mindbogglingly simple. I can't tell you how many years we wasted working with stupid/bloated/useless source control systems until git came along.
berntb 1 ago 3 replies      
I used to hate JIRA. At the present work, it surprises me that I quite like it.

I have thought about this (is my algorithm for liking/hating stuff deficient?), but am not certain if it is because I use a later version of JIRA, or if it is me. Am I less sensitive to bad web UIs?

It is easy to communicate with users and other developers. It keeps the discussion on the tasks in a nice place. It is easy to add queries to other people to call them in to touch a task.

My main theory is that the use case is different; I have more to do with users for the present tasks and the more complex features are less overkill. And I see less of the horrors (more experienced admins).

(Jokes about that this just shows my mental deterioration with advanced age will not be appreciated. :-) )

Python 3.6 dict becomes compact and keywords become ordered python.org
157 points by Buetol  5 ago   74 comments top 10
jamiesonbecker 1 ago 1 reply      
This doesn't feel right.. it's not actually a nice side effect at all.

1. it's not in the spec

2. you shouldn't rely on it

3. python can't figure out if you're relying on it, so no error will be raised

4. subtle bugs are sure to be introduced by people who "know" this "feature" exists and use it.

Regardless of the cool implementation details, this post shouldn't advertise that "keywords become ordered" and "A nice "side effect" of compact dict is that the dictionary now preserves the insertion order".

Ergo... we built this awesome thing that you'd love to use but you can't. Don't use it or you'll be a bad programmer who doesn't read specs!

(Just use addict or OrderedDict.)

gbin 3 ago 4 replies      
The optimization is awesome but IMHO the key ordering "benefit" in the implementation but not the spec is a so-so move:It can cause some future bugs in the code assuming it is there. Some languages like Go added some key ordering randomization to maps to be sure to avoid people counting on any specific key order.
andybak 3 ago 3 replies      
Does this mean all dicts are ordered in Python 3.6 onwards?

If so I can see scope for subtle bugs as code written and tested on Python 3.6 will potentially fail on earlier versions due to dict order.

But I guess that's why you test using tox...

imh 54 ago 0 replies      
What's the point of the compact dict? Does having all the keys together outweigh the extra indirection? That would surprise me. Can someone help explain why that would be the case?
IgorPartola 3 ago 3 replies      
Here is a fun little thing I didn't realize before running into it:

 OrderedDict(a=1, b=2).keys()
Is not guaranteed to return ['a', 'b'].

Of course this makes sense, but is annoying nonetheless.

Dowwie 4 ago 4 replies      
With this given, why wouldn't collections.OrderedDict be deprecated as of 3.6?
Pirate-of-SV 4 ago 3 replies      

 dict_keys(['c', 'd', 'e', 'b', 'a']) # random order
> random

Maybe I'm nit picky but I find the misuse of that word annoying.My experience of CPython is that dicts are unordered but deterministic. Not random.

fucking_tragedy 3 ago 1 reply      
How will they reconcile this?

 In [10]: OrderedDict((('a', 1), ('b', 2))) == OrderedDict((('b', 2), ('a', 1))) Out[10]: False In [11]: dict((('a', 1), ('b', 2))) == dict((('b', 2), ('a', 1))) Out[11]: True

smegel 4 ago 2 replies      
> "Preserving the order of kwargs in a function"

"What are you trying to achieve?"

jnbiche 4 ago 1 reply      
If you've disabled the StartCom CA due to concerns about lack of transparency[0] and are therefore unable to view pages like this one, you can always click the "web" link above and then view the cached page on Google.

For convenience, that link is:


0. https://news.ycombinator.com/item?id=12411870

Edit: To be clear to the downvoters, this has nothing to do with Python, other than they're using the StartCom certs. Not a criticism of Python.

Getting started with Raspberry Pi Building a Digital Photo Frame paulstamatiou.com
176 points by jimmcslim  13 ago   50 comments top 21
smartbit 25 ago 0 replies      
Elderly people love, really enjoy, their Nixplay[0]. I call it "Facebook[1] for grannies". Putting the names of people below the pictures helps people with mild forms of dementia to repeat & remember the names of their beloved.

The Nixplay device is quite good, regretfully the UX of their website is horrendous. And there I see a problem. If someone would build a selfhosted version of "Facebook[1] for grannies" with an Raspberry PI: open source projects are not known for their excellent UI/UX designs.

Time will come, and given the feedback I hear on the Nixplay, I expect that one day we'll have these devices all over in houses of elderly, either connected to a special screen or connected to the TV set.

[0] nixplay.com

[1] replace with Instagram, Flikr or your favorite photos sharing app/site

Animats 9 ago 3 replies      
An E-ink digital photo frame you could reprogram and that ran on batteries with a long life would be nice. Vikaura announced one in 2015, collected money for a way oversubscribed Kickstarter, and then didn't ship. They're still taking pre-orders with a ship date of August 2016. They last updated their "news" on Feb 04, 2015, when the Kickstarter was funded. Looks like they took the money and ran.[2]

Their address was 566 Alpha Drive, Pittsburgh, Pennsylvania. At that address now is a new startup called PowerHarvester. Hm.

[1] http://www.vikaura.com/[2] https://www.kickstarter.com/projects/1658373341/vikaura-scre...

shriphani 11 ago 4 replies      
Very cool! a couple of friends and I put this together for our living room: http://blog.shriphani.com/2016/08/03/a-frame-that-listens/

It is a pi + a condenser mic which generates visualizations in response to sound.

jaboutboul 39 ago 0 replies      
you don't even need to run the gui and do all the greaskmonkey stuff, its overkill. just set up fbida (https://www.kraxel.org/cgit/fbida/) its available packaged in most distros and you can set up the slideshow right through the command line using the framebuffer to show the images. I've built several picture frames this way.
spdustin 10 ago 1 reply      
I've been meaning to do this sort of thing for a while, and after reading the comments here, I've decided to embrace the glow from the display when I make my frame. Using a strip of addressable LEDs, you can emit a glow around the frame that extends the colors visible along the edges of the photograph beyond the frame, like the old Philips Ambilight.

Here's one Instructable I found with one idea for an Ambilight clone:


jsingleton 6 ago 2 replies      
Displaying a full screen web page on a Raspberry Pi is such a common use case, I'm surprised there isn't built-in support. Something really lightweight based on RISC OS (like NOOBS is) would be great.

I wrote up some of the options at https://unop.uk/adding-basic-authentication-to-screenly-ose but they all seem to run on a full Rasbian stack.

jablan 5 ago 0 replies      
Does it really have to be such high resolution display? I gather that, as the observer gets further from the display (and this is a photo frame, usually looked at from afar), the resolution gets less and less important.
monochromatic 8 ago 1 reply      
I hate that it's so hard to find a high-resolution display with a suitable aspect ratio for photos. 16:10 is way too wide.
Sn4p 8 ago 0 replies      
fantastic project! I have wanted to do something like this for a while.. think I will try to copy yours a bit :-)all we need now are a sensor to turn off/on automatic depending on people in the room or not.

Advanced bonus points:

1) (simpler version), connect to a calendar service, and match photos to calendar. (e.g. you have a visit from your brother on saturday, so show photos of him and your kids that day).

2) (more advanced), use a camera to recognize people in the room, and base photos on these (I can these the perfect host mode, where a perfect host would find photos from the basement and put on mantelpiece before a friend come to visit, making that person feel special).

so many thins that could be cool here!

NhanH 11 ago 1 reply      
I've wanted to build something similar lately for an always on ... todo list (mostly because I'm trying to make a habit of looking at todo list). I'm looking for some screen with similar size of a bit bigger than 10", with a caveat it has NO backlight. Can anyone suggest a screen like that? E-ink would be even better.
Mao_Zedang 12 ago 3 replies      
This is a really nice site, something about the design.
shakeel_mohamed 11 ago 0 replies      
Interesting timing, I actually want to do this with my Pi (but wired). The only thing holding me back is the cost of the display - any suggestions for a cheaper non-touch display?
Tempest1981 7 ago 0 replies      
Any software suggestions for displaying photos from a local network share, or DLNA?
LAMike 12 ago 0 replies      
This project is awesome. The stand that came with the screen made it seem like a good desktop option too
SubiculumCode 11 ago 0 replies      
Nice write-up...gave me some ideas, although I am still new to my raspberry pie.
jonathankoren 11 ago 3 replies      
Not to knock this project, which seems well executed, but I'm always a bit ambivalent about digital frames. Sure it's kind of neat to to be able to have changing images, or at least be able to easily push new images to the frame, but they're always so impractical.

You either have a wire you have to hide, or you have to remember to charge the frame regularly. All that costs energy. Then of course it's always backlit, which seems odd. Do I really want a glowing rectangle on my walls? Not really. It kind of kills the vibe. Perhaps eink would be better, but beating dyed paper is hard.

chidea 11 ago 0 replies      
How about it with gif animation?https://github.com/chidea/FBpyGIF
yingnansong 6 ago 0 replies      
This is impressive! Good job!
stop1234 8 ago 0 replies      
Images broken on the site...

Edit: now working...

mschuster91 12 ago 1 reply      
Nice! I wonder whether it 'd be possible to use the RPi's DSI connector instead of HDMI?
honzajde 8 ago 0 replies      
Inexpensive Pi, Expensive display, expensive solution. I don't get it. At least made you happy.
Dell.com make sure you read the terms pastebin.com
22 points by lsiunsuex  55 ago   4 comments top 4
kipdotcom 9 ago 0 replies      
Dell EqualLogic and EqualLogic-branded products, Dell|EMC, EMC and VCE-branded products, Dell Compellent and Compellent-branded products, Dell KACE and KACE-branded products, Dell Force10 and Force10-branded products, PowerVault ML6000 tape libraries, PowerVault DL and DR products, Dell SonicWALL and SonicWALL-branded products, Dell Wyse and Wyse-branded products, Dell Quest, Quest, ScriptLogic and VKernel branded products, Dell Software branded products, Dell Data Protection | Rapid Recovery and Dell Data Protection | Rapid Recovery branded products, Dell StatSoft and StatSoft-branded products, non-Dell-branded enterprise products, enterprise software, and customized products may not be returned at any time.


DanBlake 0 ago 0 replies      
Did you buy it with a credit card? Most of them have additional protections that let you fix situations like this. Not sure if it applies to returns, but for instance I know you get a guaranteed extra warranty with anything you buy with your amex past the manufacturers, etc..
celticninja 4 ago 0 replies      
This would be illegal in the EU. The US needs better consumer protection laws.
Silhouette 5 ago 0 replies      
I sympathise, and I appreciate someone raising awareness of the situation. Perhaps the negative attention will prompt Dell to review this policy.

That said, is it really normal to expect to be able to return a non-faulty product once it's shipped, or unreasonable of Dell to refuse such returns if they don't want to sell on that basis?

In my country (the UK) there are consumer protection laws that include various rights to return goods purchased remotely for a short period even if they aren't defective. In principle, these are supposed to allow for not being able to inspect the actual product the way you could in person if you bought from a store. However, for better or worse, business customers do not enjoy the same legal protections as private individuals, and a lot of potentially one-sided terms that would be unlikely to stand up in a B2C context because of statutory protections seem to be routine in B2B sales.

This Bubble's Got Legs bloomberg.com
95 points by petethomas  11 ago   66 comments top 12
cs702 31 ago 3 replies      
According to the OP, we have a global "central-bank-led cash bubble" powered by "an ever flowing money hose."

If you believe interest rates are being kept "artificially low" (whatever that means) by the "money printing" of central banks like the Federal Reserve and the Bank of Japan, then you will agree with the OP. In this view of the world, central banks are contributing to our current economic malaise: by keeping rates artificially low, central banks are causing asset prices to increase, making investment in productive endeavors less profitable for businesses and individuals.

If you believe interest rates are low primarily because businesses and individuals worldwide want to hoard cash (as a way to protect themselves against, say, potential deflation or insufficient aggregate demand), then you will disagree with the OP. In this view of the world, central banks are doing everything they can to motivate businesses and individuals to invest more in productive endeavors as opposed to hoarding cash. Yet even when rates turn negative, individuals and businesses still want to hoard cash, as evidenced by their demand for negative-interest instruments.[1]

[1] http://foreignpolicy.com/2016/09/07/the-weird-new-normal-of-...

habosa 1 ago 0 replies      
If you're interested in monetary policy and a fundamental analysis of the state of money and banking in the world today, I highly recommend "The End of Alchemy" by Mervyn King [0].

The book starts with a long history of why we have money at all, why it takes the forms it takes, and how banks evolved into their current role. It then goes on to describe why there is so much inherent risk in our banking system and what we could do to reduce or eliminate it.

As someone who follows financial news but has no formal education in finance, I found the book enlightening and extremely readable.

[0] - https://www.amazon.com/End-Alchemy-Banking-Future-Economy/dp...

holdenc 24 ago 0 replies      
I am reminded of when I worked on a financial analytics app and we constantly had to remove the Zimbabwe stocks. They'd all gone up 1000's of percent and ruined all the other data on the chart (appearance-wise). Why? As the Zimbabwe currency lost value and became worthless, the stocks denominated in that currency held their real value. I would expect to see some version of this play out in the stock market today.
Wildgoose 6 ago 3 replies      
The financial problems originated with excess debt, but debts that are so large that they cannot be paid back - will not be paid back.

Inflating assets via QE doesn't really address the real underlying problem.

The only long-term solution I have seen is Debt for Equity swaps. It won't work everywhere, but it does help.

For example, when the Banks were technically insolvent then the opportunity should have been taken to forcibly re-arrange their financial underpinnings rather than shoring up the existing rickety structures by using taxpayer monies to in effect bail out the rich.


vadym909 51 ago 3 replies      
This is scary as I moved to all cash a couple of years ago. I still don't understand how there can be a bubble in money. All I see around me is house and stock prices going up incredibly. Does a money bubble popping result in high inflation making my money in the bank worthless. Fuck. First I got burned in stocks, then in real estate, and now cash?
mangeletti 21 ago 0 replies      
Some background for those concerned with inflation:

The US dollar is backed first and foremost by the global reserve status of our currency, which it became in 1971, as (probably the most important) part of the Bretton Woods System. As long as nations need US dollars to purchase crude oil, the US dollar will continue to be globally valuable.

Secondly, the US has a powerful military, including hundreds of essentially forward operating bases and air fields, lying in wait, all over the world.

Third, and as a secondary property of our military and economic strength, many countries are owed many billions of US dollars. This is because, due to the strength of the US military, T-bills (essentially bets made on the US dollar) are purchased by many nations. When other countries bet on the US dollar, they're creating a vested interest in the US dollar not collapsing, which increases the US leverage (leading to things like money printing).

None of this is meant to make you more too comfortable, because, despite all of the above, China, Russia, Libya, Iran, Brazil, South Africa, and, at times, India, have been attempting to reduce the US dollar hegemony by creating other means of trading oil.

For instance, Libya's late leader, Muammar Gaddafi, created a gold backed currency for trading oil (this is the reason NATO invaded Libya and killed Gaddafi). Russia and Iran made agreements to trade food (from Russia) for Oil (from Iran), which Russia would then add to its exports, acting as a trade proxy for Iran. This, combined with Iran's efforts to pipe natural gas and oil to China, is the reason for the supposed "Iranian nuclear threat". China has also recently created the Shanghai Gold Exchange, as a means to further commoditize fiat currency (china has very large gold reserves). BRICS has created a competitor to the Western-dominated IMF in the past 24 months, which also threatens the US reserve currency's backup plan, which is called the SDR (Special Drawing Rights). The list goes on; not to mention the calls Xi Jinping has directly made to remove the US dollar as a reserve currency (IOW, this isn't exactly covert, as of about mid 2010).

As you can see, considering our posturing in the South China Sea, and considering what happened to Gaddafi, the US won't acquiesce to losing its reserve status. So, the US dollar is not going to "collapse", lest we find ourselves engaged in total war (with China, et al). So, as long as you don't plan to bail on the US in a total war scenario, you're, in my opinion, safe to invest in USD.

steeleduncan 8 ago 2 replies      
The graphs offered as evidence seem to have very carefully picked axes to fit the story. e.g. looking at the first MSCI graph with the x axis extended back to the 60s[1] it is hard to draw the same conclusions

[1] https://en.wikipedia.org/wiki/MSCI_World#/media/File:MSCI_Wo...

rtpg 8 ago 3 replies      
So my vague understanding is that QE and stuff is technically printing money, but the resulting money isn't really being used for anything.

Is a bubble a bubble if the extra bubbly-ness is from stuff that doesn't get used?

EDIT: actually, answered my own question. QE raises asset prices (even if the sellers aren't really doing anything with the money), and at one point people will be like "wait, none of this is worth the price it's at" and then panic (unless QE Infinity).

cardmagic 50 ago 0 replies      
The forrest is dry, just waiting for a black swan. The problem with black swans is they are definitionally unknown unknowns... until they are not any more.
randomgyatwork 1 ago 3 replies      
Isn't it always a thing before a bubble pops where everyone says the bubble wont pop.
lifeisstillgood 8 ago 4 replies      
So, the article says dotcoms caused first bubble in 2000, houses in 2008, but this one is driven by central banks printing money. And there is no way they will stop printing so this is an infinite bubble

There is sooo much wrong with that. As @rtpg says, that s stops not when Fed stops printing money but when market realises that the high asset prices "globally" become obviously unsustainable.

The issue is that when that happens, it happens across the world simultaneously- which is why they article refers to the complacency index. (It is possible the article is more sarcastic than I could read into it)

squozzer 2 ago 1 reply      
The conclusion makes sense only if every major economy participates. What about China and Russia? Granted, I doubt investors today trust either country without qualification, but if this goes on another 5 or 10 years, who knows?
Louisiana Judges Issue Harsher Sentences When the LSU Football Team Loses theatlantic.com
71 points by fraqed  3 ago   49 comments top 5
jdminhbg 2 ago 6 replies      
A lot of the discussion around this seems to focus on football, but that's just the easily-measurable proxy for a bad mood the authors found. Presumably an argument with a spouse, a poker game loss, a parking ticket, or a bad interaction with a coffee clerk would have the same effect. It's disheartening that justice is so capricious, but setting sentences through the legislature seems even more fraught. I don't have a good solution.
romanovcode 36 ago 0 replies      
It's also proven that judges make harsher sentences just before the lunchtime. (Because they're hungry)
kldaace 1 ago 1 reply      
I'm going to wait for more replication before I adjust my priors too much on this. It seems plausible prima facie, but I'm suspicious of their methodology. From the article (I don't have access to the working paper), it sounds like they might have been trying to test the significance of a lot of different variables, and this might just be the one that popped up.
anexprogrammer 1 ago 2 replies      
Given there's research showing harshness increases over time until lunch then dropping markedly, then increasing again over the afternoon this isn't terribly surprising.

We're not likely to eliminate these factors any time soon.

nxzero 40 ago 0 replies      
Once had the very surreal experience while in court waiting to litigate a case to object to a higher court's jurisdiction over a case; my motion was granted, and the case was sent to the lower court.

While waiting for the above motion to be heard for over an hour, the person directly behind me was calling how the judge would respond to various matters that required the judge's opinion. Having nothing to do, since mobile device weren't allowed, I listened to the predictions made and the outcome. The predictions were wrong most of the time.

As I stated earlier, my motion to send the complaint back to the lower court was granted and it turns out that the person behind me was the judge for the lower court.

Yes, the lower court was predictably unable to predict the higher court's judgements the majority of the time.

To me, this if it holds true statistically across the legal system in a blind test of judges -- it is a major issue; normally, judges don't want to publicly rule against another judge unless it's obvious the other judge was wrong.

Strange end to my experience was that I won in the lower court and the other party failed to appeal the judgement in a timely manner; meaning my motion to dismiss the appeal was granted based on it being filed one day late by the opposing party by the higher court. Always wondered what would have happened if the higher court had heard the case and issued a judgement; lucky that never happened.

Elevated error rates on applications and deployments heroku.com
86 points by jorrizza  2 ago   60 comments top 15
stpe 2 ago 2 replies      
I'm currently at the Nordic.js conference http://nordicjs.com/ (which... is down right now). Suddenly half the audience started to looking at mobile phones, unpacking laptops.

Poor timing for the current speaker...

aquilligan 2 ago 1 reply      
Down for almost an hour now. Haven't seen anything of this scale happen in 18 months or so of running our apps on heroku. Looking forward to the postmortem from Heroku (and hopefully my app being up again).
nateguchi 2 ago 1 reply      
It's more than some apps... all of ours (~70) are down and deliveroo looks like it's also having issues
fphilipe 2 ago 1 reply      
It took them 15 minutes to update their status from the time of our last received request.
gryzzly 2 ago 7 replies      
What is a good alternative for static single-page apps? What are downsides of S3?
babgyy 2 ago 2 replies      
What options would you consider if heroku was down permanently ?
guy_c 2 ago 0 replies      
Uptime robot is reporting my sites went down at 12:55 UTC, so been down for 35mins
cocoflunchy 2 ago 0 replies      
It's been down for more than half an hour now... don't understand how it took them so long to update their status page.
danielstocks 2 ago 1 reply      
I got my first SMS alert notification from Pingdom about 55 minutes ago now. One or two requests managed to reach through the the application logs in the last 10 minutes, but still mostly down.
pluma 2 ago 4 replies      
I'm not sure why, but I was under the impression that Heroku had been shut down months ago.

I may have been thinking of Nodejitsu (which has been acquired by GoDaddy). Did Heroku change owners at some point or am I imagining things?

dbuxton 2 ago 0 replies      
Seems to be coming back up now, albeit a bit intermittent.
fredrivett 2 ago 1 reply      
And we're back.
nateguchi 1 ago 0 replies      
Back Online!
scrown 2 ago 0 replies      
Up now
benmmurphy 2 ago 10 replies      
This is kind of offtopic but what is the attraction of using Heroku over using EC2 directly now. I remember back in the day when EC2 didn't have RDS and Elasticbeanstalk Heroku was an attractive option because you could deploy and scale without needing to do any kind of system administration.

But now EC2 is offering managed databases through RDS and elastic bean stalk gives you git style deployment similar to Heroku I don't see what Heroku is offering other than another point of failure and another set of security problems. It looks like Heroku uses linux containers for isolation. So not only do you have to worry about someone attacking the underlying EC2 VMs Heroku uses but you have to worry about the tenants collocated on your Heroku VM attacking you through the linux kernel as well.

NUMA-aware scheduler for Go docs.google.com
89 points by signa11  12 ago   19 comments top 4
scott_s 1 ago 0 replies      
The first listed risk is why I shy away from solutions that depend on pinning threads to logical processors:

Several processes can decide to schedule threads on the same NUMA node. If each process has only one runnable goroutine, the NUMA node will be over-subscribed, while other nodes will be idle. To partially alleviate the problem, we can randomize node numbering within each process. Then the starting NODE0 refers to different physical nodes across [Go] processes.

Basically, your particular runtime system is probably not going to be the only thing running on a host. And even if it is, the kernel itself may choose to run things on particular logical processors, and it may not take into account what pinning you have done. For that reason, I find these approaches brittle. If your users know exactly how they're going to deploy applications (not you, since you're implementing a runtime system for user code), they can squeeze out some more performance, but all it can take is one extra process running on that host to mess it all up.

That's the difficulty with implementing runtime systems, and not applications: your runtime system has to work for (usually) arbitrary user code on (usually) arbitrary systems. If you're writing a single application, and you know exactly how and where it will run, thread-pin away. But when implementing a runtime system, you don't have that kind of luxury. You often have to leave performance on the floor for a small number of cases so that you don't hose it for most cases.

In principle, I think this kind of scheduling should be handled by the operating system itself. If the kernel does not have enough information to do it properly, then we can identify what information it would need, and devise an API to inform it. But the kernel is the only entity that always has global knowledge of everything running, and controls all of the resources. I find that a much more promising direction.

As some minor support, consider the recent paper "The Linux Scheduler: a Decade of Wasted Cores", https://news.ycombinator.com/item?id=11501493. My intuition is that runtime systems which perform thread pinning like this will tend to make such problems worse, since it constrains the kernel scheduler even more.

billhathaway 11 ago 1 reply      
There is a new small thread[0] on golang-dev about someone from Intel looking into this. It would be great to see the go scheduler be more aware of NUMA characteristics.

[0] https://groups.google.com/d/msg/golang-dev/ARthO774J7s/7D9P0...

morecoffee 10 ago 3 replies      
One thing I have never understood about the Go scheduler is how P's are involved. The OS (assume Linux) works with threads, and schedules threads not processors. How does it pin the P to the processor, or in this case Node?
iends 11 ago 1 reply      
This was proposed a few years ago but it never got any traction it seems.
How the Blind See the Stars nautil.us
31 points by dnetesn  5 ago   6 comments top 3
pluma 3 ago 0 replies      
Slightly off-topic but at first I was going to complain that someone who can see some light and shadow isn't "really" blind. But this is actually a good reminder that impairments aren't always binary. He was blind for all intents and purposes, even if he can see some brightness contrasts some of the time.

This is especially important to remember when dealing with accessibility: even if it may not be possible to offer 100% of the experience to everyone, making things a bit more accessible can still be a huge win already.

Not to mention that some forms of disability can be situational. There is little practical difference whether you can for example only use one hand because you only have one arm, because one of your arms is broken and bandaged, or because you're carrying your groceries.

frobware 2 ago 1 reply      
I'm blind (at night). I would dearly love to the stars. In fact, this is pretty close to my #1 on my bucket list.
amelius 1 ago 1 reply      
> His surgically enlarged iris allows the telescope to focus images directly onto his retina, sensitive to ultraviolet and infrared frequencies that normal lenses would filter out.

If a telescope can do that, then perhaps there are other devices that can do the same for images of different origin (?)

Rare Footage of Pallass Cat Cubs in Mongolias Zoolon Mountains nationalgeographic.com
69 points by pshaw  10 ago   10 comments top 6
haversine02 4 ago 2 replies      
Pallas cats are one of my favourite animals. I've had the luck of seeing a very vocal and grumpy one in person.They also have a very unique low meow, which also sounds like they're constantly pissed off: http://www.bioacoustica.org/gallery/sounds/Felis_manul1.wavhttps://youtu.be/DKmBt9sA9hk
dahjelle 1 ago 0 replies      
If you are ever near Fargo, ND, the Red River Zoo[1] has 3 that you can come see in person! :-)

[1] http://redriverzoo.org/animals/pallas-cat/

personlurking 8 ago 1 reply      
There's an April 2016 BuzzFeed video [1m30s] with facts about this cat. You might want to mute it (electronic music overlay).


rwmj 1 ago 0 replies      
Hunted for their pelts. Don't buy real fur coats.
xarope 7 ago 0 replies      
2 years ago I found out about sand cats (google them, the ears are amazing). But these Pallas cats are positively prehistoric looking!
homero 9 ago 1 reply      
Cats can see infrared
Turns websites into Markdown fuckyeahmarkdown.com
91 points by david90  6 ago   60 comments top 16
lsiunsuex 4 ago 14 replies      
I'm all for cursing in private conversation with friends and such when it's appropriate - but why do people continue to think it's acceptable in public examples of work? Will this go on someone's resume like this? What will an employer think?

I'm not questioning freedom of speech or self expression or anything like that. I'm simply saying, if you build a tool / live example / whatever and intend to show it to people - you probably should try to keep it professional.

As for the actual tool - cool? Markdown is great amongst programmers but ask random internet user what markdown is and they'll look at you like you have 3 heads. I've never understand the use of this outside of programming in things such as content editors or such.

laurent123456 4 ago 2 replies      
For his own website I'm only getting:

 [Source](http://fuckyeahmarkdown.com/ "Permalink to Fuck Yeah Markdown") # Fuck Yeah Markdown

lj3 1 ago 0 replies      
I tried this years ago (it's been around for quite some time) and found better results with his Marker[0] bookmarklet. It only converts whatever is selected on the page. It eliminates 1 out of the 2 issues I had with markdownifier, which is it's inability to figure out what to keep and what not to keep on the page.

The second issue I have affects both. Neither locally downloads linked content. Not having the images come down with the article means if the article were to go away, your local copy would be worthless unless you manually saved those pictures and altered the links pointing to them.

[0]: http://brettterpstra.com/2013/12/22/marker-web-selections-to...

emillon 2 ago 0 replies      
You might want to filter local and unroutable URLs, for example entering renders a MAMP Pro page.
sandGorgon 3 ago 2 replies      
this is potentially a big idea. there are hundreds of jobs on upwork "convert my old wordpress site to markdown/jekyll" or "convert my old drupal site to markdown/jekyll".

even if you manage to convert a wordpress site to jekyll (with all images, formatting,etc)... you are onto something big.

I would pay 20-30$ for a one time conversion of a few dozen pages.

leeoniya 1 ago 0 replies      
i made this a few years ago:


bronlund 3 ago 1 reply      
Bah. It can't even do http://www.arngren.net/
_nalply 3 ago 1 reply      
It works on Quora articles even with math formulas. I'm going to use this tool to export my own Quora content. Thanks a lot for the pointer!
tammer 1 ago 0 replies      
I want something like that's halfway between links/w3m & readability/reader view - basically a version of readability that's an actual browser.

Maybe one day lightweight design will be trendy.

pmlnr 3 ago 0 replies      
Pandoc[1] as service?

[1]: http://pandoc.org/

fenollp 4 ago 2 replies      
What is the algorithm behind this?Does it attempts to find 2 similar pages of the same website and generate markdown from the diff of their DOM trees?
fiatjaf 2 ago 0 replies      
Output is not good at all for http://fiatjaf.on.flowi.es/, which the simplest HTML I could get.
Horba 4 ago 0 replies      
I feel validated that my markup must be sane considering a generic utility can parse the html into markdown.
AndrewVos 3 ago 0 replies      
Cool, then you can put them directly into my new app: https://anmo.io :)
upofadown 2 ago 0 replies      
Is this better than pandoc?
xyzzy4 2 ago 1 reply      
Why do front-end engineers like to use the word "fuck" so much? Just something I've noticed over time.
GPL Time Bomb an interesting approach to FOSS licensing boyter.org
144 points by boyter  12 ago   100 comments top 26
michaelmrose 11 ago 4 replies      
This is obviously much better than proprietary as people can potentially benefit in the future from your work and I thank you for being so kind as to share it and yet I would probably use something significantly crappier to have real open source and several other things seem problematic.

-- Others are unlikely to contribute to the proprietary fork.

-- What do you do about contributions to the gpl3 version they would be copyright their creators and couldn't be part of a closed source version even the authors without copyright assignment.

-- Do you backport fixes to the gpl version. It would be far simpler to say use the new version wherein issue foo is fixed. If you do its a lot more work if you don't the gpl version isn't just a older version its a known broken pos that will never be fixed.

-- What if people offer fixes to the gpl version will they be expected to assign copyright so you can use them in the proprietary version?

-- 3 years is an eternity. In 1 year someone could make their own version of whatever it is you are doing that is actually open source if it was really worth having it would also have none of the above complications.

randoramax 49 ago 4 replies      
You have another option, keeping the GNU GPL as a license and avoiding this time-shift complexity: sell the binary, and offer the source only to those who buy the binary -- don't make the source code public on github. It's not only possible but it's what sustained many GNU projects, until they were acquired by others.

Nobody in the comments mentioned the most successful case: Cygnus Solutions had been developing the GNU Compiler Collection (GCC) for good money, all under the GNU GPL. Notably, they made a lot of cash from a deal with Sony to support the original Playstation in gcc.

A good read about how they marketed Free Software on John Gilmore's pages: http://www.toad.com/gnu/cygnus/

tibbetts 11 ago 1 reply      
RapidMiner was doing something like this a few years ago. https://rapidminer.com/news-posts/rapidminerembracesitscommu... I haven't followed closely but I think they gave up on the complexity and started offering the same product as either AGPLv3 (a hostile license for the kind of customers who pay for software) or commercial license. A straight dual-license model reduces confusion about what is in which edition. If you are going to have an open and a closed product you really have two products, which should ideally have two separate reasons for existing. Just being the old version is not the right kind of difference.
anilgulecha 10 ago 6 replies      
Language matters.

The idea is neat, but "time-bomb" gives it a negative connotation. Instead maybe call it anti-abandonware licence, or public-benefit timer, or something else. (I'm sure HN can come up with a better phrase).

pwang 10 ago 3 replies      
I applaud the spirit of this, i.e. in trying to lay out a covenant with the community while still being entrepreneurial about building a commercial business to fund continued innovation and maintenance of the software.

The problem is that it falls into an outdated framing of the value of software. Namely, as more people around the world have gained significant expertise in coding, the skillsets needed to produce software is commoditizing. That skilled labor used to be scarce, and that meant that the fruit of that labor was scarce, and therefore the artifact itself was the thing to monetize.

Creative and clever licensing around software can "protect" software from certain kinds of things, but they cannot protect the software from being commoditized. And commoditization pressure is very high right now. Over the last 10 years, OSS has permeated into real enterprise adoption, and many of the deep reasons for enterprises to adopt open source software are tied precisely to those commoditizing factors.

So to build a sustainable business around software, you have to tie your price to the actual business value. In our modern times, when the skills to produce software are abundant, the thing that's valuable, then, is creating alignment on your API or framework or service, by earning the greatest mindshare among the collective labor pool that can maintain and innovate on software. And there's a huge virality to that, too - the game of developer mindshare is a winner-take-all game.

All this is to say: don't worry so much about defending the software, because that's not the valuable thing. The community around your software is what determines whether it is relevant 3 years from now. And having weird/new/different licensing terms can stifle the growth of that community.

perlgeek 18 ago 0 replies      
I for one welcome this experiment.

Making money from Open Source is hard, and while this isn't quite about making money from Open Source, it's still an interesting experiment in this direction.

TallGuyShort 1 ago 0 replies      
I like the intention, but a few thoughts:- If you're dead, to whom have you legally clarified copyright should go? No one but that person can enforce the GPL, because the foundation of enforcement is that it's otherwise copyright violation.- IIUC, new releases don't reset the date of old releases, so old releases become fair game after a few years. The reason that doesn't help people is that most of the value is in what you're doing right this very minute and your ability to support that today. That is why companies can make money from existing licenses. They're paying your for support, paying you to develop new features that matter to them, etc. But at the same time they can accept collaboration and contributions from even competitors right now. I think you're losing most of the benefits of open-source for making money despite the time window.
majewsky 3 ago 0 replies      
There is something similar for Qt. Back when Qt was licensed more proprietarily (with a GPL option for pure FLOSS), the KDE folks were worried that Trolltech could just release new versions as proprietary-only. Therefore, an agreement was reached and codified that, should Trolltech fail to do an open-source release of Qt for more than 12 months, the last release shall fall under the BSD License to facilitate a community fork.
ubertaco 3 ago 0 replies      
My gut-reaction to the title was to assume (based on the negative connotation to "time bomb") that you had some code that was permissively-licensed (something user-friendly like MIT or BSD), and then when the bomb "goes off", it switched to GPL, rendering it unusable in most commercial software.
bitL 4 ago 1 reply      
If you really want to make money, it's probably not the best idea to put a fixed time into your time bomb, unless you are targeting a fast moving platform like mobile OSes. At some point your previous version will be sufficient to 99% of people and any new features will be just nice-to-haves. Maybe you can consider per-feature time bombs? You'll still have time to build a sustainable business on top of it, still allow people to study/modify your code, but with greater control over your cashflow.
tytso 1 ago 0 replies      
I made a very similar proposal back in 2003, which I called the Temporary Proprietary License, or TPL:


gioele 3 ago 1 reply      
I have been toying with a similar idea, but for born-FOSS source projects.

I like copyleft licenses but I find the legalese in them too complex and I fear that they will cause problem down the road because of changes in the legal landscape. I am also worried about leaving code around that is useful but unusable by other for legal reasons. For all these reasons I would like to license my projects like this:

 * They are born AGPLv3/OHL. * All the published revisions older than 5 years are released into the public domain (CC0).
Simple and easy to state, incredibly complex to write down. Especially if you want to make it enforceable without me (or the other contributors) being alive. From what I understand, the concept of a "legally binding promise" is even more fuzzy across the jurisditions than that of a "simple" copyleft license.

Scaevolus 11 ago 1 reply      
Unfortunately, this doesn't support regular expressions, which is a very useful feature for a code search engine. See https://github.com/google/codesearch for one implementation of this.
ktRolster 11 ago 1 reply      
I remember Stallman once advocated a license similar to this (for things like games, so people can make money on them, then release shortly after. His timeframe was more like months though iirc).
Arnt 8 ago 1 reply      
This was done twenty years ago with Ghostscript, google for AFPL to see the details. The author made enough to retire, so it can work for some people some of the time.
PieterH 9 ago 0 replies      
Proposed by MySQL's Monty some time ago under moniker "Business Source.
IANAD 5 ago 2 replies      
I'm not sure if this will work as it is currently, and here's why:


Clearly shows that it is released under the Fair Source License, version 0.9 with no mention of license termination and restricted rights. It is NOT free.


then states:

'After the following date 27-August-2019 you may choose to use this software, version "1.2.3" or "1.2.4" under the GNU General Public License Version 3 with terms specified at https://www.gnu.org/licenses/gpl-3.0.txt'

This is so wrong. Not that it is GPL, but this both contradicts the license and states that you "may choose to use this software..." under the GNU GPLv3. Also, the missing comma after "1.2.4" is a mistake in grammar.

I'd say, as-is, you should consider this licensing unreliable. Yes, he posted in his blog stating his intention, but as it is now, it would appear that if a ruling had to be made on the actual license, it could very well be that the only valid license is the one in LICENSE.txt.

Since that is unclear, though, if I were a user and cared about the license, I'd get a good lawyer to figure this out before making any assumptions.

And, I would suggest that he get a lawyer to help craft the required changes to his license.

Illniyar 9 ago 1 reply      
I think that having a license that is both open source and that protects the creator's ability to earn money from his creation (thus paying for it's maintenance really) is a very good and important idea.

Personally though I wouldn't go with the TimeBomb idea, I think that if I ever get the chance I'll decide on a single expected revenue stream (such as sass hosting) and provide an open source license for use except for that purpose.

martingxx 7 ago 2 replies      
I interpret this as "I don't want to collaborate with you people yet - I want this all for myself because I think I can make more money that way. But in case I fail, I'll change my mind and hey let's work together!"

No thanks.

GNU GPL is an excellent choice for a license for a commercial product if you do it right. It fosters collaboration and community from the start. Look around at the most successful companies whose major products are open source, especially the GPL ones. Note that none of them started off refusing reciprocal collaboration with potential contributors.

hsivonen 10 ago 1 reply      
IIRC, Ghostscript used a scheme like this prior to 2006.
mjevans 11 ago 1 reply      
This is an interesting take on licences that turn 'abandon-ware' in to public domain or close to public domain code when the maintainers are unresponsive.

I like how it mandates that there are 3 years for innovation to occur within. However I feel like the license before that needs provisions for things like the ability to make local changes as needed, and also to publish /security patches/ freely.

Edit (Beneath):

The impression I got was that he has ONE version, and a built in escape clause for anyone using it. I took the view of this licensing scheme ensuring he'd have to 'compete' against a 3 year old version of the product with compelling features; not that there'd be an intended 'free' version of the product.

burgerdev 11 ago 1 reply      
a) Site is down, google cache is here http://webcache.googleusercontent.com/search?q=cache:NBcmXVo...

b) The license file on GitHub does not mention GPL, a time frame or something even remotely related to the post - looks like some sort of shareware license.

c) If it's a good product today, will it still be a good product in 3 years? In today's version? I doubt it.

Still better than just selling binaries, but not by much.

kijin 9 ago 1 reply      
I'm planning to do something similar with some of the source-available proprietary software that I'm writing nowadays. If I die or otherwise fail to maintain the software for some time (much shorter than 3 years), the license will revert to GPL.

This kind of arrangement might be more difficult to pull off when there are multiple copyright holders or if some of them are corporations. But for projects maintained by individuals or a small group of like-minded people, I think it hits a nice balance between making a profit and giving back to the community.

m-ueberall 10 ago 2 replies      
For the record--this is basically the same as http://bit.ly/toxicBSL, http://bit.ly/bslMScl, therefore the same arguments apply.
kahrkunne 8 ago 0 replies      
I hate when people don't even try to make money off FOSS
bobajeff 11 ago 1 reply      
This almost but not quite sounds like the concept of public domain.
Hakaru Probabilistic Programming hakaru-dev.github.io
93 points by avindroth  14 ago   12 comments top 3
wcbeard10 1 ago 1 reply      
I occasionally use probabilistic programming systems, and find this project fascinating, but I've been wondering for a while what the vision is.

Is it meant mainly as a research project/proof of concept (seems I've seen elsewhere that it's funded by the DARPA PPAML project), or is it intended to become a commonly used piece of software with a community like pymc3 and stan?

eggy 10 ago 2 replies      
I was very interested to try it out, but it requires a licensed version of Maple.
calebm 10 ago 1 reply      
How does it compare to PyMC?
Ciscos Network Bugs Are Front and Center in Bankruptcy Fight bloomberg.com
88 points by sytse  13 ago   31 comments top 8
ericseppanen 11 ago 1 reply      
5-nines uptime means anticipating that the components (hardware and software) you bought from another company may fail in devious, correlated ways.

Failure to do so is simply incompetence.

Maybe the customer didn't read the fine print of their service guarantees, and that's on them. I would hope that doesn't happen often-- it would be very silly if service guarantees fell apart every time some piece of shoddy equipment (purchased and operated by the service provider) turns out to be at fault.

wbl 12 ago 1 reply      
I'm surprised Peak Web isn't responsible for bugs in its suppliers stuff under the terms of the contract. After all, it picked them, and can switch to alternatives if they don't measure up, while the game company cannot.
imglorp 3 ago 0 replies      
> The entire network often has to go down in order to patchvery disruptive in the best of times

This is why Erlang has hot loaded code, since around 1986. Those who xxx are doomed to repeat yyy....

PaulHoule 3 ago 0 replies      
My experience has been that Cisco hardware from the high end to the low end is crap.
walrus01 12 ago 1 reply      
Peakweb are idiots. With sufficient diversity and redundancy it is not that hard to run a five nines ISP. Ten hour outage is amateur hour. There are a great many terrible low budget hosting companies and colocation operators in the market.
jobes 11 ago 2 replies      
You should be able to build a lot of redunandcy into your network on $4 million a month, but that would eat into margins, wouldn't it?
X-Istence 11 ago 0 replies      
MachineZone shouldn't have put all of their eggs in a single basket. Multiple hosting providers, and write your server side so that failover can happen to other datacenters/locations almost instantly.
peterwwillis 9 ago 0 replies      
"Three people familiar with Peak Webs operations say the [10 hour] outage gave the company time to deduce that the troublesome command was reducing the switches available memory and causing them to crash."

Cisco Nexus devices log alerts for low memory/resource situations to syslog, the defaults being 85% minor, 90% severe, and 95% critical. Were they not reading their logs?

Inside the fight to reveal the CIA's torture secrets theguardian.com
23 points by secfirstmd  2 ago   6 comments top 2
gragas 24 ago 0 replies      
The CIA is one of the US' greatest assets. It's not going to be "exposed" and go away. Sure, many of it's atrocities might come to light and the public will be mildly upset. But the CIA isn't going anywhere.
joshuaheard 38 ago 1 reply      
"Enhanced interrogation" was not "torture". Before starting the program, the CIA asked the Justice Department to take the definition of torture from the Geneva Convention, and design guidelines to exclude any such behavior from the Enhanced Interrogation program. So, to call it "torture" in the headline, only exaggerates the description of the program and demonstrates a bias in the reporting.
FAA Urges Passengers to Not Use Samsung Galaxy Note 7 on Planes wsj.com
181 points by petethomas  12 ago   121 comments top 10
flashman 12 ago 3 replies      
Here's the actual release: https://www.faa.gov/news/press_releases/news_story.cfm?newsI...

Might as well just quote it in full:

> In light of recent incidents and concerns raised by Samsung about its Galaxy Note 7 devices, the Federal Aviation Administration strongly advises passengers not to turn on or charge these devices on board aircraft and not to stow them in any checked baggage.

kondro 11 ago 1 reply      
Australian airlines have just outright banned them.


bwang29 12 ago 5 replies      
If I understood this correctly, this is based on a higher (but unknown to consumers?) chance that the Galaxy Note 7 's battery could explode during flight and catch fire. Then, what would be considered a chance high enough that a laptop battery's explosion would not be a concern? Would it be safer to lock these phones into an explosion-proof box before they onboard the flight?
sschueller 6 ago 2 replies      
Now I hope samsung will bring back the removable battery. The only reason I can still use my note 3 is because I was able to easily replace the battery with a new one.

In fact all phones should have accessible batteries as the battery is in my opinion a consumable which will need replacing at some point.

mrmondo 6 ago 0 replies      
So thats two models of phones that explode or set themselves on fire, a wall charger with illegal insulation imported to America, two models phone phones AND tablets that destroy SDCards likely due to faulty or cheap voltage regulators and a washing machine that sets houses on fire, not exactly a stellar record over the last 5-10 years...
josephcooney 10 ago 1 reply      
What happens after I send my Note 7 back, and get a new one with the problem corrected? Will I still be banned on Australian airlines?
peterwwillis 9 ago 0 replies      
So what's weird to me is how the FAA is issuing this notice, yet for hoverboards didn't issue any notice; airlines took it upon themselves to ban them.
pcunite 12 ago 1 reply      
Hydraulix989 11 ago 4 replies      
prklmn 12 ago 1 reply      
"Urge."??? I think they can do better
Facebook deletes Norway PM's post as 'napalm girl' row escalates theguardian.com
254 points by mmariani  3 ago   224 comments top 35
RodgerTheGreat 2 ago 12 replies      
If you're a Facebook user and you are unhappy with the way the company strongarms, censors and manipulates its audience, the most effective way for you to express this dissatisfaction is to close your account, block social media bugs and encourage your friends and family to do the same.

Facebook doesn't care how you feel when you use their service; their bottom line simply depends on your contribution to the statistics they use to sell ads. Apathy, or even outrage, are perfectly acceptable provided you express it through channels they control and profit from.

As far as I'm concerned, as long as this conversation is couched in trying to appeal to Mark Zuckerberg's imagined sense of ethical responsibility it will lead nowhere.

mabbo 2 ago 9 replies      
I'm torn.

Freedom of speech and freedom of expression mean that the government can't put you in prison or punish you for saying or believing what you do. Facebook aren't the government, they're a private entity and don't have to host anything they don't like- including hosting photos that they don't like. It's a walled garden, and it's their walled garden, and if you don't like it you're welcome to leave.

And on the other hand: it's the only garden. If your friends are in that garden, they can't share with you, interact with you, etc, without you also being inside. Facebook's created a 'with us or not with us' distinction that has a very sharp boundary. And it's worked- they've won the social network wars. A billion people are on it.

The question is, as the social network champions does Facebook have to have to public's interests in mind or just their own bottom line profit margin? As a public company, the shareholders will fire their leadership if they don't choose the bottom line. As the major social network of the world, the public will denounce them for actions like this.

maxxxxx 2 ago 11 replies      
I always find it interesting that no level of violence is deemed inappropriate in the US but nudity has to be avoided at any cost.
dazhbog 2 ago 0 replies      
Why doesn't fb just blur the content that users find disturbing like "Viewer discretion.., flagged by our users". Then you can click to view or adjust the sensitivity in your account settings.
jokoon 2 ago 2 replies      
Any centralized social network is subject to moderation because if it's centralized, it can be attacked, fined or shut down by a court. So facebook can't escape that rule and must decide what is acceptable or not and have to anticipate any flak they can get.

In the end, moderation is a gruesome job and nobody really wants to do it, and it will be subject to how moderators anticipate public perception, so it's a PR race.

So of course you will have those situations where facebook will make bad choices, but it doesn't only depends on their moderation team, it also depends on political correctness. That's why decentralized networks are better, because nobody is really responsible, and it can hardly be attacked.

You can decide to either have a politically correct website and get investments, or disagree with political correctness and be like 4chan.

It's not great, I'm sure people realize that, and that the internet will go back to decentralized systems.

planetjones 2 ago 1 reply      
Was the Norwegian Prime Minister's post removed because she posted the image again in that post ? This is a crucial question and not clear from the article. If Facebook censored only words then this is a much larger issue. If they censored the whole post (including photo) then while debatable this is Facebook's policy i.e. a blanket ban on such imagery, irrespective of history.

Edit: I don't find it clear journalism, but the fact is there:

Solberg was one of a string of Norwegian politicians who shared the iconic image after Facebook deleted a post from Tom Egeland

So the post was removed because it had the image, not because she had dared to criticize FB.

thr0waway1239 1 ago 0 replies      
If you want something from FB: its reach, you need to play by its rules, however arbitrary they may be. If you wish to change the laws of physics, go and get yourself your own planet. It is much easier in this case: just choose a different forum.

Having said that, this incident should teach Norwegians (and the countrymen of any country) a thing or two about where they stand on the totem pole of power.

Facebook > Every other country on the planet

Facebook is a country because it is acts as an independent sovereign state which is not answerable to anyone at this point. Apparently, it already makes up its own taxation laws[1]. I expect them to release their own flag, maybe a national anthem?

[1] http://www.forbes.com/sites/kellyphillipserb/2016/07/29/face...

But of the many truly troubling things I see with FB's policies - their alarming intrusiveness and ruthless exploitation of our need for being social, choosing its own censorship policy is not one of them, especially if it is consistent. I would rather see them made answerable to privacy violations.

TrevorJ 11 ago 0 replies      
It's particularly troubling because facebook is primarily about communicating with your own friends and acquaintances. Censoring public content is troubling, but removing content that is private and only available to people who took the step to friend you on Facebook is really really crappy.
jondubois 2 ago 0 replies      
I think censoring the PM's complaint is a bad move by Facebook. Regarding censorship of the photo, I think it should be left to Norwegians to decide whether it's appropriate or not - I think different people might have different views on this.
cannonpr 2 ago 0 replies      
I suspect a large part of this isn't so much an attempt by facebook to impose US cultural norms to the rest of the world, as much as an attempt to avoid financial burden by simply applying the ban stick as bluntly as possible.After all, being multicultural, providing good editing suitable for several countries acceptable norms, while trying to advance/modify them...Well that might be viewed as admirable work or cultural imperialism. The point is it's not work that they want to do, nor do I think is it work that they feel they can get paid for.
the_af 2 ago 0 replies      
This iconic picture was not only a Pulitzer Prize winner, but was also on the cover of the New York Times. Surely this will help the anonymous "Facebook spokeswoman" determine on which side it lies of the thin red line of "censor" / "do not censor"?
wonks 2 ago 2 replies      
I feel like this is a good argument for taking another look at projects like Diaspora and Friendica Red
angelofm 2 ago 0 replies      
The article makes a reference about an open letter from the editor-in-chief - Espen Egil Hansen, the link return an internal error, you can read the open letter in the web.archive website https://web.archive.org/web/20160909061907/http://www.aftenp...
samfisher83 2 ago 2 replies      
Maybe the censorship team was too young know the significance of the picture. I am guessing average Facebook employee is under 30. Probably younger than that. Vietnam war is over 40 years old, most American students learned about it and knew that picture, but I don't know how much Vietnam war is taught in other countries. It might have been a combination of age and where the person grew up that contributed to deleting the picture?
thomasfl 2 ago 0 replies      
Let's create a walled garden that embraces facebook's walled garden. A new social network that displays your facebook timeline and other items.

BTW. I'm Norwegian.

newscracker 1 ago 0 replies      
Facebook is very arbitrary in its censoring and account deactivation decisions. Many cases I have read about are instances where Facebook is in the wrong and does not provide users a way to get things resolved (perhaps these instances surface online more often or more prominently).

Every time I read about Facebook's decisions, I feel extremely frustrated and downright angry. Humans need an alternative to Facebook that's not as evil and can get better traction (no, this does not mean everyone closing their FB accounts and switching to email or text messaging). I'm waiting for that to happen.

brakmic 19 ago 0 replies      
"Making the World a better Place" is so 2009. Doing the same with our Past is way cooler, and profitable.
Raphmedia 1 ago 0 replies      
It is kind of scary to see how countries are powerless when it comes to Facebook. I know that this article and the whole discussion here is not about that but I get a eery feeling reading about it.
kajecounterhack 1 ago 0 replies      
At Google certain images are considered EDSA (Educational, Documentary, Scientific or Artistic). I wonder if this would have been considered EDSA vs Facebook's decision to say it's against ToS.

That said, it totally makes sense that they have a consistent policy. Whether you find their overall abuse ToS objectionable should be the main consideration here. It's OK to me that they seem to have decided that imagery containing nude children should be hard-banned. It's a decision couched in the desire to protect children, not some heavy-handed censorship.

EdSharkey 1 ago 1 reply      
It would be neat to have a decentralized social network simply to avoid the editorial demands of the walled garden. I think we'd have a lot more unsavory content making its way to people's eyes though. There'd need to be more sophisticated ways of filtering information than just "unfriend", I suppose. And people would need to have tougher skins for it to work.
codingmyway 1 ago 0 replies      
I can accept why they need to draw a line on naked child images and be done with it. Like most silicon valley companies they want everything automating with as little human customer service as possible.

However if they aren't going to do that job of editorial then they need to stop trying to be a news source while abdicating any responsibility that entails by saying they are a tech company.

return0 1 ago 0 replies      
The bigger problem is that the new media is US controlled,and youre going to have some culture conflicts. Maybe legislative action could force facebook to federate the users content
cx1000 1 ago 0 replies      
Ironically, Facebook is not censoring the news articles covering this story. The unedited photo is now shown all over Facebook.
pi-rat 1 ago 1 reply      
The prime minister's original post:http://snpy.in/5Nv92c
roadman 1 ago 1 reply      
I'm not on fb. I read in the comment from the spokeswoman that the distinction cannot be made by their robotic rules. So I believe this illustrates a limitation of their AI. And they don't care so much about the people than their algorithms. Just an opinion.
dajohnson89 52 ago 0 replies      
What was the PM's motivation to make the post in the first place?
cmdrfred 55 ago 0 replies      
This is a place where I do not agree with Facebook's decision but I agree they have a right to decide who and what can be on their platform. Freedom of speech does not give me the right to come into your home and say whatever I like without being asked to leave. I'm free to do so in the public park across the street though. Your property rights trump my free speech.
tamana 34 ago 0 replies      
Guardian is pretty trashy for tossing that picture up twice in one article. The article isn't even about napalm or the war, the picture is being used as snuff shock. Show some respect for human dignity.
794CD01 2 ago 2 replies      
Just because that is theoretically more effective doesn't mean you should do it. Assassinating Mark Zuckerberg might even approach a whole 1%, it's still not something we should be encouraging.
JabavuAdams 2 ago 1 reply      
Why are we thinking of FB as some monolithic entity? Isn't the most likely explanation that some low-wage contractor in the Philippines saw a picture of a naked girl and flagged it? That contractor may not even know the historical significance of the picture.

You're in a low-wage job and have to look at horrifying shit all day, every day. Are you going to let the one image through that maybe will cost you the job that you really need?

andrewclunn 2 ago 0 replies      
Dear Norway,

The US governments make us legally complicit in child pornography if we don't have automated processes to take this stuff down. People keep blaming corporations for censorship of porn-like (but not porn) content, song lyrics that get mistaken for terrorist threats, and overly zealous take downs of anything that might infringe on IP. Do you think we want our users to get angry at us over this shit? Look at the US child porn laws, the numerous governments spying under the banner of the war on terror, and laws like the DMCA. Our hands are tied and you are blaming the wrong people.

- Facebook

fil_a_del_fee_a 2 ago 2 replies      
I honestly think it was an algorithm that flagged it.
exodust 2 ago 1 reply      
She should simply publish it somewhere else, such as her own blog or some other website. When she signed up to Facebook.com she ticked a box agreeing to their terms.

I never signed up so couldn't care less, but aren't most people on Facebook talking about what they had for breakfast and how awesome stuff is? I'm not sure where Napalm girl fits in with that culture except maybe "awesome war photography - thumbs up!!".

kybernetyk 2 ago 1 reply      
Why should the PM be treated differently than anyone else? Just because she's the PM?

FB has any right to remove whatever they want from their private property.

upofadown 2 ago 2 replies      
Wait, Facebook censors pictures of naked children because they are afraid that some pedophile might get off on then?

That's kind of twisted, isn't it?

       cached 9 September 2016 16:02:01 GMT