hacker news with inline top comments    .. more ..    24 Mar 2017 News
home   ask   best   2 years ago   
3.5 Years, 500k Lines of Go npf.io
193 points by NateDad  2 hours ago   89 comments top 10
ben_pr 1 hour ago 2 replies      
I'm looking forward to my first project with GO. It appears to offer a lot with minimal complexity.

> Because Go has so little magic, I think this was easier than it would have been in other languages. You dont have the magic that other languages have that can make seemingly simple lines of code have unexpected functionality. You never have to ask how does this work?, because its just plain old Go code.

That lack of magic and his comparison to C# sounds like a really good mix.

zalmoxes 2 hours ago 2 replies      
Great point about time. At work we've adapted github.com/WatchBeam/clock and it's helped a lot.

Thanks for blogging about your work on juju! Despite Go already being five years old, many of the patterns around building large applications are only emerging now.

k__ 19 minutes ago 0 replies      
Are generics really that big of a thing if you got structural typing?

I mean, you don't have to implement all the interfaces explicitly, you just have to get your structure right and be done with it.

Am I missing something?

excepttheweasel 49 minutes ago 3 replies      
I think the go language has taken a position along the lines of "re usability and abstraction are overrated". I certainly think there is some truth to that, I am really enjoying working with go on smaller projects and it is this philosophy that has made understanding the code much easier.

But I wonder how it really scales on a large code base like this? Some of the best projects I've worked on leverage usability more effectively to create a sort of vocabulary. They're far more concise, there is rarely more than one source of truth, they're far easier to change and improve. Does this hold true for 540,000 lines of go code?

Gys 34 minutes ago 3 replies      
3542 files

540,000 lines of Go code

65,000 lines of comments

So on average only 170 lines per file, including 17 lines of comments.

Are this normal ratios ?

therealmarv 2 hours ago 2 replies      
What do people think about the future of Juju?
rjammala 1 hour ago 2 replies      
Wonder how long does it take to build juju?
grabcocque 1 hour ago 10 replies      
This entire piece sounds like the Blub Paradox made real.


It's written with knocking down a very specific set of straw men in mind, but rather carefully avoids coming anywhere close to addressing the legitimate criticisms of Go as a language. One of the things that's most irritating about Go enthusiasts is the way they try to close ranks on legitimate critique and reframe their language's warts as "simplicity".

Also: 20 bonus blub points for pulling the old 'I don't need generics, therefore NOBODY does' gambit.

mylons 1 hour ago 1 reply      
And half are from dependencies? ;)
mynegation 1 hour ago 2 replies      
Off topic rant: I don't know much about the details of godeps hash file but I do wish that there were a better infrastructure for merging contents of various file formats. Built into git or shipped as a separate repository. I wasted too much time merging vcxproj.filters files just because in its XML representation one item of a sequence of folder assignments occupies 3 lines (opening tag, contents, closing tag) instead of having everything on one line. Similar problems with JSON files when new items added concurrently to the end of the array.
Amazon, the worlds most remarkable firm, is just getting started economist.com
89 points by gpresot  2 hours ago   77 comments top 11
mabbo 1 hour ago 6 replies      
I've said before on hackernews, but people never seem to grasp the scope of Amazon. They aren't an e-commerce website anymore; that's just the tip of the iceberg. Bias: I spent a good five years there.

In logistics alone: hundreds of warehouses in a dozen countries; hundreds of delivery stations, a fleet of drivers (independent contractors all) in the tens or hundreds of thousands; a fleet of airplanes (and an airport I hear?); and I read an article a while back saying they were buying a big ol' container ship because why not? Oh and the drones of course, whenever they get off the ground (eh? see what I did there?)

Then you add on AWS, Kindle, the prototyped no-cash-register stores, and whatever else they've got cooking that no one knows about. This company is my best bet for "Weyland-Yutani" from the "Alien" series.

toddmorey 56 minutes ago 2 replies      
My family has migrated more purchases to Amazon than I even realized. I got tipped off when I noticed that now the recycle bin fills up so much faster than the trash can.

Which brings up the main issue I have for to-the-door delivery: the packaging. The insane amount of boxes inside of boxes really makes me feel guilty. There was a project at Amazon in 2008 to have consumer friendly, "frustration free" [1] packaging, as most packages are optimized to stand out on store shelves and prevent theft. (Of course that makes them bulky and hard to open. With posed toys that secured with wire and screws, it's even more maddening.)

Where has that effort gone? Did it fall out of favor with manufacturers and / or consumers?

I would love to have Amazon drop off my order in as little packaging as possible and even collect & reuse special durable bottles and such for frequently used items like laundry detergent.

This seems like an opportunity as they increasingly build local warehouses and take ownership of the supply chain and delivery logistics.

[1] http://www.adsavvy.org/amazoncoms-new-frustration-free-packa...

chiph 1 hour ago 5 replies      
Amazon needs to solve the counterfeit goods and scammer seller problems it has. I reported a scammer on Wednesday and in the span of time I was in the chat with them, 3 more scammers appeared for that same SKU.

I've been a customer since the late 90s and if they don't get this fixed, I'm going to start looking for another e-tailer to use.

gdulli 58 minutes ago 1 reply      
I've had an account since 1997 and I'm close to canceling, not getting started. I've weaned off, only making a few orders a year, which I still do because I get Amazon credit card rewards.

I was okay with missing out on fast free shipping unless I paid $100/year, because I could still get slow free shipping. But now that I don't get the best price on a given product without Prime, shopping there feels like a bad value. The competition competes well on price now without extracting an annual fee.

As of a few weeks ago my Amazon rewards card gives me lower rewards than if I paid for Prime, so I'm looking for a new card. I'd chosen a card without an annual fee for a reason.

For the first 15 years Amazon trained me to buy online from one place. But in the last 5 it trained me off of that behavior. I have the added motivation to reject Amazon because of its treatment of employees and its business practices, but as just a customer and not an activist it's become a bad deal anyway.

tabeth 1 hour ago 5 replies      
I know this problem isn't unique to Amazon, but what are people's thoughts on thepotentially unsustainble levels of e-commerce. I know plenty of people who are inserious debt (primarily student and credit card) and still blow tons of money onAmazon.

Given that the general levels of debt are approaching 2008 levels [1], is this something tobe worried about in general, or is it irrelevant? The main reason for asking is that if largepercentages of Amazon's growth can be attributed to a rise in consumer debt, then wouldn't Amazon be disproportionally damaged by another recession and/or some sort ofdebt bubble burst?

[1] http://money.cnn.com/2017/02/16/pf/americans-more-debt-in-20...

vxxzy 43 minutes ago 1 reply      
What is the consensus here on HN? More and more I feel like Amazon is becoming the next Standard Oil. In the case of Amazon, they are leveraging technology in all spaces and optimzing everything. Could they be the next Standard Oil?
gordon_freeman 1 hour ago 1 reply      
As per the article their PE ratio is above 172 and it already reflects their "would-be" profits after 2020. Isn't this too much risk to invest in AMZN?
tehabe 41 minutes ago 2 replies      
For me as customer, Amazon is almost the best, some times I think it could be better. The UX for their video service is awful, but okay.

For its employees I hear only bad things, like that Amazon won't talk to the union in Germany or what The New York Times wrote a couple of years ago.

I'm so conflicted about this company. But it matters. Both sides matter.

sidcool 42 minutes ago 0 replies      
The growth trajectory of Amazon baffles me. In its early days, I had heard a lot about Amazon's failure prophecies from luminaries of the Tech world. Their real power is behind the scenes, far away from their shopping apps. There's no doubt they will be in the forefront of many initiatives over the next decade.
npguy 58 minutes ago 0 replies      
Ethereum will be a solid threat to AWS. Overall.
brilliantcode 13 minutes ago 1 reply      
Amazon, the world's biggest net unprofitable website, kept afloat by investors expectation of turning on net profits in some distant unknown future.

It's like, the laws of gravity doesn't apply to American tech firms but ruthlessly expected from the rest.

It's only a matter of time before Walmart kills Amazon. Walmart has enough cash to copy Amazon and steal their market share. The unchallenged status of Amazon will signal Walmart to enter it's turf and steal away the competition.

Whether I buy a box of condoms on Amazon or Walmart need not matter. Whoever gets it to me quickly and cheapest wins.

Short AMZN and long WMT

Chasing the First Arcade Easter Egg edfries.wordpress.com
92 points by anjalik  4 hours ago   9 comments top 4
rhaps0dy 2 hours ago 0 replies      
Ready Player One in real life? Hunting eggs.
sleepychu 1 hour ago 2 replies      
So was Bonus Time enabled for the shipped consoles?
dep_b 3 hours ago 1 reply      
Didn't I read the same story here yesterday?


nsxwolf 3 hours ago 2 replies      
"What follows, as detailed in his blog post, is one of the wilder retro-gaming goose chases in recent memory."

Didn't sound all that wild. Sounds like they figured it out pretty quickly just by looking at a hex dump.

Dells 32-inch 8K UP3218K Display Now for Sale anandtech.com
83 points by DiabloD3  2 hours ago   101 comments top 23
Xcelerate 1 hour ago 2 replies      
Sometimes I feel like I'm the only person who is really excited about these resolution improvements. When the MacBook Pro Retina came out in 2012, the only reason I bought it was because of the display. I had never used a Mac before then.

Going from 4K to 8K for a 32" monitor may seem like a small improvement, but it is a subtle sensory improvement that just makes using a computer more pleasant. Until displays reach 1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast), I will always want higher resolution.

Other than resolution improvements, it would be nice if someone would attempt an HDR light field display. This would ultimately lead to a monitor that is indistinguishable from a window.

mixmastamyk 33 minutes ago 0 replies      
16:9 is suboptimal for me. I usually like one extra display in portrait orientation for code and web pages.

My current 16:9 24" 4k display in portrait is too tall---have to move my head up and down a lot, and too skinny---hard to put two windows side by side comfortably. Have to fiddle with overlapping windows a lot.In landscape it would be too short, and I rarely watch movies on my desktop. One movie trailer a quarter approximately, so optimizing for that use case is absurd.

I would prefer a shorter 16:10 instead, and was happy with the 22" 1920x1200 that was replaced, though I love the increased resolution of the new monitor.

nickparker 1 hour ago 3 replies      
>Its worth noting that Raja Koduri, SVP of AMDs Radeon Technology Group, has stated that VR needs 16K per-eye at 144 Hz to emulate the human experience

This is tangential to this thread, but can anyone in here explain the state of the art in eye tracking? Actually rendering 16k quality for my peripheral vision seems insane to me, so I'm really interested in the barriers between today's tech and a good foveated headset.

bitL 58 minutes ago 1 reply      
I have 3x 4k monitors, one 31.5" DCI 4k, one 28" UHD 4k and a 55" UHD TV (that also reports itself as DCI capable).

Frankly, 8k at 31.5" is kinda pointless unless you have eyes of an eagle or are working glued to your monitor. As a tech demo from Dell, it's cool!

nwah1 5 minutes ago 0 replies      
8K doesn't excite me. I want a monitor with 4K, FreeSync, over 144hz, running on Displayport-over-USB type C... and ideally powered only by USB.

This seems like it will be possible one day.

Tepix 1 hour ago 2 replies      
Do you think that the market will sooner or later move to 8K monitors? I'm not so sure.

At a normal viewing distance a 100dpi monitor is already decent. A UHD monitor is just great. You have to get very close to see individual pixels. I'd say doing AA is no longer required even then.

8K seems overkill for most purposes. Sure, there is a niche that can take advantage of it, but I don't see advantages for the mass market.

Same as with SACD, CD tech is simply good enough for pretty much everyone so SACD never took off.

I write this despite being a high dpi junkie, I bought a ViewSonic VP2290b (IBM T221 clone, 3840x2400 22") back in 2006 and dealt with a huge hassle of 4 DPI inputs for years.

bhauer 23 minutes ago 2 replies      
Give me a 50+ inch 8K concave OLED desktop display with a matte surface and I will pay a huge premium.

My predictions aren't holding up [1]. I feel too many customers are satisfied with small form-factor and/or are tolerant of using multiple displays and suffering the inconvenience of bezels.

[1] http://tiamat.tsotech.com/ideal-desktop-displays

usaphp 1 hour ago 8 replies      
> "Linus from LinusTechTips should be happy, as they just invested in a pair of 8K video cameras. Time to submit my own acquisition request"

I have never understood what's the point of investing in such an expensive and new tech for videos for YouTube, most of the tech channel videos on YouTube will be non relevant really fast so it's not like you are future proofing...in two years from now still very few people will have 8k screens, and cameras will cost at least 50% less. Storing 8k videos will increase storage costs and processing bandwidth too.

intrasight 28 minutes ago 1 reply      
The image on the screen in that article certainly looks better than the monitor I' using ;)
redm 1 hour ago 5 replies      
I'm using multiple 32" 4K monitors, and while the additional definition might be "nice," I certainly can't work with any "smaller" text.

I wish Dell would produce 8K monitors in a much larger format, like 48".

Keyframe 1 hour ago 0 replies      
280 ppi! At around 800 ppi you don't need to do AA anymore. So, around 32k at same size.
JohnTHaller 19 minutes ago 0 replies      
tl;dr: US$4999, 7680 4320 resolution, 1300:1 contrast, 31.5" IPS, 60Hz refresh, 2 DisplayPort 1.4 to handle the bandwidth
sarreph 1 hour ago 1 reply      
> "By then, 16K might exist, back at the $5000 price point. Maybe."

I appreciate the author saying 'maybe' because I can't, off the top of my head, understand why 16K would add any benefit over 8K... Aren't we at 'retina' with 4/8K anyway?

EDIT: at a 32" resolution...

HurrdurrHodor 29 minutes ago 1 reply      
That article says I can get 4K for 350. More interested in that than the 8K, anybody got a link in Europe?
ryanmarsh 1 hour ago 0 replies      
So can I drive this from a 15" touchbar MBP?
vinayan3 1 hour ago 1 reply      
Hopefully, they drop the price on the 32'' 4K monitor to under $1,000. That'd be a great deal.
TazeTSchnitzel 1 hour ago 1 reply      
Interesting that it uses two DisplayPort 1.4 inputs. I guess HDMI 2.1 isn't ready yet?
mpg33 1 hour ago 0 replies      
holy shit...look at the size of the taskbar
mark_l_watson 58 minutes ago 0 replies      
I might be tempted if it used a USB-C input and was compatible with my MacBook. I recently bought from Apply the LG ultradef monitor they recommend for the MacBook, and while the LG + MacBook combination is great, even a larger higher resolution screen would be great.
eveningcoffee 1 hour ago 0 replies      
Is it with matte finish?
eugenekolo2 1 hour ago 0 replies      
Personally don't see the point. I was disappointed w/ 4K monitors when half of my applications didn't scale well for them. I imagine even less things will scale well for an even less popular resolution.
adamnemecek 1 hour ago 2 replies      
Can't wait for AR/VR to make monitors obsolete.
qntty 1 hour ago 5 replies      
4k is 4096p, I would expect 8k to be 8192p, but it's only 7680p.
Whats the Matter with Covert Action? filfre.net
37 points by doppp  3 hours ago   13 comments top 7
startupdiscuss 2 hours ago 3 replies      
The article is long and entertainingly written. It has lots of references to 80s computer games that people might enjoy (like Pirates!). But the key insight, I believe, is:

If Covert Action had believable, mimetic, tantalizing or at least interesting plots to foil, I submit that it could have been a tremendously compelling game, without changing anything else about it. Instead, though, its got this painfully artificial box of whirling gears.

I suppose in the 80s, given the limitations of the box, there was a much larger emphasis on "narratology" in games, but this criticism might hold for most failed games.

Edit: Never seen this site before. Who is this guy? Now I am going down a rabbit hole...

Endy 50 minutes ago 0 replies      
Having just watched CirclMastr's Let's Play of Covert Action on YouTube, and playing it myself (and I think I came across this article too), I have to say that Covert Action really feels like what it ended up as: an unpolished rush-job that someone put a lot of thought into that maybe didn't come out onscreen. There's a lot of good there and further development of the ideas would have turned it into a real classic. There's a reason it's one of the long-lasting underdogs; it's not just Sid Meier's name.

I'm glad it exists, and for anyone who wants to teach kids about very basic crypto (i.e. letter replacement, removing spaces, etc.), I suggest the "training" for the Crypto skill; it's also good to go and brush up on your own skills. Similarly, the wiretapping is a genuinely interesting logic puzzle which can be used to teach the concept of physically representing 1/0 and logic gates (and how the result may not always be what you expect).

As far as "The Covert Action Rule", I guess the question is how you approach the game. If you look at it in terms of "older" games (i.e. Zork et al), having a notepad beside the computer to take notes seemed the most logical thing in the world to me. In many cases while playing games with complex storylines and puzzle-solving, it still does. I never found the tactical sections making me "lose track" of why I was in a building - either I was there to grab an Agent or I was there to get information to go and do so that I hadn't gotten by wiretapping or crypto.

mcguire 54 minutes ago 1 reply      
"Because processing is, to use Crawfords words again, the very essence of what a computer does, the capability that in turn enables the interactivity that makes computer games unique as a medium, games that heavily emphasize processing are purer than those that rely more heavily on fixed data."

I'd buy that. Purer, but not better. Procedural generation, in theory, has a lot of advantages, but in practice, in many areas, it hasn't matched the old school approach.

tehwalrus 26 minutes ago 0 replies      
Sid's theory about games seems to suggest that Invisible, Inc. should be boring. I have to say I've found it anything but.

Has anyone played both Covert Action and Invisible, Inc for comparison?

waqf 1 hour ago 1 reply      
The author seems to think that ludology vs narratology is the same as procedural vs static content, which I don't think is true at all. You can easily make an open-world game with static content, you just need a lot more static content.
gipp 1 hour ago 0 replies      
I guess this guy is much more historically focused, but it seems odd to present the thesis he does and not even mention No Man's Sky as an obvious example of the kind of failure he's talking about.
JabavuAdams 1 hour ago 0 replies      
Great references on procedural story generation. I used to cringe a little bit every time my students would tout the procedural generation in their project that would result in limitless replayability.

Someone has to enjoy the first play, before they'll replay. You don't know how to make one good game yet, never mind a factory that makes good games.

TidalCycles A language for the Live Coding pattern tidalcycles.org
115 points by bojo  7 hours ago   38 comments top 15
shae 15 minutes ago 0 replies      
My favorite use of TidalCycles is in the Canute performances: http://canute.lurk.org/
eggy 4 hours ago 2 replies      
Tidal (they added cycles when the Tidal music service came afterwards) is great fun. I was hoping to get Hylogen working with it.

Hylogen is an EDSL for livecoding shaders in Haskell [1]

My weapon of choice is Extempore by Andrew Sorensen [1]. Dual languages in one system available for blistering speed in livecoded graphics and sound/music.

 [1] https://github.com/sleexyz/hylogen [2] http://extempore.moso.com.au/

bamdadd 3 hours ago 0 replies      
Check this out if you are if you are interested in live coding, its a great abstraction for live coding for overtone : https://github.com/ctford/leipzighttp://ctford.github.io/klangmeister/composition
tudorw 6 hours ago 0 replies      
If you get to see someone perform live coded music as I did at EMF 2014 (Yaxu (Alex McLean) it's a blast, watch in real time as the code is edited, glitches out then drops back in, if the coding is projected you can predict where the music is going :) I could not find the EMF show I saw, but this gives a feel https://www.youtube.com/watch?v=3HXcb5_RuNg
yaxu 1 hour ago 0 replies      
You can find out more about live coding in the performing arts here http://toplap.org
andybak 2 hours ago 1 reply      
Did the title get modded to remove the reference to music? It's a terribly unclear title as it stands.
mezod 6 hours ago 4 replies      
dates 2 hours ago 0 replies      
Tidal has a MIDI add on which makes playing with MIDI instruments quite fun! I get patterns of notes playing and then fiddle on the knobs of the synths. So much fun :-)
kindohm 3 hours ago 0 replies      
I produced this album with TidalCycles: http://shop.conditional.club/album/risc-chip. Controlled samples and hardware synths (via tidal-midi) from code.
filleokus 5 hours ago 0 replies      
Related question: Are there any nice videos demoing this kind of music artistry in a different genre? I'm thinking of stuff like https://www.youtube.com/watch?v=yY1FSsUV-8c or maybe something with vocal sampling?
beaconstudios 3 hours ago 0 replies      
this reminds me of Bret Victor's talk, "stop drawing dead fish", which takes a similar concept of live-performance programming and applies it to the world of animated storytelling:


vittore 3 hours ago 0 replies      
This is beautiful and I think haveing tool like that can improve generated music for lounges and also games.
krautsourced 5 hours ago 1 reply      
Slightly unfortunate naming choice maybe, considering the Tidal music streaming service?
ozpri 4 hours ago 1 reply      
omg. muh ahlbuum is vegan and open source
goldenkey 6 hours ago 4 replies      
Another one? So how is this one different than all the languages that do this (like Chuck or Overtone) AND the libs for C++/all other languages that generate audio as well? The original THX sound was generated in C.

I would be inclined to use a language that had more signal processing built in, ie Mathematica:



German Space Center has constructed the worlds largest artificial Sun newatlas.com
146 points by kamaal  4 hours ago   30 comments top 11
danielvf 2 hours ago 0 replies      
I worked on the light fixtures for the US Insurance Institute for Highway Safety, vehicle research center. http://www.iihs.org/iihs/about-us/vrc

These lights were 72,000 watts each, with a total of around three-quarters of a million watts of lights in the crash hall. (Almost as as much power as the German sun installation.) High speed cameras need a lot of light.

I've stood in front of one of the lights in protective gear - but the heat still just goes through your body.

The first time the IIHS lights were turned on it blew a power substation 20 miles away.

Tepix 3 hours ago 0 replies      
I saw the large space simulator (LSS +) at the ESTEC test centre (ESA) a few weeks ago where they can simulate up to eight times the solar intensity (the solar intensity is 10 times as high at Mercury) in their 15m by 10m vacuum chamber using xenon lamps. The surface temperature of the BepiColombo spacecraft due to launch next year will be around 500C. That was impressive already.

This one does not work inside a vacuum chamber but manages 10,000 times the normal solar radiation with the temperature reaching 3,000C. Wow!

+ http://www.esa.int/Our_Activities/Space_Engineering_Technolo...

VMG 2 hours ago 3 replies      
Watched a TV clip about this. I was surprised that apparently the control system was programmed by one guy (Dmitrij Laaber). I found a Master Thesis explaining control software for a prototype of this: http://elib.dlr.de/108662/1/Deepak%20Chopra-Master%20Thesis.... (is Deepak Chopra a common name?)
morsch 3 hours ago 3 replies      
Synlight is a giant parabola made up of 149 7-kW xenon short-arc lamps... so a total power draw of about 1 MW? That's nuts. Hope they remember to turn it off when they leave the room.
evanb 2 hours ago 2 replies      
I moved to Jlich in November. There's all sorts of interesting industrial, technological, and scientific research going on here---it's not just the Forschungszentrum and the space agency. Cologne and Aachen aren't far. Worth a look if you're looking for an affordable place in Germany with good access to cities and lots of smart people to start a company.
zebrafish 2 hours ago 1 reply      
Pretty cheap installation for the amount of knowledge they could potentially get out of it. I wonder how much it costs to run for an hour.
cmarschner 1 hour ago 0 replies      
So, they put up lamps.
pinkskip 2 hours ago 0 replies      
myst 3 hours ago 3 replies      
Came for fusion reactor, left disappointed.
balabaster 1 hour ago 0 replies      
Ugh, here I was hoping for an article about nuclear fusion... so disappointing. Instead of making power, we're wasting it.
unwind 2 hours ago 3 replies      
"It refuses to work at night" ... yeah, that's exactly how the Sun works. Very annoying stupidification of the content, in my opinion.

Humans calling the Sun "finicky" is a bit like ants calling an aircraft carrier "slow and cramped", or something.

A second life for very old C programs tailordev.fr
14 points by couac  1 hour ago   1 comment top
splitdisk 21 minutes ago 0 replies      
This is very cool and something I have been curious about for a long time. How is the performance?
The PhD Octopus (1903) uky.edu
42 points by maverick_iceman  4 hours ago   3 comments top 3
master_yoda_1 1 minute ago 0 replies      
Same kind of shit is still going on in silicon valley.
arcanus 1 hour ago 0 replies      
In modern times, I've found much the same with jobs and the requirement for a "B.S. in CS"

(three magical letter holder here)

Beyond Silent Spring: An Alternate History of DDT chemheritage.org
10 points by Hooke  2 hours ago   1 comment top
bougiefever 12 minutes ago 0 replies      
My husband remembers seeing puddles with dead birds around if after spraying DDT in nearby fields. His father died from a serious lung disease, and his family attributes it to the chemicals he was exposed to while farming. My husband remembers him mixing up the chemicals with his bare arm immersed completely in it.
Why Information Matters [pdf] thenewatlantis.com
13 points by miobrien  2 hours ago   1 comment top
jamesrcole 6 minutes ago 0 replies      
FedEx is offering $5 off web orders to enable Flash fedex.com
9 points by donjh  1 hour ago   1 comment top
gigatexal 1 hour ago 0 replies      
Cheaper than re-enginnerring their enterprise web facing app I guess. Sad.
What Happens If You Break an Artwork? artsy.net
48 points by prismatic  5 hours ago   29 comments top 4
BugsJustFindMe 2 hours ago 4 replies      
What gets me is that in a world where artists often experiment with viewer interactivity, sometimes you just can't tell what in a museum you're supposed to touch and what you're not supposed to touch.

There was a huge retrospective exhibition of Yoko Ono's work at Musee d'Art Contemporain in Lyon, France in 2016. Included were a mixture of interactive/participatory elements (ladders you were meant to climb, a room filled with hammers and nails that you were supposed to contribute to by banging a nail into whatever you wanted) and other obviously not interactive elements (videos, paintings, things protected by glass).

And I remember, in the same room as the ladders you could climb and combat helmets suspended from the ceiling by ropes that were swinging around because people were walking through them, was a sheet of either canvas or paper mounted on the wall, with another piece of either canvas or paper suspended about an inch in front of it by some string, with some holes cut into the front sheet so that you could just see that there was something drawn or written on the hidden layer.

Well, in a room filled with people talking and laughing and climbing ladders and pushing helmets around, I went to peek behind the curtain and lifted up the front layer to take a look at the one behind.

A custodian ran at me and told me to step away. I still don't know if I damaged it or participated in it.

Pica_soO 10 minutes ago 0 replies      
My uncle works in art-work transportation, and some of the pieces are basically not transportable, but are travelling from museum to museum anyway.

He remembers a charcoled doorframe, that had to be transported although it basically could come apart any second once moved. They have vibration reducing special boxes, with the same climate protection as humidors have it.Also there are titanic insurance fees at work, to move art.And sometimes, somebody in some state run museum, is forced to take the cheapest option available.One of those haulers venturing into art, transported a artwork by a Chinese artists (do not know the name), basically very long paperrolls with Chinese letters on them to be hung from a halls ceiling. Those rolls they come in cardboard boxes, sealed with tape- and the poor fellow, takes a cutter, and systematically, cuts through the tape that seals all the boxes.Each box a artwork, insured for 25.000 - sliced.

markbnj 21 minutes ago 1 reply      
I suppose if you're going to make an artwork look like a bench and not surround it with protection, then you'd best make it function as a bench too.
melling 2 hours ago 3 replies      
On a slight tangent, does flash photography damage paintings?

I see always people using flash in museums. I assume it's because they don't know how to turn it off.

Ostinato A Network Traffic Generator and Analyzer ostinato.org
65 points by rishabhd  7 hours ago   18 comments top 9
mino 3 hours ago 0 replies      
Ostinato has been there for a while. I think it is the best you can get as opensource traffic generator, if you don't have access to commercial hardware (Ixia, Spirent, etc.).

Also, there's this project based on the VPP dataplane technology, which could help you achieve higher bitrates. Personally I haven't managed to play with it yet: https://trex-tgn.cisco.com/

ausjke 2 hours ago 1 reply      
Linux kernel has a built-in traffic generator, can rarely find documents on how to use them, but I heard Spirent etc is using that to test their generators in the making
DoofusOfDeath 4 hours ago 1 reply      
Kind of off-topic, but I learned the musical term "ostinato" just a few days ago, here: https://www.youtube.com/watch?v=8IX1jSVmaAs

I just discovered that guy's (Rick Beato's) channel the other day. I normally find music theory very dry, but even my kids found that video interesting.

MichailP 5 hours ago 2 replies      
Can someone recommend good lectures/book for full stack network programming focused on concepts but with enough detail? Something similar to The Elements of Computing Systems by Nisan & Schocken, but for networks?
supahfly_remix 1 hour ago 0 replies      
How is the packet checking on this? What exactly does it check?
jaimex2 5 hours ago 0 replies      
Its got a pretty quirky UI but it works. I've used it when load testing some packet processing apps.

Why was this posted, It's a generic packet generator?

pmontra 5 hours ago 2 replies      
Ostinato == stubborn in Italian but I wonder if it is named after the ostinato music pattern [1], also of Italian origin but different context.

[1] https://en.m.wikipedia.org/wiki/Ostinato

ldzcoder 3 hours ago 0 replies      
There is also Scapy [0], fully operable from the Python shell. [0] http://www.secdev.org/projects/scapy/
kkirsche 4 hours ago 0 replies      
I've usually used hping3 but ostinato is good
Tamil Bell wikipedia.org
308 points by Thevet  11 hours ago   82 comments top 17
msound 4 hours ago 5 replies      
The Tamils originally wrote on dried palm leaves with a sharp scribe. So, if you didn't want to tear the leaf, you had to avoid straight lines and dots. That's why there are so many curves in the script.

Also, one of the meanings of my first name "Mani" is literally "Bell".

Source: https://en.wikipedia.org/wiki/Tamil_scriptAlso: I'm Tamil.

elvinyung 9 hours ago 2 replies      
Out-of-place artifacts[1] are cool. A really interesting one is the Tecaxic-Calixtlahuaca head[2], a part of a terracotta figurine that was found in a pre-Columbian site in Mexico that is speculated to have Roman origin.

Learning more about this is actually kind of hard. Unfortunately, there seems to be a lot of pseudoarchaeology concerning pre-Columbian transoceanic contact.

[1] https://en.wikipedia.org/wiki/Out-of-place_artifact

[2] http://www.unm.edu/~rhristov/calixtlahuaca.html

danans 1 hour ago 2 replies      
The name of the ship's owner (mohayideen) is Arabic, transliterated into Tamil script. There is a well known history of the Arab trade network throughout Southeast Asia but it's fascinating that this artifact represents a fusion of Arab and Tamil culture.

It makes me ponder whether the owner of the ship was himself cross cultural.

The Arab presence in Southern India actually predates the advent of Islam, so even it's possible (though not likely) that this artifact hails from before that time.

EDIT: TFA notes that the script is datable to 500 years old so it is probably not pre-Islamic.

skrebbel 7 hours ago 7 replies      
> Translated, it says "Muhayideen Bakshs ships bell".

I like language, and this inscription makes me wonder. Why write on a bell that it's a bell? I'd be less surprised if the bell had said "Muhayideen Bakshs ship".

I mean, I get that school buildings say "School" because otherwise it's really just a building. But a bell? Isn't that a bit like writing "Headmaster's Office Door" on the headmaster's office door?

I wonder whether maybe it was just an artsy kind of joke. A bit like how in import stores you can buys forks that say "FORK" on the handle.

I now imagine the crew on that ship looking up to the bell every once in a while, grinning, thinking "that Baksh fella is a tough one but at least he has some sense of humor".

puranjay 9 hours ago 0 replies      
You might find this interesting:

"Australia experienced a wave of migration from India about 4,000 years ago, a genetic study suggests."


mataug 9 hours ago 3 replies      
Theres ambiguous separation between the words on the picture.

It reads phonetically as "MugayatheenPak Udaya Kappal Mani".The translation is spot on.

Source: I'm a native speaker from Tamil Nadu.

cyberferret 9 hours ago 3 replies      
Interesting. There has to be more evidence of other nations discovering Australia and New Zealand before the ones mentioned in our history books.

Nearer to me, the Tiwi islands just off the coast of North Australia have unearthed jade figurines and artifacts that seem to originate from China or another Asian country, which date back to before the time of Captain Cook.

Update: Article on early Chinese explorers reaching Australia a long time before Dutch or English explorers - http://www.theage.com.au/articles/2002/11/24/1037697982893.h...

manojlds 7 hours ago 1 reply      
As a Tamil, I am surprised by so many such things that I come across about our ancestors. No wonder we take a lot of pride in our culture, sometimes to the extreme.
mclightning 5 hours ago 1 reply      
A lot of people express their surprise about "other nations discovering AU, NZ, US" etc.

You do understand homo sapiens did not evolve separately on these different continents/lands right?

The actual discovery was when the first homo sapien settled in US, AU, NZ etc.... Of course some cultures managed to travel to these lands long before our history records tells us. Native Americans are not a separate species of homo sapiens, they did not evolve there separately.

palerdot 9 hours ago 1 reply      
Fascinating. I'm from Tamilnadu (where Tamil is spoken in southern most part of India) and so far we have only heard of early day explorations so far as current singapore, malaysia, cambodia etc. This news is really interesting.
nhaliday 7 hours ago 1 reply      
here's some interesting speculation on this topic: https://westhunt.wordpress.com/2013/01/15/a-three-hour-tour/

Basically there's genetic (Y chromosome) and linguistic evidence suggesting an infusion of Indian immigration (~5000 years ago so maybe a distinct event from this). One plausible explanation is that some Dravidian seafarers crash-landed in Australia and got absorbed by the local population. From their perspective it probably would have felt like sinking into savagery.

jv22222 9 hours ago 0 replies      
While we're on the topic of interesting artifacts on Wikipedia, this is pretty cool:

Voynich manuscript Undeciphered book from the 15th century


LAMike 9 hours ago 0 replies      
Trincomalee would be a good base, second biggest harbor in the world. Cool to see something Sri Lanka related on HN
sriram_iyengar 4 hours ago 0 replies      
Proud moment for a Tamizhan on HN !
sathishmanohar 9 hours ago 2 replies      
Am I missing some context here? How is this related to HN?
Angular 4.0.0 Now Available angularjs.blogspot.com
246 points by theodorejb  15 hours ago   265 comments top 36
ssijak 4 hours ago 10 replies      
Hmm. I see that Angular is getting aaaaalot of hate here. I really tried to understand why but have not found really valid reasons, just preferences. I have used Angular 1.x a lot and have just tried Angular 2. It really enables me (somebody who comes from primarily strong backend dev experience) to work on frontend SPA apps productively and fast.

It does not 'feel' heavyweight or that it gets in my way too much, but the contrary is true. Of course, it has it`s quirks, as every larger lib have, but it`s pluses outweigh minuses by far for me.

Preferences aside, does Angular make you less productive than other options? Do you feel that you fight the framework? Can you finish non trivial frontend apps, involving 5+ team members, 'better', with cleaner code and much faster with other options?

This are just honest questions. I wanted to start some pet project in angular2 soon, but would listen to alternatives, maybe it is a right moment to try some of them.

Bahamut 10 hours ago 3 replies      
I have been leading a good sized Angular app for work for the past 8 months, and upgrading from v2 to v4 has been painless, even while having an AOT compilation build pipeline set up - the only modifications we had to do was switch OpaqueToken to InjectorToken, <template> to <ng-template>, and Renderer usage to Renderer2, and these weren't required to be done yet. We only did those changes to get ahead of the curve from the deprecation messages.

On first note, the codegen size decreased dramatically for AOT compiled builds - we went from ~600 KB vendor + app minified and gzipped (not counting 50 KB of polyfills) to ~400 KB. This is huge, and the boot speed feels even faster on first user load of the page!

Thanks to the Angular team for this fantastic work!

robwormald 14 hours ago 7 replies      
Angular core team here, we're pretty excited about this release.

Main change, as noted, is the new View Engine. The design doc[0] is worth a read if you're interested in front-end at all.

Happy to answer any questions!

[0] https://docs.google.com/document/d/195L4WaDSoI_kkW094LlShH6g...

tomelders 12 hours ago 15 replies      
I'll be that guy.

I'm sure the Angular team are great people, and they're clearly talented devs... but stay away from Angular.

It doesn't help with the problems you will actually face. Typed Javascript is a cargo cult. Angular just plain confusing for no apparent benefit. Dependency injection is bizarre. The distinction between modules, components, and directive is unnecessary. The Javascript community in general is moving away from OOP towards functional programming, but Angular has hitched it's wagon to OOP. It's reliance on the decorator pattern is maddening. It's view layer performance is sub standard. It's opinionated in all the wrong ways.

And I say this as someone with a lot of experience with Angular 1, Angular 2.

Trust me, go for libraries over frameworks every time. Redux, React, Immutable, Sagas, and reSelect. That's the future of web app development.

georgefrick 13 hours ago 1 reply      
Community here has been pretty negative on this lately. So I'll just add I've been using Angular 2x and also Ionic 2 the last year or so with great success. One real benefit we saw was the ability to take a lot of Backbone code and just quickly port it (models to services, templates just back to html for a component, etc). This is helping us help our enterprise clients in conversions of both older front end code and older Java EE code (for example JSP + Struts => Restful + NG2X).
otto_ortega 12 hours ago 3 replies      
VueJS is the only of these JavaScript UI Libraries/Frameworks I can stand... The only one whose syntax doesn't make my eyes bleed!... I can't wait for Alibaba's Weex to be officially released so Vue can be used for developing mobile apps too. Reusing components across platforms and the web is the only reason why I would like to jump into the JS wagon!
crudbug 2 hours ago 0 replies      
I tried react initially. As a Java developer, I needed types, rxjs, modules without the Hassel of webpack configs.

Angular provides consistent structure to your app. With angular-cli, consistent interface to front-end development. Having ported it from Ember, IMO the cli interface should be standard across all front-end frameworks.

I see it as microkernel like framework for front-end development. In Java land, we have OSGi standard for modular software component development, angular has sort-of similar design.

The only confusing part is NgModule, the team should rename angular modules as NgBundle - which consists of native TS modules.

I read somewhere, angular team is working on material widget library, please make style-less components with theming support.

aprdm 14 hours ago 4 replies      
Even with SemVer versions it looks really weird to see a new Angular relase branded as 4.0.0.

Angular 1 to Angular 2 made me drop angular entirely, that happened less than 6 months ago I believe.

Just reading Angular 4 gives me the creeps.

Is 4 widely different from 2? How am I suppose to know if you using semver and already have a really bad history?

git-pull 14 hours ago 7 replies      
I'll probably try it.

But man, am I getting jaded about all this javascript framework stuff. Every 6 months stuff breaks, every 2 years there's a huge shift.

The problem with javascript I've been facing is value. Time and effort doesn't always correlate to what I get in the other end. In fact, I can say when building, going single-page is a time sink.

And it's almost always a mistake to go SPA first. Using a django or a rails lets you get the basics and data flow nailed down early on. Get into a framework too early and have a need to change something? Have fun explaining to your manager/client how costly it is to do a "simple" modification to a JS app when you have to throw the state you built it upon out the door.

What I want is a system tightly coupled into a server-side framework like a Django or a Rails that degrades gracefully and I only have to program the interactivity one time. Something that'd plop right into the asset pipeline/django compressor so I don't have to go outside of the framework to build.

Hundreds of hours of my life have been spent chasing this dream of sharing server side code with client side JS frameworks. That's what I need.

Meteor didn't do it for me. As for rendr, I've done stuff better with backbone/express in-house. As of 2017, I get my best bang for the buck using django and pjax. No joking, I went from full DRF + Backbone Marionette -> to plain old jquery and pjax and couldn't be happier.

All these new build tools (grunt, gulp, webpack... come on), ES versions (I was ok with ES5). None of these things are helping me ship stuff ahead of / on time and correctly. They're creating an even larger gap between the server side data, logic and templates and the JS interactivity.

If anyone is listening, I'd love to have a well-supported opinionated distribution of django or rails that just renders forms, tables, etc. with angular/react/etc. and degrades gracefully.

tannhaeuser 6 hours ago 1 reply      
First of all, congrats to the Angular team.

I'll say personally I've never been a fan of Angular, but I think if you want Java/J2EE-ish all-encompassing component model and decorator-/annotation-based GUIs, it certainly is a very strong contender (though kindof the thermonuclear option and absurdly complex IMHO, at least if you have some prior web development experience). I think Google's track record wrt. long-term maintainance isn't half bad really (GWT and closure tools have been around for a long time).

That said, I've recently talked to recruiters, and was told Angular has already peaked as the go-to framework for enterprise MVC web apps, and is replaced by React and others (and I'm assuming Angular wasn't all that much used outside that demographic because of the heavy setup and on boarding/buy-in).

In the course of JavaScript generational cycles I'm expecting we're bound to re-discover "evergreen" web apps, those being characterized by lack of heavy build pipelines, simple browser-refresh driven development cycles, and straightforward use of web forms (+ maybe components).

An open question for me is what about TypeScript, eg. since Angular has been a major driver/user behind its type-heavy approach, will it suffer along with Angular?

elmigranto 14 hours ago 5 replies      
I'm so out of the loop Feels like a week ago everyone talked about how cool and performant v2 will be when it's finally out of the beta and officially ready for production, and now there is v4 release.

What's going on with versions?

awqrre 10 hours ago 1 reply      
Angular feels like it's there only to obfuscate the web's source code... (don't be shy if you think I need downvotes...)
AngeloAnolin 1 hour ago 0 replies      
At some point, I will take a look at what Angular 4.0 has to offer.

My criteria to adopting a framework:

* Makes me more productive

* Allow me to ship apps / solutions better and faster

* Minimal time doing some head scratching on how to implement stuff

* Integration with other frameworks

* Uncomplicated setup and build process

* Community support (hey, we can't possibly know every inch of a framework unless we are the author)

tribby 13 hours ago 0 replies      
increasingly, angular seems less like a tool I'd use and more like a platform that exists to sell typescript and generally influence the web.

in 2017 I'm curious why I'd pick angular over vue for a web-based project, if anyone has insight. not over "x js framework" -- over vue specifically. vue 2 is pretty much exactly what I wanted angular 2 to be.

voidmain0001 2 hours ago 0 replies      
I wonder what Rob Eisenberg formerly of Angular and now with Aurelia.io would have to say about v4?
kabes 8 hours ago 1 reply      
Just reading about 'renderModuleFactory' in this announcement makes it sound enough like Java EE bullshit that I want to stay far away from it.
dreamache 14 hours ago 1 reply      
Just a quick note to those who would like to play around with it..

Install the latest version of the CLI:npm install -g @angular/cli

And then run:ng new project-name --ng4

The "--ng4" flag is currently required, as it doesn't yet install ng4 by default.

ramigb 5 hours ago 0 replies      
I really hope the Angular team find their path again, because it seems like they've lost it a long time ago!. I remember learning Angular 1, 5 years ago, it was an awesome experience but now ... let's say I wish them the best for real.
crudbug 13 hours ago 1 reply      
Congratulations to the team. I have been using Angular 2 for our internal operations dashboard.

I am seeing : platform-browser / platform-server namespace. Can we expect platform-android, platform-ios, platform-jvm ?

Looking at the angular compiler pipeline, with AST this should be possible ?

caleblloyd 3 hours ago 0 replies      
What is one supposed to type into Google when trying to differentiate from Angular 1 and Angular >= 2? I used to type Angular2 but posts going forward may mention Angular4 now and not show up.
coding123 14 hours ago 3 replies      
I suspect there's going to be a bit of pushback to this acceleration in numbering.
wiradikusuma 9 hours ago 1 reply      
Can anyone from Angular team please make integration with Google Cloud Endpoints, or at least show how it's done? the official documentation is for AngularJS 1.x: https://cloud.google.com/solutions/angularjs-cloud-endpoints...
finchisko 2 hours ago 0 replies      
"These changes reduce the size of the generated code for your components by around 60% in most cases"

Sorry angular team, it just mean you did poor job at first.

jlebrech 6 hours ago 1 reply      
why does it feel like I have to be a developer OF angular in order to create an app. too many of the design choices they made manifest themselves in MY code, a framework isn't supposed to do this.
adrianlmm 1 hour ago 0 replies      
Stay away from this abobination, it will make your project unnessary complex, there are better alternatives.
aedron 6 hours ago 0 replies      
Angular is so 8 months ago.
ianamartin 7 hours ago 1 reply      
Honest question,

How does Angular make my life easier or better in some way?

I currently use Python and Pyramid as a framework. Mako templates. And SQLAlchemy for database interactions.

How is any of this really better?

andrewclunn 4 hours ago 0 replies      
I just want to say thank you for angular 2. I still code in 1 mind you, but now that ecosystem is stable and mature, and I can just get my stuff done while ignoring all the hip trends.
serb348 3 hours ago 0 replies      
Can't wait for Angular 29!!!!!!!1
regeiger101 13 hours ago 1 reply      
Will I be able to build on this platform with pure JavaScript (rather than TypeScript)? What kept me from Angular 2 adoption was its emphasis on TS.
drawkbox 9 hours ago 0 replies      
The monolithic javascript framework era is over and has been.

Anyone choosing monolithic over microframeworks/libraries will regret it over time, that time may be as quick as 6 months, right around launch and switching to maintenance. Pour some out for the poor bastards that have to support legacy versions of these frameworks.

Nican 13 hours ago 1 reply      
Does anyone have an good example app using modern Angular?

Looks like cool features, but I have a hard time envisioning on how they all work together.

Touche 13 hours ago 1 reply      
Is Angular ever going to support web standards such as web components?
SFJulie 12 hours ago 0 replies      
As I said in another comment the fun part is less than 50% example on madewithangular actually use angular on advertised pages, and even the true positive on one page do not imply a significative part of the web domain is powered by angular.Check by yourself. (Condition for tagging a page made with angular is no 40X/50X/60X and m/(angular|ng-app)/ in html feel free to correct me if I was wrong, I will make my mea culpa and published revised figures. Domain checking was random, by hand (I never trust my software))


So making the hypothesis of good faith from the submitter it means 50% of historic users of angular dropped it since 2015. Most of them anyway did not fully adopted it.

If a technology is being dropped after 2 years by their early motivated adopters, maybe there is a smell?

booh 14 hours ago 3 replies      
Is it relevant anymore ? Most of the javascript developers I know are working with Ember.
sova 8 hours ago 1 reply      
Man y'all niggas is trippin. Angular changed completely with angular 2.0. If you say 2 and 1 are the same you are plain lyin' to yourself. I am not kidding. They are not the same thing, and should not be named the same. Likewise, pretending that 1+2 = 4 is just fool's talk. I don't care anymore for angular; Having your app-state in-sync with your code is great, but changing your whole paradigm every 6 months because the powers-that-be told you that your major version number _should_ increment for no apparent reason? You must be out of you got-damn mind. So fuck angular, and fuck all that bullshit that comes with it. Downvote me all you want, you guys gotta get a grip on reality, or let go completely, because this honey-do-good-half-assed-bs is getting you absolutely nowhere.
From Ruby to Crystal: A Quick Look atomicobject.com
32 points by sdogruyol  58 minutes ago   1 comment top
nurettin 1 minute ago 0 replies      
Great writeup for the folks who are curious but lack the time to test Crystal.
No, I Dont Want to Subscribe to Your Newsletter python.sh
74 points by jacobbudin  3 hours ago   61 comments top 25
jasonkostempski 13 minutes ago 0 replies      
I made a add-on for myself to shun sites that behave in ways I don't like. It removes links to their sites from every page. It effectively removes them from the internet. I think that's a way scarier scenario for a website owner than just getting their ads blocked. If the list were community driven and widely used, I think site owners would start changing their behavior. I know most people just want the content and don't care if they block a few ads but I'm sure, like me, there a good chunk of people that are fine with respecting the site owners wishes and just never going to the site (by never seeing their site mentioned ever again) since I'm not going to "pay" in any form.

I did publish the Firefox add-on (desktop only) just so I could avoid having to use web-ext or temporarily install it every time I run FF. It's a complete hassle to set up and configure at the moment. https://addons.mozilla.org/en-US/firefox/addon/ssure/?src=se...

Edit: Source is here https://addons.mozilla.org/en-US/firefox/files/browse/601156... in case you want to try and not worry about it doing something shady.

gnicholas 1 minute ago 0 replies      
These are also bad for accessibility reasons they generally have a light gray "x" on a white background, which you have to hunt for if you want to dismiss the modal. For normally-sighted individuals, this is a hassle. For people with low vision or motor impairments (e.g. Parkinson's or other tremors), this creates a much bigger problem.
makecheck 9 minutes ago 1 reply      
Publishers should approach everything from the point of view of a person on the street: if it wouldn't be acceptable in real life, WHY is it acceptable online!?

The equivalent of a pop-up newsletter modal is somebody on the street PULLING you aside, standing directly in front of you and preventing you from going any further until you answer their question. All without bothering to observe what you were doing beforehand. Your choice then is to step back the way you came to avoid the creepy sidewalk-blocking people. Ridiculous, creepy and unacceptable in real life but essentially exactly how web sites treat their visitors.

Lxr 1 hour ago 1 reply      
This is what happens when you mindlessly optimise for metrics like user signups - not surprisingly, you get more signups when you block the content with an obnoxious modal but after a few years, you wonder where all your users went. How much value do you really get from a user who was forced to sign up with a gun to their head?
bernardlunn 2 minutes ago 0 replies      
I run a newsletter biz and avoid all these gimmicks. Real attention is what matters not phony metrics - even if attention is hard to measure.
jo909 21 minutes ago 1 reply      
Yes, I want to subscribe to your newsletter!

Automatically with a browser plugin, with an address that will even accept your mails. Unfortunately no human will read them in the end, but I'm sure your metrics will be great. I might even accept a cookie so you know I'm already subscribed to your great newsletter.

Now if everybody were to do that...

pcmaffey 48 minutes ago 2 replies      
No, I don't want to chat with someone either.

So please stop beeping at me and popping up a chat modal.

digitalengineer 1 hour ago 5 replies      
Yes it's annoying but they work:

"sticking a big ole pop-up in their face can be one of the most effective ways to jolt their attention & grab their email for a return visit." Peep Laja. https://conversionxl.com/popup-defense/?hvid=2EcGFw

pabloarteel 1 hour ago 2 replies      
I think Google is fighting this by giving a lower rank to pages that do this[1].

[1]: https://webmasters.googleblog.com/2016/08/helping-users-easi...

PortableCode 54 minutes ago 0 replies      
I usually subscribe postmaster@(domain) to the newsletter as RFC822 requires the account to be active ;)
tlow 35 minutes ago 2 replies      
From the perspective of crowdfunding

1. The crowdfunding sites themselves maintain HUGE newsletter lists and use very advanced analytics to determine what to place in those newsletters.

2. For campaigners, the size and activity of your email list is a huge factor in determining your campaign success. Just like this web tool https://www.thunderclap.it/about sending a direct email blast to a good list can mean the difference of a successful hard launch and campaign, or a lackluster or failed campaign. The email lists of the sites themselves which feature several campaigns, are hugely influential on campaign success, and in my experience has at least once lead to the production of 4x our total raise goal in a single platform newsletter feature of our campaign.

Sometimes people do want to be notified. Newsletters are something of a different issue, but the case above seems like a newsletter to me. Especially because we used our first campaign backers + second campaign + interest landing pages and social media gathering email campaigns to continually send emails about new campaigns and products.

Essentially therefore I'm arguing, the ability to gather a quality, targeted email list and generate a recurring newsletter without a 10%+ attrition rate [1] is both difficult and valuable.

[1] CANSPAM compliance requires unsubscribe link, my personal interpretation is 1-click unsubscribe should be the rule, no loading email setting pages behind login walls. Good design is honest. Crowdfunding requiring physical good production in quantity is very difficult for the uninitiated. And then it remains difficult, time consumer over time, and requires constant attention. This is essentially scaling issues but in the physical world. So many of the failed to deliver crowdfunded projects are not so much dishonest as naive, but also consider Jobs' thoughts on the subject

> great artists ship

though Dieter Rams (most famous living Industrial Designer) says

> designers are not fine artists who we are often confused for

stevesearer 1 hour ago 0 replies      
Placing the newsletter signup form at the bottom of the articles on my site is my preferred method. Only engaged readers get there, and those are the people I want subscribing. Adding double opt-in also weeds out mistaken subscribers.
srigi 42 minutes ago 1 reply      
How about extension with some kind of machine learning? You know, all 4 borders and all 4 corners going dark w/o user interaction. At the same time some kind of form input appear at highest z-index.

Extension would delete that DOM subtree rightaway. + some kind of cloud harvest from users reporting false-positives.

sklivvz1971 7 minutes ago 0 replies      
Another good use case for spam blockers. They work wonderfully for this.
neogodless 1 hour ago 0 replies      
I'm not sure the headline succinctly conveys the prime message of the article, which is that modals have replaced pop-ups as a nuisance.

In general, I agree. My reaction to most modals is to simply close the tab. Often it's halfway through an article. I can't be bothered to finish it if I'm being interrupted rudely.

jasonkostempski 1 hour ago 1 reply      
NoScript unless they earn the privilege of executing code. I wish whatever spec would have covered the concern had stated script execution MUST prompt the user.
0x006A 1 hour ago 1 reply      
I started just closing the tab if a site thinks I want to signup before reading the content.If sites are do desperate for the quick fix, I do not expect them to have good content below that modal dialog.
angvp 1 hour ago 0 replies      
I do agree with the article, but sites continue to ask me for stuff, what I do, is fake data, nothing is painful than that.
morley 1 hour ago 1 reply      
The full-screen modal-with-windowshade newsletter prompts are super annoying. But I realize bloggers will always want to "expand their reach," to use the distasteful marketing term. I'd much prefer the prompted blinded in from the side quickly, or ideally, appeared in the site's sidebar. I'm guessing they're used everywhere because you could drop in a code snippet to do everything for you, and a vast majority of code snippets are the "in your face windowshade" variety.

I don't know what to do about this situation other than to write my own paste-in package for newsletter signups, which I don't really have time for. I guess the best thing I can do is announce: if your newsletter prompt doesn't cover the main content of the page, I'm much more likely to subscribe (~20%) than if it's a modal + windowshade (0%).

eXpl0it3r 22 minutes ago 0 replies      
I wish popup-/ad-blockers would start picking up these modal popups as well.
designium 48 minutes ago 0 replies      
You can suspend the modal pop ups, at least the majority of them, by detecting if they use Bootstrap or Foundation and add CSS code to take over the modal classes from those libraries. Most sites use one of those two framworks anyway.
Lio 35 minutes ago 0 replies      
...and no, I don't want to log into your walled garden Facebook, LinkedIn, etc.
rdiddly 11 minutes ago 0 replies      
To put a finer point on it, no, I'm not too dumb to find a regular ordinary signup link somewhere when I'm interested, and no, I didn't accidentally forget to click yours. Face it, website, I'm just not that into you!
davidgerard 52 minutes ago 0 replies      
This is why Google needs to start penalising lightboxes.
edw519 1 hour ago 1 reply      
Theres no keyboard shortcut you can use to get rid of them.

Sure there is:

 1. Alt-F4 2. Ctrl-Alt-Del

A Russian Poetry Scandal That Ended in a Duel atlasobscura.com
16 points by lermontov  5 hours ago   1 comment top
Tycho 1 hour ago 0 replies      
Does anyone know good poetry accounts to follow on Twitter? I follow quite a few accounts that just post quotations or snippets from famous writings. I think this would work well for poetry.
Intent to Deprecate and Remove: Trust in Existing Symantec-Issued Certificates groups.google.com
693 points by ehPReth  23 hours ago   290 comments top 33
stevecalifornia 22 hours ago 3 replies      
TLDR: Google has lost trust in Symantec's ability to properly validate certificates they issue. Chrome has a Root Certificate Policy that expects a CA to perform in a manner commensurate with the trust being placed in them and the Google team appears to see evidence that they are not living up to the standard laid out.

They propose a gradual distrust of existing certificates by reducing the 'maximum age' of the certificates with each release of Chrome.

EV certificates are proposed to have their EV indicators stripped immediately until Symantec, for one year, demonstrates sustained compliance.

jwarren 21 hours ago 2 replies      
If anyone here hasn't realised, Symantec bought Verisign back in 2010 - who own many brand names, like GeoTrust, Equifax, Thawte etc. You can see a list of their roots certs here: https://chromium.googlesource.com/chromium/src/+/master/net/...

In case you missed it at the bottom:

> From Mozilla Firefoxs Telemetry, we know that Symantec issued certificates are responsible for 42% of certificate validations

NateyJay 23 hours ago 4 replies      
This is huge, Symantec owns about 15% of the SSL certificate market[1], and as stated in the article, has issued 30% of in-use certificates. No certificate authority of this size has ever been raked over the coals like this.

[1] https://w3techs.com/technologies/history_overview/ssl_certif...

tlrobinson 21 hours ago 4 replies      
I've often wondered: why is trust in CAs an all-or-nothing proposition (aside from EV certs), and why should my particular browser vendor have all the authority over who I should trust?

For the vast majority of users that's probably just fine, but I would have thought that there'd be a browser or extension or something that allows security-conscious power users more fine-grained control over this by now.

For example, I could subscribe to changes in CA trust levels from every major browser vendor, and if they don't agree my browser could show me a warning with an explanation.

Or I could subscribe to feeds from other entities I trust, like the EFF. Or my security-conscious friends.

Or if I decide I have lower trust in certificates issued by governmental CAs, or CAs in certain regions, I could mark them as lower trust.

Basically a web of trust for CAs.

thenickdude 17 hours ago 0 replies      
It looks like the questions that Google/Mozilla asked Symantec and didn't like the answers to are posted here:


(Archive link: http://archive.is/Cq9VO )

Really interesting reading!

gigatexal 19 minutes ago 0 replies      
I'd rather trust Google to be a/the defacto CA than some has been AV company.
fosco 35 minutes ago 0 replies      
I do not know how to observe if this has been accepted, does it not require 3 others to approve and say 'looks good to me' to continue with this proposal?

sorry if I missed this...

zie 22 hours ago 1 reply      
I was curious if this would affect my Symantec issued certs... according to my date math:

Chrome 59 (Apr 13, 2017) +1023 days: 2020-01-31

Chrome 60 (May 25th, 2017) +837 days: 2019-09-09

Chrome 61 (Jul 20th, 2017) +651 days: 2019-05-02

Chrome 62 (Aug 31st, 2017) +465 days: 2018-12-09

Chrome 63 (Oct 12th, 2017) +279 days: 2018-07-18

sandGorgon 22 hours ago 8 replies      
>All Symantec issued certificates. GeoTrust and Thawte are CAs operated by Symantec, simply afforded different branding.

>While this list may need to be updated for some recently created roots, https://chromium.googlesource.com/chromium/src/+/master/net/... may accurately capture the state of impact

Damn. There goes my certificate (Rapidssl). Anybody know what are the remaining, trustworthy certificate issuers ?

No we cannot use LetsEncrypt for convenience reasons (we bake our certificate pub key in many places)

musicnarcoman 23 hours ago 2 replies      
> Intent to Deprecate and Remove: Trust in Existing Symantec-Issued Certificates

When I read that something like this popped up in my head:

"Google is using the nuclear option on Symantec. Neat!"

lukegb 22 hours ago 1 reply      
Google's also been looking to limit the maximum validity lifetimes in general through the CA/B Forum[1] in a ballot that ended up not passing (with hints[2] that Chrome would end up enforcing something similar itself even if it wasn't part of the Baseline Requirements).

This seems to be indicative of the general indication that Chrome wants to head in anyway[3].

[1] https://cabforum.org/pipermail/public/2017-January/009373.ht...

[2] https://cabforum.org/pipermail/public/2017-February/009746.h... - there was a more explicit post elsewhere but I can't find it in the archives right now

[3] https://twitter.com/sleevi_/status/829804370900426752

TenOhms 20 hours ago 2 replies      
It amazes me how often Symantec is in the news about the same subject, yet they seem to be incapable of learning a lesson from it.
benchaney 19 hours ago 0 replies      
Good. The security of certificate system is dependent on an incentive model where misbehavior is punished with revocation. It is important not just because we shouldn't give individual authorities trust after they have demonstrated themselves untrustworthy (although that is an important factor), but also as a punitive measure to disincentive misbehavior.
dantiberian 19 hours ago 1 reply      
Not much focus in the comments has been put on the 9-month certificate validity, but it seems like that is going to be almost as big a punishment for Symantec as the deprecation. Because dealing with SSL is so painful for some large corporates, being told that you have to go from a 3 year renewal schedule to a 9 month one would be enough to cause many to go looking elsewhere.
unabridged 21 hours ago 1 reply      
SYMC is at an all time high, and every one of their EV certificate customers is about to start talking to different security vendors. The Jan18, Jan19 near ATM puts are quite reasonably priced.
marina8888 21 hours ago 0 replies      
Related: not valid certs should show a red warning after accepting them: https://bugzilla.mozilla.org/show_bug.cgi?id=1349897

is this for all the certs or just symantec certs?

praptak 22 hours ago 2 replies      
It's a bit scary how much power do browser creators wield. Even if it's being used for good.
kartickv 12 hours ago 2 replies      
Why not immediately begin treating these connections as plain HTTP? Don't show the padlock or "Secure".

Don't fail the connection, so people will still be able to use the site, but don't present it as secure.

This would be a stronger action than treating EV certs as non-EV, which only a few geeks will notice. Or reducing the maximum age of certificates.

Ajedi32 21 hours ago 2 replies      
Anyone have any more information on the incidents that triggered this response? I was able to find this article on Google's Security Blog: https://security.googleblog.com/2015/10/sustaining-digital-c...

But that's almost 2 years old. Have there been any more recent incidents that I'm unaware of?

Animats 21 hours ago 3 replies      
What's Mozilla doing?
WestCoastJustin 19 hours ago 1 reply      
We need a service to check if your certs could be flagged as bad. Especially since this spans so many brands (Symantec, Equifax, VeriSign, GeoTrust, Thawte, etc). Something like, plug in the domain name, and it'll tell you if you need to update the cert.
smartbit 13 hours ago 0 replies      
in chris palmer mar 23, 19:33 (9th answer)

> Combined with the gradual move to certificates with shorter lifespans anyway (as a way of coping with problems like this, and with the difficulty of certificate revocation generally), automation is a necessity going forward.

Interesting. What are the effects of automation? E.g. is it possible to automate the update of EV certs? Is this opinion fueled by the idea that websites should be hosted in the public cloud? What is OWASP's stance?

Where can I find more on pro & con automated cert renewal? Around the time when Letsencrypt was introduced, there must have been someone who wrote an informed article/blog on automated cert renewal.

justinclift 21 hours ago 1 reply      
Symantec being well... Symantec, I'd expecting them to lawyer up in order to delay, or outright block this.

They're big enough to afford the higher end law firms likely needed too. :(

marina8888 21 hours ago 3 replies      
But why is https://w3techs.com/technologies/history_overview/ssl_certif... saying Lets Encrypt has 0.1% when https://letsencrypt.org/stats/ says 32 million Fully-Qualified Domains Active?

32 million = 0.1%32 000 million SSL certs? = 100%

? what?

voidlogic 18 hours ago 3 replies      
Unless Mozilla and IE goe along with this, effected orgs could just inform users that Chrome is not a supported browser? Have we heard from the other browser vendors?
zie 22 hours ago 1 reply      
What is the chrome release schedule? i.e. what is the timeline for the 59 - 64 releases to happen? A quick google isn't getting me an answer(and I don't use Chrome, so don't really pay attention).
hughw 18 hours ago 1 reply      
"...pour encourager les autres"
bandrami 16 hours ago 0 replies      
PKI is broken. The problem isn't the crypto, it's the wardens.
shkkmo 22 hours ago 1 reply      
Is there a list anywhere of which EV cert providers use Symantec as a CA?
mtgx 23 hours ago 1 reply      
I'd be very surprised if Symantec doesn't have some backroom deal with intelligence agencies, and not just in the U.S. either, especially since they've acquired BlueCoat - a "security company" known for selling surveillance tools to authoritarian regimes - and after they made the BlueCoat CEO the CEO of Symantec.
wasntme 22 hours ago 1 reply      
Are they just hoping to drive business to their new CA: Google Trust Services?
jwildeboer 21 hours ago 2 replies      
TL;DR Google prefers to override what the standards say about validity of certicates instead of what would be the logical thing: stop trusting Symantec root Certs. A dangerous precedent.
nikanj 22 hours ago 1 reply      
TLDR: for the next few months, expect to tell your relatives "just click yes on the big red warning dialog". Not sure if this is good conditioning.
Drag Change the way you work inside Gmail dragapp.com
62 points by nicktimms  13 hours ago   37 comments top 13
chris9397 2 hours ago 2 replies      
NO I'm not going to get 3 friends to sign up until I get a chance to try this out. They want us to helpp spread the word about an app we cant use yet, wtf
bartj3 8 hours ago 4 replies      
I like the idea but https://inbox.google.com already solved this for me.
mosselman 7 hours ago 1 reply      
Here is a great article that argues pretty well for why e-mails aren't tasks:


One of the arguments that directly comes to mind when I look at dragapp's screenshots is that parsing messages to find out what the underlying task is just takes too much time. Look at the image and you'll see that there is no way to see what part of "Chrome extension for Gmail" is in to-do, progress or complete. You have to read each e-mail to find out.

Backup screenshot: http://imgur.com/a/VtyyO

arkadiyt 9 hours ago 3 replies      
You can't deploy https but you want me to trust you with my email? No thanks.
dack 2 hours ago 1 reply      
I like the idea, but imo we need to treat inbound email differently than a task list. I think inbound email should have a rich set of filtering/grouping/bulk archiving tools to sift through the junk quickly, and then a seamless way of converting the actionable items into to-dos. I don't think Gmail handles either of these parts very well.

When you think about it, gmail is just one input into your list of things to do for the day - so you could imagine other integrations like Github PRs that end up there. Anyway, I'm glad people are still working on this problem, I consider it very much unsolved.

nkristoffersen 8 hours ago 1 reply      
Agreed with the other comment. This crowd definitely requires HTTPS.

Perhaps move to a host that helps you with that if you are unsure how to deploy HTTPS?

jlebrech 5 hours ago 0 replies      
I just did: before:2017/01/01 in:inbox is:unread

and selected all to delete.

no more annoying count of unread emails for years ago.

franze 8 hours ago 1 reply      
A tasklist everybody can push tasks to, what can possible go wrong?

Email != ToDo List

janwillemb 7 hours ago 0 replies      
No tool, in particular on top of email, will solve the lack of a proper process. Tools like these will maybe work for some time, but it will be a mess in no time.

GTD has been working for me for 10 years now, with simple lists. I think a part of it's strength is that the GTD-framework includes a method to "start over again" after your system gets inevitably cluttered.

timewarp256 4 hours ago 0 replies      
In both Gmail and Inbox I miss emails when two come in at once in the same thread - wish they could solve that without turning off convos altogether.
gcatalfamo 5 hours ago 1 reply      
I would love to try it, but one question: how likely is it that I adjust my gmail workflow with "drag" and then it becomes paid-only?
ycombinete 3 hours ago 0 replies      
I thought Hackernews only used Fastmail?
dddw 8 hours ago 1 reply      
looks very much like sortd
Apple says recent Wikileaks CIA docs detail old, fixed iPhone and Mac exploits techcrunch.com
150 points by gerosan  13 hours ago   54 comments top 11
tptacek 12 hours ago 7 replies      
If you're not familiar with the iPhone platform and you're interested in just one technical detail to help navigate these stories, let it be this: the iPhone 3G platform bears very little resemblance to the modern, post-touch-ID phone. The platform security system at every level, from boot chain to hardware domains to OS security, evolved more in the last 10 years than any previous platform had in 20 years prior.

That doesn't make an iPhone 7 impregnable, but it should inform any analysis you do of stories about phones being tampered with "starting in 2008"; that's a little like talking about SMTP server security "starting in 1993".

sohkamyung 12 hours ago 0 replies      
Yes, it is an old exploit. This ArsTechnica article [1] has more on the timeline

[1] https://arstechnica.com/security/2017/03/new-wikileaks-dump-...

chillaxtian 12 hours ago 0 replies      
if you're interested in how iOS security works, apple publishes white papers on the subject.


kyleblarson 44 minutes ago 0 replies      
Apple fixed those particular exploits, yes.
throwmesomeseo 4 hours ago 1 reply      
Keep in mind, not everyone has the newest shiny iPhone7 in the world. The HN crowd probably is not representing the average iPhone user.
doggydogs94 10 hours ago 2 replies      
The CIA exploits are important because most people never update anything. It doesn't matter if you have fixed the OS for the exploit if the fix is never installed.
tyingq 12 hours ago 0 replies      
I wonder how old the leaked CIA docs are though. Are there any contextual clues that it's current?

Someone might have sat on a copy for years before leaking.

Edit: Quick scan shows there are some docs with dates in 2013, 2014, 2015. So at least some of it is fairly recent. No real way to tell, though, if it was all pulled at once, assembled over time, etc.

pfarnsworth 10 hours ago 1 reply      
CIA must have a bunch of embedded workers at Apple, Google, etc all adding subtle bugs that can later be used to hack the devices and services. I imagine other intelligence agencies must have them too. If they don't, then they're not doing their job.
kevindong 8 hours ago 0 replies      
> Based on our initial analysis, the alleged iPhone vulnerability affected iPhone 3G only and was fixed in 2009 when iPhone 3GS was released.

"fixed" probably isn't the right word.

freshyill 6 hours ago 2 replies      
If there were ever any doubt that Wikileaks is a bad actor, let this be the proof.

Regardless of the fact that this is a patched, nearly decade-old exploit, they're trying to make a scene rather than go through ethical channels.

UpDownLeftRight 8 hours ago 1 reply      
This is the same Apple that has maintained on their website that their OS is "secure by design" and no additional security steps are needed.

See http://cc.bingj.com/cache.aspx?q=%22secure+by+design%22+site...

Google Team Refines GPU Powered Neural Machine Translation nextplatform.com
84 points by Katydid  14 hours ago   18 comments top 4
visarga 11 hours ago 0 replies      
TL;DR - It's the code from "Massive Exploration of Neural Machine Translation Architectures", where they run extensive hyperparameter search using 250K hours of GPU time.

Direct GitHub link: https://github.com/google/seq2seq

Arxiv: https://arxiv.org/abs/1703.03906

rawoke083600 8 hours ago 3 replies      
I'm surprised they used k80 and not something like 1080*. Surely they don't need the double precision-performance ?Am I missing something?
braindead_in 10 hours ago 1 reply      
Why not use DGX-1? Surely Google can afford a bunch of these.
microcolonel 12 hours ago 4 replies      
Seems it mishandles the difference between Premiers and Prime Ministers.
Modern JavaScript for Ancient Web Developers postlight.com
571 points by rmason  1 day ago   273 comments top 49
Waterluvian 21 hours ago 11 replies      
The trick to being successful with JavaScript is to relax and allow yourself to slightly sink into your office chair as a gelatinous blob of developer.

When you feel yourself getting all rigid and tense in the muscles, say, because you read an article about how you're doing it wrong or that your favourite libraries are dead-ends, just take a deep breath and patiently allow yourself to return to your gelatinous form.

Now I know what you're thinking, "that's good and all, but I'll just slowly become an obsolete blob of goo in an over-priced, surprisingly uncomfortable, but good looking office chair. I like money, but at my company they don't pay the non-performing goo-balls." Which is an understandable concern, but before we address it, notice how your butt no-longer feels half sore, half numb when in goo form, and how nice that kind of is. Ever wonder what that third lever under your chair does? Now's a perfect time to find out!

As long as you accept that you're always going to be doing it wrong, that there's always a newer library, and that your code will never scale infinitely on the first try, you'll find that you can succeed and remain gelatinous. Pick a stack then put on the blinders until its time to refactor/rebuild for the next order of magnitude of scaling, or the next project.

tedunangst 23 hours ago 4 replies      
> If you dont have a human expert at hand, at the very least, check the date on that Medium article or tutorial or the last commit in that GitHub repository. If its more than a year old, its almost certainly not the way to go.

The logical conclusion then is to wait a year and ignore everything this article says. Saves a lot of time. :)

acemarke 23 hours ago 1 reply      
I can highly recommend Sacha Greif's article "A Study Plan to Cure Javascript Fatigue" ( https://medium.freecodecamp.com/a-study-plan-to-cure-javascr... ) as a great place to start. It gives an excellent series of steps for tackling modern Javascript concepts one piece at a time: Javascript, React, ES6, and state management.

For anyone interested in learning React, here's my standard advice:

You should start out by reading through the official React docs and tutorial at https://facebook.github.io/react/, and use the official Create-React-App tool ( https://github.com/facebookincubator/create-react-app ) for setting up projects. It creates a project with a solid build setup, with no configuration needed on your part.

Past that, I keep a big list of links to high-quality tutorials and articles on React, Redux, and related topics, at https://github.com/markerikson/react-redux-links . Specifically intended to be a great starting point for anyone trying to learn the ecosystem, as well as a solid source of good info on more advanced topics.

mwpmaybe 23 hours ago 5 replies      
Great postand the title got a laugh out of this self-described "ancient" web developerbut interestingly, there's not much (any?) info in there about learning JavaScript/ECMAScript the language. It and the articles it links are mainly focused on the ecosystem and tooling.

I held JS at arm's length for way too long, trying to use it without actually learning it. I finally grew tired of being illiterate and read "You Don't Know JavaScript" by Kyle Simpson[0] and it was eye-opening. The preface[1] really spoke to me and I suspect it will speak to many others like me. I only wish I'd read it two years ago.

0. https://github.com/getify/You-Dont-Know-JS

1. https://github.com/getify/You-Dont-Know-JS/blob/master/prefa...

nathan_long 21 hours ago 2 replies      
> Things havent settled down long enough for curriculums and guides to gel and mature, and for best practices to become authoritative for more than a few months... When youre working with an ancient language like PHP, you Google a question or problem, and almost 100% of the time you will find a 5-year-old Stack Overflow answer that solves it... Not so much with modern JavaScript

OK, so everything's changing all the time, everything you Google will be out of date, but it's worth it because... why, exactly?

Node doesn't block on IO. OK, cool. But the Erlang VM doesn't block on IO, either, AND it's not single threaded, so you can actually do parallel computation if you need to.

JS has a lot of raw speed these days. OK, cool. But raw speed is generally less important than good algorithms and parallelism, and if we really need raw speed in other languages, we can generally call out to C or something.

Rust has a unique value proposition: safety and speed. Erlang/Elixir have one: supervision and easy parallelism. Ruby has one: well-worn tools for web development.

I don't understand the value proposition for Javascript on the server, other than "hey, you already know Javascript, right?", to which the answer is "LOL I thought I did but apparently everything I know is wrong all the time".

I think I'm going to sit out the Javascript frenzy until it's all very stable and Googling gets me answers that aren't So Last Week. And maybe we'll be writing Rust for browsers via WebAssembly by then, anyway.

smdz 20 hours ago 5 replies      
One of the worst problems I see for the new JS developers is the tooling-overload. Being an experienced developer, I know why that tooling is required - but new devs are just overwhelmed with the information out there (and understandably so). And most of the todo-list projects tend to over-simplify the tooling. It takes time to learn the tooling and understand its importance

In a typical React app (+TypeScript), I have atleast 40 dependencies in package.json (frontend-only). I know exactly what each dependency is required for. And each time I have a new app, I create that package.json one line at a time - just to make sure I don't add unnecessary dependencies.

But for a JS newbie, knowing what dependency is what and why - is pretty overwhelming, tedious and uncomfortable. This is assuming they already spent time learning JS and the frameworks in the first place. Its easier to give up that important learning when you have to deliver "just a landing page in Angular" and answer to managers. After few iterations, the result is a disaster that is difficult to maintain, buggy and much more difficult to add features. Ultimately that adds to the development/maintenance cost.

I know why the JS tooling is complex, but sometimes I wish JS projects were as simple (and controlled) as creating C# windows apps using VS - where the tooling is mostly hidden

mybrid 18 hours ago 1 reply      
As an Ancient Developer I see the problem with Javascript as not being the language, but the ecosystem.

1. Async programming. Async programming is hands down the aspect of programming that college students do the worst on. As a TA at Berkeley I can say it was one of the hardest aspects of programming for students to do well in and yet Javascript starts with an total async model.

2. No testing. Some of this has to do with Javascript starting life as web browser only. Some of this has to do with writing test code for async programming. How does one write async test code? Very, very carefully.

3. Not invented here. Javascript is not a young language. It has been around more than 10 years and yet Javascript has the same chaotic ecosystem as a first year language. That's a permanent mind set, or ecosystem.

The natural conclusion of this kind of ecosystem is to eventually suggest developers just run in kernel mode and bypass user space altogether. Asynchronous code such as database pooling is hard to get correct. This why an app server has financial value: pay for the hard stuff. And, get this, the hard stuff can be abstracted away.. Same with distributed clustering. Because these kinds of difficult async services can be black box abstracted then we have a kernel, an OS and user space. Javascript challenges that notion, and all the experience and the history of why we have kernels to begin with. Javascript is asking developers to in effect become kernel developers, without doing testing , and with in ever changing 'not invented here' chaotic code base.

Does not bode well. Has nothing to do with language though.

ben_pr 21 hours ago 5 replies      
I see that JavaScript has it's place in the browser but the whole back-end thing scares me. The ugly code (callbacks, etc), npm injecting only God knows what into your back-end servers, tons of work-arounds for trying to make JS not so ugly, are over the top.

The JS everywhere is so much like the "only tool you have is a hammer, so every problem looks like a nail" thing, it's amazing.

Creating a simple, secure, extensible middle-tier is a solved problem and is not in need of JS trying to solve it in a much more obtuse way. I've created many myself in everything from Delphi, PHP, C#, Groovy, to Java and I would never pick JS for that layer.

And a final thought, PHP used to get tons of bad press for being messy, etc, etc. But this JS stuff takes that mess to a whole new level. Perhaps PHP devs moved to node/js so they could make a mess and everyone would still think they are the cool kids?

bitwize 19 hours ago 1 reply      
An internet writes a cheerful, helpful guide for dealing with the fact that like a sort of digital ice-nine, JavaScript turns everything it touches into a sluggish, half-finished, poorly-integrated pile of shit much like itself. Most of Hackernews assures itself that this is the natural order of things. One Hackernews traces "new" and "exciting" JavaScript features as far back as SICP, the Torah of the Scheme cult; his voice, along with those of the other dissenters, is quickly drowned out.
zeteo 22 hours ago 3 replies      
> The sheer number of tools and plugins and packages and dependencies and editor setup and build configurations required to do it the right way is enough to stall you before you even get started.

At some point you have to ask yourself how much of this is good engineering (does it help the end user?) and how much it's just having fun and impressing fellow engineers. Is the complicated build process worthwhile just to obtain a nicer syntax (for the current definition of "nicer")?

narrator 21 hours ago 1 reply      
I think why there's so much churn in javascript frameworks is that in a dynamically typed system as soon as the framework objects get too big and complicated they become burdensome to remember and everyone starts chasing the next "bloat free" system. Then the "bloat free" system gains features and complexity and everyone gets annoyed and chases after the next thing. Ever wonder why Java codebases last forever and ever without being scrapped for the latest whiz bang every year? It's the static typing and the lack of cleverness in the language. The static typing helps manage the complexity and you can delete stuff and know you didn't break anything because the compiler will say so immediately. This is why Typescript is such a breath of fresh air. I actually get some help from the IDE like I'm used to with statically typed languages.
bborud 22 hours ago 2 replies      
The curse of having been around for a while is that after a while it looks like the world is on auto-repeat. The antidote for this is to take a deep breath, smile (without looking condescending), say "that's interesting" and then wait and see what can stand the test of time.
albb0920 23 hours ago 4 replies      
I was very frustrated when I tried to move to gulp+sass+babel+uglifier tool chain, It took me an afternoon to set everything up before I can write any actual code.

The whole thing looks like to be designed toward make big and long-term maintained website, not toward helping you to hack up throw away site quickly.

Stuff like gulp, after I installed it globally, I still have to install it per project, which makes little sense to me.

It used to be just "compass init; compass watch", and start write stuff, uglify JS before deploy, then done.

I still hates JS, I don't get why a language once hated by everyone, now is sexy again.PHP in contrast, have improved a lot since 5.3, and it actually quite fun to write these days.

brandonmenc 22 hours ago 3 replies      

Rediscovering the horrors of setting up a Java web app in the early 2000s, except now you configure it with json instead of xml.

sparkling 22 hours ago 2 replies      
As a "ancient web developer": i will come back once you JS folks have agreed on a tool set and actually use it for more than 6 months.
skyisblue 18 hours ago 1 reply      
I'm an ancient web developer who would build entire web apps in django. I've tried creating forms with react, but it took 10 times longer to build it in react than it would have in django, maybe i'm just new to react, but i don't see how it would be faster even with more experience.

In django i would just create a model, a form class and a template and that would handle the rendering of the form, form validation, and saving the data into a db. But with react you would have to create the components and also create a separate api that would handle the retrival and saving of data.

Are there any ancient developers out there building web apps faster with react/vuejs/angular than django/rails?

Sandman 5 hours ago 0 replies      
I've been long enough here on HN to remember how people used to laugh at Java in the comments, bemoaning the time it took to set everything up before you can start working with it (although it wasn't really that much trouble then, and it certainly isn't now) compared to how easy it was to start working with Javascript ("Hey, all you need is a browser and an editor"). How the tables have turned.
zlynx 22 hours ago 2 replies      
Really ancient web developers know that Javascript adds hardly anything to most web pages. Some HTML, CSS, a few iframes (or just frames!) and form POST, and you've got a web site that's faster than most of today's junk.

I think my reaction to modern Javascript is to just ... don't.

icc97 17 hours ago 0 replies      
I attacked the modern Javascript approach through first focusing on functional programming.

1. Python + functional programming in Python

Python is hardly a pure functional language, but it's lovely and simple and has all the core concepts including list comprehensions. This leads you on to...

2. Haskell

If you want to find a pure functional solution to a Python problem, first search for the Haskell one and translate it. Then read Learn You a Haskell [0] which was the funniest programming book I ever read and almost, almost taught me about monads (I had it for a second, then tried to explain it in Python and all was lost)

Now you can relax cause the hard bit is done.

3. Read Javascript the Good Parts and only pay attention to the functional programming bits. Suddenly mentions of currying aren't so scary.

4. Work your way through the funfunfunction [1] videos, especially the functional playlist [2] and for added bonus he has videos where he works through the first few chapters of Learn You a Haskell.

Then you've got map, reduce, filter all completely under control. Now immutability makes more sense, arrow functions don't look so strange, promises are just friendly monads really and we all love those.

Now you've got Immutable.js [3], lodash, underscore all reasonable to understand.

React's moaning about state and pure functions makes reasonable sense.

5. Babel really isn't that hard. Just following the Meteor + React tutorial [5] got that all working without me really noticing. Then, holy moly you're all reacted up, with JSX and pure sweet smelling functions.

6. Follow some of Dan Abramov's excellent blog posts such as about getting eslint up and working in Sublime Text [4].

Yeah that's as far as I've got, but adding in Redux to this mix doesn't seem so scary, at least I understand the language now. Angular will just have to wait.

 [0]: http://learnyouahaskell.com/ [1]: https://www.youtube.com/channel/UCO1cgjhGzsSYb1rsB4bFe4Q/ [2]: https://www.youtube.com/playlist?list=PL0zVEGEvSaeEd9hlmCXrk5yUyqUag-n84 [3]: https://www.youtube.com/watch?v=I7IdS-PbEgI [4]: https://medium.com/@dan_abramov/lint-like-it-s-2015-6987d44c5b48 [5]: https://www.meteor.com/tutorials/react/creating-an-app

mark242 22 hours ago 2 replies      
So let me get this straight. You need to go through all this code in 2017:


...in order to replicate dispatchEvent and addEventListener?

ilaksh 16 hours ago 2 replies      
Except she stopped before he learned async/await, which is how you usually actually want to do async in Node.js now. It is even built into new versions finally without transpilation.

I'm not ancient but I am 39 and have been doing web programming for many years, going back to the days when PHP was popular. I switch over to primarily JavaScript and Node.js around 6 years or so ago.

To all the haters, there are very practical reasons I started using Node on the backend years ago. Trying to build a collaborative real-time web application with Twisted and whatever was a huge pain. With something like Node.js its much more practical because you get async and isomorphic (same codebase on client and server).

Point being, JavaScript has been a big deal for quite a long time, and it hasn't taken _all_ older developers up until just now to catch on.

And now we are actually one or two trends removed from that initial wave of popularity. Plenty of the young guys who basically wrote the book on Node.js and created the most downloaded-ever npm modules moved on to Go or whatever years ago. And plenty of them decided to move on to Rust (or a few picked up Nim if they are smart).

wyldfire 22 hours ago 3 replies      
> Then JavaScript and its modern frameworks ate backend, frontend, and everything in between, and it was time to re-become a web developer in 2017who writes JavaScript.

Gee, I always thought that Javascript was a mediocre language, sufficient for its initial purpose. I assumed that node.js was just someone's idea of "wouldn't it be neat if I could have this on the backend too and use the same representation of data on backend + frontend?"

I have to admit it sure does sound popular these days. But I still am under the impression (delusion?) that a dedicated backend implementation in a better-suited-language is what I really need. That said, I have little experience with web dev.

bordercases 18 hours ago 0 replies      
My contribution would be to point out the Rule of Least Power to people again.


A lot of the times when technology is being created in a space where existing technologies already solve the problem, and even moresoe when these frameworks are an "extension" of such platforms, the NIH syndrome comes as a cause of being unfamiliar with how the original technology solves the problem. Rather than RTFM and internalizing that way of thinking, the libraries are rebuilt. Even worse, some get adopted by new developers that don't know better as being standard.

jQuery, motherfucker, I'm looking at you. HTML5 and CSS can go further than many expect, including myself.

beamatronic 22 hours ago 0 replies      
Outstanding. Sadly, your article simply confirms for me that I want to stay in my world which starts at the application server and ends with the database.
RikNieu 22 hours ago 2 replies      
I went from barely knowing html and CSS to slinging react/redux apps with es6, bundled with webpack, out for paying clients in less than a year.

If you just take your time, carefully go through tutorials and study a couple of open source projects you'll be fine. Especially if you already have years of development experience. You can do it!

bluepnume 23 hours ago 0 replies      
After burning a lot of time getting all the javascript tooling I wanted working, for the umpteenth open source front-end javascript library I wanted to publish, I threw together this:


Pretty opinionated on the tooling it uses, but good if you just want to say "Fuck it, I want to fork some boilerplate, write some code, and build/distribute it", then worry about the finer details later, and swap in specific tooling that you need later.

pmarreck 18 hours ago 0 replies      
I'm one of these old school backend developers.

I'm kind of still waiting for someone to build, like, an Erlang (or Elixir) for the frontend. Maybe compile actual Erlang to it via emscripten.

Then code up a way to interface with the DOM, as well as model it in tests, and I could be a frontend developer again! (except for CSS... hrm)

I'd be happy to never look at JS again, frankly.

I think Elm currently comes closest to what I'm looking for

leeoniya 23 hours ago 0 replies      
I also dislike forced tooling, which was the reason for writing domvm [1] which can be used as a drop-in <script> with no tooling. It's a practical mix of virtual-dom, composable components and OO (as needed).

Pardon the shameless plug, but the whole project started as an easy way for moving from jQuery spaghetti to a fast data-driven virtual-dom view layer so seems fitting to post here.

[1] https://github.com/leeoniya/domvm

unit91 20 hours ago 0 replies      
This is exactly why I switched to ClojureScript for new projects, and have ported over a few from JS. One of the best decisions I ever made.
franciscop 20 hours ago 0 replies      
The main problem I've found (as an modern JS developer) is that in many situations "mainstream" (AKA C/C++) CS topics are not so relevant to JS. Javascript is Async by default, function-based instead of OO (until recently), dynamic typed and was made to mess with the DOM.

So I feel like some of the concepts taught in the most recommended books such as Clean Code are only tangentially relevant and some times just plain irrelevant. On the other hand, massive books/material written for pure JS are really low quality compared to these classics (except some blog posts, but with these it's difficult to get a round education).

So the most difficult part for me was to find out which concepts were relevant for JS in high quality books and which were not, something that is not easy when you don't know these concepts. Then learn the things not mentioned that are important for a modern JS developer such as NPM, exporting modules, minimization+gzip+tree shaking, etc.

keithyjohnson 20 hours ago 0 replies      
Oh god here comes yet another JS framework - http://vanilla-js.com/
jksmith 23 hours ago 1 reply      
I'd like to take this time to show some love to ISAPI extensions and filters. Wrote some kickass stuff with that technology about 15 years ago (in Delphi of course).
matrix 21 hours ago 1 reply      
Great article. This quote in particular resonated for me:

"I had to let go of doing it The Right Way from the get-go, and allow myself to fumble through using suboptimal or just plain amateur setups just to get comfortable with individual tools"

Coming from languages that have good build and dependency management tools, JavaScript is a highly unproductive developer experience. Yes, you figure it all out eventually, but I shudder to think how much developer productivity is wasted today on JavaScript ecosystem complexity and constant tooling and library churn.

Hopefully 5 years from now, we'll all be looking back in disbelief at what we used to put up with to build web apps.

Animats 21 hours ago 1 reply      
I had to let go of doing it The Right Way from the get-go, and allow myself to fumble through using suboptimal or just plain amateur setups just to get comfortable with individual tools.

Javascript started as a language for small programs. While it's possible to write well-structured large programs in Javascript, the language does not compel modularity. By default, everything is global. Modules and objects have to be built out of closures. There are many ways to do this, and if you have several libraries, each probably does it differently. Hence the pain level.

pweissbrod 21 hours ago 0 replies      
As someone who works heavily in the big data apache stack I will say that javascript's rate of change and deprecation wears me out!
nojvek 16 hours ago 0 replies      
Is it really that crazy complicated though?

Simple browser = script src=xSimple node = index.js

What if you want some libraries included? specify your libs in package.json and npm install.

What if you want to publish your libs? Npm publish.

What if you want to have dependency tree bundled for browser as single script? Webpack

What if you want nice types, latest features and validation for larger projects ? Typescript loader in webpack

What if you want a nice state management library? Preact /react

There's a ton of crap and different ways of doing things but I bet fundamental things are here to remain which solve a large % of use cases.

I would say things are getting simpler in some manner. You don't really need lodash and jquery nowadays. The standard js in browsers has improved quite a bit.

overcast 20 hours ago 0 replies      
Express/Node seems to work just fine for me at this point. I settle on Vue for the frontend, and that's about it. The bundling/minifying/whateverfying flavors of the week, I've basically passed on. Always added so much more crap to a project, and in a lot of cases made everything bigger.
z3t4 17 hours ago 0 replies      
You don't have to use every damn framework and compile to JS language out there ... You will do just fine with vanilla JS!

Most important parts to learn about JavaScript: 1. Scope and closures, like that your function can access variables from a parent scope, and that those variables are frozen in time. 2. Events and "async", like button.onclick = someFunction, where "someFunction" is called when a user click on the button.If you learn just those two concepts you will know more then 95% of the JavaScript programmers out there and it will be more fun to write JavaScript! And all those frameworks are no longer needed.

pjmlp 21 hours ago 1 reply      
I rather stay an Ancient Web Developer, using MVC tooling for Java and .NET until WebAssembly gets mature enough.
Yokohiii 19 hours ago 0 replies      
"Modern" is the new "Nonsense".
diminish 21 hours ago 0 replies      
At some point someone will write an MS Access or File Maker Pro in JS. And finally a Visual Basic 6 client as well as an SQL database right in an single page JS app. At that point client side JS tools will stabilize. vb6.js will then last forever.
cs02rm0 23 hours ago 1 reply      
I know I'm struggling with modern JavaScript tooling.

I've been using React with JSX and maybe as little as React router too. Trying to get router added was so painful. NPM, Babel, we pack, grunt, gulp, etc and then you find you've been given instructions for something on an old react version and you need different tooling for a new one. Or the instructions give you something that's really only suitable in dev not production.

I actually gave up trying to add Flux and wrote my own implementation in vanilla JS which seems ludicrous to me.

rdiddly 20 hours ago 0 replies      
This is like one of those pharmaceutical ads that vividly tells you about all the hideous side-effects of the drug. The cognitive dissonance is so high that even if the benefits happened to outweigh the downsides, I wouldn't know or care because it's just too ridiculous to even think about. And I'm not suffering from any disease the drug claims to treat, either.
neves 13 hours ago 0 replies      
I liked the interactive workshops linked in the article. Do you have any other recommendation of good ones?
lorenzosnap 18 hours ago 1 reply      
mmm... I am surrounded by Java guys who worked with SWING (gone and forgotten) JSP and STRUTS (same story). They talk about the beauty of Java 8 and that they are waiting for Java 9.I wish I could tell them about this article, because out there plenty of people still treat JS as a toy
SadWebDeveloper 20 hours ago 0 replies      
> New Problems, Not-Yet-Established Solutions

FTFY: Zero Problems, Not-Tested Solutions

stirner 17 hours ago 0 replies      
I think the PHP docs are very much "paralleled".
peterwwillis 22 hours ago 0 replies      
The only reason I want to learn about modern Javascript development is to learn how I can replace it all with a small shell script. I assume it's all basically a new, more complicated incarnation of PHP that can run in that horrifying bloated ubiquitous user interface from hell that everyone is addicted to. (Get off my lawn!)
brilliantcode 20 hours ago 1 reply      
The big objection to Javascript is likely arising from what you've seen 10~20 years ago. The fact is, the older you are, the more likely you will reject it.

I'd say look hard at your own skepticism and actually test it out. While it doesn't make sense to write a blog using React/Redux, what one needs to get over is he fact that Javascript is no longer just "rendering documents" but the next evolution in software development as everybody moves to using a common codebase and components.

Why build complicated native applications when a thinly wrapped SPA in Electron will be indistinguishable to the customer? The only objection here is the engineer's ego and reversion to the familiar.

Enterprises will still run on Java, but minimizing code and weight the ROI on managing existing legacy technical debt for a simple CRUD desktop application vs. utilizing current Javascript technologies will be a continued pattern as Javascript wins more buy-in from the corporate executives.

This is coming from the biggest Javascript skeptic. After learning that the problems Javascript were solving were already very difficult, I feel more comfortable speaking the most active language on the web. Nay, I feel anxious that I've not fully caught up with Javascript technologies.

Sea Ice Extent Sinks to Record Lows at Both Poles nasa.gov
190 points by matteuan  16 hours ago   107 comments top 6
Clanan 2 hours ago 4 replies      
Already, the discussion here swings toward the climate change vs. "denialist" debate. Instead of politicizing it, maybe read the article? According to those interviewed, the trend until last year was to see more ice.

 > It is tempting to say that the record low we are seeing this year is global warming finally catching up with Antarctica, Meier said. However, this might just be an extreme case of pushing the envelope of year-to-year variability. Well need to have several more years of data to be able to say there has been a significant change in the trend.
Ironically, all those accusing others of being "denialists" are themselves failing to look at this scientifically, and resorting to the rudely unscientific tactic of name-calling.

dzdt 14 hours ago 5 replies      
Look at this graph of global sea ice extent for every year since 1978. It is insane how much an outlier the last 9 months are. Like sesame street level "one of these curves is not like the others" kind of outlier.


ideonexus 3 hours ago 4 replies      
In the DC area, we had two weeks of 70 degree weather in February. The cherry trees blossomed because of it, and then they wilted off the trees because temperatures dropped to normal.

I saw Fox News manage to put a anti-global warming spin on it by only reporting that cold killed the cherry blossoms, leaving out the fact that they normally blossom in April.

bamboozled 5 hours ago 4 replies      
Ok, so this question has been asked before, but here goes!

What are we going to do about it?

I hear people say it's just too late to stop a catastrophe, but I find it hard to believe.

pastaziti 14 hours ago 4 replies      
batushka 2 hours ago 2 replies      
Comments about climate are so censored here. Bottom is full of flagged out comments with replies. It's like reading CIA doc with half page black.
OpenSSL Licensing Update openssl.org
176 points by JoshTriplett  15 hours ago   104 comments top 9
JoshTriplett 15 hours ago 4 replies      
This is a huge win. The incompatibility between OpenSSL and the GPL has been one of the most notable license incompatibilities regularly encountered in practice. With this change, OpenSSL will become compatible with GPLv3, which also makes it compatible with software licensed "version 2 or later". People will no longer need to choose and port to another crypto library for license reasons.
Animats 9 hours ago 1 reply      
The Rust replacement for OpenSSL is coming along well.[1] It's available under Apache License version 2.0.. the MIT license, or the ISC license. That should be the future direction.

[1] https://github.com/ctz/rustls

hedora 14 hours ago 3 replies      
Wow. Apache 2 has a patent license section that BSD style licenses are missing. At first glance, this looks like this change is being done in bad faith.

Concretely: Contributing to apache 2 software potentially grants universal licenses for relevant patents you hold to anyone that uses the apache 2 software. Courts have not decided how viral this is. (What if I start with apache 2 code, make substantial changes, and apply it in areas the patent holder objects to, for example?)

I'm not a fan of software patents, but sloppy retroactive relicensing like this will create all sorts of legal ambiguity for users and contributors.

(Also, as the openbsd thread points out, apache 2 is license incompatible with existing openssl forks and other downstream software)

DannyBee 9 hours ago 1 reply      
So, the major question getting asked (and theo is trolling about elsewhere) is what happens if you don't respond to the emails.

Having helped out with many relicensing exercises at this point, i'm not sure why people are so worried.

The usual tact taken is:If you cannot affirmatively get consent, you remove the code and, if still needed, have someone not involved rewrite it.

I have yet to see anywhere in all of this that actually says they plan on doing anything different (IE theo's trolling seems to be based not on reality).I looked at the press releases, details, and mailing lists.What am i missing?

tyingq 14 hours ago 0 replies      
I didn't understand the issue with the original license well, but found Wikipedia's overview: https://en.m.wikipedia.org/wiki/OpenSSL#Licensing
kstoneman 13 hours ago 1 reply      
Seems like OpenSSL should focus on fixing their code instead of risking the alienation of past contributors. I wonder if this is to satisfy some of the big donors who want to fix OpenSSL but won't until the license is changed. I know I'd be pissed if I had contributed and then got an email saying no response implies agreement.
Daviey 8 hours ago 0 replies      
Finally! This should mean Debian & Ubuntu can enable SSL in Squid!

EDIT: Dammit, Squid is GPLv2.. It might make the situation worse due to the Patent clause.

So, potentially now.. GPLv2 programmes can't use OpenSSL. Nice.

starseeker 11 hours ago 0 replies      
Although I personally think matching the OpenBSD libressl license would have been better, I still have to regard the move towards using one of the mainstream, modern standard open source licenses for such a widely used and critical software component a Good Thing. I do, however, also agree with those voicing skepticism about the "silence gives consent" bit - this is important enough to be worth doing without adding a (very) questionable practice like that to the mix. I'd suggest instead setting up some sort of "relicensing coverage" report, and use the yay/nay/no-answer status of various diffs to figure out a price tag for re-writing the bits that can't be relicensed.

I don't suppose the libressl folks could end up creating a 2-clause BSD implementation of the SSL/TLS stack that could divorce itself completely from its openssl origins? (Yes I know that's almost certainly impractical for such a large, complex and thorny code base and problem set, but it's a nice dream... maybe some cryptography researchers/companies/etc. looking to make a name the the industry could target and re-implement specific pieces/algorithms/etc...)

jasonkostempski 15 hours ago 6 replies      
What happens if not everyone responds to the emails?
Thousands of underground gas bubbles poised to 'explode' in Arctic siberiantimes.com
539 points by xg15  22 hours ago   330 comments top 31
brogrammer2 8 hours ago 9 replies      
Honest question: Why is Mainstream Media not covering this like it should be doing? Heck, this should be the breaking news every single day!!

After all, this spells doomsday for the upcoming generations, so shouldn't it be the news that should be shown/covered almost everyday on the front page.

The people have the right to know that their children and grandchildren will suffer because of something that is going on right now. I guess, that majority of people, all over the world, are blissfully unaware of this scenario because this doesn't get the kind of attention in the MSM that it should. All they get served is dirty politics and gossip entertainment news.

Maybe people will force the policies to change if they get to know that this will happen.

It seems that most people today think that Terrorism is the main threat to our society, when in fact, Global Warming seems to be the real deal.

Let's say it was found that fifty years from now, an Asteroid would hit Earth. Would the people of Earth react in the same way as they are doing now?

bithive123 19 hours ago 7 replies      
Here is a good video about the arctic methane emergency: https://www.youtube.com/watch?v=8F9ed5E54s4

Some summary points:

- Total amount of methane in the current atmosphere: ~5 gigatons

- Amount of carbon preserved as methane in the arctic shelf: estimated at 100s-1000s of gigatons

- Only 1% release would double the atmospheric burden of methane

- Not much effort is needed to destabilize this 1%

- The volume currently being released is estimated at 50 gigatons (it could be far more)

- 50 gigatons is 10x the methane content of the current atmosphere

- We are already at 2.5x pre-industrial level, there is a methane veil spreading southward from the arctic.

- Methane is 150x as powerful a greenhouse gas as CO2 when it is first released.

Here is a longer video for those who have the time: https://www.youtube.com/watch?v=FPdc75epOEw

LyndsySimon 21 hours ago 1 reply      
Well... I guess let's hope the "Clathrate gun hypothesis" is incorrect.


sqeaky 21 hours ago 1 reply      
At first I was thinking "stupid click bait title", then I saw the pictures. Anything making those craters is properly an explosion. Anything heard 100km away is properly and explosion.

If anything this article plays it down with words like eruption and venting. This seems super dangerous for people in the area. And dangerous in the climate change sense for the rest of us.

adamtait 2 hours ago 0 replies      
There's a team of scientists in the arctic who have an unconventional but (possibly) effective idea to reduce/slow the release of permafrost methane. The Zimov's have a Kickstarter up now - https://www.kickstarter.com/projects/907484977/pleistocene-p...
british_india 20 hours ago 3 replies      
This is precisely why Dr. Guy McPherson has predicted human extinction on earth by 2030. Methane clathrates coming from the Arctic sea floor and from Canadian and Siberian permafrost. This is the most serious threat we face now.
agentultra 12 hours ago 1 reply      
It's strange reading about this and realizing that to be able to do something about it I would have had to have been born twenty some-odd years before I was born. That it's literally too late to do anything about it. I was raised to care about these issues, to save the rainforest, to cut down on pollution, recycle -- to do something about it. And that it would've taken a concerted effort from everyone to take the same care.

Scientists have been warning about this for decades. And we've done nothing. Not even to slow down!

I saw the best minds of my generation destroyed by madness...


Razengan 18 hours ago 3 replies      
If we can't fix, if we can't prevent, and if we cannot prepare either, then maybe we could at least preserve?

Digitize all human knowledge and as much art/literature as you can gather (books, music, movies, shows, games, even porn and random YouTube videos and discussions on online forums :) Most of that work has already been done.

Store it on the most resilient (and simple/repairable) storage media you can,

Bundle it with devices that can read that data,

Along with instructions for building/reinventing such devices, and instructions on how to interpret that data (i.e. JPEG and other file formats :)

Also include a guide for translating the instructions. Assume that a future reader may not understand any of our current languages, or even be human at all.

Put it all in a silo as physically strong as you can build.

Make copies of the silo and bury one on each continent and in each ocean. Maybe even on the Moon?

Distribute markers and maps to each silo (and instructions for opening them) all over the world.

Let fate take its course.

All of this could be done by a few individuals and most of it won't even require a lot of money.

xutopia 16 hours ago 2 replies      
The majority of the population does not care about this one bit. They're going to carry on driving their gas guzzlers. The only change we can make is if we, who understand what is at stake, invest in renewables to bring their price down.

When the backwards people will see us all in our cheap to run, fast to accelerate and cheap to maintain electric pass them on the highway will they feel dumb buying gallons of gas to keep their guzzlers going.

wiz21c 19 hours ago 5 replies      
I don't like what I'm reading here. Is there any expert here that could at least bring in non-panic arguments ?
TeMPOraL 18 hours ago 2 replies      
Ok, so how do we unfuck this? What are the promising options, who's working on them, and how can one contribute?
gph 20 hours ago 3 replies      
I know it's stupid and crazy, but would carpet bombing the arctic work to burn up the methane trapped below the permafrost before it leaks out into the atmosphere and causes worse problems? Or maybe we should send out expedition crews to find and preemptively explode these methane bubbles.

Perhaps the way things are going they will become too plentiful to really do anything realistic about it.

mattmanser 3 hours ago 0 replies      
Honest question, if it does turn out that these have a bad effect, and they can detect the bulges, can't they start draining them before they pop?
perfunctory 5 hours ago 0 replies      
Suppose one has already cut his own consumption as much as possible and has some free money left. What's the most efficient way to spend it to combat climate change? Buy EV; donate to environmental lobby group; invest in a vegetarian restaurant around the corner; isolate your own house? Any ideas?
tabeth 19 hours ago 6 replies      
So, assuming we're screwed, where's the best place to move preemptively? I know Boston isn't that place.

- I assume the coasts are a nonstarter.

- Moving inland could be nice, but the ground would need to be pretty arable to restart civilization there. I'm guessing somewhere in Africa is a safe bet, given the lack of resource exploitation there, but then again, there's not much infrastructure at the moment.

buzzybee 11 hours ago 0 replies      
Every time I see stories about runaway climate catastrophe, I stop and consider the scenarios and how it might affect how I lead my life, and none of them really change how I plan to do things. I'm working and training in ways I believe will contribute, and that is sufficient to calm myself.

Basically, either we make all the turnarounds necessary, on every front - policy, technology, engineering, culture - to beat this stuff by a huge margin, or it's all over. I don't see a middle ground of "things kind of suck for a while and then it's okay" happening. And that guides a lot of my thoughts on other topics: I want survivability, and of a form which preserves key freedoms. I believe massive reforms to the economic and political system are needed to do it. I believe our existing social network structures and cultural norms are insufficient to address this. There's a lot of room to change in all directions.

dcgoss 13 hours ago 1 reply      
This is the first time I've heard about this issue. If this is as severe and terrifying as these comments are making it seem, why has this not been covered more?
CoffeeDregs 21 hours ago 0 replies      
Unfortunately, few of the photos have anything to indicate scale and those few seem to range from 100m (small trucks on the far side) to 2-3m (the guy stepping on one). Still, those are incredible photos...
emiliobumachar 6 hours ago 0 replies      
Serious question: would we be better off finding a safe way to burn all this methane before it leaks? Even detonating it in place and wasting the energy seems advantageous, since CO2 is much less greenhouse-y than unburned methane.
porker 7 hours ago 1 reply      
It's really hard to get the scale of the domes from the photos. Are the top photos 30cm across or 100m?
deanclatworthy 21 hours ago 1 reply      
If you find this interesting, VICE did a 15min segment on their show recently about this very issue.
sinkensabe 7 hours ago 0 replies      
So they can explode at any time from now and possible end us? Is that how I should wrap my head around it?
Keyframe 18 hours ago 1 reply      
Could one pierce a bubble and syphon/concentrate the gas into a tank with a pump? Might be too late, considering how many there are and how remote they are.

edit: nm, I saw that conference where they've shown Methane releasing from the submerged Siberian plane. We're fucked.

cjensen 21 hours ago 2 replies      
A lot of the photos in the article sure looked like pingos[1], which are common in the Arctic and subarctic, rather than something exotic. Horses before Zebras and all that.

[1] https://en.wikipedia.org/wiki/Pingo

xg15 18 hours ago 1 reply      
Here is another article about the same event (in german) that has an aerial view supposedly showing the distribution of the craters over one of the islands:


(The craters filled with water, hence the dark color. The picture doesn't say anything about the craters' depth)

I don't know about the source though, so I guess it should be taken with a grain of salt.

partycoder 19 hours ago 0 replies      
Excellent time to defund the EPA. Natural selection at its best.
gragas 18 hours ago 0 replies      
It's pretty amazing that that website emails you your password in plain text when you register.
lolive 5 hours ago 0 replies      
God is nice.God won't let that happen.
mrlaserOO 6 hours ago 0 replies      
So here is a solution: send a drone overhead and just ignite the methane via laser. no more methane!
executive 16 hours ago 0 replies      
burning off the bubbles is funhttps://youtu.be/15tPaV_j0CU
BrailleHunting 5 hours ago 1 reply      
Another bigger problem, raised by VICE reporting, is tundra thawing, leading to ground collapsing 10-30 meters into huge sinkholes. Appearently, one of the possible solutions is replacing tundra forests with grasslands by megafauna and macrofauna grazing, including, potentially, cloned mammoths, because grassland freezes harder in the winter and keeps the permafrost froze during the summer months.
       cached 24 March 2017 16:02:02 GMT