hacker news with inline top comments    .. more ..    26 Jun 2017 News
home   ask   best   2 years ago   
1
Sega releasing every console game for free with ads on mobile sega.com
170 points by yincrash  3 hours ago   53 comments top 14
1
apapli 1 hour ago 6 replies      
Having a 4 year old son I am the first person to put my hand up and say I absolutely HATE ads and general in-app purchases for games that he likes to play on my iPhone. In some cases the way these are implemented it is almost akin to being in a casino which is not a skill I really want my son picking up (watch this ad to see what prize you can WIN etc).

I welcome Sega's announcement and will be delighted to hand over $1.99 to disable all ads - I know their games are of a known quality, and will come without suprise violence included etc.

By way of example I have one simple game he loves to play that randomly brings up images of a guy holding a girl in a headlock with a gun pointed at her head.... and the same ad comes up repeatedly. I can't even disable it via an in-app purchase (trust me, I tried).

As an aside, I'd welcome some suggestions of games he can play, and if anyone reading this is a game developer I'll be happy to provide any imnput to something you are dreaming up.

EDITS: just for clarity of reading

2
tdiggity 2 hours ago 8 replies      
The Sonic game uses the most vile type of in-app ad. The ad requires you to play Game of War for some indeterminate amount of time. You are made to play it after every stage. I hope Sega changes this, it basically makes their game unplayable.
3
codezero 2 hours ago 3 replies      
4
lpa22 26 minutes ago 0 replies      
I can't fathom free ads from Sega, I will gladly pay the handful of George Washingtons to keep my kids away from the onslaught of unregulated ads that this storied franchise is about to endure
5
makecheck 2 hours ago 0 replies      
* With an in-App purchase of $1.99 to remove non-SEGA ads, at least in the ones so far (like Altered Beast).

Also, at the current rate of release, it will take years to reach nearly all titles so I would take this with cautious optimism. Your favorite games may show up tomorrow or 3 years from now.

6
chenster 2 hours ago 3 replies      
It's not free. Every game has "in-app purchase".
7
yincrash 2 hours ago 0 replies      
I wonder if VMU minigames will be ported with the associated Dreamcast games.
8
INTPenis 37 minutes ago 0 replies      
I've seen so much malware spread through ads on Android that I'm more than willing to pay the 1.99 charge.
9
yincrash 2 hours ago 0 replies      
Details:

SEGA Forever is a free and growing classic games collection of nearly every SEGA game ever released from every console era Master System, Genesis/Mega Drive, Dreamcast, and more. Available on iOS and Android mobile devices.

-Play free

-Save your game progress

-Leaderboard -- compete with the world for high scores

-Controller support -- fully integrated wireless Bluetooth controller support

-Offline play

-Games released every month; download them all!

10
minimaxir 1 hour ago 1 reply      
Sega games with poorly-coded emulators/ads aside, the Sonic 1 fully-native port is still incredible from a gameplay standpoint, and contains a few surprises even for those very familiar with the game. It's worth it to pay for ad-free. (and it also works on the new Apple TV too, but I recommend having a MFi controller if you want to play it like that)
11
imron 35 minutes ago 0 replies      
This is incredible. Please let me pay to remove ads though.
12
akhilcacharya 2 hours ago 0 replies      
The possibility of playing Shenmue on an iPhone is interesting..
13
nerflad 2 hours ago 0 replies      
This is amazing and unexpected. I hope this sets a precedent for other developers to follow with their old titles.
14
kakarot 2 hours ago 1 reply      
Serious kudos to Sega.

This is how old games should be handled throughout the industry when possible. The likelihood of someone not already familiar with a title or franchise to play it is a function of A) its cost and B) how dated it is. Once a game is seeing marginal returns, it's kind of a very corporate mindset to try and suck it dry of every last penny. Especially when you view games as a form of art.

I fear for so many incredible titles, especially as we possibly enter a real VR age.

Unless I force it on them (I probably will), my children may never give a second glance to the titles I grew up with and consider masterpieces, when they could sensually immerse themselves in a modern AAA or VR title.

So many great soundtracks, assets, feats of code, all deserving to be in a museum somewhere, lost in the ever-growing sea of content. Eventually only treasure-hunters like myself seek to experience and appreciate them.

Not only that, Sega can much more accurately determine what franchises might see profitable continuations, given a large enough sample size.

Having not played any of these titles on mobile myself, I can only imagine that Sega has ruined this very noble idea with intrusive ads and a payment scheme for removing them.

3
Texas Is Too Windy and Sunny for Old Energy Companies to Make Money bloomberg.com
402 points by JumpCrisscross  14 hours ago   185 comments top 10
1
toomuchtodo 14 hours ago 3 replies      
It appears we're at the inflection point where even natural gas (in addition to coal) is getting pushed out of the generation mix. Good! About time.

This may require paying natural gas generators for their ability to quickly throttle to back renewables, but only as a temporary measure until utility scale batteries fall in cost.

2
davidf18 14 hours ago 7 replies      
I wonder what the economics would be without the federal tax incentives for wind and solar. Does anyone know?

EDIT: Someone above provided a link that provides the figure I was looking for:

"Between 2010 and 2016, subsidies for solar were between 10 and 88 per kWh and subsidies for wind were between 1.3 and 5.7 per kWh. Subsidies for coal, natural gas and nuclear are all between 0.05 and 0.2 per kWh over all years." [1]

I wonder how much of a subsidy there is for LED lighting. A lot of energy goes for incandescent lighting.

Also, there should be a lot of subsidies to replace heaters in building burning #6 and #4 fuel oil which is very, very dirty and pollution (NYC where I live banned #6 a few years ago but #4 is allowed to persist until 2030 I think).

https://www.forbes.com/sites/jamesconca/2017/05/30/why-do-fe...

3
fencepost 13 hours ago 2 replies      
Very amusing that the article ends with

 Its pretty slim pickings right now, Ferguson said. God is not manufacturing more coastal property.
Part of the concern for renewable energy folks is that you could argue that God is arranging for more coastal property, though not perhaps in ways useful to wind farmers on the Texas coast.

4
SOLAR_FIELDS 8 hours ago 0 replies      
The last lines of the article elicited a bit of cynicism:

"Thats because the market is so oversupplied that its even difficult for the wind guys to make money at these electricity rates. And besides, its hard to acquire land by the water at reasonable prices."

Its pretty slim pickings right now, Ferguson said. God is not manufacturing more coastal property".

In 70 years it will be nice and cheap at this rate, which is ironically and sadly just the sort of problem cheap wind power would help alleviate.

5
jostmey 12 hours ago 2 replies      
I live in Dallas Texas and buy renewable (wind or solar) electricity from the grid. It cost only marginally more
6
anon3939 14 hours ago 1 reply      
7
api 7 hours ago 0 replies      
I think we're in the early stages of a genuine disruptive event.

"They were actually worried about an 'energy crisis' back then. Didn't they realize free energy falls from the sky all day long?"

8
london888 10 hours ago 0 replies      
This is such good news.
9
notadoc 9 hours ago 0 replies      
I'm sure the politicians will solve that problem.
10
sizzzzlerz 14 hours ago 4 replies      
4
KeePassXC 2.2.0 released with YubiKey and TOTP support keepassxc.org
146 points by weslly  4 hours ago   52 comments top 19
1
problems 1 hour ago 1 reply      
Still no KDBX 4 support though?

Please consider making it a priority - it looks like someone tried to pull request it but that failed? The older format uses a custom AES-based KDF - and while I don't personally see any major issues with it, I'm much more comfortable with the modern, heavily reviewed Argon2 design used in the KDBX4.

https://github.com/keepassxreboot/keepassxc/issues/148

2
interfixus 4 hours ago 2 replies      
This is by far the best of the KeePassX* lot, incorporating loads of enhancements which, disappointingly after years of development, never made it into KeePassX 2.x proper, and some 1.x features which simply disappeared in the upgrade.

[Edit: missing word]

3
finchisko 1 hour ago 0 replies      
Really good timing for me to self promote. :-) I'm working on improved keepassxc browser extension. Communication between browser and KeePassXC is via NativeClient. You need varjolintu fork of KeePassXC, but eventually it will also support KeePassHTTP protocol too.

My goals currently are: internalization, nicer UI, clean and extensible code base. I already did options page with material-ui and react. Currently working on replacing jquery popup implentation for hyperapp, which appeared here on HN yesterday.

If interested I can send instruction how you can build extentions. I would like to see this as official part of KeePassXC and willing to donate for free. What you guys think?

You can try options UI on https://mauron85.github.io/keepassxc-browser/preview/

4
sillysaurus3 4 hours ago 3 replies      
So there's KeePass, KeePassX, and now KeePassXC? (And two different variants of KeePass that have nothing to do with each other.)

Not that there's anything wrong with that. I'm just curious if KeePassXC is yet another fork, or if it's from the same people who did KeePassX. KeePassX has an excellent security reputation, so it'd suck if an unrelated fork ruined that.

5
desdiv 3 hours ago 1 reply      
You guys rock! Thank you for your hard work.

BTW the Windows portable version link on your download page is 404ing:

Incorrect URL: https://github.com/keepassxreboot/keepassxc/releases/downloa...

Correct URL: https://github.com/keepassxreboot/keepassxc/releases/downloa...

6
ktta 1 hour ago 1 reply      
Notable features:

1. Unlock using Yubikey

2. TOTP 2FA

3. Diceware password generator

4. ASLR for in-memory security (didn't expect this!)

5. Portable and Single instance mode (I'll have to check this one in detail)

Thanks for your work team!

7
shmerl 4 hours ago 0 replies      
I'm still waiting for it to arrive in Debian: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=855173
8
dfabulich 46 minutes ago 0 replies      
Is there a KeePass fork with support for the new MacBook touch bar?
9
sowbug 2 hours ago 0 replies      
This is really cool. Here are more feature requests (and for all I know they're already there):

* Optionally display a secret as a QR code

* Generate and validate BIP39-compatible seeds (like Diceware but with a checksum. Many Bitcoin wallets these days accept them)

* Get this into Tails

10
gregwebs 4 hours ago 1 reply      
This release looks great. Unfortunately I had to switch to LastPass to get sharing, real sync support rather than blob syncing in, and also easy usage on mobile.
11
cyphar 4 hours ago 0 replies      
It also now has CSV importing so you can now import your passwords from LastPass (though you'll have to manually recreate the folder structure).
12
atomlib 3 hours ago 3 replies      
I see it supports Linux, Windows, and macOS, but are there any Android and iOS apps to open and modifty KeePassXC databases?
13
wst_ 2 hours ago 0 replies      
I'd love to be able to decide which characters to use when generating a password.
14
Slurpee99 3 hours ago 1 reply      
How well does it work for teams? Is there issues for syncing the database if two or more people are using it at the same time?
15
otachack 4 hours ago 0 replies      
Nice! Excited for future development of this. I'll have to get a Neo soon to try out these new features.
16
angelsl 2 hours ago 0 replies      
No KDBX 4 support unfortunately.
17
FT_intern 2 hours ago 2 replies      
Has anyone thought of a good redundancy scheme for yubikey?

Physical object authentication is great except physical objects are less durable than brain memory (or at least, if my brain memory is gone then I probably would have no use for the password anyway).

18
TokenDiversity 3 hours ago 2 replies      
As someone using Keepass2 (mono on Linux), can someone knowledgeable briefly tell me (and probably others) why should and why should not I switch to KeePassX?

I'm getting the feeling that this uses the older protocol?

I got a good answer to the above question from desdiv so I'm adding an edit:

Is there a reason to use this (and not use this) in place of KeePass2 on Windows?

19
TokenDiversity 2 hours ago 2 replies      
Features I like:

1) Download website favicon (no clue how though, tried entering website but didn't see an option to download favicon)

2) Command line interface, no clue again how to use.

5
Shared thoughts after 6 years in Pentesting 0x00sec.org
160 points by wolframio  11 hours ago   37 comments top 5
1
tptacek 10 hours ago 6 replies      
1. You definitely do not need to make security part of your "lifestyle", much less spend 80 hours a week working at it. The irony is that the author is a netpen person, which is sort of infamously the least demanding specialty in offensive security. If people writing browser drive-by exploits can stay on top of their game with a 40 hour work-week, I think the netpen people can too.

2. Don't get certificates. If you meet a prospective employer who seems intensely interested in them, that's a red flag about that job.

3. The idea that you should aspire to being able to do your whole job from a Linux terminal is pretty silly. Use what works for you.

Maybe it takes more than 6 years in offensive security to realize this, but the #1 bit of advice for this field is: learn to enjoy coding. The worst possible place to end up in security is as a captive to available tooling.

2
knieveltech 5 hours ago 0 replies      
Yeah...stopped reading at 80 hour weeks. I don't care how esteemed someone is in their industry, if they have to completely destroy their life to get there I question their judgement and don't want their advice.
3
maxxxxx 9 hours ago 1 reply      
We just had some consultants do pentesting on our medical device and its software components. I was pretty impressed by all the problems they found quickly. As developer I find it pretty hard to stay up-to-date with all the possible ways hackers can get into your systems.

To me this was money well spent.

4
w8rbt 5 hours ago 0 replies      
I would say certs have value in security management, compliance and audit. In fact, if you want to take one of those paths, certs are mandatory. If you want to do technical security (which is totally different), then get a CS or EE degree and maybe a few SANS certs (optional unless you are in a regulated/compliance oriented industry). Finally, having a security clearance will help as well, especially if you or your employer want to do government contracting.

Edit: To expand on the cert topic... if you want to do computer forensics for law offices, police departments, etc. You'll need a technical cert (GCFA, etc.). And having a CS/EE/CE degree won't hurt either. You'll have to have a cert to do serious forensic work.

5
forgottenacc57 46 minutes ago 0 replies      
So many years pen testing.

Is blue better than black?

Do red pens last longer?

6
Marching neural network arogozhnikov.github.io
14 points by signa11  2 hours ago   3 comments top 3
1
jerad 10 minutes ago 0 replies      
This is pretty cool. It seems impossible to mouse away while maintaining the orientation. I want to be able to perturb the weights and watch from different angles. Very cool though, nonetheless.
2
hacker_9 1 hour ago 0 replies      
Is this meant to be useful or just look interesting? I can't really say I understand what is being shown.
3
gabrielgoh 44 minutes ago 0 replies      
for those wondering, what is on display are the level sets (3d contours) of a neural network.

this is a very cool use of shaders for raytracing, but it is not terribly informative.

7
Sci-Hub as Necessary, Effective Civil Disobedience brembs.net
219 points by dredmorbius  8 hours ago   36 comments top 9
1
tacomonstrous 4 hours ago 2 replies      
In my experience as a mathematician, most for profit journal publishing companies are essentially rentiers making money off a captive clientele, who don't even get much utility any more from them, except for access to some older articles. Moreover, the services they provide to authors are also usually worse than the non-profit journals: no copy editing, crappy editorial tools, just a complete disregard for the actual producers of the (free) content they make money off. It's a complete and unmitigated scam.
2
quickben 3 minutes ago 0 replies      
This round is going to be fun.

See, I'm old enough to remember this battle starting for music and movies. We know how that ended.

But now, it's for the very knowledge that drives our civilization.

"Stallman was right", oh how that statement is going to get tested.

3
nsaslideface 3 hours ago 1 reply      
It is rare that true heroism can be framed as a catchy story. Those working for campaign finance reform is another example. People who fight "David-like" against these Goliaths are doing extremely radical work, and they will be canonized in the coming centuries. They don't get the semi-celebrity of Snowden or Ellsberg in the present day because they lack the whistleblower story of one person verses the full power of the federal government - the only sort of punishments they face are an unsexy sinking into obscurity, or a quiet snuffing-out such as with Aaron Swartz.
4
_lm_ 1 hour ago 1 reply      
I wonder if closed-access publishing is part of why academia seems so insluated from the "real world". People write for their audience, and if the general public can't read academic papers, then academics are going to write as if only other academics are reading.

Likewise, if research output is difficult to access, the feedback loop between ideas and implementation is broken; folks outside academia can't easily comment on the cutting edge work in a field, and academics only have to worry about what other academics think of their work.

5
nyolfen 5 hours ago 1 reply      
elbakyan surely belongs to the category of contemporary outlaw-/folk-heroes. at least she is in my personal pantheon.
6
mmmnn 3 hours ago 0 replies      
Does anyone know how Elbakyan was 'outed' as the creator of Sci-Hub? When I first started using it, it all seemed quite secretive as to the creator's identity. Was this something uncovered during the lawsuit?
7
pks016 1 hour ago 2 replies      
Related to this, I have one question.

Suppose one is going to write a paper and he need to add a reference of another paper which he has got from Sci-hub. So is there any check or any sorts of things of originality of reference paper ?

(I don't know exactly how publishing works)

8
dredmorbius 5 hours ago 0 replies      
So long as we're looking at intellectual property, and arguments against it:

An online book by UCLA economics professors Michele Boldrin and Mark Levine, making the case against intellectual property -- patents, and copyright most especially.

The opening chapter leads off with the patent battles of James Watt, which are credited by some authors with setting back the start date of the Industrial Revolution from 1769, when the patent was issued, to 1800, when it (after parliamentary extension) finally expired.

http://levine.sscnet.ucla.edu/general/intellectual/againstne...

Joseph Stiglitz, "Knowledge as a Global Public Good," in Global Public Goods: International Cooperation in the 21st Century, Inge Kaul, Isabelle Grunberg, Marc A. Stern (eds.), United Nations Development Programme, New York: Oxford University Press, 1999, pp. 308-325.

http://s1.downloadmienphi.net/file/downloadfile6/151/1384343...

9
aaron-lebo 5 hours ago 2 replies      
This means it is our duty to the citizens to reduce our publishing expenses to no more than currently ~US$200m per year (and we would even increase the value of the literature by making it open to boot!). If we were to do that, wed have US$9.8b every single year to buy all the different infrastructure solutions that already exist to support all our intellectual outputs, be that text, data or code.

Is is also our duty as the people to reduce all expenditures on software? Is piracy justifiable especially within government institutions?

I guess my point is that necessary is a really strong claim and you can justify a lot of crazy stuff with that. Scientific progress has continued on just fine despite these cartels. With no supporting evidence I'd argue that today's scholars have hundreds of times the free information available than they did a century ago and that ramps up the further you go back. It's easy to imagine that Elsevier's lockdown of a paper is the difference between an academic breakthrough or not, but in reality that's probably not the case, even if it's a noble cause.

8
Ubers Complicit Board mondaynote.com
10 points by robin_reala  42 minutes ago   1 comment top
1
noncoml 11 minutes ago 0 replies      
Uber is what it is today because the board and Kalanick had no ethics and complete diregard for the law since the beginning.

They showed that from their very early days. If you supported the company until last year, then you supported their behavior as a whole. You cannot have your pie and eat it too.

9
SpaceX successfully launches and recovers second Falcon 9 in 48 hours techcrunch.com
332 points by janober  10 hours ago   122 comments top 16
1
schiffern 8 hours ago 2 replies      
Incredible time-lapse of the landing, from Elon Musk's instagram: https://www.instagram.com/p/BVxysOlA04j/

The yawing motion at the beginning of the video is because they moved the drone ship to avoid stormy seas, so the stage had to thrust sideways to retarget. In calm weather SpaceX positions the ship right along the ballistic path, so the stage only needs to pitch up and "flip."

You can also see the grid fins "pulling up" through the atmosphere to bleed off as much speed as possible. I described the optimization a while back. https://news.ycombinator.com/item?id=14288431

Fantastic job to everyone at SpaceX!

2
TheAlchemist 9 hours ago 5 replies      
This is becoming really astonishing. It's more and more fitting the quote They did not know it was impossible so they did it

I mean, the company was founded only 15 years ago, they started (with success) launching stuff into space only 10 years ago and now it feels like they are able to launch rockets into space every week. Reusable rockets should we add.

Musk very often sets impossible deadlines, but in this case, even if you take a step back, it's scary to make 10 years predictions based on this company track record !

3
ChuckMcM 8 hours ago 1 reply      
Per Elon's tweet those grid fins were much better behaved than previous versions. Nothing got hot enough to start showing up in the visible spectrum (good). And what was interesting for me was the lack of gunk landing on the camera. (presumably from the fact that the covering of the fins wasn't burning off like it had in previous flights). What is particularly impressive for me is the slow and steady progress on the 'landed' F9's. The first one successfully landed looked really beat up, and the next couple marginally less so, Friday's went through a part of flight regime that SpaceX had deemed "un-recoverable" and an this one came through looking quite good. Still feels like science fiction to me ...
4
Animats 7 hours ago 2 replies      
Nice. SpaceX is finally getting their launch rate up.

As a business, that's been SpaceX's biggest problem. Customers like the pricing but not the long delays. Finally, SpaceX seems to be getting past that.

Getting pad time at Canaveral is a bottleneck. SpaceX is still building their own launch site at Brownsville,, TX, but that's going slowly.[1] All SpaceX has there right now is some fill that's settling (the location is on sand maybe 2m above sea level) and a dish antenna. Next to be built, the fire station. First launch is now supposed to be no earlier than 2018.

[1] http://www.krgv.com/story/35550679/3m-road-project-underway-...

5
zer00eyz 8 hours ago 3 replies      
Those fins are awesome:

https://twitter.com/elonmusk/status/878821062326198272

Cast and cut titanium. They are about 4x5 feet and some of the largest (if not the largest) titanium castings in the world.

Titanium is an amazing material that is super hard to work with (special furnaces), and has its own sets of risks (titanium fire any one). I would love to see what goes into making those things because it simply has to be impressive.

6
Tade0 9 hours ago 2 replies      
I like how SpaceX has a "pricing" section on its webpage as if space flight was something mundane and pedestrian like an oil change in your car or something.
7
sidcool 1 hour ago 0 replies      
Some mornings when I am unable to get out of bed, such news act as an Adrenaline shot for me. I kick myself out of the black hole and go ahead to launch my rockets (metaphorical).

If Elon can, I need to, as his protege (again metaphorical)

8
snovv_crash 9 hours ago 4 replies      
Did anyone else notice that the engines shut off while the rocket was still a small ways up, after which it fell quite heavily onto the landing legs?
9
vermontdevil 9 hours ago 2 replies      
And the second launch with the updated grid fins made out of titanium alloy. Elon Musk said it went well and they want to reuse them indefinitely.
10
app4soft 8 hours ago 1 reply      
Take TLE of all Irridium-NEXT satelites...http://celestrak.com/NORAD/elements/iridium-NEXT.txt

... and import it to Stellariumhttp://www.stellarium.org/wiki/index.php/Satellites_plugin

P.S.: Stellarium 0.16.0 released few days ago!https://sourceforge.net/p/stellarium/news/2017/06/stellarium...

11
ttandon 1 hour ago 0 replies      
Does anyone know why a side-view of the actual touchdown (like they usually release after) hasn't been posted by SpaceX yet?
12
lostdog 8 hours ago 2 replies      
To deploy 10 satellites, does the rocket do a series of burns, or do the satellites have enough propulsion to deploy themselves to separate orbits?
13
deegles 7 hours ago 0 replies      
Looking forward to articles complaining about SpaceX launching too often. :)
14
gsmethells 5 hours ago 1 reply      
I'm reading his autobiography right now. Fascinating life of both the man and his companies if you are interested: https://www.amazon.com/Elon-Musk-SpaceX-Fantastic-Future/dp/...
15
agumonkey 9 hours ago 3 replies      
The frequency is quite staggering. Will Musks replace Hertz?
16
babyrainbow 5 hours ago 0 replies      
There seems to be a reason that more companies are not attempting to reuse rockets [1]

So considering that, SpaceX has not proved anything, yet. Because the impossible or hard part is not launching and landing rockets. Hard part is to do it..

1. With same or more reliability than using completely new rockets.

2. Launch with enough frequency to justify the reusing procedure..

So yea. A couple of launches and reuses does not prove anything. It is a start, sure. But they have not yet proved others who didn't attempt this yet..

[1] https://news.ycombinator.com/item?id=14626183

10
Intel Skylake/Kaby Lake processors: broken hyper-threading debian.org
839 points by vbernat  16 hours ago   215 comments top 32
1
userbinator 15 hours ago 6 replies      
The problem description is short and scary:

Problem: Under complex micro-architectural conditions, short loops of less than 64 instructions that use AH, BH, CH or DH registers as well as their corresponding wider register (e.g. RAX, EAX or AX for AH) may cause unpredictable system behavior. This can only happen when both logical processors on the same physical processor are active.

I wonder how many users have experienced intermittent crashes etc. and just nonchalantly attributed it to something else like "buggy software" or even "cosmic ray", when it was actually a defect in the hardware. Or more importantly, how many engineers at Intel, working on these processors, saw this happen a few times and did the same.

More interestingly, I would love to read an actual detailed analysis of the problem. Was it a software-like bug in microcode e.g. neglecting some edge-case, or a hardware-level race condition related to marginal timing (that could be worked around by e.g. delaying one operation by a cycle or two)? It reminds me of bugs like https://news.ycombinator.com/item?id=11845770

This and the other rather scary post at http://danluu.com/cpu-bugs/ suggests to me that CPU manufacturers should do more regression testing, and far more of it. I would recommend demoscene productions, cracktros, and even certain malware, since they tend to exercise the hardware in ways that more "mainstream" software wouldn't come close to. ;-)

(To those wondering about ARM and other "simpler" SoCs in embedded systems etc.: They have just as much if not more hardware bugs than PCs. We don't hear about them often, since they are usually worked around in the software which is usually customised exactly for the application and doesn't change much.)

2
theGimp 15 hours ago 3 replies      

 The issue was being investigated by the OCaml community since 2017-01-06, with reports of malfunctions going at least as far back as Q2 2016. It was narrowed down to Skylake with hyper-threading, which is a strong indicative of a processor defect. Intel was contacted about it, but did not provide further feedback as far as we know. Fast-forward a few months, and Mark Shinwell noticed the mention of a possible fix for a microcode defect with unknown hit-ratio in the intel-microcode package changelog. He matched it to the issues the OCaml community were observing, verified that the microcode fix indeed solved the OCaml issue, and contacted the Debian maintainer about it. Apparently, Intel had indeed found the issue, *documented it* (see below) and *fixed it*. There was no direct feedback to the OCaml people, so they only found about it later.
Inexcusable.

3
fotcorn 15 hours ago 2 replies      
The latest intel-microcode package from Ubuntu 16.04 does not fix the problem. I installed the same package from Ubuntu 17.10 [0] which fixes the problem. You can check your system with the script linked in the mailing list thread [1].

[0] https://packages.ubuntu.com/en/artful/amd64/intel-microcode/...

[1] https://lists.debian.org/debian-devel/2017/06/msg00309.html

4
pedrocr 14 hours ago 0 replies      
Here's how to fix it on a Thinkpad on Linux. I've got a T460s and checked with the script[1] that it was indeed affected. The Debian instructions said to update your BIOS before updating the microcode package so I went to the model support page[2] to the BIOS/UEFI section and downloaded the "BIOS Update (Bootable CD)" one. The changelog included microcode updates so it looked promising[3]. To get the ISO onto a usb drive I did the following:

 $ geteltorito n1cur14w.iso > eltorito-bios.iso # provided by the genisoimage package on Ubuntu $ sudo dd if=eltorito-bios.iso of=/dev/sdXXX # replace with your usb drive with care to not write over your disk
I then had a bootable USB drive that I ran by rebooting the computer, pressing Enter and then F12 to get to the boot drive selection and selecting the USB. From then it's just following the options it gives you. It's basically pressing 2 to go into the update and then pressing Y and Enter a few times to tell it you really want to do it. After that just let it reboot a few times and the update is done. After booting again the same test script[1] now said I had an affected CPU but new enough microcode.

[1] https://lists.debian.org/debian-user/2017/06/msg01011.html

[2] http://pcsupport.lenovo.com/pt/en/products/laptops-and-netbo...

[3] https://download.lenovo.com/pccbbs/mobiles/n1cur14w.txt

5
Syzygies 14 hours ago 2 replies      
In my experience with parallel code written in Haskell, hyper-threading offers only a very mild speedup, perhaps 10%. It is essentially an illusion, a logical convenience. (How long does it take to complete a parallel task on a dedicated machine? Four cores with hyper-threading off has nearly the performance of eight virtual cores with hyper-threading on.)

Many people have neither the interest nor the hardware access to overclock, and these processors have less overclocking headroom than earlier designs. Nevertheless, the hyper-threading hardware itself generates heat, restricting the overclocking range for given cpu cooling hardware. In this case, turning off hyper-threading pays for itself, because one can then overclock further, overtaking any advantage to hyper-threading.

6
tyingq 15 hours ago 1 reply      
There's a perl script on the debian mailing list that digs a bit deeper and tells you if you're affected in the first place, if you're affected but patched already, affected but have HT disabled, etc.

https://lists.debian.org/debian-user/2017/06/msg01011.html

7
mjw1007 13 hours ago 0 replies      
It's painful to have to read text like select Intel Pentium processor models .

If Intel used marketing names that were more closely related to technical reality, then when something like this happens they wouldn't have so many customers finding themselves in the "maybe I'm affected by this horrid bug" box.

8
ourcat 15 hours ago 4 replies      
So will this be affecting most Macbook Pros of the past few years?

If so, there's a way to disable hyper-threading, but you need Xcode (Instruments).

Open Instruments. Go to Preferences. Choose 'CPU'. Uncheck "Hardware Multi-Threading". Rebooting will reset it.

9
onli 15 hours ago 1 reply      
Rule of thumb: On a desktop, if you have an i5 you do not have Hyperthreading. All i3s and i7s do have Hyperthreading, as do new Kaby Lake Pentiums (G4560, 4600, 4620).

On laptops, some i5s are not real quad cores but dual cores with Hyperthreading.

10
age_bronze 8 hours ago 0 replies      
I would've expected at least an example assembly code reproducing the bug? How was it not discovered before, but only with the OCaml compiler? They say "unexpected behavior", does this mean that code compiled with this can give incorrect results? Can this have any security implication? How much code was compiled with similar patterns? Can the problem reproduced with any JIT compiler? We need to know what can cause this, maybe compiled and working code already contains such patterns waiting to be abused...
11
luckydude 12 hours ago 0 replies      
That's a really nicely done announcement. Simple, to the point, no drama, all the info you could want, scripts to figure out your processor, etc.

Well done Debian folks!

12
zzalpha 13 hours ago 0 replies      
Charming. I picked up a 5th Gen X1 Carbon configured with a Kaby Lake processor, and apparently there's no way to disable hyperthreading in the BIOS, and according to Intel's errata, no fix available yet.

Oh well... so far the machine (running Windows 10) has been stable minus one or two random lockups in 2 months of heavy usage which could be attributed to this. Guess I wait...

13
spektom 1 hour ago 0 replies      
I wonder how this issue affects cloud providers?
14
ncrmro 14 hours ago 0 replies      
Just got the 2017 no touchbar 13 macbook pro with the kaby lake i7. Should I be worried, can I even disable HT with mac. And presumably the update will be provided so the whole laptop is still ok?

I've been using the thunderbolt 3 dock with two external monitors and occasionally get a little glitch prolly loose cable I think.

I've downloaded the bitcoin blockchain, done quite a bit of work in pycharm + chrome, multiple projects, flow and webpack in the background and haven't had any sort of crashes tho.

15
ComputerGuru 12 hours ago 2 replies      
So, serious question: If the microcode "fix" for this ends up disabling HT, how does one get a refund not just for the CPU but for the $3k laptop I spec'd around it? Without needing to sue?

This isn't a hypothetical; what did Intel do when the only fix for broken functionality was to disable TSX entirely?

16
rwmj 15 hours ago 3 replies      
So if I understand correctly, some affected processors can be fixed by a microcode update, but there are some which cannot be fixed at all?

Also the advisory seems to imply that the OCaml compiler uses gcc for code generation, which it does not -- it generates assembly directly, only using gcc as a front end to the linker.

17
joshschreuder 8 hours ago 1 reply      
How does one get new microcode on Windows? Is that what the Intel Chipset drivers are?

And is the microcode fix available for non-Linux systems yet?

18
elnik 12 hours ago 1 reply      
My CPU (6th generation i5) died last week. RIP.

I installed debian 9, installed virtualbox, vagrant, setup a clean development machine for myself, everything took 4 hours to finish.

I reboot the virtual machine, and boom, there was a kernel panic which I sadly don't remember exactly / didn't take a picture of. After I rebooted the machine, and opened terminal, the system froze. The cursor wouldn't move. Reboot again, motherboard has a CPU fail/undetected light on. Couldn't get it to boot after that.

I am both sad and relieved that bad stuff exists, but it's being patched to prevent proliferating.

I sincerely hope I'll get a replacement from Intel.

19
bleair 11 hours ago 1 reply      
When intel had the floating point division hardware bug they recalled chips.https://en.wikipedia.org/wiki/Pentium_FDIV_bug

I wonder if intel will do something like that again or if the industry as a whole is more tolerant of unreliable / buggy behavior and will just live with it. Examples of Apple just telling people that the poor reception strength was their own fault / changing software to hide problems / etc.

20
riledhel 15 hours ago 4 replies      
Does Windows have a patch for this too? Or just disabling HT is the safest option?
21
wfunction 7 hours ago 0 replies      
A little off-topic, but does anybody know of any hacky ways to disable hyper-threading (on Haswell if it matters) if the firmware doesn't provide the option?
22
itsoggy 10 hours ago 0 replies      
When HT first started appearing on P4 chips I was looking after NetWare, 2K and XP boxes, they would freak out with HT enabled all kinds of oddities, I suspect most because of the OS's not fully supporting it.

To this day I disable it by reflex on everything!

23
octoploid 15 hours ago 3 replies      
Well, at least Intel acknowledges, documents and finally fixes these CPU bugs (via microcode updates).

AMD on the other hand doesn't even acknowledge an issuewhen multiple customers report problems. See this Ryzen bug:https://community.amd.com/thread/215773

24
cJ0th 10 hours ago 0 replies      
This is just great! Just yesterday I got a new laptop with a skylake processor. Now I wonder whether I experienced that bug today as I got kicked out of dosbox (on debian) for no apparent reason. In the config file I had to change the value of 'core' in the cpu-section from 'automatic' to 'normal'. Could be something entirely different but it is a funny timing.
25
isaac_is_goat 15 hours ago 2 replies      
Holy cow. Definitely feel like I dodged a bullet by building an AMD/Ryzen system this time around - which had it's own set of issues (but seem to be more or less ironed out now).
26
walterbell 15 hours ago 2 replies      
Does this affect execution of Ocaml runtime, or only the Ocaml compiler?
27
geogriffin 14 hours ago 0 replies      
has anyone affected by this bug tried using a kernel configured with hyperthreading support disabled? would that work?
28
asow92 14 hours ago 1 reply      
So what does this mean for the thousands of new MacBook Pro 2016/2017 owners out there?
29
angry_octet 15 hours ago 2 replies      
Am I hellbanned?
30
natehouk 15 hours ago 0 replies      
So if I understand correctly, this was known all along?
31
salex89 15 hours ago 2 replies      
Great, pay a premium for the top of the line CPU to get anything more than 4 threads, that disable it...
32
boris 15 hours ago 0 replies      
Intel treats its customers like mushrooms: feeds them shit and keeps them in the dark.
11
Ask HN: Why are credit card chip readers so slow?
52 points by dv35z  3 hours ago   47 comments top 20
1
sofaofthedamned 27 minutes ago 0 replies      
I was a user in the Mondex card trial in 1995. This was like modern chip cards, but a stored wallet instead of online auth to an account:

https://en.wikipedia.org/wiki/Mondex

The banks outfitted buses, bars, pretty much everywhere with readers but even after inducements to use it such as half price beer(!) it still failed. Why? Because it was soooo slow. Waiting for ~45 seconds at the bar for a payment to go through got old really fast. It barely lasted a year.

I'd have thought the friction of the payment would have been a lesson learned, but here we are 22 years later and it's still a pain.

2
ca12et 2 hours ago 2 replies      
Additional question: why is it faster in other countries? The first time I used a chip card in the US I was astounded by how long it took. I had been using chip (and pin) cards in Canada for years and it was never as slow as it is in the states.
3
phlo 2 hours ago 0 replies      
As has already been pointed out, EMV transaction flows go through many steps. From what I understand, the protocol was designed with a focus on flexibility, and little attention was paid to low latency.

Until some years ago, most terminals would mirror that. Most prominently, they used to have separate "enter pin" and "verify transaction amount" steps, and included longer delays for displayed status codes. Recent devices have started combining these steps ("Amount: xy. Enter PIN to confirm") and status messages.

Newer use-cases like the contactless qVDSC application have been tuned for better performance, limiting the amount of communication between reader and card.

For more details, have a look at this guide from VISA: https://www.visa.com/chip/merchants/grow-your-business/payme...

4
asciimo 2 hours ago 1 reply      
There's an express Target in the San Francisco Financial District that gets around this by assigning cashiers to two registers. They start the chip payment transaction on one register, and the slide over to the second register to start another customer's checkout. Then they slide back to hand the receipt to the first customer, etc. Absurd but effective.
5
exabrial 2 hours ago 5 replies      
And why on earth do you have to SIGN still? Seriously. I draw a picture of Shammoo most of the time, to the delight of many cashiers
6
Humphrey 1 hour ago 1 reply      
This post explains why I was so frustrated using my card in the USA the other month. I figured it was super-slow because I had an international card, and it was confused.

Back here in Australia, almost every retailer (including those on 3g eftpos machines) takes < 4s from when i tap my card, to when I can start walking away. So much quicker than cash :-)

7
bericjones 2 hours ago 4 replies      
Because with the swipe readers there is only one call to the payment processor.

However, with chip transactions there are multiple calls for different payment processing flows. For example, a transaction could require 5 round trip request responses from the chip to the payment process meaning 5x the time required.

8
_wmd 2 hours ago 0 replies      
Note this isn't true in all countries. My UK cards within the UK all follow some apparently online process in any UK merchant, however during a stint in Finland a few years back, I didn't find a single example of a merchant where their reader didn't instantly approve my transaction as soon as I correctly entered my pin.

Never received a (note: I know, we can all make guesses) conclusive answer explaining the difference.

9
tomerbd 6 minutes ago 0 replies      
could it be on purpose? to have it more secure? like, wait before you can retry?
10
userbinator 2 hours ago 1 reply      
Smart cards (ISO 7816), used for credit cards and SIMs, among other things, communicate through a relatively low-speed serial protocol. The secure microcontroller they contain is also quite slow, especially if you consider the cryptographic operations they're required to perform. I suspect part of it is due to power constraints, and also somewhat tamper resistance.
11
vasusen 1 hour ago 0 replies      
Here's a good blog post from a WePay engineer that explains some of the slowness - https://wecode.wepay.com/posts/supporting-chip-cards-at-wepa...
12
cdibona 1 hour ago 0 replies      
Do you mean the actual chip back and fourth? The inherent problem is that the 7816-d standard is a mess. It requires extremely small data exchanges on the order of seconds to get a cert out of the card.

This has been a mess since the mid 90s, when I first worked on these things.

Here a cruddy not at all usefule link to the standard:

http://www.cardwerk.com/smartcards/smartcard_standard_ISO781...

13
toast42 2 hours ago 0 replies      
Planet Money did a story on this last year.

http://www.npr.org/sections/money/2016/04/13/474135422/episo...

14
pbreit 1 hour ago 0 replies      
Previously the mag-stripe conducted your card number to the merchant and they could charge essentially whatever they wanted (but there were various reasons they would likely charge the amount you owed). With chip, they have to compute the final amount while your card is inserted and cannot deviate.
15
ronpeled 10 minutes ago 0 replies      
just another reason we'll move faster into blockchain and decentralized crypto currencies...
16
Mandatum 1 hour ago 1 reply      
Bad internet speeds, WiFi or business skimping on internet. It's never usually the terminal, it's the connection to their payment provider, or their payment provider reseller's connection to THEIR payment provider.

It's very common for bars and restaurants to have a dedicated line for the terminal, but usually they'll skimp on tech (have seen dial-up over POTS or in a fibre-capable premises). Also very common to use 3G or 2.5G.

It'd take a tech all of 5 minutes to diagnose and suggest a fix for 98% of these slow terminals. It's strange seeing businesses not look to fix these issues. If I was a payment provider I'd probably run diagnostics against my customers terminals every day and force poor performing customers to have someone come in and fix it.

18
callumjones 2 hours ago 0 replies      
I believe Index was working on speeding up EMV transactions: http://www.index.com/payments-and-security/emv/

I thought it now depends on the firmware in the card readers, which it seems companies like Index control.

19
randomfool 1 hour ago 0 replies      
FWIW, Square's chip reader seems to be much faster than many others.
20
Figs 1 hour ago 0 replies      
I don't know what the reason actually is, but I assumed it was slow by design to make it harder to compromise, similar to bcrypt.
12
Elias Burstein, Pioneer in Semiconductors, Dies at 99 nytimes.com
25 points by fmendez  5 hours ago   1 comment top
1
brian_herman 2 hours ago 0 replies      
Can we get a black bar for this guy?
14
Microchips That Shook the World ieee.org
79 points by amelius  9 hours ago   44 comments top 9
1
kqr2 8 minutes ago 0 replies      
Although it wasn't earth shattering, my personal favorite is the Motorola MC14500B 1-bit microprocessor:

https://en.wikipedia.org/wiki/Motorola_MC14500B

2
vparikh 1 hour ago 0 replies      
As stated with the 68000 - I often wondered where the industry would be if the 68000 was picked instead of the x86 architecture by IBM. The 68000 had a far better design and the ISA had real legs. Instead we got stuck with the monstrosity that is the x86 architecture. Probably put us back a decade or so I think.
3
gumby 6 hours ago 0 replies      
I like that list but am sorry that they didn't include the 7400. A far more influential part than Transmeta (which I sy despite having many friends who were at Transmeta).
4
jhallenworld 3 hours ago 0 replies      
I'm happy to see the XC2018 FPGA on this list. I remember when they came out, but at the time I did not realize just how much could be done with simple logic (I was then interested in bit-slice and microcode), so ignored them. I did get to use them though since both the 2018 and 2064 were still available in the mid-90s, even though they were obsolete.

This list is missing RF chips, though I suppose success is from RF technology on chips, not any single killer chip.

5
stefanpie 8 hours ago 1 reply      
Would the Atmel AVR line (such as the ATMega328P or the ATTiny85) count as one set microchips that shook the world. They played an important role in the rise of modern hobby electronics such as the Arduino, but I don't know how much they are used in practical applications.
6
flavio81 7 hours ago 1 reply      
I wish IEEE had a newspaper! A really great, comprehensive list. Really good that they have added also the CCD chips and the DLP micro mirrors as well!
7
Sytten 8 hours ago 0 replies      
Proof that those chips got us to where we are now, I used most of them during my Bachelor of Computer Eng. (Recently graduated).Chips like the 555 are extremely cheap and reliable. You got to start with the basic if you want to create more complex designs after.
8
deepnotderp 8 hours ago 3 replies      
This is super cool, maybe add Itanium for an example of what failed?
9
InclinedPlane 5 hours ago 0 replies      
This is a really excellent list. It includes a lot of things that are easy to forget due to how much we've progressed since then but were absolutely hugely impactful in their day. What I find fascinating is the degree to which we've come to pretty much master this stuff. A modern smartphone includes analogs of nearly every single chip in the list or depends on some core element of the functionality in some way.
15
Picking Winners: A Framework for Venture Capital Investment arxiv.org
68 points by blopeur  10 hours ago   24 comments top 7
1
chroem- 7 hours ago 7 replies      
I feel like venture capital is a major choke point in our current implementation of capitalism, where "the market will decide" is really a euphemism for "a small handful of extremely wealthy investors will decide." We're paying a massive opportunity cost on all of those ideas that don't ever reach the market because VC's don't think they're easy money.

A good way to resolve this would be to reform the accredited investor laws into something more meritocratic. Instead of needing to own one million dollars in assets, there ought to be some sort of knowledge-based competency exam so that regular people can invest in ideas they think are worthwhile.

2
dzink 1 hour ago 0 replies      
The model seems to correlate what worked previously to what will work next. However, in a field that is supposed to lead to new product funding - past industries may steer funding in the wrong direction from what will work next. Two issues:

1. The purple cow effect - the opportunities for highest growth may be in underserved segments which are best addressed by founders less represented in Silicon Valley. (the next big thing may be a farming startup in India, but the founder won't fit the well connected or well advised by people with startup exits model this framework uses, and thus will be missed by SV investors, which leads to problem 2. )

2. The money trumps product problem - whatever you can't beat with a solid product, you could always hammer with more money in the bank. Instead of hearing about a farming innovation in India, American farmers could be getting FarmVille ads on TV instead and tuning out. Since VCs invest locally, even if a startup starts picking up steam in Chicago, SF investors who don't have a toe in that pond, pick the local fish that eats the same algae and fund it far better than the Chicago company, which might have a better product. In a land-grab industry that money may be enough to gain adoption to the SF pond-dweller, but returns for the entire market will be lower, due to lower product quality and tendency of big firms here to pick only one company per industry.

So the framework, biased by past data, may skew future data away from results that would be optimal without a framework.

3
startupdiscuss 4 hours ago 0 replies      
This analysis is a little too "content free" for my tastes. (Edit: I don't mean there is nothing to it, I mean the analysis is largely formal as opposed to the content of the decision)

But this problem hasn't gone unnoticed and there are some ideas around how to solve the "pick the rare winners" problem:

1. Andreessen: Don't pick winners. Invest in the startup after it has already demonstrated itself to be a winner but before it goes public. This is the safe growing area.

2. McClure: Invest small amounts, early so that you can afford to spread the investment over many companies.

3. Thiel, Gurley: Be right

4. Graham: combination of #2 and #3

5. Doerr: Network like crazy to have a shot of being in the few good ones (this assumes you will recognize them)

#3 is not necessarily something we can reproduce

4
samfisher83 50 minutes ago 0 replies      
If you look at figure 4 on page 24, it basically says that the VCs picking ability is about good as random chance. They have some really smart people working for them and its still hard to pick winners.
5
exogeny 8 hours ago 3 replies      
There's a few VC firms (Correlation Ventures leaps to mind, as noted in the article) that invest solely on the basis of a quantitative model that looks at the various features of the business (market, founders, etc.) and then doing some kind of neural network/similarity scoring analysis on it. Of course, a big feature that I presume their models have which this paper does not is the understanding that an IPO is worth tens of small wins, if not more.

The real optimal setup here would be to pair that kind of mathematical rigor with the dealflow of an a16z or KP. I would suspect that both of those two would say that a similar model exists in the heads of their partners so far as pattern recognition, but..

6
coolswan 5 hours ago 0 replies      
Despite it being so obvious, I really like their analogy,

 pharma::drug studio::movie vc::startup
"Just" need one to work out!

7
hatmatrix 7 hours ago 0 replies      
I thought predicting winners in startups was searching for "Black Swan" events - the ones models often miss...
16
IPv4 route lookup on Linux bernat.im
74 points by luu  11 hours ago   11 comments top 3
1
xiconfjs 58 minutes ago 1 reply      
How does this 50ns compare to commercial hardware routers like Brocade, which have dedicated memory (TCAM) for their routing table, which should be extra fast memory - as e.x. Brocade claims?!
2
betaby 8 hours ago 3 replies      
Unfortunately IPv6 stack doesn't match IPv4 one on features and optimizations.IPv6 is still using regular radix tree and a route cache.
3
ladzoppelin 6 hours ago 2 replies      
Why is this image https://d1g3mdmxf8zbo9.cloudfront.net/images/linux/lpc-trie-... jumbled in Firefox beta?
17
Backtick Macro Processor for Forth rdrop-exit.github.io
34 points by eatonphil  9 hours ago   1 comment top
1
evincarofautumn 48 minutes ago 0 replies      
Nice to see more interest in Forths and concatenative languages lately on HN. IMO theyre even more amenable to metaprogramming than Lisps, and theres a lot of power in a compositional, interactive style of programming, especially for a low-level language with no safety nets. I definitely recommend playing around with a Forth (I like Gforth) or Factor, and trying to absorb some of the philosophyits helped improve my program design in other languages.
18
Torus A secure, shared workspace for secrets torus.sh
61 points by sr2  10 hours ago   11 comments top 5
1
TheSwordsman 1 hour ago 0 replies      
Looks interesting. I'll definitely take a look at how this compares to Hashicorp Vault. Definitely getting closer and closer to the point where I can start to properly utilize a tool like this.

Reading the docs, though, it does seem like it's one legendary AWS outage away from being a huge problem:

>Toruss infrastructure has been designed from the ground up for resilience without any single point of failure. All of our services are autoscaled and run in multiple availability zones in the us-east-1 region.

2
zimbatm 7 hours ago 4 replies      
As soon as developers have access to the production credentials it's game over. Auditability is gone. Passwords end-up being stored in various password stores and plain files because it's convenient. Developers connect to the production system and start doing live changes.

Hashicorp Vault is more difficult to put in place but it does the right thing. With it's custom backend it can generate temporary tokens, for example to access the database. Those token are short-lived and part of the audit log.

3
wyqydsyq 4 hours ago 0 replies      
We've used Torus a fair bit in my team at work, however we're now using Docker's native secrets solution for our container runtimes as it avoids the knowledge and build-time overhead of an extra dependency.

I personally found the experience of using Torus to be great. Getting a quick working setup is easy and it doesn't take much effort to transition from there to locked down access control. Will likely continue using it outside of the Docker context.

4
tptacek 7 hours ago 0 replies      
This does not seem any less complicated than Vault or Keywhiz.
5
m_sahaf 7 hours ago 2 replies      
Looks interesting. I probably need to get my hands dirty to understand how it's different from Hashicorp's Vault. A quick glance says it's a cloud-based solution with simpler UX. I see only client binaries and sign-up instructions. There are no server setup instructions.
19
Show HN: A Minimal Code Example of Using Vulkan for Computations on the GPU github.com
50 points by erkaman  11 hours ago   7 comments top 3
1
erkaman 11 hours ago 1 reply      
OP here. This is a small code example of how to use Vulkan for compute operations. Only ~400LOC. It's pretty short, and has plenty of comments, so it should be useful for people learning Vulkan, I hope.
2
ilaksh 5 hours ago 0 replies      
Has someone created a wrapper or higher-level API that takes care of some of the details for common use-cases?

For example, say I don't need to optimize my command submittal, or I just want the main device. Or maybe my common use case is a compute shader just like the program submitted in this Hacker News post. I would want a class that just exposed a simple API like runComputeShader() or something.

3
sorenjan 9 hours ago 2 replies      
I haven't used GPGPU in a while. Is Vulkan expected to take over from OpenCL?
20
A goldmine of Radio Shack goodies is up for auction hackaday.com
48 points by cstuder  10 hours ago   19 comments top 8
1
dkresge 6 hours ago 1 reply      
I don't know if they were ahead of their time, or woefulily, even negligently, unaware of their market. I remember going to "the mall" as a kid with my Mom and spending (what seemed like) hours staring, coveting, and sometimes purchasing the blue blister pack ICs; all while she waited patiently to do her shopping. I didn't have the income they were looking for, and was far from their target demographic.Some time in 1979 my dad paid the local RS a visit with an intent to purchase a computer for the family. After spending an hour in an effort to get some answers from the salesman who, according to my father, was more interested in selling battery memberships, he walked out. I now recognize and appreciate that, at the time, a $2.5k purchase might have been seen as the exceedingly unlikely result of questions from "just another lookie loo". But if R.S. took even a moment to see the potential of their customer base, I likely wouldn't have awoken Christmas 1979 to an Apple ][+.
2
ChuckMcM 8 hours ago 2 replies      
I suppose it would not be as fun if the title was "A thrift store of Radio Shack goodies" but the truth is I've seen an example of just about everything they are auctioning off at thrift stores. Radio Shack moved a lot of product and its out there still :-). I don't think it will hit peak nostalgia until 2020. See that way people who were 10 in 1990 and had their lives changed by the neat stuff they could buy there, will be 40 and will be trying to recapture that wonder that has been beaten out of them by 15 to 20 years of CRUD programming.

EDIT: Ok, not the pictures of the executives and the stuff they handed out for sales person of the quarter :-)

3
robmiller 6 hours ago 0 replies      
My memorabilia of RadioShack, https://imgur.com/a/BolEA

I even worked for them in Nashville before grad school. Sold a lot of cell phones, but my favorite was selling a karaoke machine to Harry Connick, Jr.

4
coupdejarnac 1 hour ago 0 replies      
I think the only thing I really miss about the Tandy/Radio Shack era is going ice skating at the Tandy Center in downtown Fort Worth when I was a small child. I got my start in engineering with their kits and Forest M. Mims books, but shopping there was a mixed experience. Talk about a poorly run company.
5
RcouF1uZ4gsC 1 hour ago 0 replies      
I remember an Archie comics that was distributed at my school where it was basically and advertisement for Tandy computers. Archie, his dad, and the school principal all had Tandy computers that ehy did awesome stuff on.
6
innovate 2 hours ago 0 replies      
this is my favorite story about the impact radio shack had on a generation of early makers and startups: "HOW I FOUNDED A $2 BILLION COMPANY WITH A 95 CENT BOOK FROM RADIOSHACK":https://www.wired.com/2016/07/how-i-founded-a-2-billion-comp...

I also loved shopping there... the items seemed to be obscure, growing up I always wondered who actually shopped there for these items to justify an entire store... but the demand was there, they just didn't figure out how to take advantage of the early-mover advantage

7
waspear 4 hours ago 0 replies      
What goodies, like the $50 usb cables?
8
ams6110 7 hours ago 0 replies      
Crap from a crappy company that treated their employees like crap.
21
Why is the function SHStripMneumonic misspelled? microsoft.com
36 points by uyoakaoma  10 hours ago   42 comments top 13
1
hota_mazi 8 hours ago 3 replies      
> so there would have to be some coordination among the teams so everybody fixed their spelling at the same time.

What a strange approach. Why not create a function with the correct name, put the code in it and redirect the misspelled function to it?

Mark the bad function deprecated, warn the teams and after a few release cycles, remove it.

2
xg15 7 hours ago 1 reply      
Raymond Chen's blog post are always interesting and enlightening to read and we should be grateful that he takes the time and effort to write them - but his people skills appatently could use a good bit of improvement.

So he teases a significant detail of the answer, then notes that not only won't he explain but he will not allow anyone else to explain either - no reason given. Then, when people unsurprisingly do discuss it, he gets mad and deletes the comments.

That's in addition to a lot of aggressive or condescending answers I've read to what seem quite reasonable questions.

If commenters are such a problem, wouldn't it by now be better to simply close comments completely?

3
johan_larson 7 hours ago 1 reply      
See also: creat()

https://linux.die.net/man/3/creat

It goes way back to the murky beginnings of Unix.

Ken Thompson was once asked what he would do differently if he were redesigning the UNIX system. His reply: "I'd spell creat with an e."

4
dTal 6 hours ago 0 replies      
Okay so the ultimate cause is apparently "because Microsoft was legally compelled to publicly document poorly-designed internal functions as part of the DOJ anti-trust suit".

Go figure why Raymond Chen decided to do a blog post on this though, since apparently that is a verboten topic which leads to a weirdly sparse article that doesn't answer its title question, and a censored comment thread.

It's annoying, but given that it's nearly a decade old it's probably not worth getting upset over it.

5
cytzol 8 hours ago 0 replies      
For anyone wanting to see the linked IOCCC program (now-broken link in the third paragraph), here it is: http://ioccc.org/1987/hines/hines.c

"This program was designed to maximize the bother function for structured programmers. This program takes goto statements to their logical conclusion. The layout and choice of names are classic."

6
yuhong 8 hours ago 2 replies      
The order they are referring to is probably the DOJ settlement in 2002. There used to be a page called "Settlement Program Interfaces" in MSDN. This was also when winternl.h was invented.
7
Vanit 9 hours ago 1 reply      
Not sure why they couldn't rename it, make the previous function an alias of the new one, and deprecate it.
8
userbinator 6 hours ago 1 reply      
I pronounce "mnemonic" without any "u" sound, and I've almost never heard otherwise, so this particular misspelling is a little puzzling --- unless the original author was thinking of "pneumonic". I wonder if it is a UK vs. US difference.
9
raldi 8 hours ago 0 replies      
This story just teases that there will be an answer and then doesn't really give one. To paraphrase the ending: "We had to. To those writing in to explain that we didn't have to, trust me, we had to."
10
desdiv 7 hours ago 0 replies      
TIL that "referer" is a misspelling of "referrer" (from the top comment).
11
makecheck 4 hours ago 0 replies      
Deprecation is a really good solution for fixing things "eventually" but only if you pair it with a plan for making the old one go away forever (a plan that you stick to!).

If you aren't serious about actually removing a bad API at some point, don't change anything. Otherwise, you create two things that need to be tested/supported/kept binary-compatible/etc. instead of one, raising technical debt when you were supposed to lower it.

12
mrkgnao 5 hours ago 0 replies      
The correct link to the IOCCC entry is:

http://www0.us.ioccc.org/1987/hines/hines.c

13
mchahn 4 hours ago 0 replies      
I hate it when people misspell functions and commands to save one letter. umount comes to mind.
22
In Towns Already Hit by Factory Closings, a New Casualty: Retail Jobs nytimes.com
101 points by ctoth  14 hours ago   112 comments top 11
1
ChuckMcM 12 hours ago 5 replies      
As the article points out the retail job crush is exactly because the Factorys closed, nothing more magical than there is less GDP so there is less work. Previously those factory workers bought things at the store, but now that the factory is closed they can't (or they cut back) and the stores don't have some magical source of 'outside of market' money.

They call this out in a couple of places but they never seem to completely connect the dots, there is "He renovated the first floor to attract customers from farther away, customers who might have more money to spend and more places to go than Johnstown." to pull money from towns further away, or "But fewer people can afford his products now that the good jobs are long gone, and Mr. Apryle has had to make adjustments."

It's not Amazon, its not 'big box' chains, its that the city no longer has a production base and so the fraction of GDP this space used to produce has gone away.

2
mc32 12 hours ago 3 replies      
It's a big economy and changes take too long trickling down to those forgotten rural towns once part of the American economic engine.

I'm pessimistic that the federal or even state governments have the will to do anything about it -in terms of policy to effect change.

Not that top down change always works (Japan for example, has made many, many half-hearted but ill-conceived attempts at restarting their growth engine) but smaller nimbler economies have been able to manoeuver economic obstacles (like Taiwan, Singapore and S. Korea) whereas others have stumbled and fumbled (Malaysia, Mexico).

That said, we need to try something. Trump tapped into this angst but does not look like he (or the party) will deliver in the least. Never the less, the issue is not going away and will only become more pronounced. Someone will have to do _something_ about it. People will get upset and more radical elements will be elected till someone begins to take notice and does something substantial about the economic decline of the part of the middle class which got by on medium skills.

3
notadoc 9 hours ago 1 reply      
It's fairly obvious that less (or no) disposable income leads to less consumer spending, which leads to fewer jobs, which leads to less consumer spending, which leads to.... That is exactly the ongoing story of the vanishing middle class.

The lack of demand snowballs and branches out to impact many other things the exact same way that a strong increase in demand spreads out to impact other things.

With a predominantly consumer driven economy, eventually all of this will catch up and be a significant drag on GDP at best, though it'd be easy to argue we're already at exactly that point.

So, what happens next and what do we do about it? If only we had some sort of large team of people who were elected to solve this kind of problem, and if only they had some sort of historical periods of widespread prosperity to reference and model policy on....

4
oneplane 11 hours ago 2 replies      
This job-centric lifestyle seems to point to a problem of sustainability rather than economics. Jobs are never endless, moving somewhere or creating a town for one particular job or factory will always end in deserted towns and closed factories, and not adapting to what is happening around you always ends in tears. Nothing new here...

If you want to work, you should look for what people need rather than whatever you did for a job 20 years ago. For example, we'll always need energy and food, but we won't always need cowboys and coal miners. Instead of trying to be a coal miner, try to be an energy worker.

5
ctoth 14 hours ago 1 reply      
This is related to https://mobile.nytimes.com/2017/04/15/business/retail-indust... which I posted a few weeks ago but got no traction.This is a critical part of the story that people talking about robots taking our jobs are missing. It's already happening, it's not just robots but also new scheduling algorithms and frankly people just not giving two shits about retail workers.
6
withdavidli 11 hours ago 1 reply      
Worked in a mall for a decade. As money moves out so do retail stores. Small mom and pop shops barely sustain because of little to no labor cost. Small businesses that do have to hire employees close down fast because labor cost becomes unsustainable due to low margin business. Worse is when big box stores moves out, they tend to be the attraction that drove people to the mall in the first place, and smaller stores got business from being "discovered" along the way in high traffic locations.
7
ctoth 14 hours ago 0 replies      
One more piece if anybody stumbles across this: https://morecrows.wordpress.com/2016/05/10/unnecessariat/
8
bluedino 10 hours ago 0 replies      
Well, if there's a bright side it's that the good retail jobs are long gone. I had friends whose parents made decent livings selling suits or appliances at Sears and JCPenny. You used to make 40k/year as an assistant manager at KMart. Now it's almost impossible to make $10/hr in retail. There are no benefits or even full-time hours. And the only stores hiring work you like a dog with two people per shift like Dollar General. You have to stock, clean up, and play cashier.
9
Mz 10 hours ago 0 replies      
Sort of a TLDR of the piece:

Rural counties and small metropolitan areas account for about 23 percent of traditional American retail employment, but they are home to just 13 percent of e-commerce positions.

Almost all customer fulfillment centers run by the online shopping behemoth Amazon are in metropolitan areas with more than 250,000 people close to the bulk of its customers

Im thinking about whats next, he said. Were essentially thinking of Johnstown as an economic development laboratory.

10
Theodores 12 hours ago 3 replies      
The article blames ecommerce and Amazon for the latest decline. I don't live there, however, I believe this is not to do with ecommerce, it is simply the loss of manufacturing and what happens over time.

In the UK we had many communities thrown on the scrapheap with mines, steel and much else manufacturing gone. Initially people invest their redundancy money in things such as a new dog-grooming business, a cafe, a shop, perhaps a tattoo parlour, depending on what 'follow your passion' leads to. So these businesses go okay for a while, eventually the redundancy money runs out, or, in rare instances, the 'follow the passion' business actually meets a genuine need and a success story happens.

Around the time of this general decline my sister was trying to raise money to go somewhere fancy with her group of friends. Being skint she decided to raise some money by making things - things with beads, jewellery, that sort of stuff. These items sold 'well' but only to her friends whom she was going to be travelling with. There was no 'external market'. So rather than go on the big trip to the festival they missed out on that and spent what little money they had on beads etc. to make stuff to sell to each other. A lesson in economics was learned the hard way.

To some extent any town/city/country that does not have manufacturing and external markets will be a variant of my sister's schooldays model of capitalism. If manufacturing (or mining) jobs go and retail comes along to 'fill the gap', then it cannot last forever. Tourism can't come to the rescue either.

11
fiatjaf 12 hours ago 1 reply      
23
After 17 years of development,does the Boost library meet its original goal? cppdepend.com
47 points by virtualcpp  8 hours ago   21 comments top 7
1
ClassyJacket 4 hours ago 0 replies      
This website is unfortunately near unreadable thanks to hijacking my scrolling and constantly pushing me past where I want to look.
2
personjerry 4 hours ago 0 replies      
Also check out https://github.com/facebook/folly which "complements" Boost and std
3
vortico 1 hour ago 1 reply      
Is it still required to download the entire 300MB+ Boost library to use a small section of it, like odeint for example? This has always been the "joke" that prevents me from taking it seriously.
4
WalterBright 3 hours ago 1 reply      
An unexpected dividend of Boost is the Boost License, which is the most open source of the open source licenses. We use it pervasively in the D community.
5
vitaut 5 hours ago 1 reply      
I think Boost was very successful in encouraging development of high-quality libraries that seamlessly work together (most of the time) and have consistent build infrastructure. It is not that important nowadays because the most important parts have already been integrated into the standard library and we have a de facto standard build system (CMake) that makes it easy working with small libraries that do one job well.

Where Boost is a bit lacking is modularity and somewhat varying library quality but this is kind of expected considering the number of subprojects.

6
noway421 4 hours ago 1 reply      
This website has terrible scroll.
7
adzm 5 hours ago 0 replies      
It gets rehashed on the mailing lists all the time but the biggest issue with boost now is backwards compatibility. While it is great to have libraries working on older platforms, it's a pain to use modern cpp features.

With increased modularity, I'm sure this situation will be improved.

24
Easiest Path to Riches on the Web? An Initial Coin Offering nytimes.com
87 points by robbiet480  7 hours ago   135 comments top 19
1
ashnyc 1 hour ago 0 replies      
ICO are the best thing that ever happened. This is no different then kickstarter.com started. People where saying the same things. People put up lot of BS project and they got funded, in time these kind of project will fail on there own. This is just the start, no government or bank would have supported such system. For once, the small guy can participate in this market. I would have loved to invest in amazon, facebook, etc when they where private. You had to have the connection and access .. here the little guy in columbia or Africa has the same access to such a deal
2
mathieuuu 6 hours ago 5 replies      
I am amazed that in 2017, there is still people qualifying Bitcoin or Ethereum of ponzi schemes. Crypto currencies have a lot of problems (technical, economical, philosophical, ICO are insanes) but saying they are ponzi schemes is plain wrong. And it just takes 5 minutes of reading on Ponzi Schemes signs and basic knowledges of how those currencies work to understand why.
3
jamespitts 6 hours ago 2 replies      
I would like to make an appeal to the YC community: please reach out and help guide those involved in this rapidly advancing phenomenon.

There is a huge amount of instantly-deployable capital in Ethereum and Bitcoin, held by individuals now seeking ways to put it to work. Many technologists will naturally make use of this energy, but so will BS artists and people under the influence of hubris.

I am making this appeal because it is irresponsible to stand by and watch so much financial energy be directed at projects without appropriate constraint. Many firms funded by ICOs (or soon to be) have qualified leaders and developers and are developing great ideas, but IMO many are also raising too much money for their own good.

What may be needed to guide these rockets to their full potential are maturity, mentorship, and the careful application of jurisprudence and principles of accounting.

4
ransom1538 5 hours ago 6 replies      
I have a fetish for watching complicated court cases play through. What you realize is a judge in a courtroom has a lot of power. They can take your money, freedom or even kids. However, if I had few hundred thousand in Ethereum encrypted placed into a few random anonymous git hub repros, well, the government can't take it. It is truly yours. No one can take it. That must be horrifying proposition for (some) governments - the inability to control people.
5
matt_wulfeck 6 hours ago 3 replies      
These crypto coin bubbles can't pop soon enough.
6
bluetwo 6 hours ago 2 replies      
200 years ago in the U.S. regional banks printed their own currency. The constitution does not create a common US currency, although we have been operating as if it does.

Will a non-nation state be able to generate enough credibility to run their own currency that lasts a lifetime?

I don't know be I think it's interesting to watch.

And if they do, it may force central banks into a corner by limiting their ability to print money (i.e., lowering the value of your currency by printing money will cause people to move their money to the alternative currency, leaving you with inflation).

7
huckyaus 6 hours ago 0 replies      
If there's anyone qualified to write about the crypto bubble, it's surely a guy called Popper.
8
piratebroadcast 6 hours ago 2 replies      
Anyone have a resource on the technical aspect of this? Seems like these businesses create their own cryptocurrency? Or build it on top of Ether?
9
hendzen 6 hours ago 1 reply      
Someone should start WebVanCoin.
10
thedrake 5 hours ago 2 replies      
Wait until amazon offers its own coin...that is when the mainstream will happen and if these much less know are able to pull this valuation the legitimacy and hype will many x when amazon does it. They have already tested the waters with https://www.amazon.com/gp/feature.html?docId=1001166401
11
surrey-fringe 4 hours ago 0 replies      
You all get so angry when someone mentions a blockchain. It's strange.
12
notadoc 6 hours ago 4 replies      
Does anyone think these cryptocoins will hold value during, in, and through the next recession?
13
the_cat_kittles 6 hours ago 1 reply      
these things are strike me as greater-fool schemes. everyone hops on thinking they are probably ahead of the curve, and most of them are suckers. what a funny game- the skill is knowing how many other people will go for it.
14
kim0 5 hours ago 2 replies      
Can someone in the know shed some light on why does every project need its own currency?! Why isn't a standard coin like BTC or eth good enough?
15
sriram_sun 6 hours ago 1 reply      
Would it make sense for companies like Amazon or Alibaba to print their own currencies when they are in the business of selling everything under the sun?

Would they be saving on transaction costs?

I believe most of the money that is ending up in Bitcoin exchanges is laundered in the first place. Also, there is a whole lot more that is getting laundered everyday than Bitcoin being mined. That would mean Bitcoin will keep going up in value. Gold should correspondingly fall.

16
thebiglebrewski 4 hours ago 1 reply      
How can one profit off of this?
17
devoply 6 hours ago 2 replies      
> made it easier than ever for entrepreneurs to raise large sums of money without dealing with the hassles of regulators, investor protections or accountants.

Umm no easier than raising funds for any other venture, in fact probably more difficult and I would think a lot of that money is coming from people who were already made rich by other coin like bitcoin. Easy come, easy go, but enjoy the gravy train while you are riding it.

18
nodesocket 6 hours ago 6 replies      
I'm sorry but this is so absurd. They are doing the equivalent of printing their own monopoly money in the garage and then selling it for real currency... dollars.
19
partycoder 6 hours ago 1 reply      
Except that soon, the proliferation of cryptocurrencies will reduce the value of such ICOs.
25
Job applicants over 40 filtered out by employers uu.se
202 points by fraqed  10 hours ago   181 comments top 32
1
sandoze 4 hours ago 2 replies      
I might have 'lucked out', I'm 41. I under performed in my 20s (it was the 90s after all and the bubble burst just as I was gaining momentum in the market) and received my college degree at 30. I assume most people who see my resume consider me 8 - 10 years younger than I really am. Doesn't hurt that I hit the gym and haven't gone gray yet. I went the mobile route (iOS) and haven't had a pay cut in 8 years (currently 200k+ living in the mid-west). Not without its ups and downs, I tend to not get hired at start ups, maybe those 20 somethings smell something is up, but fortune 100s are quick to give me an offer letter.

With age comes maturity. It's allowed me to get along better with my co-workers (so many upper 20s, lower 30s tend to be a bit... hot headed) and I'm not afraid to negotiate. The older I've gotten, the more comfortable in my skin and in my skill set I've become.

If there's ageism I've yet to experience it and I've worked with people well in their 50s doing mobile. It comes down to who you're working for, what your skill set is, timing and, in my opinion, health. You've got to stay healthy and look healthy! Also, I tend to prune my resume, no one wants to see your experience 8+ years ago.

Or maybe it's all luck, ask me in five years what I think when Objc/Swift goes the way of PHP, I might be singing a different tune.

2
zebraflask 5 hours ago 1 reply      
I just turned 40. There is an unfortunate grain of truth, especially among start ups, but it would be a mistake to think the situation that dire.

I am constantly pestered by recruiters and companies to interview. I think one of the things that helps is that I trimmed my resume to omit material older than 5 years, removed unnecessary dates, and I make a point of drawing attention to studying for new industry certifications. It probably doesn't hurt that I stay physically fit, either. As cruel as it may be, if you're out of shape and look "frumpy" or "run down," that will count against you far, far worse than your age.

The key is to make the age factor irrelevant by not drawing unnecessary attention to it or by projecting a stereotypical "middle aged" image. We can argue all day about whether that's fair or not (it's not), but you have to do what you have to do.

3
notadoc 6 hours ago 5 replies      
Harsh and an obvious problem for those over 40 and industry in general, but is this simply the natural result of companies optimizing for maximum productivity at the lowest possible cost to them?

Many 40+ year olds have families or other life obligations outside of work, and thus they may not be as willing to put in absurd hours that a young employee could be squeezed for. The older employee also might be more likely to take weekends off, and want to use vacation time.

Additionally, someone over 40 is supposed to be in or near their peak earning years, which lasts ten to twenty more years (or did historically anyway) so their expenditure is going to be significantly more than someone with a few years of experience.

I realize it's one of the lamest analogies possible, but for many companies, employees are quite literally a cog in a larger machine, and so they want the cheapest possible cog at a reasonable quality level that works the longest before breaking down (quitting, getting fired, burning out, etc).

To fight this, I'd bet those over 40 would have to aim to get into important management and executive positions, which are less likely to be swapped out for less experienced, less demanding, and cheaper labor.

4
ThomPete 5 hours ago 2 replies      
I am 43 I know this is true. Even for people in their thirties.

This is why I decided to start my own company again after 4 1/2 years at Square which was probably the last time I ever would be working for someone else.

This way age becomes an asset rather than a liability.

5
badsock 5 hours ago 3 replies      
I don't understand why everyone bought the idea that we don't need unions anymore.

This is exactly the sort of thing they make better.

6
xiaoma 38 minutes ago 0 replies      
For founders who are fundraising, it's even more extreme.

Job applications generally don't and possibly can't ask about age, but it's a required field on applications for YC or other incubators.

7
SOLAR_FIELDS 6 hours ago 3 replies      
Since it is not legal for employers to require you to give your age as part of an application, doesn't this simply encourage candidates to lie about their age? One way that age is given away on resumes is to look at work history or date of college graduation. In lieu of the current situation, is it not advisable to simply not give the date of graduation and only provide the previous 5-10 years of work experience on resume?

It isn't a perfect approach, but it makes it more difficult to discriminate since by the time you have a face-to-face interview with the employer you are already well along in the interview process.

8
justboxing 5 hours ago 2 replies      
> In the study, the researchers sent more than 6,000 fictitious job applications to employers who had posted job ads for administrators, chefs, cleaners, restaurant assistants, retail sales assistants, business sales agents and truck drivers to then compile the employers responses, such as invitations to job interviews.

Not trying to be in denial, but all these jobs appear to be blue collar jobs (assuming "administrators" is office admins and not Sys Admins / Network Admins :) .

Any data on whether this happens in our Tech / I.T. Industry, where every other month, you read a story on severe shortage of skilled tech workers everywhere?

9
Keyframe 6 hours ago 1 reply      
Time for some heavy stick and penalties, laws exist: https://en.wikipedia.org/wiki/Employment_discrimination#Lega...

Challenge is to prove it.

10
awacs 2 hours ago 2 replies      
I left my former industry at 40 (now just over 3 years ago), bootcamp'd and got into dev and it was the best thing I ever did. I've been gainfully employed since the move and love my current job. I can't say it wasn't a lot of work, and yes I feel the competition of the young folks, but it is what it is.
11
nether 6 hours ago 1 reply      
"You know what they do with engineers when they turn 40? They take them out back, and shoot them." - Primer
12
gexla 2 hours ago 0 replies      
> In the study, the researchers sent more than 6,000 fictitious job applications to employers

Might this be a clue?

Any job application which is in the reaches of an automated process must be a joke endpoint. How many other automated applications do they get selling employee skills, sex, penis enlargement pills, fast loans, malware and other trash.

Disclosing information should be the first "BS" smell for a job. I'm often well into getting work done before the person I'm working with figures out how old I am, where I live or other personal details. Granted, that's freelancing.

I live in the Philippines where the age requirements are actually advertised. And these ages seem pulled out of a hat. And I get the sense that people running the show at all levels couldn't tell their asses from a hole in the ground. It's a pleasant surprise to find someone who seems competent at convincing you that there's some purpose for them taking up a spot at that spot or role they are taking up in a serious time commitment out of their life.

Clearly the hiring process is just as broken as everything else. Why expect that hiring is going to be significantly more awesome than the rest of the system?

Don't interact with machines. Get to know real people. Show people what you can do. Preferably find people who tell you they could use your help rather than you telling them that you need a job. ;)

13
alanfranzoni 53 minutes ago 1 reply      
There's ONE thing that really amazes me. A lot of people around the world - not just in Europe or in US - keep speaking about talent crunch, STEM shortage, etc, as an emergency... so, if this research is not flawed, the real issue is that such talent is just ignored at 40?
14
ErikVandeWater 5 hours ago 1 reply      
How did they actually test this? Without details, there is no new information. As well, do we know that there are no confounding variables in these findings?

As a hypothetical employer I might think these roles are lesser roles, so if you are older I would wonder why you had not been able to secure a greater economic situation.

Second, as consumer facing roles, attractiveness is a benefit, with those over 40 having less of it. It is not discrimination on the basis of age, but attractiveness that would be the cause.

15
krapp 6 hours ago 4 replies      
Welp. Guess it's freelancing and minimum wage shift-work until I die, then.
16
daxfohl 6 hours ago 2 replies      
I didn't have too much trouble last year at 41. Took a couple months to get any traction at all, but then suddenly got a few interviews and eventually some offers all in quick succession. Maybe I was lucky though. After reading this story I'm definitely more inclined to stick with the position I have rather than to try to go out on my own again.

Another talking point: the trend of waiting until late 30's to have kids greatly compounds this problem.

17
Animats 20 minutes ago 1 reply      
"Logan's Run". It's here.
18
noncoml 6 hours ago 0 replies      
So, age discrimination is not tech specific, but unlike other industries, tech workers change jobs much more often and that creates the illusion that the problem is more tech industry related.
19
geodel 4 hours ago 0 replies      
I am unable to reconcile these facts that life expectancy keeps rising and jobs opportunities keeps falling. Government across the world in low/mid or sometimes even high income countries are already running huge budget deficits. So they are not going cheap/free amenities to jobless masses when even people with jobs are finding things expensive.
20
myro 3 hours ago 0 replies      
My father, 60 y.o. engineere, in a few months retiree, asked me how to find a reliable freelance basic lvl jobs. The thing is, the country just passed a couple of recession periods, and the retirement program, the country is going to greet him with is around usd100/mo. What obviously is not enough to cover expenses. To find a local job at such age and at that economical situation us just not worth time and effort.And there are literally millions of such as him. I personally will probably not receive any pension at all (currently 32yo.).
21
EternalData 3 hours ago 0 replies      
This is why employment-based purposefulness of life is so dangerous -- I think it is nothing short of a tragedy that the way society is set up validates an utter hopelessness if you are not employed, this despite the fact that there may be structural factors arrayed against you. I've seen it almost chew up my father and I want to resolve that it'll never happen to me, but I know so long as structurally employment remains the default badge of worthiness in a capitalist society -- that it will come to pass for me as well.
22
prawn 5 hours ago 0 replies      
With general increases in the age to receive a pension (increasing six months every two years in Australia), but challenges in finding employment, what bridging options are there? What's the landscape going to look like for 50 year olds if re-skilling after a lay-off doesn't always help?
23
Entangled 4 hours ago 1 reply      
I am in my fifties with over 30 years programming experience and proficient in 20 languages yet one employer rejected my application because I didn't meet their requirements and they found a better fit.

I work fifteen hours a day, can't stop learning and trying new technologies to keep myself on top of the wave.

24
ruleabidinguser 5 hours ago 4 replies      
Whats the logic here? Are applicants >40 generally worse performers?
25
SQL2219 5 hours ago 0 replies      
If only there were some kind of physically demanding test for employment, I would then punish all those youngsters by going to the front of the line.
26
woogiewonka 5 hours ago 1 reply      
31 here. Couldn't find work for 6 mo despite being well qualified for positions I applied for. My guess: too many applicants for the same position. Ended up going the freelancer route, couldn't be happier now. Sorry, I know this doesn't relate to age (I don't think) but just wanted to say that.
27
lquist 5 hours ago 0 replies      
I don't speak the language, so I'm unable to read the underlying report, but I hope this quote is out of context in the article: "There should be no doubt that the employers discriminate on the basis of age," when the study itself shows discrimination for very specific roles: administrators, chefs, cleaners, restaurant assistants, retail sales assistants, business sales agents and truck drivers.
28
empressplay 2 hours ago 0 replies      
Have a side project in an en vogue "cutting edge" framework or language, and they will care much less about how old you are.
29
Banthum 6 hours ago 1 reply      
>A questionnaire study directed at a selection of employers shows that there are three characteristics that the employers consider to be important and are worried that employees over the age of 40 have begun to lose: the ability to learn new things, being adaptable and flexible and being driven and taking initiative.

Open question - is there any research on to what degree these three worries are true?

30
TheBobinator 4 hours ago 2 replies      
A few observations.

First off, companies like Cisco, Microsoft and Oracle, and even Google, are running entirely off of the centralized education system in the US. These are companies that got their software into curriculum and taught everyone their way of doing things, then engaged in shoving their software and a lot of labor into large organizations making a gargantuan mess that was glossed over with lots of "free overtime". Compare Cisco CLI to Juniper, or Microsoft to Debian, or MS SQL to Oracle; who's going in who's direction.

Why?

If you made the investment in understanding exclusively those companies products you went along the technological imperialism trip and now that you're on the other side, and you never spend time understanding the theory or building critical thinking skills, you're washed up. 20 years working on a massive oracle mainframe or with purely Cisco R&S becomes a liability, the reason being, you never tried to find a better way to do things or try to eliminate your job and replace it with something better.

There's an honesty in Meritocracy; The market has always valued the independent thinking, hard-working, incredibly knowledgeable IT staff with a tremendous depth of understanding of infrastructure, programming, politics, and equipment over what 95% of the IT market has become. 95% of the people I've worked with expect the solution to be in some arcane google search result or in a book; they don't expect to go on the journey of finding the answer. What they never develop is real creativity, a real understanding of the systems they work with, or a real understanding of the architecture, why things are done, or the process of how to build on themselves; to set a path for themselves and others that that eventually brings about a finished product.

The entire IT industry is maturing and getting older and as they do, older staff that haven't done this is viewed as a liability. I'll agree, there's all kinds of ways to try to hire gullible people who don't know their own self-worth. Fact is though, those kinds of companies are on a long-term death spiral of their own making. Every time a large corp outsources, I go look at the 10-k and I see a major cash flow problem of managements making. "The old cranky sysadmin way" is beginning to take at more and more companies and that will trickle into academia as time goes on as management begins to understand what technological imperialism means and what the results are; generally, a total mess.

It's a very controversial thing to say these things because it makes a lot of people who aren't that good, or who invested their time in the wrong things feel like they are doomed. Fact is, there's no set career path in IT like there is in other fields like Attorneys and Lawyers, Stock Brokers, Research scientists and Academia.

The trick I've discovered is to put in no more than 40hrs a week at work, and if overtime is needed, come home and practice, do architecture work, learn algorithms, make good notes, read programming and architecture and project management books. 40hrs a week is for work, 10-20hrs a week is for self-betterment. Then you come into work, and find ways to eliminate your job. A new approach that saves butt loads of time. Get your assignments done early, then either come up with a new project to work on, move on, or study. Within a few years of doing this, you will be a top-tier programmer\architect\systems admin, whatever you want to do.

And while I do feel for people who feel they've fallen behind due to having a family, the fact is from my perspective, the real issues with society are things like 21% of GDP being spent on a scummy healthcare industry, or high incomes of the top 1%, or lack of wage parity tariffs on imports from China. The baby boomers have really messed things up for us. The fact you can't go from a high paying IT job to a factory job or retail management position and still have enough money to put your family in a decent home with 3 hots and a cot and to put your kids through school and college is a failure of society in general, not the IT industry. Those issues need fixed and frankly, contribute a heck of a lot to our messed up society.

31
subru 5 hours ago 11 replies      
I will be 40 next year. Humans are expendable; we have to make our own worth. If we don't do that, we end up like I am right now: a year from 40, homeless, broke, in a lot of debt, out of work, failed startup, facing a felony on false accusation.

Life is challenging for me now, and this article directly applies to me. Oh, and there's that thing about being a white male and suicide. Now imagine being a kind of intimidating looking type in a very non white anti trump area.

Tribalism is real. Had I stuck with my tribe early on I'd be more secure. My demise is probable at this time.

I am solid in my desire to self terminate yet lack the ability to overcome fear of death. I stay alive but it's closer than ever. It's almost a humane thing to let me go. I wouldn't wish my brain on anyone.

Good luck all.

32
philovivero 6 hours ago 1 reply      
Cool. Zero comments. This is a bigger problem in tech than a lot of other industries. Welcome to the meritocracy!
26
Trapsleds marc.info
37 points by protomyth  9 hours ago   2 comments top
1
dsjoerg 7 hours ago 1 reply      
Delightfully context-free
27
Observational Learning by Reinforcement Learning arxiv.org
8 points by guiambros  4 hours ago   1 comment top
1
deepnet 1 hour ago 0 replies      
Copying behaviour without divining intent can lead to problems such as 'cargo-culting'.

Imagine observing a man shaking his leg, first one then the other, then his whole body convulses and twitches - is he dancing ?

Absent the knowledge that a wasp has flown up his trousers leg.

Copying without comprehension may lead to getting stung !

Inverse Reinforcement Learning [1] to reverse engineer goals will be needed especially for embodied AI in Partially Observed Enviroments, i.e. the real world (as opposed to simulations).

Berkeley's CS294-112 [2] Deep Reinforcement Learning for Robotics provides good coverage of methods of mirroring, DAGGer, Deep-Q, iLQR, and IRL.

[1] https://people.eecs.berkeley.edu/~pabbeel/cs287-fa12/slides/...

[2] https://www.youtube.com/playlist?list=PLkFD6_40KJIwTmSbCv9OV...

28
The secret to a long and healthy life? Eat less bbc.com
273 points by DiabloD3  14 hours ago   143 comments top 29
1
cyrusshepard 11 hours ago 5 replies      
Good overview of the benefits of calorie restriction, but doesn't touch on two of the most important schools of thought of the past couple years.

1. Calorie restriction is almost impossible for most humans to follow long term

2. Many of the same benefits of calorie restriction may be achievable through intermittent fasting, which is much easier for people to follow. I've been experimenting with a daily 16 hour fast (all calories consumed within an 8-hour window) and have seen small improvements in my energy and general well-being, although it's decades too early to say how this will impact my longevity.

Relevant articles:https://www.nature.com/articles/srep33739https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2622429/https://www.scientificamerican.com/article/the-hunger-gains-...

2
adambmedia 52 minutes ago 0 replies      
In addition to differences in recommended calorie intake, is there a good, scientific, international comparison that puts dietary guidelines side-by-side, nutrient guidelines, balanced meal?

It's amazingly difficult to find clear information anywhere. For instance, CDC Nutrition guidelines?! It's an example of something that, in trying to be comprehensive, results in a mess of awkwardly qualified terms and difficult to digest (couldn't help it) high-level recommendation.

USDA Food Patterns, Healthy US-Style Eating Pattern. https://health.gov/dietaryguidelines/2015/guidelines/appendi...

Meat poultry and eggs in the same line? They are completely different foods with wildly varying nutrients.

[UPDATED: good start] - Pictorial Nutrition Guidelines: http://www.fremont.k12.ca.us/cms/lib04/CA01000848/Centricity...

- Comparison of International Dietary Guidelines and Food Guides in Twelve Countries across Stages of the Nutrition Transition http://www.fasebj.org/content/29/1_Supplement/898.36.short

3
chiefalchemist 11 hours ago 5 replies      
A few thoughts.

It seems that "calorie restriction" is a misleading term. If these funds are correct, CR should be considered the norm/ideal. That means the majority, currently, over-consume.

That aside. I wonder if it's also related to modern food, and the production there of. If inflammation is the root of most disease, and CR, I presume, reduces inflammation, then what is it about our foods that trigger so much inflammation?

Finally, when all said and done, I predict this will be found to connect with gut bacteria in some way. That is, CR effects the gut, and that effect is ultimately a positive for the whole body.

4
stevewillows 11 hours ago 2 replies      
About five years ago I switched to one large meal per day (dinner) with a snack or two. Prior to this I lost nearly 40lbs by shifting my diet toward dominant protein and smaller portions. I also added in a good 30+ minutes of walking per day. Nothing fancy.

I also dropped almost all of my sugar intake. I drink about 3L of water per day and have one cup of coffee in the morning.

I've been able to maintain a healthy weight (6'4" tall, ~185lbs) and I rarely feel truly hungry. On the days where I have more than one large meal, I tend to feel tired and foggy.

Its completely anecdotal, but this pattern has worked out well for me.

The main issue with articles like this is that they propose lame diets that strip out enjoyment. If you want to eat eggs basted in butter, do it -- but balance that out with exercise or somewhere else in your diet. No use suffering so you can live a few extra years at the tail end of your life.

5
bonniemuffin 12 hours ago 4 replies      
I checked out the recipes on the iDiet website recommended in the article, and I'm feeling sad and deprived just reading them. I'm willing to suffer for better health, but I'd hate to suffer through a life of egg whites and margarine and then discover in the end that the science was wrong.
6
louprado 12 hours ago 5 replies      
My father has had a low calorie diet all of his life. At 78, he is in perfect physical and mental condition. His doctor had only one concern; he is slowly losing weight each year and needs to reverse that. The doctor wanted to prescribe an appetite enhancing drug but my father explained he can not simply gain weight by eating more. It's as if his digestive system is already at a maximum threshold.

I wonder if habitual CR inevitably trains the body to stabilize its weight and if that can not be undone if practiced too long. That then put you at increased mortality risk when you start losing body mass in old age. Caged animal studies might miss this risk because a valued lab rat won't suffer sudden injuries and the resulting prolonged hospital visits.

7
ericjang 13 hours ago 4 replies      
Does that imply that lifting, building up one's body and consuming excess calories to bulk up is somehow stressful for the body's metabolic aging process?
8
TheRealmccoy 1 hour ago 0 replies      
While eating less and having CR and IF is important, equally important is the actual process of eating.

Most of the articles focus on results, but not on the actual process.

If one is eating sitting on the couch, watching television, completely oblivious of the activity, it is not going to help.

There is nothing absolute in nature, everything is connected.

How do we eat is far more important than how much we eat.

One simple experiment one can do at dinner is to sit alone, without any distraction with the dinner plate and for every morsel one takes in, chew it till you count 20 and then swallow.

Notice what happens...

9
pascalxus 2 hours ago 1 reply      
Due to social conditioning, It's not commonly known, but you don't actually need to eat every day. You could lose 10 lbs in just a couple of weeks of not eating any calories, but I would recommend eating 1 lbs of spinach and maybe a combination of other green, red and orange vegetables: that will keep your calorie count under 200 for the day. Drink lots of water.

I know the hunger will drive you crazy in the first few days, but your body gets used to it and it's not so bad, after 3 days.

The first time you do it, perhaps start off with some 16 hour fasts at first for a few months till your used to that, then you can increase it.

10
dirkg 6 hours ago 0 replies      
It is pretty ironic and sad that following a healthy diet, including CR, is much harder in Western nations and esp in the US.

- Processed food costs less than fresh food.- Multiple generations of people have been conditioned to eat processed and junk food.- People believe in the myth that 'cooking takes too long'- People are lazy and don't want to spend any time on cooking/food even when living a life of luxury compared to the rest of the world.

All of which is the opposite of most developing countries. US and UK are very close in this respect. Even many European countries value shopping for and eating fresh food much more.

CR, and eating in general should be about buying nutrient dense but low caloric foods - which is a mostly plant based diet of fresh food. The exact kind of food which is artificially expensive and considered a 'fad'.

11
Vitaly 12 hours ago 5 replies      
I can't find the study right now, but from what I understand it is only really beneficial to animals the size of a mice, as soon as you move to bigger animals the effect is sharply reduced, and experiments with monkeys (or was it even primates? I don't remember) didn't find any real life prolonging effect. So you might be subjecting yourself to not enjoying food for no real benefit after all.
12
rubicon33 12 hours ago 2 replies      
Most articles on eating tend to focus on weight, drawing the conclusion that weight (fat) is the problem, not eating. This article provides a refreshing perspective:

> But the latest results suggested that significant health benefits can be garnered in an already healthy body a person who isnt underweight or obese.

The take away here is that the problem with excessive eating isn't just the weight gain. There's something about eating itself which is stressful. Restricting calories is good for you even if you're already within a healthy BMI. This means that just because you can eat anything, doesn't mean you should.

13
pavement 5 hours ago 0 replies      
One detail that isn't explicitly stated, but seems like a bit of a blind spot, is whether there's an age-appropiate aspect to this behavior.

Do non-growing adults (30+ years old) enjoy greater benefits than small children who have faster metabolisms and growth spurts ahead of them?

My inutition says that even if the concept still fits well with growing children, that their cycles are different, and closer together in frequency. Maybe they don't eat more calories, but eating several times a day might possibly tie into cycles of blood serum nutrient levels properly based of weight and activity.

14
Mikeb85 13 hours ago 2 replies      
Doesn't really seem like calorie restriction. The monkeys who died younger would have definitely been considered unhealthy and gluttonous.

And it's no secret that being overweight causes premature aging, health problems and eventually premature death.

15
mrb 11 hours ago 3 replies      
I wonder if Americans' declining life expectancy is due to them eating more and more.

https://www.washingtonpost.com/national/health-science/us-li...

16
jksmith 12 hours ago 0 replies      
Drift: Maybe a longer life of great experiences is better than a shorter life of great experiences, but a longer life at the price of greater experiences? Nah, hope I die before I get old. Case in point: http://io9.gizmodo.com/men-can-live-20-years-longer-but-ther...
17
notadoc 9 hours ago 0 replies      
Does this surprise anyone? Humans did not spend millions of years eating 6000 calorie hyper refined lab perfected highly processed meals three times per day. We did evolve foraging, hunting, gathering, and eventually and much more recently farming, all of which would naturally include a ton of physical activity and occasional bouts of not eating if a meal was yet unavailable, which we now call "fasting" as if it is some harsh or deprivational activity.
18
Area12 8 hours ago 0 replies      
Investigation into calorie restriction isn't at all new. Roy Walford wrote five books about it, dating back to 1983.

https://en.wikipedia.org/wiki/Roy_Walford

19
jimjimjim 10 hours ago 0 replies      
anecdotal datum: A few years ago I lost about 40 pounds (230->190ish) in a little over a year by limiting the amount of food and not caring about the type of food.

steak, sandwiches, cheese, potatoes, all good. salad, bananas, cereal, all good.

I had to make sure to look at the nutritional info. If the food was small but energy dense you have to eat less of it.

20
m3kw9 5 hours ago 0 replies      
Maybe the more you eat the more chance you eat crap that is adverse to your health hence the variations
21
tammer 10 hours ago 0 replies      
I eat a protein & fat heavy breakfast & dinner, & skip "lunch". I find it to be highly effective at keeping me productive through the day (both more time & more steady energy cycle).
22
lazyjones 12 hours ago 2 replies      
Perhaps eating is just one of the most common ways of getting dangerous toxins into our body?
23
Theodores 12 hours ago 0 replies      
Assume that you had a garden with acres of produce all perfectly grown and in season. If you wanted orange juice all you would need to do is walk out into this garden, pick some oranges, bring them back to the kitchen and manually squeeze them into a glass.

Now imagine if you wanted to have a bit of toast, maybe with some chocolate-hazelnut spread. All you would need to do is walk into the garden, grab some hazelnuts off the tree and some cocoa beans, perhaps with some cane sugar for good measure. Some blender in the kitchen would be able to make your hazelnut spread, just so long as you shelled the nuts first. Similarly with the bread, in this garden, free for the taking would be some strong wheat that you can put through some kitchen appliance, then after a few hours with the breadmaking machine the bread would be good to eat. Butter would be equally simple too, you just needed to milk a cow, put the milk in some glorified washing machine with a bit of salt, then wait a while to get the freshest butter ever tasted.

Would you be able to complete all of these tasks by breakfast? Would you really bother to shell all of those hazelnuts? Would you question why it is that you have juiced 14 oranges when actually just the one orange, non-juiced was satisfying enough and didn't require all that effort dicking about with juicing?

Of course people have allotments and smallholdings so this does happen, albeit with greenhouses instead of some magic 'always in season' aspect. But my office workmates of the obese variety have no idea of the effort needed to get their food, even if it is healthy food. The connection is not there.

One of my dream is to rock up at the doctors one day having eaten too much fruit and veg. For the doctor to recommend me to stay off the veggies and eat some sugary snacks instead. I am fairly sure that no amount of fruit and veg would age me, not in the way that sugary snacks, beef products and everything processed would.

24
tsao 10 hours ago 0 replies      
What if you're already eating less? It should say: eat the right amount.
25
doener 11 hours ago 0 replies      
If you havent watch this documentary on the topic: https://www.youtube.com/watch?v=Ihhj_VSKiTs
26
reasonattlm 12 hours ago 0 replies      
Some references for those who want to dig in.

Will calorie restriction work in humans?http://dx.doi.org/10.18632/aging.100581

Caloric restriction improves health and survival of rhesus monkeyshttps://dx.doi.org/10.1038/ncomms14063

In general, the consensus in the research community is that we shouldn't expect more than an additional ~5 years from the life-long practice of calorie restriction. The evolutionary argument is that the calorie restriction response evolved very early on in as a way to enhance fitness given seasonal famines. A season is a long time for a mouse, not so long for a human, so only the mouse evolves a very plastic lifespan. The practical argument is that 5-10 years is about the largest effect that could exist and still be hard to pull out from existing demographic data of restricted calorie intake in a bulletproof, rigorous way. Obviously any much larger effect would have been discovered in antiquity and be very well known and characterized by now.

So that said about longevity, it is very clear that calorie restriction does better and more reliable things for health in ordinary humans in the short term of months and mid-term of few years than any presently available enhancement technology can replicate.

A good deal of research into aging is focused on trying to recreate the calorie restriction response. So far this has consumed billions with little of practical use to show for it beyond increased knowledge of some thin slices of cellular biochemistry relating to nutrient sensing and energy metabolism. It has proven to be very hard and very expensive to get anywhere here.

So calorie restriction itself is free and reliable in its effects. Everyone should give it a try. There are, however, far more important areas of aging research to direct funding to instead of trying to recreate this effect with pharmaceuticals. In an age in which meaningful rejuvenation is possible to create in the years ahead (see, for example, clearance of senescent cells, something that calorie restriction can only slightly achieve in a very tiny way, while drug candidates are managing 25-50% clearance) it seems just plain dumb to instead be chasing expensive, hard ways to only slightly slow down aging.

27
dom0 13 hours ago 0 replies      
A remarkably click-baity food "science" head line, especially for the BBC.

Edit: "Permanently [...] may turn out to have a profound effect on your future life, according to [...] scientific studies."

And that's a Bingo, folks.

28
caub 13 hours ago 2 replies      
29
pors 11 hours ago 4 replies      
There is lots of proof now that obesity is not caused by "more calories in than calories out", it is caused by hormone reactions to what you eat (esp. insulin). Just read the books by Gary Taubs for zillions of details and references to studies.

So the "eat less" claim in this article is again back to the old MD advice, "eat less and move more". Not going to work if you don't look at your macronutrients.

29
Tumblr Heads-up for AT&T customers zendesk.com
42 points by dhotson  3 hours ago   9 comments top 5
1
mintplant 3 minutes ago 0 replies      
Does anyone know if this will affect email access to the listed domains? My parents use an @bellsouth.net address as their primary email.
2
adambatkin 3 hours ago 2 replies      
Is this designed to be a test of (lack of) net neutrality or something?

Verizon now owns Tumblr. AT&T e-mail addresses will suddenly be blacklisted from logging in to Tumblr.

Maybe I am reading too much in to this?

EditAccording to TechCrunch (https://techcrunch.com/2017/06/25/take-the-oath/) it's all cool. Except that the explanation still makes no sense. It sounds like what they are saying is that (for example) an att.net e-mail address won't be a Yahoo account anymore, which I assume means that Yahoo won't be hosting their e-mail or something like that. But why wouldn't those addresses become the "username" on Yahoo logins, in the same way that any e-mail address can sign up as a Google account, even though the e-mail itself is hosted elsewhere (i.e. my "username" for Google could be a non-Google-hosted e-mail address, but my e-mail itself has nothing to do with Google - i.e. it's not a GMail account).

3
dzonga 1 hour ago 0 replies      
Is that legal ?
4
rapidstuff 3 hours ago 0 replies      
Verizon flexing it's muscles?
5
65827 2 hours ago 2 replies      
What the fuck happened to the internet?
30
Why did AirAsia fly a crippled jet away from a nearby airport yesterday? crikey.com.au
49 points by CPAhem  5 hours ago   40 comments top 5
1
sitharus 3 hours ago 3 replies      
At at guess, because the maximum landing weight of a A330 is 187,000 kg but the maximum takeoff weight is 242,000 kg. They'd have to burn off fuel before landing or risk crushing the landing gear.

The A330 doesn't have fuel dumping nozzles as standard equipment so that may not have been an option.

Edit:

From a slightly dodgy source http://www.pprune.org/tech-log/117765-a330-fuel-consumption.... they'd burn 6 tonnes on takeoff and another ~7 in the 1.5 hours before the engine failed, and approx another 7 before landing but I don't know what the consumption would be like for a single engine return.

That gives 20 tonnes total fuel burn. At those figures the 5:40 flight would take 41.5 tonnes of fuel, plus a bit extra.

2
uhnuhnuhn 2 hours ago 0 replies      
It's too early to tell what factors went into the decision to divert to Perth. Aviation is incredibly complex and armchair speculation by people outside the industry is almost always going to get it wrong.

As per usual, more well-founded technical discussion on this incident is ongoing over at AV Herald: http://avherald.com/h?comment=4aac9f14&opt=0

3
jw2k 3 hours ago 3 replies      
Al Jazzera has a great expos on the issues of quality and supply / demand for pilots in Asia. https://www.youtube.com/watch?v=qSZ-R5HdPQU

It's truly frightening, and makes me think twice when any of these airlines come up when I'm searching for flight deals.

4
ithinkinstereo 3 hours ago 2 replies      
The captain apparently informed the passengers he was praying for their survival and asked that they pray too[1]. This tells you everything you need to know about the quality of AirAsia pilots.

Anytime I fly in that area of the world, I make it a point to avoid Malaysian and Indonesian airlines at all costs. Fly Cathay or SingAir and their affiliates if you can. The cost premium is well worth it.

1)http://www.smh.com.au/business/aviation/a-boom-in-midair--th...

5
anatari 3 hours ago 1 reply      
Why didn't they turn off the engine to stop the vibration? To a layman that would seem safer than having an engine violently shake the wing!
       cached 26 June 2017 07:02:01 GMT