hacker news with inline top comments    .. more ..    25 Jun 2017 News
home   ask   best   5 weeks ago   
1
Turn any link into a suspicious-looking one verylegit.link
195 points by defaultnamehere  7 hours ago   53 comments top 15
1
rjbrock 6 hours ago 0 replies      
A similar site has been around for a long time: http://www.shadyurl.com/

example: google.com -> http://www.5z8.info/dogs-being-eaten_x2r3rq_5waystokillwitha...

2
OJFord 1 hour ago 0 replies      
If I were trying to send someone to my nefarious website, I'd definitely now wrap the link in this, so that the savvy viewer would think it's a harmless verylegit.link...
3
nandhp 6 hours ago 3 replies      
Is there any way to get SSL error messages in Firefox?

https://irc.verylegit.link/0x8c*download()194mobiads(windows... is supposed to redirect to Facebook, and it does if you use HTTP. However, over HTTPS Firefox just gives me a very generic "Secure Connection Failed" message. (Chrome is rather more helpful, giving me "ERR_CONNECTION_CLOSED".)

4
BenjiWiebe 6 hours ago 1 reply      
The suffixes should be exe, com, js, hta, vbs, and so on, for extra evilness.
5
dingo_bat 3 hours ago 0 replies      
Unable to open in Edge: http://imgur.com/a/nBAne
6
alexdrans 1 hour ago 0 replies      
Hi, would you please consider paramaterising the input in the URL so that I can use it with Chrome's Omnibar?
7
acbabis 6 hours ago 0 replies      
This is neat. I'll make sure to use it whenever I post something here or on Reddit. Great work
8
SimeVidas 6 hours ago 1 reply      
I get Secure Connection Failedin Firefox Nightly when clicking on the demo link.
9
qume 6 hours ago 0 replies      
Woz would love this, I hope he gets to see it
10
pmiller2 3 hours ago 0 replies      
Mods, thanks for changing the title. It was screwing with the layout on mobile.
11
Mayzie 6 hours ago 0 replies      
Doesn't work for any HTTPS site.
12
logicallee 4 hours ago 1 reply      
This:

 secure.verylegit.link/warez737speedupurpc.gif.pdf
(example from site) doesn't look dodgy to me at all.

I'd have no qualms clicking on it, because my browser and I can handle suspicious websites. (Especially ones ending pdf.)

Something that would give pause would be:

https://tinyurl.com/2ea2mu4?command=127.0.0.1/activate

I would think...wait a minute... I probably wouldn't click this example.

13
dredmorbius 6 hours ago 3 replies      
I've DNS blackholed the entire .link TLD, along with .science, .country, .click, and .rocks.

So, there's that.

(DNSMasq, router-based blocklist.)

14
Markoff 4 hours ago 1 reply      
what is purpose? to confuse average user so he matter click on genuine shady links?

seem dumb on same level as put bodies in blood next to road to check if people stop

15
m0atz 4 hours ago 1 reply      
How is this top of hacker news???
2
Localizing Papers, Please dukope.tumblr.com
165 points by wglass  9 hours ago   88 comments top 10
1
teej 7 hours ago 4 replies      
For those who aren't aware: Papers, Please is a video game about paperwork. You play a border agent in a fictional eastern bloc country, checking passports, visas, and work permits. It's surprising and tense and incredibly good. It's currently on sale for $3.99 on Steam, I highly recommend it.
2
cbanek 4 hours ago 1 reply      
One other interesting problem with localization involves the use of printf. Even if you're looking up strings based on IDs in another file (which is a good pattern), sometimes you'll need to move things around based on language. For example, if you're doing right to left languages, you might put the number before, or after the string, and the other way for left to right languages. So like ("%d %s" vs "%s %d").

The way that we got around this was adding another level of indirection, and putting printf format strings also as localized data.

3
tschwimmer 6 hours ago 1 reply      
Awesome article. I'm always impressed by the distance people will go for their passion. Lucas talks about ultimately having to hand draw Cyrillic versions of _each_ of the game's ten fonts. Very cool!
4
unsigner 3 hours ago 4 replies      
Don't ever use the original string as key in the localization table. That will force you to translate "high" difficulty the same as "high" resolution, for example.
5
mproud 7 hours ago 0 replies      
This should be amended as (2014).
6
Animats 3 hours ago 0 replies      
If you haven't seen the trailer, it's worth watching.[1]

Glory to Artstozka!

[1] https://www.youtube.com/watch?v=_QP5X6fcukM

7
surgi 3 hours ago 0 replies      
Loosely related to the title: Why not create a complete modular version not only localised, but also tied to individual country's flows and processes? So it could serve as an education material. (mind:blown)
8
haikuginger 6 hours ago 0 replies      
This article makes me unreasonably glad to be working in a framework (Django) with good i18n tooling and few special needs re: textual images.
9
eropple 7 hours ago 4 replies      
I would recommend against one's own XML format and doubly against CSV/some homegrown delimited format. Instead, consider something like Excel 2003 XML (one of the easier ones), OpenDocument (also pretty easy in many languages), or Office OpenXML (easy in .NET, a bit harder elsewhere) to store your translation data.

Potfiles are another option, but the tooling is pretty clunky and, in games in particular, people don't seem particularly attuned to their use. And they're not great for editing, though they might be for storage--when dealing with tabular stuff, it just makes a lot of sense to use tools that present a tabular interface. It makes life a lot easier.

10
rasmafazi 7 hours ago 4 replies      
Sometimes you just have to bite the bullet. For interesting subjects, which always have global reach, the virtual conversations are conducted in English. There is also a place for vernacular -- it is part of people's cultural identity -- but not in a formal knowledge setting. English is a bit like Latin used to be: the language of knowledge, technology, and business. If the subject has global reach, you will miss out on the interesting bits of knowledge, simply because you are trying to do it in vernacular. Doing anything in vernacular, will just lock you up in a small and uninteresting national silo. Nothing of any interest is national. But yes, I use vernacular. I also speak it with my kids, but I don't read it -- unless it is poetry or literature -- and I don't use it in software or in business.
3
AWS Security Primer cloudonaut.io
137 points by vinnyglennon  9 hours ago   12 comments top 6
1
briansteffens 5 hours ago 1 reply      
I wonder at what point the complexity of interacting security systems hits dimishing returns? Not trying to be negative, I'm a fan of how much power AWS gives you. But seeing how many systems have interacting security implications laid out in a graph like that makes me curious how far you can take it before it becomes difficult to reason about. Maybe the systems are sufficiently isolated and well defined that it's not even an issue.
2
TheAceOfHearts 44 minutes ago 0 replies      
I've always found AWS's security system pretty confusing, so I'm a big fan of this primer. There's such a huge number of stuff that it can difficult to even begin to know what to look for. From my limited experience, Google Cloud Platform seems to be much easier to setup.

What I'd really love to see an end-to-end example of a non-trivial production-ready project, with all its nitty-gritty details. I'd expect that having a sensible baseline you could look to for general guidance would help improve security and reduce risk.

3
racl101 5 hours ago 2 replies      
AWS is so f-ing confusing sometimes.
4
bbayles 6 hours ago 0 replies      
Excellent! Pleased to see VPC Flow Logs included; they are underrated as a security tool and one big advantage AWS has over other providers.

At work I co-develop an open source Python library for reading VPC Flow Logs - it can be an easy way to get started analyzing them for security:https://github.com/obsrvbl/flowlogs-reader

5
jdubs 7 hours ago 1 reply      
Great write up. I knew there was a lot, but visualizing really puts it into perspective, especially for the more niche services like cognito & IOT.

My current job has about 14 different AWS accounts, a few are prod, some are lab and others are meta accounts. I've been thinking about having a dedicated account just for security related stuff but I see the value in collect cloudtrail, config and other stuff but, I'm not 100% sure it's worth the effort to get setup right now. Thoughts?

6
jajavisst 3 hours ago 2 replies      
Slightly off-topic: The Mindmap looks pretty cool. Does somebody know which technology he used for that?
4
Common Errors in Undergraduate Mathematics vanderbilt.edu
25 points by vinchuco  4 hours ago   1 comment top
1
okket 1 hour ago 0 replies      
Previous discussion: https://news.ycombinator.com/item?id=8181101 (3 years ago, 110 comments)
5
Pencil an open-source GUI prototyping tool for all platforms github.com
143 points by PleaseHelpMe  9 hours ago   12 comments top 6
1
v3ss0n 3 hours ago 0 replies      
using it and thinking about contributing it for long.The team did really ambitious work of dumping XUL and rewriting it in Javascript + HTML. I was one of the first user of new version since they hosted in gitlab , i was astonished that they did the conversion from XUL within just around a month and already working since then.
2
cmrx64 3 hours ago 0 replies      
I've been an occasional user of Pencil for 5 years (almost to the month). Really glad to see it moving away from XUL.
3
dflock 5 hours ago 0 replies      
Been using this for a while in it's various incarnations - this one is definitely the best so far.

Very useful tool for UI prototyping.

4
sametmax 1 hour ago 0 replies      
Very instable on Linux unfortunatly.
5
zitterbewegung 6 hours ago 1 reply      
Wow looks better than balsamiq I will try this out at work.
6
jasonkostempski 5 hours ago 3 replies      
http://www.fiftythree.com/pencilhttps://www.apple.com/apple-pencil/

Kind of already an established name in the Apple world.

7
Who Americans spend their time with theatlas.com
10 points by hunglee2  2 hours ago   1 comment top
1
examancer 13 minutes ago 0 replies      
These are some of the worst designed charts. I had to download the CSV to figure out that the x-axis is age.

Once I figured it out the bottom right chart was chilling.

8
Wikimedia Foundation v. NSA: What Now? wikimedia.org
106 points by madmax108  11 hours ago   39 comments top 4
1
jokoon 1 hour ago 1 reply      
I'm curious, can the government use surveillance on paper mail at the large mail operators?
2
dmurawsky 6 hours ago 1 reply      
Great article, thanks for doing what you do ;)
3
Markoff 4 hours ago 5 replies      
what is there to surveil about wikimedia? aren't all articles public and even IP of non registered users?

this seem like something to justify showing those donation requests in your face, despite Wikipedia drowning in moneyhttps://www.washingtonpost.com/news/the-intersect/wp/2015/12...

4
asher 3 hours ago 8 replies      
Seems like it's trendy to hate the NSA. It gets conflated with an anti-authoritarian mindset. I wish smart people would gain some perspective - I got some by reading Bamford's books and a new one by Fred Kaplan - Dark Territories, about NSAs painful move to cyber. Some key points:

* All the great powers have NSA equivalents. Meaning they play offence and defense in crypto, RF, and cyber. We (USA) can impose restrictions on our NSA but not on anyone else's. Our exploit-riddled networks are a playground for American, Russian and Chinese cyber warriors - and probably many others.

* In cyber, offense and defense become the same. Kaplan's book covers this. So a smart country seeks cyber-superiority. The more we hamper NSA, the more we empower foreign cyber-warriors.

* The focus has moved from RF to cyber. Giant antennas are far less important and giant datacenters are the new stars. Vacuuming up packets is less alarming when you understand we've been vacuuming up radio and telephone signals for decades. When comsats were important, NSA was vacuuming up their downlinks. When international telegrams were punched on paper tape, NSA's predecessors picked up the tape each day.

* The US has tried going "NSA-less". It happened in 1929 under the slogan "Gentlemen do not read each other's mail". That noble slogan led to the US operating at a disadvantage in the lead up to WWII. It doesn't pay to fly blind.

* Fear of an overreaching state is always justified; however we should focus that fear more on how NSA shares data than how it acquires it. For instance fusion centers: https://www.eff.org/deeplinks/2014/04/why-fusion-centers-mat...

9
Generate your own sounds with NSynth tensorflow.org
126 points by pkmital  13 hours ago   24 comments top 7
1
svantana 12 hours ago 4 replies      
I'm sorry but is the Deep Learning Hype strong enough to warp people's sensory perception? Every sample on this page sounds terrible IMHO, and pretty much what you would get if you would spend 10 minutes implementing the most naive spectrogram resynthesis you could think of. Granted, there is great promise in finding the "manifold of music", which seems to be the goal here, but what they show is just not anywhere near that promise.
2
asher 8 hours ago 0 replies      
This story reminded me to clean up a very different synth and put it on github:

https://github.com/wildsparx/synthem80

Unlike NSynth, synthem80 is directed to a specific and humble goal - make early 80s-style arcade sounds. It uses a mini-language to control an engine similar to that in Pacman.

For instance, the sound when Pacman eats a ghost:

 ./synthem80 -o eat-monster.sw 'timer(t=0.5) wav(wf=1 f=2 a=45) add(a=< b=48) nmc(wf=5 f=<)'

3
6stringmerc 10 hours ago 1 reply      
Eh seen this submitted before, totally agree with the early criticism here because it's the same as the last time.

Woo hoo you built a noise maker! Kazoos for everybody!

4
the_cat_kittles 10 hours ago 0 replies      
i feel stupid and do not get what this is all about. so there is something that synthesizes sounds by feeding it audio files? i dont get what is happening here. i tried semi hard to understand, but i figure someone can give the big picture that i think im missing.
5
sowbug 12 hours ago 2 replies      
Could this approach be used for media compression? I've wondered how compressible a popular-music track could be if you had a sufficiently rich language to describe it. This seems like a method to answer that question.
6
sebringj 12 hours ago 0 replies      
I'm just starting to learn tensorflow from a developer non-data-scientist view. This is great. From a laymen view, it seems it needs a training session for eliminating noise or static.
7
funkychicken 7 hours ago 0 replies      
Apple missed a golden objective-c opportunity here.
10
Sans Bullshit Sans: leveraging the synergy of ligatures (2015) pixelambacht.nl
175 points by JoshTriplett  15 hours ago   17 comments top 7
1
tribby 12 hours ago 0 replies      
this is great. at the last stupid hackathon in SF I worked on a font for cats with over 7 million ligatures (every fourth letter, it turns what you just typed into "meow.")[1] it was tricky to figure out how to push ranges and named lookups to the limit, and there doesn't seem to be any getting around the speed issue, but there's a lot of underlying font tech to have fun with :)

1. https://github.com/jpt/meow

2
grenoire 12 hours ago 3 replies      
Love it! Also, "Fun fact: the well known ampersand, &, was originally a ligature for et, meaning and in Latin," completely amazed me. I never knew.
3
keithpeter 12 hours ago 1 reply      
Nice teaching framework: I've painlessly absorbed a little bit of information about how fonts work while going along with the joke.
4
jancsika 8 hours ago 0 replies      
Immediately looked to see whether the substitution language is Turing complete. Luckily, it appears not.

Seems like you could do some very quirky animation using ligatures. For example-- imagine that the ligature for "Im" tilts the "I" toward the "m". Then the ligature for "Imm" tilts the "I" further, and so on, until I type the complete word "Immediately" and the "I" has drifted down below the "y" character.

Repeat this drift for each subsequent letter typed, slowly increasing the amount of displacement for the letters in the ligatures. Then, by the time I've finished typing this comment, all the letters would end up piled up in the corner of this textarea. :)

5
Sephr 11 hours ago 0 replies      
I was experimenting with this concept in 2012, and I think this is still relevant today. I wanted to make a web font with ligatures that replaced popular company names with their actual company logos.

It ended up being way too much work, and it isn't monetizeable due to the copyrighted nature of many logos.

6
grimgrin 8 hours ago 1 reply      
Here's a font with programming related ligatures:

https://github.com/tonsky/FiraCode/blob/master/README.md

7
chris_wot 6 hours ago 0 replies      
Behad Esfahbod is, quite frankly, a font and script genius. When I was delving into understanding LibreOffice's font and layout mess (which is getting better, btw) I kept seeing his name and work in harfbuzz.

The world, I feel, owes this man a great debt and I fear he is one of the great unsung open source heroes we seem to hear so little about!

11
The slow death of the electric guitar washingtonpost.com
70 points by paladin314159  11 hours ago   92 comments top 32
1
rwmj 1 minute ago 0 replies      
Put simply, guitars have a terrible user interface. I have "taught" Korg Kaossilator to a dozen people and they have all picked it up and started making not-terrible electronic dance music within an hour of first using it.
2
d--b 3 hours ago 2 replies      
I think it's a bit naive to believe that guitar heroes can bring the interest for the guitar back. People look back at the guitar heroes time with a nostalgic fondness for a time that doesn't exist anymore. This was already visible in Wayne's world and became more than obvious in school of rock. Guitar-heavy rock has become a geek genre. Pop has moved on.

There are obviously various reasons for this:1. Technologically the instrument ran its course. While the tech was still innovative in the 90s with the new digital effects, it hasn't changed much in 20 years.2. Electronic music however has brought in a lot of new sounds that gave pop music a fresh start in the 2000s3. People seem to be more interested in 2 things: dancing and lyrics. You could write songs with totally inintelligible lyrics and a good solo, and you'd have a hit. I don't think that's true anymore. Similarly a solo kind of ruins the dancing.

I think die hard rock fans need to get over it, guitar heroes are not coming back any time soon. I don't think it means the electric guitar is going to die. It still is an amazingly cool instrument. But, it means that your average kid may not want to try to play that "ten years after" intro anymore.

3
cyberferret 3 hours ago 3 replies      
> John Mayer? he asks. You dont see a bunch of kids emulating John Mayer and listening to him and wanting to pick up a guitar because of him.

Sorry, but my 17 year old son was so inspired by Mayer about 5 years ago that he invested a LOT of time learning how to play guitar and sing like him, and other artists with similar styles. [0]

He is now building quite a steady music career even while finishing high school (he was booked for 3 gigs just this weekend).

He is also interested in past guitar heroes such as Eddie Van Halen, Mark Knopfler, Angus Young, Andy Summers etc. and spends a lot of time going through 'older' stuff to learn more.

While he has a lot of natural ability, there is no arguing that it takes a LOT of hard work. He practices for a minimum of 2 hours a day - sometimes even up to 4 or 5 hours, not counting gigging time. We often have to call him away from his guitar to do school work or eat.

I envy the time he lives in though - I started playing when I was 15, back in the early 80's and it was really difficult to find decent gear, and the only way to learn anything new was to try and figure it out by ear or find someone else who knew to teach you. Nowadays, the proliferation of Youtube and other online learning resources, the huge selection of reasonably priced gear, and things like software and hardware modelling amps mean players can dial in ANY sound they want under any situation. Unheard of in my time.

It just needs kids who are interested enough to turn it into their passion.

[0] - https://www.youtube.com/channel/UCJK-R3HGG09uGBRDs7fhpZw

4
rhapsodic 4 hours ago 1 reply      
It's ironic that young people are losing interest in the guitar at a time when there is an amazingly enormous amount of resources freely available for learning it.

When I started to learn guitar several decades ago, I would learn guitar solos off of records by slowing them down to 16 rpm (old turntables could do that) and moving the needle back repeatedly to listen to tricky phrases over and over again. It was frustrating, time consuming, and hell on on the records.

Today, for just about any popular and many obscure guitar-oriented songs, you can find a Youtube video where someone breaks it down note by note and chord by chord. There are all kinds of resources online for learning scales and theory and online communities where an aspiring guitarist can connect with thousands of other like-minded people.

I would like to see guitar-oriented rock and roll make a comeback. The heavy metal subculture is thriving without any mainstream radio airplay to speak of, but aside from that, there's just not that much going on.

If I see a local rock band play in a bar these days, about 80% of the time it will be all middle-aged men who have been playing for decades. Some of them are even retirement age.

5
davnicwil 1 hour ago 2 replies      
I think there are strong parallels with the tech/startup world here: the often-touted quote "The next Bill Gates won't build an operating system, the next Mark Zuckerberg won't build a social network" comes to mind.

The next Miles Davies won't play the horn, and the next Jimmy Hendrix won't play guitar. There will always be jazz, there will always be rock n roll, but the level of interest in those styles, particularly amongst young musicians, will slide inevitably towards the niche as the next innovative style comes along.

The world keeps turning. This is a great thing for music.

6
agumonkey 28 minutes ago 0 replies      
Allan Holdsworth, one of the most innovative guitar player passed not long ago.

If you feel like discovering a new way around an instrument, and even music in general, and are not repelled by 70s/80s synth feel, enjoy youtubing his name.

https://www.youtube.com/watch?v=OXSd-WyrtfA

The man was an extraordinary among extraordinaries. The guitarit's guitarist as they say.

Beside music, the notion of culture itself changed, it's palpable; the previous era was inspired a lot by music; today the passion has shifted down, at least as a mainstream thing. It's an industry in maintenance mode. Youngins may not be thrilled to be a guitar player, but in a way guitar heros aren't that much interesting. The instrument value in itself has not decreased.

7
kristofferR 45 minutes ago 0 replies      
The article didn't even attempt to answer the question it asked in the ingress - why I should care about a certain thing past its glory days fading in the popularity.

Times change, and that's a great thing, yet people will always complain. It has happened to things I really loved too. That's just life I guess, but I've realized that being upset at culture change is just a self-destructive though pattern.

The piano is still around, just like the electric guitar will be in the future.

8
quadrangle 3 hours ago 0 replies      
When you peak as the most popular instrument in the world and get tons of obsessive people to collect/hoard instruments, there's only one way to go from that peak. Thinking the downturn is death is worthless hype.
9
elihu 2 hours ago 0 replies      
My opinion is that it's a good time for electric guitar buyers, because factory-made guitars are pretty good and relatively inexpensive. They also last a long time, which means there's a lot of great used guitars out there.

I don't think there will be any major technological advances that make factory-made guitars significantly cheaper and better than they are now. I think a more interesting direction is for guitars to become simpler and easier to build to the point where a non-expert can build one easily without a lot of exotic tools. (In a way, this has always been the case. Cigar box guitars are an old tradition; they're ridiculously easy to make, and can sound very good.)

If you think about it, a Fender Stratocaster is a very minimalistic design that was engineered to be easy to manufacture with 1950's woodshop tools (bandsaws, routers and jigs, etc).. Every Strat clone is a reproduction of a design made for that era of technology. When CNC machines came on the scene, a Strat shape isn't substantially easier to make than any other shape, but we keep using that shape because it works well and because of tradition.

A guitar design that's optimized to be easy for a non-professional to make with a CNC router and a laser cutter and some basic woodworking tools might look somewhat different. This could open the door to extreme customization -- one-of-a-kind harp guitars, unusual pickup arrangements, guitars with three strings and four frets, nine string guitars designed for 31-tone equal temperament or just tunings, or whatever you like.

I expect most guitar buyers will continue to buy traditional Fenders and Gibsons and so on with 6 strings and from 21 to 24 frets and a scale length of 25 inches, plus or minus half an inch. However, for those that want something different, there will always be a minority of tinkerers who build their guitar just the way they like it. That's where I think the most interesting advances are going to happen.

10
johan_larson 2 hours ago 1 reply      
Gibson and Fender are like the companies that sell exercise equipment: lots of people buy one with the best of intentions, and three to six months later it's collecting dust in the basement.

It's not hard to see why. Playing music is hard. The ratio of effort to reward is just terrible. I totally understand why people quit.

11
Joeboy 3 hours ago 0 replies      
Not necessarily saying the electric guitar isn't dying, but the economic argument isn't very convincing. There are a lot of guitars in the world now and we keep making more, consequently people aren't prepared to spend as much money on them. The people who are willing and able to spend $1000 on a new Les Paul or USA Strat are an aging generation for sure, but maybe their grandchildren are just playing guitars that cost $200 instead? You can get a pretty decent guitar for peanuts these days - I bought a perfectly playable strat copy for like $40 (actually 30GBP) off Ebay a while ago, with a (small, tinny) amp thrown in.

I've been expecting the death of the electric guitar since the '80s, but it seems surprisingly resilient.

Edit: In addition, although I'm incredibly old and don't claim to have my finger on the pulse of the zeitgeist, it seems to me that in the post White Stripes era playing shitty second user equipment is widely considered cooler than playing expensive new equipment.

12
0xbear 1 hour ago 0 replies      
There are some amazing players out there: Guthrie Govan, Nick Johnston, Plini... The old guard (Satriani, Vai, Gilbert) are cranking out great albums from time to time. Truth be told, people who really know how to play and write guitar music always were in short supply. They're just in slightly shorter supply now for (IMO) two reasons: 1. One no longer needs to know how to play an instrument to be considered a musician: people are perfectly willing to pay for "music" that's just computer drums and bass, and 2. Kids have the attention span of a house fly, and guitar requires daily practice. Even so, YouTube is full of amazingly talented kids.
13
badcede 3 hours ago 1 reply      
As long as we're talking slow death and electric guitars, https://www.youtube.com/watch?v=EL3pP29N-Wc.
14
crispytx 1 hour ago 0 replies      
I've been saying this for a couple of years now. None of the music I listen to anymore features an electric guitar; it's all made on laptop computers. The fun part about knowing how to play guitar is that you can learn how to play your favorite songs. But if none of your favorite songs feature an electric guitar, what fun is it anymore? I'm 30 years old and have been playing the guitar since I was 12, but it's just not as fun anymore... I'm thinking about learning how to make music on my macbook so I can stay young and hip :)
15
afpx 1 hour ago 0 replies      
Huh, and all this time I thought "rock and roll" and playing electric guitar was all about sex. At least, that's why we learned to play guitar when I was a kid...

Guitar playing lost its mojo when it was usurped by aficionados enamored by theory and technique and repulsed by swagger.

If I was 14 today, I'd grab a Maschine, load up some tracks in launchpad, and start emulating Avicii until I could pack a small dance hall.

16
pizzicato7 1 day ago 2 replies      
I think this article misses a key point. It's not all just about a lack of relevant role models.

Today, there are so many things competing for a kids' time - social media and messaging, mobile apps, video games, Netflix - that kids are choosing other activities instead of solitary, frustrating hours practicing guitar technique.

To become a proficient amateur-level guitarist, it takes around 2,000 hours of practice. That's equivalent to an hour a day, EVERY SINGLE DAY, for 5-1/2 years.

90% of kids learning guitar quit in the first 2 months (according to Fender) - most before they can play their first song well. The first few weeks are particularly brutal - it sounds horrible, it's painful on your fingers, and takes hours just get your first chord down.

In one sentence: it's just too hard to learn for the vast majority of people - and it's always been this way. But the difference is that these days, most kids would rather play Pokemon Go or Snapchat - and for kids who are musically inclined, it's so much easier and faster to become a DJ or producer than an instrumentalist, thanks to GarageBand and VirtualDJ and other easy-to-use software apps.

So a lot of musical kids are choosing that route. Why spend thousands of hours alone in your bedroom when you can be DJ'ing your first party in a few weeks?

So, how do we solve the problem of getting more kids to learn instruments, particularly the guitar? Some people have put lights on the fretboard (Fretlight, Gtar, Poputar) but in 25 years, that hasn't proven to make it much easier to learn. Others have gamified the experience (Rocksmith, Yousician) - but the learning curve is still extremely steep.

My company, Magic Instruments, has a different approach. We make it fundamentally easier to learn. Instead of starting by learning traditional guitar chord fingerings, we enable people to start playing chords using just one finger. This gives beginners an instantly positive musical experience - you can start strumming and playing your favorite songs from day one, and start jamming with others in a band in your first week. We then transition people over to learning traditional chords at their own pace.

We've seen 9 year old kids form a band in a few hours. Our hope is that we can inspire these kids to have a passion for practicing music, which will enable them to persevere for the thousands of hours of practice it takes to build the muscle memory to become guitarists.

17
ThomPete 1 day ago 3 replies      
Software eating the world.

As a guitarist, for some of my years professionally I can only say that this is mostly because you can emulate a lot of things today which normally required different guitars to bring out special sounds and because well most music today don't have guitar solos and thus it's hard to imagine guitar heroes coming out of music which doesn't put guitar in the front.

It's not just guitars though it's most other instruments. The real heroes today are the composers and producers.

18
superplussed 1 hour ago 1 reply      
The person every kid wanted to be in the 70s was Jimi Hendrix. The person every kid wants to be in 2017 is Elon Musk or Mark Zuckerburg. The zeitgeist has moved on.
19
gciruelos 1 hour ago 0 replies      
In my opinion boutique competition, economic hardship, drop in quality, and an active used market are what's killing Gibson and other big companies. Not EDM.
20
uranian 30 minutes ago 0 replies      
I think the drop in revenue is because the market is really saturated. All those millions of guitars produced are not destroyed, it's not the kind of product that you can apply 'planned obsolescence' to. Instead, the vintage one's are often better sounding and therefore more popular. Also, if you find a good vintage guitar you are almost assured it will remain good as the wood has proven to be stable.

If there is a slow death to guitar than it is because it's almost impossible to earn any money with it, except for the lucky few. Learning an instrument like guitar takes many, many years, which is quite a hobby.. If you would spend the same amount of time in learning software development you are almost assured to have a solid income as result.

21
akytt 1 day ago 0 replies      
Hands up if you have actually ever thrown away a guitar? It's rather big it's bulky and there's few things that can go badly wrong. You sell it or hand it down. I've just rescued a kickass guitar from a pawn shop and it'll be serving me probably as long as i play. Basically, there are enough guitars out there. The average lifetime of an instrument is going up and that's a trend that is the opposite of what the rest of the consumer goods world is seeing. No wonder business is bad. But don't confuse business with the actual instrument.
22
clavalle 1 day ago 0 replies      
Funny, I just got done playing for an hour. It's a great way to refresh.

If you want to see a resurgence of guitar playing you can't start with stadium guitar gods, you have to have the guitar house-party hero, and the local club hero, and the regional tour hero. And they're out there. Go out and see them play -- a lot of them are incredible.

I grew up in and around Austin so I'm biased and a little spoiled but there's nothing like a live local show.

There are more than a few parallels between putting together a band and becoming a success and putting together a company and becoming a success.

23
splicer 2 hours ago 0 replies      
It's time another instrument got a chance at the spotlight. I'm going to call it: the next big thing is Tuba heroes!
24
b1daly 3 hours ago 3 replies      
I think the trend will continue down for guitar playing, and for all musical instruments.

My theory is based on the fact that learning to play well is difficult and time consuming.

Popular music is by and large not the product of humans, moving muscles, energizing mass, thus generating sound waves. "Sounds" that we hear in pop music are in main generated digitally, usually with software instruments, in a computer.

Pop music is still uses the more traditional aspects of music composition, but only as a component of an ever expanding sonic palette.

Modern production is based increasingly on the ability to manipulate "sound" in a computer, and assemble it into listenable compositions.

The human voice remains one elements that is still generated by biological processes. But even the voice is subjected to increasing amounts of digital manipulation.

Learning to produce music in the modern style is also difficult, though in a different way from learning to play an instrument. Specifically, it is not a realtime process.

Given the finite amount of time and resources available to individuals, especially young people, it is inevitable that learning more modern pop production will be at the expense of investing in the extensive training needed to perform music in realtime.

This is compounded by the fact that the economic weight of the music industry is in the world of pop music, meaning the various strains of digitally created music. This is where the money comes in. People that want to make of a living doing music will increasingly need to be proficient in modern music production.

This creates a "virtuous cycle" which directs more resources towards this aspect of music, and a "vicious cycle" towards the traditional aspects of musical performance.

There are actually two significant technological forces that enable this structural shift in music creation. The first was the advent of recording, and mass distribution of music. It broke music away from the need to have human performers, playing in realtime, to hear music. This dramatically lowered the marginal cost of experiencing music.

The twin forces of time shifting and mass replication were turbocharged with the advent of digital audio. This, combined with the ever increased use of digital manipulated in music generation, amounts to a "singularity" in humans relationship with music. A line has been crossed that is permanent, it can never be uncrossed.

To be sure (ha ha), music will continue to be performed (and listened to) by live musicians, indefinitely. But it will be in the context of decreasing cultural influences.

The financial resources needed to support the creation of skilled musicians will continue to dwindle. This effect has been ongoing for decades in the world of orchestral music; now it has come for the world of all performed music.

One might think, what about live music? Won't there always be a demand for live, performed music? I don't think so. Or rather, it will continue the dramatic decline illustrated in the article by guitar sales.

Audiences seeme to respond just as well to shows that use essentially pre-recorded music. As long as there is a show of some kind, most of the music consuming population will not mind if the music heard at a show is "canned".

This makes me a bit sad, but ultimately the endeavors of human creativity will march on, inexorably charting new paths using the astonishing arsenal of software applications that are available these days at a very low cost.

25
watertorock 4 hours ago 1 reply      
Learning an instrument requires commitment and a lot of practice. And more practice. And more practice. And a lifetime of learning.

How many people have the patience for that? Particularly in Generation Internet Points Right Now?

26
estomagordo 2 hours ago 1 reply      
Is it just me, or are HN links increasingly often to material behind paywalls?
27
3327 4 hours ago 1 reply      
rock n' roll will never die as Neil Young put it. Its just taking a break and going to hell to reorganize. Rock is here to stay and is out of fashion now but the cycle will come back and just like clothing fashion 20-30 years from now a new wave will happen.
28
pmalynin 4 hours ago 0 replies      
Completely broken on Safari because of scroll jacking. Had to actually drag the scroll bar until the text starts.
29
justboxing 3 hours ago 2 replies      
30
molecule 4 hours ago 0 replies      
Previous discussion yesterday:

https://news.ycombinator.com/item?id=14621515

31
dschep 1 day ago 0 replies      
I clicked because I was curious what number of strings guitars were all secretly switching to. Why include such a superfluous detail in that headline? The cynic in me thinks this just clickbait getting harder to identify.
32
paulsutter 4 hours ago 5 replies      
Slash was the last famous guitarist. There's your proof that guitarists used to be a big deal, but are no more.

http://www.rollingstone.com/music/lists/100-greatest-guitari...

12
Medieval Scholars Believed in the Possibility of Parallel Universes atlasobscura.com
76 points by Petiver  13 hours ago   32 comments top 7
1
danidiaz 38 minutes ago 0 replies      
The counterintuitive moral of the story seems to be that ideological censorship has the potential to spur philosophical and scientific progress, even if this is an unintended and not very frequent outcome.

It is strange, but for a good chunk of time religious thinkers were "more right" about subjects like the eternity of the universe than the prevailing aristotelian philosophers, sometimes deploying surprisingly modern-sounding ideas in the process. I'm thinking of authors like Philoponus https://historyofphilosophy.net/philoponus and Crescas https://historyofphilosophy.net/crescas

The Stanford Encyclopedia of Philosophy has an entry on the condemnation of 1277: https://plato.stanford.edu/entries/condemnation/

2
gautamdivgi 8 hours ago 1 reply      
Not just medieval - Hindu philosophy has definitely had that concept for the few thousand years it's been in existence. The concept of "koti koti brahmanda" literally millions of universes is a pretty central concept.
3
fnovd 11 hours ago 3 replies      
Of course they did. It was heretical to imply that something conceivable by man would be impossible for God to create. If we can imagine it, he can do it. A rather roundabout way of arriving at a theory of parallel universes, but fascinating nonetheless.
4
shmerl 3 hours ago 0 replies      
That's not news really.
5
dredmorbius 5 hours ago 1 reply      
There are some broader lessons from this than who believed what and when (or what other philosophical or religious traditions had similar or "better" beliefs).

Belief based solely on faith has a pronounced tendency to go off in the weeds. That's the distinction which the scientific method makes, particularly in the tradition of the bacons -- Roger Bacon (13th c.) and Francis Bacon (16th c.), no relation. Each emphasised the value of observation or experimentation.

Second: if you find your premises, or traditional authorities, at odds with observed reality, you might care to strongly favour dismissing your premises or authorities, rather than your observations -- so long as the latter seem to be independently verifiable. An interesting case of this developed most especially in the 19th century, within the field of geology, where the record of the stones was found in marked difference to the record of the scripts, particularly the biblical record. Noted geologists spent not inconsiderable time attempting reconciliation of these records. There was no reconciliation possible, of course, one of those records was simply wrong.

https://en.m.wikipedia.org/wiki/James_Dwight_Dana#Publicatio...

Third: if you've found a persistent discord between observation and theory, then it's quite likely you've illuminated a lacuna in your knowledge or understanding. Again to geology: it was clear from the geologic record that the Earth was at least some hundreds of millions of years old, but no known force or energy could explain the observed temperature of the Earth's interior. The answer to this turned out to be previously unknown form of energy potential and release: radioactivity. That also happened to provide the clock by which the Earth's age could be determined, as well as the mechanism by which geology is ultimatley founded: plate tectonics. Not fully accepted, it turns out, until 1965, though it's now considered the fundamental organising principle of geology. Which gives us a fourth lesson:

Fourth: You can study a thing for a long, long, long time before you come to a proper understanding of it.

Fifth: Study of ancient authorities isn't wholly useless. I advise people to look to philosophy, especially, if not for truth then as a record in how the truth, and error, are arrived at, over time.

And finally: it's not enough to come to the correct answer, but to come to the correct answer through the right chain of reasoning. Science is structured knowledge. It's not a dry recitation of facts, but rather, the structure, through which, those facts become evident. If not self-evident, then building on observation and mechanism. Precursors of current understanding, absent the underlying structural foundation, are interesting, but are not science as it's properly considered.

6
lngnmn 5 hours ago 2 replies      
Nothing to see here. It is a naive or rather primitive concept based on assumption that there are more than one "future" (while actually there is none).

The traditional example is that if I cross the street everything will be different. Parallel universes is related to this nonsensical multiple-parallel-futures assumption.

For those who are interesting in seeing things as they are instead of piling up nonsense upon nonsense, consider a process of evolution. There is no multiple versions of the same species because they might turn the other way. Everything happen as it happen (everything is the way it is because it got that way). It is an unfolding of a single process.

There is no future(s). It is an "organic" growth (like growing of a tree). Future is a concept of the mind. A meme. So are parallel universes. There is nothing parallel to what is. At least outside of one's head.

7
meric 10 hours ago 2 replies      
There's a reason religious philosophy took precedence over "scientific" thought in those times. And it wasn't because people were dumb.
14
Good enough practices in scientific computing plos.org
59 points by okket  14 hours ago   7 comments top 3
1
ferdterguson 7 hours ago 2 replies      
One thing they mentioned in the 'left out' section is code review. I'm a researcher in a computation-heavy field and I think that anything I write that is intended to be used in our research group or used by other people should be code reviewed. In my experience, so many codes written by former group members or by senior group members become blackboxes that no one can read or maintain in five years.

Code review for anything more complicated than a script has helped the quality of what I write. It also ensures that there are other people who have at least seen the code that I wrote. Even if they don't fully understand it, they are at least empowered enough to wade through it if need be.

2
type0 11 hours ago 2 replies      
Oh, those citations:

> ... you have to explain the difference between plain text and word processing. And text editors. And Markdown/LaTeX compilers. And BiBTeX. And Git.... the barrier to collaborating on papers in this framework is simply too high to overcome. Good intentions aside, it always comes down to, "just give me a Word document with tracked changes"

> ... Google Docs excels at easy sharing, collaboration, simultaneous editing, commenting, and reply-to-commenting.

Using google docs is not "good enough" practice for scientific computing. How will you embed parts of your csv files in the report, how will you at the same time have it included in the version control system? Using plain text file toolchains on the other hand could solve all that. Now they mentioned Markdown and Pandoc, but no mention of the ReStructuredText, Orgmode files or Asciidocs, so it makes me wonder why would they recommend version control and google docs?

15
Getting compilers right: a reliable foundation for secure software microsoft.com
75 points by matt_d  15 hours ago   16 comments top 2
1
pcwalton 10 hours ago 6 replies      
For C and C++, the cost of proving a compiler correct seems hugely out of proportion to the actual benefit gained from doing so. Most critical security bugs in C++ code are found in code that the compiler had no obligations whatsoever to compile "properly", because they are the result of undefined behavior (use after free, buffer overflows, stray writes, etc.) A perfect, proven-correct C++ compiler would do nothing to protect against any of the famous vulnerabilities you've heard of. Even the famous "null check eliminated by a GCC optimization" Linux kernel bug would be unaffected, as that was a valid optimization per the language definition.

By contrast, I think JavaScript VMs are the target of miscompilation attacks many of orders of magnitude more often than C++ compilers are. They actually have to compile untrusted and hostile code. A miscompilation can be disastrous, and in fact actual browser vulnerabilities have traced back to incorrect JS compilation. So I feel this impressive research might have a more practical impact if applied to JS (or Web Assembly!)

2
nickpsecurity 12 hours ago 0 replies      
CompCert, CakeML, Simpl/C, VeLLVM, and SPARK Ada are the related reading for those interested in this sort of thing. The others naturally fit the topic with SPARK being a mature tool for verifying imperative algorithms.
16
Easy way to detect where the C++11/C++14 features are used in a C++ project cppdepend.com
30 points by virtualcpp  9 hours ago   9 comments top 3
1
a_twig 6 hours ago 2 replies      
Disable C++11/14 with a compiler flag (or use an old compiler version) and try to build the project :-P
2
acidburn1995 6 hours ago 0 replies      
An easier way would be when you change -std=c++1y to -std=c++03 it won't compile.
3
wcr3 6 hours ago 0 replies      
thanks for the ad
17
3D Convolutional Neural Networks in TensorFlow github.com
13 points by irsina  5 hours ago   1 comment top
1
irsina 5 hours ago 0 replies      
We leveraged 3D convolutional architecture for creating the speaker model in order to simultaneously capturing the speech-related and temporal information from the speakers' utterances.
18
CS61A Structure and Interpretation of Computer Programs berkeley.edu
45 points by miobrien  14 hours ago   6 comments top 4
1
sdiq 11 minutes ago 0 replies      
I fear the current 'iterations' might not last long because of the video aspect. I have been following two other courses from Berkeley: data8.org and datastructur.es. However, because of a certain directive (and I don't fault university for it) the video lectures are no longer public. Sad.
2
sudoscript 4 hours ago 1 reply      
I did this course a while back just when I was getting started on my self-taught journey. I'd highly recommend this over Harvard's CS50 or MIT's intro course. This has great coverage of how a computer program works, without being too pedantic.
3
jordonwii 1 hour ago 0 replies      
In case anyone's curious, this is a rather old iteration of the course (2011, right after it switched from scheme to Python).

The current iteration is at http://cs61a.org - taught by two undergrads for the summer.

The most recent two iterations taught by professors are http://fa16.cs61a.org and http://sp17.cs61a.org

4
ivan_ah 4 hours ago 0 replies      
Really nice course. Labs come with solutions and would make an excellent "challenge pack" for anyone learning Python and CS right now.
19
Hexagonal Grids redblobgames.com
30 points by guiambros  11 hours ago   2 comments top 2
1
Gaelan 3 hours ago 0 replies      
2
donatj 1 hour ago 0 replies      
I love this post. I found it years ago and used it to make a toy game I always wanted to try. It's awesome.
20
Terark (YC W17) is a profitable database compression company based in Beijing techinasia.com
20 points by rockeetterark  7 hours ago   7 comments top 4
1
continuations 2 hours ago 0 replies      
So there's TerarkDB: https://github.com/Terark/terarkdb

And there's TerichDB: https://github.com/Terark/terichdb

How are they related to each other?

Also TerichDB calls itself open source but then includes this: "TerichDB is open source but our core data structures and algorithms(dfadb) are not yet."

If the core algorithms of TerichDB is not open source then is TerichDB even usable? Are you going to open source the core algorithms?

All this is rather confusing.

2
rockeetterark 7 hours ago 1 reply      
Sorry, I know the article is kind of sensational but it also has some good information and we're here to discuss the real substance in the thread.

Terark built a new storage engine for Database and Data Systems based on the Succinct Nested Trie data structure. Our technology enables direct search on highly compressed data without decompressing it. Thanks to that we obtain >200X faster performance and more than 15X storage savings (better than Google's LevelDB or Facebook's RocksDB).We are a Y Combinator company (W17).

4
nerdwaller 2 hours ago 2 replies      
Maybe I'm alone in privacy concerns, but something behind the "great firewall" scares me a bit to trust.
21
Show HN: Simplexhc Haskell to LLVM compiler (design phase) pixel-druid.com
26 points by bollu  9 hours ago   4 comments top 4
1
mhh__ 5 hours ago 0 replies      
How badly does the function pointer technique effect interprocedural optimization? Since LLVM would know the exact control flow it should be able to do the optimisations anyway?

That match(int) technique seems nave. How would you handle the function arguments/application. The size of the function would also, presumably, large enough to be either very bad for one's icache or requiring more optimisation, which on the assumption that my previous paragraph is correct would make this a wasted effort?

Has this been profiled at all?

2
runeks 3 hours ago 0 replies      
> The way GHC currently compiles is to first reduce Haskell to a Core language, which is a minimal subset of Haskell, in some sense.

GHC Core is not a subset of Haskell, it's just a simplification of it. The same thing but explained with fewer words (constructors).

That being said, we completely agree that Spineless Tagless G-Machine is full of badassery. Just imagine being able to reduce any Haskell app to, like, 8 different instructions. Something about that fascinates me, even though I'm not quite sure what it is.

3
mncharity 3 hours ago 0 replies      
Regards interprocedural optimization, an alternative is to return to the trampoline less frequently. As with Cheney on the MTA.

I wonder what it would take to add a wiki to lambda-the-ultimate.

4
kccqzy 6 hours ago 0 replies      
Great work! Minor typo: the first code snippet has a missing right parenthesis.
22
Show HN: Insect a high-precision scientific calculator with physical units insect.sh
303 points by sharkdp  19 hours ago   172 comments top 56
1
boramalper 14 hours ago 3 replies      
Kudos!

I would love to see two more things:

1. Propagation of uncertainty.[1] I often yearned for a calculator that automatically propagated uncertainties for me while writing my (high-school) lab reports. I think it would be life-saver functionality for many students at the very least.

2. True high-precision. I don't know how Insect works under the hood (so maybe you are already doing it this way) but instead of using high-precision results of the operations, store the operations themselves and calculate the final result at the end with the desired amount of precision.

I am aware that both requests requires a tremendous amount of change so you might as well think of them for the next major version! =)

[1]: https://en.wikipedia.org/wiki/Propagation_of_uncertainty

2
franciscop 16 hours ago 5 replies      
Sorry to be the party stopper for a really awesome tool, just wanted to let you know that you have a dead butterfly as a logo: http://emilydamstra.com/news/please-enough-dead-butterflies/
3
lol768 18 hours ago 4 replies      
This is pretty cool, it's one of the rare applications I've used where the things I've tried "just work". For example "10 kg to g", "c", "c to km s^(-1)", "c to km/s" all work intuitively. It's great it works at the command line too.

Something I wish I'd had when I was studying Physics.

4
sharkdp 18 hours ago 1 reply      
Hi HN! This has been posted a few months ago (by someone else), but it wasn't really finished back then. The feedback I got here last time has helped a lot to improve all sorts of things. Looking forward to your comments!

Old discussion: https://news.ycombinator.com/item?id=13909631

5
averagewall 7 hours ago 2 replies      
This is exactly what I've been looking for for years. A fast easy calculator that's as convenient as a physical school calculator but with familiar programming style input since computers don't have scientific calculator keys. I can do sqrt(2) much easier than Windows' calculator and it doesn't have the ridiculous attempt to copy a physical calculator's quirks like pressing equals twice to re-apply the previous operation or having an "inverse"/"2nd function" button.

Units is a bonus but really just the calculator is less frustrating than anything else I've ever seen. Google search is too unpredictable - morphing into a typical confusing 1970's style keypad design after your first calculation. I tried a couple of desktop applications and they were no better either. Come one calculator app designers - stop trying to copy a physical calculator. Those already have a terrible design and computers don't have the same constraints or freedoms as them.

Even better, it can act like a desktop Windows app simply by saving the page in Chrome! Beat that for latency, Google!

6
navane 52 minutes ago 0 replies      
I use Python with pint [0] for this. It integrates with numpy. It has support for uncertainty. You can do all calculations Python can, Python math reads like ascii math. It also can output to latex.

This can easily run local. If you prefer online repl, it's available on repl.it [1]. There you can keep your scripts in the cloud for later, with rudimentary versioning.

[0] https://github.com/hgrecco/pint[1] https://repl.it

7
jclulow 13 hours ago 2 replies      
So "megabytes" appears to be the power-of-ten unit, which is generally not that helpful in practice.

 6 megabytes to bytes 6 MB -> B = 6000000 B
Assuming you're sticking with the power-of-ten unit, that means you should really grow support for the (sigh) "mebibytes" family of units; i.e., what some folks are retro-actively calling the power-of-two byte unit.

 6 mebibytes to bytes 6 mebibytes -> B Unknown identifier: mebibytes

8
btown 15 hours ago 2 replies      
> high precision

 e*1e15 e 1000000000000000 = 2718280000000000
Needs a bit of work, I think.

9
raketenolli 16 hours ago 1 reply      
Looks nice, but is there a reason it interprets m^2 * Pa correctly

 1 Pa * 2 m^2 / N 1 Pa (2 m^2 / N) = 2
but doesn't automatically convert a result to N?

 1 Pa * 2 m^2 1 Pa 2 m^2 = 2 mPa
Same with W, J, s and all their relations.

10
eps 13 hours ago 2 replies      
If it could deduce density from the material name, it'd useful for volume-to-mass conversions., e.g.

 1 cup of butter -> g
Not sure if it's quite scientific though... :)

11
semi-extrinsic 17 hours ago 2 replies      
Looks really nice!

exp(2*kg/s)

 exp(2 (kg / s)) Conversion error: Cannot convert quantity of unit kg/s to a scalar
Excellent!

Is there any way I can save a list of variables to file and then reload them?

I also would like to vote for supporting imaginary numbers (Issue #47).

12
rnhmjoj 16 hours ago 0 replies      
I use this tool also: http://pythonhosted.org/uncertaintiesIt comes in pretty handy for a quick calculation with error propagation.
13
db48x 17 hours ago 1 reply      
Looks pretty nice, but no light years? No attoparsecs? You don't even have hogsheads or fortnights!!!!!1!

Still, it's pretty good.

14
forgotpwtomain 17 hours ago 2 replies      
Hmm, I find the auto-associativity to be a bit weird for example:

1/12 c

 1 / (12 c) = 2.7797e-10 s/m

15
gvkv 11 hours ago 1 reply      
Well done!

Plain and easy to understand interface and excellent use of colour and space. Two suggestions:

1. While I doubt it'll be used very much, consider adding calories for completeness if nothing else:

 1000 kcal -> joules
or

 1000 Cal -> joules
You might also throw in cal for just for fun[1]!

2. Change Variables to Constants. I think this is more in keeping with standard jargon.

[1]: cal is based on the gram while Cal or kcal is based on the kilogram.

16
eggy 9 hours ago 0 replies      
I have been using Frink for over 7 years. Aside from a calculator with a gazillion units, Frink lets you program in a less verbose Java for other tasks. Graphics are easy too. I write small apps for calculations I often need at work with GUIs and input prompts.What makes Insect different? Can you program apps in it?
17
scentoni 18 hours ago 2 replies      
I like this, but would really like a convert-to-base-units function.

 sqrt(1/(eps0 mu0)) ->m/s sqrt(1 / (eps0 mu0)) -> m / s = 299833000 m/s sqrt(mu0/eps0) ->ohm sqrt(mu0 / eps0) -> = 377.031 

18
te 4 hours ago 0 replies      
I'd love a calculator that can implicitly recognize hh:mm:ss notation and can answer queries like ...

3:20:36 / 26.2

7:35 * 26.2

19
DINKDINK 13 hours ago 1 reply      
Needs more temperature scales than just kelvin.

US Engineering units of energy would also be help certain people: such as BTUs British Thermal Units etc.

20
F_r_k 12 hours ago 0 replies      
For those interested on a CLI counterpart, GNU units is more or less the same. The units definitions file is enormous
21
rwallace 5 hours ago 0 replies      
Looks good! A lot of things seem to just work.

Can you stop the cursor blinking?

If you do something like pi 1e20, I think it should print out all the digits it has instead of printing zeros.

22
oskbor 18 hours ago 1 reply      
Looks nice! are you relying on mathjs? I think it has some of the same functionality.

If not, is the conversion logic in an npm package?

23
raz32dust 7 hours ago 0 replies      
"9 min/mile -> km/h" fails since they are different dimensions. However in reality we do use both of these units to indicate speed. This might be a stretch but will be good to incorporate if it can be generalized.
24
gabipurcaru 17 hours ago 1 reply      
Shameless plug for my http://dedo.io/, which is similar to this this, but with less features, more money-oriented and support for any custom units, e.g. if bag_weight = 12 kg / bag, then 10 bags * bag_weight is 120kg
25
isatty 18 hours ago 1 reply      
Hi Sharkdp, wonderful product. I think that unit and command autocompletion would be super useful!
26
wotamRobin 16 hours ago 2 replies      
I really like the UI and seeming wealth of different units. It doesn't look like it supports fluid ounces though.

I tried "4 tbsp to oz" and it interprets oz as mass instead of volume. Google correctly gives me 2 as the answer.

27
ourcat 17 hours ago 2 replies      
Nice. This is a lot like (in looks and some functionality) the macOs app, 'Numi' (which I love) : https://numi.io/
28
amelius 17 hours ago 2 replies      

 2**100+1-2**100
Equals 0?

29
Hello71 12 hours ago 5 replies      
I'm a little confused as to how this improves on GNU units, which seems to support far more:

 $ cat .units period(len) units=[m;s] 2pi*sqrt(len/gravity) ; (period/2pi)^2 * gravity $ units 2980 units, 109 prefixes, 97 nonlinear units You have: period(20cm) You want: Definition: 0.89729351 s You have: period(20cm) You want: ms * 897.29351 / 0.0011144625 You have: period(2ft) You want: ms * 1566.5419 / 0.00063834872 You have: 5 GiB You want: bytes * 5.3687091e+09 / 1.8626451e-10 You have: 5 hundred million You want: Definition: 5e+08 You have: tempF(100) You want: tempC 37.777778
is it supposed to be a competitor? learn how to use a new language? I don't get it.

I like the idea of keeping units, but I'm not sure this makes things easier:

 V * A / J V (A / J) = 1 VA/J You have: V*A/J You want: Definition: 1 / s

30
avip 15 hours ago 1 reply      
round(pi * 100000000000000000) = 314159265358979324

round(pi * 1000000000000000000) = 3141590000000000000

(similar comment by @btown down there about e)

31
effie 15 hours ago 0 replies      
I tried to convert 1000 rpm (revolutions per minute) to Hertz (periods per second), but I get

 Unknown identifier: rpm

32
antrion 15 hours ago 1 reply      
Really great tool! Thanks for the hard work.It would also be really nice to specify custom units, such as

 U = 1e-6/60 * kat U = 1.66667e-8 kat
Because now it does this

 1 kat -> U 1 kat -> U = 1 kat

33
BenjaminBini 14 hours ago 1 reply      
Found a bug : sin(pi) = 8.62803e-81
34
psuter 15 hours ago 1 reply      
Nice tool! Not too obvious from the landing page that there is a terminal version. Also, some units come with implicit assumptions:

 1 month -> days 1 month -> d = 30.4167 d 1 year -> days 1 year -> d = 365 d

35
floatboth 14 hours ago 0 replies      
There's also a fork (of an older version of this) for binary instead of physical units: https://soupi.github.io/insect/
36
relyks 15 hours ago 0 replies      
This is way faster than Wolfram Alpha and much simpler to use for most use cases :)
37
Animats 16 hours ago 2 replies      
That's cute. Are there any phone apps which do that?

Remember Graphing Calculator? That could use such features.

Autodesk Inventor understands units in formulas, but mostly for length and angle. Everything becomes meters internally.

38
pdabbadabba 16 hours ago 1 reply      
Very nice! Any plans to add decibel/log-scale units? That would make this even handier.
39
lwlml 14 hours ago 0 replies      
4 mi / min 4 mi / min Unknown identifier: mi 4 miles / min 4 mi / min = 4 mi/min

Wut? Looks like some strangeness for the parser.

40
zython 14 hours ago 1 reply      
hmmm, I'm a little bit skeptical about this.

I'd be cool to have this as a python script somewhere but I am not quite sure wether I would visit this site whenever I need to calculate some physical units, especially since google already covers most of my needs

dont get me wrong, I think otherwise it looks and feels great and is easy to use but I dont know who will use this.

41
rikkus 12 hours ago 1 reply      

 1 mile / 2 km 1 mi / 2 km = 0.804672
Great, but it would be nice to show the result's unit.

 1/2km 1 / 2 km = 0.5 km
It's been a long while since I did any real maths, so I'm slightly stumped as to why the -1 exponent is there.

42
timow1337 17 hours ago 0 replies      
Too bad it doesn't support complex numbers
43
snissn 17 hours ago 1 reply      
cool! Quick minor suggestion, would be cool to support words for numbers like five kilometers or five million seconds!
44
alok-g 16 hours ago 0 replies      
It's not working on my Android, Dolphin browser. I am unable to type anything on the prompt.
45
Quiark 18 hours ago 1 reply      
I tried to enter

 3 5
which was interpreted as 3 * 5. Seems a bit risky to me :) Otherwise looks neat.

46
jkh1 14 hours ago 1 reply      
First thing I tried:>> 2 min 35 s + 3 min 54 s= 13920 s
47
Pfhreak 15 hours ago 1 reply      
Great tool. I would love to see computing units -- gb/s, etc.
48
dredmorbius 5 hours ago 1 reply      
Nice, though I'm still highly partial to GNU Units. In large part as it allows specifying output units, and not just for conversions.

Note: if you're on MacOS and are using the supplied 'units' utility, that is BSD, not GNU units. You're going to want to install gunits from Homebrew.

https://www.gnu.org/software/units/

49
sonium 13 hours ago 0 replies      
Needs more energy units then Joules
50
Zenst 10 hours ago 0 replies      
pi pi = 3.14159

High-precision!

51
jk2323 13 hours ago 0 replies      
Nice. But no RPN? I pass!
52
toss1 17 hours ago 0 replies      
Very nice on first try -- added to my quick toolbar.Great to be able to intermix metric and SAE units, as there is (very unfortunate) constant use of both in my field of work.

Any hope of an Android app version soon?

53
ASipos 15 hours ago 0 replies      
I tried e ^ (i * pi)
54
anigbrowl 14 hours ago 1 reply      
Nice, but

 sin (30 rad) sin 30 rad Unknown identifier: sin
Wouldn't it be great if software evaluated things, and if things didn't made sense, considered what the possible alternatives were, a bit like semantic checkers for spelling and grammar in text?

55
linker3000 17 hours ago 1 reply      
Fails at the first fundamental test:

 Meaning of life Meaning of life Unknown identifier: Meaning

56
matt4077 17 hours ago 2 replies      
It unfortunately fails my go-to test for these calculators:

 7.8L/100km -> miles/gallon 7.8 L / 100 km -> mi / gal Conversion error: Cannot convert unit L/km (base units: m) to unit mi/gal (base units: m)

23
Show HN: Ecola touch screen tree editor with hierarchical zoom github.com
23 points by hcs  12 hours ago   9 comments top 3
1
based2 2 hours ago 1 reply      
Where is the license?
2
zubairq 4 hours ago 2 replies      
It doesn't let me edit text on iphone 6, and Can you show example with large number of nodes and then I may use it
3
crawfordcomeaux 3 hours ago 1 reply      
I love the zooming & want to use this for category theory diagrams.
24
Text to speech in Python pythonprogramminglanguage.com
77 points by lobdev  11 hours ago   15 comments top 7
1
BugsJustFindMe 2 hours ago 1 reply      
The more accurate title for this is "Python interface to Google's cloud TTS service".

Which means it won't work offline or when Google inevitably shuts down the API.

2
d0mine 9 hours ago 0 replies      
In Pythonista for iOS:

 import speech speech.say('Hola mundo', 'es_ES')
In reverse, to record a sound for three seconds:

 import sound r = sound.Recorder('audio.m4a') r.record(3) # seconds
To recognize it as text:

 text = speech.recognize('audio.m4a', 'en')[0][0] # sent to Apple servers

3
haikuginger 10 hours ago 3 replies      
sudo pip install? Really?

You almost never want to use the system Python. Use pyenv[0] to install the version of Python you want, and then create a new virtual environment for each project.

[0]https://github.com/pyenv/pyenv

4
acbart 8 hours ago 0 replies      
So what is the state of the art in turning text into speech files in Python? Especially in terms of free options? Everything I find on google seems to be a year or two out of date.
5
grw_ 10 hours ago 1 reply      
BadPrototypeError: Objective-C expects 1 arguments, Python argument has 3 arguments for <unbound selector setProperty of NSSpeechDriver at 0x10d9e4e00>
6
revicon 5 hours ago 0 replies      
Anyone have an opinion on the best option these days to do the reverse (speech to text)? Building an open source version of Alexa sounds like a fun project.
7
jonbarker 9 hours ago 0 replies      
Try virtualenv instead of sudo pip install. Then you can keep track of your requirements.txt, which is helpful for teamwork on multiple machines.
25
Windows 10 Without the Cruft: Windows 10 LTSB (Long Term Servicing Branch) howtogeek.com
182 points by walterbell  17 hours ago   147 comments top 17
1
alkonaut 14 hours ago 8 replies      
I'm on ltsb for my business laptop and it stinks. Id rather have random issues with a few upgrades than be stuck without new features and get almost no bug fixes. New features I get for my home machine are missing on my job laptop. New dpi handling improvements coming Nope, not in ltsb.

It appears to get security fixes only - few bug fixes.For example, the LTSB start menu is completely broken - there is no search on it and it takes 3-4 seconds to show. Presumably related to the lack of Cortana, but who knows. In any case, it's been a widely reported bug in ltsb for a long time and it's a pretty fundamental feature in windows.

I could see the point in being on the LTSB branch of windows 7 because that OS is done. But windows 10 isn't nearly finished yet and very rough around the edges. Being on LTSB of such an old Win10 release is like being on the release day version of a AAA game while everyone else runs the fixed version that came out 3 months later.

2
sjellis 20 minutes ago 0 replies      
I'm both amused and angry that Windows 10 adopted the same model as the big commercial Linux distributions: a fast-cycle OS that upgrades every 6 months (but might have bugs), and a slow-moving version that they support for a long time but won't suddenly change behavior or stop working with your hardware.

It is a proven model in the Linux world. The unpleasant thing is that Microsoft shunted millions of users on to the equivalent of Fedora or non-LTS Ubuntu without bothering to explain this.

3
Filter 12 hours ago 3 replies      
Two changes made a big difference for me in using Windows 10:

1. Turning off Cortana using these instructions: https://www.howtogeek.com/265027/how-to-disable-cortana-in-w...

2. Turning off most of the visual effects under "Adjust the appearance and performance of Windows." (I left "Smooth edges of screen fonts" and "Enable Peek" checked)

The combination of these two change feels like a whole new computer.

4
MadSax 15 hours ago 2 replies      
The only version of Windows 10 which prosumers want, they can't have.
5
Pica_soO 11 hours ago 0 replies      
There is a whole Eco-system of tech-bloggers and tool-smiths, whos whole working-life consists of ripping out the "improvements" Microsoft has shoveled towards the users - riding binary rodeo with every update.

My assumption is that this Eco-system will rather soon bring the clumsy Microsoft attempt to become google one stack-layer closer to the user to a horrific end.

6
Animats 14 hours ago 0 replies      
Now that's useful. I have a Windows 7 machine for the few things that don't run on Linux. Windows 10 LTSB would be great for that.

I wonder if I can get Windows 10 LTSB preloaded by Central Computer, the retailer. They installed Windows 7 with no bloatware for me. (I asked for that, and the invoice actually reads "no bloatware")

7
thomasfedb 8 hours ago 0 replies      
I'm a full time Fedora user. I run LTSB in the VM that I run Office 365 in and it's perfect. No fuss, no bells, gets out of my way and just works.
8
andreasen 4 hours ago 1 reply      
This seems to install a pretty 'good' win10 version:https://www.microsoft.com/en-us/software-download/windows10s...

I had a Lenovo S500 dektop that I just reinstalled. Non-SSd. I was getting a '100% disk usage' issue right of the bat. I installed 'windows10startfresh' and now it's pretty sweet - pc is not slow, and such. The 'windows10startfresh' installs windows without all the lenovo crud apparently.

9
itsoggy 13 hours ago 0 replies      
I have deployed LTSB in a few schools I look after, it's far easier because of the minimal feature set especially in smaller environments where full enterprise management tools like System Centre and WSUS are not installed.

Go old school and customise the default profile in the build for best results!

The best part is that if you have education volume License You get both EDU and Ent LTSB editions!

10
unicornporn 13 hours ago 3 replies      
I was about to switch to this about a month ago but realized WSL does not work with LTSB. So no BASH for Windows. It will probably be more than a year too...
11
na85 14 hours ago 2 replies      
>Windows 10 LTSB omits a lot of the new stuff in Windows 10. It doesnt come with the Windows Store, Cortana, or Microsoft Edge browser. It also omits other Microsoft apps like Calendar, Camera, Clock, Mail, Money, Music, News, OneNote, Sports, and Weather.

>In fact, the default Start menu on Windows 10 LTSB doesnt even include a single tile. You wont find any of those new Windows 10 apps installed, aside from the Settings app.

That sounds fantastic. I use absolutely none of those "features" and would love to be able to remove them from my copy of windows 10.

12
Crontab 11 hours ago 1 reply      
Microsoft should make this available for everyone. It sounds perfect.
13
TazeTSchnitzel 14 hours ago 0 replies      
I noticed that when I had Windows 8.1 installed on my PC earlier this year, the version string mentioned LTSB. Does 8.1 use the kernel of 10's LTSB?
14
intopieces 14 hours ago 1 reply      
I would be okay with a pay-as-you-go Windows 10 Enteprise subscription for this LTSB. My use of Windows is very sporadic. If I could log in and pay $7 for the next 30 days official access then let it expire till I need it again, I would be giving MSFT more money than I do now but would consider a fair exchange for ongoing security updates.
15
mschuster91 14 hours ago 6 replies      
> And Windows 10 Enterprise is only available to an organization with a volume licensing agreement, or through a new $7 per month subscription program

Given that it's basically free to register as a commercial entity (in Germany it costs 40-50, and iirc a British LLC can be formed for less), can one do so, and then apply for said subscription program?

edit: are offerings like https://www.lizengo.de/microsoft/windows-10-enterprise actually legit, and can these be used to activate a LTSB installation?

16
fsiefken 13 hours ago 2 replies      
I also want the Classic Theme back, Classic like in NT4 classic, with the old school minimize, maximize and close buttons.
17
mqatrombone 13 hours ago 1 reply      
LTSB = Embedded
26
Latency of Scaleway, the cloud provider offering multicore ARM servers philipmw.github.io
29 points by pmw  4 hours ago   21 comments top 9
1
deadlyllama 3 hours ago 1 reply      
To anyone outside the USA, this is old hat. Yes, latency matters for interactive traffic. Yes, the Atlantic (and Pacific) oceans are big and light ain't getting any faster.
2
posnet 3 hours ago 0 replies      
This has nothing to do with Scaleway, the latency results would be the same no matter the provider.
3
fierarul 53 minutes ago 0 replies      
It's interesting people are worried about this with Scaleway because I don't remember reading this with Slicehost or Amazon EC2 while they were ramping up.

So, is the bar so much higher or is it something else?

I used Slicehost and EC2 from Europe with total disregard for latency because I never had much users. For my (mostly internal) servers it was fast enough.

And even now, I have the cheapest Scaleway machine with a public-facing website that seems to be running fine a small Angular4 + Java backend app.

I would also like to see a graph showing the latencies between all the AWS regions. Which I guess will show that AWS regions do have a logic and that having servers next to your users makes sense.

Still, why worry about this from the start when your monthly 'budget' is less than the price of your coffee breakfast and you get unmetered bandwidth?

4
trelliscoded 2 hours ago 1 reply      
I just use Cloudflare's free plan with my Scaleway servers, so Scaleway could be hosting from low earth orbit for all I care.
5
bowmessage 2 hours ago 0 replies      
Why not measure application response times on instance instead of at home? Would have been more interesting to see how $my_web_app runs on some t2.micros vs some arm boxes?
6
Ecco 1 hour ago 1 reply      
What's weird is that the distance from SF to Paris is 9000 Km, which is 30 light-ms. So the theoretical minimal latency would be 30ms.

How comes we're 5 times above that ?

Is the latency introduced by routers ? If yes, then that quite doesn't make sense : I doubt there is any in the middle of the Atlantic.

Is the routing that inefficient that the data travels 50000 km ?

7
jjeaff 1 hour ago 0 replies      
I use scaleway for some of my side projects. The latency is noticeable but with a little bit of cache adjustments and a free cloudflare plan in front of it, it is fine.

I took about 5 sites from a $50 a month shared cPanel plan that included a few WordPress blogs and some custom sites and put them on a $3 a month scaleway instance and haven't had a bit of trouble.

8
spullara 3 hours ago 1 reply      
This is really obvious. You should lose karma for up voting it.
9
pasbesoin 3 hours ago 1 reply      
So, where on the North American continent are people going for a small (ARM or otherwise), dedicated server that doesn't break the bank?

A few years ago, dedicated Raspberry Pi hosting was a bit of a thing for a bit.

I looked a bit, a few months ago, but I didn't turn up what appeared to be a clear winner of a choice.

27
Google Will Stop Reading Your Emails for Gmail Ads bloomberg.com
858 points by ahiknsr  1 day ago   421 comments top 60
1
jikeo 1 day ago 3 replies      
I know I shouldn't be surprised, but it seems weird that the reporter nor any of the 140+ comments so far seemingly don't mention the recently published proposal for a new ePrivacy directive in the EU that will make it a lot harder for Google to scan e-mails in the first place.

https://www.reuters.com/article/us-eu-privacy-idUSKBN14U1FLhttps://www.theguardian.com/technology/2017/jan/10/whatsapp-...http://www.kemplittle.com/site/articles/kl_bytes/the-draft-e...

2
skrause 1 day ago 12 replies      
I never understood the argument that some automatic scanning for keywords is like "reading" your mail. By that same logic isn't Gmail's spam filter still "reading" your mail? It is classifying your mail based on content after all...
3
vaishaksuresh 1 day ago 11 replies      
I very recently switched to fastmail and couldn't be happier. For $90 a year, I don't have to deal with people snooping and tracking me around for ads. I know google is trying to give me value with all their facial recognition and recommendations, but I don't think it is going to end well. When it does end badly, it will be too late for the user because we would've given up all the data. I don't want Google to build models to track my toddler's face when he isn't even capable of consenting to such tracking.
4
myrandomcomment 1 day ago 1 reply      
So this never really bugged me. It is a damn good free service. I also love how it picks up on plane tickets, hotel reservations, etc and puts them in the calendar. Makes life simpler.

If ad companies fix something please fix the I searched for something and bought it but I get adds for it for the next 4 weeks. That bugs me.

5
kentosi 1 day ago 2 replies      
I don't understand why there isn't an option for me to pay for Google to remove ads from gmail.

I've already paid for Youtube Red and couldn't be happier.

6
newscracker 1 day ago 0 replies      
<rant>Slightly off topic: I was very annoyed that this article didn't provide any links to Google's official statement/declaration about this change and when it's coming. Even if Bloomberg interviewed Diane Green for this article or asked questions and got official statements, it could've still provided an official link for the change.</rant>

I found the link [1] here on HN.

[1]: https://blog.google/products/gmail/g-suite-gains-traction-in...

7
jmull 1 day ago 11 replies      
I should probably get fitted for a tinfoil hat because my immediate reaction was, "Oh shit! They must've developed something now that tracks you better and is less obvious."
8
WisNorCan 1 day ago 2 replies      
This most likely is a pragmatic financial decision. Contextual advertising is generating a lower CPM than data/person based advertising.

Said differently the relevance that can be extracted from your specific email is less than the cumulative knowledge that Google has about you from other sources.

9
drusepth 1 day ago 1 reply      
Will there still be a way to opt-in?

This seems like one of those decisions that is a net negative for functionality in favor of quelling some misguided privacy concerns. Hopefully this doesn't lessen the quality of ads by much.

10
neves 16 hours ago 0 replies      
It basically means that they already track so much information about you that they don't need to monitor the contents of your email anymore.

Now Google knows:- all your searches (know your interests), - a great percentage of the pages you visit (ads and analytics)- all your contacts and how frequently you connect to them (metadata in gmail)- the places you visit (geolocation in Android)- Google logins (know the sites that interest you the most)

Your email contents is completely unecessary.

11
VMG 1 day ago 1 reply      
Hope this increases the chance of E2E encryption
12
robbart90 1 day ago 0 replies      
"The decision didnt come from Googles ad team, but from its cloud unit, which is angling to sign up more corporate customers."

Interesting timing with the story earlier this week about Wal-Mart telling vendors to stop using AWS

13
cm2012 1 day ago 4 replies      
That sucks for me as an advertiser. Gmail ads were great for B2B marketing.
14
sidcool 1 day ago 0 replies      
Google will still show ads in Gmail, just not based on email contents.
15
kolemcrae 1 day ago 1 reply      
Interesting - as an advertiser I generally recommend using the email address "sent from" as your main source of targeting but combining that with specific keywords within the gmail campaign can be super helpful at finding people at the exact right time.
16
Lambent 14 hours ago 0 replies      
To be clear, Google is not saying they'll stop reading your emails, only that your emails' contents won't be used to generate ad content.
17
hilyen 1 day ago 1 reply      
Good for them. Though it goes without saying email is insecure. If enterprise clients dislike an algorithm scanning their email contents, maybe they should also consider that email generally has unencrypted transit and storage.
18
TheChosen 14 hours ago 0 replies      
Quite obviously Google feels their profiling of users from other sources is good enough that they can afford to throw them a bone - not to mention save themselves a bit of effort since they will no longer need to maintain two systems for GMail.
19
aaln 1 day ago 0 replies      
Title should be: Google Will Stop Using Your Email Data for Gmail Ads.
20
Manager 14 hours ago 0 replies      
Well this is an interesting turn of events.

I wasn't expecting this from a company that makes most of it's revenue through advertising. Sets a cautiously positive precedent.

21
geekme 1 day ago 1 reply      
I will never use a enterprise google product unless they have customer support. The customer support team should have humans and not robots.
22
chenster 18 hours ago 0 replies      
Even though it never really bothered me because Apple Mail does not pull anything but the actual email message from Gmail, so I never see them, I still would like G to stop scanning anything personal. Period.
23
prirun 1 day ago 0 replies      
My guess is that with Google Drive, they are getting way more information on individuals and companies than they ever got via email.
24
davb 1 day ago 0 replies      
But will they stop using Hangouts IM content for customer segmentation? I probably say much more relevant things in Hangouts (from a marketing perspective) than in email.
25
ziikutv 19 hours ago 0 replies      
Some of my emails are "read" (as in shown as I have read them) when I have literally just received them. Is this referring to the same thing?
26
Overtonwindow 1 day ago 1 reply      
I think I've become jaded because I just don't believe it. It's like Google Home. A wonderful device but when I heard it may start listening to everything I say... I just figured yeah, that should be expected. So now I just expect Google to read, listen, and analyze everything I do with their products.
27
jshelly 1 day ago 1 reply      
And I am in the middle of migrating to icloud from google. Not going to stop at this point.
28
mozzarella 1 day ago 0 replies      
I wonder what the conversion rates on these within-client ads are. I know I've never opened any no matter how well-targeted or 'interesting' the ads were, because the immediate response is to just want to sweep the inbox clean.
29
m-p-3 1 day ago 0 replies      
But they'll most likely read them for another purpose, for example training their AI.
30
rzr 17 hours ago 0 replies      
Title is misleading, it should be G will keep reading your emails except for G Ads.
31
chaitime 1 day ago 0 replies      
What happens to the data that was already collected. Legally they can still use that data right?
32
nerdiiee 1 day ago 0 replies      
How many of you have successfully prevented Google and Facebook for tracking your web habits ? What all steps do you take to prevent the tracking ?
33
megamindbrian 1 day ago 0 replies      
Their AI knows what you are thinking anyways. The singularity is here!!
34
daveheq 1 day ago 0 replies      
Actually I'd prefer they did so they can optimize their cash flow and my targeted ads so I can continue using their free product and all it's nice features without any compromise.
35
MarkMc 1 day ago 1 reply      
OK but will Google stop reading my email for other types of ads?

For example, if I write "I love coffee" in an email, am I more likely to see a Starbucks ad when I visit watch a YouTube video?

36
tkubacki 1 day ago 0 replies      
Is free Outlook account scanning me email or for building ads profile or for any other purpose ?
37
RichLewis007 1 day ago 2 replies      
It's about time! I wonder if this is due to the fact that if email encryption becomes common, the content will be inaccessible to Google anyway.
38
LeoNatan25 20 hours ago 0 replies      
Almost as if they want to do no evil. Almost.
39
ForFreedom 1 day ago 1 reply      
So what method have they adopted to display adverts in emails?
40
nachtigall 1 day ago 0 replies      
Seriously, there's private, ad-free mail for 1$ a months: https://posteo.de/en/ or https://mailbox.org/en/
41
hkmurakami 1 day ago 0 replies      
Won't people keep assuming gmail reads their emails at this point though?
42
mavhc 1 day ago 0 replies      
Wonder if it's also a move that will help encrypted email become supported
43
andrepd 1 day ago 7 replies      
How do you know? That's the problem with closed source software, and software that runs in someone else's computer. You have no idea what it does. You aren't in control. Someone else is deciding what code runs on your computer. That's a problem.
44
ChuckMcM 1 day ago 0 replies      
It was interesting (but I suppose a random bit) that after reading this article a new email showed up in my gmail inbox that was spam. I wonder if this isn't the first move in a plan to create a 'pay' gmail service for individuals.
45
rasz 1 day ago 0 replies      
Will they also stop tracking clicked links in gmail?
46
mrmondo 1 day ago 0 replies      
Too little, too late IMO
47
Markoff 19 hours ago 0 replies      
Thank you EU.
48
itiman 1 day ago 1 reply      
The fact of reading emails for ads is disputable even if it's used for a new purpose, YT.
49
Kluny 1 day ago 0 replies      
Gosh, thanks.
50
redthrowaway 1 day ago 0 replies      
I suspect that means they no longer need to, and have better ways of targeting ads at you.
51
nether 1 day ago 0 replies      
Switch to Protonmail.
52
fernyherrera 1 day ago 0 replies      
word
53
mtgx 1 day ago 2 replies      
So much for the argument "how else is Google going to make money if it isn't reading your emails?!"

Companies can make money without tracking you 24/7 and reading all of your private content. They just choose not to, because it's easier, and then spread the propaganda that those things are "needed" to stay in business.

54
rootsudo 1 day ago 0 replies      
How nice of Google.
55
flavor8 1 day ago 0 replies      
Fine. Can they please make Google Apps users first class users in the google ecosystem?
56
funnyfacts365 23 hours ago 0 replies      
Oh, the doublespeak... They will read your emails for everything else, like Google Now, just not to show you ads. ahahahahahahahah
57
ethanpil 1 day ago 2 replies      
Does that mean that Google now has no more use for Gmail, and soon millions of people will be scrambling to cover yet another product sunset?

That would certainly cause an enormous loss of goodwill, but.... imagine this scenario:

Google has some slow growth quarters, they need to keep the numbers up for shareholders. They start to examine what they can squeeze. Gmail costs them X (hundreds?) millions per year, but doesn't gain much from it...

Certainly its unlikely, as it is also a SSO tool, etc. Still....

58
patkai 1 day ago 1 reply      
Google will stop reading my emails for Gmail ads because I will stop using Gmail. It's kind of a sad story, because they are the good guys, but once they built a huge company they started to focus on maintaining it, possibly at any cost. This is a cautionary tale: power corrupts. You, me, everyone. And yes, the web will produce dictators we never imagined possible, because the Internet is so powerful it will enable them.
59
siliconc0w 1 day ago 4 replies      
I think there is some interesting middle ground where you could use machine learning to go from 'show only relevant ads' to 'show only ads you might actually click on with greater than .001 probability'. I guess this is like 'extreme' outlier detection but it'd be interesting to see what revenue the ads at the 'long tail' of likelihood generate anyway. My guess the bulk of it is from the standard high CPC stuff like Mortages and Insurance. Google says it does this but i'm not too sure - I've never intentionally clicked an adword and yet they're still shown to me.

Anyway this may solve the tragedy of the commons situation we're in now and allow us to move away from the technology war of ad blockers, ad blocker blockers, ad blocker blocker blockers, etc.

edit: removed comment on clarity of Gsuite vs Free due to downvote brigade.

60
4684499 1 day ago 0 replies      
I'm fine to let them scan my emails for spam filter, yet they use it as an excuse to justify their data collection and other things they do with my data, which is unacceptable for me. How would I know if they are going to do things against my interests one day? Ads targeting is already against mine.

I used to see Google as a stalker. How naive I was. There are billions of people being stalked, exploited, and the whole process is automated. It comes to me that users are not victims of stalking, they are lab rats.

Now Google stopped reading emails for ads, but they'll still read for other purposes, which still makes me as a user feel insecure. I value my privacy, I have my dignity, I shouldn't be a lab rat that they can just observe however they want only because they provide free cheese.

Even if I start using a paid account to stop them from reading my emails (assume paid account with better privacy protection is possible), I couldn't stop them from reading others'. Stopping the data collection of one user won't change the situation, they still have other lab rats' data they could collect and analyze, which enables them to learn or predict other rats' behaviors.

The worst thing is, Google is not the only company doing this right now. Surveillance technologies are developing, it's like every data company has grown their teeth and become more thirsty for blood.

Edit: words.

28
New Seafloor Map Reveals How Strange the Gulf of Mexico Is nationalgeographic.com
27 points by kevitivity  5 hours ago   3 comments top 2
1
rhcom2 16 minutes ago 0 replies      
Should make a cool color 3d print.
2
webnrrd2k 4 hours ago 1 reply      
Is it just me, or does it seem kind of weird that there is a link to another article about how the blasts of compressed air used to make these maps kills a great many creatures.

http://news.nationalgeographic.com/2017/06/seismic-survey-ai...

29
Cooling the tube Engineering heat out of the Underground ianvisits.co.uk
326 points by mzehrer  1 day ago   143 comments top 30
1
kator 23 hours ago 1 reply      
Subways in NYC are not fun in the summer either. I always assumed it was because when they were designed they didn't consider a future state where air conditioners on the trains dump their heat into the tunnel.

I tried searching for a similar study for NYC but all I found was old articles from years back.

It doesn't look like the MTA shares any measurements of temperature in their data feeds: http://web.mta.info/developers/download.html

Does anyone have ideas on how we could get this sort of data for NYC subways?

2
Jyaif 22 hours ago 2 replies      
The Montreal subway system has a very clever way of somewhat saving energy (and emitting less heat): the section of the track at the station stop is higher than the rest of the track.This means that the train's kinetic energy is converted to/from potential energy whenever the train arrives/leaves the station stops.
3
Animats 16 hours ago 0 replies      
Long-term heat buildup was known in the design stage to be a problem for Eurotunnel.[1] Huge chilled water plants were built to prevent that from happening. It's a surprise, though, that it would be a problem for the London Tube, which has so many connections to the surface.

[1] http://www.nytimes.com/1991/05/01/business/business-technolo...

4
franciscop 18 hours ago 1 reply      
The Tokyo (and Japan in general) underground/subway is actually quite fresh and amazing in the hot summer. How do they do it? It might be interesting to learn from them and a good question for the Underground of London Engineers.
5
f_allwein 1 day ago 1 reply      
6
f_allwein 1 day ago 3 replies      
> offered a prize of 100,000 to anyone who could come up with fresh ideas

Too late now, but I wonder if they considered district cooling, where e.g. cold water from rivers is used as an alternative to air conditioning. Seems to be used successfully in my hometown of Munich: https://www.swm.de/english/m-fernwaerme/m-fernkaelte.html

7
raverbashing 1 day ago 2 replies      
How much would it cost to add regenerative braking to the cars?

And instead of ventilation shafts you would probably need active heat pumps

8
dredmorbius 22 minutes ago 0 replies      
The general strategy would be a) introduce less heat and b) extract more. The first might be accomplished through greater efficiencies, though that's limited.

If a major heat component is braking, then locating the additional cooling capacity where breaking is heaviest (presumably on inbound station approaches) might offer advantages -- at the very least this reduces the total treated area for maximum effect.

Given the possiblity of ground-based thermal banking, and the long-term nature of the issue, if any amount of coolant could be circulated through the thermally-affected clay, and made available for seasonal heating needs elsewhere in the city, that might be a net win.

I'm familiar with geothermal energy projects elsewhere (borehole projects in Australia, the Habernero project) where the problem is actually inverted: themal extraction cools the strata around a borehole, over the course of about 40-50 years, to the point that no further useful heat can be extracted.

The thought also occurs that the steel rails themselves are thermally conductive and might be made a part of the cooling system. Not a tremendous radiative surface, but a long conductive length. Poorly placed, that is, low within the tunnel, rather than high, for effective heat extraction though.

9
tomohawk 18 hours ago 0 replies      
I wonder if using a different means of regenerative breaking would work, such as hydraulic hybrid.

https://en.wikipedia.org/wiki/Hydraulic_hybrid_vehicle

Much more of the power would be preserved, leading to less heat.

10
richardjennings 15 hours ago 1 reply      
Is it feasible to increase the distance into surrounding clay that heat can be conducted? Could metal be used to conduct heat from clay surrounding tunnels to clay that is presumably cooler, further away? I could imagine that if a material exists that insulates electricity but conducts heat very well, the track itself might be useful in transferring heat to cooler clay, making the track colder and cooling tunnels.
11
GoodAdmiral 14 hours ago 0 replies      
It's quite incredible that the clay soil is still absorbing heat from the introduction of the tunnels. I heard possibly wrongly that the clay drys out too and ends up insulating the tube lines over decades.

Regenerative breaking sounds like the quickest and cheapest way to address the problems - not that any change would be 'quick' or 'cheap'.

12
marze 17 hours ago 1 reply      
The heat from the trains etc. heats the surrounding clay over a period of years. You can add less heat, or you can remove heat from the surrounding clay, or both.

Removing:

Run ventilation on high during winter and keep temps quite low in the tunnels.

Run ventilation on high on cooler nights.

Install cooling tubes in the surrounding clay and cool it directly. Either from above, or from the tunnel itself, a ground-source heat pump (geothermal heat pump) to pull heat from the clay. These can be powered by the cheapest available power, likely solar on sunny days in the future.

Adding less:

Upgrade motors to highest efficiency available. This could halve the waste heat from the motors.

Regenerative braking: if it is too complex to put the power back on the grid, build large "electric kettles" and dump it into a vat of water with resistance heaters. The water vat could be part of a water main so it would be constantly refreshed, and result in slightly warmer water for water users.

Instead of ice in the cars, cool brakes and motors that exceed 100C with water, by boiling the water. This absorbs terrific amounts of heat per kg water.

13
Dwolb 16 hours ago 2 replies      
What about a mechanical sling shot system at each sub station?

As the train arrived it gets slowed by a spring or similar system which is then used to propel the train forward once it needs to depart.

Also, what about just slower trains? The heat produced while acccelerating or braking is probably not exactly linear with the speed of the train.

14
avar 22 hours ago 1 reply      
I don't understand why lack of space above ground is a hindrance to building new ventilation shafts. Surely these aren't going to be wider than a sidewalk, and in central London the distance between any two roads on a block is rarely more than 50-100 meters.

You'd end up with lots of ventilation grates on the sidewalks on the surface, but that seems like an easy and space efficient solution.

15
amoorthy 20 hours ago 3 replies      
Interesting article. However I can't recall tube stations being much warmer than outside temperatures in winter (on non-windy days). If that's right then how is the heat dissipated better in winter?

I don't live in London so anyone with regular riding experience please correct me if I'm wrong.

16
jpalomaki 15 hours ago 0 replies      
"Over 47 million litres water are pumped from the Tube each day" [1]

Use the heat from air in tube to warm the water that is anyways getting pumped up. (Too lazy to do the math to see if this would make any difference)

[1] http://www.telegraph.co.uk/travel/destinations/europe/united...

17
ricw 20 hours ago 2 replies      
I never got why the heat in the tube was not being used as a heat source. You could extract the heat and supply it to surrounding buildings at a cost, thereby cooling the tube. The tech is readily available. It would be a win win situation. Plus it'd be be very environmentally friendly.
18
em3rgent0rdr 13 hours ago 1 reply      
The surrounding earth acts as a temperature storage buffer, so without ventilation, it will be extremely difficult to remove heat. Instead of the more difficult problem of trying to remove heat, I would rather focus on simply reducing the total heat produced in the lower levels. That can be done by having the lower trains be only for express, so they aren't accelerating and decelerating as much (which is when the heat is produced), with fewer trains in general, and then save the tracks closer to the surface focus for the more heat-producing non-express trains and for the more frequent trains.

On a related subject, the temperature of any cave will remain almost constant at the location's average annual surface temperature. So another option would be to focus on not producing so much heat in the entire city in the first place, to lower the average temperature.

19
l5870uoo9y 22 hours ago 1 reply      
Is the temperature still rising or have it plateaued?
20
ziikutv 21 hours ago 1 reply      
Silly question, is the heat so low that it cannot be used for something other than releasing above ground?
21
memracom 18 hours ago 3 replies      
Here is an idea that I sent to the Underground in 2006 when they solicited suggestions from the public.

Add cool to the tunnels, rather than taking heat out.

Build liquid air plants above ground, 2 or 3 floors up in the air so that the heat of the pumps is released above street level and the noise can be kept away from the street. Feed the liquid air into the tube tunnels through insulated pipes which takes up far less volume than air vents. Let gravity bring the liquid air down the pipes. Release the liquid into the tunnels near platforms where the air pump effect of moving trains caused lots of air circulation. Also the car doors open on the platforms.

Since you are liquifying the air, not just the oxygen, it can be safely released anywhere in the tunnels. And if your air intakes are high up you will actually be improving the air quality in the tunnels as well, i.e. cleaner air flows in.

22
zkms 54 minutes ago 0 replies      
> Most of the tube tunnels have above ground sections, so a hybrid idea is to use air conditioning in the trains when above ground, and while above ground to cool a block of phase change media, or water to you and me, into an ice pack. When underground, the heat that would be dumped in the tunnels is absorbed by the ice-pack until it has returned to water.

> Whether this can be viable is still being looked at, bearing in mind that they already struggle to fit air conditioning units into tube trains, finding space for the ice blocks is going to be even more of a headache. And not to forget that the extra weight means more energy needed to drive the trains, driving up running costs.

This can be worked around, do the chilling on the wayside, not on the train! At each station, run chillers that can reject waste heat on the ground -- and chill a nontoxic liquid glycol/water mixture to -40 C. Commercial equipment exists to do this already. Have air/liquid heat exchangers on each EMU, along with glycol storage tanks, a pump, and sensors to keep track of the temperature/volume of the glycol in each tank. On the roof of the EMU, install large-diameter quick-mating liquid connectors, along with fiducial marks. At each station, wayside equipment uses computer vision to locate the fiducials on the EMU, mates with the connectors, does a pressure test to verify the integrity of the connection (squirting glycol is a no-no), pumps out all the warmed glycol (and replaces it with cold stuff), and disconnects. This can be done during the dwell time if the connectors and refill tubing are of large diameter.

Cooling loads are on the order of 50 kilowatts per EMU and inter-station times are on the order of 10 minutes, which means 30MJ per EMU. The ending temperature of the glycol will be on the order of 10C (you need a temperature differential to ensure heat flows from the glycol to the air), its initial temperature will be -40C -- a temperature difference of 50K. Glycol/water mixtures have a specific heat of around 3.2kJ/(kg * K), so we have:

30 MJ = (3.2 kJ / kg * K) * mass * 50 K

leading to a mass on the order of 200 kg, which is quite tolerable for a rail vehicle. The tanks for the glycol can be spread around the car and can be arbitrarily shaped (as long as fluid can be circulated and offloaded) to deal with other constraints. There's no phase changes involved, which makes the heat exchange work non-annoying; there's just liquid glycol and air. EMUs don't need to haul around an air-cooled chiller, all the equipment on the EMU is reliable, does not consume much electricity, and is extremely tolerant of vibration and the harsh environmental conditions aboard a rail vehicle.

If you want to reduce mass further and are willing to accept some more complexity, it might be sensical to use a small chiller on each EMU that uses the glycol for heat-rejection. What does this give you? It means that you can still generate a constant chilled water temperature of 10C, but let the end temperature of the glycol go above 10C -- and more temperature range on the heat storage fluid means more heat energy can be dumped into it. When the glycol temperature gets above 10C, turn on the heat pump to create chilled water at 10C, and reject heat into the glycol (stop before it boils). Liquid/refrigerant heat exchangers are much smaller than air/refrigerant exchangers, so if your compressor isn't obscenely heavy, you can likely save some weight. If you can use the glycol from 10C (when heat won't passively flow from the glycol to the air) to 60C (a reasonable condenser temperature) -- that's another 50K worth of temperature difference, which means our 200kg load of glycol can be cut in half.

23
mixedmath 20 hours ago 1 reply      
I wonder, how long would they need to shut down the tube before temperatures lowered?
24
sambull 15 hours ago 1 reply      
Long term heat build up because of human activities. It may not because of climate change, but this is how humans change climates.
25
Tharkun 20 hours ago 2 replies      
How much heat are we talking about? I'm guessing it wouldn't be enough to use for district heating, the way some industrial waste heat is converted to hot water for homes?
26
sjg007 13 hours ago 1 reply      
I'd look at how ants or termites cool their nests for inspiration.
27
Roritharr 1 day ago 3 replies      
What is so "experimental" about the air coolers in the picture? They look like normal A/Cs
28
toyg 1 day ago 0 replies      
>The future of the cooling the tube project will be judged not so much by how they cool the hot tunnels, but by how they stop tunnels becoming hot in the first place.

That's a very good metaphor for our planet.

29
iamflimflam1 1 day ago 4 replies      
More information available from the article's source: https://www.ianvisits.co.uk/blog/2017/06/10/cooling-the-tube...
30
eecc 1 day ago 3 replies      
For one thing I'm surprised they're not using regenerative brakes. It sure will cost some of the profit but refurbishing the trains with these will cut somewhat on that 80% of heat
30
The CTO Journey at a Small Startup zapier.com
199 points by vinnyglennon  20 hours ago   38 comments top 10
1
dpeck 15 hours ago 3 replies      
C suite member still doing coding 80% of the time is extremely unlikely in my experience. You may still be able to steal away some time on nights and weekends to be able to stay sharp but day to day you're much more concerned with worrying about the future of the business, interacting with customers, and often acting as a service provider to other parts of the management team (vetting out partnerships/acquisitions etc) from a technology perspective the same way in house counsel and CFOs do for legal and financial matters.

What this seems to be calling CTO is more akin to a most senior engineer/fellow/hacker. I've seen it called Chief Engineer before. That's the person the the CTO should be able to hold their own in a conversation with but being that person would seem be unlikely for an exec team member as the business grows.

*Titles are more to less meaningless unless there is internal conflict or you're interacting with someone external, ignore that bit and think in terms of roles

2
PaulRobinson 13 hours ago 1 reply      
I was a consultant for a bunch of startups for 4-5 years, selling my services as a "CTO for hire" to them, and was full-time CTO in multiple startups to boot.

I had a general line/rule of thumb. At < 6 engineers, you had to write code regularly as the team was so small it couldn't carry the weight of a member of the tech team who didn't commit regularly.

At 12+ engineers, you didn't have time to in order to do the other work (management, prioritisation, reports, strategic thinking, etc.), well enough.

At 6-12 engineers (where I spent most of my time), I didn't have time to write code, but had to in order to keep the company moving. Cue 60-100 hour weeks for 10 years. Yeah.

I went to quite a few CTO events, and in all honesty, it was a surprise to many of them that I both knew how to code and that actually spent any time doing it. I thought it was insane that there are CTOs - many of them - that aren't interested in the practice of creating technology at a hands-on level, but I could also understand how that happened: in my location (London), it's quite normal for non-tech CTOs to pick up from founding CTOs after a few years.

It's a weird situation to be in, and eventually a couple of years ago I decided to evaluate what I wanted and wrote a list of what I liked and didn't like about my job as a CTO.

I realised all the things I enjoyed were actually the responsibilities of a senior engineer, and all the things I didn't like were the management and board duties of being a CTO. Slept on it for a week, resigned, applied for senior roles, and generally am much happier (2 years on).

It's worth really thinking about what you want from the role. If you're a co-founder, you can shape it, but you have responsibilities to your investors, wider board, exec team, managers and developers. Most importantly, your have responsibilities to yourself.

Choose your own adventure when it comes to being a CTO, but choose wisely and carefully.

3
cynusx 12 hours ago 0 replies      
Primary goal of CTO's is to make sure that the technology serves the business.As C-suite, you are also involved in setting the company strategy and are responsible for building the technical organization to own it.

Outside of executing the core strategy you can take an opportunistic approach to:

1/ create more strategic options for the company

2/ cut costs by automating or re-engineering business processes

3/ deliver an unfair technical advantage over competitors

4/ improve reliability of service

5/ introduce more technology in the rest of the business (sales, marketing, operations, ..)

Startup CTO's tend to combine many different roles as there are more roles than people to fulfill them. Generally startup CTO's wind up also doing product management, engineering management (people, culture), recruitment, SCRUM master, IT, support, BI, architecting and programming.

What you actually wind up doing depends on the needs of the business and the available talent in the company to delegate these roles to.

4
jandrewrogers 13 hours ago 1 reply      
The split between VPE and CTO is that a VPE is operational and internal-facing whereas a CTO is strategic and external-facing. Neither role involves coding if your company has a dozen engineers; senior individual contributors usually have a title like Chief Scientist or Distinguished Engineer.

A CTO role is essentially that of a Technical Product Manager. What distinguishes it from a traditional TPM is that, to do the role well, you also take on the aspect of being the technical "moral authority" for the company, setting the de facto engineering culture, and creating a compelling vision that goes beyond product management.

5
hota_mazi 6 hours ago 0 replies      
CTO as a startup cofounder and CTO of a large (100+) company are such different roles that they should really have different names.

If you're CTO because you co-founded the company or you joined at an early stage and you just happened to be the most senior engineer at the time, you really have no idea what being CTO of a large company is like.

6
truesy 16 hours ago 2 replies      
Seems like I read a different take on CTOs / VPEs every week or two. I've been a VPE several times, and as a cofounder. I, personally, think that a company should never have a CTO until you split the role into two; a CTO & a VPE working alongside each other. In that situation, the VPE manages the team, makes sure things get done, etc. While the CTO is working with other "C-staff" planning the longer (year+ out) roadmap of the company, with a focus on Engineering.

I have struggled with these roles and names, similar to Bryan Helmig, the author of this post. I have consulted with some former bosses, that now lead engineering teams at some of the larger companies here in the bay area, and have come to the conclusion that most CTOs are really VPEs, just using the wrong title, but in the end, the title does not really matter. Since it is extremely flexible in definition.

7
joshribakoff 13 hours ago 1 reply      
I view the cto as the person who makes sure the 6 developers aren't using 6 different frameworks. I've worked at places where that was the case and the CEO knew nothing about coding and didn't offer much more advice than "you guys work it out". The problem is you sometimes get a dev who wants to use "insert esoteric bleeding edge tool here" and then leaves a huge mess when they leave the company 3 months later
8
alexandercrohde 8 hours ago 0 replies      
I don't see value in this article. The author asks a bunch of other people in high tech positions what they believe CTO should be. Unfortunately the plural of anecdote is not data.

For example "Don't innovate in the management structure." Sure if you're an average person, then don't. But some companies have (Valve, young Google) and shown outstanding results.

I'm very skeptical of reducing the CTO title to rules like this. This is garbage conversation fodder, it appeals to our weaker human side to create a facade of self-improvement but none of us are gonna remember this article in 2 months.

9
pouzy 8 hours ago 0 replies      
Hello,

I'm having the same considerations and am writing about challenges CTOs of startups of different sizes face on a daily basis on http://cto.pizza

The concept is simple: we talk about your growth, team and tech challenges over a pizza. Let me know of any of you might be interested in grabbing one.

I'm based in Paris but we could figure something out over Skype or something if you have interesting stories to share!

10
edoceo 16 hours ago 0 replies      
Pretty smart article, good introspection, matches my experience and observation. Thanks for this
       cached 25 June 2017 10:02:02 GMT