hacker news with inline top comments    .. more ..    8 Nov 2016 Best
home   ask   best   2 years ago   
1
Advanced Data Structures mit.edu
1261 points by ingve  4 days ago   137 comments top 24
1
stuxnet79 4 days ago 7 replies      
I never hear anybody mentioning him but Jeff Erickson's 'Algorithms' textbook [1] has some of the most lucid explanations I've come across. CLRS is often times impenetrable and for the times I didn't like its explanation of something I turned to Jeff Erickson's book and it hasn't failed me yet. I'd urge anybody trying to solidify algorithms and data structures to take a look at it.

[1] http://jeffe.cs.illinois.edu/teaching/algorithms/

2
0xmohit 4 days ago 3 replies      
3
amelius 4 days ago 2 replies      
This is very nice.

But right now as a programmer, I am using data-structures more on an as-needed basis. Sometimes it is difficult to find the right data-structures for the job. So then it would be nice to have a resource that provides the functionality of data-structures as a black box. Learning these would make more sense than learning also all of the gory technical details upfront.

4
Koshkin 4 days ago 1 reply      
Often shied away from as too complicated, this book deserves the obligatory mentioning: Knuth's The Art of Computer Programming. Although not using Python and perhaps being too analytical and detailed for an average programmer's taste, the book is the single biggest classic treatise on algorithms and data structures.

At the other end of the spectrum - accessible and brief, I find Dasgupta et al.'s Algorithms a refreshingly engaging read.

5
mrleinad 4 days ago 1 reply      
List of videos/notes 2014: http://courses.csail.mit.edu/6.851/spring14/lectures/

(spent 2 min trying to find them)

6
40acres 3 days ago 0 replies      
I've been watching the lectures & recitations from 6.006 Introduction to Algorithms (Fall 2011) to brush up prior to an interview. Erik Demane, Srini Devadas & Victor Cossan (Recitations) have been an amazing resource.

I've learned so much and am really impressed with their depth of knowledge and how they are able to convey complex ideas in a very easy to understand way, I can't wait to start the next courses.

7
fmardini 3 days ago 1 reply      
A very underrated book and one of my favorites is Udi Manber's Introuction to Algorithms, highly recommended
8
0x54MUR41 4 days ago 9 replies      
Thank you for sharing this.

Anyone would recommend resources for learning fundamental of data structures?

Book, video, or courses are welcome. I don't care the programming languages that are used for implementations. I am OK with C.

9
adamnemecek 3 days ago 1 reply      
Does anyone use any of these ideas day to day? That's not to knock it, I'm genuinely curious.
10
nahumfarchi 4 days ago 0 replies      
Anyone knows which year has the best scribe notes?
11
lawless123 4 days ago 2 replies      
why are these hand drawn diagrams easier for me to understand and remember?
12
ohyoutravel 4 days ago 0 replies      
These are great, Erik is a really smart guy. His intro to algorithms class is also fantastic, which should be on MIT OpenCourseWare.
13
nhatbui 3 days ago 0 replies      
> If you haven't taken 6.854, you must have a strong understanding of algorithms at the undergraduate level, such as receiving an A in 6.046, having had a relevant UROP, involvement in computer competitions, etc.

Quite the pre-reqs...

14
zvrba 4 days ago 0 replies      
My small contribution to the field: http://zvrba.net/downloads/fusion.pdf
15
mathnode 4 days ago 3 replies      
I take it solutions by students are mostly done in Python now?
16
burnbabyburn 4 days ago 0 replies      
also very interesting is Erik Demaine's work on geometric folding, at least it's fun to watch various structures he prints and plays with.
17
make3 2 days ago 0 replies      
They better start close captioning or they will have to take the videos down
18
abbiya 4 days ago 3 replies      
erik is the prof.
19
zem 3 days ago 0 replies      
from the two-birds-with-one-stone-dept i've been looking for a good excuse to dive into pyret, and using it to do the exercises from this course might just be it. would anyone like to join me in a slow-paced workthrough?
20
MciprianM 4 days ago 1 reply      
Will there be a 2016 version?
21
interdrift 4 days ago 0 replies      
Thank you for this, I'm so excited to take it.
22
ausjke 4 days ago 0 replies      
is the site down?
23
albertTJames 4 days ago 0 replies      
That's a great teach
24
jasonjei 3 days ago 2 replies      
This is a great resource for anybody that isn't formally trained in computer science. A lot of programmers use an abstract data type like a dictionary or hash table, but many of the self-taught and even some formally trained treat it like a magical black box that stores key-value entries very efficiently. What a hash table/dictionary gives it near O(1) properties is a good hashing function for the key, and having a good distribution of buckets for all the keys when collisions occur.

I think a lot of programmers have good understanding of many data structures. But I think hashes and dictionaries are still taken for granted. What they really need to think of hashes as many magical black boxes and the hashing function directs which key to go to which magical bucket. :)

2
H.264 is Magic sidbala.com
1215 points by LASR  4 days ago   219 comments top 60
1
lostgame 4 days ago 7 replies      
Absolutely love this:

'Suppose you have some strange coin - you've tossed it 10 times, and every time it lands on heads. How would you describe this information to someone? You wouldn't say HHHHHHHHH. You would just say "10 tosses, all heads" - bam! You've just compressed some data! Easy. I saved you hours of mindfuck lectures.'

This is a really great, simple way to explain what is otherwise a fairly complex concept to the average bear. Great work.

2
userbinator 4 days ago 2 replies      
The lossy transform is important, but I think what's actually most important in video compression is getting rid of redundancy --- H.264 actually has a lossless mode in which that transform is not used, and it still compresses rather well (especially for noiseless scenes like a screencast.) You can see the difference if you compare with something like MJPEG which is essentially every frame independently encoded as a JPEG.

The key idea is to encode differences; even in an I-frame, macroblocks can be encoded as differences from previous macroblocks, and with various filterings applied: https://www.vcodex.com/h264avc-intra-precition/ This reduces the spatial redundancies within a frame, and motion compensation reduces the temporaral redundancies between frames.

You can sometimes see this when seeking through video that doesn't contain many I-frames, as all the decoder can do is try to decode and apply differences to the last full frame; if that isn't the actual preceding frame, you will see the blocks move around and change in odd ways to create sometimes rather amusing effects, until it reaches the next I-frame. The first example I found on the Internet shows this clearly, likely resulting from jumping immediately into the middle of a file: http://i.imgur.com/G4tbmTo.png That frame contains only the differences from the previous one.

As someone who has written a JPEG decoder just for fun and learning purposes, I'm probably going to try a video decoder next; although I think starting from something simpler like H.261 and working upwards from there would be much easier than starting immediately with H.264. The principles are not all that different, but the number of modes/configurations the newer standards have --- essentially for the purpose of eliminating more redundancies from the output --- can be overwhelming. H.261 only supports two frame sizes, no B-frames, and no intra-prediction. It's certainly a fascinating area to explore if you're interested in video and compression in general.

3
szemet 4 days ago 4 replies      
I thought I'll learn something special about H.264, but all information here is high level and generic.

For example if you replace H.264 with a much older technology like mpeg-1 (from 1993) every sentence stays correct, except this:

"It is the result of 30+ years of work" :)

4
amluto 4 days ago 1 reply      
Nice article! The motion compensation bit could be improved, though:

> The only thing moving really is the ball. What if you could just have one static image of everything on the background, and then one moving image of just the ball. Wouldn't that save a lot of space? You see where I am going with this? Get it? See where I am going? Motion estimation?

Reusing the background isn't motion compensation -- you get that by encoding the differences between frames so unchanging parts are encoded very efficiently.

Motion compensation is when you have the camera follow the ball and the background moves. Rather than encoding the difference between frames itself, you figure out that most of the frame moved and you encode the different from one frame to a shifted version of the blocks from a previous frame.

Motion compensation won't work particularly well for a tennis ball because it's spinning rapidly (so the ball looks distinctly different in consecutive frames) but more importantly because the ball occupies a tiny fraction of the total space so it doesn't help that much.

Motion compensation should work much better for things like moving cars and moving people.

5
adilparvez 4 days ago 1 reply      
Related, how h265 works:http://forum.doom9.org/showthread.php?t=167081

This is a great overview and the techniques are similar to those of h264.

I found it invaluable to get up to speed when I had to do some work on the screen content coding extensions of hevc in Argon Streams. They are a set of bit streams to verify hevc and vp9, take a look, it is a very innovative technique:

http://www.argondesign.com/products/argon-streams-hevc/http://www.argondesign.com/products/argon-streams-vp9/

6
woliveirajr 4 days ago 2 replies      
I love how you can edit photos from people to correct some skin imperfections without loosing the touch that the image is real (and not that blurred, plastic look) when you decompose it in wavelets and just edit some frequencies.

Don't know in photoshop, but in Gimp there's a plugin called "wavelet decomposer" that does that.

7
mherrmann 3 days ago 1 reply      
I recently experienced this as follows: https://www.sublimetext.com has an animation which is drawn via JavaScript. In essence, it loads a huge .png [1] that contains all the image parts that change during the animation, then uses <canvas> to draw them.

I wanted to recreate this for the home page of my file manager [2]. The best I could come up with was [3]. This PNG is 900KB in size. The H.264 .mp4 I now have on the home page is only 200 KB in size (though admittedly in worse quality).

It's tough to beat a technology that has seen so much optimization!

1: http://www.sublimetext.com/anim/rename2_packed.png

2: https://fman.io

3: https://www.dropbox.com/s/89inzvt161uo1m8/out.png?dl=0

8
the8472 3 days ago 2 replies      
> Chroma Subsampling.

Sadly, this is what makes video encoders designed for photographic content unsuitable for transferring text or computer graphics. Fine edges, especially red-black contrasts start to color-bleed due to subsampling.

While a 4:4:4 profile exists a lot of codecs either don't implement it or the software using them does not expose that option. This is especially bad when used for screencasting.

Another issue is banding, since h.264's main and high profiles only use 8bit precision, including for internal processing, and the rounding errors accumulate, resulting in banding artifacts in shallow gradients. High10 profile solves this, but again, support is lacking.

9
dluan 3 days ago 2 replies      
By the way, this is an incredible example of scientific writing done well. It's very tangible jelly-like feeling that the author clearly has for the topic, conveyed well to the readers. This whole thread is people excited about a video codec!
10
algesten 4 days ago 6 replies      
"See how the compressed one does not show the holes in the speaker grills in the MacBook Pro? If you don't zoom in, you would even notice the difference. "

Ehm, what?! The image on the right looks really bad and the missing holes was the first thing I noticed. No zooming needed.

And that's exactly my problem with the majority of online video (iTunes store, Netflix, HBO etc). Even when it's called "HD", there are compression artefacts and gradient banding everywhere.

I understand there must be compromises due to bandwidth, but I don't agree on how much that compromise currently is.

11
eutectic 4 days ago 1 reply      
Anyone who likes this would probably also enjoy the Daala technology demos at https://xiph.org/daala/ for a little taste of some newer, and more experimental, techniques in video compression.
13
alexandrerond 4 days ago 2 replies      
Very well explained. But I could have understood it all without the bro-approach to the reader. You see where I am going with this? Get it? See where I am going? Ok!
14
spacehacker 4 days ago 0 replies      
The part about entropy encoding only seems explain run-length encoding (RLE). Isn't the interesting aspect of making use of entropy in compression rather to represent rarer events with longer longer code strings?

The fair coin flip is also an example of a process that cannot be compressed well at all because (1) the probably of the same event happening in a row is not as high as for unfair coins (RLE is minimally effective) and (2) the uniform distribution has maximal entropy, so there is no advantage in using different code lengths to represent the events. (Since the process has a binary outcome, there is also nothing to gain in terms of code lengths for unfair coins.)

15
john111 4 days ago 6 replies      
Can someone explain how the frequency domain stuff works? I've never really understood that, and the article just waves it away with saying it's like converting from binary to hex.
16
4 days ago 4 days ago 2 replies      
17
amelius 4 days ago 3 replies      
> discard information which will contain the information with high frequency components. Now if you convert back to your regular x-y coordinates, you'll find that the resulting image looks similar to the original but has lost some of the fine details.

I would expect also the edges in the image to become more blurred, as edges correspond to high-frequency content. However, this only seems to be slightly the case in the example images.

18
amelius 4 days ago 3 replies      
What are directions for the future? Could neural networks become practically useful for video compression? [1]

[1] http://cs.stanford.edu/people/eroberts/courses/soco/projects...

19
kakarot 4 days ago 1 reply      
Ya'll wanna get the most out of your H.264 animu rips? Check out Kawaii Codec Pack, it's based on MPC and completely changed my mind about frame interpolation. http://haruhichan.com/forum/showthread.php?7545-KCP-Kawaii-C...
20
Savageman 4 days ago 1 reply      
I wonder if across a lot of videos, the frequency domain representations look similar and if instead of masking in a circle we could mask with other (pre-determined) shapes to keep more information (this would require decoders to know them, of course).Or maybe this article is too high-level and it's not possible to "shape" the frequencies.
21
iplaw 4 days ago 3 replies      
H.265 gets you twice the resolution for the same bandwidth, or the same resolution for half the bandwidth.
22
nojvek 3 days ago 0 replies      
This is a really well written article. Exactly why I love HN. Sometimes you get this nice technical intros into fields you thought were black magic.
23
rimbombante 4 days ago 0 replies      
Articles like this are what makes HN great, and not all those repeated links to the visual studio 1.7.1.1.0.1.pre02-12323-beta3 changelog.
24
el0j 3 days ago 1 reply      
The PNG size seems to be misrepresented. The actual PNG is 637273 bytes when I download it, and 597850 if I recompress it to make sure we're not getting fooled by a bad PNG writer.

So instead of the reported 916KiB we're looking at 584KiB.

This doesn't change the overall point, but details matter.

 $ wget https://sidbala.com/content/images/2016/11/FramePNG.png --2016-11-04 22:08:08-- https://sidbala.com/content/images/2016/11/FramePNG.png Resolving sidbala.com (sidbala.com)... 104.25.17.18, 104.25.16.18, 2400:cb00:2048:1::6819:1112, ... Connecting to sidbala.com (sidbala.com)|104.25.17.18|:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [image/png] Saving to: FramePNG.png FramePNG.png [ <=> ] 622.34K --.-KB/s in 0.05s 2016-11-04 22:08:08 (12.1 MB/s) - FramePNG.png saved [637273] $ pngout FramePNG.png In: 637273 bytes FramePNG.png /c2 /f5 Out: 597850 bytes FramePNG.png /c2 /f5 Chg: -39423 bytes ( 93% of original)

25
mtw 4 days ago 1 reply      
Even better H.265 with 40-50% bit rate reduction compared with H.264, at the same visual quality!
26
notlisted 4 days ago 1 reply      
Well done. The only thing that could make this better is an interactive model/app for me to play around with. The frequency spectrum can probably be used while retouching images as well.

A video on youtube led me to Joofa Mac Photoshop FFT/Inverse FFT plugins [1] which was worth a try. I was unable to register it, as have others. Then I came across ImageJ [2], which is a really great tool (with FFT/IFFT).

Edit: if anyone checks out ImageJ, there's a bundled app called Fiji [3] that makes installation easier and has all the plugins.

If anyone has other apps/plugins to consider, please comment.

[1] http://www.djjoofa.com/download

[2] https://imagej.nih.gov/ij/download.html

[3] http://fiji.sc/

27
i336_ 3 days ago 0 replies      
I found this explanation of Xiph.org's Daala (2013) very interesting and enlightening in terms of understanding video encoding: https://xiph.org/daala/

Related:

BPG is an open source lossless format for images that uses HEVC under the hood, and is generally better than PNG across the board: http://bellard.org/bpg/

For a runner-up lossless image format unencumbered by H265 patents (completely libre), try http://flif.info/.

28
optimuspaul 3 days ago 0 replies      
I enjoyed this for the most part and even learned a little. But it started out very simple terms and really appealing to the common folk. But then about halfway through the tone changed completely and was a real turn off to me. It's silly but this "If you paid attention in your information theory class" was the spark for me. I didn't take any information theory classes, why would I have paid attention? I don't necessarily think it was condescending, but maybe, it's just that the consistency of the writing changed dramatically.

Anyway super interesting subject.

29
afghanPower 4 days ago 1 reply      
A real fun read. Had an assignment a couple of weeks ago where we used the most k most significant singular values of matrices (from picture of Marilyn M.) to compress the image. H.264 is on a whole other level, though ;)
30
aaron695 3 days ago 0 replies      
H.265/HEVC vs H.264/AVC: 50% bit rate savings verified

http://www.bbc.co.uk/rd/blog/2016/01/h-dot-265-slash-hevc-vs...

31
el0j 3 days ago 0 replies      
If the author truly wants 'magic', how about we take a 64KiB demo that runs for 4 minutes. That's 64KiB containing 240 seconds of video, and your H.264 had to use 175 for only five seconds on video.

We can conclude that 64KiB demos are at least 48 times as magical as H.264.

32
problems 4 days ago 0 replies      
Really cool stuff, one thing though seems a little odd:

> Even at 2%, you don't notice the difference at this zoom level. 2%!

I'm not supposed to see that major streakiness? The 2% difference is extremely visible, even 11% leaves a noticably bad pattern on the keys (though I'd probably be okay with it in a moving video), only the 30% difference looks acceptable in a still image.

33
dirtbox 3 days ago 0 replies      
I like this video explaining the difference between H.264 and H.265 https://www.youtube.com/watch?v=hRIesyNuxkg

Simplistic as it is, it touches on all the main differences. The only problem with H.265 is the higher requirements and time needed for encoding and decoding.

34
monochromatic 3 days ago 0 replies      
This is great as a high-level overview... except that it's way too high-level. These are all extremely well-known techniques. Is there any modern video compression scheme that doesn't employ them?

In other words, why is H.264 in particular magical?

35
ludwigvan 3 days ago 0 replies      
What is the latest in video compression technology after H264 and H265?

The article discusses lossy compression in broad terms, but have we reaped all the low hanging fruit? Can we expect some sort of saturation just like we have with Moore's law where it gets harder and harder to optimize videos?

36
markatkinson 4 days ago 1 reply      
Damn, lost me during the frequency part.
37
mozumder 4 days ago 1 reply      
So what's the final car weight? It looks like you stopped at the Chroma subsampling section..
38
imaginenore 4 days ago 0 replies      
> "If you don't zoom in, you would even notice the difference."

First of all, I think he meant "you would NOT even notice".

Second of all, that's the first thing I noticed. That PNG looks crystal clear. The video looks like overcompressed garbage.

39
umbs 3 days ago 2 replies      
"1080p @ 60 Hz = 1920x1080x60x3 => ~370 MB/sec of raw data."

I apologize if this is trivial. What does 1920 in above equation represent?

40
vcool07 4 days ago 1 reply      
This was a good and interesting read. Is h.264 an open standard ?
41
syastrov 3 days ago 0 replies      
An enjoyable, short and to the point article with many examples and analogies. But my favorite part was this:

"Okay, but what the freq are freqX and freqY?"

42
neo2006 3 days ago 1 reply      
The comparison doe not make any sense, and no h264 is not magic!!:- The guy is comparing a lossless format PNG to H264 which is a lossy video format, that is not fair.- he is generating a 5 frame video and compared to 1 frame image, only the I-frame at the begining of the video matter in that case al the others are derived from it, P-Frame.- What is the point of having that comparaison we already have images format comparable to the size of a H264 I-frame and using the same science (entropy coding, frequency domain, intra frame MB derivation...)?
43
some1else 3 days ago 0 replies      
Try scrubbing backwards. H264 seeking only works nice if you're fast-forwarding the video. Actually, that is kind of magical.
44
andrewvijay 3 days ago 0 replies      
Well explained. I was thinking of reading about h264 and this is an amazing starter. Thanks Sid!
45
11thEarlOfMar 4 days ago 1 reply      
Do H.264 and WebRTC have different use cases? Or do they compete directly?
46
imperialdrive 3 days ago 0 replies      
Great Write-up, thank you for your time and effort!
47
xyproto 3 days ago 0 replies      
Copyrighted and patented magic.
48
mohda786921 1 day ago 0 replies      
I need hacker
49
mohdz4939 1 day ago 0 replies      
I need hacker
50
bjn 3 days ago 0 replies      
Well written article.
51
molind 4 days ago 0 replies      
Wow, now tell me how H.265 works!
52
andrey_utkin 3 days ago 0 replies      
Too trivial, too general, too pompous. I'd downvote.
53
wizkkidd 3 days ago 0 replies      
time to make move on: h.265
54
mentioned_edu 4 days ago 0 replies      
Nice
55
necessity 3 days ago 0 replies      
s/magic/lossy
56
wizkkidd 3 days ago 0 replies      
time to move on: h.265
57
wizkkidd 3 days ago 0 replies      
time to move on H.265
58
cogwheel 3 days ago 1 reply      
MB is 1024 * 1024 * bytes not 1000 * 1000 * bytes. Unless you're a HDD/SSD manufacturer.
59
kutkloon7 4 days ago 1 reply      
"This concept of throwing away bits you don't need to save space is called lossy compression."

What a terrible introduction of lossy compression. This would mean that if I empty the thrash bin on my desktop, it's lossy compression.

The concept of going through all compression ideas that are used is pretty neat though.

60
VikingCoder 4 days ago 2 replies      
Ugh. Comparing the file size difference between a lossless PNG and a LOSSY H.264 video of a STATIC PAGE is absurd. Calling it "300 times the amount of data," when it's a STATIC IMAGE is insulting in the extreme. It really doesn't matter if the rest of the article has insights, because you lost me already.
3
DeepMind and Blizzard to release StarCraft II as an AI research environment deepmind.com
911 points by madspindel  3 days ago   324 comments top 41
1
theptip 3 days ago 9 replies      
This is pretty interesting.

DeepMinds last triumph (beating the best human Go players with AlphaGo) is impressive, but Go is a great fit for neural networks as they stand today; its stateless, so you can fully evaluate your position based on the state of the board at a given turn.

Thats not a good fit for most real-world problems, where you have to remember something that happened in the past. E.g. the fog of war in a strategy game.

This is a big open problem in AI research right now. I saw a post around the time of AlphaGo challenging DeepMind to tackle StarCraft next, so it is very cool that they have gone in this direction.

When Googles AI can beat a human at StarCraft, its time to be very afraid.

2
formula1 3 days ago 12 replies      
I suspect this will eventually lead to AI as a service for games. Rather than build a terrible AI that delays a game by months, approaching a company that can build a decent AI initially which gets better overtime would probably be ideal and create better experiences.

Im curious if a startup can be built from this.

3
rezashirazian 3 days ago 7 replies      
If DeepMind is planning on building an AI that can beat the best human SCII player, they have their work cut out for them.

I'm not sure how familiar people are with StarCraft II, but at the highest levels of the game, where player have mastered the mechanics, it's a mind game fueled by deceit, sophisticated and malleable planning, detecting subtle patterns (having a good gut feeling on what's going on) and on the pro scene knowledge of your opponent's style.

4
chrishare 3 days ago 2 replies      
Very interesting stuff.

Allowing researchers to build AIs that operate on either game state or visual data is a great step, IMO. Being able to limit actions-per-minute is also very wise. The excellent BWAPI library for Starcraft Broodwar that is referenced (https://github.com/bwapi/bwapi) provides game state - and was presumably used by Facebook to do their research earlier this year (http://arxiv.org/abs/1609.02993). For mine, the significant new challenges here not present in Go are the partial observability of the game and the limited time window in which decisions need to be made. Even at 24 frames per second, the AI would only have 40 milliseconds to decide what to do in response to that frame. This is more relevant to online learning architectures.

The open questions here are how freely this will be available - and in what form. Will I need to buy a special version of the game? Clearly there will be some protection or AI detection - to ensure that competitive human play is not ruined either by human-level bots, if they can truly be developed, or by sub-par bots. Starcraft 2 (presumably the latest expansion, Legacy of the Void, will be used here) does not run on Linux, whereas most GPU-based deep learning toolkits are, so having a bridge between the game and AI server may be necessary for some researchers.

Besides being great for AI researchers, this is probably good for Blizzard too, since it will bring more interest to the Starcraft series.

2017 can't come soon enough.

5
tdaltonc 3 days ago 1 reply      
I'd love to see a e-sport league where the teams are AI human hybrids (Centaur teams). We know that AI human hybrid teams are great at chess [1], and I'd love to see rts games played by 'Centaur' teams. In the same way the innovations made in F1 often trickle down to consumer cars, can you imagine the advances that could be made in human-machine interactions in the crucible of a real-time Centaur gaming league?

[1] http://bloomreach.com/2014/12/centaur-chess-brings-best-huma...

6
rkuykendall-com 3 days ago 0 replies      
This would be so much more fun with a turn-based game where speed isn't a variable, like Civilization. I'd love to play against several AIs that were better than me because of code, not because they get a bunch of extra in-game bonuses. With a nice AI API, you could have online competitions where AIs battled every month.
7
wodenokoto 3 days ago 0 replies      
This paper is a few years old, but it gives a good overview of the problems faced when building an AI for starcraft and the methods used.

Ontann, Santiago, et al. "A survey of real-time strategy game ai research and competition in starcraft." IEEE Transactions on Computational Intelligence and AI in games 5.4 (2013): 293-311.

http://webdocs.cs.ualberta.ca/~cdavid/pdf/starcraft_survey.p...

8
Miner49er 3 days ago 10 replies      
Looks like they are going to limit the APM of the AI. I wonder how they are going to decide the limit? I've never played StarCraft, but from what I understand very high APM is needed to play the game at the highest levels.
9
cryptrash 3 days ago 0 replies      
I'm pretty excited about this. I think some kids out there will really enjoy an environment like this to mess with, and maybe learn a thing or two about machine learning along the way.

Starcraft is a really fun game, and I think it's enough to engage kids a little more than something like Minecraft where there's plenty of room for some cool ML hacking, but not enough stimulation from it. Instead of just seeing blocks here or there or whatever, starcraft has hard goals that will force them to use critical thinking skills, their knowledge of the game, their own personal strategic insights, and the ML skills they accrue.

So exciting! Love the feature layer idea also, well done!

10
deepnotderp 3 days ago 2 replies      
By the ways, if anyone's interested, there was a deluge of deep learning papers today and one of them basically used deep learning to make deep learning models and it did better than humans.
11
Savageman 3 days ago 2 replies      
That's so cool! I wish they could start doing AI for team based competitive games like League of Legends where meta-play and team decision making is important.Is that too complicated to tackle yet?
12
brylie 3 days ago 0 replies      
It would be cool to see a project like this for an open source game, such as OpenTransportTycoonDeluxe (http://www.openttd.org/). The AI developed by interacting with the OpenTTD economy might even prove useful for urban planning of real geographic regions.
13
fitzwatermellow 3 days ago 0 replies      
Training data? After all, AlphaGo trained on a database of over 30M expert human moves. I suspect one championship round from Team EnVy is worth billions of iterations of random exploration ;)

Kudos to both Blizzard and DeepMind. Anticipating a lot of fun with this. StarCraft 2 could indeed become the AI pedagogy standard.

14
xg15 3 days ago 2 replies      
I applaud the idea, but I'm worried about how open the results of the research will be in the end.

I think the worst possible outcome for society would be if we ended up with capable AI but with the algorithms only accessible for a handful "AI-as-a-service" companies.

The second concern is understandability of the algorithms: from what I've read, it's already hard to deduce "why" an ANN in a traditional problem behaved like it did. An AI that is capable of coping with dynamic, unpredictable situations like an SC game (with only pixels as input) is impressive but it seems less useful to me if it's not understood how that is done.

15
simopaa 3 days ago 2 replies      
I would love to keep a 24/7 stream open where the AI is continuously playing against the game's own bots in order to improve its playstyle.
16
jasikpark 3 days ago 0 replies      
Comparing AI and humans in games is not useful unless all limitations are controlled for both parties. The artificial intelligence only gets the video output of the game and output to simple controls with a human reflex - like lag on how long it takes for the controls to take effect. It just comes down to the scientific method.
17
partycoder 3 days ago 0 replies      
That's great.AIs on SC1 relied on many hacks. Initially I thought that DeepMind was going to create a bot for the original SC.

I hope some of the advances in SC2 AI can be integrated into the in-game AI. e.g: a trained neural network that plays better than the "hard" AI, but can run on a consumer box and not on a massive cluster.

18
bluebeard 3 days ago 0 replies      
This will be good for games moving forward due to the meta changing for players as the AI adapts to their tactics and vice versa. Lessons learned from this can then be applied to other areas. And as an added bonus it creates more interest in AI research.
19
seanwilson 3 days ago 1 reply      
Can a simulation of a complete Starcraft game be done quickly and assuming it can't doesn't that present problems to training an AI? For example, I'm guessing complete games of Go are order of magnitudes faster to simulate which makes it more practical to do things like getting AlphaGo to play against itself to train.
20
KZeillmann 3 days ago 4 replies      
This is so exciting. I've always wanted to program bots to play online games -- mainly for learning purposes. (Can I make a bot that plays better than me?)

But I've never done it because of the risk of bans. I'm glad that Blizzard has opened it up for people to experiment with this. I wonder how it will interact with any sort of anti-cheat systems in place, etc.

21
luka-birsa 3 days ago 10 replies      
Anybody else sees a problem with training the AI to move troops around a battlefield, with the purpose to exterminate the opponent?
22
simonebrunozzi 3 days ago 0 replies      
This is a wasted opportunity for other strategy games to become the most played game on the planet.

I sometimes play strategy games and I always find the AI disappointing. Any game with a great AI would be my favorite for years. Heck, I would even pay a few dozen cents/hour to be able to compete against a proper AI.

23
randyrand 3 days ago 0 replies      
I hope the AI will have a handicap on the speed of mouse movement. IRL you can't just teleport the mouse around the screen.
24
ChrisAntaki 3 days ago 0 replies      
> Were particularly pleased that the environment weve worked with Blizzard to construct will be open and available to all researchers next year.

This is awesome. I've only ever reached the Platinum league in Starcraft II (1v1), but I'd almost feel more driven to create bots to (hopefully) surpass that skill level, than actually playing the game.

25
andrew3726 3 days ago 0 replies      
This is really good news!Lets hope DeepMind can improve even further on their Differentiable neural computers (DNC) which seems like an requirement for this kind of AI to work (exploiting long-term dependencies).I also hope that other research/industry teams will join on the competition to create competing AIs. Very exciting!
26
plg 3 days ago 4 replies      
I understand the challenge, and the importance of the proof-of-principle. But again? Having done Atari games, then Go, at what point exactly does google deepmind start attacking some real world problems of consequence?

Or maybe the answer is never, other companies are supposed to do the hard work? We only play games?

27
tylerpachal 3 days ago 0 replies      
For anyone looking for more information about Starcraft 2, the world championship is on this weekend and the broadcast for today has just started (16:30EST)

https://www.twitch.tv/starcraft

28
lanius 3 days ago 1 reply      
I can't wait to see how far DeepMind can go in this area. I was initially skeptical that AlphaGo could defeat top human players, but then it happened. Who knows, perhaps one day AI can compete against progamers in GSL!
29
oblio 3 days ago 0 replies      
I wonder when we'll be at a point that a small, portable AI (such as the one included with games) is actually competitive with decent humans.
30
prawn 3 days ago 0 replies      
Makes me wonder if any game companies have seeded empty servers with bots, acting as humans, to give their games a sense of popularity.
31
kleigenfreude 3 days ago 0 replies      
First we teach it Atari games, and now strategy and war?

Why not give it lots of data to solve real problems? Training it on useless games will have no benefit.

32
pizza 3 days ago 0 replies      
Nice! I cracked a joke about how SC2 was nothing more than an AI testbed just last week, lol. Very glad to see it's becoming a real thing!
33
andr 3 days ago 0 replies      
I wonder if AI will take over eSports (Twitch, competitions, etc.), as well. It could be a variant of the Turing test.
34
Havoc 3 days ago 0 replies      
This is awesome. I know there is a happy AI community on SC1 front so glad to see Blizzard anything on SC2 front.
35
simpfai 3 days ago 2 replies      
Does anyone know of any resources for someone looking to learn more about Game AI for real time strategy games?
36
noiv 3 days ago 1 reply      
Hopefully there's a JS/Spidermonkey interface. I'd be happy to port Hannibal from 0AD.
37
komaromy 3 days ago 0 replies      
My mostly-uneducated o/u on the time to beat the best human players is 8 years.
39
felix_thursday 3 days ago 0 replies      
OpenAI vs DeepMind?
40
flamedoge 3 days ago 2 replies      
This is dangerous. Overmind will come to life.
41
ambar123 3 days ago 0 replies      
I will consider AI to human level.. When it can fully play gta san-andreas
4
Web fonts, boy, I don't know meowni.ca
689 points by guessmyname  4 days ago   323 comments top 46
1
mouzogu 4 days ago 13 replies      
I have had the exact experience in the last month. Being on a slow connection, I've come to loath web fonts. There is nothing wrong with them in general but it's just that they've come to symbolise over-indulgence and a myopia towards the users actual task-focused needs as opposed to aesthetics.

Part of the issue is dealing with progressive enhancement as far as slow internet connections go. How do you solve that problem? There is no native browser API to my knowledge that does not depend on using JS which isn't ideal imo.

Would love an attribute on script and link tags that could be conditional based on connection speeds.

P.SWould also encourage those who have the choice to use system fonts (https://medium.design/system-shock-6b1dc6d6596f) instead of web fonts. Seems more in-line with the spirit of the web and these fonts are very well tested in general.

2
discreditable 4 days ago 3 replies      
My feeling on web fonts is that that they can sometimes look nice but aren't worth the page weight. My approach is to pick two fonts I expect MacOS and Windows to have, then fall back to generic. Other times, I'm more lazy and just use the generic.

To put it more bluntly: https://bestmotherfucking.website/#remote-fonts

3
jaredcwhite 4 days ago 1 reply      
Yes, web fonts often suck on slow connections. Meanwhile, on fast-ish connections, I find the FOUC effect to be greatly annoying. That 1-2 seconds of seeing Times and Helvetica and then seeing Freight Text Pro and Proxima Nova drives me bonkers. I think there must be some kind of happy medium between FOUC for faster connections and rage-inducing FOIC for slow connections. Unfortunately, most devs (and sadly I claim to be one) don't test their sites frequently on a range of connection speeds so it's common to have badly-tuned font dynamics on sites.

Bottom line: as someone who cares about design and visual experiences, I don't think the answer is just accept and live with FOUC everywhere. Browsers and dev tooling both need to improve in order to find that happy medium.

4
metafunctor 4 days ago 1 reply      
The HN crowd frequently compliments Stripe on their good looking, functional, well designed web pages. The Stripe home page is 75% web fonts (432k).
5
tvon 4 days ago 2 replies      
I'm regularly annoyed by the flash when loading the Docker documentation, e.g. http://docs.docker.com/engine/reference/builder/

It's annoying enough to warrant using Stylish to override the web font.

6
WorldMaker 4 days ago 3 replies      
I've been slowly adding the contents of Google Fonts into my system fonts storage as I see them used on sites that I visit often. Things like Open Sans and Roboto for instance are quite common.

I think there would probably be a good case for an extension to automate that in some fashion for fonts with known open licenses (essentially anything from Google Fonts, for example) to go ahead and just install them into the system.

There was at least one HN commenter I've seen that has claimed to just go ahead and install all of Google Fonts locally, which might be a bit extreme, but even automating that isn't necessarily a bad idea.

7
mslev 4 days ago 4 replies      
For anyone missing the WW reference: https://www.youtube.com/watch?v=wvr1T1sFvEg
8
rndmize 4 days ago 5 replies      
For God's sake man, don't leave a huge, fast gif infinitely looping where I can see it while trying to read your blog post. Make it activate after I scroll down to a certain point, or on hover, or anything that doesn't make me feel like I'm trying to read through a strobe light.
9
drinchev 4 days ago 1 reply      
Badly engineered, multi-purpose WordPress themes are my favorite on that matter. Not only they make you download the entire set of assets ( Think of ~2 icon fonts, half a megabyte stylesheets, megabytes of JS and on top of that 5$ DO box serving all the PHP `magic` ), but also include some entrance animations, which take additional time, before you focus on the content.

Actually this reminds me of the IoT DDoS attack. The low-tech users, making a blog by clicking around and installing 25$ theme are actually unaware that their website drains mobile-bandwidth.

So if you are developing one of these portfolio-with-landing-page-with-blog-section-woo-commerce-supported mega theme, please consider optimizing assets as your next priority.

10
elcapitan 4 days ago 9 replies      
Are there actually font blockers like ad blockers? That would solve the problem pretty easily for everybody who's annoyed. Although they probably won't be available for mobile browsers, where it's the most painful.
11
makecheck 4 days ago 2 replies      
Are web fonts smart enough to download only the symbols that youre using? With Unicode growing all the time, and fonts supporting significant parts of that total set, it would be insane to have to download hundreds of thousands of symbols when youre just trying to write a 3-word headline.

Another problem is aggressive filtering, oddly. I once visited China and found my entire web site wouldnt load; it turned out to be triggered by a font from a Google domain. How silly for content to be thrown out because an optional style element is unavailable. I have since stripped out all my Google dependencies.

12
niftylettuce 4 days ago 1 reply      
For web fonts in emails, look no further than my latest package `custom-fonts-in-emails` for Node.js.

https://github.com/crocodilejs/custom-fonts-in-emails#custom...

13
wickedlogic 4 days ago 1 reply      
I feel like all developers should be forced to load their site over Amtrak wifi....
14
Animats 4 days ago 3 replies      
Fonts ought to have Subresource Integrity checksums, and clients should cache them effectively, even cross-site. It's not like they change much.
15
initram 4 days ago 0 replies      
I find web fonts so annoying, along with images that don't load until you scroll to them. I usually read in Reader mode, and it takes the loaded page and re-renders it without web fonts in a normal style. But when the site doesn't load all the images, I end up missing half the article because I can't see what they're talking about. So annoying.
16
skizm 4 days ago 4 replies      
Why are GIFs in articles a thing? Can anyone focus on the words while they are looping in the corner of your eye? I always open the inspector and delete the elements, but it is annoying to do that every time.
17
frivoal 4 days ago 0 replies      
I blame the browsers. There's no deep reason you have to introduce a piece of javascript to hide stuff from the browser so that it can load the page normally, and then add back the fonts. If browsers cared about making the web usable for everybody more than they cared about making web page look like slick native apps, they'd do it for you.

The way browser internals work is already designed for that. if the network is slow and you get only half the html, half the css, miss some images, or whatever, the browser will render as best as it can with what it got so far, and re-render when it gets the rest. The fact that it doesn't do that with fonts sucks.

18
draw_down 4 days ago 1 reply      
In truth, if web design took seriously the constraint of bad network connections, the result would be much different to what we have today. This is but one example.

(And to be totally honest, if the author took that constraint seriously this page would be different as well.)

But in that world, things would be less pretty and less flashy, and that stuff means more to people than they would care to admit.

19
mhd 4 days ago 0 replies      
I despise icon fonts with a raging passion. It feels like we're back in 8 bit time where we're using sprites for things that Man Wasn't Meant To Do with them.

Web fonts as real world fonts are kinda-sorta okay. Although it's yet another area where "All the world's a VAX" can proliferate, when designers think everyone's got the Windows font engine or their MacBook/iPhone's Hippidippi display.

20
forrestthewoods 4 days ago 1 reply      
All web developers should be forced to spend a full week using their site on each of LTE, 3g, and 2g. If your website takes more than 10 seconds to load on an ad-block free LTE device you're fired.

Relevant: Page Weight Matters (2012) http://blog.chriszacharias.com/page-weight-matters

21
codedokode 4 days ago 2 replies      
I think the easiest solution would be just not to use web fonts at all. Do you need fancy fonts on a website with technical documentation? I think you don't but sadly PHP developers added a whole megabyte of fonts to PHP manual. As a result one has to wait until the fonts are loaded to see the text.

And web fonts are often poorly rendered on Windows.

I want an option in Chromium that would allow disabling web fonts by default and enabling them on a per-site basis (like it has an option for disabling JS). And what surprises me most is that mobile browsers (which are often used on slow and metered connections) do not have such options at all.

I remember old Opera browser that is now long gone had a lot of options to make user experience better on poor connections (and a builtin domain block list). Well, at those times internet users had higher average IQ and now typical Internet user would not even understand what is the meaning of "disable JS".

22
kutkloon7 4 days ago 3 replies      
I absolutely do not understand the need for 'web fonts'. Just specify a normal font that most users already have installed in CSS, with some fallbacks.

'Progressive styling' is way more annoying (in my humble opinion) than waiting some time. Pages like the verge can take minutes anyway, when the wifi is not optimal. When you do things progressively, the page just keeps on changing, and the page can't really be used. So if you have a long page with text, and the user is somewhere halfway, the font loads and... BOOM! You are reading text that is located 8 paragraphs from where you were reading one instant ago.

This is far more annoying with images which dont specify their size, but I would still consider it bad practice to use an uncommon font for non-artistic pages.

23
mrec 4 days ago 0 replies      
I had a very strong impression that the quality of Web typography took a colossal nosedive when web fonts became a thing. My suspicion at the time was that this was largely a result of designers pushing pet fonts that looked fine on their Mac but like crap on other OSes (due to different rendering/hinting algos); can't prove that, though.

I did try disabling web fonts in Firefox, but then that semi-broke sites like GitHub who seemed to think that using them to display UI button images was a good idea. This is why we can't have nice things. And why designers are sometimes held in less esteem than they might like.

24
CoolGuySteve 4 days ago 0 replies      
I had the exact same problem in China last month, only worse because I was also going through a crowded VPN. The only page I could reliably load before losing patience was Hacker News.

The best workaround I found was to load Opera Mini and crank up the compression proxy. It's annoying though, if Opera Mini can automatically compress the page's imaging and formatting, why can't the web server do the same?

We can nag designers/developers all day long about web fonts and whatnot, but at the end of the day, only an automated solution is going to actually get put into regular use.

25
tribby 4 days ago 0 replies      
it's easy enough to edit web fonts to include only necessary characters, but most EULAs won't allow for this. any font licensed under the SIL open font license is fine, though you may have to change the font's name if it has a reserved font name (check the license file). this makes a huge difference (fonts under 10k), but no one ever bothers to do it. not sure if inlining them as base64 in your CSS and serving with gzip would help much more.

a lot of the stuff that we do in font editors is going to start happening in-browser. variable fonts that use two masters (the thinnest and heaviest font weights) to interpolate all the other weights in-browser are an emerging standard that will get font sizes down[0]. you're essentially being sent two fonts over the wire that can generate quite a few more. font files are bloated in a number of ways. eventually we'll have a standard where diacritics like etc can automatically be added to letters without the need for an entirely separate glyph. (,a,,, are currently separate, which is very expensive).

google, adobe, and other big players are hard at work on this stuff, and it won't be long. there are pain points now, for sure, but to me this isn't an argument in favor of system fonts. most of web design is typography design, and anyone who says otherwise probably hasn't been designing for the web very long.

0. http://blog.typekit.com/2016/09/14/variable-fonts-a-new-kind...

26
vbezhenar 4 days ago 1 reply      
Web page should just disable web fonts, if they are not loaded after 1 second and reenable them, when they are loaded. It's not hard to do and it's greatly improves usability. Actually it would be nice, if browsers would do it automatically, show system font, if web font is not cached and it's not going to be ready in the next few milliseconds. Anything better than white page.
27
hpaavola 4 days ago 3 replies      
What fonts are available in recent versions of all major operating systems? So what are the "web safe" fonts of 2016?
28
okonomiyaki3000 4 days ago 0 replies      
As she's in Taiwan, she might have mentioned that webfonts, despite any pros and cons they have for western language sites, will never (never say "never"...) be useful for Chinese or Japanese sites. A bit off the topic of the article but maybe adds some perspective. Or not.
29
niftich 4 days ago 1 reply      
The gripes are valid and some of the links about fixing it are good content.

I'm starting to sound old, but "back in my day" people would use the 'media' parameter in link tags to load mobile-specific stylesheets. Has this practice died out? If so, why?

30
williamsharkey 4 days ago 1 reply      
I wish that system fonts could be told the size of the characters of a web font upfront, before the web font was loaded. The system font's glyphs should smoothly fade/morph into the target webfont once it is loaded.

To solve the problem generally would be difficult or equivalent to the effort/data required to download the web font in the first place, but there may be a quick and dirty solution that covers 99% of text in the wild. The easiest case is monos paced fonts. Is there really an excuse why the system font can't be told via Css what spacing/height to use to match the target webfont?

31
b1daly 4 days ago 1 reply      
I'm mystified by how many major sites have terrible usability issues on mobile devices that are used widely. (older iDevices cheap-mid level Android phones).

Are the proprietors of these sites, which live and die on their traffic, actually unaware of how bad the experience is?

Or are the proprietors oblivious because they always have the latest iPhone or Galaxy? Are the developers oblivious too, or are they keeping their bosses in the dark?

Or are the business requirements for the sight so absolutely needed, and it is simply hard to make it work?

Anyone able to venture a perspective on this?

32
rocky1138 4 days ago 1 reply      
This is one of the reasons I browse with all web fonts disabled.
33
ivanhoe 4 days ago 0 replies      
Problem is that lazy loading will introduce a flicker on fast connections or if the font is already in browser's cache since re-rendering of the page takes some amount of time (more content, slower it will be). And if you have multiple pages, this means this will affect even people on slow connections on every page except the first one. So it's not a win-win solution, one side will have more or less degraded experience.
34
iamandoni 4 days ago 0 replies      
That invisible ink at the top is a good subtle touch.
35
innocenat 4 days ago 0 replies      
One thing this completely missed are non-Latin script. For example, the default Thai font on Windows is just plain ugly.
36
lacampbell 4 days ago 2 replies      
Off topic, but the first paragraph mentioned using 2G mobile internet in Taiwan... why!? 4G is cheap and plentiful there.

https://guidetotaipei.com/article/cell-phones-and-sim-cards

($100 TW = $3.18 US)

37
rcarmo 4 days ago 0 replies      
I just gave up on web fonts and go with Georgia, Arial and a sensible set of fallbacks (although I try Menlo instead of Courier first, because it's ugly in retina)
38
daodedickinson 4 days ago 0 replies      
Ehhh. Slowness has its benefits. But you need to know the fallout that will result.
39
incompatible 4 days ago 0 replies      
I got a message from my provider to say they are closing down their 2G network next year since hardly anyone uses it and they want to reuse the bandwidth for something else. Maybe this protocol will die out soon.
40
Illniyar 4 days ago 0 replies      
When you hover on the links they flash like a 90's era webpage horror, why? is this what the cool kids do now?
41
jacobmischka 4 days ago 0 replies      
Really love the title, nice article.
42
Crenchaw 4 days ago 0 replies      
Custom fonts and web fonts are fine... if they are stored on the same web server, so they load with the document.

Images are fine... if you specify the width and height so the page doesn't jump around while the page loads.

Menus and top bars that change when you scroll down are never acceptable. It's distracting and unnecessary.

Goofy background and image effects in general are very annoying if they are synced with page scrolling. It reminds me of a child's pop-up book.

43
tyingq 4 days ago 6 replies      
A 1.2 MB blog post complaining about bloat.
44
DocTomoe 4 days ago 2 replies      
> awesome free roaming 2G

So instead of fixing the real problem - which is obviously connectivity - we need to break the web and add another layer of Javascript complexity to something that works perfectly fine without.

It's 2016. 2G is 25 years old (started in 1991!). Stop being cheap. Stop working with 2G.

45
JepZ 4 days ago 0 replies      
If she cares so much about the experience, why does she use an iphone after all?!?
46
NumberCruncher 4 days ago 1 reply      
Tl;dr: if you are travelling in a foreign country and you think about whatever fonts... girl, you are doing it totally wrong. Buy a city guide, printed on good old fashioned paper, even if it is so 1990, and get a fuck off-line! Or just talk to the natives! In person!
5
Time to Dump Time Zones nytimes.com
621 points by prostoalex  1 day ago   530 comments top 100
1
darawk 1 day ago 22 replies      
Hmm...this is an interesting idea. However, the core of the argument seems to be this:

> The economy thats all of us would receive a permanent harmonization dividend the efficiency benefits that come from a unified time zone.

But this editorial is pretty light on actually supporting that. The basic argument seems to be that it reduces 'translation costs'. But..does it? What about the benefits of being able to refer to times without having to localize them? If my friend on the other side of the country says "I woke up at 9 this morning", I have a pretty good idea of what that means. If we used this new system, i'd have to mentally translate.

In terms of scheduling things, it would get easier in some ways and likely harder in others. If say, I want to schedule a conference call at 3, yes, 3 is the same time for everyone, but i'd still have to do some mental sanity checks to ensure that that time is reasonable for everyone who might be participating.

Overall, is there really an efficiency gain to be had here? I'm not taking the firm position that there isn't, btw. Just a bit skeptical and curious to hear a better argument in its favor if anyone has got one.

2
pfarnsworth 1 day ago 6 replies      
Getting rid of Daylight Savings makes complete sense, and it's something we should really pursue.

Getting rid of Time Zones is ridiculous. People know that 6am roughly is morning, and 6pm is roughly the evening. When you're dealing with someone internationally, you know not to call them at midnight their time because there's a high probability they may be sleeping. Having time roughly follow a standard around the world makes absolute sense because we're human.

3
edblarney 1 day ago 9 replies      
The article is upside down.

"Perhaps youre asking why the Greenwich meridian gets to define earth time. "

It doesn't. Everybody gets to have their own, proper time.

In comparing times, we +/- based on an arbitrary spot, and that's it.

Time zones are a great solution to a problem.

People want their time in local terms.

Everyone waking up and going to bed at different times is an utterly ridiculous concept.

FYI - if you want to use UTC - you're free to do that today.

To those suggesting we should use UTC: walk down the street and look at regular people. 'Other time zones' are irrelevant to them - utterly. There are very few people who need to deal with other time zones.

It might be remotely possible to put an entire nation on one time - i.e. put America on 'Mountain Time' - it might be possible to convince New Yorkers and Californians that they are getting up earlier/later etc.. But even that wouldn't be very useful in the end.

4
apaprocki 1 day ago 1 reply      
Maybe I'm the contrarian. I don't mind timezones at all. The thing that throws us for a loop is changing the timezone either temporarily due to some custom (Daylight Savings, Ramadan, elections, etc.) or permanently with little notice (populist tendencies in governments).

I maintain timezone infrastructure and what I'd rather see is an international treaty that all changes to country timezones require some standard period of notice. Even 90 days would be better than what we have now. Getting international politicians to care about timezones seems like a losing proposition before even starting, so I was imagining a "hack" to an existing treaty. The best one I could think of was an amendment to the International Maritime Organization (IMO). Perhaps if governments were made aware that screwing with, say, DST 3 days before it is about to go into effect would violate some treaty, they would shy away from that behavior. I know that's something the IATA could get behind.

5
powvans 1 day ago 3 replies      
I've spent the last several years developing applications that have calendaring at the core, where users collaborate across timezones, and where proper timezone handling is expected.

Just try to explain to everyday people just how hard the problem is, the technical in and out, the practical miracle that it all works in almost all the cases, that all the error prone complexity can be eliminated by acknowledging that it's the same time everywhere all the time. They will not understand and they will not care. They will most likely consider you crazy. Like Don Quixote tilting at windmills.

We implement technical solutions to hard problems in order to simplify daily life for the 99.9% of people who do not care about things like timezones. Not withstanding the elimination of DST, this article is asking for the opposite. Furthermore, behind the scenes we already deal with time in the agreed upon standard.

If your dates are not stored in UTC in your database, you are doing it wrong. If your API and client software do not deal in dates in UTC, you are doing it wrong. If your dates are not localized when displayed to end users, you probably have very unhappy users.

6
jameshart 1 day ago 1 reply      
This proposal focuses on timezones as names for hours and completely glosses over what the relationship is between timezones and days. Does Monday November 7th 2016 run from 00:00 UTC - 23:59 UTC? Do people in Sydney start work at 21:00 UTC on Sunday and come home at 17:00 UTC on Monday? Or do different places around the world have different times when they flip the calendar over? (in which case you've just recreated timezones by another name).

You don't get to get rid of the international dateline, either, because when every country chooses which two daylight-periods a week to use as their local weekend, even if everybody aligns them with their closest neighbors to the east and west right the way round the world, someone's going to find themselves having to make a choice between matching their western neighbor or their eastern one, because they disagree by a day.

7
cyberferret 1 day ago 1 reply      
Living in a place that doesn't have the concept of daylight saving time, I positively hate the dance that we have to do twice a year with our interstate and overseas brethren to arrange meetings etc.

Having been a commercial pilot, I also appreciate the concept of 'Zulu time' where EVERYONE is on the same page as to when an aircraft will depart or reach a particular waypoint. No need to wonder if it is during morning, noon or evening, if a crew member said they would be at a particular location at 0421, we all knew how many minutes ahead or behind we were, no matter where we are in the world. After all, everyone who cares about that reference is already awake and working at that time.

Currently, I work with a widely distributed remote team across the world. Yes, arranging meetings is hard in order to ensure that it fits with working schedules and awake times, but at the end of the day, I usually also clarify the meeting times in UTC times, just to ensure that everyone can double check in their local timezones.

This means using only one fixed datum for checking rather than figuring out if the remote timezone & daylight saving, and trying to work back to your local timezone accordingly. For instance, I am in Australia and you are in the US, you have to figure out the subset timezone each is in before working out. I have no idea what zone Kalamazoo is in off the top of my head, and I bet you have no clue what timezone Darwin is in?!?

In our web apps, we always set our servers to the default UTC timezone, and try and use humanised time displays all over the app (i.e. "Updated 34 minutes ago") or use local browser timezones to display actual time. This way, it doesn't matter if someone in the Ukraine or in Alaska enters a record, it is always "x minutes from a fixed datum" no matter what your local time is, and it seems to make more sense to the users.

8
kryptiskt 1 day ago 5 replies      
So you want to abolish time zones (https://qntm.org/abolish)
9
olliej 1 day ago 0 replies      
Someone wrote an article a while back about why this is a stupid idea -- mostly things like "if I want to call so and so in Australia I would now need to find out what time of day in Australia is the time I consider to be evening"

Computers already work in terms of utc (assuming correct code ;) ) -- this article is mostly "I know what time I'm in, why should I have to consider other people?"

If we were to get rid of anything it should be DST as we aren't agrarian and there's no evidence that the modern "save power" claims are remotely accurate. But then again it's a politically cheap thing for politicians to change it "help the environment" and helping the environment is always good, right? :-/

10
glook 1 day ago 0 replies      
I remember https://en.wikipedia.org/wiki/Swatch_Internet_Time - I loved the concept. Maybe if we switched to beats we wouldn't have an association for 'noon.'
11
mjevans 1 day ago 2 replies      
As this version of the thread seems to have more points I'll comment here.

I completely agree with everyone using UTC for numeric time numbering.

I DISAGREE, with remapping 'noon', 'midnight', 'morning', etc. All of the relative descriptions for when in the local solar day a thing is should be approximate local references.

An example: 'lunch' and 'noon' would still be the time in the middle of the local solar day that people get a meal. (Somewhere around 11AM to 2PM in current local times)

12
nxc18 1 day ago 0 replies      
Scheduling a recurring meeting of busy people (think college students with busy schedules) across continents, cultures and countries is hard enough without timezones and arbitrary changes in daylight savings time.

For example, Dubai doesn't have DST - it never changes relative to the others. Kosovo and Croatia change a week before the U.S. changes. Then the U.S. changes but Dubai doesn't.

For about 6 weeks out of the year, scheduling is confusing chaos, and a workable schedule under one time configuration very likely doesn't work for the others, keeping in mind that a midday meeting in the U.S. is pushing on midnight in Dubai and family time in central Europe.

The whole situation is a disaster, and while manageable, certainly takes a lot of effort, planning, and luck to get right.

13
captainmuon 1 day ago 0 replies      
Another radical idea, we could do the opposite: Use local astronomical time everywhere.

One reason unified time zones were introduced were train timetables. I haven't used one directly in quite a while, instead I've used websites, apps and public displays. They could all accommodate for the fact that different train stations had slightly different times.

Unified time is also important for TV show airing times. But scheduled TV is getting less and less important compared to video on demand. (Also if you really wanted to, future set-top boxes could modify the text of announced times on-the-fly. Text-to-speach has become pretty good :-))

I believe this would also encourage places to adjust their opening hours relative to sunlight, which would probably be healthier and better for the environment in the long run.

14
brongondwana 1 day ago 0 replies      
Come back to me when you've managed to convince the USA to adopt metric, which is significantly more of a no-brainer than throwing away timezones.
15
artpepper 1 day ago 0 replies      
This is basically a form of Utopian thinking, that we should adopt a convention because it's "logical" rather than one that meets human needs.

> People forget how recent is the development of our whole ungainly apparatus. A century and a half ago, time zones didnt exist.

That's true, but not because everyone was living on GMT. Time zones are somewhat arbitrary, but aligning everyone to a single time zone is even more arbitrary.

16
taeric 1 day ago 0 replies      
This should be required reading for anyone thinking time zones are a bad idea: https://qntm.org/abolish
17
upofadown 1 day ago 2 replies      
If you have UTC and solar time available, you don't need zone time anymore. If you need to coordinate with other people you use UTC. If you want to do things at a particular time of day you use solar time.

If you decide to go to work at, say, an hour after sunrise you get all the advantages of DST, anywhere in the world, without any of the disadvantages. The reason DST sucks is because of the way time zones force all time to be an even number of hours (yes, I know there are exceptions, but the principle still holds).

18
barnacs 1 day ago 2 replies      
I'd go even further and divide every human settlement into multiple districts with schedules shifting gradually between them.

Like, in one "district" of the city most people would be sleeping, it would be quiet and somewhat dead, while at the same time a few "districts" over in the same city businesses would be open, including banks, offices and other stuff on extremely strict schedules currently. Yet in another district it would be leisure time, where most people would be doing whatever they do between sleep and work.

The point is, you could always find the appropriate district for whatever you want to do, be it business, leisure, sleep or anything.

Also, with digital timekeeping devices (watches, calendars, digital presentation of business hours, etc) schedules could all be dynamic. Instead of "2016-11-09 16:00" you could schedule things like "3 days and 2 hours from now" and all devices would dynamically keep track of how much time is left until the event.

19
gdw2 1 day ago 1 reply      
I would imagine for many places, a normal 'day' would actually be split across two dates. That would be confusing!
20
henrikschroder 1 day ago 0 replies      
This is stupid. This is obligatory reading: https://qntm.org/abolish
21
norea-armozel 23 hours ago 0 replies      
This is definitely not an issue I think that can be solved by merely doing away with time zones. Time zones at least in the US were part of coordinating arrival and departure times for the railroad which was a big problem when two towns that were mere miles apart were on two different times (sometimes by an hour). I think the real problem is that every governing body involved with time measures seems to think theirs is the best. UTC is awesome but by itself it's not enough to synchronize with local habits which are dictated by solar time (even when considering the most extreme ranges of time zones).

At the heart of the matter is how do you fix time zones such that governments respect solar time over their own desired time frame fork human activity? Honestly, I have no solution beyond a fierce beating of the average legislator/bureaucrat. Because honestly timezones should be in 15 degree slices around the Earth since that approximates well with the shape of it. Mind you, for people living near or at the poles is going to be a mess for them regardless of how you settle it but the majority of humans a simple 15 degree slice of the Earth gets the time zone problem as close to a solution as possible. But the odds of any major government (or even local government) adopting this is nil. So I just pray no one actually tries to reform the existing timezones otherwise we'd might get something worse.

22
egypturnash 1 day ago 0 replies      
okay so

let's substitute one completely arbitrary time measurement (Greenwich Mean Time, which is basically "solar time outside an observatory in Greenwich, England") for an intricate set of time zones that is, admittedly, confusing, but also has some vague relationship to "solar time in the area covered by the time zone"

it'll be great

almost everyone on the entire planet will have to get used to workday hours being different numbers, and we'll still have to do timezone calculations when we want to try and make realtime contact with someone in what was formerly another time zone

it'll be super awesome

oh hell let's just all switch to Swatch Internet Time while we're at it, throw out all the analog clocks.

(I mean yeah, fuck daylight savings time anyway, but sure let's throw out the baby along with the bathwater and make life marginally easier for people who regularly schedule intercontinental phone calls. They're the only people whose opinion matters evidently.)

23
saretired 1 day ago 0 replies      
This editorial seems to assume that clocks were more or less synchronous prior to the introduction of time zones -- this is completely false. Time zones were introduced (at the behest of the railroads in the U.S.) because communities had their own local mean time, and often those communities were not distant from one another. In other words, there were an indeterminate number of time zones worldwide. The current time zone map, for all of its peculiarities, is hardly difficult for people to grasp: it gets light in the early hours, etc. China has a single time zone for the entire country--perhaps the author expects a Maoist ``harmonization dividend'' worldwide but speaking to Chinese friends over the years I'm not convinced that such a happy dividend ever materialized there.
24
reflexive 1 day ago 1 reply      
The advantage of daylight savings is I can set my schedule to wake up at the same time each day, while maximizing my daylight hours and never getting up before dawn.

Without the shift for daylight savings, the sunrise time in San Francisco varies from 5:47a at the summer solstice, to 8:25a just after winter solstice. If I set my schedule to get up at 8:25a every day, I'd miss a significant amount of morning sunlight.

The beauty of daylight savings is it shifts the sunrise time to be earlier in winter, so instead of 8:25a the sun rises at 7:25a. Thus I can set my schedule to get up at 7:25a and get an extra hour of sunlight every day.

(Technically the latest sunrise time is just before shifting from daylight to standard, 7:39a on November 5 - the point is we reduce the range of variation from 2hrs 38min to 1hr 38min)

25
droithomme 1 day ago 0 replies      
UTC+8 has the largest population by far:

http://artscience.cyberclip.com/world-population-by-time-zon...

Therefore that is what everyone should use rather than Greenwich time as it would provide the least disruption.

26
laurieg 1 day ago 0 replies      
I think less reliance on clock time would be a good thing for the modern world, but I don't think getting rid of time zones would achieve that.

It's amazing how much of modern life is unnecessarily affected by the time on the clock. When daylight savings time rolls round in the UK you can make 60 million people change their schedule by an hour, and even miss an hour of sleep while doing so! Who would have thought the humble clock had such control over people?

In the modern world of plenty, why is it so incredibly rare to meet someone who goes to bed when they're tired and wakes up when they're not?

27
rcarmo 1 day ago 2 replies      
I'd be happy with doing away with Daylight Savings Time.
28
danso 1 day ago 1 reply      
Dumping time zones might be painful, but in the long run it'd probably be the right thing to do. First, it'd make learning about time as a programmer much easier because you'd inherently know that it's a bit complicated and why epoch time and UTC is a thing instead of storing values like "Sunday 4PM".

But more future facing...what are we going to do when we have colonies on the moon and Mars? Things are going to spiral fast if we don't stick to uniform time while we're still mono-planet.

As an example, NASA's Mars photos API provide an option to sort by Martian sol instead of earth date: https://api.nasa.gov/api.html#MarsPhotos

29
mckoss 1 day ago 0 replies      
I don't think people could easily adjust to having the date and day of the week change every day at, say 2pm (for those on PST).
30
facorreia 2 days ago 2 replies      
I think the article makes a valid point but it can't get past Americanisms like "noon" or "4 p.m."

If people were to adapt UTC, those would be 12:00 and 16:00 respectively.

And that just reinforces the main point, that the numbers are largely arbitrary -- but descriptions like "after mid-day" (p.m.) aren't.

31
wst_ 16 hours ago 2 replies      
Could anyone, please, explain to me why 12:00 the noon is 12PM? Why PM? I come from Europe so it may be not so obvious to me. My reasoning is, assuming 12h clock, we have 12:00 twice a day. Once in the morning and once at night. So it would be only natural to say 12PM at midnight, not at noon. Since we already are dividing day for two halves, why not use exactly the same division for AM/PM? And yet, in the morning we (or actually you guys) are saying 11AM but 12PM.
32
ajmurmann 1 day ago 3 replies      
I have huge issues with DST and how we measure time in general (24hours, 60 minutes, 60 seconds?! Who the f* came up with these crazy ass numbers?!). But what bugs me much more is the insane calendar. 12 Months with arbitrary numbers of days. That's just crazy. In the very least we could make it so that January to May have 31 days and the remaining months get 30. ideally we would switch to the positivist calendar. That would make everything date related trivial, easy to remember and calculate.
33
stretchwithme 1 day ago 0 replies      
I've been thinking this for years, as dealing with time has been a chore in software development.

But I think it would make life simpler for those with the resources to deal with time zones and harder for those with less resources.

You wouldn't want a change made in medicine that made life easier for doctors but harder for patients, would you? I'm mean, which is trained to deal with complexity and is well paid to do it? And which is often overwhelmed with what's happening to them?

Of course, software is going to make time easier and easier to deal, so perhaps it doesn't matter if we change how we deal with it or not. Not in the long run anyway.

Software may get so good at dealing with time that we each can deal with it exactly how we want to and the software we used to share information will seamlessly deal with all the translation back and forth.

34
cushychicken 1 day ago 0 replies      
China has one time zone. People outside of Beijing and the eastern Chinese seaboard don't experience sunrise until as late as 10 AM in some places. I don't find that terribly sensible, frankly. It might make scheduling communication easier, but I agree with a lot of the other comments pointing out that 9 AM actually means something to people as far as time of day goes.
35
pipio21 1 day ago 0 replies      
"Time" has always been linked to sun or moon or the stars.

In English you say o'clock because it is the time of a mechanical clock, it was normal and way more precise to use the sun for a long time in history so 12 was the time the sun was the highest(Zenith).

Astronomers used the moon and stars for calculating time with extreme precision, and they continue doing that.

This always gives you local time at the point of the observer. The man that writes the article probably lives without contact to the environment,in a city,goes to work to a building without the light of the sun but for those of us that do not, knowing when the sun is going to be the highest, the rising and setting of the sun is a great idea.

With GPS enabled clocks, like Apple watch and future smartwatches, every person could carry in his clock the real local time, the political local time zone time,and UTC. No need for globalist trying to force us into doing that.

In America and specially New York it looks like the only important thing in life are first money, then the economy. Instead of money and economy being in service of the world, in those places the world has to serve the economy and money.

36
grzm 2 days ago 3 replies      
grr...

"A century and a half ago, time zones didnt exist."

Sure. How old are the concepts of noon? Midnight? The transition to railroad time and time zones wasn't from a single time zone to many, but from incredibly local time zones (what time is noon in this town?) to fewer.

Likely a lack of imagination or caffeination on my part, but it's hard for me to imagine what it would be like to have shops open at, say 22:00. And we still need to take into account differences across the world for coordinating with people. It's not like we're going to change our diurnal habits just because it's, say, 14:00 across the world at the same time.

Wow. This is one of the most caustic comments I've left on HN. Someone back me off of the ledge. Off to read the Cato Institute commentary linked to in the article.

(And you kids! Get off my lawn!)

37
taf2 1 day ago 0 replies      
I wish this was about day light savings and the random changing of the time zone. I can live with the different time zones and I think they are even kind of nice, but day light savings changing the current time, is terrible. You could argue it had cost real money (think azure going offline) and think of all the countless other hours people have spent dealing with the change.
38
sschueller 1 day ago 0 replies      
Swatch tried to do that a 'very' long time ago: https://www.swatch.com/en/internet-time/
39
mvindahl 1 day ago 1 reply      
There should be a single, unified time across the globe. It would greatly simplify matters. I propose something like Friday around five-ish in the afternoon.
40
vidoc 1 day ago 1 reply      
Very interesting article, I've wondered about a universal time several times, tons of pros and cons sure, but it's cool to think that people are thinking about such disrupting alternatives.

Reminds me this cool episode of 99% invisible[1] where they talk about calendar design, in particular, Kodak company used a 28-day/13 month design calendar until the 70s and it turns out it was beloved by teams who were doing a lot of forecasts.

1: http://99percentinvisible.org/episode/the-calendar/

When I see how tough it is to move to the metric system in the US, I'm not quiet sure we'll ever see such a change any time soon tho!

41
joquarky 1 day ago 0 replies      
I agree with getting rid of the DST switch. But getting everyone to use UTC is messy.

If you're going to overhaul the system, why not base time on the longitude where the sun is at the meridian (noon)?

Of course, you'd have to use degrees with decimals to avoid confusion over minutes and seconds. And even then, it might be better to use a term other than degrees to avoid confusion with the weather.

42
benjaminmhaley 1 day ago 0 replies      
Actually we need local and universal time.

Say you have a meeting with remote participants. You want to know that it's at 9am UTC. With only that everyone knows when to meet. You also want to know that it's at 4am local time. That way you know that you will be asleep.

Universal time is useful for communicating. Local time is good for understanding whether people are likely working, eating, or sleeping.

43
glandium 1 day ago 1 reply      
I agree with the sentiment in many of the threads that this is an awful idea, OTOH, there is some level of condusion with time zones in the US: I don't live in the US, but the few times I went there, it took me a while to figure the whole "8|7c" thing for TV programs, where, AIUI, east and west coast get to have programs at the same local time (8), but obviously not the same absolute time, and states in the middle get to have programs at the same absolute time as the east coast, but earlier in their day (7).

Or did I get it wrong, and the whole thing is even more confusing?

44
contingencies 1 day ago 0 replies      
The author James Gleick wrote The Information which was recently recommended to me but I found a bit slow-going and obtuse. It didn't really resonate with me coming from a comp-sci pragmatist background, though some here may enjoy it.

As far as TZ's go, IMHO one of the biggest issues is the TZ database which itself lacks support for i18n and many modern pragmatic concerns. I once made a proposal to modernize it but it fell on deaf ears... it is, understandably, fairly conservatively maintained.

45
runeks 20 hours ago 0 replies      
We already have a universal time zone, UTC. Anyone is free to use it and some do, mostly in software. Why do we need to "dump" anything? Just use UTC for coordinating time globally, and local time zones for local coordination.
46
131hn 1 day ago 0 replies      
Having all clocks of the world in sync will be a bless, but there remain a gap in time information, so people will add location offset by themselves (we mostly use time to describe life in our timezone) "yes it's 4 am, and i'm in paris". That location offset will be nothing but current timezones, with an inverted logic. I know we are all on different schedule and rythm, but 7am is a good time for breakfast, everywhere, 12h30 feels fine for lunch, kids go to bed by 8pm, and we should sleep at 2am. I guess timezone make life easier for 80% of the world ( i dont think it's 99% anymore)
47
namank 1 day ago 0 replies      
So 2pm Earth Time for me in North America means afternoon but it means night for Asia?

This might make sense if Earth's communication with extraterrestrial localities significantly outweighed communication across Earth.

But it doesn't.

So, no.

But upvote to the OP for posting an interesting idea. :)

48
daveheq 1 day ago 0 replies      
This is idiotic; it would actually have the same problem but worse: translating time to where the sun is relative to Earth. It's called false simplification; it looks easier on the surface but is actually more difficult for the person, just for some blog writer's own convenience.
49
foxylad 1 day ago 0 replies      
We run a global booking service. Fixed timezones are simple to accommodate, but abolishing daylight savings would save me hours of scribbling clock faces.

The issue is that when someone books a given time across a DST change, you need to adjust the unix timestamp in the right direction. And in my experience, your first guess at the direction is always wrong - even if you know your first guess is always wrong.

Still, I guess that making a few developer's lives easier is less important than saving several lives a year due to reduced pedestrian deaths.

50
paulvs 22 hours ago 0 replies      
> New York (with its longitudinal companions) will be the place where people breakfast at noon, where the sun reaches its zenith around 4 p.m., and where people start dinner close to midnight.

He must mean latitudinal companions..

51
endymi0n 1 day ago 0 replies      
Public Service Announcement:

https://qntm.org/abolish

(So you want to abolish Time Zones...)

52
mark_l_watson 1 day ago 0 replies      
The vote is in: I like the idea and my wife hates it :-)

I agree that using Earth Time would harmonize business and personal relations around the world. I work remotely, and time zone calculations for scheduling meetings, etc. are a small inconvenience, but why have any inconvenience at all?

53
ced 1 day ago 0 replies      
People on the West Coast would start their work week on Sunday around 11PM, and finish it on Monday at 8 AM. Then they would agree to a Wednesday morning meeting, and miss each other by a day.
54
bitwize 1 day ago 0 replies      
Time zones I'm okay with. But DST needs to GTFO.
55
leroy_masochist 1 day ago 0 replies      
Meh. The military has been trying to run everything on Zulu time for years and it hasn't gotten a lot of traction because everyone seems to hate it. (Speaking from experience on the ground side, I know it is a bit more commonplace and accepted in the aviation community).

At least as far as the US is concerned, if we're going to change a major standard of measurement, let's focus on going over to the metric system.

56
Spooky23 1 day ago 0 replies      
Interesting thought experiment, but what problem does it solve?
57
MistahKoala 23 hours ago 1 reply      
What if there were two sets of time? The 'local' timezone, as we know it, and a universal timezone - like Swatch attempted to create with Beats way back (in the last century?).
58
mcs_ 1 day ago 0 replies      
if the problem are the financial operations let them have a Time Zone.

Consideration, if you plan something at 8 am, who/what makes that possible? probably a clock! so:

1) reset the time,2) start having business related time zone3) start having speed related time operations

my clock tells me when i have to speed up or slow down to accomplish a task.

There are a lot of applications that can contribute to a better life:

Speed Clock for Uber!Speed Clock for developmentSpeed Clock for ...

I mean, who cares about 2016 AC at this point? You can always have a coca cola party with gifts in some moment in the "year" if you want.

(Note: this idea is absurd and will never works)

59
smokedoutraider 1 day ago 0 replies      
So time would only make sense if you're in England. As most things coming from the nytimes lately, this is another pathetic joke.

Now dropping daylight savings, that does make a lot of sense.

60
drallison 1 day ago 1 reply      
Time zones and, worse, leap seconds interfere and needlessly complicate recorded time--the ability to compare two times and compute the time interval between them. Anyone who has tried to geophysical data from diverse locations and organizations knows how difficult that is.

Now would be a good time to adopt this change since the way standard units (mass, length, time, etc.) are being redefined.

61
remoteme 1 day ago 1 reply      
We should use local relative times based on sunrise.

"I start work at 3 past".

Meaning he or she starts work at 3 hours after sunrise.

62
Lazare 1 day ago 1 reply      
I remember when I thought the same thing. Then I spent a couple hours thinking about it, and realized what a terrible idea it was.

In my day to day life, I often need to talk to fellow tech people (engineers, support, etc.) in my own time zone, but also in places such as Sydney (2 hours behind me) or California (20 hours behind me). As a society, we've agreed that tech people like that are generally working 9-5, give or take, and we have a universally agreed mapping of locations to adjustments to local time (aka, time zones).

This means that I can look at a clock and realize "oh, it's only 10am, I shouldn't expect a reply to my email to that guy in Sydney yet". Even if I'm dealing with a place I'm not really familiar with, I can look up the timezone, or just google "time in Berlin", and I now can translate appropriately.

But what if we abolished that mapping (aka, timezones)? Presumably every location would just continue on about their life as before. Sydneysiders with desk jobs would stop working 9-5 AEST, and would start working 19-3 UTC; New Yorkers in similar roles would be working 4-12 UTC. Sounds odd, but it would quickly start to seem normal.

Problem: We just abolished the mapping tables that let me adjust for local time. It was trivial to work out that my Sydney colleague was probably out to lunch at 12:45 AEST, but how am I meant to know if he'll be out to lunch at at 22:45 UTC?

The only way to make this work is to have or create some form of mapping that tells me the adjustments to make to bring Sydney schedules in line with my own schedule. Eg, "I go to lunch at 00:30 UTC, and Sydney is about 2 hours behind me, so they'll be at lunch now". Because without that table, I simply have no clue what (or rather when) people in Sydney are doing.

But that table is just another word for time zones, the thing we abolished. Even better: While there are work arounds, most of them are very difficult to computerize. For example, I could check the corporate website of the guy I'm trying to call, and see if they have office hours (in UTC), and then try an work out when they'll be at lunch, but my calendar app wouldn't be able to do that. It needs a formal, agreed upon table of adjustments. Which would be time zones, whatever we want to call them.

TL;DR: This is an idea that makes a lot of sense to people who only have to deal with people who are geographically close to them. If you're actually dealing with people around the world, time zones are not an inconvenience, they're critical.

63
chx 1 day ago 0 replies      
> No more wondering what time it is in Peoria or Petropavlovsk.

Sure, I know it's 10am in Petra and so what? Is my partner in the tourist business dead asleep or just stirring? If you detach these numbers from the life of people then you need to find some other way to remember their cycle.

64
syrrim 1 day ago 0 replies      
Lets review the arguments:

- Time hasn't always had zones, so why not get rid of those too?

- Nazi germany, communist china, and north korea all do it, so clearly this is a policy befitting the modern world. Presumably alongside mass censorship and death camps.

65
guilt 1 day ago 0 replies      
I think the first thing to dump is the daylight savings time concept.

Then rebase the timezones based on UTC.

It won't help when you have a polar winter or a midnight summer - but at least it will remove unnecessary conversions.

66
jwfxpr 1 day ago 0 replies      
The United States still uses Imperial measurements and Fahrenheit, and the NYT publishes an article arguing for an unnecessary, human-unfriendly re-standardisation of time zone(s)...?

Mmhmmm. The whole world's gonna take THAT idea seriously.

67
amelius 1 day ago 0 replies      
The article is a little short-sighted. What time will we use when we start colonizing other planets? Those planets may not even have a 24-hour day. And to make matters worse, relativity tells us that a global clock doesn't even exist (look up simultaneity).
68
frik 1 day ago 0 replies      
Time to dump summer vs normal time. Switching twice a year, and every other country choose a different date or doesn't switch... It's time to stop switching between summer and normal time.
69
ChicagoDave 1 day ago 0 replies      
Having worked on computers for 30 years and seeing a recent uptick in globalization of system usage, getting rid of time zones would be a fucking awesome change to localization nightmares with data.

DO IT!!!

70
OOPMan 1 day ago 0 replies      
I don't mind timezones, but DST is the devil. And I don't even live in a country that has it.
71
btbuildem 1 day ago 0 replies      
Nix the DST, but keep the timezones. Maybe clean them up a bit (standard 24 or smth).
72
ericzawo 1 day ago 0 replies      
A crazy thing my brother and I noticed this morning that the last clock we have to manually change ourselves is our watches. Yes, even the oven had a built in DST time change.
73
nealmydataorg 1 day ago 0 replies      
It will be difficult for all the countries to agree to dump time zones. If they agree then adjustment of computer systems will bring time similar to Y2K effect.
74
tancybertitan 1 day ago 0 replies      
Stop upvoting stupid posts. Time zones is a perfectly sound way of calculating the current time on anywhere on the planet. We should just try to improve it by removing skewed concepts like daylight saving and non uniform time zones. Saying that we need to get rid of the concept of time zones is saying we need to get rid of the metric system.
75
nealmydataorg 1 day ago 0 replies      
It will be difficult for the countries to agree to dump time zones. If they agree then adjustment of computer systems will need effort similar to Y2K effect.
76
nullc 1 day ago 0 replies      
We can't even manage to agree to get rid of the nearly pointless and _highly_ disruptive leap second...
77
mememachine 1 day ago 0 replies      
Time zones make cross-geographical communication far easier. This is important in the age of the internet.
78
neves 22 hours ago 0 replies      
Will it be easier than to make America use the metric system?
79
mixedCase 1 day ago 0 replies      
Archived version: https://archive.fo/oZVQ2
80
m1sta_ 1 day ago 0 replies      
A single, open, calendar interface for all people would solve this. I cannot see that happening though.
81
brianwawok 1 day ago 0 replies      
I vote for infinite time zones. Each town picks their own timezone based on solar noon.

We have tech to map for us when traveling.

Bringin 1900 back.

82
bbcbasic 1 day ago 1 reply      
Sounds good to me. If the farmers want daylight savings then simply go to work at a different time, UTC.
83
shmerl 1 day ago 1 reply      
I don't mind ditching DST. It's a mess. Not sure why it's still in use.
84
matjaz2k 1 day ago 0 replies      
Wait until we go interplanetary. :)

Then also "let's talk in 5 mins" won't work anymore.

85
return0 1 day ago 0 replies      
This will be useful when we've colonized a few planets. Until then, hell no.
86
taylodl 1 day ago 0 replies      
How about we start with ending daylight saving time and see how that goes, hmmm?
87
Mz 1 day ago 0 replies      
Prior to railroads, local time was determined by making it Noon when the sun was directly overhead. Time zones became necessary when railways spread in order to be able to catch your train. Airplanes use Zulu Time because they can fly fast enough to make it a confusing mess to do anything else. This is the same problem the railways had, but the next order of magnitude up.

We already have and use Zulu Time for airplanes. There is no compelling argument here for making that universal. Local time zones still make sense in terms of setting schedules for local services, jobs, etc. The fact that I can talk to people all over the world isn't a compelling reason to move to a singular Earth Time.

88
ryanbertrand 1 day ago 0 replies      
It would reduce a lot of code in Date libraries :)
89
paulddraper 1 day ago 0 replies      
Agreed. Baby steps: get rid of daylight saving time.
90
petre 1 day ago 1 reply      
First get rid of leap seconds.
91
vacri 1 day ago 0 replies      
Given how difficult it's been to get the US and the UK to adopt something as simple as a sane measurement for distance, it's weird that so many people think that it's even vaguely possible to get them to change measurements for something as complex as timekeeping...
92
InclinedPlane 1 day ago 0 replies      
No, just no. You still need to keep track of local times relative to the Sun because that's important. And time zones are the best way to do that. We could improve our usage of time zones, but abandoning time zones is just a silly idea.
93
peterwwillis 1 day ago 0 replies      
Has anyone else noticed that dialing a phone number to call or text someone is like using their IP address to send them an e-mail?
94
grzm 1 day ago 0 replies      
95
abcd_f 1 day ago 0 replies      
Looks paywalled to me.
97
grzm 1 day ago 1 reply      
mod: Can you please combine the three threads?
98
Longhanks 1 day ago 0 replies      
> Don't have an account? Sign up here!

No thanks.

99
adamnemecek 1 day ago 0 replies      
What will Jon Skeet do now?
100
gwbas1c 1 day ago 4 replies      
Ok, pretty much every comment here is about why time zones are important. I agree!

But one thing to ponder: Just like telling time in 24-hour mode is useful, perhaps it will be useful to have some clocks display UTC? Perhaps it'll be useful to talk about international events in UTC?

Perhaps instead of abolishing time zones, we just use UTC as a convention when talking about events that happen across time zones?

6
The People's Code code.gov
567 points by rmason  4 days ago   191 comments top 29
1
joshdotsmith 4 days ago 2 replies      
I really like the work of folks at 18F, USDS, CFPB, et al.

Bearing in mind that I'm ecstatic to see this collection of projects and someone clearly put a lot of time and effort into this already, I do have couple points of hopefully constructive feedback here:

- Please list all of your repositories and feature ones that you think deserve special recognition. Seeing all your repos is a 4-step process otherwise: click on the project, click on the link, click on the acknowledgement that I clicked a link, click on the organization name on GitHub. [1]

- Some information on contribution policies, pulling in your README.md, etc would be helpful on this site. [2]

- When I click on the link to go to your repo, I know I'm clicking a link that leaves your site. You even say so. "But you probably knew that already." Don't hijack it, please. [3]

- The activity feed doesn't provide much value in its current form. Most of these GitHub events are completely without context. [4]

EDIT:

I've since opened the following issues on https://github.com/presidential-innovation-fellows/code-gov-.... Anyone who has feedback, definitely let them know (kindly!).

[1] https://github.com/presidential-innovation-fellows/code-gov-...

[2] https://github.com/presidential-innovation-fellows/code-gov-...

[3] https://github.com/presidential-innovation-fellows/code-gov-...

[4] https://github.com/presidential-innovation-fellows/code-gov-...

2
former_govt 4 days ago 2 replies      
I admire the work that 18F is doing, and live in the DC area, even contracted for the government early in my career -- but I can never go back. I just fundamentally disagree with the security clearance apparatus is in place and never want to subject myself to that again. Especially when purely private companies pay more and don't care what you do in your personal time.

Working for the government felt like being a child again.

3
OliverJones 4 days ago 4 replies      
How about open source code for some makes and models of voting machine?

The US Veterans' Administration health records software system is in the public domain. https://en.wikipedia.org/wiki/VistA#Licensing_and_disseminat... But it's not listed here. (It's also kind of complex. "wget; tar x; ./configure ; make" probably won't get you a running instance. Still.

4
CiPHPerCoder 4 days ago 1 reply      
https://github.com/samilliken/openDCIM/search?utf8=%E2%9C%93...

https://github.com/USEPA/E-Enterprise-Portal/search?utf8=%E2...

Looks like the US Government is oblivious to the risk of PHP Object Injection.

https://paragonie.com/blog/2016/04/securely-implementing-de-...

It looks like they're publishing code that, in particular, is vulnerable to CVE-2015-2171.

https://github.com/samilliken/openDCIM/blob/d3e137294179e392...

5
jknoepfler 4 days ago 1 reply      
This is a great idea. I feel like the big picture isn't open source code for government, but open source APIs for interacting with the government. In the medium-term future, I think we should push for a system where all government documents that ought to be a matter of public record are managed in a publicly readable versioned source repository like, like a federal git server.

Other things that I'd like to see in publicly visible git repositories:

- federal laws and regulations (particularly regulations, standards, etc.).

- procedures (rules for federal processes)

- all documents for making requests of the federal government, with mechanisms for getting those requests to the right place.

6
pdkl95 3 days ago 2 replies      
Why the subsidy to Google? Sending tracking data to Google isn't appropriate for a government site.

 <script> /* i='GTM-M9L9Q5' */ /* ... */ src='https://www.googletagmanager.com/gtm.js?id='+i /* ... */ </script> <!-- ... --> <noscript> <iframe src="https://www.googletagmanager.com/ns.html ?id=GTM-M9L9Q5&gtm_auth=GTM-M9L9Q5" height="0" width="0" style="display:none;visibility:hidden"> </iframe> </noscript>
Also, requiring Javascript in a single page application is a terrible design for this kind of site. Almost all of this can be static pages or traditional web frameworks. Requiring Javascript made the download much larger than necessary, slowed down the page load time a lot, and forced the page to reflow multiple times as the data arrived. A wide variety of web frameworks could have rendered and cached static pages instead of massively over-engineering the site as an "app".

7
kensai 4 days ago 4 replies      
"The Federal Source Code Policy is designed to support reuse and public access to custom-developed Federal source code. It requires new custom-developed source code developed specifically by or for the Federal Government to be made available for sharing and re-use across all Federal agencies. It also includes an Open Source Pilot Program that requires agencies to release at least 20% of new custom-developed Federal source code to the public."

That's really nice, but I think the 20% is too little. It should be at least 50%. They could still keep a critical low percentage secret.

8
weka 4 days ago 2 replies      
Took a look at some of the code.

I see one for Gov of Commerce:

The Commerce.gov API is under active, but not public, development. As such, API code is not currently made available publically. This Github repository will be used to collect and respond to feedback regarding the API and engage with developers interested in using the API.

---

I don't believe that's the correct usage of GitHub. What's the point if you're not going to showcase the code?

9
neom 4 days ago 0 replies      
This is awesome! At work we're trying to help with this at the municiapl level, it's hard so it's really great to see it coming from the Fed. Shout out to Becky Sweger[1] from 18F who has been instrumental in helping move things forward, talking about open data and open code.

[1] https://twitter.com/bendystraw

10
sandGorgon 4 days ago 0 replies      
The one project that i would love to see opensource is the FBI Sentinel project.

A lot of countries have broken infrastructure for law enforcement. Just like USAID, it would probably be more efficient to give software that can be adopted.

The sentinel project captures millions of man hours of wasted... and ultimately successful product development - all focused towards law enforcement collaboration. It would be good to have that.

11
tommynicholas 4 days ago 1 reply      
If there's one thing that deserves a federal massive budget, it's this. If laws and regulations can be understood easily by software, we would remove so many problems the government creates. Complexity would be easier to manage, but also transparency would inherently breed simplicity.
12
qwertyuiop924 4 days ago 2 replies      
Most of the code looks fairly dull and single-purpose. Except NASA's code, as you may expect. Trick and OpenMCT both look at least interesting, and OpenMCT in particular could be used as the basis for many an interesting web-based project.
13
amluto 4 days ago 0 replies      
In case any code.gov website people are looking: there's an unfortunate bug: when the site is first loading on a slow network, you can see all the agency names and click them, but they all show text like "No repositories found." This made me think that this was a brand-new project that hadn't done anything yet.

Also, the little department logo images shouldn't have alt-text. That alt text of the department name overlays the actual text of the department name and just makes it even less accessible when the image isn't there.

(I suppose the lesson for web development in general is to test on a slow connection.)

14
idlewords 4 days ago 0 replies      
Why does this simple landing page have to be over 600K and require Javascript?
15
matthjensen 4 days ago 1 reply      
I wonder how they will determine which open source projects are included at code.gov. For instance, I contribute to a few projects that are used within government for determining the economic impacts of fiscal policy, and the code is in the public domain. I'd love for them to be included at code.gov, but I'm not sure whether they meet the criteria.
16
liveoneggs 4 days ago 0 replies      
is this page 508 compliant? I look to .gov to show some examples of accessibility
17
breakingcups 4 days ago 0 replies      
This is a step in the right direction, a thank you to the people who helped make this possible.
18
skiplist1 3 days ago 0 replies      
Just an honest question. Can I obtain some estimation of the capabilities of CS people working for the government by just reading all these code. Will I find here a glimpse of code beautifully designed and tailored to solve some really deep problem that require real intelligence?, can I use these code to learn how to code a big project?
19
initram 4 days ago 1 reply      
They already have some Swift projects [0], which I think is kind of cool. Not sure I would want to use code named "Bro" or "Conman" though. lol.

[0] https://code.gov/#/explore-code/agencies/DOL[1] https://code.gov/#/explore-code/agencies/DOE

20
yellowapple 3 days ago 0 replies      
Apparently Github supports displaying STL models: https://github.com/nasa/NASA-3D-Resources/blob/master/3D%20M...
21
dnprock 4 days ago 0 replies      
I suggest having a search feature across all repositories.
22
thinkcomp 4 days ago 1 reply      
How about open PACER?
23
rayiner 4 days ago 1 reply      
Interesting that someone approved the communist reference (or that people didn't perceive that as a communist reference). It's not the 1980s anymore I guess.
24
mars4rp 4 days ago 1 reply      
Where is the IRS code?
25
_audakel 3 days ago 0 replies      
I was kinda hoping to see some projects from the NSA. Noooope.
26
manish7 4 days ago 0 replies      
This is awesome initiate by US Gov.
27
ayh 4 days ago 0 replies      
Great so we all have access to useless software code.
28
steanne 4 days ago 1 reply      
Looks like You Dont Have Javascript Enabled

Code.gov is built on the latest web technology using Javascript. To use Code.gov, please enable Javascript so we can share all of the wonderful open source projects the Federal Government has to offer.

29
masterleep 4 days ago 3 replies      
Horrible name - the Federal Government of the United States is not "the People".
7
Docker in Production: A History of Failure thehftguy.wordpress.com
565 points by user5994461  4 days ago   284 comments top 61
1
user5994461 3 days ago 6 replies      
Author here.

A bit of context missing from the article. We are a small shop with a few hundreds servers. At core, we're running a financial system moving around multi-million dollars per day (or billions per year).

It's fair to say that we have higher expectations than average and we take production issues rather (too?) seriously.

I updated and fixed a few points mentioned in the comments. (Notably: CoreOS).

Overall, it's "normal" that you didn't experience all of these issues if you're not using docker at scale in production and/or if you didn't use it for long.

I'd like to point out that these are issues and workarounds happening over a period of [more than] a year, summarized all together in a 10 minutes read. It does amplify the dramatic and painful aspect.

Anyway, the issues from the past are already in the past. The most important section is the Roadmap. That's what you need to know to run Docker (or use auto scaling groups instead).

2
amouat 4 days ago 11 replies      
Well, this person seems to have had a hard few months. I have a lot of problems with the post though:

- Docker do not own CoreOS and did not create the distro

- Docker did not write OverlayFS

- There are issues with storage drivers. Don't use AUFS for the reasons outlined. I've had no issues with Overlay2, but note there is also devicemapper (which Red Hat have poured a lot of time into) and BTRFS. And ZFS now IIRC.

- It's not correct to say Docker was designed "not" to do persistent data. You do need to be aware of what you're doing, and I suggest not running DBs in containers until you have the solved the FS issues and have experience with orchestration.

- Kubernetes has been around for a while now and is fairly stable in my view. Swarm isn't rubbish, but the new version is very young and you may run into issues in the short term.

- With microservices, when you are running lots of small containers, you have to expect them to crash now and again. It's just the law of averages. You need to design your application to accept this and deal with it when it happens.

If you hit this many problems with any given tech, I would suggest you should be looking for outside help from someone that has experience in the area.

3
jconley 3 days ago 4 replies      
This is why DevOps became a thing. It's complicated managing systems at scale. If you were using these tools and aren't at scale, well, that sucks.

Unplanned downtime is the main drawback to both hosting your own OS's and using leading edge tooling for your critical systems. It doesn't matter what the underlying system, stuff like this happens. This stuff is complicated. You will find major warts in every new system. Everything will break, often, leaving your users stranded. It takes a very long time for software to mature. [0]

That's why you see engineers with 15+ years of experience patiently waiting for new technologies to mature before putting them in place in our critical infrastructure. Sure, we'll play with them, but putting new tech in production that could easily make your entire system unavailable is too risky. Yes, three year old tech is new.

[0] http://www.joelonsoftware.com/articles/fog0000000017.html

4
davexunit 4 days ago 2 replies      
Docker's reliance on overlay filesystems is one of the biggest problems I have with Docker. Stacking opaque disk images on top of each other just isn't a great design, and it makes for a cache strategy that is all-too-often invalidated (because a Dockerfile is linear, there is no dependency graph). The file system is just the wrong layer of abstraction for solving such an issue.

If you change paradigms from the imperative sequence of mutations in a Dockerfile to a declarative specification that produces immutable results and a package dependency graph structure, you get a much better cache, no need for disk image layering, a real GC, etc. For example, the GNU Guix project (purely functional package manager and GNU/Linux distro) has a container implementation in the works named call-with-container. Rather than using overlay file systems, it can just bind mount packages, files, etc. inside the container file system as read-only from the host. Not only is this a much simpler design, but it allows for trivial deduplication of dependencies. Multiple containers running software with overlapping dependency graphs will find that the software is on disk exactly once. Since the results of builds are "pure" and immutable, the software running inside the container cannot damage those shared components. It's nice how some problems can simply disappear when an alternative programming paradigm is used.

https://www.gnu.org/software/guix/news/container-provisionin...

5
tzaman 4 days ago 4 replies      
That's odd, we've been using Docker for about a year in development and half a year in production (on Google Container engine / Kubernetes) and haven't experienced any of the panics, crashes yet (at least not any we could not attribute as a failure on our end).

Clearly, Docker itself has a a few annoyances, like no garbage collection and poor tagging process, but we're using it on a daily basis, and nobody complains about it (nothing a few bash scripts can't fix).

Finally, the biggest benefits of not just docker but containers in general are consistency and immutability which brings me to the final point I agree with the author: Your database shouldn't be containerized. It's not your pet (from pets vs cattle), it's your BFF.

6
zwischenzug 4 days ago 2 replies      
I work for a large enterprise 100K+ employees using Docker in production [1]

It's certainly had its challenges (different versions of Docker clients causing havoc with different manifest versions, race conditions in the engine etc etc), but this article takes things a little far.

If you take pretty much any new technology like this make it a base part of your platform, you'll want support from a big vendor. To think otherwise would be naive.

There is definitely a debate to be had about Docker Inc's approach to production/enterprise rollouts though, but as a technology I'd say it's developing pretty much as I'd expect.

I've also seen Docker succeed in production despite the technical challenges. If you don't like the heat...

(1) See my blog: https://medium.com/@zwischenzugs, especially: https://medium.com/zwischenzugs/a-checklist-for-docker-in-th... and https://medium.com/@zwischenzugs/docker-in-the-enterprise-ec...

7
awinder 4 days ago 2 replies      
"The only way to clean space is to run this hack, preferably in cron every day"

https://github.com/spotify/docker-gc

"Many attempts can be found on the internet, none of which works well. There is no API to list images with dates"

http://portainer.io/ seems to be able to do it, docker images lists it... I mean, I don't want to call bullshit without knowing the full story, but...

"So, the docker guys wrote a new filesystem, called overlay"

https://github.com/torvalds/linux/commit/e9be9d5e76e34872f0c... written in 2014 by the linux kernal team / Linus, but OK cool story

"It affects ALL systems on the planet configured with the docker repository"

OK, this is why people use AWS' container repo, or they use the open source software and maintain their own repo... this happens with any of the public repo services, and it was 7 hours.

"The registry just grows forever"

S3 is a supported backend, highly would advise anyone running their own repo to use it (or have a similar expanding-space story). There's also a config flag for the repo to allow removing images. Obviously wouldn't want to use this if you're hosting a public repo but internally, go for it, it's off by default, seems sane enough.

"Erlang applications and containers dont go along"

I'm certain that people are running erlang containers successfully.

"Docker is a dangerous liability that could put millions at risk. It is banned from all core systems."

Daily FUD allowance exhausted.

I guess the tone of this article really bugged me because there's obviously a point to be made from this experience that running Docker is more difficult & error prone than it should be. And maybe it wasn't a great fit for this company culture. But this article crossed the "we didn't have a good experience, therefore, it's BS software" line, and frankly, that attitude may very well have been to blame for the lack of success just as much as docker's shortcomings were...

8
jsz0 3 days ago 0 replies      
> Docker Issue: Cant clean old images

The worst part of this problem is when you do run out of disk space docker's image/container database often gets corrupted resulting in the great cryptic 'could not find container entity id' error. The only fix I'm aware of is rebuilding all your images or mucking around with the docker sql database. The lack of basic features to manage the problem creates a worse and harder to troubleshoot problem.

It's an easy enough problem to avoid in the first place so it's really not big of a deal. It hurts docker as a company more than the users in the long run because it makes them look bad. There are a lot of other small things like this with the docker cli tools especially. They aren't deal breakers but they are definitely eyebrow raisers. For example if you mess up your docker run command it usually just fails in strange ways instead of telling you what's wrong with your command.

I'm actually pretty happy with docker overall but they really need to fix some of these basic usability problems.

9
atemerev 3 days ago 0 replies      
I also work in HFT, and I also attempted to use Docker in production and encountered all sorts of problems (different from what are described in this article, though).

However, as it turned out, these problems had nothing to do with Docker. Containerization is a great idea, and Docker's approach is sane (one container per service fine).

The offender was docker-swarm. Like many others, I chose it as the default container management approach (it is written by the same team, so it should work the best, right? Wrong.) Docker-swarm is indeed buggy and not ready for production, or at least it wasn't 10 months ago. And if you use "one container per service" approach, orchestrating large groups of containers is a necessity.

Then I discovered Kubernetes, and became significantly happier. It's not without its own set of quirks, but it orders of magnitude smoother experience compared to docker-swarm. And it works with AWS just fine. (I didn't get to use DCOS, but I heard some nice things about it, too).

Tl;dr the source of all "Docker is not ready for production" rants seems to be docker-swarm, at least from my experience. Use Kubernetes or DCOS and everything will be so much better.

10
apeace 3 days ago 2 replies      
Here is the smoking gun, showing that this organization shot itself in the foot:

> Docker first came in through a web application. At the time, it was an easy way for the developers to package and deploy it. They tried it and adopted it quickly.

Unless you've got a lot of spare time on your hands, it's never a good idea to adopt unfamiliar tools where they're not needed. Stick to your traditional deployment techniques unless you're equipped, and have the time, to take a roller coaster ride.

Docker is young software.

That said, it seems the author did experience many legitimate breaking changes and abandoned projects. It would be great to hear a perspective from someone on the Docker team about how breaking changes are managed, how regressions are tracked, how OS packaging is handled (e.g. the dependencies the author mentioned wrt filesystems, as well as the bad GPG signature), etc.

11
audleman 3 days ago 2 replies      
I haven't seen anybody speak to one of his points and I'm curious about what HN thinks: if you're already using a cloud provider that provision's instances of the exact size you need, what benefit do containers bring?

I'm excited by containers and all, but this point has always stopped me from going forward. If I were a huge company running bare metal, then yes I'd want to squeeze every last ounce out of my hardware. But if I'm running inside a hypervisor and somebody else is paying the cost, what are the benefits?

As for the article: it comes across as a blend of frustration and real technical challenges. Some good points, though the hostile tone weakens the argument.

I don't think Docker should worry too much about backwards compatibility right now, just power forward until it's got a solid formula. In the meantime, caveat emptor!

12
markbnj 4 days ago 0 replies      
Like the author I've also been running Docker in production since early 2015. Given the 'HFTGuy' part of his email address I'll guess I don't do anything nearly as edgy as he has to do on a daily basis, so it would be tiresome to just say that I haven't had the problems he's experienced. A lot of his issues probably do stem from his organization pushing the limit and wanting to run very specific configurations for performance reasons.

The one thing I do want to respond to is the notion that Docker cannot run stateful apps. Among the stateful apps I have run on Docker in production: elasticsearch, redis, postgresql, mysql, and mongodb. Containers and application state are orthogonal topics. The contained process is reading and writing a file system on persistent media just like all the other processes on the host and that file system doesn't mysteriously disappear when the contained process exits. Naturally stateless apps are simpler. They're simpler to run on VMs or bare metal too. When you need data in a container you do the same thing you do when you need it on a VM or on bare metal: you mount appropriate storage for it.

The main issue with most stateful applications isn't the existence of the state. The main issue is that if you have state then its probably important, you probably want to make it durable and highly available, you probably want clustering and replication, and now you're into discovery and peering, and that can be an actual issue with Docker, especially for finicky beasts like redis.

13
pbecotte 3 days ago 0 replies      
The problem with this article is it is the kind of thing I could see myself writing after about three weeks of playing with the tool, when I did not yet understand the best ways of using it... but it is written from the POV of someone who does not know what they are talking about.

Half of the article is various complaints about AUFS, and then Overlay. There are at least 6 storage drivers off the top of my head, and I feel like there are one or two more I am forgetting about. What made you choose AUFS? What were your thoughts on Overlay? Instead of going with the default and complaining, maybe research your options? Even worse though, the filesystem driver is completely beside the point. If you want to store things on the filesystem from a Docker container, you don't use the copy-on-write storage system! You create a volume, which completely bypasses the storage driver. There are even a bunch of different volume drivers for different use cases- say mounting EBS volumes automatically for your container to persist data to? His whole rant about data magically disappearing has nothing to do with the tool, but that he didn't take the time to learn how to use it correctly.

Further, while I am sure he is telling the truth and has machines randomly suffering kernel panics and crashing multiple times per day, it seems hard to believe that it is docker causing this. He is running 31 containers spread across 31 aws hosts... I am running thousands, across hundreds of aws hosts, and have been for a long time, and cannot recall a single kernel panic caused by docker.

14
pjmlp 4 days ago 1 reply      
> Google invented containers long before it was a thing

I had to laugh at this one, then again I am an old dog.

15
marcosdumay 4 days ago 2 replies      
Well, I lost nearly half the article asking myself why would somebody ever run a single container per VM... They got a bug that made container-running VMs unreliable, thus did a non-test using what they got available.

I'd be more interested on articles that actually tested it running how it was meant to. Or, if you can't, just stop there and say "there's a showstopper bug, it wouldn't work for us" - those are useful articles too, and require much less effort.

About databases, mostly anything you put between them and the hardware will hurt you. VMs are bad too. What I can't imagine is how you would create and destroy database instances at will - I might live in a different reality, but around here you'll always want as few database nodes as possible, and as a consequence, as powerful and reliable nodes as possible, what means as little moving parts as possible.

1 - What won't hurt is a disk abstraction layer designed for performance and as expressive as the original physical stuff. This is widely used with the "SAN" name, and I'm willing to bet you can mount one inside a container. But well, on AWS you really won't.

16
api 3 days ago 1 reply      
When I first saw Docker I saw something with all the characteristics of a developer fad, namely that it was a magic new thing that in reality was nothing but a barely-novel rearrangement of old things but promised to make a lot of very old very hard problems go away.

Even worse, it was a fad that adds a whole lot of complexity to the system. Anything that adds complexity always gets a jaded glance from me. Unless there's a very clear and good reason for it, increasing complexity is typically a sign that something is wrong. (I use this rule in my own code and designs too... if I'm building a tower of Babel then I'm not seeing something or making a mistake.)

"Let's take preconfigured Linux system images and treat them like giant statically linked binaries."

Don't get me wrong-- it is a somewhat interesting hack and does have some utility. (and it IS a hack) But the problems it solves are in fact a lot harder than they seem and stem from fundamental 30+ year old issues in OS design and the Unix way of doing things. Our OSes date from the days when a "box" was a very long lived stationary thing that was provisioned and maintained manually, and they're full of things that today are bad design decisions because of this... like sprawling filesystems full of mutable data with no clear mutable/immutable division.

But the hype on this thing was just insane. I've never quite seen anything like it except maybe Java in the 1990s and that had more substance (a popular Smalltalk-ish language). For a long time I've joked that devops conference conversation was sounding like:

"Docker docker Docker docker docker docker Docker docker docker?"

"Docker docker docker."

"Docker, docker?"

"Docker docker docker Docker docker."

"Ahh, Docker."

Luckily it seems to be dying down and a lot of people are realizing it's not a silver bullet. This will probably inoculate a generation of devs and admins against over-hyped fads, but never fear... a new generation of devs and admins are coming in now and are eager to rewrite everything again since this time they'll be able to make a few trivial rearrangements of things that are already there and this time they'll get it right.

17
edpichler 4 days ago 0 replies      
It seems I have seen a lot of posts on HN relating that Docker is not ready for production. I am using Docker in production on my small company for one year and it's working pretty well (maybe because we are still small).
18
FuNe 4 days ago 3 replies      
Being caught in the containerization craze myself I'd love to hear whether the story is exaggerating or is painfully accurate.

So far I've been bitten by the inability to clean up images of certain age.

UPDATE:Another really annoying thing is the inability to tag an image directly into a registry (AFAIK). You need to pull, tag and push back again. Given that images can be GBs long you end up with really heavy net traffic for a simple task.

19
rzimmerman 3 days ago 3 replies      
A lot of the issues in this article seem to stem from running Docker on old kernel versions.

> We are using Debian stable with backports, in production. We started running on Debian Jessie 3.16.7-ckt20-1 (released November 2015). This one suffers from a major critical bug that crashes hosts erratically (every few hours in average).

If you're stuck on an older system that's poorly supported by Docker then it may be a bad choice for you.

20
scresswell 3 days ago 0 replies      
What a very strange article. TES were using docker in production in October 2014. It was certainly possible to run docker containers (docker run -d) in the background and to view them (docker ps) as early as October 2013 when we first evaluated it at the Daily Mail.
21
andmarios 4 days ago 1 reply      
Author's company should hire a proper devops person and then docker, magically from the devs pov will start working.

AUFS on production? Seriously? Create a BTRFS partition.Docker isn't a shortcut to learning a linux based OS, it is a useful tool in the hands of people who know what they are doing.

22
finid 4 days ago 0 replies      
The author made some really good points, but CoreOS is made by Docker for Docker is not one of them.

CoreOS is the company behind a competing container tech called rkt (rocket) and CoreOS Linux.

23
ubercore 3 days ago 1 reply      
I have to disagree on the pain of working with Kubernetes. There are definitely gaps in documentation, no doubt about that.

But it took me under a day to containerize an application I'm working on, and get it up and running flawlessly on Google's container engine. And this is coming from having zero Docker experience.

24
dominotw 4 days ago 0 replies      
>Making a new filesystem in 1 year is still an impossible mission. Docker just tried and failed. Yet theyre trying again! Well see how it turns out in a few years.

Is this just fear mongering or are there any actual issues with overlay2. we have been trying out overlay2 on our less critical systems and haven't experienced any issues so far but we have stressed it enough to know for sure.

25
0xmohit 3 days ago 2 replies      

 That was a 7 hours interplanetary outage because of Docker. All thats left from the outage is a few messages on a GitHub issue. There was no postmortem. It had little (none?) tech news or press coverage, in spite of the catastrophic failure.
This is incorrect. There were github issues, HN posts, and so on. Solomon Hykes posted on Github [0] and HN [1]:

 Hi everyone. I work at Docker. First, my apologies for the outage. I consider our package infrastructure as critical infrastructure, both for the free and commercial versions of Docker. It's true that we offer better support for the commercial version (it's one if its features), but that should not apply to fundamental things like being able to download your packages. The team is working on the issue and will continue to give updates here. We are taking this seriously. Some of you pointed out that the response time and use of communication channels seem inadequate, for example the @dockerststus bot has not mentioned the issue when it was detected. I share the opinion but I don't know the full story yet; the post-mortem will tell us for sure what went wrong. At the moment the team is focusing on fixing the issue and I don't want to distract them from that. Once the post-mortem identifies what went wrong, we will take appropriate corrective action. I suspect part of it will be better coordination between core engineers and infrastructure engineers (2 distinct groups within Docker). Thanks and sorry again for the inconvenience.
[0] https://github.com/docker/docker/issues/23203#issuecomment-2...

[1] https://news.ycombinator.com/item?id=11823231

26
vogonogov 4 days ago 0 replies      
We've been using docker in production for two years now. Started with docker 1.4 and Rancher v0.15. Docker has had it problems with backwards compatibility, but we haven't had the mentioned filesystem problems.

Our containers are completely stateless. We use AWS's RDS and S3 to store state.

27
thaeli 4 days ago 1 reply      
Since OP is running on AWS anyway, why not just use Amazon ECS? The orchestration (including Elastic Beanstalk) is pretty good, and we haven't had any stability issues in production with it. Same thing for registries; just use AWS ECR instead.
28
julienchastang 4 days ago 0 replies      
I also have been steeped in Dockerland for the last couple of years. Here are my own Docker pros/cons. Caveat: I am not nearly the power user that thehftguy is or understand Linux kernel issues in any real depth. Im a lowly high-level language programmer (elisp/Python/Clojure/Java/shell).

Pros:

When Docker works, its great. The concept of being able to install software on the users behalf in the container and have the user install the container really does make life easier. The unwieldy command lines are nicely abstracted with docker-compose(.yml). Software upgrades via containers becomes a nearly trivial process.

I work on scientific software that can be a huge pain to install and configuring the packages correctly often times requires the help of the software creator. The fact that you can codify this dark knowledge in a Dockerfile is of tremendous benefit. And generally speaking, I like the concept of the Dockerfile. Whatever container technology ends up winning, I hope there will always be something like the Dockerfile.

Cons:

- Docker version upgrades never go smoothly. Firing up docker after apt-get update && apt-get upgrade never works with respect to Docker. You get some obscure error and after spending 30 minutes in Google you end up having to rm -rf /var/lib/docker/aufs. My solution is sometimes to throwaway the VM and start over, but this is unacceptable.

- File ownership issues inside versus outside the container are (or at least used to be) a huge pain in the ass! I am referring to files mounted from the Docker host file system into the container. (And I am also referring to running docker as unprivileged user that is part of the docker group.) The solution I settled on is to enter the container as root, run chmod/chown on the appropriate directories and drop down to a regular user with gosu. At that point, you proceed on your merry way by starting your service, process, etc. This work flow solves permission problems inside the container, but outside the container is a different story. At least on earlier versions of Docker, you could lose permissions of files created via the container because at that point they are owned by root as seen from outside the container! More recent versions of Docker seem to solve this problem as I have gleaned empirically and from experimentation but I have yet to see adequate documentation about any of this and I have spent a long time searching! With more recent versions of docker I have observed that the files in question are no longer owned by root (again as viewed outside the container), but by the user that started the container, the behavior I would expect. Im also not crazy about the gosu solution which seems like a bolted on rather than baked in solution.

29
kordless 4 days ago 0 replies      
> My first encounter with docker goes back to early 2015. Docker was experimented with to find out whether it could benefit us. At the time it wasnt possible to run a container [in the background] and there wasnt any command to see what was running, debug or ssh into the container.

Having worked all of 2015 in the container space, I can state this comment is a false statement. While it may not have been possible for this person, the technology has existed for quite some time to see what process are running in a container, debugging became possible with sysdig and ssh'ing into a container has been a thing for YEARS.

30
edsouza 3 days ago 0 replies      
Excuse my ignorance on docker, but why did the OP not use a proxy cache for the registry? [0]

If they were running this on a single machine with backups they could restore to a time before the incident.

Also querying and downloading from the public registry everytime time seems like a waste of bandwidth.

[0] https://blog.docker.com/2015/10/registry-proxy-cache-docker-...

31
carrja99 3 days ago 1 reply      
> My first encounter with docker goes back to early 2015. Docker was experimented with to find out whether it could benefit us. At the time it wasnt possible to run a container [in the background] and there wasnt any command to see what was running, debug or ssh into the container. The experiment was quick, Docker was useless and closer to an alpha prototype than a release.

Anyone who has used docker at all knows the above statement is 200% rubbish. FFS

32
pjwal 4 days ago 2 replies      
We're successfully running Dokku instances for various dev/qa environments, but I have not attempted containers in production as of yet.

While Kubernetes seems to be the orchestration choice du jour, I had been eyeing http://deis.io/ (which I believe to be at least as mature). Can anyone comment on their experience or thoughts on Deis vs Kubernetes? TIA.

33
iamleppert 3 days ago 0 replies      
Over the course of the years I've tried to like Docker. I've found the benefits to be not worth the stress and aggravation for the associated "magic" that it takes to provide the kind of user (developer?) experience they're after but have thus far failed to achieve. It's like the uncanny valley of sysadmin tools.

The concept of getting away from snowflake machines and making your infrastructure more code-like with deployment scripts is the way to go, but you certainly don't have to use docker for that. At the end of the day, it's more about forcing your team to be regimented in their processes and having a working CI system for your systems related stuff. You should be able to start from nothing and with a single command have a developer's environment up, all the way to a fully scaled prod system.

Docker is just a tool to try and get to that point of evolution (albeit a poor one).

34
ungzd 4 days ago 0 replies      
> Making a new filesystem in 1 year is an impossible mission

If this is "real" filesystem for disks, not proxy for other filesystems, mostly read-only.

35
Ideabile 3 days ago 0 replies      
After reading your post I think that you miss understood the purpose of Docker.

Even if people try to sell it has flexible silver bullet, made up for everything; like any other tools around, they need to apply some sort of CAP theorem.Because underwoods we're just talking in how moving bits.

For instance I use containers like AWS lambda's functions. And I use tar streams to pass the contents inside them (So no virtualised mounted filesystem needed).

With that I mean you can find different way to use them, in short: Law of the instrument.

If you're wise (and according with you article you got ton of experience), you can find good purpose for it.

Docker is just 'meant' to make your life 'easier', like any other tool, not always they satisfy the expectations.

In other hands you gain experience with it, I'm sure you would use them back for a better purpose, where they can perform better for you, there is nothing to worry about this is just experience.

Thanks for sharing.

36
bergwolf 3 days ago 0 replies      
Interesting! A relevant topic ("Hyper is Docker Done the Right Way" by TheNewStack) came up in HN first page as well.

Is it how Docker should have been designed to embrace production in the first place?

https://news.ycombinator.com/item?id=12873089

37
crad 3 days ago 0 replies      
I've been using Docker in production for over a year at this point. I can't say that it's been problem free, but nothing so dramatic as this post outlines. For example, I run Erlang apps in it, with heavy production use.

Heck, it is possible even remove images from the docker repository without corrupting it, just not from a single API endpoint. FWIW I think that it is a big failing of the v2 repository project to not have that functionality from the get go.

It's also worth noting that I've only run Docker in production using CoreOS and Amazon's ECS AMI. Both have their drawbacks, but nothing so dramatic as to keep me from recommending Docker in production for "cattle" style applications.

38
minitech 3 days ago 1 reply      
The section on not running databases with Docker isnt really fair. Of course you wouldnt use something that crashed all the time to run your database server, but Dockers not designed to crash. It has volumes and you can do regular crash recovery with it. Crashing hard and taking the hosts data with it shouldnt be a likely scenario and should be prepared for about the same as disk failure have redundancy and make backups.
39
Qerub 3 days ago 0 replies      
> The impossible challenge with Docker is to come with a working combination of kernel + distribution + docker version + filesystem. Right now. We dont know of ANY combination that is stable (Maybe there isnt any?).

You can do worse than use Red Hat Enterprise Linux 7 (RHEL7) that has a tested and supported blend of those components. Or CentOS if you just want to try it out and don't care about support.

40
ererekererkerk 3 days ago 0 replies      
Docker is unstable trash and does not belong anywhere near a production environment.

Unsurprisingly the linked article is fraught with glaring errors and misinformation. This has become typical of most things I read that is docker-related; it's a circus.

It saddens me to see the Linux community get dragged into and overwhelmed by this mess.

41
barnacs 3 days ago 0 replies      
It's funny how the article mentions AWS lock-in but fails to recognize the infrastructure lock-in aspect of relying on a whole container/orchestration/virtualization stack that is so fragile that (according to the article) the only way to reasonably use it in production is to outsource its maintanenace.

Once you go down this path, it seems there is a very a limited number of providers capable of providing the infrastructure you require (at least compared to vps/dedicated/datacenter providers) and they can just keep pushing you into using more and more fragile infrastructure components that you would never want to maintain yourself.

Eventually, you may very well find yourself paying so much for all the hardware/virtualization/os/container/orchestration maintenance and resource overhead, with a number of backup instances of everything due to the unreliability of all these components, that you wish you could just go back to placing a few pieces of metal in a few datacenters and call it a day.

42
nhumrich 3 days ago 0 replies      
I feel like almost all issues in this article stem from not following ops best practices. And similar issues would happen no matter what technology was being used, they just happen to blame their issues on docker because that's the tool they were using at the time.
43
okket 4 days ago 2 replies      
Clean docker images:

 docker rmi $(docker images -f dangling=true -q --no-trunc)

44
rogersach 3 days ago 0 replies      
Although Docker may be complicated, it's the neatest and fastest solution so far in terms of spinning up production and their near-ubiquitous support with major companies.
45
corv 3 days ago 0 replies      
Meanwhile BSD jails & Illumos zones are humming along nicely on ZFS.
46
ex3ndr 4 days ago 0 replies      
Does anyone know is rkt much better in terms backward compatibility?
47
z3t4 3 days ago 0 replies      
I think Docker is meant for apps in a shared environment (with other apps). If you run dedicated app servers Docker is an unnecessary abstraction.
48
k2xl 4 days ago 1 reply      
I've been running docker in production for about 3 years.

Almost every time I've hit a problem, it has had nothing to do with docker (though everyone likes to blame it). 99% of the time the issues I've encountered are due to docker configuration (and yes the learning curve is steep) or the app itself crashing.

The one time I've encountered an issue with docker with databases (used elasticsearch) was a kernal issue (where the host kernal was out of date). That was the only time I had to care about the host OS in debugging a docker issue.

One thing that I recommend for anyone getting into docker - use docker-compose for everything. Don't use the docker CLI.

One of the major problems I have with docker is that (especially for beginniners) the docker command line tool can cause unintended consequences.

Every time you run docker run <image> you are technically creating a new container from an image. Once that finishes, the container still exists on disk. Using the --rm flag will cause the container to actually be removed after running it. I sort of wish docker would emphasize that flag, because often when developers are debugging or trying things, they will continue to run docker run <image> multiple times - not realizing that they are creating a ton of containers.

Yes, the containers are usually very light, but it caused a lot of confusion for me when I started out. And part of my superstitious self believes that the less dead containers you have the less likely you'll have filesystem issues.

A second source of confusion for new devs I've noticed is that the concept of volumes are somewhat confusing. "Why would I ever do COPY in my docker file if I can just volume it?"

Once you've worked with Docker, like any tool, you become more adept at avoiding a lot of the mistakes mentioned in this article. Which leads me to a point I can't disagree with the OP about - upgrading docker is annoying as hell.

Almost everytime I upgrade docker I end up having some obscure error. When I look on docker github I notice I'm not the only one, and often times the issues are just a few hours old.

But, what the OP didn't seem to realize, is that to avoid these issues you need to lock your docker version in production to one that you know works. Additionally, you need to not build your dockerfile in production. Build your docker containers with a build system and upload them as artifacts. You can either use an internal docker hub or do docker save to save them as a .tar.

Building while in production is a real no-no, even though it may seem more attractive than moving around 300-500MB images. You never know if that one dependency you apt-get in your dockerfile is offline for some reason. And you can't always depend on devs of dockerfiles properly versioning them anyway.

49
leandot 3 days ago 0 replies      
Is it just me or this article mostly shows serious incompetence from the author/company?
50
siegecraft 4 days ago 0 replies      
Ha, docker wishes that they had the staff to create all the things this article claims they create.
51
mamcx 4 days ago 1 reply      
If not Docker, then what?
52
SadWebDeveloper 3 days ago 0 replies      
Docker has been the go to solution all my python-for-web-development friends have been using as flagship to "kill php et al" and sharing this with every single one of them is making smile again. Thank you OP.
53
discordianfish 3 days ago 1 reply      
That such a bad article ends up here shows the biggest problem with Docker: It's irrational hype. Whether it leads you to hate it or want to dockerize all the things, it's unhealthy.
54
X86BSD 4 days ago 0 replies      
I recommend he try a better solution such as tredly.com, or perhaps SmartOS. Some things actually work OP, don't give up hope because of docker.
55
Unbeliever69 4 days ago 0 replies      
Let's not forget to LOL about those 2 animated GIFs. Just about ejected my morning coffee through my nostril.
56
m3kw9 4 days ago 2 replies      
Docker SHALL NOT run databases in production, by design."

You need to use external volumes

57
frenck 4 days ago 0 replies      
I'm not getting in to this discussion... nevertheless, this guy just made my day!

This was a fun read :)

58
Daviey 4 days ago 0 replies      
Automatically updating core infra from public repositories without CI'ing.....

No... losing a day of development is on their infra team.. not docker.

59
ben_jones 3 days ago 0 replies      
Can we please, please, please get a follow-up titled: "Bash in Production: A History of Failure"? With a comedic yet tragic theme of course.
60
moondev 3 days ago 1 reply      
Docker in Production: How we did it wrong

This is a typical case of blaming the tool. There are so many bad practices mentioned I don't know where to start. No orchestration... One container a host... Not using volumes correctly... Apt-get installing docker every time ci runs... Not using an optimized base os for containers... It's amazing what people will blame their problems on when they don't do their due dilligence with a new platform.

61
billyjobob 4 days ago 4 replies      
Is 'An History' in the title some sort of meme joke like 'An Hero' [1] or does the author not know basic grammar?

[1] http://knowyourmeme.com/memes/an-hero

8
Homebrew 1.1.0 brew.sh
581 points by sashk  1 day ago   201 comments top 36
1
AdamN 23 hours ago 2 replies      
I'm going to ditto the positive comments. Kudos to the maintainers - not just for the package itself but for growing it. They successfully went from a hacky page with some scripts, to a big giant repository, to multiple repos, to a fully fledged mature best-practices package manager for macOS with all the bells and whistles.

The sign of a great project isn't just that it does one thing well, but that it grows with the needs of the users and engages with them to constantly make something better without falling prey to bloat. And by that measure, Homebrew gets a gold star.

2
brunosutic 1 day ago 3 replies      
Homebrew maintainers are doing a fantastic job, made my life as a developer so much easier! I wanna use this opportunity to "publicly" thank them!
3
lisper 22 hours ago 7 replies      
I have very mixed feelings about homebrew. When it works, it is awesome, but when it doesn't it can be a serious nightmare. In particular, homebrew refuses to run under sudo. Instead, to instal in /usr/local it wants me to recursively chown /usr/local to myself, which is a Really Bad Idea (tm). I have mostly stopped using it for this reason.

I understand that running a program that runs third-party scripts under sudo is a security nightmare, but there has to be a better solution than globally chowning /usr/local.

[UPDATE] Apparently, this is no longer the case. Good to know!

[UPDATE2] Alas, the problem seems to not have been fully addressed. You still need to chown /usr/local/bin :-(

4
dancek 1 day ago 2 replies      
There are lots of minor changes, but nothing major IMHO. The reason for bumping the minor version are a couple of breaking changes:

 - Disable SHA-1 checksum support in formulae - Disable running Homebrew as the root user (e.g. sudo brew) - Bottles with _or_later tags no longer use _or_later in their filenames so the existing bottle can be reused

5
antirez 17 hours ago 0 replies      
Without Homebrew, MacOS would be a lot less a fit as a development machine for me. Thanks to the authors.
6
vr46 1 day ago 4 replies      
Always good news to see progress in Brewland. I personally do not like having it installed in /usr/local and have ensured it never is on my installation, but I've not had the chance to ever discuss with one of their team why they do it and what things may possibly explode by not having Brew installed to /usr/local
7
ianoshorty 7 hours ago 0 replies      
Recently had to reinstall my mac from scratch. I was able to install 95% of my daily software through brew, by far a new record. Big thanks to the developers / maintainers of this project.
8
AdamN 23 hours ago 13 replies      
Just a quick poll. Is anybody still using MacPorts or anything else? If so, why?
9
kriro 9 hours ago 0 replies      
I have mixed feelings about Homebrew. Don't get me wrong, it's great, good job everyone involved. It makes my OSX laptop a viable development option for me. Without Homewbrew I wouldn't use it.

Alas that's also the problem. It makes OSX so familiar that this one OSX machine has now crept into my collection of Linux boxes and won't leave (thank you iOS development). I guess if your only complaint is that the software is too good...that's a pretty great thing :)

Sometimes I wonder how valuable Apple thinks Homebrew is for them. Any info on how much they contribute to the project?

10
erichurkman 12 hours ago 0 replies      
To echo everyone else: Apple should be paying you folks a boat load of money. If homebrew didn't exist, I don't think we'd bother with macOS as a developer environment.
11
raimue 22 hours ago 1 reply      
Disabling sha1 does not seem like a good decision. There are still enough upstream software projects that only publish their checksums as sha1 or even only md5. Keeping the original checksum allows quick comparison that a Formula is meant to use the same tarballs that were published by upstream. The problems of algorithms such as sha1 or md5 can be remedied by an additional strong checksum in the Formula. Checksum calculation is so fast that doing it twice will not matter.
12
matt4077 18 hours ago 1 reply      
Of all package managers I use, homebrew is without a doubt the most satisfying. It tracks updates closely and has always had any package I was looking for, yet I don't remember the last time I ran into an error.

(I'm sure all three happen, but if I can't find them with close to 200 packages installed, they must be doing something right)

13
Raphmedia 23 hours ago 5 replies      
I have been very happy with Homebrew some far. What are some good windows alternative? Making the move from OSX to Windows in a week.
14
lanebrain 5 hours ago 0 replies      
Homebrew is one of the great things I love about doing dev on a Mac. /micdrop
15
truebosko 22 hours ago 0 replies      
I absolutely adore Homebrew and can't thank the package maintainers enough. I think of all the times I battled with compiling some package, or installing it via MacPorts only to struggle updating it.

Never with Homebrew. So much <3

16
no_wizard 15 hours ago 0 replies      
I love homebrew! I have it all on of my macOS machines. Best, easiest way to install common software I use between computers (Where supported, anyway) and developer packages. Now if we could just get http://linuxbrew.sh/ up to the same caliber....:)
17
krisgenre 11 hours ago 0 replies      
As a developer, if it weren't for Homebrew, I'd ditch OS X.
18
aymenim 4 hours ago 0 replies      
First order of Action when setting up new Mac install chrome and homebrew :), Fantastic tool keep up the great work.
19
akashaggarwal7 7 hours ago 0 replies      
It is so much easy to make your code public for everyone to use on Homebrew. Great job maintainers!
20
eltoozero 17 hours ago 0 replies      
Homebrew: The macOS package manager Apple should have included from the beginning.

Thanks for stepping up.

21
eathrow 3 hours ago 0 replies      
The most fitting name for a software I've ever seen!
22
serge2k 19 hours ago 1 reply      
I don't know if I can trust homebrew, I hear their implementation of reversing binary trees is weak.

On the other hand it's been pretty darn good for me.

23
accraze 22 hours ago 0 replies      
Homebrew is what keeps me working on a Mac, plus now that Docker for Mac is out....
24
xorange 18 hours ago 0 replies      
A Good Software. Thanks for the hard work, Homebrew peoples!
25
AtticusTheGreat 21 hours ago 0 replies      
There is definitely a lot to like about Homebrew (it's better than macports), but like others have said, when stuff goes wrong you can end up spending an entire day googling, only to end up going down a rabbit hole of archaic and often obsolete commands that may or may not work.

Once you get your environment set up it's fine, but try updating to the most recent Mac OS version with a ruby environment and you'll find pretty much all your gemsets broken in one way or another (if you have anything that needs native building).

I might just be bitter, though, as I spent most of a day working through this exact situation just last week.

26
harrygeez 13 hours ago 0 replies      
Anyone here used port? What made you choose brew over port? I'm on the fence here.
27
heavy_machinery 10 hours ago 0 replies      
Homebrew is awesome. Comes close to apt.
28
rocky1138 22 hours ago 2 replies      
I know they're different platforms, but how does this compare to apt? Which is better?
29
moondev 16 hours ago 0 replies      
Docker for mac has replaced homebrew for me. Run anything under the sun in a temporary container and mount the current directory. Works great and is very nice isolating everything
30
dudul 23 hours ago 5 replies      
Probably stupid question but: how do you update your version?
31
omegote 17 hours ago 0 replies      
It's funny how every hipster dev uses a MacBook for development, yet you have to resort to this kind of hacks from day one if you want to be productive. Anyway.
32
atarian 18 hours ago 1 reply      
How do I donate?
33
pikachu_is_cool 17 hours ago 0 replies      
Switch from Ruby to LuaJIT! I'd wager this is the best package manager out there if not for the slowness of Ruby's VM.
34
floatrock 22 hours ago 1 reply      
35
1 day ago 1 day ago 2 replies      
36
anon789na 22 hours ago 0 replies      
It's upsetting that Apple doesn't care about developers. Why is there no Apple maintained package management system? This results in having to turn to solutions like Homebrew, which -- while a valiant effort -- is completely insufficient for distributing state across multiple machines.
9
Show HN: Make an app by adding JSON to this app github.com
643 points by gliechtenstein  3 days ago   176 comments top 59
1
swframe 2 days ago 1 reply      
I've built similar platforms. Naturally, I think they are awesome.They are a core part of very valuable products like SAP, Salesforce, etc.You will get a lot of push back from high developers who prefer direct access to the native api or html.Please don't let them discourage you.My response to them is that they can build reusable components that are driven by your json layer (win-win).There are enough developers who will benefit.Try to get funding sooner than later. I waited too long.You can't keep up building this by yourself.
2
gliechtenstein 3 days ago 15 replies      
Guys, the author here. This is my first open source project and I worked really hard on it for 5 months.

I think it's really cool, and I hope you guys like it. I would appreciate any kind of feedback!

3
mpweiher 3 days ago 6 replies      
Instead of JSON, you could also consider sending XML markup. Oh wait, or maybe reuse an existing SGML/XML standard called...HTML.

And then you could add scripting...

4
ajhit406 2 days ago 1 reply      
I found it interesting that the author has actually submitted the same idea 4 times to varying degrees of interest:

https://news.ycombinator.com/submitted?id=gliechtenstein

Its easy to forget that good ideas take a lot of time and iterations until they become magical. Great things do not happen overnight and without much sacrifice. The majority of people would give up the first time the community dismissed their prototype.

Big props for not giving up.

5
vvanders 2 days ago 1 reply      
Very cool.

For what it's worth most med-high profile games are built on similar techniques albeit at a larger scale. You're largely building tools and reusable components that let designers, artists and animators define behavior, look & feel.

It goes by a couple different names but the most common one I see is "Content Driven Development". As you've noticed it has all sorts of benefits like fast iteration time, updates on the fly and ability to allow non-coders to bring in their perspectives to build experiences.

Don't let the ivory tower haters get you down, it's a very powerful and valid approach in the appropriate domains.

6
hex13 3 days ago 1 reply      
Been there, done that. When I coded games, I made some simple JSON based file format and almost whole game was in form of JSON. I even implemented conditional and loops in JSON.

But It can easy lead to inner platform effect. Not needed complexity. And in JSON you can't write JS code (needed for e.g. event handlers).

So I switched to JS. Still data-driven approach (simple hierarchical objects with data), but with power of JS (and if you're use Babel, you could use JSX for your DSL).

So: I don't think JSON is best format for what you try to achieve. It's just so limited.

Besides what is added value of Jasonette? When I look on Jasonette I have feeling that it's just reinvents ReactJS but in JSON not JSX. Not sure if there is much profit with JSON format alone.

7
mentos 3 days ago 2 replies      
Really great stuff!

What is the most advanced application you have created with this? What functionality do you think is still lacking?

If you were to try to create a full featured app I imagine you'd find that working in Swift is the better option.

What this reveals to me is that the App Store submission and update process is so time consuming you would rather write your logic in Json than in native Swift/ObjC.

If Xcode let you instantaneously push app binary updates would this be as useful?

8
felipeccastro 3 days ago 2 replies      
This is very interesting, although JSON is not a very comfortable format to build manually - I think writing in YAML or CSON would be much cleaner to read/write, and easily converted to JSON. Also, any plans on building a UI for generating this JSON? If you have predefined components, you could make some sort of designer that builds the JSON for the user, that could turn into a commercial project I think.
9
oliv__ 3 days ago 0 replies      
Wow this is really impressive.

If you had enough JSON-to-native "modules", basically anyone could write a native app in a few hours (since functionality in most apps is pretty much standard stuff)!

Hell, if you pushed this further you could create an app store for code, letting you buy additional modules to plug into your app like lego.

10
lewtds 1 day ago 0 replies      
Congratulations! I think you have "discovered" QML, my favorite environment for rapid GUI development, incredibly intuitive. You may want to to look into that for inspiration.
11
libeclipse 3 days ago 2 replies      
Wow that is actually a brilliant idea. Would you ever consider making an android version?
12
macca321 1 day ago 0 replies      
So what this is, is an IOS hypermedia (AKA true REST) client, for consuming APIs serving a custom (as opposed to generic) hypermedia format.

The format itself is designed for a particular problem domain, namely outputting mobile-application specific controls and behaviours.

It would be an interesting exercise to extract those aspects into a custom vocab for JSON-LD + Hydra (or Siren, or (insert) other generic hypermedia formats).

http://www.markus-lanthaler.com/hydra/

It would also be interesting to have the hypermedia format defined as a schema somewhere.

13
walke 3 days ago 2 replies      
Very neat and well put together! Thank you for taking the time to prepare nice documentation to go along with it! Congrats on your release!
14
sintaxi 2 days ago 0 replies      
This is really cool.

Why not also support the web? With an Immutable store you could easily use react to display this app on the web without needing native install.

Great job. Look forward to seeing where this goes.

15
frostymarvelous 2 days ago 0 replies      
Is anyone working on an Android port? I'd definitely want to give it a go.

Haven't checked yet, but is there a spec for the json?

16
yellowapple 2 days ago 0 replies      
What would be the feasability of an Android port? I like the concept, but hate that it's for a platform that I refuse to touch with a ten-foot pole.
17
Xeoncross 2 days ago 0 replies      
Students (especially younger ones) often need more help seeing the light at the end of the learning tunnel. Something like this is a great start to teaching development and spurring them onto an actual language.

I usually don't start with CSS let alone SCSS for websites, basic HTML shows them instant results for what they typed and gives them more endurance for the longer learning curve (but more power) provided by CSS.

18
hyperpallium 2 days ago 0 replies      
OK, this obviously looks like a web browser (native...) using json instead of (xml-like) html, though it's not good as a markup language.

Interesting is having declarative templates on the client side, instead of server-side, for accessing json APIs. Of course you can do this in browsers (and it is done), but this is cleaner and simpler, being a fresh reinvention.

19
niklaslogren 2 days ago 0 replies      
This seems similar to what is used internally at AppSpotr[1] (disclaimer: I'm a former employee). They also use JSON to represent apps that can be rendered in iOS/Android clients. The difference is that apps must be built using a WYSIWYG editor rather than by editing the JSON directly.

From my experience, a JSON representation works very well for a big percentage of all the apps out there, and you can often do without custom code. Good job creating this open-source version, it looks very expressive and useful!

[1] https://www.appspotr.com/

20
OOPMan 2 days ago 0 replies      
Nice to see you getting so much positive feedback, here is some negative feedback balance it out:

* How is this different from any of the other declarative app building systems that have, as yet, failed to take the world at large by storm?

* Why JSON? I get it's universal but it's also annoying to write. Why didn't you consider YAML or , even better, HOCON?

* Do you really think loop syntax in a declarative language is a good idea, because to me it always comes across as a nasty anti-pattern to get around the fact that declarative languages don't tend to handle this well?

Otherwise, nice effort. Better than anything I've ever shown to the world :-)

21
reubano 1 day ago 0 replies      
This is very cool! Do you have a spec to share? First thing i thought of is that it would be awesome to write webapp and static site generator implementations of the jason spec.

Im in the process of making my own static site, and im storing as much of configuration and content as possible in json.

22
tomaskafka 2 days ago 0 replies      
One use case: you have a game or app, and want to publish rich promo screens for new features or in game events. Even to old versions of clients.We once built a tiny in-house framework for this, based on Lua (game downloaded a package of Lua script and image resources and displayed the promo). If there was something ready to use (and multiplatform - at least android and ios), we could have saved quite some development time.
23
emmanvazz 2 days ago 0 replies      
This is awesome! I actually had this same idea and built a platform for non-technical people to have the ability to build native apps. We picked Xamarin.Forms as our mobile framework because that way we could target most of the mobile platforms and share as much code between those project. We also picked Drupal a CMS to give non-techies an easy framework to build things out. I am going to keep an eye on this project for some different ideas and maybe one day we can pick your brain with our idea/platform.
24
lpasselin 2 days ago 1 reply      
Small detail I enjoyed from your website (on mobile, might be different for desktop) : header text positioned to the right of the screen (or content div).

Position right is something we don't see often. This shows it is a great way to make your page easier to read.

25
mark_wong 22 hours ago 1 reply      
I tried building the instagram clone but get:

Failed to create provisioning profile.The app ID "home.master.Instagram-UI-example.Jasonette.com.githubusercontent.raw" cannot be registered to your development team. Change your bundle identifier to a unique string to try again.

Anyone knows whats up? I ran the setup and inputted 2 and supplied the URL from the Github..

26
zupreme 2 days ago 0 replies      
I'm at a restaurant reading this so I haven't had a chance to test it out yet but having reviewed the website I already see a massive use for this: Programmatic app building. This is an awesome idea. I can't wait to play with it.
27
robk 2 days ago 0 replies      
This plus Huginn will be awesome for dynamic apps
28
thecrazyone 2 days ago 1 reply      
Android and web dev here, how would you compare this to nativescript and such? Instead of learning a new dsl, isn't web to native simpler? Unless you're developing a game I've never found native to be necessary. I'm one of those devs who moved from native to hybrid for faster development speed

[edit] I really like how you're participating in the comments and how you reply. I really hope you do well, you seem like a nice person :-)

29
tzm 2 days ago 1 reply      
Looks like a new name from jasonclient.org. Previously discussed 6 months ago: https://news.ycombinator.com/item?id=11736817
30
yla92 2 days ago 0 replies      
This is really great one. Shameless plug, we've built similar (but smaller) concept for the filter form of our apps[0] (cross platform, natively for Web, Android and iOS) mainly because we were tired of having to change the search filter form fields and logics in every app update. Now, we just have to change the configuration and the filter form changes instantly.

[0]: web version can be found at - https://www.99.co/singapore/rent

31
amelius 3 days ago 1 reply      
Is this HTML reinvented? :)
32
anupshinde 2 days ago 1 reply      
This looks very cool. And tempting at the first glance. As I read more of the documentation, I can't help stop comparing it with React-native for the rendering part. I remember somebody mentioning server-rendered mobile apps in one of the early react-native apps. This is pretty much achieving that.

However, the comparison ends there. This is great! Its just too simple to make an app and probably when the app grows one could create a transpiler to generate this json

33
desireco42 2 days ago 0 replies      
I know, it is too simple, people's head just exploded.

Well, there seem to be great number of uses for this. I think this is excellent idea. Off to make an app... or two.

34
tomcam 2 days ago 1 reply      
Late to the game--where does logic go? Looping? DB access? Would they be written in Objective C as actions?
35
mcudich 1 day ago 0 replies      
Neat project! I've been working on something similar here: https://github.com/mcudich/TemplateKit
36
robk 1 day ago 0 replies      
This is quite useful with a Huginn server setup to dynamically feed JSON out https://github.com/cantino/huginn
37
srikz 1 day ago 0 replies      
This is nice. I always wondered why something like this isn't available. I built a nested tab-bar menu system for an e-commerce app recently which can be generated from a json file.
38
babuskov 2 days ago 0 replies      
I made something very similar in 2004. Instead of JSON, you would supply PHP arrays. The enabled to mix in some conditional code and make it more flexible for different usage patterns. We shipped a pretty big ERP application this way. I have since left the company, but the software still works on 200+ installations.
39
ARothfusz 2 days ago 1 reply      
How do app permissions work? Is there a way to make Jasonette request only the permissions needed by my JSON?
40
ldjb 3 days ago 0 replies      
This is a wonderful idea! Making app development faster and easier can only be a good thing.

Now if only it were possible to place a generic version of this app on the App Store and allow users to load in the JSON for whatever app they want. Sadly I very much doubt Apple would allow it.

41
unlogic 2 days ago 0 replies      
> "text": "{{$jason.find('#about h2').text().trim()}}"

Yeah, you can only get so far being declarative. Miracles don't happen.

42
heavy_machinery 2 days ago 0 replies      
This is awesome. Do you plan on releasing an Android version?
43
jwebb99 2 days ago 0 replies      
Is is possible to include some kind rich text/html editor that can send data back to a server via JSON? Is it possible to send any user data back to a server with Jasonette?
44
markbnj 2 days ago 1 reply      
Well you do need to be a little bit of a programmer. You have loops in there, and evaluating properties of objects, etc. Interesting idea and it looks like you've done a nice job of it.
45
nhorob67 3 days ago 0 replies      
Native app newbie here. Can this solution be used to present data tables?
46
devcorn 3 days ago 0 replies      
Hey Buddy!, Great concept and smooth execution. Something for Android?
47
achairapart 2 days ago 0 replies      
What about Push Notifications? Are they supported somehow?

Great project, by the way!

48
ing33k 2 days ago 0 replies      
Congrats on the release !

this actually reminded me of a startup which lets you create a native app using API documentation ( swagger ).

Edit : If I can find the bookmark, I will the link to that startup here.

49
ruudk 3 days ago 4 replies      
Cool idea. But I'm wondering if this is allowed in the App Store?
50
jv22222 2 days ago 0 replies      
Great Job.

This is the next iteration of Titanium Appcelerator and Xamarin style apps (which basically do the same thing but internally).

51
djhworld 2 days ago 1 reply      
The more I scrolled down this page the more impressed I got.

What's the performance like? Does it handle switching views back and forth? Keep state?

52
jscholes 2 days ago 1 reply      
This looks really interesting. How can you localise an app's UI when using this?
53
ethaligan 2 days ago 0 replies      
Never thought you could do that much with just JSON, this looks very promising for students or for fast prototyping.
54
blairanderson 2 days ago 0 replies      
I would recommend adding documentation for shipping to the App Store.

Even if it's as simple as "just build and submit".

55
joshu 2 days ago 1 reply      
Is there support for loading more pieces of the app on demand (so the server can generate them?)
56
Kiro 2 days ago 0 replies      
Looks cool but what kind of apps are you supposed to build with this?
57
chaudhary27 2 days ago 0 replies      
This is great. I am new to iOS and this seems very promising.
58
jzelinskie 2 days ago 0 replies      
FWIW, the name is quite similar to https://github.com/google/jsonnet
59
ashish348 3 days ago 1 reply      
how to make an app in 15 min | How to develop a mobile app free https://www.youtube.com/watch?v=tnsYachOafk
10
Google Data Studio datastudio.google.com
432 points by wiradikusuma  12 hours ago   147 comments top 40
1
chipperyman573 12 hours ago 3 replies      
For anyone wondering what this is:

"Google Data Studio (beta) turns your data into informative dashboards and reports that are easy to read, easy to share, and fully customizable. Start telling great stories with your data and make better business decisions.

Create up to five free custom reports with unlimited editing and sharing with Data Studio."

(From https://www.google.com/analytics/data-studio/)

This also isn't anything new, it was released a few months ago: https://analytics.googleblog.com/2016/05/announcing-data-stu...

2
soared 10 hours ago 6 replies      
This is almost my favorite tool, barring a couple problems. Its very easy to use, and makes very readable, pretty reports/dashboards. Sharing via a URL is nice, but I really like just exporting as a PDF and emailing them. Here is an example report that took <10 minutes to make [1]. The big problems though:

1. Beta version. You can't use this in production, send reports to clients, etc. when this tool is in beta. A client gets this report and expects all future reports to be this pretty, but Google might just kill it off or it could break.

2. Limited to 5 reports. The majority of agencies/brands can't afford the $100k+ a year for premium G suite. 5 reports per account kind of sucks, and I've had to use old google accounts to get past that limit.

3. Design. Most people simple aren't good at design, so you need a designer to create a good number of templates. Users cannot stray away from templates unless they know what they are doing. While its easy to use, its easy to make things look bad too.

4. You can pull from a Google Sheets, but updating that sheets doesn't play nicely with data studio.

Side ux notes.. changing pages resets the date range which is annoying. Choosing custom colors for some charts doesn't always work.

[1] http://i.imgur.com/1SCUNu1.jpg

3
laacz 6 hours ago 1 reply      
It's getting too common for new tools from large companies (Google, Facebook, Twitter, Apple) to be unavailable in most smaller markets. For example, Data studio told me that "Data Studio is not available in your country.". Yet, I can play with pre-existing reports.
4
bsg75 36 minutes ago 1 reply      
Its surprising and disappointing that as a Google product you have to contact a sales rep for pricing.

For a company like Google, who is known to make contact with actual support people difficult, making pricing of all things a sales interaction is off-putting.

5
nevi-me 10 hours ago 3 replies      
Tried using it yesterday, but says it's not available in my country (South Africa). I'm tired of silly restrictions like this, it immediately turned me off Data Studio.

I have a Tableau licence at work, which we feel are expensive and restrictive in terms of transferring licenses around when no longer needed.

Seems like a no disappointment option these days is to just invest in the d3 ecosystem as much as possible.

6
titel 8 hours ago 2 replies      
"Data Studio is not available in your country."

Why is this still a thing any more? Even more this is a product of a single company so no legacy licensing scheme (like for movies for instance) can be blamed :(

7
ioda 6 hours ago 1 reply      
Shameless plug : We have been building http://www.reportdash.com with a similar target market.Datastudio appeared to be our startup killer in the beginning. But I guess, we may narrowly escape owing to the usability advantage, and the deep integration we have for datasources like adwords, fb ads etc

We are working hard to give a fight. We are about to release a major update in the coming months which makes slicing and dicing data a breeze.

8
thomaspryor 11 hours ago 1 reply      
We're using this a lot at Khan Academy already. Nothing that other products can't do, but the ease of use, familiar interface, and google account integration have made use of it spread through the company very quickly
9
sandGorgon 11 hours ago 2 replies      
MySQL support exists,but no postgres?

I notice that Google has consistently built support for MySQL in all its products...but never postgres.

I don't understand why - is there a massive amount of engineering involved ?

10
jordache 42 minutes ago 0 replies      
The functionality is pretty limited... For a score card widget, it seems to only aggregate values, what about min/max/avg ??
11
johnhenry 1 hour ago 0 replies      
I've been using silk.co for a while now. It seems quite similar in it's use case... I wonder if anyone might be able to comment on the differences?
12
wgx 5 hours ago 0 replies      
We (at D4) have been offering something similar - QueryTree lets users connect a data source (MySQL, MS SQL, AWS or PostgreSQL) and then build powerful reports and visualisations with no technical knowledge required. We even suggest JOINs that make sense automatically. http://querytreeapp.com/

I'd be really interested in feedback from people who like/dislike the Google Data Studio product.

13
lpasselin 11 hours ago 0 replies      
We use this for a client with Google adwords data. User can go to a url and see data and plots we selected for him. Decent tool, easy to use but a bit slow when I used it last month.
14
rpalsaxena 1 hour ago 0 replies      
What do you think, Companies having terabytes of data may use these GA as a main tool for their analytics ?

Upto what extent they may rely on these tools?

15
david90 11 hours ago 0 replies      
I used to sync Google analytics and other external data source to Google Spreadsheet with addons(https://chrome.google.com/webstore/detail/google-analytics/f...) and visualize the data

Just played around datastudio, customization seems easier.

16
aaronhoffman 2 hours ago 0 replies      
A site we've been working on for a while that does something similar, but with d3.js and dc.js to bring the charts to life: https://www.sizzleanalytics.com
17
gegtik 11 hours ago 1 reply      
Wonder how related this is to Fusion Tables

https://fusiontables.google.com/

18
djhworld 8 hours ago 1 reply      
This is nice, from the looks of it you can query datasources in the Google Cloud ecosystem like BigQuery etc, although it doesn't appear to support joining across datasources (could be wrong)

This would probably be very useful for 'ad-hoc' queries/reports. I don't believe AWS offers anything similar right now, outside of spinning up an EMR cluster and attaching a notebook or something - but that's a lot of setup

19
mikeflynn 11 hours ago 0 replies      
I've been staring at the MySQL connection settings deciding if I want to actually want to connect this with or not for about five minutes.
20
sosha 6 hours ago 0 replies      
Data Studio is not available in your country.

Would you like to be notified when the service is available?

Sigh...

21
ljw1001 4 hours ago 0 replies      
Is there any information on building connectors to new data sources? I can't find any on the site.
22
gabrielrdz 11 hours ago 3 replies      
Why waste any time using it if a few years down the line they will discontinue it?
23
alexrbarlow 6 hours ago 0 replies      
Looks really nice, often having business wide dashboard are a pain and it's nice for everyone to be able to see sign ups etc. MySQL data source is good but I'm hoping they'll include things like prometheus or others later. For now this would work with some Segment.io or similar batch jobs
24
dgelks 3 hours ago 0 replies      
Can definitely see this becoming useful as a potential metabase replacement since we are using mysql - lack of postgres support is a weird choice though
25
alexc05 3 hours ago 0 replies      
Looks like they need some qa. The sample Adwords report looks really buggy on the iPad Pro (chrome)
26
jasonhoyt 9 hours ago 1 reply      
A nicer alternative for non-US folks, or even those in the US, would be DataHero (https://datahero.com/). Less feature rich than Tableau, but also extremely easy to use with practical graphs and data mashups out of the box.

I have no connection to the app, other than I've tried dozens like it, including Google's Data Studio.

27
1_listerine_pls 11 hours ago 1 reply      
I think tools like this are going to enable more government accountability. Facilitating visualization and sharing of what otherwise would be a bunch of numbers is key.

In Mexico there is only one electric utility company. The price per kWh is supposed to be a function of the median monthly temperature. However, many cities are misclassified on purpose. A map showing which cities are misclassified would be really easy to do and share if it supported maps.

28
JungleGymSam 10 hours ago 1 reply      
An alternate from Microsoft is Power Bi. Much more mature. https://www.powerbi.com
29
hd4 6 hours ago 0 replies      
Just curious, are any people in this thread already using Eclipse BIRT, Jasper Reports or Pentaho? Would you mind sharing your experiences/use-cases?
30
awestroke 7 hours ago 1 reply      
No drill-down, only explore the time axis. Worthless for any real BI
31
ben_jones 11 hours ago 0 replies      
I wonder if this will be able to extend the Google cloud console / analytics.

And it's nit-picky but some of the dashboards aren't responsive in chrome, which I would normally let go but if you're gonna force us into the cult of material design you might as well get that right.

32
rabboRubble 9 hours ago 0 replies      
Anybody have decent test data connections that can be used to play with this? Unfortunately my personal Google account is sparse and doesn't provide much meat on the bone.
33
tsumnia 10 hours ago 0 replies      
Has anyone gotten YouTube Analytics to work? My authorization did nothing, and sadly that's what I'd like to generate!

Edit: Nevermind, I left the page, came back and things worked(?)

34
mataug 9 hours ago 0 replies      
My guess is that this was some internal tool that got cleaned up and opened to the public as a beta.
35
vgt 10 hours ago 0 replies      
works on top of BigQuery!
36
andrewvijay 11 hours ago 0 replies      
Looks pretty complex. I wonder how long it took them to pull it off.
37
xchaotic 4 hours ago 0 replies      
And that is less than 2 years after they abandoned Google refine.
38
BucketSort 11 hours ago 4 replies      
Someone please spoil my optimism about this. This seems like a game changer in reporting. What do you guys use that make this obsolete?
39
iTunzUout 11 hours ago 1 reply      
Does this make chart.io obsolete?
40
smegel 11 hours ago 0 replies      
Google Analytics 2.0
11
Hyper.sh Effortless Docker Hosting hyper.sh
530 points by rocky1138  1 day ago   199 comments top 44
1
e12e 17 hours ago 1 reply      
Congratulations on launching/going public! I remember seeing your hypervisor/container tech a while back, and it's nice to see a service based around it.

A couple of thoughts:

1) Your quickstart ends with a command to remove the test container, but leaves other resources, like the pulled image, billed at 10 cents/started GB intact. That's probably going to surprise some people that start to play with your free credits, and then maybe end up eating that/getting a (small) bill at some point due to dangling images.

Might want to add a "hyper rmi nginx" on the end, along with commands to remove the shared volume?

2) The binary for Linux seems to work fine under "bash/Linux subsystem for windows" on windows 10.

3) Inbound bandwidth on the smallest images are abysmal - I didn't test bigger ones, so I'm not sure if are just those that are oversold/under-provisioned. I got 2-300 Kbps from Ubuntu mirrors and http://speed.hetzner.de/1GB.bin on a fresh Ubuntu container -- while from my small vps on Leasweb[1] I got a solid 10 MB/s (basically 1 Gbps).

Granted the small VPS is almost 5 Euros a month - but that includes an IP - and drops with a longer term commitment (again apples to oranges, I know -- the whole point of containers on demand is that they are, well, on demand).

And Leasweb is pretty close to Hetzner - but still, at least breaking a solid 1MB/s should be an absolute minimum.

[1] https://www.leaseweb.com/cloud/public/virtual-server

2
gnepzhao 23 hours ago 8 replies      
Hey all, founder is here. I'd like to thank you for your votes here. Really appreciate!

Also, I just want to share our public roadmap: https://trello.com/b/7fEwaPRd/roadmap. Feel free to comment. It actually helps a lot for us to prioritize. Thanks!

3
agentgt 23 hours ago 5 replies      
Google Cloud is not far from this. Basically instead of "hyper" you are typing "gcloud".

Google Cloud is far more complicated but its tools so far are pretty good.

I couldn't find how you do custom networks with Hyper. Also as a Java + Postgres shop 16 Gigs memory (L3) is just not enough.

Per second also seems overkill. Google Cloud has per minute. It doesn't seem to make sense for "effortless". If you are that interested in saving money like that (ie margins) it seems you wouldn't be using a heroku like PaaS?

For me easy deployment is a small part of the story for a compelling PaaS. What I want is really easy metrics, monitoring, notification, aggregated log searching, load balancing, status pages, elastic stuff, etc. Many cloud providers provide this stuff but it is often disparate costly addons/partners/integrations that are still not terrible easy to work with.

IMO it is actually harder to get all the diagnostic cloud stuff vs the build + deployment pipeline.

EDIT:

As mentioned in another comment my company tried to use Docker but it would take to long to make Docker images so we just prefer VMs. That is it seems with something like Hyper you save on deployment times but your build times get worse (unless I'm missing some recent magic that you can do with docker now).

EDIT again:

We didn't have Docker cache (because of some issues) so please ignore my slow docker build time comments. Apologies.

4
STRML 19 hours ago 1 reply      
Surprised not to see much comparison to Joyent Triton on here.

We evaluated Triton, and while we encountered a depressing number of show-stopping bugs doing really basic things in the first week (like any container that installs `curl` failing due to a utf-8 character in the default ca set), it was pretty cool to use the native docker CLI to provision nodes. Local == remote on Triton.

Triton runs on top of SmartOS inside Zones. To me, this is the only setup I'd actually trust for production. The security story is a whole lot of hand-waving on Linux. What does Hyper run on top of?

Unfortunately for Triton, it does take as long as a minute to provision and the cost is 2x Hyper's for equivalent hardware. I haven't done CPU benchmarks on Hyper yet but the CPUs were anemic on Triton. The I/O perf was unbelievable, though, due to local SSDs and no virtualization layer.

Will keep an eye on this at least for dev and CI. Good luck!

5
vsl 8 hours ago 1 reply      
They are spammers. Last week they spammed my email address unused for years (but still harvestable) write email advertising their product under the pretense that I'll find it useful "as a build it user" (which OSS project I indeed used briefly many years ago).

Such poor judgement goes to show company culture. Wouldn't even consider them for any service after this.

6
beneills 17 hours ago 1 reply      
I'm trying this out by deploying my website (static files generated from Jekyll source and served, all in a Docker image).

I've written the following instructions for updating the site (build new image, push to Docker Hub, pull into hyper.sh, stop previous container, run new one, attach floating IP). Does it seem reasonable?

 HYPER_IP=209.177.92.197 LATEST_HASH=$(git log -1 --pretty=format:%h) IMAGE_NAME=beneills/website:$LATEST_HASH docker build -t $IMAGE_NAME . docker push $IMAGE_NAME hyper pull $IMAGE_NAME hyper run -d -p 80 --name website $IMAGE_NAME EXISTING_CONTAINER=$(hyper ps --filter name=website --quiet) hyper stop $EXISTING_CONTAINER hyper rm $EXISTING_CONTAINER hyper run --size=s1 -d -p 80 --name website beneills/website:2994001 hyper fip attach $HYPER_IP website

7
ohstopitu 1 day ago 0 replies      
This has a chance to do to Docker (quick painless docker containers hosting) what Digital Ocean did to VM hosting (quick, painless VMs in the cloud).

This will definitely be my go-to hosting for personal side projects.

I wonder if the major cloud providers will have something similar (both Azure and AWS seem to spin up VMs on which they run the containers - but you do get charged for the VMs as well)

8
stu2010 1 day ago 2 replies      
"All our servers are built on powerful Octo-Core machines" pretty much guarantees they're using some cheaper than E5 Xeons to save money, I'm wondering if it's something in the Xeon-D line. Has anyone specifically characterized what they're using? Could be Xeon D-1540s or similar, or they could also be selling 4-core hyperthreaded E3 Xeons as "8 core".
9
btgeekboy 1 day ago 4 replies      
Building a cloud from the ground up is no small task, even more so when you build it on your own hardware, and your own virtualization technology.

Any idea who these people are / this company is? Seems to have come out of virtually nowhere.

10
octref 23 hours ago 3 replies      
Confusingly, zeit also has a terminal named hyper[0], despite not having the .sh TLD.

https://hyper.is

11
cies 22 hours ago 1 reply      
Funny to see Deis is not mentioned here. It runs on top of Kubernetes and seems to deliver quite a similar experience to Hyper. Compared to Hyper, Deis seems a bit more towards the Heroku style of doing things, which is not a bad thing at all. And the Deis team came up with Helm which is an amazing way to deploy whole sets of containers as if your are installing packages with a package manager.
12
JamiePrentice 23 hours ago 2 replies      
Really like the look of this, feels VERY Digital Ocean-esque from the UI (which is awesome). As a big fan of DO I'm looking forward to playing with it!

Edit:

One interesting thing I've noticed is that I was charged a dollar for an IP address that I released after 1 minute and 11 seconds. I'd have assumed that it would have been by the second as well. However:

fip209.177.88.125-2016/11/07 16:27:082016/11/07 16:28:190.0197$1.0000

From pricing:"Billing begins when a new Floating IP is allocated, ends when it is released. Partial month is treated as a entire month."

13
mmastrac 1 day ago 2 replies      
The pricing on this is amazing. It's cheap enough to run a few services that I'd prefer to keep off my home machines.

How secure are these containers though? I always thought that Linux containers were not designed to be a guaranteed firewall between multiple tenants.

EDIT: They use their own technology to run docker on "bare metal hypervisors" - https://github.com/hyperhq/hyperd. That's actually pretty cool.

14
funkaster 21 hours ago 0 replies      
Interesting, will give it a look. For small container-based projects I've been using bluemix[1] and it's very simple and cli-friendly. the web dashboard could use some help, but it works. Definitively light years more simple than aws or kubernetes for simple projects.

[1]: https://www.ibm.com/cloud-computing/bluemix/

15
guptaneil 1 day ago 1 reply      
This is exactly the service I've been looking for: a fast, cheap, and easy docker deployment service for my personal projects. It'll be interesting to see this grow, and more importantly, see how they handle security and privacy.
16
zxv 23 hours ago 0 replies      
"Hyper is a set of Linux kernel, init process and management tools, able to virtualize containers to improve their isolation and management in case of multi-tenant applications, eliminating the need of guest OS and all the components it brings. Hyper provides safe and fast isolated environments (virtual machines), on which portable environments (containers) can be easily scheduled."[1]

The article[1] links to the hyper.sh site [2], as well as a github repo [3].

[1] https://wiki.xenproject.org/wiki/Hyper[2] https://hyper.sh/[3] https://github.com/hyperhq/hyperd

17
uniclaude 22 hours ago 1 reply      
Wow, very good experience you guys are delivering here! Congratulations on shipping!

Little question, you guys said you started in Beijing, and that your next plans are NYC and Europe... Why did you leave Beijing? Didn't the local growth attract you?

18
OJFord 20 hours ago 1 reply      
Unfortunate name collision with Hyper the terminal emulator, which has already changed name once...

https://hyper.is

19
switz 1 day ago 1 reply      
This is really intriguing and seems like a strong fit for my use-cases. What are your plans to expand to other datacenters?

Personally, I'd like to see: NYC, Chicago, Germany, Middle East, and Australia

20
jayfk 22 hours ago 1 reply      
This looks cool, but ~$300 a month for 16GB of ram?

Where am I supposed to run my DB?

21
jrk 18 hours ago 0 replies      
I'm curious: why is the pricing linear in RAM for the S* and M* types, but double that for the L* types? L3 = 4xM3, but the pricing is 8xM3. The pricing is very appealing on the lower end, but much less appealing for the still-quite-small large end.
22
abhishivsaxena 23 hours ago 1 reply      
For anyone looking for easy to use docker hosting, I would heartily recommend docker cloud(1st node free, then 14$/node/month), along with bare metal providers like packet.net or scaleaway.

I Have a 8GB/4-core atom based bare metal server running on packet.net for only 35$. Is running 30+ moderately used containers without any trouble.

Got me off heroku finally!

23
iamdave 1 day ago 0 replies      
%docker in production joke goes here%

No but really, this is a neat concept and the idea of micro instances is attractive for prototyping.

24
beneills 16 hours ago 0 replies      
Feature request: allow volumes to be mounted as read-only via, e.g.:

 hyper run --name mycontainer --volume myvolume:/mnt/point/:ro myimage

25
willcodeforfoo 20 hours ago 0 replies      
I tried this a few days ago after [someone here suggested](https://news.ycombinator.com/item?id=12876472) it as a Lambda + containers tool.

I signed up for the trial and was pretty impressed how easy it was to get up and running. I'm hesitant to use for production-level workloads. As it matures I think it will be a great platform (and then Amazon, Google, or Digital Ocean, etc. will acquire it and maybe roll it into their offerings or maybe kill it.)

26
andrewmunsell 20 hours ago 0 replies      
Maybe I'm just blind, but when I was looking at Hyper over the weekend I couldn't figure out whether it was possible to perform the rolling update for a service to update its environmental variables or other settings that you can specify when you create the service.

https://docs.hyper.sh/Reference/CLI/service_rolling_update.h...

27
ch4s3 23 hours ago 3 replies      
I wonder is this is better for spinning up low traffic static sites than DO.
28
beneills 16 hours ago 1 reply      
Feedback: there are empty pages in your docs, e.g. https://docs.hyper.sh/Reference/API/2016-04-04%20[Ver.%201.2...
29
aioprisan 1 day ago 5 replies      
Is there something like this on top of DigitalOcean/AWS instead? I'd rather rely on those providers for the hardware and uptime.
30
atmosx 19 hours ago 1 reply      
What is the best way to deploy a CRUD app? Use MySQL over TLS? Are there any sort of persistent volumes?
31
zdrummond 18 hours ago 0 replies      
Looks really nice.

Given that this is yet another place to run code, I would be very interested in a third-party security review. The result of that, and any other Certs or regulator reviews, would strongly define what sort of work loads can be run on it.

32
beneills 16 hours ago 1 reply      
Do you plan to host a container registry? My use case is wanting to build an image containing SSL server certificates, which I cannot push to, e.g. Docker Hub.

Being able to do `hyper push myimage` would streamline the process.

33
slig 1 day ago 1 reply      
Let me see if I got this straight:

1. Hyper.sh is a PaaS running their own open-source stack https://hypercontainer.io/ ?

2. That means that it's possible to host "your own hyper.sh" on AWS or DO?

34
merb 20 hours ago 0 replies      
> L2 PRIVATE NETWORK

I'm pretty sure they mean Layer 3, I mean the effort to have a SDN network/VPLS and reconfigure routes, etc for every client seems to be too off for me.

35
elp 22 hours ago 0 replies      
This is really cool. Any idea when you will start offering IPV6 floating ips?
36
luisrudge 23 hours ago 0 replies      
zeit also has super simple dockerfile support with `now`: https://zeit.co/blog/now-dockerfile
37
q3k 1 day ago 0 replies      
> Amazing Hardware: All our servers are built on powerful Octo-Core machines [...]

Doesn't sound so amazing to me (or is just very low density compared to what's typically done with the current Xeon lineup).

38
tapirl 23 hours ago 0 replies      
Looks like a competitor of Google App Engine with much lower price (but GAE provides 28 free compute hours per day).
39
Untit1ed 14 hours ago 1 reply      
Are there any plans for autoscaling groups like in K8S?
40
peregrine 1 day ago 1 reply      
This is pretty cool. I would also love to see a "micro" instance with more cores but keep the lower memory.
41
fiatjaf 23 hours ago 1 reply      
Are you sure network is free? Because if I host IPFS it will consume all the network you have available.
42
tester8453 11 hours ago 2 replies      
Ok, how the heck do I view how much credit I have. Why is this so difficult.
43
WordyMcWordface 1 day ago 2 replies      
Seems a bit expensive compared with the cheapest VPS/OpenVZ setups around.
44
jeisc 23 hours ago 1 reply      
what kind of payments are accepted?
12
Nvidia adds telemetry to latest drivers majorgeeks.com
406 points by schwarze_pest  2 days ago   230 comments top 26
1
ChuckMcM 1 day ago 3 replies      
First, let me say that I think what they did was wrong and it should only be opt-in and clearly stated.

That said, having managed fleets of machines that were nominally running the "same" software, getting updates from all of them is a really powerful debugging tool. Once you get above about 1,000 machines logs comparison of all the machines immediately surfaces software issues (happens on all machines), connectivity issues (machines in a certain area), bad machines (unique problem signature), and environmental issues (time of day correlation with other things like power/temp/humidity/etc).

And that gives you a bit more courage to release things early because you'll see problems faster and can fix them.

So with a typical roll-out of 10% of the population followed by an additional 15% followed by the rest, you can catch a lot of errors and 75% of your population sees a really good experience (and in web services where 66% of the populations the minimum requirement for delivering rated service you can often get close to 100% uptime).

Does that justify their action? No. But since you really don't need everybody to participate to get the benefit I could see a path where you opt in for early access to drivers which requires the telmetry, and people who are ok waiting for the driver to be clean in the 'canary' population get a driver without telemetry.

2
brink 2 days ago 12 replies      
On a related note; I can't even use NVidia's "Geforce Experience" any more without logging in. Thanks for that, NVidia. Just what I wanted; a driver tool that forces me to log in.
3
eswat 1 day ago 5 replies      
Maybe I havent looked hard enough, but Im surprised this and the forced login for GeForce Experience didnt make a bigger wave amongst gamers, whove historically been very vocal about questionable decisions that provide far more value to the business than to the gamers.
4
bhouston 2 days ago 4 replies      
Good. Telemetry should have been in these video drivers for crash reporting for a decade. Would have helped a ton with various video game crashs and the low quality of video drivers.
5
nanch 1 day ago 0 replies      
Send an email to info@nvidia.com to let them know that you'd like them to change their policies regarding opt-in vs opt-out default settings.

--

Here is my email:

Dear Nvidia,

 I have been a life-long supporter since I was in college (14 years). I have recommended your products to many friends and purchased more than 15 of your graphics cards for my own computers. I build servers and run a cloud storage business. My friends and family look to me for advice on their own purchases. I am your target market - a technology leader that makes recommendations to others. I have been extremely satisfied with your product for a long time and would like to be able to recommend your "issue-free" products to my friends, family, and associates. I'm a big fan of Nvidia.
--

 Unfortunately, you have recently enabled telemetry reports (https://news.ycombinator.com/item?id=12884762), and I will be less willing to recommend your product, opting for an AMD solution, or on-board solutions.
--

 To resolve this issue, please:
1: Please use opt-in defaults instead of opt-out defaults for privacy-sensitive reports

2: Make a blog post publishing your public policy on prioritizing user privacy over other priorities.

--

On a more general note, privacy issues will be an increasingly important consideration for technology leaders before making recommendations. Microsoft made a mistake integrating privacy-invasive telemetry into Windows 10. Please don't make the same mistake. Nvidia needs leaders that will prioritize user privacy over other market concerns.

Thank you,

David

6
shmerl 1 day ago 0 replies      
As a long time Nvidia user, I grew tired of Nvidia not releasing open drivers. At the same time, amdgpu + radeonsi + radv are constantly improving, so my next GPU is going to be AMD Vega.
7
bnmillar 2 days ago 1 reply      
In the words of Linus, "Nvidia, Fuck You"
8
Animats 1 day ago 1 reply      
The article tries to get you to download "autoruns.zip" from their site. That's suspicious. But it seems to be OK. The official Microsoft version is at [1] and the ZIP files compare equal.

[1] https://technet.microsoft.com/en-us/sysinternals/bb963902.as...

9
proactivesvcs 1 day ago 1 reply      
If I find any software defaulting to opt me in, without asking, it gets turned off and stays off. If it asks, I almost always permit it. When I have customers ask me about the same (when prompted by their software) they tend to agree with my thinking.

There is an exception though - since Windows 10's enforced telemetry, I have turned off all of the telemetry for all of their software across the board. Until they start to conduct themselves respectably again, they can do without my drop in their ocean.

10
nopcode 1 day ago 0 replies      
In NVIDIA's defense... this is all optional. You can still install the drivers with control panel without any telemetry or "GeForce Experience".
11
isaac_is_goat 1 day ago 0 replies      
I recently just switch to AMD because I bought a FreeSync monitor and I've been meaning to upgrade my card anyway. Looks like I did it just in the nick of time.
12
arcanus 1 day ago 1 reply      
Please AMD make competitive products to keep Intel and nvidia honest!
13
cheiVia0 1 day ago 0 replies      
Ever since AMD put in the effort into open source Linux drivers, I've been only buying Radeon GPUs. This is a reminder of why I don't want to rely on proprietary drivers.
14
deadcast 1 day ago 0 replies      
I just switched to a geforce 8400gs using the nouveau drivers on GNU/Linux. I'm not big into graphically intense applications and I don't have to worry about my graphics card waking up and "phoning home." :)
15
awinter-py 1 day ago 0 replies      
Even if we outlaw phone-home for information-gathering, automatic updates have to upload information about your architecture in order to download binaries.

I don't think anybody is suggesting at this point we ditch automatic updates -- the consensus seems to be they fix more problems than they cause. So this is going to remain a problem.

16
deviate_X 1 day ago 0 replies      
What we actually need is good _independant_ firewall vendors.

It is not enough to focus on the telemetry giant corporations like NVIDIA or Microsoft while forgetting about all the P2P software being installed by game's vendors and "telemetry" of software smaller vendors.

On big computers/pcs the default mode makes the user give up too much control _forever_ once the software its has been installed. Most software only need to be doing anything when your actually using it.

What we need is not opt-in checkboxes from vendors, what we need is the operating-system level software to be better -> where our explicit permission is needed to "allow" some kind of activity like transmitting over the network or detecting my location.

17
cturner 1 day ago 1 reply      
Imagine that five years from now, RISC-V or POWER8 have become established as free core-CPU platforms. Is there anything equivalent coming through in the GPU space?
18
robohamburger 1 day ago 0 replies      
It would be good to know if this is the geforce experience of the actual driver, and what exactly its doing. The article seems to be light on details.
19
andybak 1 day ago 0 replies      
I'm not bothered about the privacy so much as the bloat (or rather - I trust that those more vigilant than me will warn me if the privacy issues are more than theoretical - lazy I know).

The bloat is endemic to hardware companies. Is there some law of nature that says if you primarily build peripherals then you write terrible software?

20
srj 1 day ago 0 replies      
I used to just use the Windows "Update driver" dialog to manually find nvlddmkm.sys and install it that way to avoid the bloat. I haven't used Windows in a while but it may still be possible.
21
elihu 1 day ago 1 reply      
Any idea if the Linux drivers do this as well?
22
frik 1 day ago 0 replies      
Can it be avoided using the "advanced" option in the setup wizard and deselect everything but the graphics driver (,the physics engine and the sound driver)?
23
JBiserkov 1 day ago 0 replies      
Great article: highlights a problem, shows a solution. Autoruns is a very useful program.
24
hobarrera 1 day ago 2 replies      
What's "GeForce Experience", "Wireless Controller" and "ShadowPlay services"?

Do recent drivers always include the latter? How do I check for them? Are they kernel modules?

In my case, all the nvidia drivers I see loaded are:

 $ lsmod | grep nv nvidia_drm 20480 1 drm 294912 3 nvidia_drm nvidia_uvm 704512 0 nvidia_modeset 770048 3 nvidia_drm nvidia 11866112 42 nvidia_modeset,nvidia_uvm

25
eveningcoffee 1 day ago 0 replies      
This behaviour is not acceptable.
26
youdounderstand 1 day ago 4 replies      
I'm surprised by the backlash against telemetry on HN. How else are you supposed to improve reliability of software used on tens of millions of devices with an ear infinite number of hardware and software permutations?
13
Regular meditation may be more beneficial than vacation harvard.edu
416 points by prostoalex  4 days ago   205 comments top 36
1
not_an_account 4 days ago 7 replies      
Mindfulness meditation is one of the pillars on which my life rests. If you have never done it before, realize that it's hard, especially if you fall anywhere on the ADHD spectrum, and it's just brutal for the first few weeks for anyone. As your brain becomes more still and your focus improves you'll notice your productivity improve. As the mental chatter drops you'll also notice the problems in your life, that you've ignored or been too distracted to see, impeding your ability to meditate. Only working to fix them will allow you to continue, but you'll be better off for it.

I rarely talk about meditation in professional circles because, for one, people think it's weird or have a preconceived notion of what it is, but more so it's so damn effective that it's seriously my secret weapon in life and part of my competitive edge. (As an aside, it has nothing at all to do with "strange religions", vegans, or yoga pants.)

Start slow, 10 minutes a day for a week, then add 5 for the next week, and so on, and don't rush. Take it seriously and don't disrespect it. I simply cannot recommend it enough.

2
skankhunt42 4 days ago 6 replies      
Now I can show this to my employees when they ask for vacation days. I won't allow anything but the best for them and meditation can be efficiently packed into the lunch break time frame.
3
origami777 4 days ago 11 replies      
I've been going through challenging times and have been confronted with somethings I've never had to deal with before. There are a couple of things that trigger anxiety in me on a daily basis now. I wake up in the middle of the night with little anxiety episodes. I'm unfocused at work because of what feels like underlying emotional issues.

The only thing that's helped (I am trying to fix without medication) are the tools from HeartMath. Their "Transforming Stress" book has been a god send. It's not a get well fast type thing, but I've been practicing the sessions daily for a month now and am in much better control of my stress and anxiety.

I have practiced Tai Chi, meditation, and visualization via NLP techniques in the past. I'm not a novice when it comes to the types of techniques that HeartMath teaches. However, the way theirs are structured are the most effective I've found for dealing with these issues.

If you're going through hard times I highly recommend using their tools.

4
didibus 4 days ago 4 replies      
Vacations are wide and broad. Normally I try to cram so much traveling, visiting, sight seeing, partying and drinking/eating into my vacation that I'm more burned out when they are over then when I started them. I often joke that I need a vacation from my vacation. I also have this habit to book my flights as close to when my job ends/starts, like take a red-eye right after work and come back the day I start work again, going straight from airport to work.

Given that, I absolutely can believe meditation is better then vacation. But I'd have to wonder, if my vacation consisted of calmly driving to a Cabin in the forest and pleasantly sitting on the side of a lake reading or pondering or looking at the lake by myself for the whole vacation, would that still be worse then meditation? Or is this starting to count as meditating?

5
trprog 4 days ago 0 replies      
>He went on to explain that other factors that often go hand in hand with meditation (for example, exercise, diet, even exposure to incense) could help explain these improvements. So that as well remains to be more fully resolved in future studies.

Sounds nice and vague although that is understandable. It reminds me of studies suggesting that taking regular naps give all sorts of health and longevity benefits. Is it the naps themselves or that you have a lifestyle that permits you the opportunity to take naps? If nothing else the ability to decide to nap indicates the individual is in control of their time which would seem to correlate with lower stress levels Vs someone who has their daily schedule dictated to them in such a way as to make laying down for 20 minutes impossible.

There are so many confounding factors the outcomes are always going to be really fuzzy.

6
microtherion 4 days ago 0 replies      
How long until the trend in unicorn startup will be to give employees zero vacation days but unlimited meditation?
7
dschiptsov 4 days ago 1 reply      
And meditation is? Sitting in a public place in yoga pants holding fingers in a mudra? Vacation defined as? Does vacation include visiting Buddhist countries and Indian meditation retreats? How long a vacation should be?

Hint: meditation requires a profound changes in ones assumptions about his own nature and conditioning. That's why the teaching of the Buddha has been a philosophy, not a book of asanas. Meditation is the tool to realize accuracy and correctness of Buddha's insights. To test and validate his hypothesis by yourself.

there is a good place to start:https://www.coursera.org/learn/science-of-meditation

8
kome 4 days ago 1 reply      
> You say vacation, I say meditation

Beneficial to what? The point of a vacation isn't about being more productivity at work. Productivity is not the goal.

9
agumonkey 4 days ago 5 replies      
One thing that strikes me often with vacations of the average person is how much of a runaway it is.

- Fed up with work, I need holidays.

####

- Fed up with holidays, can't wait to get back to work.

10
raarts 4 days ago 0 replies      
> vacation has beneficial but very temporary effects, and that mindfulness therapies have sustained beneficial effects.

Proving the obvious.

But I would choose vacation over meditation any time. Seeing the world and other cultures does a lot for my mental health too.

11
soulnothing 3 days ago 0 replies      
Last year I took two months off between contracts. It was more a staycation, but realistically it had been over 10 years since my last vacation, or more than a day off.

That time period did so much to help rejuvinate me and make me better. Working on personal projects, and other items. Towards the tail end I took on part time work and was still happier than I had been in years. The biggest thing was being able to meditate daily. Pick what I want that day, whether it be a novel, research, or software.

I took an actual vacation this year. I was more stressed, and frazzled going into it, and even worse coming out. I had just rented my house, and the vacation was rife with calls from the police about trouble from my tenants, and missed payments. The travel was also frought with layovers and delays.

I also meditate, did tai chi, etc. Gym time and meditation was crucial for my over all well being. But now being one of those mega commuters, which is about to end. I didn't really have the time. Being crunched for time, it was easier to reach for a bottle of whiskey to ease my tension. Which is by no means ideal. I had actually gone dry for three years, and broke that several months prior.

I've also found my meditation is largely based on relieving stress from work. Where I'm not fulfilled in the slightest at work. I then meditate to try and combat that feeling. It's a cyclical cycle.

12
lqdc13 4 days ago 0 replies      
It is not clear from the article if the meditation group stopped meditating or not.

The headline conclusion only makes sense if nobody in the novice meditation group meditated after that week.

13
she11c0de 4 days ago 0 replies      
I'm also a big fan of meditation - I believe it changed the way I perceive every day "threats" and allows me to see the roots of any problem I encounter instead of just allowing automatic subconscious defense mechanisms to kick in. That being said, in this study it seems that the experimental group really had a vacation AND meditation, so I'm not surprised they had better results.
14
peterwwillis 4 days ago 0 replies      
Meditation is about clearing your mind of distractions. A vacation is supposed to be a chance to see the world and do things you normally wouldn't.

You should try to be stress free in your daily life. Your vacation should be a little stressful. All my best travel stories feature stress (or some conflict, or challenge, or misadventure, ...)

Sitting on a beach is nice, but it's also safe and boring. Go get lost somewhere.

15
charris0 4 days ago 0 replies      
I would agree for the most part, as meditation (from my experience) can provide a consistent, harmonising and content feeling that would persist for longer than the joy that comes from pure escapism or novelty of experience of a vacation.

I've been doing Transcendental Meditation I learnt through ZivaMIND for over a year now and I couldn't imagine being as on the ball emotionally/with reference to myself without it. I've found it so brilliant.

IMO, Meditation, and more broadly speaking, the uninterrupted time we allow for ourselves, should be taught in schools and by parents ubiquitously, as its an obvious antidote to our constant connectedness eating away at our wellbeing and creativity and compassion/gratitude.

16
lujim 3 days ago 0 replies      
There are mountains of good and bad info on meditation. Here is a primer on mindfulness as I understand it from Ronald Siegel (Assistant Clinical Professor of Psychology at Harvard Medical School, author etc)

The human brain has two default modes of operation and you have very little control over them. They exist because they offered an evolutionary advantage in much more dangerous times, but don't necessarily promote a sense of well being/happiness in a modern relatively safe world.a) Pleasure seeking and pain avoidanceb)Self referential thought ("Am I good enough, smart enough and do people like me?")

Mindfulness offers a third option. Relaxed open awareness with general acceptance. It can be done my anchoring awareness on a physical sensation like the breath. You won't be able to hold your concentration there very long before you realize you're thinking of something else. You are mindful when you realize your mind had drifted and you bring it back to your anchor.

It offers tons of well studied and documented benefits like increased activity in the left prefrontal cortex with is good for reasons that google can tell you.

A note from my experience. Expect the timeline of benefits to be similar to starting a cardio or weight training program. Your first few times may seem like a waste of time but the benefits start piling up after a few weeks.

17
Confusion 4 days ago 0 replies      
That headline is misleading. It was an experiment with 91 female volunteers and the 'vacation' was not something I would consider a vacation.
18
kirubakaran 4 days ago 0 replies      
I found these free guided meditations to be extremely helpful: http://marc.ucla.edu/body.cfm?id=22

They are from UCLA Mindful Awareness Research Center.

19
andyjohnson0 4 days ago 1 reply      
There was some useful discussion of meditation on HN a few years ago [1]. Reading it, I was surprised by how many people here use meditation and benefit from it - professionally as well as personally.

This [2] post by twotimesposter is a particularly good answer to the question of "how to meditate to achieve mindfulness?". So much so that I bookmarked it. Really must get around to trying it.

[1] https://news.ycombinator.com/item?id=4926642

[2] https://news.ycombinator.com/item?id=4928060

20
jondubois 4 days ago 1 reply      
Meditation does nothing for me. It's ironic that the people I know who are into meditation tend to be completely out of touch with themselves and meditation doesn't seem to actually change anything about them.
21
amerkhalid 4 days ago 0 replies      
If anyone gets turned off by spiritual/religious aspects around Meditation, I have found this podcast very helpful in understanding: https://secularbuddhism.com/It is not mainly about meditation but general Buddhist concepts. I am finding that it is helping me be mindful more often and meditate once in awhile.

If you do start listening to it, start with first 5 episodes.

22
tskarthik 4 days ago 1 reply      
Give up expectations. You will be free from stress. Thats it.
23
mcguire 4 days ago 0 replies      
"a recent study comparing a mindfulness meditation and yoga retreat to regular vacation in terms of mental health as well as physical health outcomes"

The study compared a week of health lectures and "fun outdoor stuff" to a week-long meditation retreat. This isn't meditation vs. vacation, it's meditation+vacation vs. vacation.

24
jaakl 3 days ago 0 replies      
I know a startupper who struggled for almost 10 years, then started meditation and yoga practises and then in less than 1 year the startup growed significantly and eventually got sold successfully. That was me. Was it a coincidence ?
25
rohinibarla 4 days ago 0 replies      
"As there is a science and technology to create external wellbeing, there is a whole dimension of science and technology for inner wellbeing.

Here is a study conducted on 536 Isha practitioners

https://www.innerengineering.com/ieo-new/benefits/

26
diegoloop 4 days ago 1 reply      
Here is one interesting App helpful for relaxation but in a different way.

I notice some improvements since one month using it.- better sleeping - more focus - positive thinking

(Unfortunately the app is just for iOS at the moment)https://appsto.re/de/8PGcfb.i

27
cthalupa 4 days ago 0 replies      
I've got hundreds of memories that I treasure dearly from traveling the world.

I suppose it depends on if you're vacationing just to get away from work, or if there's something you really want to do/see/experience.

28
hiryuken 4 days ago 0 replies      
Playing Tennis twice a week made my life much better. I suggest everyone to try it if you suffer from stress, anxiety and lack of mental focus. That's also great to loose some weight and get in a better shape.
29
desireco42 4 days ago 0 replies      
This is classic western scam to get you not to go to vacation :). I wouldn't trust anything.

Meditation is good for you and you will realize that you need to go on vacation during your meditation.

30
bshastry 4 days ago 0 replies      
Nice try, capitalist hawks!
31
debt 4 days ago 2 replies      
seems most beneficial then to both vacation and meditate simultaneously.
32
jeena 4 days ago 1 reply      
Meh I don't buy it, the method of the study seems not very sound. I myself was on a 10 days meditation retreat http://www.sobhana.dhamma.org/ where we meditated for 16 hours a day or something and sure it helped with the mood, relaxation, etc. but it wore out after a couple of weeks and I never had the nerv to meditate daily because it was not worth it. Btw. my brother and sister attended it too and both say the same about it.
33
Benjamin_Dobell 4 days ago 1 reply      
The study was conducted at a resort in Southern California

It's an incredibly deceptively named article. The study showed that meditating whilst on vacation is more beneficial than a vacation without meditation. All participants were on vacation.

Also, the participants who were meditating already knew how to meditate, so at best we have a correlation between the kind of people who learn to meditate and feeling less stressed during/after a vacation.

34
pkaye 4 days ago 1 reply      
I misread this as "medication" and was confused for a while.
35
burnbabyburn 4 days ago 0 replies      
I still prefer holidays though.
36
jomamaxx 4 days ago 0 replies      
" statistically significant improvements in scores of stress and depression"

It's hard to measure what 'benefit' means.

These things are a matter of choice as much as anything else.

Moreover, it may very well be that some of the answers were 'primed' by the content of the retreat.

14
Canadas federal court rules intelligence service bulk data collection illegal theglobeandmail.com
383 points by based2  2 days ago   51 comments top 9
1
pjc50 2 days ago 2 replies      
"But there is no apparent fallout from this for CSIS yet. While the spy agency says it will stop analyzing the contentious data, there are no indications that it will destroy the data."

It's not really illegal if there's no enforcement, is there?

2
noodles23 2 days ago 1 reply      
Considering how hard it is to explain how Google Analytics works to the standard business owner, I imagine it wouldn't be hard to obscure all manners of data collection programs from oversight.

In times like this, the importance of civics education is highlighted. The very idea that people in law enforcement think it's acceptable to treat judges and the legal system with such contempt is scary. Even if you disagree with a certain law or system, you still need to respect it as a public servant.

3
hbt 2 days ago 3 replies      
You'd have to be a fool to trust lawmakers or law enforcement with your data and privacy by this point.

They will always find some loophole in the language. Illegal to collect but not illegal to access what has been collected by allies (aka the Five Eyes).

Just encrypt everything, use VPNs and whatever you push in plain/text may as well be public.

Trust open technologies you understand, not politicians.

4
sandworm101 2 days ago 2 replies      
For any non-Canadians reading this, understand that Canadians have a very different relationship with CSIS than say Americans do with the NSA or Brits with GCHQ. We have not seen the bulk use of this data for purposes of general law enforcement. It isn't being used to nab drug dealers or child pornographers. CSIS is rather oldschool in it's approach: intel assets are for intel purposes, not law enforcement. At least that is what we have seen so far. The trust has yet to be broken as it has elsewhere.
5
fatdog 1 day ago 0 replies      
The thing nobody really talks about is that the agency in question is a "domestic" security intelligence agency. It has no foreign intelligence mandate (unless that's changed recently.)

Reporters call it "Canada's CIA," but it is mandated by the government in canada to spy only within the country. They are not a police force and cannot arrest their targets. Canada has a bunch of police intelligence agencies. It is hard to see what democratic function a domestic spy agency plays. Maybe there are good arguments for them.

6
formula1 2 days ago 1 reply      
So this seems to be more about the retention of data than it is about the collection. Where the retention of all data is not as important as a select few that pertain to national security. I'd love to play devils advocate here and claim that the agency requires all the information or court oversight of a spy agency is ludicrous. But I dont think any one branch should be judge jury and executionor. I wonder how much of this is symbolic? I know here in the US we are lucky to even hear about a segment of what happens. But I can certainly argue that many questionable deeds are done in the interest of the american citizens
7
brahmwg 1 day ago 1 reply      
Interesting timing for this post, given the lecture Snowden gave at McGill earlier this week.

https://youtu.be/4x8ZI0IaInE

8
riprowan 2 days ago 1 reply      
The only questions I have are: why is it taking so long for the legal system to catch up to where we were 15 years ago and how will we ever keep Constitutional protections ahead of accelerating technology?
9
known 2 days ago 0 replies      
Govt will always find a work around in the pretext of patriotism :)
15
Intercooler.js Making AJAX as easy as anchor tags github.com
449 points by cx1000  1 day ago   145 comments top 31
1
DSteinmann 1 day ago 14 replies      
This has been reposted so many times by the author and by others that I can't help but finally ask.

What's the point? This would lead to your API being comprised of blocks of HTML which are probably only useable for one product. Why not just use REST + JSON? It would take no more than five minutes to set up client-side rendering, and you could even make it attribute-based like this with barely any more effort. Is it really not worth spending the extra five minutes it takes to set things up in a way that is reusable and standard? All I see is piles of legacy code being generated where it hurts most - in the backend.

This took me 10 minutes to cook up. It would have taken about three if I hadn't forgotten the jQuery and Handlebars APIs. This allows you to POST to a JSON API using two attributes. Untested of course, but you get the idea:

 Example: <button ic-post-to="/api/resource" ic-template="#a-handlebars-template" /> $('[ic-post-to]').click((button) => { fetch($(button).attr('ic-post-to')), { method: 'post' }) .then((result) => { let templateText = $($(button).attr("ic-template")).html(); let template = Handlebars.compile(templateText); let resultHtml = template(result); $(button).replaceWith(resultHtml); }); });

2
cx1000 1 day ago 4 replies      
Honestly it feels like intercooler.js is building in functionality that should exist in HTML in the first place. For example, the unintuitive "href" tag sends a GET request, and POST requests are only sent with forms and buttons. What about PUT, PATCH, OPTIONS, or DELETE? According to http://softwareengineering.stackexchange.com/a/211790, "At this point, it seems that the main reason why there is no support for these methods is simply that nobody has taken the time to write a comprehensive specification for it."

Intercooler.js makes them seem a little more "built in" to html, which I like.

3
nzjrs 1 day ago 1 reply      
I said this on the other discussion, but I'm compelled to post it again.

Intercooler.js is so logically designed it basically requires no documentation - I read the introduction and a handful of examples and thought, "shit of course it works this way" and could basically derive how every other feature mapped to its implementation in that moment.

Congratulations!

4
scwoodal 1 day ago 1 reply      
I was a long time pjax/turbolinks user but always felt like I was pushing the boundaries of what these technologies were doing and always wished for more functionality.

I tried out several client side frameworks but always felt like it was way overkill for the apps I built.

I gave intercooler.js a try a few months ago and was extremely pleased. There's very little server side that's required and the extra functionality I had wanted from pjax was there.

If you're wanting the simplicity of server side rendering plus the feel of an SPA without the frontend complexity give this library a try.

5
Kequc 1 day ago 4 replies      
I'm surprised this needs jQuery. What this seems to be is a simple script that fetches a resource and places it into an element. I really feel opposed to adding more dependencies where they aren't required. That could be written without jQuery or this library fairly easily.
6
xg15 1 day ago 1 reply      
This seems to make things simpler at first glance, but I fear in the end you end up with the worst of both worlds: You have the API inflexibility and UX restrictions of a pure-HTML approach combined with the overhead and need for graceful degradation of a full-ajax approach.
7
carsongross 1 day ago 7 replies      
Main intercooler.js author here, glad to answer any questions.

Happy to see people are enjoying it!

8
Touche 1 day ago 1 reply      
It would be cool if this could some how use DOM diffing (I assume it just uses innerHTML now), so you'd get minimal dom updates with the advantages of doing everything server-side that this already provides you. Throw in some smart service worker caching and you get pretty close to the responsiveness of a fully client-side approach.
9
20years 1 day ago 1 reply      
I am happy to see this featured on the front page. I am using this for a current project after coming off an Angular project. I am so glad I chose this. It is simple to use and a pleasure to work with.
10
stdgy 1 day ago 0 replies      
Very neat! This matches up closely with what we have evolved for my group's legacy codebase to simplify handling AJAX requests. I suspect we're not alone in arriving at this sort of declarative abstraction.

Unfortunately, our implementation is rather scatter-brained and non-uniform. That's partly due to its gradual evolution and partly due to lack of free employee time to clean up bit-rot. I'm going to investigate this a bit more and mock out some examples for our product. I definitely think it'd help us organize our unruly mass of code. Good job!

11
oliv__ 1 day ago 0 replies      
Thanks HN! This is one of those just-in-time situations: I was going to need something to do some AJAX in the next few days and this is one of the most elegant solutions I've seen so far. Didn't even know this existed!
12
dec0dedab0de 1 day ago 0 replies      
I recently used intercooler to implement a small feature in a django app. It was an absolute pleasure to use.
13
astrospective 1 day ago 0 replies      
I've been using this on .net projects, have pulled off some fairly intricate UIs by returning server rendered partials. The polling support is nice and robust for dashboards.
14
cx1000 1 day ago 2 replies      
I love that you can use this without having to build anything with babel/webpack. Given the scope of my web apps, anything that transpiles or mutates my sourcecode is a non starter because it makes debugging it weird since I'm not looking at my own code anymore.
15
asciihacker 23 hours ago 1 reply      

 <!-- When this button is clicked an AJAX POST request is sent to /example and the response content is swapped in to the body of the button --> <button ic-post-to="/example"> Click Me! </button>
But what if I want the response to populate the content of another HTML element? E.g. if I wanted an accordion-style FAQ where when you click on the question (or on a plus-sign before the question), the div below the question is loaded with the answer from the server via AJAX.

16
rhabarba 1 day ago 0 replies      
"Small". As in "just add the giant jquery library as a dependency".
17
ing33k 1 day ago 1 reply      
this is one of those libraries, which should be posted on HN once in a while .
18
chunkiestbacon 1 day ago 0 replies      
I used this to make my own webshop software for a client. Lots of ajax features but only 80 lines of javascript in total. Intercooler is great to update the shopping cart in the sidebar when pressing the add to cart button. This makes the shop feel a lot smoother.
19
brianzelip 1 day ago 0 replies      
fyi, the timing of this post is likely related to the HN discussion https://news.ycombinator.com/item?id=12882816
20
chandmkhn 1 day ago 0 replies      
Webforms version of asp.net always sypported this idea through soething called UpdatePanel

https://msdn.microsoft.com/en-us/library/bb399001.aspx

Commercial control providers in .Net world support these scenario with something called "CallbackPanel".

https://demos.devexpress.com/MVCxMultiUseExtensionsDemos/Cal...

Real conufsion starts when you have nested HTML controls that automagcally making ajax calls. Nice idea as long as you can get away with minimal work.

The moment you want to use any moden SPA framework, you are up for a big rewrite.

21
ape4 1 day ago 1 reply      
What happens when there's an error?eg cannot contact host.
22
unethical_ban 1 day ago 0 replies      
I see someone read the front-end discussion. I am reading the guide to IC.js and it's a neat piece of tooling.
23
wichert 1 day ago 0 replies      
If you like this sort of thing I can recommend to look at Patternslib (http://patternslib.com ), which has many tools to add interactive behaviour to a webpage without having to write and javascript, making it a great toolset for designers. The injection pattern (see http://patternslib.com/inject/ ) does everything intercooler does, but also a lot more.

Disclaimer: I'm one of the original authors of Patternslib.

24
sleepyhead 1 day ago 1 reply      
"Attribute ic-get-from not allowed on element div at this point."

https://validator.w3.org/nu/?doc=http%3A%2F%2Fintercoolerjs....

25
smrtinsert 23 hours ago 1 reply      
How many times will something like this be attempted? It ends up being awful for debugging. Works great when there are no bugs of course.
26
grimmdude 1 day ago 0 replies      
Cool, this has some great functionality. A while back I wrote something very similar called "jQuery Ajax Markup". It was much simpler though: https://github.com/grimmdude/jquery-ajax-markup
27
Yokohiii 1 day ago 0 replies      
Not sure yet if I like it, but for the "on churn" parts I will leave some respect here.
28
bedros 1 day ago 1 reply      
the best part is the examples, there are so many practical cases.
29
jramz 1 day ago 0 replies      
30
bitforger 1 day ago 0 replies      
+1 for official theme music
31
bobwaycott 1 day ago 0 replies      
I have, over the last few years, taken a similar approach and built my own reusable, yet rudimentary, version of this. Happy to see such a well-thought out and elegant approach that matches my own preferences. Going to be using Intercooler in the future (and might even switch my old stuff to it). Nice project.
16
Were Scientists, Moms, And We Avoid Non-GMO Products medium.com
406 points by ph0rque  23 hours ago   479 comments top 48
1
ozy 22 hours ago 20 replies      
I find it hard reading articles like this. I agree with its main premise, but it fails to mention the valid points of the other side. And there are two:

1. Some GMO has a plant produce extra chemicals, specifically, pesticides

2. There is a unknown side to GMO, perhaps it can lead to super species that we can never get rid of in the future and will destroy bio diversity.

(1) are banned in the EU for human consumption, a reasonable choice I think until we have more data.

(2) is a much lower risk than most would thing. But it is one of those things where the naive/amateur answer is far from reality, and that naive answer is scary. (Like nuclear, or more rules vs more responsibility.)

When done responsibly, GMOs are fine. And even necessary if we want to feed 8 billion people and more in the future.

But an article that fails to mention valid (remember the eu) downsides, and discuss the tradeoff, will have a polarizing effect: Those pro GMO will think score and think anti GMO is stupid. But anti GMO will see their biggest concerns not mentioned, and will declare fail and conclude their position is well grounded.

2
oxide 22 hours ago 9 replies      
A couple I know who avoid GMO products have a few things in common with others I've seen saying the same things: Ignorance of what a GMO really is, a deep distrust of corporate interests/government regulators and a strange, unearned trust placed in those selling "organic" products.

It costs them a premium, but they say they're doing it for their children. I can't argue against it to their faces, but quietly I wonder if they wouldn't be more financially free if they weren't so willingly chained to 7 dollar gallons of milk.

3
09bjb 21 hours ago 10 replies      
I have a degree in Molecular Biology and the utmost respect for good science. I avoid GMO products for a variety of reasons:

1. They're absolutely more likely to be harmful to the environment based on the general behavior of the companies that make them (Syngenta, Monstanto, etc.). There shouldn't really be any debate on this.

2. They have massive potential for global-scale, subtle, negative externalities. I say this as someone who knows a bit of the history of science and has seen how many times we don't understand the full effects of something new until many years later (see most of the fun and variegated chemicals we came up with in the 30s-50s). In science, the burden of proof is on the new thing to unequivocally prove itself true...not 'harmless until proven harmful.' The U.S. is very pro-business and one of the only countries in the world where you can get away with 'harmless until proven harmful.' Obviously GMOs are not going to kill you overnight and seem to have no acute ill effects. I'm a fan of having a choice in this matter and not being forcibly blinded.

3. Most of the studies declaring them safe have industry funding at their root. Most of the studies with "impartial" funding sources declare them potentially unsafe or are inconclusive. "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!" Careerism and loose morals in this field give me plenty of reason to at least be skeptical of a Pro-GMO general consensus...and of overly enthusiastic new science in general.

4. I'm morally opposed to supporting products that support mucking with an already perfect product, namely real, clean, unadulterated food. To feed the world, we need better distribution and smarter farming practices, not the sort of "solutions" that Monsanto et all have peddled (see what's happened in India where adoption of GMO seeds was widespread).

Edit: spelling.

4
sebleon 20 hours ago 2 replies      
Despite Monsanto and Syngenta's marketing about "saving starving children", there's good reason why developing countries are actively fighting GMOs. Giving centralized control of food production to a handful of American corporations is a precarious position.

In practice, GMO agriculture creates a trade imbalance. Instead of simply exporting produce, many countries will find themselves importing huge volumes of proprietary products from the likes of Monsanto. This includes buying new breed of patented seeds every year, fertilizers, insecticides, pesticides, etc. And of course, once a farm goes GMO, soil damage makes it almost impossible to go back to organic farming. High level of lock in.

http://sustainablepulse.com/2015/10/22/gm-crops-now-banned-i...

GMO seeds have been modified to create plants that won't make new seeds, great for recurring revenue

5
hammock 22 hours ago 4 replies      
There may be a lot of ignorance around GMOs. Dismissing the category altogether as being dangerous may be premature. But there are also some educated people who selectively avoid GMOs because they are well-informed of the risks.

For example, the avoidance of food grown from Roundup Ready seeds because of its likely contamination with harmful herbicide.

6
jly 19 hours ago 2 replies      
> Genetic engineering, along with other tools, can help us address challenges like pests and droughts, while addressing nutritional issues, such as allergens or nutrient deficiencies. Farmers need these tools at their disposal to ensure a safe, sustainable, and reliable food supply.

If farmers practiced sustainable agriculture and grew a variety of crops instead of one or two, they wouldn't have nearly as much need for GMO 'tools'. Our agricultural system encourages non-sustainable practices that practically require GMO seeds, pesticides, herbicides, and many others in order to continue to function. The companies that make these products want to ensure this continues.

GMO on it's own is fine, to me. But we need to stop looking at it as the solution to the problem we've created, and focus on fixing the root causes of big agriculture. These 'challenges' that are being addressed are often man-made.

7
Symbiote 22 hours ago 3 replies      
The article doesn't mention my objection to GMOs:

I would refuse to buy GMO products because I don't want multinationals to "own" genomes. That requires far more trust than I'm willing to put in a corporation, especially a foreign one.

8
lumberjack 20 hours ago 3 replies      
If you are three scientists why are you publishing this on medium? Make your "science" credentials more visible at the least.

If you google their name you find out that they work at Biology Fortified.

>Biology Fortified, Inc. (BFI) is an independent educational tax-exempt non-profit organization incorporated in Wisconsin. Our mission is to strengthen the public discussion of issues in biology, with particular emphasis on genetics and genetic engineering in agriculture. Biology Fortified is independently run on a volunteer basis, and is not supported by any funding from any companies or government entities.

So they are paid to have this particular public (!) opinion.

9
woodpanel 10 hours ago 2 replies      
I've always been "pro GMO" in the sense that I think its immoral to artificially restrict our food supply and making food unnecessarily expensive. Plus the innovation that can be spurred from the food sector into other areas won't be seen until a hotbed has been established. I also think Europe is losing out on opportunity with its Gealileo-like reflex on fundamentally new, groundbreaking science.

That being said, I was deeply disappointed to hear that our 'groundbreaking technology' amounts to stuff like "make plants pesticide resistant" > "dump more pesticide on plants" > "apply for patent". This is really disappointingly stupid.

Also it turns out that crop yields are almost not at all affected by genetic modification [1]

[1] http://marginalrevolution.com/marginalrevolution/2016/10/gm-...

10
opo 15 hours ago 1 reply      
What is ironic about the GMO debate, is that the people most worried about GMO foods don't seem to know or care about the many crops that were created by exposing seeds to high amounts of radiation or chemical mutagens:

>...Unlike genetically modified crops, which typically involve the insertion of one or two target genes, plants developed via mutagenic processes with random, multiple and unspecific genetic changes[17] have been discussed as a concern[18] but are not prohibited by any nation's organic standards. Reports from the US National Academy of Sciences state that there is no scientific justification for regulating genetic engineered crops while not doing so for mutation breeding crops.[5]

So, these crops with random genetic mutations are sold as organic and have been in the food supply since 1930:

>...From 1930 to 2014 more than 3200 mutagenic plant varietals have been released[1][2] that have been derived either as direct mutants (70%) or from their progeny (30%)

https://en.wikipedia.org/wiki/Mutation_breeding

11
anthonybsd 22 hours ago 1 reply      
> More generally, a recent report from the National Academy of Sciences showed that herbicide tolerant and pest resistant GMOs have reduced insecticide use and have allowed farmers to use less toxic herbicides

There's been some serious doubt cast on that assumption recently: http://www.nytimes.com/2016/10/30/business/gmo-promise-falls...

12
colordrops 22 hours ago 3 replies      
Layla Katiraee, scientist at Integrated DNA Technologies - hardly a disinterested party. It's in her direct interest to trash non-GMO everything.
13
StanislavPetrov 19 hours ago 0 replies      
Any time your argument boils down to, "we have to keep information from people because they are too stupid and it will confuse them", you should be opposed. The same garbage argument was trotted out (successfully) by the beef producers and testing for mad cow disease. If you don't want to test your meat for mad-cow, then don't! But don't tell me I'm not allowed to test mine because you say there is no risk. If you don't think GMO's are an issue, then saturate your body with them. Go ahead and mutate every cell in your body as far as I'm concerned. But don't tell other people they don't have the right to information because you say they don't need it. Sickening hubris.
14
wehadfun 22 hours ago 6 replies      
1. How are these scientist be so sure that these modifications will not have some unintended side effect?

Software engineers can't predict every side effect of modifications to code that they created.

I read that some of these GMO products have only been on the market 10 years and have not existed more than 20. So it is impossible for these "scientist" to know the long term effects.

15
idanman 20 hours ago 1 reply      
It sounds suspiciously odd that people want to boycott information. I understand if you don't care about whether a product isa GMO but to prevent other people from finding out and dictate to them what they can and cannot eat sounds elitist to me. I want to know where my food came from and make my own decision. This article smells of astroturfing (https://en.m.wikipedia.org/wiki/Astroturfing)
16
swehner 58 minutes ago 0 replies      
Looks and smells like a well-funded marketing campaign. See also this comment, https://news.ycombinator.com/item?id=12894370
17
mark_l_watson 13 hours ago 4 replies      
We have the right to have our food correctly labeled. Then everyone gets to decide for themselves.

To me, the pro-gmo crowd seems to be about taking away my rights to have my food correctly labeled. This is a morally indefensible position, regardless of safety of gmo.

18
colordrops 21 hours ago 0 replies      
On a tangential note, there seems to be a large amount of propaganda and astroturfing regarding GMOs, and to question them often leads to being unfairly attacked. Be weary when reading comments as to what interests might be served by various sides of an argument. Nothing is beyond inquiry.
19
mmanfrin 17 hours ago 0 replies      
I find the first section exceptionally problematic and misleading, it reads in part:

 As a reminder, the only items for which a GMO counterpart is currently available to consumers are: alfalfa, canola, corn (field and sweet, but not popcorn), cotton, papaya, potatoes, soybeans (but not tofu or edamame varieties), sugar beet, and squash. Genetically engineered apples and salmon will be available soon.
The implication is 'you should not worry because there aren't many GMO products out there any way'; but this glosses over the fact that soy and corn constitute major ingredients in to many, many products (many of which you would not necessarily think contain either).

20
rm2889 11 hours ago 1 reply      
Nassim Taleb has been very vocal against GMOs. Here's a summary of his beliefs.

http://www.fooledbyrandomness.com/FictionAndFacts3.pdf

21
Glyptodon 19 hours ago 0 replies      
I avoid GMO products to a certain degree, but not because I'm concerned that they may be unhealthy or dangerous.

A food crop could be modified to be dangerous I'm sure. And likewise I'm sure one could be selectively bred (over enough generations) to be dangerous. If someone engineers Sarin-secreting corn or something we'll figure it out pretty quickly and the issue won't be it being GMO, it will be human malevolence. But I really couldn't care less about that.

Rather, I like seeing GMO labels because it's the closest analogue for determining if my food is proprietary. I am incredibly against reproducing organisms that are a living patent violation, and I don't want to support such things if possible. Above so said, I would much prefer to see marking indicating that none of the product was patented, copyrighted, licensed, or otherwise encumbered.

Another thing I think that perhaps gets (erroneously) mixed into the GMO debate is biodiversity. Many people have concerns that monoculture is dangerous and somehow mix and/or conflate non-GMO with heirloom/native varietal foods.

Obviously the shadow over all of the above is the "organic" label, which blurs things a little but further. But the primary consumer for organic overlaps with the primary consumer for non-GMO which overlaps with the primary consumer for local/heirloom/native produce, and the net result is that all the terms get muddied.

Given the above, while I concur that GMO labeling is not ideal as is, and I concur that GMO foods are fine or mostly fine, we don't actually have a sufficient food labeling system. People want to know if proprietary organisms are in their food. They want to know if it was grown with pesticides. They want to know where it was grown. Etc.

In many ways food labels are a product of bizarre anti-consumer compromises. It shouldn't take a rocket scientist to be able to get non-proprietary food grown with low pesticide usage. Complaining about the GMO label doesn't really seem to address this.

(PS: Love getting downvoted in the interval between posting my comment and refreshing the thread... almost like somebody didn't even read it.)

22
sova 19 hours ago 0 replies      
She works at "Integrated DNA Technologies". Not to suggest she has a bias, but her main source of income is producing GMOs. Just something you may want to keep in mind.
23
olakease 18 hours ago 0 replies      
* There are weapons of mass destruction in Irak!

* Drink milk, it has a lot of calcium and will prevent osteoporosis.

* Sugar is not that bad. Fat is the devil!

* AAA CDO's are a good and safe way to invest your money!

The previous sentences are all lies sponsored by big companies and lobbies which have/had the seal of approval of different governments and politicians.

Big companies fight for their own interests and politicians fight for the interests of those who "sponsor" their campaigns and offer jobs in their boards once their political career is done.

GMO is a technique. As a technique is it not good or bad. The problem is the big geopolitics and economic interests behind it.

24
bad_user 18 hours ago 0 replies      
> The World Health Organization has recognized food fortification as a beneficial way to deliver nutrients to large segments of the population without requiring radical changes in food consumption patterns.

The article got my attention until I've read this line. Sorry, but the WHO doesn't have a good track record when it comes to nutritional advice and I actually believe that "food fortification" is amongst the worst things that could have happened to us. Not only because there's zero evidence that this has led to better health, but also because it's used as a blatant lie when marketing unhealthy products. So now you have children that eat vitamin-enriched cereals instead of apples.

And what do you know, we now have an obesity epidemic on our hands that our grandparents didn't have and guess what else they didn't have? Vitamins enriched bread or freaking corn syrup in everything. Or the health problems we are confronted with, even without such extensive knowledge about good vs bad cholesterol.

Let's face it, when it comes to nutrition, can you imagine what would happen if tens of thousands of medical professionals suddenly admitted wrong doings which directly led to the deaths of an unimaginable number of people? WHO is never going to recognize that and that line above is bullshit.

And if you think about it, because of such fuckups is why many people have lost faith in the healthcare industry and why we now have a significant minority refusing vaccinations for their children, which is fucked up and stupid, but then again you can't really blame them.

25
ardit33 21 hours ago 1 reply      
While some of the points are valid I'd take this with a grain of salt. This piece was written by a Staff Scientist at Integrated DNA Technologies.

She might truly believe what she has written, but she is clearly economically vested into people consuming more GMO products.

26
MustrumRidcully 16 hours ago 0 replies      
You are not scientists, you are merely technicians understanding little about how nature and especially soils works. GMOs and extensive agriculture using glyphosate slowing sterilize soils by destroying its ecosystem that were protecting it. Thus you are forced to add more and more fertilizer, making the soils more and more acid. At this point, you are only managing your plants' desease.

Until a stronger desease will come and wipe all of your DNA clones. Think about Dedale's fate, the first engineer that lost his son after going from poor solution to poor solution.

A forest is a very productive environment, without the need of GMOs or glyphosate. Why?

27
skywhopper 15 hours ago 0 replies      
While I agree entirely with many points about misleading labelling and the relative safety of GMO foods, these sorts of articles inevitably leave out any notion of the actual risks that heavily pro-GMO policies might produce. Such as the massive risk to long-term biodiversity of crowding out non-GMO breeds in the name of "less wasteful" but far less diverse GMO breeds. Not to mention the well-established risk of lowering the quality of the food itself in terms of taste and nutrition in pursuit of shelf appearance and lifetime, pest resistance, and harvesting density, which long before direct genetic modification was possible ruined many common produce varieties.

These are entirely reasonable concerns, well proven out by past experience with agricultural monocultures and counterproductive "improvements" in the shelf product. And so while it's fair and probably justified to avoid the "Non-GMO" certification label itself, it's not justifiable to dismiss concerns about trends in GMO produce overall by quoting "science" that only addresses the safety of GMO produce and not the larger concerns.

28
pbhjpbhj 18 hours ago 0 replies      
That's great and all, and I support labelling to ensure people can make that choice, or some other choice. The big question for me is why those who support GMO want to not let other people choose?
29
droopybuns 21 hours ago 1 reply      
I have two reasons for avoiding GMO:

Flavor

Local small farms don't embrace it.

If local farms did embrace it, I'd still be stuck with the flavor problem.

30
pitaj 12 hours ago 2 replies      
I am just blown away by the amount of anti-GMO sentiment on HN. For a community so vehemently in support of the science around global warming, you certainly do not seem to be nearly as "pro-science" when it comes to the topic of GMOs.
31
entwife 20 hours ago 0 replies      
I don't trust this author because the claim to be a scientist does not speak of her credentials, publication record, field of work. A data scientist does not have the same range of expertise and credibility as a biochemist. Furthermore, actively avoiding a label that one finds meaningless is non productive. Shod non Jews avoid Kosher labelling, or non Muslims avoid halal labels? Surely there is a devout athiest somewhere who does, the rest of us just ignore.
32
soufron 19 hours ago 0 replies      
Well that's some high-level scientist bullshit there.

So non-gmo labels can be misleading, but not labeling gmo isn't. Wow. Welcome to Animal Farm.

And of course, a non-gmo label doesn't mean anything good. But a cool gmo non-label of course means something positive.

Also I am a scientist, and I avoid gmo products... wait... actually I don't have to avoid it since they are banned in the EU and in France. But we are only a bunch of retards I guess.

33
Gatsky 17 hours ago 0 replies      
GMO science gets a hard time because it precisely documents the changes made to nature. It seems illogical to regulate GMO so heavily but let farmers use antiobiotics, pesticides, herbicides etc (which have actually been proven to be harmful in a myriad of ways, and alter genetic material in the ecosystem). There also needs to be a sense of magnitude when it comes to harm. The amount of human suffering caused by alcohol, tobacco and refined sugar is far far greater than even the most apocalyptic GMO safety disaster, but we don't demand 10 years safety data for eg. a new Mountain Dew flavour, or e-cigarettes.

It's also disingenuous to claim that GMO isn't needed if we adopt sustainable farming, when the economics and scale of industrial farming make this fantasy impossible.

34
guelo 22 hours ago 1 reply      
GMOs don't increase crop yields and don't reduce the use of pesticides http://mobile.nytimes.com/2016/10/30/business/gmo-promise-fa...
35
igl 7 hours ago 0 replies      
Clickbait-mood-piece that has plenty of links to even worse articles.
36
lesserknowndan 11 hours ago 0 replies      
Good thing the products are labelled as being non-GMO or you wouldn't be able to boycott them!
37
cwp 20 hours ago 0 replies      
I do this too. If somebody helpfully labels themselves as anti-science, I'll make sure not to give them my money.
38
Alex3917 22 hours ago 0 replies      
I have no problem with GMO, but will only feed my children food that is certified AI-free.

"Sky, not Skynet."

"Google is not the Creator."

39
exabrial 13 hours ago 0 replies      
If you support AGW but think GMOs are unsafe you really need to think about why you can support science based conclusions in only certain circumstances
40
NumberCruncher 17 hours ago 0 replies      
>>The financial, environmental and health impacts of adopting non-GMO ingredients include [...] higher prices [...] and reduced food availability.

You just found the cure for the obesity epidemic! Double win!

41
shkkmo 21 hours ago 1 reply      
I take a very different stance on GMO products than most people I've talked to. I am a support of GMO products and think they are critical to protecting the food supply. I also think that any GMO food sold should be required to release the genome with a FOS license. I think that our food supply is a common heritage and rights to use and work with it should not be restricted to a few corporations.
42
arca_vorago 21 hours ago 3 replies      
I worked in a big ag company with a genetics lab and in a DNA sequencing lab later (as a sysadmin), so I feel like I have a decent amount of insight into this issue.

My two primary (I have more) objections to GMO are:

1) Lack of rigorous testing, especially long term testing. Many of the GMO products in production spent a minimal amount of time in the lab before being selected for production, with little to no long term testing being done. (to find any number of potential issues and side chain-effect issues that can happen.) For example, while at the sequencing company, I learned that it's not just about the genome of the thing itself, but it's surrounding microbiome, which of course is almost an after thought to the big-ag gmo producers.

2) The way in which the pro GMO big ag has essentially corrupted the bodies set up to regulate itself. Monsanto is a perfect example of this, but only one of many. They have a supreme court justice who has refused to recuse himself on cases with obvious conflicts of interest. They have a massive online pr (read: propaganda) operation to keep these things out of the limelight or at least remove them quickly. They have basically taken over the most prominent FDA positions. They abuse K-street beyond what even most of the K-Street abusers do. It's this kind of systematic corporate corruption that makes me sympathetic with those who are anti-gmo.

I also have a huge issue with the whole patent system when it comes to genomes, the system is broke enough for normal tech, but there have been some dangerous precedents set that I don't have the time to get into but they will have major consequences as sequencing costs and manipulation costs are reduced and therefore it becomes a more prevalent tech.

Now, all that being said, I see a ton of FUD on the anti-gmo side that is easily refuted, but the problem is that I far too often see the illogical/irrational FUD of anti-GMO being the strawman used by pro-gmo people against the more reasonable and scientific anti-gmo arguments, which seems to be intellectually dishonest, very much like I find articles like this.

I could go on, but just wanted to offer a quick two-cents.

Oh, and don't even get me started on the horrible state of peer-reviewed science. As a layman with a huge interest in science, I held the process up on a pedestal, that is until I saw it from the inside. Bad science abounds and is rarely challenged, which only gives fuel to the anti-intellectual/anti-science movements which is the last thing we need as a species!

43
stratigos 21 hours ago 0 replies      
Smells a lot like the same industry propaganda over and over...
44
Obi_Juan_Kenobi 20 hours ago 0 replies      
Wow, there's a lot of reactionary downvoting in this thread, specifically for comments that are critical of any anti-GMO sentiment.
45
jahbrewski 17 hours ago 0 replies      
Don't let the double negative trick you like it initially tricked me!
46
mschuster91 22 hours ago 0 replies      
> Many vitamins and nutrients used for enrichment are derived from genetically modified microorganisms. Others are derived from crops, such as corn, that are genetically modified. The Non-GMO Project bans the use of micronutrients derived from these GMOs. As such, there are documented instances of foods that have lost their vitamin content after changing their manufacturing process to meet the Non-GMO Projects certification requirements.

I believe that many people are not against using GM bacteria/fungi/... to produce stuff (lots of medicine is manufactured that way, for example), they just don't want GMOs in the fields because:

1) they are (rightfully) afraid that unexpected side-effects happen when modified organisms are able to spread beyond the field

2) they do not like the stranglehold that Big Ag can impose on farmers (which is more a socio-economic issue, but nevertheless valid given the actions of e.g. Monsanto https://www.theguardian.com/environment/2013/feb/12/monsanto...)

47
hackaflocka 15 hours ago 0 replies      
> It can also be redundant, since the USDAs organic label already excludes GMOs

Quick, what's the logical fallacy here?

48
EpicWaves 20 hours ago 3 replies      
17
How to contribute to an open source project on GitHub davidecoppola.com
431 points by vivaladav  2 days ago   92 comments top 28
1
jhchen 2 days ago 4 replies      
This guide is heavy on the mechanical side and misses a lot of important substantive parts, if your goal is to add value to an open source project.

Don't just create a fork, branch, and submit a PR without context. First, make sure the intent of your change is actually desired. Just because someone opened an Issue does not mean that it belongs in the project. Anyone in the world can open a Github Issue for any reason. Instead engage and discuss the Issue first and make sure it's actually something the project wants.

Don't just start writing code. Familiarize yourself with the codebase. This comes naturally if you are a user of the project, as you will naturally run into bugs or learn the software's behaviors and as you discuss the Issue or features with maintainers. There are far fewer right ways to build a feature than possible ways.

Finally, understand that your contribution is not "free" for the project. It takes time and consideration to even look at your PR and even more to code review it. The more popular the project, the more true this is.

2
csl 2 days ago 4 replies      
There is one very important tip that's missing: Follow the original coding style exactly.

Not just spaces vs tabs or block styles, but idioms and other idiosyncrasies, too. Why? Imagine reading a source repo where every second block uses different bracket styles, mixing spaces with tabs and so on. It's going to look like a kludgy mess, and will be distracting to read.

There is no correct style for most languages (perhaps `go fmt` might be an exception), only opinions.

3
brobinson 2 days ago 4 replies      
I dislike the idea of using 'origin' for my own remote name.

I keep 'origin' as the canonical remote and my local master branch tracks origin/master. I use people's usernames for their remotes (including for my own).

If I'm pushing a feature branch to my own remote:

 git push -u myusername mybranchname
If I need to checkout someone's PR, it's:

 git remote add theirusername git@github.com:theirusername/repo.git git fetch theirusername git checkout -t theirusername/repo
I've seen people at work who are new to git/Github struggle a lot with the 'origin'/'upstream' differentiation recently, especially when they're learning branching, and they don't seem to have any problems once I switch them over to using 'origin' + usernames.

4
mholt 2 days ago 0 replies      
I just want to emphasize that if reviewers ask you to make changes to your pull request, it is not a rejection or lack of appreciation. As the maintainer of an open source project, I greatly value contributors who will iterate and iterate until the change is accepted, and often, I will give them push rights ("collaborator" status) to pay it forward.

And if a change is rejected, it's usually because there was not enough discussion beforehand about how to solve the problem, the change itself did not undergo enough discussion/iterations, or the change is not really a solution to a problem. (It's not the maintainers saying "Go away and never come back" -- more like, "Thank you for your effort! Please approach this differently.")

5
hzoo 2 days ago 1 reply      
This is a great start!

Contributing can be a lot more than just PRs though: - answering questions on stack overflow, chat (irc, gitter, slack)

- creating a minimal code repro, checking for duplicates, checking if a bug is fixed in a later release/master branch

- writing tutorials/usage scenarios, giving talks, just using the project and providing feedback

- helping with documentation + website

- translations if possible

- reviewing other PRs

- helping with the changelog, testing prereleases

- adding to the discussion on issues

Bigger projects can have a pretty hard time with maintenance: fixing bugs, juggling PRs, making releases, answering questions, etc.

(We're looking for help on https://github.com/babel/babel and trying to figure out how we can make the project more contributor friendly!)

6
aban 2 days ago 0 replies      
If you'd like to avoid switching to the browser in your development workflow, I recommend checking out Git-Repo [0] that was posted to HN a little while ago [1].

Git-Repo basically tries to put as many steps of the contribution workflow in terminal as possible, by using the API of the git hosting services. It currently supports GitHub, GitLab, and Bitbucket.

[0]: https://github.com/guyzmo/git-repo

[1]: https://news.ycombinator.com/item?id=12677870

7
cthulhuology 2 days ago 3 replies      
For many projects, Github is just a place to publish yet another public repo. Using github issues and pull requests is a sure fire way to feel ignored. If you want to contribute, e-mail the lead maintainer. Do not submit patches to the ether. Do not think anyone will look at your patches. Having started several large open source projects, and started / worked for a number of open source companies, I can tell you the best way to get involved is to work on your personal relationship with the other developers. If that means hanging out in IRC or Slack, that's what it takes. Github is a terrible form of communication, especially when your org / developers have 100+ repos.
8
atemerev 2 days ago 5 replies      
Step 1: stumble upon a terrible bug (or that really obvious missing feature that _should_ be there) in your favorite library / framework / app.

Step 2: rant about it on HN / Github issues / whatever.

Step 3 (optional): try to reach developers on GitHub and get the obligatory "pull requests are welcome" response.

Step 4: In frustration, clone the repository, fix the damn bug and submit your pull request.

Steps 5..41: have an angry and emotional discussion with the devs who refuse to accept your PR because broken binary compatibility / regression tests / coding style / your choice of variable names etc. Fix all these issues and resubmit the PR until it's accepted.

Step 42: And this is how you become a contributor to high-profile projects like Docker, Akka, Spark etc, and now free to boast about it in your CV!

9
majelix 2 days ago 1 reply      
Of these, 1: Chose the project you want to contribute to (and 1.5: Choose the issue to work on) and 9: Follow up are the hard ones. Both are primarily social problems.

For 1, it's mostly about knowing yourself. What projects interest you, and where can you contribute?

For 9, it's convincing the owners that your contribution is a net positive. Start with 2: Check out how to contribute, and proactively reach out so your pull request doesn't come out of the blue.

Oh, and be willing to put your ego aside -- it can be tough to defend your work, particularly if you're a new (and thus haven't built up trust) contributor. It gets easier, both as the project learns to trust you and as you learn the work within their practices.

10
TAForObvReasons 2 days ago 0 replies      
> The way people (usually) contribute to an open source project on GitHub is using pull requests.

I disagree with this premise. The way people usually contribute to an open source project on GitHub is creating an issue or adding to a discussion. IMHO this is more valuable than actually writing code because it helps other developers gauge the relative demand for a feature/bugfix and sometimes you find out that other people have already solved the problem in their own forks.

11
chmike 2 days ago 0 replies      
There are a few steps missing. Before the push, we need to pull on master and rebase the branch on top of master. See triangular workflow in this page [https://github.com/blog/2042-git-2-5-including-multiple-work...]
12
Whackbat 2 days ago 0 replies      
For anyone that is new to contributing to projects using git for version control I strongly recommend giving this tutorial a read: https://www.atlassian.com/git/tutorials/syncing

Additionally, the best way to learn git is to use it so try all the examples.

13
winterismute 2 days ago 0 replies      
Why did this get so many upvotes? It's well written, but isn't it just a trivial guide on how to do a pull request? Not trying to be controversial, I'm just genuinely curious.
14
fphilipe 2 days ago 0 replies      
I can highly recommend hub [1], which does steps 3, 4, and 5 in one command:

 hub fork username/repo
1: https://github.com/github/hub

15
dibanez 2 days ago 0 replies      
I've been doing this without thinking about it for a while and after reading it from a beginner's perspective It seems like quite a few steps. It is the "right" thing to do though, as far as I know.
16
tbarbugli 2 days ago 0 replies      
I would add "Make sure the maintainer of the repo will merge (or even look at) your PRs". I more than once had very reasonable PRs (bugfixes) waiting to be merged forever.
17
infodroid 2 days ago 0 replies      
There is a lot of overlap with Github's own guide to Contributing to Open Source [1] and Forking Projects [2]:

[1] https://guides.github.com/activities/contributing-to-open-so...[2] https://guides.github.com/activities/forking/

18
matiasz 2 days ago 0 replies      
Its amazing that there arent many articles like this one. I wrote something very similar [1] last year because I simply couldnt find a complete, step-by-step guide that addressed the details of forking, branching, etc.

[1] http://www.matiasz.com/2015/04/16/contribute-open-source-rep...

19
exhilaration 2 days ago 2 replies      
Is there a breakdown of top projects by language? I'm a C# developer and would love to put my skills to work on an open source project, but how do I find one?
20
joelhooks 2 days ago 0 replies      
Here are some screencasts on this topic if that sort of thing interests you: https://egghead.io/courses/how-to-contribute-to-an-open-sour...
21
OhSoHumble 2 days ago 0 replies      
> Hopefully some of the project mantainers will check your pull request and will give you feedback or notify you they decided to merge your changes soon.

Ah, my experience is that I'll submit a PR and it'll just be ignored until the end of time.

22
smegel 2 days ago 0 replies      
7. Work on your contribution

8. Write tests!

23
CharlesMerriam2 2 days ago 1 reply      
Wow! When you want to fix a typo, it can be as little as two hours!
24
evantahler 2 days ago 0 replies      
Ya'll can contribute to www.actionherojs.com whenever you want!
25
ThomPete 2 days ago 2 replies      
But what about designers?
26
dorianm 2 days ago 1 reply      
Or in a lot of cases you can use github's edit button ;)
27
awkward_yeti 2 days ago 2 replies      
what if I want to start my contribution I just don't know which project to choose ? this assumes that I already know the project I want to contribute to.
28
i9182u79ikjnsk 2 days ago 1 reply      
Shudder. This embodies all that is wrong with open source development these days.

Horrible GitHub workflow: corporate logos, octop^Hcat, CoC shoved in your face on every commit.

I'm currently forced to contribute to a GitHub project, it is the most annoying, bureaucratic and brainwashing workflow I've ever experienced.

18
Unsplash Beautiful photos free to use under the Unsplash License unsplash.com
440 points by tambourine_man  1 day ago   75 comments top 21
1
cyberferret 1 day ago 4 replies      
Unsplash is a really cool resource. We actually use it (paired with another 'sister service' called Unsplash.it) to provide ever changing and semi interesting 404 error pages for our web app... I blogged about how we do it a while back - http://devan.blaze.com.au/blog/2015/11/3/errors-dont-have-to...
2
Cbeck527 1 day ago 2 replies      
I was recently featured[1] in collection #127, and as a long time user it feels really awesome to give back and let others use my work.

I've also been to a few of their NYC meetups and it's clear that the site is backed by an amazing community.

1 - https://unsplash.com/collections/curated/127?photo=jYYpTndzo...

3
kardos 1 day ago 2 replies      
How does the Unsplash license [1] differ from the Creative Commons Zero license?

[1] https://unsplash.com/license

4
df3 1 day ago 1 reply      
Unsplash is a great resource.

It's important to point out that "free" and "royalty-free" aren't the same thing. Unsplash images are actually in the public domain, whereas "royalty-free" is a license type where an image can be used multiple times for one payment.

5
Ahmed90 1 day ago 1 reply      
Always Great qualityfor fellow web devs out there give http://unsplash.it a try for development easy, fast and beautiful placeholder images
6
fpgaminer 1 day ago 0 replies      
Here's a quick script I put together which downloads a random image every hour and sets your wallpaper. Only works with Gnome 3/Unity/Cinnamon. Adjust line 5 for different resolutions (currently set for 1920x1080) and adjust line 7 for different update frequency:

https://gist.github.com/fpgaminer/bdd493ce84eafb7886e08d20c2...

7
bdcravens 1 day ago 0 replies      
Apple products on a distressed wooden table, laid out perfectly yet supposedly naturally positioned, with an open paper notebook: check
8
josephg 1 day ago 0 replies      
6 months ago I wrote a couple little scripts to download new unsplash into a directory every 6 hours. Then I pointed macos to use random images from that directory for wallpapers. The whole thing is great - its a source of tiny delight throughout my week. Its also a small step toward making my workspace feel more hackable.

The whole thing was super hacked together - I'm sure there's nicer solutions around but I'm plenty happy with what I have. Details here if anyone wants to copy what I did: https://josephg.com/blog/shiny-background-images/

9
sirodoht 1 day ago 0 replies      
Fun fact: when this was firstly posted in HN there were mainly negative comments, about yet another website on a market with many players.

Lately it has become a favorite site for many people. So, just another incarnation of the Google story. <3

10
ars 1 day ago 0 replies      
From the name I thought these photos were free to use as long as you agree never to have a splash popup on your site :)

Maybe someone could actually do that.........

11
hiimnate 1 day ago 0 replies      
I love Unsplash. They also have a Chrome extension to show a random image in your new tab page. Would recommend.

https://chrome.google.com/webstore/detail/unsplash-instant/p...

12
seanwilson 1 day ago 0 replies      
Great resource. I'm curious how this impacts photographers though given there's so many sources of free images now. Can you make a living creating and selling stock photos?
13
Hondor 1 day ago 0 replies      
Wonderful to see more CC0 use compared to a lot of "free" art on the internet that burdens the users with keeping track of attribution requirements and including the license text and all that tediousness.
14
philfrasty 1 day ago 2 replies      
How do they make sure the submitter of the image is actually the rightsholder/owner?
15
Esau 1 day ago 1 reply      
Great site but they must use a heck of a lot of bandwidth. How do they stay afloat?
16
yatsyk 1 day ago 0 replies      
Great resource.

Can somebody recommend similar resource with unprocessed images? Most photos toned or converted to black-and-white.

17
tunnuz 1 day ago 0 replies      
I shared some of my best photos on Unsplash, and I plan to use it in the future. It is a great resource.
18
rokhayakebe 1 day ago 1 reply      
To think these guys started with 10 photos.
19
ChrisNorstrom 1 day ago 0 replies      
There goes my evening. Seriously, Thank You for this. This is the best free photo collection I've seen so far and I scout a lot of photo collection sites.

1) I LOVE how you group photos by subject/topic instead of just randomly posting photos and asking the user to search for what they want. Most of the time users don't know what they want and would rather just browse and look around. Browsing lists and collections is more entertaining, engaging, and useful than what other photo sites do: drop off the user in front of a search box and ask "what do you want?". That's like asking someone "tell me everything about you". It forces the user to engage in some serious mental gymnastics and fatigues them. Collections like yours are easier on the brain. Just pick a pretty picture and browse all the pretty photos in that collection. Love it.

2) The photography is beautiful and looks authentic, rare, and avoids that "generic stock photo" feel. These photos look like they're out of somebody's "rare find" folder. They are gorgeous and ready to be used with minimal photoshopping.

3) Most of these already have color correcting and filters applied. Did your site do this? Or did the photographers?

Unique. Useful. Going in my bookmarks. Thanks for this.

20
tschellenbach 1 day ago 0 replies      
Big fan of Unsplash, great resource!
21
raz32dust 1 day ago 1 reply      
It is obviously great for users. But I am kinda sad for the artist photographers. I don't think photography should be done for money, and I doubt any photographer would make real money off landscape and generic photos. But at least having a chance of making money via sites like 500px is a good thing in my opinion. Some additional incentive for them to keep trying.

Talented individuals who are well off, giving talent for free makes life harder for other talented individuals who might not actually be well off and might have just this talent. It looks like service is the only thing that will be monetize-able in the future. Actual products will all be available free of cost. I think it will drive down the quality of the best products while driving the average quality up.

19
Nyancat on the touchbar github.com
380 points by matt2000  2 days ago   158 comments top 19
1
m0dest 2 days ago 6 replies      
To evoke maximum outrage, you should show AdMob mobile banner ads on the Touch Bar. First app to do it wins a free press cycle.
2
awalGarg 2 days ago 13 replies      
So... just about the touchbar and not this particular project:

The touchbar is on the keyboard, almost perpendicularly (and many times at an obtuse angle) opposite to the screen. So I have to move my head up and down all the time. This is not an issue with the regular keyboard because it has physical buttons which don't change their meaning, so I have memorized the layout and I can use my muscle memory.

To avoid moving human head up and down, maybe they should have put it adjacent below or above the screen, you know. And while they were at it, maybe they should have just merged it into the screen seamlessly... Oh, and while they were at that, they could make the entire screen a touch-screen instead of just a small bar. Just imagine... a touchscreen laptop. Now that would be amazing, right?

3
hanief 2 days ago 2 replies      
Apple's HIG[1] on the Touch bar is actually great. You can sort of understand the reason Apple put it on the Macbook. You can at least envision two interesting things:

- Touch UI for scrubbing/scrolling contents faster.

- Dynamic shortcuts for most used commands that previously only accessible by using the keyboard.

[1] https://developer.apple.com/library/content/documentation/Us...

4
antishatter 2 days ago 1 reply      
As silly as the touch bar looks this type of project gives me hope for its usefulness.
5
gedy 2 days ago 2 replies      
I'd consider the touchbar more useful for UI interaction if it were instead in either:

- The trackpad

- The bottom 1" of the laptop screen

6
ryanbertrand 2 days ago 3 replies      
My big issue with the touch bar is it will be a far awkward reach for me when I use my desk with external monitor with a external keyboard. It's really only useful when I work outside my office. :(
7
Pxtl 2 days ago 1 reply      
I want to see a music app on it so somebody do a sweet theremin solo on that thing keytar style.
8
StreamBright 2 days ago 0 replies      
I was worried that the touchbar will not have any legitimate use. On a bit more serious note, is there anything that this could do that is worth the pain of removing the esc and fn keys from the user point of view? I can hardly think of anything.
9
DonHopkins 2 days ago 0 replies      
They should sell boxes of adhesive bluetooth touchbars by the dozen, so you can stick them all over the place.
10
billconan 2 days ago 3 replies      
I thought about putting super mario on that bar.
11
davesque 2 days ago 0 replies      
That's pretty much all it's good for.
12
ovao 2 days ago 0 replies      
That about sums it up.
13
Pica_soO 1 day ago 0 replies      
Imagine Microsoft Clippy providing great advice to the users of the touchbar.
14
maxaf 2 days ago 1 reply      
The touch bar will be what finally compels Bloomberg to build a Mac-native Terminal. Mark my words.
15
taurath 2 days ago 2 replies      
Dammit, literally what I was going to do, but my macbook ships in another 3 weeks.
16
cobbzilla 2 days ago 1 reply      
could someone who has the TouchBar tell me: is it possible to lock the function keys in place? Does the user or the app have ultimate control of its little screen?

edit: I say this as someone who loves nyancat, and would love to have a nyan-mode on/off toggle somewhere, if I had a TouchBar.

17
JoeDaDude 1 day ago 0 replies      
For those of us that will not buy a Macbook anytime soon(the one I have is working just fine), show us the youtube!
18
chb 2 days ago 0 replies      
Thanks, I'll wait for Canonlake and LPDDR 4.
19
Fej 2 days ago 1 reply      
Hasn't this meme kinda... played out at this point?
20
I dont like computers happyassassin.net
439 points by cheiVia0  3 days ago   263 comments top 77
1
m_fayer 3 days ago 4 replies      
This struck a chord with me.

Unlike the author, I think I still like computers, but only in their essence. I like programming, the detective game of debugging, learning new paradigms, getting lost in abstraction, the thrill of watching powerful automation doing it's thing.

But I don't like what computers and the internet have become. Without constant mindful adjustment, all my devices inevitably become attention grabbing pushers of just-so packaged bits of media. I don't let that happen, but that's clearly their essential inclination. Keeping this at bay feels like swatting away the tentacles of some persistent deep sea creature.

I feel everyone's attention span eroding. I feel people packaging themselves for social media, opening their self-image and self-worth to the masses. I see a flood of undifferentiated information, the spread of hysteria and belligerence, the retreat of quietude, humility, and grace.

This is all downside, but lately I'm losing the upside. While I still love the technology underneath it all, more and more I feel like I'm working in the service of something that's driving humanity collectively insane.

2
mouzogu 3 days ago 2 replies      
I agree with the sentiments. I don't agree with the notion of the good-old days however. It only takes 5 minutes on a PC running Windows 3.1 to remind me how much of pain those days were at times - at least in comparison to now.

You see the difference is that I was much more patient and tolerant then. Now, thanks to the Internet - I have become very impatient, anxious and my attention span has dropped almost to zero.

I hate theee things. The way technology has changed me. This is why I have grown more and more disinterested in technology and all its promises. Even though if I were being honest we have never had it better in terms of the range, options and diversity of the field.

I think technology has made me a worse person. More informed but less interested. It's given me more opportunnities at a time when i feel most exhausted and apathetic. Perhaps this is normal considering we are going through the "internet" revolution. A lot of changes. Many of which I don't like within myself and society in general.

3
clarry 3 days ago 5 replies      
I kinda share the feeling. Well I still like tinkering with some things that nobody else seems to care about. But most of the time it feels like doing stuff with computers is just fighting the new technology (which I don't care for) and then there's politics, copyright & contracts, things that further try to ruin it for me.

For most part I can't get excited about any of the news about software, programming languages, new services or big tech corps. I look at the front page of HN, yawn and move on. I don't care what Apple is doing, I don't care what Google is doing, I don't care about your new javascript framework or microservice, not about your new OS, I don't care about a new smartphone or laptop...

The few things I find interesting are things I keep to myself because every time I've tried to make a discussion about them, nobody else seems to be interested. Or it may even be met with hostility.

4
bitL 3 days ago 4 replies      
I really think there was a hidden shift in the past 10 years from dreamers trying to implement things helping humanity to asocial computing whose only purpose is to extract more money than humanly possible. Before these dreamers had an edge as the power-hungry people didn't get it; now they get it and use it to extend their power and even using these dreamers as disposable ways to reach their goals. It's difficult to get excited about it.
5
dasmoth 3 days ago 1 reply      
I'm guessing a similar-ish age to myself.

Besides the internet, one thing that's changed is that computing has become a much less solitary activity: in the 90s and 2000s we were still seeing the tail end of the microcomputer era which was very much built by individuals hacking away on stuff at home or in tiny businesses -- and when larger businesses hired "microcomputer" (and to some extent PC, and web) people, they still worked in very much the same way.

Today, the IT workplace is all about "teams and practices", and even if you're working on something intensely personal as a side project, there's still a degree of expectation that if you want it to amount to anything you need to get it out there as a collaborative, open source project. Or a company with other people involved.

At least for introverts, computing used to seem like something of a refuge. That's definitely less true today unless you deliberately do something that's totally personal.

6
kleiba 3 days ago 2 replies      
I guess I can relate to the general feeling, but I would say that it's not computers I don't like, it's the internet. And if you look at the bulleted list in the original blog post, you see that most of the things listed there are internet related - probably, as one might argue, because computers == internet these days.

But that's why I say I share the feeling: what drew me to computers when I was little was the tinkering with this fascinating machine that did as you told it (so you better told it the right things or else it would end up in a mess without mercy). It was a time were you felt you could still reach a point were you're actually in control, computers were still simple enough that one person could pretty much understand all of it.

This is no longer the case today. The complexity of the modern IT landscape is just intimidating. You couldn't possibly feel like you could one day be in control or on top of things anymore. Everything's changing, everything's growing at too fast a pace to keep up.

Therefore, if what drew you to computing in the first place was a personal connection and interaction between yourself and the machine, it's no wonder that that magic has gone now.

7
ensiferum 3 days ago 2 replies      
Heh, and I don't have a problem to say that I'm looking for a career switch. In fact I've been doing software engineering professionally for +15 years now and quite honestly I'm sick of it. Have been for years now.

I still enjoy programming but only that and when I get to program my own hobby programs and focus on the parts and problems that I find worth solving and doing. I don't enjoy the SW dev work at work, doing stuff that I don't care (or the world doesn't care about), solving problems like fixing build files, or having a shitty tool that crashes or having all these stupid useless (meta) problems and the general nonsense prevalent in the IT/tech industry. Just as an example of what I mean the other day I was having a problem with automake (WARNING: 'aclocal-1.14' is missing on your system.) when building protobuf (not going to get into details, it's very obscure). My motivation for this kind of (nearly daily) crap is about absolute 0. I'm sick and tired of it all.

The only reason why I'm still doing this is because I haven't realized what would be a feasible alternate job for me to do and which direction to go to.

Overall I feel like this job has changed me as a person as well. I'm extremely cynical these days about anything related to tech/IT. But hey at least I have a great taste for cynical and sarcastic humor now (for example Dilbert)!

8
eludwig 3 days ago 1 reply      
This is a normal part of growing up. It sounds like the author is perhaps in his mid-30s? 40ish? Am I close? Actually that doesn't matter at all, because this happens throughout your entire life. It's happened to me several times.

The secret to human interests is that they have an arc. A beginning, a middle and an end. Are you still doing the same things you were doing when you were ten? Maybe, but maybe not. I'm certainly not. There were no computers when I grew up. Well maybe a few ;)

It's natural to be bummed out when your interests (work interests, love, play, etc) change. It feels weird and uncomfortable, like we are losing something. It feels bad. You wonder if you are in a deeper funk...like real depression. Will it return? Is it a phase? You don't know.

The best way I have found to deal with this is just to watch. Observe. Hmm. I'm really not feeling this today and haven't for a while. That sucks. Don't get too caught up in it. Let the feelings rise and fall. Keep noticing. What is it that I do get turned on by? Well, I'd really like to be reading right now. So make time and do it. Let your urges take you where they will. Trust them. Let them lead you towards something that does it for you. The author seems to have that covered. He (she) is aware of things that are interesting. Keep doing these things. Let the things that interest you reveal themselves. Have faith in this cycle. It does eventually resolve itself.

I realize that this whole deal is tough due to responsibilities. Family, etc. People are counting on you. You have bills to pay. Appointments to keep. Keep them. Stick to the routine while you explore. This is important, because learning about yourself is easier when the external drama levels are low.

You will know if this course works, because you will feel better. If you still have angst and it is getting worse, then you may need to talk to a real person (a whole other kettle of fish).

My advice: listen and watch. Do what you need to while exploring what makes you happy.

9
MrQuincle 3 days ago 2 replies      
A different angle, but I don't like computers that are in your face, costing time, rather than giving me time.

I don't like games. I don't like VR. I don't like AR. I don't like television. Also reading HN too much makes me feel empty.

However, I do like smart things that do stuff for me and get out of my way. I really like waking up in a warm bedroom while the rest of the house is allowed be cold. I like the convenience of telling Alexa, "play something relaxing" when I come home from work. I like having to clean a little less thanks to a Roomba. I like not having to switch off stuff because it's done automatically. I like an AI to schedule my appointments.

Every computer that minimizes my interactions with computers or gives me time, the most precious resource, I like!

10
noir_lord 3 days ago 1 reply      
My GF laughs at me and says that I'm a terrible techie, outside of computers and programming, I'm just not that interested in technology anymore, A lot of the fluff around IoT seems just that fluff, I'm excited by the possibilities in terms of things like city management, I couldn't care less if I can turn my lights on when I walk into my house or if my toaster is connected to the internet.

I don't buy gadgets, I own tablets to watch the odd movie and for device testing otherwise I would only have one, my phone is a 4 year old Nexus 4 which I broke the back on and covered in black electrical tape (I could replace it but I don't care enough to do so), I use a 17" Vostro for working the odd time I can't be at my office or at home and it's dented and has stickers stuck all over the scratches, I'm not even sure I remember what the stickers where for.

I'm just not excited by new hardware like I used to be, I only care when it'll have a demonstrable impact on my enjoyment of programming where once I'd have lusted after the latest and greatest I couldn't even name the best model of i7 or whatever at the moment, I only care about that stuff when I'm building a new desktop.

What does excite me is how technology is having a meaningful impact on peoples quality of life.

I think in a way thats just part of getting older (I'm 36).

That and every time I interact with technology that isn't one of my linux machines I come away feeling like I should hunt down whoever wrote the software with a bat and some bad intentions, one of the downsides to been a programmer is that the deficiencies of everything are so much more obvious.

Prime example, I bought a LiFX 1000 bulb (WiFi/IoT bulb) to put into a ships lamp as a christmas present for the GF, it took me 45 minutes to set the thing up, followed all the instructions to the letter, de nada then I thought "I wonder if changing the wireless channel might work" and lo and behold changing from Channel 13 to channel 9 made it work.

Nowhere was that documented in the instructions (which I read) and had I not been a techie I'd have never thought to try it, my point been where once I'd have thought "This is cool" now I just resent the 45 minutes I won't get back.

11
marcusr 3 days ago 2 replies      
I'm guessing I'm a similar age to the author from the reference to parents shouting at the phone bill from my 1200/75 modem running all night. And just recently I've felt exactly the same way. My job involves both running systems and writing software, and the joy has disappeared from both. I used to work all day, then come home and hack all evening, but now I don't know if I'm burnt out, but I can find no interest in making computers do cool things any more.

There's been one small bright spot - I tried learning Haskell and loved the way functional programming stretched my brain but there's an awful lot to learn to do anything useful. But Elm, wow, do I love Elm. I feel the excitement I felt when I saw Ruby on Rails for the first time ten years ago. It's finding something interesting and useful to build with Elm that I'm struggling with now.

I wonder if it's the message that if you're not building a product that will build a unicorn company, then it's not interesting that's part of the general malaise.

12
pmyjavec 3 days ago 3 replies      
"Somewhere along the way, in the last OH GOD TWENTY YEARS, we along with a bunch of vulture capitalists and wacky Valley libertarians and government spooks and whoever else built this whole big crazy thing out of that 1990s Internet andI dont like it any more."

It was great fun before it all got so serious. Very funny and true ;)

13
redsummer 3 days ago 1 reply      
A lot of people think the Amish are against technology. In fact, they carefully consider the technology's effect on themselves and their community. Will it really help, or is it just a new thing which will cause unintended consequences? For instance, some Amish groups accepted cars, and their community disappeared - when anyone can drive anywhere the community collapsed. Now there are no Amish who allow cars. The same thing would happen with the internet.

Tech people like ourselves automatically assume that technology is some advance, or improvement. Our peers tell us this, our incomes depend on us believing this.

In fact, technology does not improve the human condition in most cases. It erodes it. We would be better making careful decisions like the Amish, but our civilisation is locked onto this myth of 'progress'.

14
teekert 3 days ago 1 reply      
OP is getting older. And, like me you move from nights of Gentoo tinckering to Arch, to 30 min Ubuntu LTS installs and hoping the default config files are what you need. And, in the future I see myself buying a Synology.

A child is intrinsically motivated to play, you loose this as an adult. No biggie but your shit just needs to get the job done, and the job is not learning as much about the shit as you can. Such is life, you have other things to do now, like raising a kid, and getting enough sleep while doing it.

As with life I learned a lot when young, taking the time to learn the stuff that I still use now. Perhaps computers extend the playing age because they are intellectually satisfying for much longer than other forms of play, but eventually you're done playing.

15
LeoPanthera 3 days ago 0 replies      
I find solace by retreating back into my childhood. My home office contains a collection of obsolete yet comfortable pieces of hardware. A BBC Micro, an Amiga, a Twentieth Anniversary Macintosh, a MAME cabinet, and a small collection of pinball tables.

I can happily spend hours immersed in the past, and when I'm done, returning to modern digital life is somehow refreshing.

See also, the Computer Chronicles YouTube channel: https://www.youtube.com/user/ComputerChroniclesYT

16
neals 3 days ago 0 replies      
I just wanted to add, that whenever I've worked too hard for a few nights straight and find myself on the edge of that dreaded burnout... I start to "not like" things.

I know you say you've been watching your hours, but burnout doesn't maybe just come from hours.

For me, it's the first signal I need to do something when I start to feel there's no food that is really tasteful anymore, there's no games I like playing and there's no job or person in the world that could possibly make me happy.

I get that's not the issue of the post, but maybe it's something to think about for all of us?

17
erikb 3 days ago 4 replies      
I think all technological revolutions were a big game changer and people who didn't follow it lost a lot of advantages for that. It is the same with this one. Now if you are 55 years old (and this guy seems to be) and you have saved enough money to stop caring, then go ahead.

But don't misunderstand that this is a luxury that you need to be able to afford. If you don't have rich parents, or saved enough money to live without the internet, you must must must find a place in the internet world that you can stay at (e.g. some FOSS 1990 style mailing list) and at least find some way to use social media (gnu social or G+ anyone?) in some reasonable way and have some kind of internet presence (e.g. a github page and some foss projects you supply commits to).

Really, try if you can't afford the luxury to ignore it. Politics always talks about the gap between rich and poor that gets bigger and bigger. But the same is true for the gap between people who take part in the internet and use it to their advantage, and those who ignore it. Both these gaps already overlap to some degree, and that overlap will continue to grow!

18
jordigh 3 days ago 4 replies      
I don't have a phone (mobile or not). I especially do not want to carry a pocket computer around with me, but that's just because it would make me feel so powerless to have a device that can track me, that I can't hack, and I don't have certainty of what it can or can't do.

Am I just old? I'm in my mid 30s. According to Douglas Adams, that's kind of the age at which new things are just perceived as being against the natural order of things. Kids these days are being raised thinking that talking to Alexa and having it bring back accurate results is completely normal and natural.

Are there young people out there who think modern pocket computing is just plain wrong? Do they have any second thoughts about putting their entire life online under the control of 3rd parties?

19
glaberficken 3 days ago 1 reply      
IMHO the feeling the OP is describing is nothing specific or exclusive to computers or the internet.

If you listen/read closely to people that work in all sorts of fields, this feeling is quite common.

You made a career out of a hobby you really enjoyed. And after a few years it became your work and you no longer enjoy it. You now find joy in some other activity. That new thing? That's your hobby now.

I got this impression after years of thinking of throwing myself into video-game journalism or bicycle mechanics as a profession (2 of my favorite hobbies). When I started speaking to actual video-game journalists and bicycle mechanics I immediately noticed that I couldn't find a singe one that still enjoyed his respective activity anymore.

I'm not going to try to play "psychology expert" here, but for me the reason seems to be pretty simple:Those people could no longer spend their time playing the video games they liked or riding and fixing their own bikes. They now had to play all the games they were "told" to play and on top of it take notes and write meticulously about them, the bicycle guy now had work on a bunch of strangers bikes he didn't care about and keep up with a bunch of new bike tech he actually thought was needless bullshit, and he had to sell bullshit Lycra shorts and stuff like that.

To this day (37yo) its one of the decisions I think I got "the most right" in my life. Not turning one of my hobbies into my job. (curiously this runs right against the common advice "Take what you are passionate about and make that your life's work.")

20
arximboldi 3 days ago 3 replies      
This article was very touching. I'm 28 and I feel exactly thesame way. I have spent quite some time thinking about the topic.I have even used the same words in conversations with friends.

There is a generation of people that got into computers becausethey were a tool for empowerement and creativity. When I was achild, my younger sister would create movies editing frame byframe in MS Paint while I would learn Pascal to make a sequencerto play "melodies" using the PC speaker bell commands. Her friendwould learn HTML to create a manually updated blog where shewould post fantasy short stories. In the Internet, we all hangaround with nicknames in chat rooms and learn to make flashywebsites and get through the chain emails from relatives. Weneeded no Netflix or Facebook to share stuff, we had P2P andemail and IRC. Then we learnt about GNU/Linux: the ultimate toolto get control of our machines. It was all organized chaos,instant communication that no one could control, limitlesscreativity, the ultimate dream of a post-capitalist anarchistsociety...

At some point, some got to believe that if only these tools wouldbecome mainstream, the mainstream would adopt these values. Atecho-revolution!

This overestimated the transformative power of technology. Whathappened was otherwise: technology is now mainstream and hasbecome a tool for social control and the ultimate frontier ofconsumerism. Tech didn't change society, society changed tech...

I still want to believe in these utopic values. But I understandthat it is a long way traveled in little steps whose significanceis hard to see while at it. In the meantime it's often tiringand lonely to live in the computing underground. One has toexplain people why you don't have a smartphone (and it getsharder to reach people without having Wassapp and so on), one hasto explain relatives why you don't want to work for BigTechCorp,while tryin to stay "up to date" one has to go through the angryrants of Apple users on HN [1] or the celebration of the newMicro$oft facelift, and the collective systemic submission in thestartup world in this new gold rush...

The hardest part for me is to find stuff that I can do well andthat I find valuable to the world... and still get paid for it.And I am an Software Engineer, the profession of the future! Howcan I be so obnoxious to have plenty of well paid jobs around meand not be interested in them? This makes me very sad and makesme feel deeply alienated...

---

[1] You are not angry because of the design of a computer, youare angry at the realization that you are so personally investedin a technology that you have no control of, but has control overyou!

21
qwertyuiop924 3 days ago 1 reply      
I love computers. I hate the BS: I don't care about that new sillicon valley project. But I do care about that new project that's going to change how we think about computing. I care about the programming language that will show me a radically different way to program. I care about the tool that's so elegantly designed that it takes 5 minutes to explain how it works, and does its job amazingly well.

I care about things that remind me why I got into computing in the first place: For the sheer joy of it.

22
coldnebo 3 days ago 0 replies      
I can sympathize, and there are areas of computing that are tiresome and never seem to get better, but there are just way too many things on my bucket list: unbiased rendering, physics, AI, mathematics education, visualization, theory of computation...

Heck, there are so many research projects out there completely changing what it means to compute (i.e. Bret Victor), let alone rediscovery of what the founding scientists (i.e. Turing) had for their original vision (did you know Turing generated music from his computer? Decades before the first synthesizer?! or Bell Labs, or PARC.

There is so much to know and so very little time to even scratch the surface. Maybe I'll get bored later, but right now there are things to do!

23
quickben 3 days ago 0 replies      
I'm 35 and in good balance and desire for life. I want to share that you too should:

- read your Marcus Aurelius

- listen to some Alan Watts

- you are not alone, or the first person to get existential, many before you did and many after you will. Detach from anything technology from time to time, and spend some serious time reading about who people think they are, and what all this is about.

24
jay_kyburz 3 days ago 0 replies      
I suspect I am a similar generation, and I still love to tinker and make things happen on my computer.

But Twitter, Facebook, Netflix, Spotify, Snapchat, or Uber, have nothing to do with tinkering or creating something.

I also don't want to surround myself with the internet of things because I know how insecure and broken everything is. I'd rather be buying appliances that I can leave for my children when I am gone, rather than buy new ones every two years.

I'm still perfectly happy with the 5 year old MBP. I hope it will last another 5 years - even more with luck.

25
snarfy 3 days ago 2 replies      
I felt that way years ago.

I liked video games. I wanted to make video games. To do that requires programming a computer. Ok, how do you do that? Let's go down that rabbit hole. 30 years later and I'm still going down the rabbit hole. I've haven't made a video game yet, only bits and pieces and some mods, but at this point, I don't really like video games much anymore. So now what?

I've been doing a lot of hardware, electronics, arduino and general maker stuff. I still like making stuff, but it doesn't have to all be on the computer, and it doesn't have to be a game. I'm more interested in how a HAM radio transmitter works than the latest js framework these days.

26
hellofunk 3 days ago 1 reply      
My main problem with computers is how they have infected global society such that human social behavior now revolves around how computers work.

Other advancements, like the automobile, also changed society, but at least once you leave your car and are having a real conversation with someone, your car won't suddenly take your attention away.

27
darrelld 2 days ago 0 replies      
This hits the nail on the head of a feeling I've been having for years now.

Computers used to be fun and exciting. Learning how to program was a hobby for me. Opening up that Gateway 486DX to tinker with the insides of it and swap parts out was fascinating. Yes they were big bulky and prone to failure but it was still fun.

Nowadays everything is integrated, soldered on and you need a laundry list of tools to even open the casing. Used to be you could just get a star head screw driver out of the kitchen and be done.

All of the whining about the new Macbook made me sigh. Get over yourselves.

The internet was a fascinating place before. I remember frequenting forums like http://www.hack3r.com/ and reading a python tutorial, finding a mistake in it and chatting on IRC with the writer who treated me as an equal, not just some 13 year old kid. On other forums there was debate and deep conversation. Trolls and extremism was not welcome.

Now we have twitter and Facebook where it's all "me me me" and no one is having real conversations anymore. Just short bursts of it.

Now when I start a project and someone asks me "Why are you using <INSERT NEW TECH HERE>?" I roll my eyes and groan. It's a single page website with a form on it. I'll code it up in Notepad++, capture the form data and save it to a database thank you very much. I'll get it done quickly and commit it to source. No need for me to spin up Node, grunt, yeoman and whatever other shiny thing you're talking about. I'll be dammed if I even use jQuery.

I've been day dreaming about going into another field where I can still maintain my standard of living, but then program again just as a hobby at home. Anyone know of a good field outside of the tech industry that a developer mindset would thrive in?

28
sz4kerto 3 days ago 0 replies      
Neither do I. Well, I do like my main computer, I enjoy having lots of ram, cpu power, large monitor. Because I spend a lot of time with it.

But.

I have an iPad for 2 years that I haven't used, almost ever (got it as a present). I don't want a smart fridge. I run without monitoring myself all the time. I don't play on my (otherwise high-end) smartphone, I only use 6-7 apps.

The reason: I realized that these stuff are not _that_ smart yet. When I use their 'smartness', they consume more time than the non-smart things. We use a simple post-it for grocery lists with my wife because opening Trello is much more complicated than just picking up the pen when I realize that we don't have more garlic in the fridge.

I still enjoy hacking things for the sake of hacking, but that activity is not 'sold' as something smart that will save me time. It doesn't save any time, just makes me feel good.

29
theparanoid 3 days ago 1 reply      
It was a shock talking to a colleague and realizing I used to have his enthusiasm, now I want own a night club like jwz.
30
gravypod 3 days ago 0 replies      
Leave your job, become a welder, a package delivery guy, or pick up fixing cars.

It seems like you have one real hobby but no one should have one real hobby.

Welding is really rewarding. You're working with a really dangerous machine to turn 2 piece of metal into 1.

Working in the world of the 1st class currier is great to. A lot of time driving or traveling around where you live.

I didn't realize how fun working with motors is until I tried fixing something in my car. I'm trying to get my hand on a motorcycle that's broke so I can rebuild the engine to further learn how they work.

Also if you're fed up with the internet and still want to communicate with people become a HAM and learn all about RF propagation and other important things. Really fun, one of my favorite hobbies

31
SubiculumCode 3 days ago 1 reply      
I also dont like computers anymore. Its from being on them all the time. I want life apart from the screen.Then I get bored.
32
jdhzzz 2 days ago 0 replies      
While I'm somewhat in this camp:I don't do twitter (or instagram, or snapchat, or whatsapp...)I don't have SonosI only lurk on Facebook to find out what's up in other's livesI still really like working with computers full time.But, you can take the following from my cold, dead hands:

podcasts (listen to radio in the car? what year is this?)mp3/itunes/amazon music/pandora/google play music/spotifyphone with fingerprint scanner

What I am over is being a sysadmin on any of these devices.

Upgrade to android 7 breaks my phone to car radio connection? Ugh. Spare me the lost Saturday afternoon researching, tinkering, worrying that I'm going to brick the radio. Forget it and learn to live with only a bluetooth connection.

Swapping out the drive on my laptop with an SSD. I suppose, but I'm unenthusiatic. Do the same for my wife's laptop? Nah. She's not that sensitive to disk lag and the disruption and subsequent "It never did this before..." aren't worth it.

Perhaps I'm just getting old...

33
nul_byte 3 days ago 0 replies      
I appreciate where this guy is at.

After work you should do what you want to do. If that includes sports, going out and eating nice food etc, good for him. That is a balanced lifestyle, and worthy an effort to make.

I am kind of the opposite though when it comes to computers & IT and wanting to retreat a little. I started my coding career at 43 years old. I have worked in tech all my life, so the industry is nothing new to me, but I was never in engineering / software development. I was more of a Linux / network admin / systems integration engineer or run of the mill network architect (lots of time in powerpoint, visio (yuck!)). I kind of always had a healthy envy of developers, as I knew they were working within the real guts of computers and creating things. I was always the one trying to mop up the mess of a less bright developer who managed to get something dire into production. All this made me even more curious to get into that area myself.

With the advent of cloud, namely OpenStack and all the other devops'y type applications in the eco-system such as kvm, containers, vagrant, ansible, puppet etc etc, I found my nix skills could be reinvented and started learning python, brushing up my shell scripting, learning about serialising data, restful API's, messaging, models, views, controllers yada, yada, and then in turn learning lots of new tools including, git, gerrit, travis etc.

I am now loving what I do and I am super keen to learn more and more, so I do spend lots of spare time now absorbed in writing code and getting up to speed on different tooling available to developers.

Right now my spare time is spent learning rust as I would really like to get into systems programming and work with the kernel space for networking based apps.

Its weird, in that now is the time when I should be just specialising and not being so absorbed (a lot of senior guys do this at my firm, they are happy just to sit looking at some spreadsheet or project plan until 5pm and then go home), but instead I really want to develop a new career as a programmer over the next 10 - 20 years, and I love the idea of that.

I now have a laptop covered in stickers, have grown a big beard and I go all goey at the sight of some new snazzy framework. My wife jokes about it being a mid-life crisis.

I don't seem to be slowing down either, but in fact going quicker then ever before.

I am with him on instagram though, and I have no idea what alexa is.

34
jasonkostempski 3 days ago 1 reply      
"I use computers for...well, I use them for reading stuff. That is, actually reading it. Text. Pictures if I have to."

The other day I starting thinking of a way to filter the internet down to text/plain content only. I couldn't find a way to make Google filter on Content-Type, you can filter on filetype:txt but not all urls to text/plain content will have that extension. I also looked for a aggregate site that only allowed users to submit links to text/plain content but didn't find one, thinking about making one. Multimedia, markup, JavaScript and hypertext are all really useful, but they are abused so much that I think it would be better to start with the assumption that it's useless until proven otherwise. I'd rather have to copy and paste a URL to get a picture of a diagram relevant to an article than open the flood gates for in-line media, styling and scripting just because it looks a little nicer and saves few keystrokes.

35
andrewclunn 3 days ago 1 reply      
We have Wikipedia, a user generated encyclopedia. We have countless musicians, artists, and amateur film makers just a few clicks away. We have archives of history from the edge of wax audio recording and papers from the 1800s. Want your old internet back? It's as simple as not creating an account for anything, not bookmarking portal sites, and running an ad blocker.
36
kilon 2 days ago 0 replies      
I am 37 years old, I am coding since I was 9 but this is not my full time job , I make graphics and sound with the computer since day one.

The one thing I hated more than I hate Windows and C++ is the Green monitor of my first computer, a Amstrad CPC 6128. I was drooling over a Amiga 500 but my father apparently did not want to "spoil" me. But I was lucky enough to own a computer back then it was luxury.

I have to say I am in love with the Internet , is just an amazing tool. When I was kid my go to knowledge base was a 20 tome encyclopaedia.

Even learning coding was a huge struggle, as a 9 year old kid I could not afford expensive programming books, fortunately Amstrad came with a Basic manual, not the best written book but better than nothing.

I laugh when people claim that coding is hard, because I immediately remember my struggle those days. Internet would have been a miracle to have.

Its not hard to filter noise, I have made Twitter my news site with subscriptions to people that offer tutorials and links to useful websites. I watch youtube tutorials and very rarely cat videos. I dont care about facebook and most other popular sites.

The things I do not like, is mainly that sofware has become very complex, that is difficult to keep up with technology , though that is also an advantage. I also hate internet trolls and people being rude.

My iMac is the revenge of not owning an Amiga 500 and now I can have powerful software without even spending money.

I know people take a look at the technology and say "whatever" or as a comedian once said people fly with an aeroplane and complain that they had to wait on the runway for 10 minutes instead of feeling the wonder of flight.

I am the kind of person that is amazed by the ability to fly.

I am also find extremely hard to fathom that my iMac is more than 6.000 times more powerful than my first computer.

Its crazy

just crazy

37
lazyjones 3 days ago 0 replies      
I second this, but don't think it's just a matter of getting older and less curious or being fed up with computers due to years of professional work on them. For me, a large part of the frustration comes from having moved previously simple tasks and habits to complicated, complex and unstable computer-based solutions. Not only have the tasks themselves become more difficult and in some ways less efficient (reading tiny text on small displays - no thanks!), but they come with a huge burden of having to maintain an OS, network infrastructure, software updates, security risk, privacy considerations, prevent data loss (make backups). Sure, there are many advantages with our new approaches, but the burden of complexity far outweighs them if you stop ignoring it.
38
andretti1977 3 days ago 1 reply      
I was born in 1977 and programming since i was 7. I am a software developer. Seven years ago i had to start freelancing to enjoy my work again. Now i now that work can take no more than 8 hours per day, 5 days per week, no more. This way, computers are still beautiful and interesting.
39
eswat 3 days ago 0 replies      
Ive been feeling this. Still not quite sure why. But I think a good reason is Im starting to see newer technologies come out as veiled hypotheses on how to extract the most time or money out of the user, not so much as things that actually provide real, long-term value to people.

Im not actively trying to be a luddite or think I need to stick it to the man. But I cant shake the feeling that many technologies coming out simply dont care enough about humans to warrant actually being used. Thats not to disregard side projects and such. Most of the time the creation out of those projects is out of pure intentions. A lot of those same intentions get thrown out the window when money and company survival and thrown in.

40
pesenti 3 days ago 1 reply      
I am the opposite. I used not to care much about computers (I liked math). But I have been blown away but what they enable us to do. And it keeps getting better. Yes computers allow us to Tweet or Facebook which may not seem like a great advance. But they allow us to send rockets in space and make them come back. They allow a majority of humans to access almost all knowledge instantly. They allow my company to develop new medicine much more efficiently. How amazing!

My advice to the OP: go work for a company that uses your computer skills to do something good, something meaningful to you. It will change your perspective.

41
lugus35 3 days ago 0 replies      
I was in the same mood. Now I 've stopped coding for a living (you can try management, presales,...) and at home I do what I like (exploring data structures and algorithms) with the tools (Emacs) and languages I like (Common Lisp)
42
digi_owl 3 days ago 0 replies      
The way i see it, once the UX "experts" moved in, the fun moved out. Because they keep adding layers upon layers of "drywall" to hide exactly how the computer operates. Because exposing those inner working may scare away dear old aunt Tillie.

DOS and early CLI Linux was straight forward, kernel booted, some text files were parses, and you could do your thing.

In contrast the number of background processes that is present to keep a modern DE upright is just nuts. And more are added with every minor release it seems.

43
addicted 2 days ago 0 replies      
This piece really resonated with me.

10 years ago I had a few ways I kept in touch with my family in a different country. And depending on the situation, it was usually 1 way. If I had my laptop (which was almost always the case) and had decent internet connection, we'd Skype. If either party lacked one of these, we'd call on cellphones. And if it was something formal that needed to be remembered, we used email. Or we used some form of IM but irrespective of which service I would use Adium.

Now? I cannot keep in touch with my family because my communications are split between WhatsApp, iMessage, Viber, Skype, FaceTime, email, Facebook, Facebook MEssenger, Twitter, Snapchat, Instagram, etc. Nothing works with each other, and simply managing the variety of apps, and the mental calculations to figure out which app to use depending on the context, the people I am communicating with, the combinations of people I am communicating with, the formality of the communication, the stuff being communicated, is just mentally exhausting.

I am afraid to look at my phone when I receive a notification, because there is always a mental calculation that needs to be made about what I need to do next.

This obviously happened earlier, but there was less expectations of immediate responses, so it was easy to manage, as I would simply read and handle my notifications at regular selected times.

Maybe kids growing up in this environment will be (are) much better at managing this, because their brains get wired this way, but as a 30 year old, it's overwhelming, and destroys my productivity.

44
pjc50 3 days ago 0 replies      
I wonder how much of what we're discussing here is "future shock"; we've lived in a time of extremely rapid change and a high-speed cycle of hype and disillusionment.
45
adamwill 22 hours ago 0 replies      
OP here. Thanks for the interesting discussion, everyone! There's some really great posts in here. I really liked https://news.ycombinator.com/item?id=12879141 and https://news.ycombinator.com/item?id=12884730 especially; that dilemma of not wanting to spend all your time sysadminning things (I've got a phone, four servers, a desktop, two laptops, a router, an HTPC and god knows how many little trinkets to take care of) yet also knowing too much to be OK with just telling Google or Apple to take care of it all (assuming a certain ideology) is a big part of this, I think. (Full disclosure: after getting that post out of my system I ordered a new laptop and spent half of this weekend planning a bunch of changes to my mail server...)

I also really liked the wider culture comments, especially https://news.ycombinator.com/item?id=12879035 , but I wanna emphasize something I wrote in the comments on my blog: this is a very personal post. It's just about how I feel, and I don't think how I feel is 'right'. I think there's actually an awful lot of really interesting stuff happening in all the spaces I don't personally care about - new video platforms and VR and all of that. I'm self-aware enough to realize that part of this is just me wanting the kids off of my lawn. But since it's my blog, I can wave my stick at them as much as I like ;) But I wouldn't want to claim that just because I'm okay with what I know, none of this stuff has value, as a lot of it does.

Finally, just to note that when I wrote that I still like my job, I meant it! The post wasn't meant as a cri du coeur, exactly. I'm actually perfectly fine with this stuff. It doesn't keep me awake at nights. I just do my job, and use the tech I actually want and find useful, and forget about the rest of it. I just wanted to write it down, I guess.

Oh, and since 'guess my age' seems to have become a popular game...I'm 34. :)

46
dictum 3 days ago 0 replies      
I'm younger than the OP (going by the modem speed etc) but I've been experiencing the same apathy for a while (coupled with similar feelings about visual/interactive design), but I'm slightly more comfortable with it now.

Maybe I learned to deal with my own cynicism, but the turning point was probably when I started looking at my work (and computing) less as a goal/ultimate meaning and more as just another piece in peoples' lives; a way for them to accomplish non-tech goals.

47
392c91e8165b 2 days ago 0 replies      
I still like computers, but dislike this decade's web.

I wish I could access all of the web's text, images and hyperlinks without running Firefox, Chrome or a similarly massive code base.

I am aware that my wish is impractical even if a philanthropist or a government were to spend 100s of millions of dollars on it.

48
kleigenfreude 3 days ago 0 replies      
I was here about 10 years ago. Sick of technology. No longer wanted to learn about it. Just went into survival mode because of self-loathing that I was just contributing something that I no longer felt was a good thing.

Here's what I've learned since:

Part of what you are experiencing is real. This will never leave you and will transform you. It is part of maturation. It is natural to start seeing that what matters in the world are its life, its people, its wonder, and its love, and that you have human failings which over and over again will leave you feeling guilt for not reaching a potential. Or perhaps you will transcend this and just be ok with everything, or to devote your life to doing everything the best way you can, and accepting you will fail along the way in a way that limits self-pity.

Part of what you are experiencing is due to your health and circumstance. This is something you can affect. If you are tired, maybe you need more sleep and exercise. Maybe CPAP or an oral appliance from your dentist could help with sleep apnea. Maybe you shouldn't drink before you go to bed as often. Maybe you could see a recommended psychiatrist and get some medication. Maybe yoga, a martial art, tai-chi, or guided meditation would help. Maybe you should read more.

Computers in their many forms, but particularly mobile computers ("phones") are way too distracting. So is streaming entertainment. Too much of our lives are wasted on them. Go buy a bicycle, or some running/walking clothes and shoes, and get out into nature. Buy a tent, camp stove, ramen, sleeping bag, inflatable mat, and backpack and go camping.

Feel like what you are doing is B.S.? If you're smart, join Geekcorps and travel to another country doing something cool: http://www.iesc.org/geekcorps . Even the peace corps has jobs in dev/IT like: https://www.peacecorps.gov/returned-volunteers/careers/caree...https://www.glassdoor.com/Jobs/Peace-Corps-software-engineer... Or if you're an engineer: http://www.ewb-usa.org/

49
runesoerensen 3 days ago 0 replies      
Just going to leave this here (from 49s): https://www.youtube.com/watch?v=ZP6lIM3OAFY&feature=youtu.be...

"You know, I see the look on your faces. You're thinking, 'Hey Kenny, you're from America; you probably have a printer. You could have just gone on the internet and printed that bitch.' Yeah, you know what? I could have, 'cept for one fact: I don't own a printer. And, I fucking hate computers. All kinds. I come here today, not just to bash on fucking technology, but to offer you all a proposition. Let's face it, y'all fucking suck."

http://www.powerisms.com/i-know-a-lot-of-you-guys-have-119.h...

50
erikbye 3 days ago 2 replies      
Not using/liking Netflix, Spotify, Snapchat, or Uber, has nothing to do with "I don't like computers'.
51
susan_hall 2 days ago 1 reply      
I agree, and I do wish that science fiction writers still had the social prominence that they had back in the mid 20th century. Because I think humanity needs a group that thinks about how things could be, and how things should be, and to what extent advances in science could make life more fun.

Part of the "This isn't fun anymore" feeling for me comes from the way the Web has consolidated to a handful of companies (Google, Facebook, Apple, Microsoft...) and what we are being given is what they find profitable.

The loudest voices in the room are those corporations. I'd like to live in a world where the loudest voices shaping our technologies are science fiction writers who are thinking hard about what might actually be useful or fun.

52
dendory 3 days ago 0 replies      
I think there's multiple things there and I don't think taking any particular stance is wrong. I love technology, in the sense that if I get a problem to solve which makes me dig deeper into an area to figure out how things work under the hood, I really dig that. But I don't use Facebook, Netflix, Siri, Alexa or any of those things. I want nothing to do with the Internet of things. I suspect this is common among those of us who grew up with technology, as opposed to those who had technology by the time they grew up. They see technology as a service they should always have available in every facet of their lives, while we see it as something that used to be cool and mysterious, but now has been wrapped by so many commercial interests.
53
chridal 3 days ago 1 reply      
I loved this post, and so I wrote a small "reply post". http://valleybay.me/2016/11/05/death-of-the-internet/
54
hackerfromthefu 3 days ago 0 replies      
Yes, yes.

The signal to noise ratio of the modern internet has changed for the worse, western/global culture has lost it's manners, and what signal there is left shows leaders have either lost their culture or their clothes..

none of these global trends are anything to do with you personally .. those trends are external!

thus even if you look after yourself, if you avoid burnout from todays overpaced pace, if your hardware is ready and able to be inspired ..

Then, to feel that inspiration again, you must really appreciate and nuture the inspirations you find amongst the noise

Personally I believe the next frontier is hacking and implementing political/social/power cultures and social mores inspired by Libre Values

55
JKCalhoun 3 days ago 0 replies      
It may be an age thing. I've been coding professionally for maybe 30 years and having been doing more or less the same thing, interacting with machines. I find that, while I too can get momentarily caught up in the chase for a programatic solution or hunt for a bug, if I am truly honest with myself, there is very little intellectual curiosity left that might drive me to learn new language or framework. And yet, I recall having this enthusiasm years ago....

I find too though that when I am left to pursue my own projects at home, on weekends, some of the magic comes back a bit. Perhaps it is just Corporate America that has sucked the life out of my soul when I am at the workplace.

56
arekkas 3 days ago 2 replies      
"The world hates change, yet it is the only thing that has brought progress."
57
Yenrabbit 3 days ago 0 replies      
Reading this made me realize I have similar feelings. But I don't think it's technology's fault - I only really got into computers ~6 years ago, and to me then they were fascinating! I think it's just that we start to use them for work, and sooner or later stop caring about how everything works and start wishing it would all get out the way and let me browse the web or write my document. I think it's curable though. A few days ago I dug out my early code, and felt the old excitement welling up again - I'm going to spend some time trying to find that again.
58
creyer 3 days ago 0 replies      
One should imagine the days without computers... waiting days for a letter to arrive... Even if you don't like computers many of the benefits that come with them you might like... so think of them as a necessary evil.
59
thght 3 days ago 1 reply      
I think you shouldn't dislike computers for not enjoying what other people do with it. But a part of the magic of computers and the internet in the 90's is definitely gone, for ever, true. But hey, would you prefer to go back to connecting to a BBS with a 14k4 modem? I prefer my wireless 340Mbps broadband modem, really.

Fortunately I do enjoy every new day and can still become excited about new technology, which is emerging all the time. And I truly believe that computers and the internet have become much better and ever more interesting. You just have to be very selective in the vastness of things out there.

60
debt 3 days ago 1 reply      
Congrats you've reached the end of your programming career.

Sitting at a computer all day actually is quite uninteresting and boring. It's the light at the end of the tunnel or big ah-ha moment many programmers have.

It's simply more fun to socialize all day.

Many big projects say self driving cars or FB or whatever don't actually require that many antisocial, introverted engineers; only 10's of thousands so things like that will always get built anyway.

It pays well but its not good for your health to sit at a computer all day nor is it fun to socialize only through a chat screen all day.

Time to take a break.

61
tibu 3 days ago 0 replies      
What I still like is creating valuable solutions through programming. That made me a computer maniac when I got my ZX Spectrum and this is the part what I still mostly enjoy - writing some core and watching how others use it
62
cairo_x 3 days ago 0 replies      
Everything in moderation I guess. A good comedy podcast every now and then is quite therapeutic, but a lot of it can become like being possessed by an insatiable trivia-demon demanding to be fed 24/7.
63
djhworld 2 days ago 0 replies      
One thing I have noticed about myself is how my browsing habits have become the complete opposite of what the Internet was supposed to offer.

My tab bar at the top mainly consists of the same websites I visit every day, HN, reddit, newsblur, facebook, google inbox, youtube. I can't remember if I was any different 10 years ago, but it just feels like I've carved my own bubble and rarely leave it.

Maybe that comes with age, I'm not sure.

64
imode 2 days ago 0 replies      
when hobbyist computing went the way of the dodo, I hit this state, and I hit it hard.

I found that either inventing a small programming language or getting back into microcomputers worked as a cure. small, controllable, programmable systems that feature instant-on programming.

nothing between you and the machine. seems we've lost track of that idea somewhere between x86 and Javascript.

65
tech2 3 days ago 0 replies      
I ended up feeling similarly, no longer hacking at home, no more linux installs on my home machines, etc.

Instead I started on other hobbies, I repair physical things (mechanical, electrical, electronic), I enjoy photography, I work on my car.

I used to make the joke that if ever computers were no longer a thing for me that maybe I'd move to New Zealand and make violins for a living... that time isn't here yet, but I can feel it.

66
nthcolumn 2 days ago 0 replies      
I ditched my smart phone for a flip phone. Apparently it's all the rage but I don't know how I know that since I don't use those social media platforms you mentioned either. Just developer forums (or fora) and such like. Good feels from helping people, getting help and not feeding trolls.
67
anta40 2 days ago 0 replies      
I watch videos on computers. Oh look, another new cute cat video on Youtube click

I occasionally read Twitter, usually for news.

I post images to Instagram almost daily. One picture per day is a good way to exercise photography.

I use Uber. Sometimes too lazy to drive

I still enjoy using computer, especially for coding.Well, to each his/her own.

68
anthk 2 days ago 0 replies      
I've got a #pocketchip, and I am learning again Pascal with fp-ide and X86 ASM with NASM and DosBOX. Retrocoding is fucking awesome, and doing emulators on Free Pascal in a DOS-klike IDE on the underground is cool and relaxing as fuck.
69
djhworld 2 days ago 0 replies      
I like computers, but can understand some of the sentiment expressed in this article.

What's wrong with podcasts anyway? They can be informative if you want to learn something new, entertaining if you want to be amused.

70
fagnerbrack 3 days ago 0 replies      
That's really interesting.
71
z3t4 3 days ago 0 replies      
when something feels like work it usually is!
72
apeacox 3 days ago 1 reply      
This is a modern manifesto. I'm almost there, just a bit less, for now.
73
fit2rule 3 days ago 0 replies      
Computers are broken.

Wait, no. Operating Systems are broken.

Wait, no. It should all just be the Web.

But wait, no .. the Web is broken.

Ah well, I guess its time for something new. Something, not-broken ..

74
happy-go-lucky 2 days ago 0 replies      
When did computing become separate from math and other sciences and why? I always think it's integral to math.
75
profalseidol 2 days ago 0 replies      
Anything too much is bad, just like too much capitalism (which is the root cause of bad blocker bug meetings).
76
owenversteeg 3 days ago 1 reply      
I guess I'm not as far along as the OP is, but I can definitely feel myself getting there.

The meaningless Internet bullshit used to be meaningless, but mean a lot to me; the big news would be a 2% drop in Firefox users or something and everyone would lose their minds. Now, the meaningless Internet bullshit is some site with no vowels and no revenue selling for twenty billion dollars, and it actually means something because twenty billion dollars is a lot of money in the real world; for a sense of perspective, read this [0] and realize that twenty billion dollars could supply all of those things yearly for _a decade_. And yes, I know that 15 years ago there was a bubble too, but it was a lot smaller. In 1999 there was barely north of ten billion total invested in software by VCs.

I don't know exactly when it was, but at some point I went from excitedly tracking the latest versions of distros, googling "shareware" and installing whatever I could find, getting wrapped up in flamewars, formatting my hard drive every week (and it was a hard drive, not an SSD; in 2010, the price of a 120gb SSD dropped from $420 to $230.)

I think that point was systemd. Six years ago, the initial version was released. That was 2010, and things seemed different. No way I would accept a complex, huge init system on MY carefully tuned $distro_of_the_week.

Today? I'm 100% in support of systemd. It makes my life easier. I have zero desire to tweak a complex mess of init scripts. And sure, I run Arch Linux, but that's mostly because it Just Works (tm) and I'm used to Linux. If someone gives me a Windows box, I won't lecture them on how they're contributing to the downfall of humanity, I'll take the damn machine and write the code they're paying me to write. I shudder at the thought of googling "shareware" and just randomly installing programs, and it looks like I'm not the only one; it seems that trend died... yep, around 2010. [1]

I no longer give friends USB drives of "cool software", and if they gave me one I'd think it's a strange joke. I no longer read stuff like WinSuperSite; I'm sure Paul Thurrott is still churning out the same quality content as always but I have no interest in reading about the latest features of whatever.

[0] https://www.cardonationwizard.com/blog/2011/07/01/unicef-usa...

[1] https://www.google.com/trends/explore?date=all&q=shareware

[edit] Turns out WinSuperSite is gone. Or, technically it's there but it's not Paul. "SuperSite Windows is the top online destination for business technology professionals who buy, manage and use technology to drive business"... wow, that's depressing. The old URLs even 404. :(

77
andrewvijay 3 days ago 0 replies      
Reading the bullet points with actual gun shot sound in our minds makes it so amazing! Try it
21
Dalai Lama: Behind Our Anxiety, the Fear of Being Unneeded nytimes.com
337 points by applecore  3 days ago   249 comments top 27
1
sjclemmy 3 days ago 24 replies      
It's odd, the world is the safest and most stable that it has ever been. There has never been a better time to be a human being, and yet, the media and certain politicians would have you believe that there is disaster after disaster, terrorist plots afoot around every corner, feeding your fear day after day.If ever there was an example of the devil at work, then that's it(and I'm not even religious).

Uncertainty about the future has always been the same.

Don't listen to those devils.

2
xherberta 3 days ago 3 replies      
Yes, the poorest among us are materially well-off, by any historical comparison. Yet the article speaks to how we're doing psychologically and spiritually. It's a little glib to tell people they'd better get happy and appreciate how much better we have it than ye olde folks of yore.

Trump has gotten this far by stirring up rust belt fears of human obsolescence. Less-educated white males are seeing their place in society disappear. If you don't think "the rest of us" have to care about that, wait and see the response as autonomous vehicles take out the trucking industry.

Even if Trump loses this time, the same underlying sentiments will produce more Trump-y candidates in the future, unless something can be done to create real change and ways for all sorts of people to be active and valued participants in our society. It's important to recognize the pain and suffering behind the fear of being unneeded.

Interesting questions abound:

If prescription anti-depressants worked, wouldn't Americans be the happiest people ever?

We already know behavioral advertising changes one's perception of oneself -- does that shift affect well being?

What's the psychological cost of commodifying one's special moments as social media posts?

How do we engage undocumented kids who feel sidelined by the lack of paths toward rewarding careers?

How do our attitudes toward birth, death, and caring for the very young and very old work against well-being?

Why do we measure economic health of our country without reference to the distribution of ownership of capital? (We carefully measure the number of wage slaves, but who's measuring the number of people who own productive enterprises? Stocks are too far removed from "productivity" and shouldn't count.)

3
trynumber9 3 days ago 4 replies      
"In America today, compared with 50 years ago, three times as many working-age men are completely outside the work"

I was taught that work is one of only two uses for a man. That other being a good father. It must be wrong but it is an unshakable thought. I can't imagine how I would hate myself if I couldn't provide. This fear, even though I have no dependents, is far greater than my fear of being unneeded.

Maybe it's different for other people.

4
NTDF9 3 days ago 2 replies      
Anecdotally, I've been thinking about this. "Being needed" is can also be looked at as "having a sense of purpose".

Today, the only "Sense of purpose" is wealth accumulation and debt reduction. There is very little we can do with:

- Family (marriages, divorces are way too high)..not permanent

- Community (everyone lives in distant houses and neighbors change frequently)..not permanent

- Kids (nobody can afford one), parents (either divorced, separated or in an old-age home)--not permanent

- Friends (too little time to socialize, too much competition, competing with the Joneses, too much changing locations, hard to make new ones)...not permanent

There is very little satisfaction in fighting for the above because NO effort to above leads to PERMANENT satisfaction (except maybe wealth). Therefore, demotivating any real sense of purpose for anything besides digits on your online accounts...which really isn't the same as "being needed".

5
dschiptsov 3 days ago 2 replies      
That's gross oversimplification for western amateur audiences. His Holiness is trying to speak the language of common western consumer.

Behind our anxiety is ignorance. Period. Ignorance is described as a veil that obscures the view of what is from so-called primordial awareness or Atman, if you wish - the aspect of the whole (Brahman) in us. (The Buddha explicitly rejected the notion of Atman, but numerous later Indian (tantric) writers messed everything up). It is like continuous day-dreaming. The meaning of Awakening is literal awakening from this habitual delusional day-dreaming to what is.

Most of our fears are due to misapprehension of reality, like classic mistaking of a rope for a snake, and our attaching to (and hence the fear of losing) that illusory my self, which is nothing, but an appearance to ignorant and confused introspection.

This crude formulation is as old as Upanishads.

6
neves 3 days ago 2 replies      
I'd like to read everyday articles like these. We are drowned in hateful speech. Would feel better reading something like it.
7
sekou 3 days ago 1 reply      
A question that comes to mind: What is the future of "meaningful work" If technology is poised to render sustenance-based work obsolete? The utopian idea is societies where education, arts and entertainment are what we consider to be "work." It's something that's been talked about since the industrial revolution, but assuming it's a multi-generational shift how do we set course to ensure a sense of well being in the human species?
8
orasis 3 days ago 4 replies      
This is the big missing piece with basic income: People need meaning in their lives as much as they need money.
9
cossatot 3 days ago 0 replies      
It's an interesting source. Tibetan society pre-1940s (most recent Chinese invasion) was incredibly feudal. The Dalai Lama was the top of the feudal society, both monarch and high priest. I've spent a bit of time in Tibet, and the Tibetans dearly love the current Dalai Lama. Now, maybe he's an exception for reasons of being a leader in exile, and a general scholar and nice guy.

But I wonder, in most feudal societies, do the serfs feel needed by their societies and their monarchs?

10
slimypickle 3 days ago 0 replies      
The comments in here remind me of this video:

https://www.youtube.com/watch?v=PsXFwy6gG_4

Eric Schmidt takes the side a lot of people here take, which is "everything is getting better", and Peter Thiel takes the other side, that things aren't getting better.

I don't think things are getting better at all. You can easily just use your own metrics and then come to the conclusion that everything is getting better. The problem is, your metrics are bullshit so you are coming to the wrong conclusion.

I have seen all kinds of metrics used in here like "don't have to hunt", "live longer", "safer", "can watch netflix". I don't think any of these things have to do with whether or not people's lives are getting better.

You can just take one of those metrics - "live longer", and see that in itself, it can't measure "getting better", because you can live for 80 years and have a terrible life, or have 60 years of an excellent life.

I think people in silicon valley are just completely clueless and are making the wrong bet, which is why they never see things like Trump coming. Things aren't getting better and people know it.

11
IllusoryReverb 3 days ago 0 replies      
I am not certain the author of the article has identified the true cause/root of the problem. To me, the need to be needed is certainly important, but, I do not think that that need is the true cause of the anxiety described in the article. The way the western world has reacted to immigration and socio-political landscape currently obtaining is not well explained by the need to be needed as explained in the article.

To me, the need to be needed/fear of being unneeded takes a back seat to the idea of 'other'. The otherness of the migrants, of the new persons, the lack of 'connection' with them certinly is a better explanation than the feeling of being superflous.

My view is that if we, as humanity in general, could only see each other first as individuals and secondly as persons with intrinsic worth - basically treat each other as we would a treasured member of our family, their troubles with the same sense urgency with which we would treat our close friend, brother, sister in the exact same circumstances, then the world would be a better place for all.

But, it feels like this is nave, not 'realistic' or 'practical' there is too much standing in the way, at least thats what it looks like to me when I attempt to do so. It gets overwhelming, emotionally anyway. To me, behind our anxiety, as humans, is the fear of what would happen if we truly saw others as though they were ourselves. Really and truly. I am not sure we can do so, because we would realize just how fucked up the world was. I for one would rather not, the shame, fear is too much. I would rather build a wall and tell myself there is not much I can do, I do not have enough to help, I would be disregarding my responsibilities. But I know the truth. I am just scared.

12
Kenji 3 days ago 4 replies      
And although all the worlds major faiths teach love

That's not what I see when I look into the holy books. Let's be frank here, faith and religion are still the tools to turn ordinary people into genocidal maniacs.

13
MattyRad 3 days ago 0 replies      
This is a refreshing and useful article in light of the upcoming election and political fallout. It's a reminder that the world, despite the media/politics, is getting better and better, and consequently the need for human labor is decreasing. The Dalai Lama doesn't offer any answers, only recognition of the "problem" and hope that we can collectively find a solution.

It's especially important for us as the very people who are actively trying to make human labor obsolete. As a generally libertarian minded person, I find this is a very compelling article for compassion in a relentlessly utilitarian society.

14
habosa 3 days ago 2 replies      
There is a lot of discussion on HN (in this thread and in many others) about what we will do when technology replaces a significant fraction of our jobs. People are talking about a 10%, 50%, or even 99% reduction in the workforce size due to technological advances (mostly AI).

I totally reject the conclusion that the workforce will be significantly reduced by any imminent technology. Look at humans throughout history. We have continuously invented ways to automate (or nearly automate) away the labor behind our needs and desires, but people have been working 40+ hour weeks since the industrial revolution. It seems that our wants and needs advance as quickly, or even more quickly, than our ability to meet them with technology.

I think a good analogy (for this crowd) is personal computing. Every year our computers get much faster, yet the overall "speed" of the experience remains about the same. Why? Because as soon as we get our hands on a new CPU we go and write a program that requires more CPU power, rather than simply watching our old programs run faster. I don't think there is anyone out there running Netscape on Windows XP on top of a brand new Intel Processor with 32GB of RAM.

I think we are being a combination of overconfident and unimaginative when we think we are on the brink of a technological utopia where DeepMind AI does our job for us. We are unable to imagine the wants we will have in the future. This is by definition, if we could imagine them we'd have already started working towards them!

I will present a concrete example, one that's overused but still effective. Consider the smartphone. This is a device that is pocketable, affordable to ~50% of the world's population, and can answer nearly any factual query in seconds. I think if you had told someone in the 90s about that they would assume we'd have vastly more leisure time. We'd have fired all the librarians, replaced the teachers with machines, and our children could complete 12 years of education at home in a fraction of the time. But of course this is not what happened. In fact we created more work for ourselves, I bet half the people in this thread are employed by a company that makes software for smartphones. We didn't know we'd want 100 apps and games on each phone, but now that we're here we won't easily give them up.

I don't mean any of the above to sound negative. This is a wonderful thing! This infinite cycle of desire is what has driven us to this point and is what will drive us into the future. But the one negative consequence (in my opinion) is that humans will not soon be a leisurely species. We will keep busy, and we will keep inventing things to busy ourselves with.

15
debt 3 days ago 1 reply      
to be fair though, if you're feeling severe anxiety you should go see a doctor.

i've been seeing a lot of articles on hn lately that are like "just think your way into a good mood yay" or "heres arbitrary reason for how you feel etc" when we all know there's many environmental or genetic factors that play into how our minds develop.

maybe it's the fear of the being unneeded or maybe if you have anxiety you should talk to someone or it's likely both.

16
carsongross 3 days ago 1 reply      
Any analysis of anxiety that does not distinguish between males and females is going to be extremely hamstrung.

Each half of the species has very different socio-biological failure modes.

17
white-flame 3 days ago 0 replies      
These problems and anxieties stem from a belief that things are unfair. Be it racial biases, economic disparity, or policing offensiveness, many of these complaints boil down to this: In such a "modern" society, shouldn't we be finally past these problems? Why does it look like they're getting worse?*

People contrast the problems they face against the positive strides they see elsewhere, and it foments an us-vs-them mentality, projects a "f- you, got mine" on the positives, and increases outlooks of entitlement.

This is a continually escalating and divisive cultural cancer that's erupting. However, it has little to do with feeling "unneeded" (though obviously in the lost jobs cases that's mixed in), but rather more of people feeling actively antagonized. It's a mess.

(* answer: greater information flow and media that profits from outrage)

18
blueprint 3 days ago 0 replies      
Not necessarily. For most people, yes. But for those with eyes open the anxiety and agony is that our solutions are needed but that we cannot enlighten human societies. The best we seem to be able to do is either fund activities on our own or find those who have kept themselves true enough to be able to understand, recognize, correctly learn, and practice truths. But it's difficult to find those who want to know, so while valuable, this way is quite arduous and lonely. The Dalai Lama may talk about western psychology's abandonment and bonding conditions, but he has not wanted to meet even me as I know the one truth that he has not been able to realize even half of one of Buddha's teachings.
19
humanrebar 3 days ago 0 replies      
It's odd to see this sort of philosophy/theology posted on here, but since the subject was broached, from the article:

> Virtually all the worlds major religions teach that diligent work in the service of others is our highest nature and thus lies at the center of a happy life.

Actually, Ecclesiastes (great read for all philosophy geeks) is a long discourse of happiness, the point of life, etc. Its final conclusion was:

> Fear God and keep his commandments, for this is the whole duty of man.

The core of Judeo-Christian theology is the supremacy of God and trust that following His instructions is the best way to live the best possible life. Of course there is a lot of teaching about how to care for others, and there's a lot of teaching against selfishness, but that is the result of following a good God, not the point in itself.

Ecclesiastes actually brings up charity as wise, but falls far short of saying it's the point of life. It points out that there's limits to human knowledge. In my wording, help people, because that could be you tomorrow, but the person you save from a fire today could drown tomorrow. You'd need to be able to see the future to utterly help someone, and only God has that kind of foresight. So if you trust that God exists and is good, you should obey His teachings.

> Indeed, what unites the two of us in friendship and collaboration is not shared politics or the same religion. It is something simpler: a shared belief in compassion, in human dignity, in the intrinsic usefulness of every person to contribute positively for a better and more meaningful world.

I'm eager to collaborate productively with everyone. But if you're interested in being my friend, you need to understand where I come from. I value human life because mankind was made in the image of God (imagao dei). My eagerness to give human compassion, dignity, and usefulness another shot grows out of my trust in God and his teachings that say to do just that. But I do that in spite of the track record of humanity, not because I have faith in what humans will do in the future. Who knows the future?

There's a bit of an "all religions are the same" meme out there, and it's important to correct it, especially when respectable people repeat it.

20
justratsinacoat 3 days ago 0 replies      
Article is about modern human psychological well-being; comments are about how materially satisfied modern (Western) humanity is, so what's the fucking problem, gosh!? The best part is the latter doesn't actually discount the former, it just trivializes its importance.

Oh HN, never change.

21
zappo2938 3 days ago 0 replies      
I Want You To Want Me [0]. Some of us just want to love and be loved.

[0]: https://www.youtube.com/watch?v=bbw-PVwBU9k

22
the8472 3 days ago 0 replies      
if you have paywall issues: https://archive.fo/yBrfC
23
ChefDenominator 3 days ago 1 reply      
24
SeriousM 3 days ago 1 reply      
25
ChefDenominator 3 days ago 3 replies      
26
quickben 3 days ago 3 replies      
27
forgottenpass 3 days ago 3 replies      
For as prolific as he is, the Dalai Lama doesn't pop up on my radar much. But here he is in the NY Times opinion pages sharing monism.

This speaks to a broader human truth: We all need to be needed.

There's an old but relevant perspective on life. We value ourselves by how much we can contribute to the lives of others.

Still though, I'm left wondering, what combination of forces summed up to this particular writing crossing my path?

Leaders need to recognize that a compassionate society must create a wealth of opportunities for meaningful work

Oh, there it is. A message about the importance interpersonal relationships to self worth is co-opted and reduced to nothing more than calling on the government to make sure everyone is feeding their 40 hours into the system.

22
25 Years After Junk Science Conviction, Texas Admits Sonia Cacy's Innocence theintercept.com
310 points by finid  3 days ago   116 comments top 15
1
seibelj 3 days ago 11 replies      
This is something I find extremely scary, because an unusual series of events (accidental fire, accidental drowning, suicide) could be framed to look like something entirely different (arson, murder), which could drag any one of us into a nightmare. If you don't have the money to hire your own experts and a strong legal team, potentially bankrupting yourself, you are at the mercy of a motivated prosecutor with nearly unlimited resources.

In 2009 a scathing report was released by the National Academy of Sciences that essentially says that blood spatter, handwriting, hair, fingerprints, and bite mark analysis are all junk science[0]. If two "experts" can look at the same evidence and come to entirely different conclusions, how is this science? It's opinion wrapped up as scientific fact. Who knows how many people are innocently convicted. It's terrifying.

An excerpt from WikiPedia about hair analysis:

The outcry from defense attorneys has forced the FBI to open up on disputed hair analysis matches since 2012. The Justice department began an "unprecedented" review of old cases involving hair analysis in July 2013, examining more than 21,000 cases referred to the FBI Lab's hair unit from 1982 through 1999, and including as many as 27 death penalty convictions in which FBI experts may have exaggerated the reliability of hair analysis in their testimony. The review is still in progress, but in 2015, it released findings on 268 trials examined so far in which hair analysis was used. The review concluded that in 257 of these 268 trials, the analysts gave flawed testimony that overstated the accuracy of the findings in favor of the prosecution. About 1200 cases remain to be examined.[1]

[0] http://www.nytimes.com/2009/02/05/us/05forensics.html?pagewa...

[1] https://en.wikipedia.org/wiki/Hair_analysis#Microscopic_hair...

2
mabbo 3 days ago 5 replies      
> In an exceptional move by the notoriously conservative panel, the BPP agreed that Cacy should be paroled, just six years after she was convicted.

She served 6 years before parole, not 25 years behind bars

I'm far more concerned with Cameron Todd Willingham. Governor Perry had this evidence, that much of the state evidence being used was junk science, and did nothing while an innocent man was put to death. Shameful.

3
rdtsc 3 days ago 3 replies      
"Expert" witnesses for US courtrooms is a special kind of a parallel voodoo-science world. Especially when it comes to arson.

Prosecutors like to pick the same people to testify as "experts" and their top qualification is that they have testified before as "experts". I imagine many have optimized putting up an act and throwing around fancy terms to make it seems really precise and scientific. Their future employment depends on that.

4
garyclarke27 3 days ago 1 reply      
Similar junk in the UK has put many innocent parents away for "shaking baby syndrome" just based on a theory, not proven by scientific evidence.Expert witnesses who don't agree with the consensus establishment, have even been banned from practicing medicine, thus most now refuse to testify.
5
geff82 3 days ago 3 replies      
The nightmare is also that in some countries, when the police knocks on the door to arrest you, you might get killed in a gruel archaic ritual called "execution" at the end, even if you did nothing wrong, just the odds were against you. Here in Germany I do not have to fear the police. If if the judges wrongly sent me to jail "for life", at least I'd have some hope that one day I can convince them they were wrong and get to freedom again.
6
rmchugh 3 days ago 0 replies      
The other case mentioned, the Willingham case is even more horrifying. A man was convicted of murdering his children and sentenced to death on bogus evidence. When presented with evidence to the contrary, the state of Texas under Rick Perry ignored it and allowed the man to be executed. This is state sanctioned murder of an innocent man. Why is the Governor not on trial for this?
7
draw_down 3 days ago 1 reply      
We need a pretext for what we want to do, which in America is to lock people up. If we can fool ourselves with something that sorta looks and smells like science, that fits the bill perfectly.
8
metafunctor 3 days ago 2 replies      
Is junk science in court rooms a root cause or just a symptom?
9
Tloewald 3 days ago 0 replies      
This reminds me of a New Yorker article on the same topic (covering an even greater injustice, also in Texas -- in fact referred to in this article)

http://www.newyorker.com/magazine/2009/09/07/trial-by-fire

10
johnhattan 3 days ago 1 reply      
Got a friend currently doing time in Texas for basically the same thing. Here's hoping this gets the case some notoriety. http://thearsonproject.org/case-studies/curtis-severns/
11
edblarney 3 days ago 0 replies      
Question is: what is considered 'junk science' at the time it was used in court?

Because I'm sure we are using some 'junk science' we just don't understand at the present time.

12
finid 3 days ago 0 replies      
Somethings the main qualification of some so-called experts is a certification from a 6-hour or 6-week class. From then on, they are eligible to testify as an expert in serious criminal cases.
13
lanius 3 days ago 0 replies      
Gerald Hurst and Chris Connealy are true heroes.
14
gourou 3 days ago 1 reply      
Making a Murderer season 2
15
yuhong 3 days ago 1 reply      
Anti-discrimination laws are even worse in that discrimination can happen with no evidence at all. One of the methods used to enforce them (particularly in things like hiring) is statistics, most of which assumes employees are interchangeable commodities. They were designed back in the 1960s for things like manual labor jobs. I am willing to suggest a compromise to limit them to these kinds of jobs.
23
Why I won't recommend Signal anymore sandervenema.ch
323 points by maglavaitss  2 days ago   330 comments top 48
1
zigzigzag 2 days ago 6 replies      
Like a lot of crypto-puritanism it is rather mixed up. He says he recommended Signal because it was easy to use (more consumer friendly I guess) and secure, then says he wouldn't have gone in the direction of making it easier to use and criticises the things that make it user friendly, like using phone numbers instead of usernames.

He says he thinks the protocol is secure, then says he doesn't want it to use GCM because it routes messages via Google who he doesn't trust (fixing that is the point of the encryption) and then talks about an attack that'd apply to any app regardless of whether it used GCM or not.

He finishes with a call to action: "We as a community need to come up with a viable solution and alternative to Signal that is easy to use and that does in fact respect peoples choices ... this tool should not have dependencies on corporate infrastructure"

But like a lot of armchair moralising, he isn't willing to debate the hard choices that go into building successful software. He says it should "respect people's choices" as if Signal is built by people who are disrespectful, he says it should not have dependencies on "corporate infrastructure" as if volunteer run datacenters actually exist, and then says his motivation is avoided paywalls, ignoring that both Signal and WhatsApp are free.

It reads like a collection of talking points rather than a coherent argument.

Signal is unusual because it combines cutting edge cryptography with consumer friendliness and is actually successful. It's pragmatic, not ideological. Crypto-warriors have a long history of producing secure software that nobody uses and then blaming the general public for not getting it; this sort of blog post is just a continuation of this decades long trend.

2
clumsysmurf 2 days ago 5 replies      
Unfortunately, Google has made it (almost) impossible to wake up the phone via some external event without using its proprietary GCM. Even though GCM is not part of AOSP, it has unique status on the platform that can't easily be replicated (without recompiling the kernel, etc like the article mentions).

Before the days of doze mode & other battery optimizations, you could just listen & block on a socket, then let the phone go to sleep. Incoming 3G packets would wake up the phone, you grab a wakelock, then start doing things. From what I remember, at least a while ago Facebook Messenger did this using MQTT. But this is not possible any more.

3
tptacek 2 days ago 5 replies      
The author of this post believes that by making a stand over Signal policies he doesn't like (the superficial GCM dep, the OWS-only server policy, the contact list discovery system), something more like LibreSignal will grow to take Signal's place.

The author is wrong. LibreSignal won't replace Signal. Something like Telegram will: an "open source" messaging system with inferior cryptography, "opt-in" end-to-end messaging, a long-term dependency on the telephone system for authentication, and a far "cuddlier" personality with its users and, more importantly, with people from the app development community (like the author). Telegram will continue to gain adoption, because sexy beats sound in every end-user match up. Signal is the closest thing sound cryptography has to a palatable solution for end users.

Iran has already compromised Telegram users, because it systemically trades security off for user adoption. They'll get more of them, and people will hang from cranes as a result.

It's not wrong to criticize Signal. Signal does things I don't love, too! But we should be clear-eyed about the market.

4
SamWhited 2 days ago 5 replies      
I highly recommend Conversations (disclaimer: I've worked on it in the past, although I'm not a project "member" per say): https://conversations.im/

It's open source, uses a federated, open protocol, and can do multiple types of encryption including OTR and OMEMO (an XMPP wire format that uses the Axolotl ratched devised for signal). It does not do VoIP, so it would just be for chat (although there is a large bounty on Jingle-based voip support open). It has also had a public security audit, and is designed to be white labeled so you can tweak a few variables in the source and build your own hardended version or encrypted-only version, etc.

5
SapphireSun 2 days ago 2 replies      
Essentially this guy is saying, Signal is secure, it's mostly easy to use (with the exception of multiple phone numbers), and the only alternative he mentioned is a half broken clone. Is he seriously going to stop recommending it to people whose lives depend on secure communications because of some abstruse ideological point? In any case, Moxie's position is a reasonable one even though there are some arguments for federation.

While my current phone doesn't support Signal, once I get a new one I will continue to use it.

You might opine that allowing Signal clones would allow me to use the app, but they would almost certainly be maintained by people who aren't really crypto experts, and so it's better to operate as though I am broadcasting in cleartext than to pretend that I'm not and get burned.

6
chrismartin 2 days ago 1 reply      
Signal may not transmit any payload via Google Cloud Messaging, but Signal's requirement to run Google Play Services compromises the user's privacy in ways that have nothing to do with Signal. If you run Play Services then you have a device which provides your communications metadata, whereabouts, and device usage habits to Google.

I don't trust Google with this information and don't want to carry such a device, but a handful of friends and family use Signal, so I must choose between easy/secure communication with them, and reducing my exposure to corporate surveillance.

Signal may be "pragmatic" among the current choices (just like the project's decision to use GCM is pragmatic), but OpenWhisperSystems absolutely deserves criticism for:

1. Tying secure communication to running what amounts to Google's spyware on your device

2. Offering no alternative for privacy-conscious users

3. Showing hostility to those trying to introduce such an alternative to the project

I think those dismissing these concerns as "crypto-puritanism" will be on the wrong side of history.

7
codewiz 2 days ago 2 replies      

 "Also, theres the issue of integrity. Google is still cooperating with the NSA and other intelligence agencies. PRISM is also still a thing."
What's this based on? Google immediately denied any association with the NSA and PRISM:

https://googleblog.blogspot.com/2013/06/what.html

Googles chief legal officer claimed that collection was being done without Google's consent:

http://www.irishtimes.com/news/technology/google-outraged-at...

Evidence leaked by Edward Snowden also points in the direction of illegal infiltration of Google's private network without Google's consent:

https://www.washingtonpost.com/world/national-security/nsa-i...

8
reacharavindh 2 days ago 2 replies      
This.I didn't know much of the insides of Signal. But, When WhatsApp decide to go in bed with FB to share my contacts and usage, one of the alternatives I explored was Signal. Threw it out the moment it asked for ownership of my contacts (no way to opt out). I for one am not going to trust a guy's pinky promise to be good with my contacts and meta-data.

If I'm going to give up the convenience of reaching anybody by WhatsApp, it is going to be at least worth it in the sense of privacy.

Still hoping for a GNU project that garners enough interest to be technically strong, and used universally. One can dream.

9
EugeneOZ 2 days ago 1 reply      
Any messenger, tied to phone number, is not safe. possible attacks are: 1) create copy of sim-card; 2) force mobile operator to intercept password-code, sent to your number, and "restore" password this way.It may sound ridiculous for you, but in Russia it's reality (both vectors), it's real cases from life. And when user really need safe messenger, all of them are too careless to implement really safe way of messaging. And if you think these vectors are not possible in your country - be sure, we were thinking the same way.
10
cbsmith 2 days ago 0 replies      
There's a fundamental assumption here: that there is a better way. I'm not saying there isn't, but there's a pretty good existence proof that Signal is the best combination of security & simplicity we can put together.

I would agree with this statement from the article: "there should be a tool that is fully free software (as defined by the GNU GPL), that respects users' freedoms to freely inspect, use, modify the software and distributed modified copies of the software. Also, this tool should not have dependencies on corporate infrastructure like Googles (basically any partner in PRISM), that allows these parties to control the correct working of the software."

There are such tools. None of them are as easy to use as Signal. So for now, I recommend Signal. I can't, in good conscience, recommend anything else... and given the author doesn't speak to what they recommend, I'm curious about what their recommendation would be.

11
haffenloher 2 days ago 1 reply      
From the post:

"The Google Cloud Messaging service basically handles message handling from/to the users devices to the Signal servers. The GCM service then handles all the aspects of queueing all messages and delivery from/to users."

This is not true. Messages are delivered via Signal's own servers only. GCM messages are empty; their only purpose is to wake up your device. [1]

"The phone component of Signal is called RedPhone. The server component of this is unfortunately not open source [...] this is also probably the reason why secure encrypted phone calls dont work in e.g. LibreSignal"

No. The reason for that is that the signaling for RedPhone calls is currently still done via GCM and not via Signal's own message transport.

Regarding microg: I've never heard of the need to re-compile kernels for that. I think most people use it with Xposed (admittedly, a giant hack, but it works).

[1] https://whispersystems.org/blog/goodbye-encrypted-sms/

12
heavenlyhash 2 days ago 2 replies      
EDIT: this isn't a response to most of the article, but specifically to the "Moving Forward" section, asking about alternative tools.

Come to the matrix!

https://matrix.org/

It's free -- all FOSS, including the entirety of the server -- and yes, all of it: proof by existence: several of my friends run their own.

It federates. I regularly join channels hosted on several different servers, and exchange messages without issue.

It's on every platform. I use it on the desktop, my android (cyanogen, without gapps, none the less!), and my ipad, every day.

It even has voice and video calling built in, using webRTC. This feature has been a little rough while it was in development, but I used it last week in a 1-on-1 call and had an effortless experience. The audio and video quality was on par with Google Hangouts.

Crypto is hard, but it's coming. The Matrix developers have huge respect for the axolotl ratchet design used in Signal. They've worked on making another implementation (in C, for easier linking in various languages, ostensibly) here: https://matrix.org/git/olm/

The deployment of that code to give full End-to-End encryption is a work in progress, but the beta is roughly functional. It includes everything you'd expect: communication works by default, but in an encrypted room, messages are flagged yellow if you haven't explicitly verified the sender's key. There's a key per device; it doesn't leave the device; and as soon as you verify that device/key, messages from it are green, and you're E2E secure.

Disclaimer: I have no direct association -- I became a Matrix convert after trying to write some XMPP client code about a year ago. I'm just really enthusiastic about recommending it because the tech is solid, the sync is good, it solves a problem, and the team hasn't stopped either: they been firing on all cylinders constantly since I started using Matrix.

I love Signal for their dedication to getting encryption right and the security of their users. But yes, I also share a lot of the concerns listed in this article. Most of all, I honestly believe federation is an imperative. So, while acknowledging Signal's history of outstanding security work... Hey, let's celebrate there's more than one game in town working on alternatives.

13
zabuni 2 days ago 2 replies      
"Also, theres the issue of integrity. Google is still cooperating with the NSA and other intelligence agencies. PRISM is also still a thing. Im pretty sure that Google could serve a specially modified update or version of Signal to specific targets for surveillance, and they would be none the wiser that they installed malware on their phones."

Isn't part of the reason that Moxie went with the Google Store is that he gets to sign the god damned binaries, making it impossible for Google to modify the app.

14
walterbell 2 days ago 2 replies      
Wire (http://wire.com) has worked well on iOS for encrypted text/files/audio/video. Open-source client, no contact sharing neeeded. No phone number needed, you can register with email by using a desktop browser at http://app.wire.com, then logging into the mobile app. Group chat for text only. Timed/ephemeral messages for 1:1 text/files. Feature matrix, https://wire.com/privacy/. Could use more documentation (e.g. on retention of encrypted data) but a lot of questions are answered on Twitter or Github issues.
15
Canada 2 days ago 1 reply      
Nothing is stopping anyone from running their own servers, changing the username scheme, and implementing the voice signaling. Moxie doesn't complain about such usage. But that's more work than simply complaining and telling OWS what they should do.

As far as usernames go, that would require the signaling key to be remembered by the user. That doesn't work well in practice. As far as contact sync goes, has anyone submitted a patch for the android client to add an advanced option to disable that? On IOS access to the address book is user controlled at runtime. destinations will be validated by the server at compose time. Regarding federation, let's see some code. It's ridiculous to demand the small team that is OWS solve every single problem.

16
joecool1029 2 days ago 1 reply      
Can't wait for moxie to jump into the commentary. :)

>Lack of federation

Moxie's pissy because he trusted the kangbangers at Cyanogenmod to to keep in sync with his development. They didn't. Someone will need to volunteer to run their own server that's kept updated, then buy Moxie a Snickers and hope he stops being moody.

>Dependency on Google Cloud Messaging

Fun fact: The iOS client doesn't use GCM, it uses Pushkit. GCM was chosen for Android because what else is as robust and doesn't eat battery? Moxie's voiced support of Websockets if someone implements it correctly and he can merge it as a fallback option when Play Services are missing. If you can't code and want it, contribute to the bounty on it:

https://github.com/LibreSignal/LibreSignal/issues/37

https://github.com/LibreSignal/LibreSignal/issues/43

> Your contact list is not private

https://whispersystems.org/blog/contact-discovery/

TL;DR, it's a tradeoff because nobody has a better idea that works at scale and is usable. Redphone used to have a good way of blindly doing contact discovery but it would require too much data for their current userbase.

17
droopybuns 2 days ago 0 replies      
Animated GIFS were the straw that broke the camels back?

Let's throw the best available solution under the bus.

This post will be my go to example of the myopia of some members of the security community. We have very few examples of well executed, consumer friendly privacy soloutions. Signal is the best for all possible scenarios: Open source, user friendly, buy in from a major Internet service.

I like wickr, but it falls short due to the closed source nature of the project.

Consumer friendly, usable security needs to be the number one priority for security advocates. We need to stop burning down houses because they are short a door or are the wrong color. The foundation is the hard part. Wait till there is a real alternative that can be used by people who are not c.s. majors before you argue that people should stop using the best available solution.

I appreciate the authors perspective and I agree with some of their points. Then they fuck it up by demonstrating purist jackassery. Worth a read as a useful persuasion antipattern.

18
nickik 2 days ago 0 replies      
I have started to read about matrix a lot.

- It now supports e2e encryption.

- It has a nice web and mobile client, called riot.im

- I has many other client options

- You don't need to show any phone numbers.

- Federated, you can host your own server

19
secfirstmd 2 days ago 0 replies      
I've trained hundreds of human rights defenders and journalists over the last 10 years and I will continue to recommend Signal. For too long the community has placed perfect security over usability - there are slightly more secure ways to communicate than Signal but they are far too disruptive to peoples work flows to actually be implemented.
20
latkin 2 days ago 0 replies      
> this tool should not have dependencies on corporate infrastructure like Googles (basically any partner in PRISM)

Free yourself from the bonds of corporate infrastructure by installing this tool on your Google Android or Apple iPhone device (Microsoft Windows desktop version coming soon).

21
qwertyuiop924 2 days ago 3 replies      
There are several projects moving toward this. Matrix is probably the most well-known project, but its crypto isn't actually operational yet, AFAIK.

Tox works now, but for all their talk of trying to be user-friendly, asking users to exchange long alphanumeric sequences inherently isn't.

Psyc, maybe?

22
youdontknowtho 2 days ago 1 reply      
If they are only using GCM as a queue (and the messages are themselves encrypted) I don't understand what the problem is.

They could use anyone for that functionality. Even if the messages are given to an "adversary" what can they really get from that? Your phone app contacted the signal servers. That's really it.

23
bitmapbrother 2 days ago 1 reply      
>Im pretty sure that Google could serve a specially modified update or version of Signal to specific targets for surveillance, and they would be none the wiser that they installed malware on their phones.

I'm not sure he understands how app signing works and why it would be impossible for Google to forge a developer's signature. He also seems to have a problem with GCM and Google in general. Perhaps he should look into writing his own secure chat application.

24
joecool1029 2 days ago 1 reply      
Redphone component.

I don't know why it's closed source. It's been suggested elsewhere in this thread that it was potentially IP issues they kept it closed for. Is it possible loose US CALEA law interpretation influences the reasoning? Or a gag?

I honestly don't know why they chose to do that but I wanted to comment in to see if a lawyer or someone from the project could hint at the reasoning.

25
rstuart4133 2 days ago 0 replies      
I know the redphone is library is just a binary blob in the github repository:

 https://github.com/WhisperSystems/Signal-Android/tree/master/libs/armeabi
But I always thought that .so was just a compiled version of this C++ source, which in the same github repository:

 https://github.com/WhisperSystems/Signal-Android/tree/master/jni/redphone
I haven't compiled it myself so I can't be 100% sure, but the C++ entry points matches the API the Java code is using. I presume it's written in C++ for speed. There isn't much to the C++ bits. It just pumps data through an encrypted RTP connection - CPU intensive but not particularly complex.

The server code is up there too - in fact it's all up there. AFAICT, Signal is completely open source.

26
gyey 2 days ago 1 reply      
I haven't actually worked with GCM so please forgive me if this doesn't make any sense. I suggest that, instead of routing all messages through GCM, what if Signal could send a "wake up" message via GCM, and then let the app pull the encrypted messages directly out of Signal's servers? A wake up message would only be sent by the server if the message could not be received by the client via normal means (implying that the device is asleep).

An optional user preference could allow some dummy wake up messages to be sent at random moments during the day, to support plausible deniability, at the cost of slightly worse battery life performance. This would all happen silently and the user would only notice a message notification when the app successfully fetches a new incoming message.

27
lisper 2 days ago 0 replies      
I'm working on a completely open secure communications suite based on TweetNaCl. Proof-of-concept prototype is here:

https://github.com/Spark-Innovations/SC4

Working on a better UI at the moment. Could really use help, especially beta testers.

28
em3rgent0rdr 2 days ago 0 replies      
I didn't have to recompile my kernel to use microg...instead I used FakeGApps with Xposed framework. instructions: https://github.com/thermatk/FakeGApps
29
argos-rho 1 day ago 0 replies      
The author offers no better alternative so I think that means the article speaks for itself: there's not much to do but whine. These are problems, sure, but they're minor when you consider that Signal is the most secure and user-friendly messenger we have on the market right now. If something takes its place, then great. Otherwise, we just will continue to use what is secure and actually works.
30
RustyRussell 2 days ago 1 reply      
For me I won't recommend it because of the horrible lack of options when you replace your phone (let alone lose it). No encrypted migrate. No backup options. Unencrypted loses content (images).

Plus there's no way to search old messages.

31
codemac 2 days ago 0 replies      
Signal is to get people from SMS and iMessage -> Signal. This means that cross platform communication becomes secure in transit.

Once Signal and others have really wiped out all the insecure messaging people are doing, then we can start with the identity problem with phone numbers. GCM, Contacts, etc are all related to this "phone number as identity" problem.

RCS is an unfortunate grab in this space, and we need to move fast before RCS is the default, and we're back to insecure messaging.

Email addresses are the best form of "federated identification" but are wildly insecure for communication. Here's to hoping we can get some better ones.

32
lonelyw0lf 2 days ago 0 replies      
The truth which a lot of Moxie fans don't want to admit is he thinks there is nobody better to be entrusted with this project. I don't think this was ever meant to be a community project -- he just opened some parts so he could pretend it is. Also he is a limelight hogging security diva who always wants to be in the news and have people talk about him. If he allowed others to contribute and be recognised, he worries they might overshadow him.
33
joesmo 2 days ago 1 reply      
"Otherwise, well be in danger of ending up in an neo-90s Internet, with walled gardens and pay walls all over the place. You already see this trend happening in journalism."

The internet will never be less walled, more free, and more federated than it was in the 90's. With such a poor understanding of the internet and its history, even if he did make a compelling argument (he doesn't), it'd be hard to take seriously.

34
nopcode 2 days ago 0 replies      
Why is the author asking for GPL?

Wouldn't a ISC/BSD-like license be better for the federation aspect?

35
raverbashing 2 days ago 3 replies      
- Lack of federation

Use a federated secure protocol. Oh wait, there are none. Because if a problem appears you just can't fix it without breaking all federated clients. And then they will whine.

- Dependency on Google Cloud Messaging

Fair enough

- Your contact list is not private

Fair enough

- The RedPhone server is not open-source

While it would be nice that it was Open sourced I can understand them not releasing it (might be for IP issues)

tl,dr: "Signal does not work the way I wanted"

36
ttam 2 days ago 1 reply      
funny enough, I was going to try out Signal today but stopped right after seeing the permissions they request: https://pbs.twimg.com/media/CwhFsLzXcAIDcMH.jpg:large
37
1024core 2 days ago 2 replies      
> Another issue, and a plus for using usernames, is that you may want to use Signal with people you dont necessarily want to give your phone number to.

So, how do you know that the Edward.Snowden@signal you're communicating with is the same Ed Snowden that we all know about, and not some TLA stooge?

38
HashThis 2 days ago 1 reply      
How does Signal compare to Telegram? Would you recommend Telegram as better or worse then Signal.
39
richardwhiuk 2 days ago 0 replies      
If you aren't going to recommend anything else then sit down and shut up frankly. The world is made of compromises and saying I don't like your choices is pointless if it's effectively impossible to choose differently.
40
wtbob 2 days ago 2 replies      
I am also very unhappy with the direction Signal has gone, but there's currently no alternative. I'd be interested in contributing to work attempting to replicate it, though.
41
angry_octet 2 days ago 0 replies      
If wishes were fishes we'd all live by the sea.
42
empath75 2 days ago 0 replies      
Any service that owns valuable user data is going to get compromised eventually, whether they do it themselves, or are the victims of an attack. I feel like the only way to not get swept up in the surveillance state is to never put your data on one of these services at all.
43
fiatjaf 2 days ago 0 replies      
The Signal app is stupid. It doesn't work intuitively as WhatsApp. It's incomprehensible that you need a phone number, it's incomprehensible that you can't compile it yourself.
44
kingad 2 days ago 2 replies      
What are your views about VoIP with ZRTP?
45
piotrjurkiewicz 2 days ago 1 reply      
Add a lack of real desktop to this.
46
sctblol 2 days ago 1 reply      
Hmm... he mentions the Giphy thing at the beginning of the article, then never again.

The Giphy mention seemed really dangerous to me. Now I don't use Signal but I imagine it's 1) optional and 2) requests are proxified/anonimised through an intermediary (the Signal servers in this case). And why is this dangerous? Because this "don't build cool stuff on this serious app" is what makes people not use the app. It's creating boring, dull apps what stops them from becoming mainstream successes. If we are trying to make the public using secure apps because we believe in privacy, we have to make them appealing.

This is similar to the case of how nobody uses PGP because how horribly bad it is, UX-wise.

That said the rest of points he brings up are good. I just didn't like the Giphy mention, especially taking into account he didn't say anything else about it, he just brought it up.

47
antocv 2 days ago 1 reply      
48
draw_down 2 days ago 0 replies      
> The big question now... is what post-Signal tool we want to use. I dont know the answer to that question yet

Oh.

24
LessPass: sync-less open source password manager lesspass.com
405 points by mgliwka  1 day ago   225 comments top 61
1
JulianMorrison 1 day ago 11 replies      
What this seems to be, in essence: password = HMAC(key, website).

Why this is bad, compared to an encrypted on-disk key store:

1. A password is now ciphertext, not a block of line noise. Every time you transmit it, you are giving away potential clues of use to an attacker.

2. The search space for possible passwords is bounded if you know the website. You are subject to key guessing attacks. If your key is short, pure serial guessing will break it fast.

3. They don't need any access to you or your stuff, to guess a key. They don't even need access to the server, it can be guessed on an unrelated machine. You don't have the opportunity to detect a break-in and neither does your bank, etc.

4. You only have one password for all the sites, really, underneath, and it's your secret key. If it's broken, it's now a skeleton-key and your digital ass is theirs.

2
sixhobbits 1 day ago 4 replies      
I really dislike the copy/marketing of this tool. OK, so it doesn't sync? How does it work? reads whole front page and all features.No sync, but access anywhere? How does it work?? *clicks the "How it works" link and reads another 5 paragraphs of "This is great. It's so simple. It works really really well. You can phone people and they'll tell you how well LessPass works". Finally, after clicking on the link and scrolling past a bullet list and stylised quotation, we get

"The trick is to compute passwords rather than generate and store random passwords.

LessPass generates unique passwords for websites, email accounts, or anything else based on a master password and information you know."

"Next-gen", "Anywhere, anytime", "Manage directly from your browser". These are all super cliched, really cheap phrases that I really dislike. The front page is full of them. If you're marketing a luxury yacht trip to people with more money than sense, then sure you're probably going to get good results by writing like this. But the folk reading about this are going to be pretty technical and I'm sure everyone would appreciate to see something like "We provide a function that generates a memorable password from the site name and your master password" on the front page, above the fold.

In terms of entropy, you may as well come up with your own function. Security through obscurity is bad (no one knows the function you use to generate site specific passwords) but it's better than security through less obscurity (use a public function that a bunch of other people are using).

You can't get free entropy. If you care about your passwords not being broken when a database of hashes is dumped, you need to use a long, securely generated, random password. Sure, this is better than using the same password everywhere, but it's not really an alternative to something that uses proven cryptography to generate secure unique passwords. Passwords generated using this are only as good as your master password, with some obscurity thrown in.

3
adilparvez 1 day ago 7 replies      
It's great people are exploring this problem space, but so far nothing comes close to https://www.passwordstore.org/ which is just a wrapper around gpg and git. It has Android/iOS clients, as well as GUI clients.

On Android I use Password Store + OpenKeychain, the UX with a YubiKey is very smooth.

https://fossdroid.com/a/openkeychain.htmlhttps://fossdroid.com/a/password-store.html

4
croon 1 day ago 5 replies      
Others have expressed most of them, but issues I see with this is:

* Algorithm can't be changed/improved without changing all your passwords.

* Your master password can't be changed without changing all your passwords.

* You have to remember yourself at what sites you are already registered, and in case of critical bug, you would perhaps need to change password at some services (again remembering which ones they were).

With that said, I really like the outside-of-the-box thinking on this.

5
pilif 1 day ago 3 replies      
How do you deal with sites whose password requirements don't match the output of LessPass? How do you handle the fact that sites want you to change your password? Yes. There's a counter field, but how do you know what site uses what version of the counter? How do you change the master password without having to change all passwords?

Thing is: There's a solution for all these problems: All you have to do is actually generating a random password and store that (in-fact, that's the solution proposed by LessPass to use for these special cases. But if you have storage for the special cases, why not just store the passwords to begin with?)

You don't want to sync it because you don't trust the client-side encryption used in all the managers out there? Use a piece of paper to write the passwords down. Or use a device you constantly carry with you as your password store.

While there are tons of workarounds for the issues of stateful password managers, there are none for the stateless ones (aside of storing the state somewhere, but if you're doing that, why not just store the password?)

6
guillaume20100 1 day ago 1 reply      
I'm the creator of LessPass. We did not expect as many visits on our website. Thank you. We are working on:

* encrypt password profiles client side.

* help user change their master passwords (https://github.com/lesspass/lesspass/issues/36)

* mobile version(https://github.com/lesspass/lesspass/issues/6)

Change his master password seems to be the biggest problem for many of you. We will address this problem as a priority.

7
avian 1 day ago 2 replies      
I don't know what these are used for, but secret keys generated from current time are easy to guess. You only have to try around 2^24 values if you can estimate installation time within a specific year.

https://github.com/lesspass/lesspass/blob/master/lesspass.sh...

8
fps 1 day ago 1 reply      
We need to re-think passwords. Password re-use is a big problem for technical and non-technical users alike, because managing a unique generated password between devices is hard. Dealing with password managers and syncing password lists back and forth is super frustrating to users. None of the existing tools work quickly and easily on all the different devices a user could be using, so at some point everyone that uses a password manager is going to be stuck fighting their password manager to log into a website. The user will likely perform a password reset, which will send an email that allows the user to bypass the password entirely.

Why not provide people with a quick and easy "login by email", since this fallback is almost always available anyway? Slack does this https://auth0.com/blog/how-to-implement-slack-like-login-on-... and allows people to have accounts with no password memorization. You're not making the login any less secure - any attacker with access to a user's email can almost always perform a password reset anyway.

9
jiehong 1 day ago 3 replies      
Not-so-good good idea?

Given that you already have dozens of site with their own passwords, you just can't import your passwords, but you need to change all of them to start using lesspass first.

Also, if the way the generation of passwords works changes later (i.e. bug), then the users are stuck with a version, or the bug is never fixed, ever.

10
romseb 1 day ago 0 replies      
I am surprised Stanford PwdHash [0] has not been mentioned yet as an alternative, which has extensions for Chrome, Fx, Safari, Android, iOS, Terminal and other software.

[0] https://pwdhash.com/

11
sonofgod 1 day ago 1 reply      
Don't use this if you're ever going to type in a password where the screen might be shared -- the constantly-updating "is my password correct" glyphs give away enough information to make it super trivial to decode by eye.

PS: the password for the demonstration gif is "passwordpassword"

12
cyphar 1 day ago 1 reply      
While the idea sounds alright (and I've seen similar ideas done before), there are a few problems with this system that make me quite cautious about trying it:

* In order to handle different password complexities, regeneration of passwords and similar setting, you have to use a "connected" version (read: you have to store the configuration). In addition, the configuration they have includes potentially sensitive information (password length, number of times password was changed, list of websites I use, my username on the site). And currently those profiles are unencrypted. So you in order for it to be useful it's no longer sync-less. As an aside, my bank (foolishly) uses my generated username as a "privileged" piece of information -- which means that I literally could not use this manager for my bank.

* You can't change your master password without updating all of your site passwords. This also means you can't import your old passwords without just changing them all. IMO this makes LessPass not a password "manager". It's a password generator.

* Also, the profile doesn't appear to contain any configuration details for the PBKDF, which seems like a bad idea (it means that they can never practically update the PBKDF without introducing backwards compatibility in the profile settings). Also not sure why they're using SHA when there are better password hashing algorithms.

* Aliases are impossible to implement (without adding more information to the profile), which just makes this impossible to use with SSO systems (I'm not going to remember which of the 5 different hostnames I used to generate a password I use once a year).

I've got to admit that I kinda like the symbols shown next to your password to make sure you're using the right master password, but there doesn't seem to be any description how that's generated. My guess is that it's similar to SSH keyart (which then brings up the question how often will collisions happen with only X^3 options, and can you have two passwords result in different orderings of the same tokens).

Overall, seems like an okay idea. But I would prefer if someone just offered a nice way to host your KeePass databases (or rather if there was an app that did it). You could probably do it with git and push to GitLab or something, but that is just ugly to do manually.

13
ekiru 1 day ago 0 replies      
The _prettyPrint [1] and _getPasswordTemplate [2] functions they use to get from the HMAC to the actual password seem to have a lot of issues:

- _prettyPrint calls into _getPasswordChar which will then take the character code modulo the length of the array of possible characters [3], which is usually going to be biased if the character code is not uniformly distributed between 0 (inclusive) and a multiple of the length (exclusive).

- It's even worse because the input to _prettyPrint is the HMAC encoded as a hexadecimal string. The impact of this depends on the size of the possible character array, but in several cases, some of the options can never be chosen and others will be chosen twice as often as others that can be chosen.

- Using the hex encoding also drastically reduces the number of possibilities for a given length even if that input was then used in a less flawed fashion.

- _getPasswordTemplate appears to treat a password with lowercase/uppercase letters as a series of alternating vowels and consonants (by appending 'vc' or 'VC' to the password template).

- It also generally seems to define "password containing X and Y char types" as "password containing X char type, then Y char type, then X, then Y, and so on".

[1]: https://github.com/lesspass/core/blob/master/lib/index.js#L8...

[2]: https://github.com/lesspass/core/blob/master/lib/index.js#L6...

[3]: https://github.com/lesspass/core/blob/master/lib/index.js#L1...

14
ch0wn 1 day ago 0 replies      
I've used PasswordMaker in the past which uses the same concept but recently moved away from it. Changing passwords is a pain as you now have to remember special parameters for certain pages. Even worse, in case of a compromise of your master password, you're toast. The same is true when losing your password manager database + credentials, but at least you will have a list of all your compromised records which you would have to keep separate in this case.

I've moved to KeeWeb since then + CPK for Chrome and Keepass2Android on my phone and couldn't be happier.

15
lsrom 2 hours ago 0 replies      
I wrote a short blogpost about LessPass, it's security and what should be done in near future. You can find it here: http://lsrom.cz/blog/2016/11/08/lesspass_why_holding_back_ge...
16
Ruphin 1 day ago 1 reply      
Hi Guillaume,

Your master password is 'passwordpassword'. 10 points if you know how I figured that out :)

17
corobo 1 day ago 1 reply      
What happens when the method of creating passwords needs updating, do I then need to visit countless numbers of sites to change the password?

I like the idea don't get me wrong, I just can't see all of the downsides right now which will stop me using it.

Elephant in the room: Are you going to be sued by lastpass for the name?

18
HackinOut 1 day ago 1 reply      
I wouldn't use a password manager system that doesn't have the ability to change the master password.

EDIT: You can't change any password really, without changing all of them (or having a separate master password). Seems unpractical as soon as, for example, site X gets its database hacked.

19
hellofunk 20 hours ago 0 replies      
One issue with this is that URLs for sites can, and in my experience, often do, change, while logins remain the same. A password that uses the URL/site itself may no longer work unless you can remember the old site. That's a headache. I used to have my own mental model of generating a password based on a URL, and it eventually failed several times over because of this issue.
20
wyclif 1 day ago 2 replies      
A gentle critique: don't use "How it works?" since that is not proper English. "How does it work?" is better.
21
leepowers 19 hours ago 0 replies      
Aside from the security concerns already noted I have two major questions:

1) How do I change my master password? It appears that all generated passwords would change as well. Even master passwords should be expirable and changeable.

2) I don't just use a password manager to manage passwords. I use it to manage _credentials_. I have a ton of credentials, so I need something that will remember the _usernames_ for me as well. Otherwise I don't just have to memorize the master password - I also have to memorize the usernames for the hundreds of different logins I'm managing. That's a non-starter.

22
nicwolff 23 hours ago 1 reply      
I may be the inventor of in-browser hash-based password generation at any rate, most of the early variants like SuperGenPass [1] credit mine [2] as the original. And I still use it for low-value sites but I let iCloud Keychain generate, store, and sync passwords for e-commerce sites and e-mail services, for all the reasons mentioned by others here.

[1] https://github.com/chriszarate/supergenpass/wiki/FAQ

[2] http://angel.net/~nic/passwd.current.html

23
vrikis 23 hours ago 0 replies      
I used to use SuperGenPass[1] to do exactly this, but what you'll soon find is that every website has slightly different password rules, so you'll have to start memorizing unique settings for each site (i.e. website X can't have certain chars, website Y can't be longer than 12 chars, etc...). Then you run into the issue of multiple things you need to remember, such as secret answers for various questions, birthdays that you may lie about, etc...

Much easier to just use a vault to store all this.

[1] https://chriszarate.github.io/supergenpass/mobile/

24
msl09 1 day ago 1 reply      
I started using Nic Wolff's password generator[0] to solve that very problem, but I noticed that the need for connected passwords makes up a large size of the passwords that I needed to use, so I gave up on the idea of "store nothing passwords". The idea of generating always the same password for the same site is also doesn't work, because of database breaches, so in the end I created a program[1] that did all the things I needed. Check the README to know how it works.

[0] - http://angel.net/~nic/passwd.current.html

[1] - https://github.com/marceloslacerda/password_generator

25
3pt14159 1 day ago 0 replies      
Very cool project!

One thing though, LessPass sets HSTS headers, but should include the `includeSubDomains` directive and the `preload` directive to stop a first time MITM (for example, when you get a new phone). Once these are done, LessPass should be added to various browser preload lists.

26
mrleinad 1 day ago 2 replies      
What happens if a site requires you to change passwords regularly and keeps a history of the passwords you've used? Wouldn't you be locked on a single password for each site (or a very limited number of options to force the app to change it)
27
atemerev 1 day ago 1 reply      
OK, fine. This is one thing I can finally use.

(I can't use regular storage-based password managers, as I have ADD and I will lose my password file. If it is backed up, I will lose a backup, or I will forget to update it when passwords are changed, or something else. I always screw such things up, this is absolutely inevitable, so I have to prepare. To compensate, my symbolic memory is excellent, so I just chose to memorize all my passwords, as I can't lose my head. But even my memory has limits, and I had to lose some entropy on my passwords to keep them all in my head.)

Now, I will get the best of both worlds.

28
chiefalchemist 15 hours ago 0 replies      
This only confirms one thing: How we're doing account security is inadequate at best. Perhaps LessPass comes up short, but that's more a symptom of the broader problem than it is a mistake on the part of LessPass and its ilk.
29
tim333 1 day ago 0 replies      
How it works roughly: https://blog.lesspass.com/lesspass-how-it-works-dde742dd18a4...

It seems they take info from the site (its name?) plus your master password and hash them - "LessPass uses PBKDF2 with 8192 iterations and a hash function sha-256."

I guess they then produce something of the required length and characters from the hash.

Guess it's ok till someone finds your master password.

30
mehdix 1 day ago 0 replies      
I use [pass](https://www.passwordstore.org/) which is an open source command line password manager. It uses gpg to encrypt passwords which in turn is setup to use a hardware key with its own password. The downside is the metadata which is exposed in file system tree, however the data is stored on a dm-crypted disk. Difficult to setup but much more secure.
31
stcredzero 18 hours ago 0 replies      
I wish someone would make a sync-friendly open source password manager. Something like KeePass, but with fewer features, user initiated auto-fill, has a client on just about all platforms, and is designed ground-up to live on a file sync service like Dropbox or Google Drive.
32
iKlsR 1 day ago 0 replies      
Enpass is my goto, recently recommended it to some people as well and all are happy. I have it on my phone as well as work PC, personal laptop and a linux box. One master password and it's all local, I just use dropbox to sync the encrypted "wallet" so I have my accounts on all my devices. https://www.enpass.io/
33
mikegerwitz 1 day ago 0 replies      
See also gnu-pw-mgr (CLI):

https://www.gnu.org/software/gnu-pw-mgr

It generates passwords based on a secret key ("seed") and memorized transformations to URLs. You can adapt that to use a master password, if you so desire, by using a password as a base for such a transformation, but that's not built in.

34
dexterdog 1 day ago 0 replies      
Why do none of the password managers integrate a decent xkcd password option? I like having a complex password, but God forbid I have to ever read it from my phone and type it somewhere else. xkcd is easy. Use a word dictionary pick a few words and delimit them with special characters. Mix the case as an option, too.
35
Marc_Bryan 1 day ago 0 replies      
Lastpass is free now for mobile devices. Enjoy!
36
nachtigall 1 day ago 0 replies      
How is this better/worse than https://addons.mozilla.org/en-US/firefox/addon/easy-password... which claims that "No web servers involved, the data never leaves your computer."?
37
libeclipse 1 day ago 0 replies      
I made something similar to this: https://github.com/libeclipse/visionary

Works by using a master password and a keyword to generate a seed, then applies various deterministic algorithms to generate multiple passwords.

38
laurent123456 1 day ago 0 replies      
Same as this one - https://ssl.masterpasswordapp.com/ and also same problem that if you need to change your master password, all the passwords of all your websites becomes invalid. I really don't see how this kind of solution is workable.
39
Nux 1 day ago 1 reply      
Nice idea and willing to try it, but..

"The requirement for self-hosting is to have docker and docker-compose installed on your machine."

Fsck that.

If I will trust this with my passwords, I need to know how to _really_ install it. I can't trust you if all you have to offer is a steaming pile of docker or your idea of how I should run my systems.

40
sschueller 1 day ago 1 reply      
This one isn't bad either https://keeweb.info/
41
rahvee 1 day ago 0 replies      
If you re-generate the password for each site, rather than sync, it means (1) you can't change your password on that site, (2) if any site puts in the effort to brute-force guess your password, they will have access to all the sites you visit.

This is not a good idea.

42
leshow 1 day ago 0 replies      
I did a quick over-look of the information on the site, but it looks to me that this is a pretty bad 'password manager'

it looks like anyone can try to brute force your password based on the login/site combo. can someone tell me why i'm wrong here?

43
s9ix 1 day ago 0 replies      
Wouldn't enabling two-factor on the primary login and naming sites whatever you want solve most gripes with this (in the comments here)?

i.e. I name hacker news hn or hack, as opposed to news.ycombinator.com - one more thing to guess for an attacker.

44
dchest 1 day ago 0 replies      
Obligatory post about password generators: http://crypto.stackexchange.com/a/5691/291
45
jhasse 1 day ago 0 replies      
I've written a native app (in C++ with wxWidgets) with the same principle: https://bixense.com/pwcalculator/
46
mgliwka 1 day ago 0 replies      
47
ciorici 1 day ago 0 replies      
I've been using a similar tool long time ago:http://angel.net/~nic/passwd.current.html
48
redpanda_ua 1 day ago 0 replies      
Is this better than this: https://ssl.masterpasswordapp.com/ ?
49
jasikpark 1 day ago 1 reply      
A more developed and mature implementation of a stateless password manager is http://masterpasswordapp.com
50
trungonnews 6 hours ago 0 replies      
what happens if someone figured out your master password?
51
marcioaguiar 1 day ago 2 replies      
How is this different from having one password for all sites? If I break the master password I gain access to everything.
52
lindner 1 day ago 0 replies      
This looks a lot like https://getvau.lt/
53
mihaifm 21 hours ago 0 replies      
Does anyone know a good password manager that uses touch id, on the iphone?
54
wjd2030 1 day ago 0 replies      
So what is stopping someone who already knows the username from generating the correct password?
55
shinigami 1 day ago 0 replies      
> PBKDF2 with 8192 iterations

Not nearly good enough.

56
SippinLean 1 day ago 0 replies      
Maybe make the difference between your name and LastPass more than 2 letters?
57
daurnimator 1 day ago 2 replies      
SuperGenPass is (was?) a similar concept: supergenpass.com
58
jayeshsalvi 1 day ago 0 replies      
Can I change the master password?
59
cchubitunes 23 hours ago 0 replies      
At last, a simple software project a beginner can understand.
60
rorygreen 1 day ago 0 replies      
I feel like most of the complaints here are about stateless password managers/generators in general. If not being able to change your master password is an issue to you, then this type of password manager is not for you. The cryptographic arguments seem more valid and worth considering if you plan to use this. I was planning to build something very similar to this as I came to the same conclusions as the creator of LessPass, that the existing solutions are not satisfactory for my use case.

My current choice in software of this type is Twik (https://github.com/gustavomondron/twik) because it has an Android app available from F-Droid and an extension I can install in Chromium. The browser extension is especially good but not without faults. For daily use I find this adequate but I find myself in situations where I need to access a password outside of a browser or my phone which is a huge pain. Other drawbacks include no ability to change or bump the password to a new one without creating a new profile. Keeping my phone and computer in sync is also slightly annoying as you need to manually copy the UUID (long-ish complicated string) which is used to identify each profile.

I think a lot of these issues could be overcome by relatively simple solutions just by applying some sensible design without going all in and trying to be everything at once like LessPass is. If, for example, Twik generated QR codes for its profile keys which you could scan in the mobile app, it would speed things up massively. If it had a compatible CLI interface and a macOS menu bar application, it would be fantastic.

I'd also rather not bother with creating an account with another service just to sync my password version numbers and silly password rules that sites implement. The ability to self-host LessPass is nice but do I really need to bother with this? What's wrong with syncing this information in a simply formatted, maybe also encrypted, text file with services I already use such as Dropbox or Google Drive? Also, why is the default password length 12 characters? It's a small gripe but I thought the whole point of using software like this was to enforce good password practices.

I think https://getvau.lt/ gets pretty much everything right other than having to remember the rules of each password every time you generate it which, for me, isn't much better than remembering different strong passwords for multiple sites.

I've been hoping for some time that somebody would create a solution that I feel makes sense so I was excited to see this post but unfortunately it seems to have too many drawbacks for me personally to incorporate into my daily life. The open source nature of a lot of these products is very helpful and will hopefully reduce the effort I will have to eventually put into building a solution that gets out of my way.

61
fnj 1 day ago 4 replies      
When I find stuff like this that uses SHA256, I instantly just tune out. SHA256 is stupid. Use SHA512 or find another line of work. SHA512 is astronomically more secure. Anyone who is using anything less than SHA512 for hashing is an idiot.
25
Be careful about what you dislike pocoo.org
348 points by JonoBB  3 days ago   169 comments top 29
1
hellofunk 3 days ago 14 replies      
I have realized over the years that it is wise to be naturally skeptical of any opinion that is strong, either positive or negative. People who have an appreciation for gray areas, even if they ultimately do have a preference, tend to be a lot more emotionally balanced than those who maintain a very strong stance on something. I have noticed this so consistently over the last 15 years that I now consider it a fundamental benchmark by which I can gauge my ability to work or socialize with someone in general, on any topic, over the long term.
2
andybak 3 days ago 8 replies      
I'm fascinated by the topic of English Prime: https://en.wikipedia.org/wiki/E-Prime

It introduced me to the idea that 'is' should be treated very carefully. Any assertion outside of strict formal languages that use it are half-truths at best. It also introduces heightens the emotional tone of a discussion. If you say "John is foo" you tend to create the impression that John will always and has always been foo. Foo-ness is a taint on his soul. Contrast that with reformulations that make it explicit that John's foo-ness is a fleeting association related to both his present situation, your current perception of it and the current socially accepted meaning of foo along with all it's implied baggage.

I realise I might be rather off-topic :-)

3
TazeTSchnitzel 3 days ago 2 replies      
> Then the entire thing spiraled out of control: people not only railed against TTIP but took their opposition and looked for similar contracts and found CETA. Since both are trade agreements there is naturally a lot of common ground between them. The subtleties where quickly lost. Where the initial arguments against TTIP were food standards, public services and intransparent ISDS courts many of the critics failed to realize that CETA fundamentally was a different beast.

CETA has ISDS as well, and if only on that point alone, CETA is objectionable. This argument comes off as disingenuous, the similarities between the deals are not imagined. ISDS isn't even the only similarity; CETA also contained objectionable new copyright provisions (though apparently those are mostly gone now), for example.

4
pimlottc 3 days ago 2 replies      
This brings to mind a fantastically lucid comic about the utility of questions vs answers:

http://kiriakakis.net/comics/mused/a-day-at-the-park

After all, the an opinion is just an answer to the question, "What do I think of this?"

5
cyberpanther 3 days ago 1 reply      
A very common cognitive bias or logic pattern our brain follows is to whitelist or blacklist things. When we trust something, we follow it without question or we begin rationalizing it no matter what. And in the day of the internet and Google we can confirm basically any bias we have on either side of an issue.

You should scrutinize your own thoughts and opinions and others to see if they are just believing something because it was true in the past.

In terms of Javascript, there is definitely a lot of hate out there for the language and ecosystem which was entirely true. But I would argue JS has the best trajectory right now of any language out there. So you better learn it if you want to stay relevant in development.

Lastly, I've found it best to not be so opinionated about everything. Sure having some opinions are great but you develop too many biases otherwise. So what if something sucks, use it anyway. You might learn something new, or maybe you can help improve it if it has potential.

6
crawfordcomeaux 3 days ago 2 replies      
Our realities are each a collection of stories we each tell ourselves. Sometimes parts of the stories two people believe will overlap and we'll call those opinions or facts depending on situation.

I'm finding it helpful to view every signal my body encounters as a chance to choose how to process it, including what I do, taste, or hear.

Since adopting this view, I've effortlessly enjoyed eating foods I've hated my entire life (tomatoes, olives, CILANTRO?!), listening to country music, and doing things like chores that used to bore me to tears.

If anyone sees danger in learning to view the world that way by default, I'd love to hear about it.

7
kstenerud 3 days ago 0 replies      
It's unfortunate, but we have a tendency to take some beliefs so deeply that they become a part of our core identity. Once this happens, validation of the idea becomes validation of ourselves. Attacks upon the idea become attacks upon ourselves.

Once someone has reached this point, logic simply cannot reach them. Successfully defeating their arguments will only strengthen their resolve (the backfire effect), because they're being driven by the amygdala, which only understands threat response. They will grab onto any argument, no matter how flimsy, and be completely unaware of how little sense it makes. Any further argument with them will at best do nothing, at worst make you look as much a fool as he.

The wise man learns to recognize this state and back off.

8
lazyjones 3 days ago 1 reply      
It's not the responsibility of the author to anticipate future changes that might weaken his current arguments. The reader is responsible for taking into account the time and context of the text they are reading.

It's why we like to have e.g. "(2013)" added to anchor texts on HN, for example.

9
dorianm 3 days ago 3 replies      
For comparaison, Ruby 3 is gonna introduce a pretty big breaking change (frozen string literals) but they already shipped a way to optional enable it by-file (magic comment) and globally to the ruby interpreter (just a parameter) so that all the libraries and projects can slowly fix it in a compatible manner (often just calling .dup is enough).

So that's when it's time for Ruby 3 the transition will be pretty painless.

More info: https://wyeworks.com/blog/2015/12/1/immutable-strings-in-rub...

(Frozen string literals allows strings to be in memory only once and not having to reallocate each time, so a pretty big memory and cpu optimization)

(Also for instance rubocop already recommends adding the magic comment to all ruby files)

10
carsongross 3 days ago 1 reply      
I naturally see both sides of almost any argument, and my personality is such that I would rather synthesize the arguments of both sides into a final position via dialectic.

I have lost almost every major argument I've had in a corporate environment.

11
danso 2 days ago 0 replies      
As a relative newcomer to Python, I had no real interest in working with 2.x. But I appreciated Armin's critiques of 3.x -- it was really difficult finding thorough, thoughtful critiques that were focused on 3.x's flaws, not on the pain of porting/division of the community, which is of less concern to recent bandwagon jumpers like me. Most of all, I appreciate that his libraries -- Flask, flask-sqlalchemy, Lektor -- are 3.x compatible.
12
michaelsbradley 3 days ago 0 replies      
Hear! hear! I also recommend, more generally, reviewing logical fallacies, cognitive biases, and misconceptions as part of a regular self-review. It's important to keep a flexible mind, though achieving greater degrees of interior freedom is hard work.

https://en.wikipedia.org/wiki/List_of_cognitive_biases

https://en.wikipedia.org/wiki/List_of_fallacies

https://en.wikipedia.org/wiki/List_of_common_misconceptions

13
dorfsmay 3 days ago 0 replies      
Part of the issue is efficiency, we have to make choices and cannot reevaluate everything constantly. Also, we can't be specialists in everything.

So programming languages, we have to pick a few and become good at them. It's one thing to take another hard look when applying for a new job for example, but we cannot keep track of all programming languages and their evolutions.

14
Unman 3 days ago 1 reply      
Hmmm... while agreeing with the sentiment I am unimpressed by the lack of evidence for one of his supporting examples. What stood out for me was this bald assertion with no reference to falsifiable specifics:

"_Not_only_was_it_already_a_much_improved_agreement_from_the_start_,but it kept being modified from the initial public version of it to the one that was finally sent to national parliaments."

Either the writer of this is an expert on the topic, well-known in the field and the weight of this judgement on its own is a valuable primary source; or, the writer is referring to such an analysis conducted by other experts but has not bothered to include a citation/link; or, the writer has their own critique but instead of presenting _that_ has just stated an opinion which they know to be controversial.

All of the above possibilities contribute substantially to the noise around any discussion.

15
simonhamp 3 days ago 0 replies      
I think a sideline point here is to not appropriate other people's opinions from a specific point in time just because they happen to align with yours (opinion/bias) at the current time.

And of course, try to have as wide and deep an understanding of the subject as possible before forming strong publicised opinion in the first place.

16
rdslw 3 days ago 0 replies      
Paul Graham in one of his best text explained similar concepts writing "I finally realized today why politics and religion yield such uniquely useless discussions"

Highly worth read: http://paulgraham.com/identity.html

17
sitkack 2 days ago 0 replies      
I have to reference an Arthur C Clarke essay, "Hazards of Prophecy" with this quote

 > When a distinguished but elderly scientist states that > something is possible, he is almost certainly right. When > he states that something is impossible, he is probably wrong.
I have found that the wisest, smartest, most mature folks will re-evaluate their opinions in light of new information, and often change their mind.

18
mooreds 3 days ago 0 replies      
Try not to move the goalposts. If someone compromises, acknowledge that and thank them for it, rather than saying "I am glad you finally saw the light, but now we need to take it a step further".

Brinkmanship rarely serves to get anything done, and burns bridges when it does actually accomplish something.

19
madsbuch 2 days ago 0 replies      
We need to establish that all communication is the senders responsibility. In the case of CETA, bot parts are not senders. Only one part is. They have a clear obligations to let people know about updates and imprecision about their communications.
20
9mit3t2m9h9a 3 days ago 0 replies      
I think the effect described in the text has another side: imagine that at some point using XYZ was obviously a bad idea for multiple reasons for a specific person in specific circumstances. Obviously, keeping track of the changes in XYZ will have a lower priority for a person who is not going to use XYZ anyway, even if one of the multiple show-stoppers gets fixed/changed/redesigned. This means that the person's opinion about XYZ slowly gets stale.
21
slavik81 2 days ago 0 replies      
The bit on CETA was interesting. I was very disappointed when I heard CETA was signed last week because I strongly opposed the copyright term extension and anticircumvention clauses from the 2009 leaked draft. However, as far as I can tell, those are not in the final agreement. Opps.
22
agumonkey 2 days ago 0 replies      
This is a broader topic, it touches on how to deal with communication, debate, idea exchange, solution finding, society. I've seen the postures recently from supposedly right wings partisans that were mostly stuck up on old negative facts that don't apply today.
23
z3t4 2 days ago 0 replies      
Web URL's are seriously underrated ... You can not go back in time and change what you told someone ... But if you have a blog that has an URL, you can actually update the content.
24
datashovel 3 days ago 0 replies      
It may be less the responsibility of the "consumer" of the information and more the responsibility of the "producer" of the information.

If the argument is presented as if something is and will always be a certain way (or even if the argument is presented without admitting that something may change) it can probably lead a lot faster to groups of people assuming the argument will be valid forever.

EDIT: Or can be misinterpreted that someone presenting an argument believes the argument will remain valid forever.

btw. never saw the talks the author cites, and have not followed the trade agreements very closely so I'm only speaking generally here.

25
minusf 3 days ago 1 reply      
for me personally it is news the_mitsuhiko is "not vocally against python3 anymore". i cannot find any other recent blog posts besides this one, where python3 is praised or encouraged fully. so why be surprised if people still think he is a big python3 critic?

as i see it, the issue is less about parroting other's outdated technical opinions, it's about not being vocal enough about the change of heart.

26
xtiansimon 3 days ago 0 replies      
Headline: Engineer cries Political Arguments are not 'valid'; Forks off own nation.
27
profalseidol 2 days ago 0 replies      
Two words:

Socrates, Marx

28
msinclair 3 days ago 0 replies      
Except Internet Explorer... that will always be the same. :)
29
ak39 3 days ago 1 reply      
Good article.

List of some of the things I don't like for which I have to occasionally take another peak to see if I'm finally wrong:

1. (In languages) Garbage collection and the idea of "safe code". I didn't like it then and still don't.

2. ORMs

3. (Relational) Data models with compound keys flying around as FKs everywhere.

4. The idea of self service BI (like PowerBI etc in the hands of a business user)

5. Regexp

26
Tesco Bank halts online payments after money was taken from 20K accounts bbc.co.uk
253 points by luxpir  1 day ago   156 comments top 10
1
elcct 1 day ago 3 replies      
> Tesco Bank is stressing that relatively small amounts were taken from 20,000 accounts

If someone is living month to month, said 500 missing could be very serious complication of life and 25 "emergency fund" is a joke. I personally doubt a wealthy person would use that bank and yet they seem to think their customers are pissing gold.

2
joosters 1 day ago 8 replies      
I wonder what the security flaw was? It is interesting that all the customers are still allowed to use their cards for cash withdrawals and payments, and they can all still log in to their online accounts. There doesn't seem to be any mention of a system-wide password reset.

So... it sounds like there wasn't a widespread theft of account credentials, and that the attack was some kind of weakness in the bank's online systems. Perhaps the attackers found a way to log in to accounts bypassing the usual security checks? But that still doesn't explain it all.

All my online accounts have extra security when I create a payment to a new individual. Some have an extra password check, some have SMS validation, and so on. All of them send me a notification of a new payment being added. And yet there doesn't seem to be reports of Tesco customers getting any of these kind of messages. People only found about the losses when they logged into their accounts, or when Tesco broadcast a "we've been hacked" message to everyone.

Does anyone know what could have happened here?

3
mstade 1 day ago 7 replies      
And this right here is why you should have accounts with at least three separate banks, ideally in at least two different countries, and emergency funds in all of them. Also get a couple of credit cards! Even if you never use the credit beyond what's necessary to keep it, it's good to have for when the proverbial shit hits the fan. Cash is also useful for solving the basic needs like getting food, but for paying bills it tends to be less so, since it's increasingly a pain to bay bills by cash these days. The move to cashless is unfortunately going faster than a feature parity alternative is being developed and crucially, adopted.

As with almost anything financial, the key to lower risk is not putting all the eggs in a single basket.

4
martinald 23 hours ago 0 replies      
It looks like faster payments are still allowed, as are in person chip and pin and cash withdrawls.

My guess someone (either insider or via technical means) has got a list of all the debit card numbers, ccv and account details - maybe even 3DSecure/VfV details?

People are then doing loads of payments via online cardholder not present.

Going to be a pain to figure what is what on this.

"Ref: Customers will still be able to use their cards for cash withdrawals, chip and pin payments, and bill payments.The bank is blocking customers from making online payments using their debit card, although transfers between accounts and to other people are still allowed, a spokesperson said."

5
coldcode 1 day ago 3 replies      
The worst thing that can happen for any bank is to have its customers' money taken away. No one will ever do business with that bank again. You can screw up everything else but lose people's money is unforgivable.
6
diegoprzl 1 day ago 0 replies      
I'm only surprised at this not happening every week. I suppose that completely hacking a bank is not easy to monetize, even if breaking its security is.
7
283894 1 day ago 1 reply      
I just signed up for 2 Tesco accounts the other day to dump 3k in each for the 3% interest.

I'm certainly not going to be doing anything with the accounts until Tesco give some more clarification on what actually happened (although the way these things work, I doubt there will ever be a full technical response.)

Also if it is some sort of internal breach, would any other data have been taken?

Back in 2012, Tesco were storing passwords in plain text.

http://www.bbc.co.uk/news/technology-19316825

8
andybak 23 hours ago 1 reply      
I know banking and retail are separate but as an organisation Tesco hasn't got a good history for security:

https://www.troyhunt.com/the-tesco-hack-heres-how-it-probabl...

https://www.troyhunt.com/lessons-in-website-security-anti/

9
DrNuke 1 day ago 0 replies      
Most times the snake is inside in the form of disparaged employees or corrupt managers: how are Tesco bank personnel recruited and treated, compared with their peers at more established banks?
10
lifeisstillgood 1 day ago 5 replies      
it seems to be money transferred from accounts (600 mentioned as an amount). But to set up 20,000 new transfers, and extract money from them, without 2FA, and without tripping any number of alarms is a terrible failure in security.

This will massively affect their provider fiserv, their internal team will almost certainly have to be replaced and I would be surprised if they don't throw their hands up and go back to being grocers. Retail banking is wafer thin margins.

Edit: I cannot think of / find a similar case - this is amoungst the first if not the first mass account attack I know of.

To do this there is a trace. Potentially an insider at Tesco to turn off the 2FA etc, or possibly they have penetrated the systems totally. Not sure which is worse.

Also there must be some mule accounts - right now all the "Big Four" are scouring their customers accounts for unusual deposits. We will hopefully see where it went soon - presumably to several people who believed a Nigerian Prince was sending them cash, and then sent it into a wash of Russian accounts.

But I would be amazed if it all gets out the country. It would trip so many alarms. Of course if it did not trip alarms

Some predictions - Gov will enforce GPG level encryption for every bank interaction - 2FA with Time based OTP for example. This will force a huge upgrade in retail banking - and will be good for the economy.

And Apple IPhone is the perfect host for making time based two factor auth that smooth. Good for apple. Android might just see the whole UK market as large enough to get its act together.

27
Switching from macOS: Developer Environment elementary.io
254 points by bdcravens  3 days ago   170 comments top 26
1
nebulous1 3 days ago 7 replies      
Elementary doesn't strike me as a particularly good distro for dev. It's not that I've anything against it, but other than your personal preference in the DE (and Pantheon isn't without its charms) it doesn't seem to have much that's going to lift it over any other linux distro. Perhaps I'm missing something.
2
jkrems 3 days ago 5 replies      
Wonder why they didn't go with Cmd+C/Cmd+V for copy&paste. As a developer, that's one of the reasons I really enjoy working on macOS. There's no chance to confuse Ctrl+C and Cmd+C - both of which are shortcuts I use frequently.

P.S.: Not to mention that I appreciate using my thumb for the primary meta key instead of my little finger.

3
nmalaguti 3 days ago 2 replies      
One of the major benefits of macOS has been that everyone who uses it has a consistent experience. Some people will use more specialized applications or tools, but the base has been very consistent.

Homebrew has made things even easier and has been adopted as the one right way to install things in a lot of projects and companies. And the fact that it is a rolling release package manager means you can always get the latest and greatest or use homebrew/versions to stick with an LTS version.

I have always found installs of the same Linux distro by different people to be almost incompatible, let alone installs of different distros. Different hardware, different desktop environments, different applications and configurations. On the one hand everyone can have a tailor made experience, but it makes it hard to debug or come up with common configurations and instructions.

Elementary is making some simple and familiar choices that make it easier for everyone to start at the same place. It looks and feels good, but is different enough that I can't just switch without feeling all the rough edges.

If developers are serious about migrating to a linux distro and PC hardware, I think a hybrid rolling release for devtools and versioned releases of the base system might be needed to capture a lot of the success of macOS. I'm not even sure if that's really possible.

4
meesterdude 3 days ago 3 replies      
Why are they even making a code editor? Seems like effort that could go towards more fruitful endeavors.
5
unhammer 3 days ago 1 reply      
> Similarly, you can just Ctrl+V to paste in the terminal instead of having to work around with extra modifier keys.

that's a bit dangerous; Ctrl-V is normally used to "escape"/make literal the following keypress, or do block select in vim.

The notification-on-long-running-process looks very handy though (I've been using https://gist.github.com/unhammer/01c65597b5e6509b9eea , but of course clicking it doesn't put me back in the right tmux window). And the "energy-sucking apps" indication mentioned in http://blog.elementary.io/post/152626170946/switching-from-m... looks very handy. (I've been considering creating wrapper for Firefox that Ctrl-Z's it when it's minimized )

Is anyone running the Elementary DE (or parts of it) on Ubuntu? Does it work OK, or do you have to run the whole OS for it to be worth it?

6
lukaszkups 3 days ago 1 reply      
I don't get all the hate of elementaryOS distro here on HN as a dev machine. I've worked before on osx, ubuntu, xubuntu and fedora. Comparing to other linux distributions, it is just another linux-like system and works as a dev machine similar to any other distribution, but IMHO looks nicer. Please, provide me information what makes elementaryOS worse than e.g. Ubuntu as a dev machine? (I'm a webdev working with cordova/phonegap, RoR, Django and Node.js every day and eOS works like a charm for me)
7
nickbauman 3 days ago 1 reply      
I bought a System 76 laptop a couple of years ago. It completely smokes my 2x more expensive MBP (which has faster processors) in important tasks like running test harnesses and compiling projects. The body, keyboard and trackpad all have this cheap, "dollar-store" quality that initially drove me nuts but I got used to it after a couple of days.
8
rerx 3 days ago 1 reply      
I've been running only Linux for years. Here's what I miss and why I still regularly contemplate just getting a Mac:

- a modern full featured client for email, with an efficient and pretty UI, with good shortcut support (at least as good as the Fastmail and Gmail web interfaces)

- a fast and full featured PDF viewer that supports annotations properly -- anything based on Poppler unfortunately does not cut it

- friendly software to create pretty presentations -- Keynote still seems to be king

Development tools are the least of my worries.

9
JustSomeNobody 3 days ago 1 reply      
Wow, they are really pushing hard in the wake of all the "controversy" with the new MBPs.
10
cyberferret 3 days ago 0 replies      
I installed Elementary in a VirtualBox on my old Windows 7 Thinkpad, and am loving it. Seriously considering installing my Ruby (Padrino) development environment within it to fully test, with a view to completely scrapping Win7 from the laptop and running pure Elementary in the future.
11
tananaev 3 days ago 2 replies      
The only reason I have to use macOS for development is Xcode which I need to make iOS mobile apps. I used to use macOS in a VM with Linux as a host system, but it's just too slow and laggy even on good hardware.
12
vijucat 3 days ago 3 replies      
I guess Elementary had to copy the Cmd+spacebar shortcut to mimic the Mac OS experience (Spotlight), but on that count, Windows' just-press-Win-and-start-typing experience is much better. It's just one key less, but opening up a program is used ALL the time, and eliminating that key press makes a huge difference, IMHO. Not sure when they introduced it in Windows, but that was a good one.
13
pmlnr 3 days ago 1 reply      
elementary again within 2 days? Come on.

Anyway, Geany beats Scratch.

14
jasoncchild 3 days ago 2 replies      
i was reading this and wondering why there would be so much emphasis on stuff like apt...then realized that there are indeed developers who've only ever used OS X (and perhaps windows). i guess i assumed everyone ended up with Linux as a daily driver at some point, if even for a short time
15
EugeneOZ 3 days ago 1 reply      
Does fonts rendering look smooth on hidpi screens?
16
joeevans1000 3 days ago 0 replies      
This may be a good transitional and familiar OS for people now having to migrate away from Apple now that it isn't taking developers and professionals seriously. Some may find this meets all their needs.
17
achikin 3 days ago 8 replies      
As a mac user I wonder why do I need to use sudo to install packages?
18
shorodei 3 days ago 1 reply      
An year ago I started dualbooting Elementary as my daily *nix OS. All was well, until one day, with no hardware change or OS update, the touchpad stopped working. I'm back to VMs now.

If I wasn't dualbooting I might have spent more than a day to figure out what happened - but I was too lazy and scrapped dualbooting.

19
yulaow 3 days ago 0 replies      
As someone suggested also in the previous post about elementary, take a look at Apricity Os (arch-based)

[ https://apricityos.com/download ]

20
erokar 3 days ago 1 reply      
My gripe with Elementary OS is that it's too much like MacOS. It's dervied and feels boring and stale in the same way that MacOS does. If you're switching, do it with a bang, not a whimper.
21
jordic 3 days ago 0 replies      
I'm quite happy with my desktop less i3 + tmux for shells. (Ubuntu) I switched from Mac three years ago tired of iTunes and the rest of bloatware.
22
cocktailpeanuts 3 days ago 0 replies      
I'm an iOS developer. Everything is irrelevant.
23
rco8786 3 days ago 0 replies      
This whole thing reads like an Apple product release. Not sure if that's good or bad considering the intent.
24
gchokov 3 days ago 0 replies      
Not switching anytime soon. No reason.
25
mirekrusin 3 days ago 0 replies      
elementaryOS is ok'ish but before you dive into it you should know that there's no way of upgrading system, you're going to have to do fresh install once new version is available.
26
oblio 3 days ago 0 replies      
Man, I didn't expect this surge in ElementaryOS articles :)
28
Software Developers: how to get a raise without changing jobs fearlesssalarynegotiation.com
253 points by Axsuul  19 hours ago   182 comments top 25
1
redwards510 18 hours ago 12 replies      
If only things were so easy. Our department gets allocated $XX,XXX each year for raises. My boss has to split that among eight people. If I were to get a 5% bump like OP recommends, other people would get nothing.

Sounds like typical D-grade manager motivation. "Everyone work yourself to death and I might throw you a bone after one year!" I'd rather give up that 5% to have a sane work-life balance.

It's common knowledge that you must quit to increase your salary. It seems like a stupid practice for companies to follow, but it must make sense economically or they wouldn't do it.

2
hbt 18 hours ago 1 reply      
Yes, the classic "just world hypothesis" https://en.wikipedia.org/wiki/Just-world_hypothesis

If you "work harder", you will be compensated fairly. Of course, "work harder" is overrated now that work doesn't involve physical labor. "Work harder" has been replaced by "create more value".

The expectation is after you create more value for the owners, you will be compensated fairly. If you believe that, go talk to engineers who worked on startups as early employees and got screwed by investors, founders, managers etc.

Why? For many reasons:

- the owners didn't think you could do any damage to their reputation and impact hiring prospects after they screwed you

- after the company grew, you were now interchangeable

- needed to raise more money and dilute your shares

- you were no longer as valuable to the next stage of the company

- They didn't believe you could replicate the value you created again

etc.

Of course, founders, investors and friends stick around regardless of the stage of the company or the value of their contribution. Why? Because they OWN it, they have power.

Do not confuse performance for power. The only time I got a significant raise (I mean 4X the pay) is when I had significant leverage. Opportunism, self interest and politics rule business.

People are not gonna give you more money if they can avoid it.

3
mrlyc 18 hours ago 1 reply      
When my contract was up for renewal, what worked for me was to mention to my manager that people doing my job in other companies were being paid $20 more per hour than I was. I was hoping for a $5 increase or maybe even a $10 one but he gave me the whole $20 and made me promise not to tell anyone.

Six months later, we got a new CEO and my contract was terminated because I cost too much.

4
iamthepieman 18 hours ago 4 replies      
I've got raises as high as 206% by switching jobs. Most recent switch a little over 2 years ago resulted in a 20% raise which is much more typical.

Never had more than a cost of living increase otherwise (2-3%),

My strategy is to move when I stop learning. It maximizes both my enjoyment (learning challenges me and gives me motivation beyond a paycheck which is good for me and my employer.) and my potential earnings.

5
ffjffsfr 18 hours ago 1 reply      
> Bring more value to the company than whats expected of you. Then ask to be compensated for it.

Sorry but it doesn't always work like this. Sure, performing well makes it more likely for you to get a raise, but there is no guarantee. Perhaps performing well increases probability of you getting a raise. But it does not guarantee anything nor is it most important factor in computing this probability. Probabibility of raise will depend on variety of factors including your company culture, your manager personality, company financial situation etc.

6
Axsuul 19 hours ago 2 replies      
Saw this set of polls from @zedshaw on twitter and had JUST read this article. Seems relevant. https://twitter.com/zedshaw/status/795687259861172224
7
JoshDoody 18 hours ago 1 reply      
Just saw a huge spike in traffic from HN and thought I would stop by to see what all the fuss is about. Glad to see people reading the article and having a productive conversation in the comments.

Happy to answer questions if you have them, and I'll hop into the comments as well.

8
darklajid 18 hours ago 1 reply      
Like in a lot of things: Context matters.

I can look at SV salaries and only shake my head. Random 'jump between jobs every 6 month if needed to improve your net worth' advice from single mid-twenties (not talking about the author, no clue about his background. But it feels that this is the general attitude around salary improvement here) is not globally applicable.

And the author basically says 'invest in the company and THEN ask for the reward'. He's ignoring the 'or else' part. I am 100% sure that a healthy portion of envy and fear is part of my rejection, so let's say I'm coming in with a rather negative attitude. But that article is completely and absolutely devoid of content.

"Work harder, then ask for a better salary based on your excellent performance" isn't exactly worth a blog article. Writing down the year over year effects of N percent in salary is trivial maths.

He's followed by Patio on Twitter, so I give the author the benefit of the doubt and believe that he has sound advice to offer, a decent idea about negotiation (with the 'context matters' caveat). But this article seems .. a fluff piece?

9
nachtigall 18 hours ago 1 reply      
4. Organize with (some of) your co-workers, maybe join a union. Act together instead of fighting for higher wages on your own.

Seriously, the moment you are on a "secret" one-to-one with your boss you have already lost. There's reason why unions have always bargained collectively, and there's also reason why bosses have always tried to avoid this.

10
xchaotic 18 hours ago 3 replies      
You should always approach the negotiations with an upper hand - an offer from somewhere else.If it doesn't go well, just move.
11
dfabulich 18 hours ago 1 reply      
I often see articles like this talking about getting a "raise," but the steps they talk about seem more in line what I'd think of as getting a "promotion" (along a technical track). Is "getting a raise" the same thing as "getting a promotion" for the purposes of this discussion?

If I get a promotion every two or three years, am I also supposed to be trying to negotiate a non-promotion raise every 12 months?

12
LouisSayers 11 hours ago 2 replies      
In my experience you'll be much better off just quitting and joining other companies, then going onto contracting.

Here's my own person experience in just a few years:

1. Start off as a Grad on 30k GBP

2. One and half years later quit, onto another company at 48k GBP (>50% pay rise)

3. One and half years later quit. Do own thing for a year. Then go onto contracting at equivalent of 75k GBP (>50% pay rise)

So in summary, in the span of a 4 year period, I've gone from 30k GBP to 75k GBP.

There's definitely an upper limit on this of course. I think if I were to get out of startup land again now and focus on my salary I'd have to head to a different market (currently in Australia). I'd either aim for contracting in the UK where I know you can easily get 400-600 GBP per day, or the US where salaries for devs just seem higher.

I'm playing long game though, taking the startup route. So I'll either be collecting change from the streets in a few years, or hopefully I'll be well ahead of the game. Will have to wait and see...

13
acconrad 17 hours ago 0 replies      
> If you negotiate an additional 5% raise every two years youll make up that gap in your starting salary by Year 6

Well if the average tenure of a software developer at a company is like 2-3 years, you will lose by asking for a raise and should just quit to go somewhere else!

14
agounaris 5 hours ago 0 replies      
Writing more code is definitely not the way to get a salary raise! Such a simplified and wrong post about salary negotiation...
15
ensiferum 7 hours ago 0 replies      
Write more code... urgh... code is stupidly expensive to any organization and simple LOC should never be used as metric for anything.

Basically all the suggestions are low level grass roots value proposals. This article has much better advice (IMHO)

https://www.fastcompany.com/3064513/lessons-learned/im-a-ceo...

16
rockshassa 5 hours ago 0 replies      
this "article" reads like it was written as propaganda by in-house HR to retain staff. 3% is basically a non-raise, it just keeps up with inflation. an extra 5% is alright, but is not adequate to retain talent who knows their worth.i once received an annual raise was 2.5%. i pushed back a bit but not too hard, as i've done this enough times to know that it is never worth the amount of stress involved. fast forward a few months, i'm walking out the door to a new gig with a signifigant increase (lets say 20%).old mgmt didnt see it coming, despite the ample warning i gave them, and ample opportunities to correct the situation. they never do, just accept it and move on
17
quizzas 16 hours ago 0 replies      
It really does depend on the organizational structure of the company, and whether they appreciate how difficult it is to replace programmers with an equally competent one. Most non-tech managers don't understand the intangible costs of replacing programmers, but that's never stopped them treating programmers like interchangeable cogs. If you work for a Fortune 500 non-technical company, I don't believe this strategy works to well.
18
NKosmatos 17 hours ago 1 reply      
No offense, but this is applicable in fairy land or in an idealized and fair society/economy. Sure there are exceptions and I bet there are software developers who get what they deserve, but the majority is in no position to negotiate a 3-5% raise. I'm sure that for someone working in Silicon Valley it's true but not for many countries I can think off (including Greece where I work).
19
Mandatum 15 hours ago 1 reply      
$100/hr: 120K yr for 3-day/wk contracts, I like long weekends.. Java dev.
20
jerkstate 9 hours ago 0 replies      
Q: How do I get a raise without changing jobs?A: Get a raise without changing jobs.
21
andrew_wc_brown 14 hours ago 0 replies      
At least in startups. You salaried gets lowered when funding gets tougher. This has happen to be at every startups I was employed at.
22
z3t4 16 hours ago 1 reply      
You can not earn more then someone else is willing to pay .
23
zer00eyz 18 hours ago 0 replies      
fearlesssalarynegotiation.com sounds like such a reputable place to buy a $199 bundle.

So lets break down why this 199 bundle might work. Odds are that if your buying this, you lacked the confidence to negotiate upfront. If your low man on the totem poll its easy for me as a manager to get you to the same level as everyone else. That deals with year one, your not supposed to try this again till year three. By then your out side the 60 day window, and your not getting a refund on the BS package.

24
known 8 hours ago 0 replies      
25
imaginenore 18 hours ago 2 replies      
Don't ask, don't get. And asking is the only thing you can do. Almost never will your boss/manager approach you and offer you a significant raise, even if you're a kickass developer who is crucial to the company. Just ask.

Still, asking will only get you that far. I got $80->$90/hr raise by asking. But then I got a 40% increase by changing jobs.

And if you're consulting, and are confident enough, you can just raise your prices by a lot. I once was very busy, and told the client that I would freeze my other project if they pay $150/hr, and they agreed to it. Win-win. I get more money, they get the product.

29
FlyWeb An API for web pages to host local web servers flyweb.github.io
311 points by collinmanderson  3 days ago   68 comments top 29
1
macawfish 3 days ago 2 replies      
I'm very excited about this! I've been waiting for something like this for a long time!

To me, the power here is in using the technology to foster local, human-scale interaction.

Intranets are totally underutilized. How many people do you know who can reliably transfer personal files over a local area network? Not nearly as many as those who know how to use google or send an email... that's absurd to me, given how ancient of an application file sharing is.

It's my opinion that the survival of the internet may very well rest on p2p webs like this.

2
nstart 3 days ago 1 reply      
This is potentially huge. If other browsers can also jump into this, we could potentially see a rise in a new generation of apps that are local first and enable much richer real time collab features. I might be reading into this wrong but this could also usher in a better and open implementation for iot devices to provide interfaces for the user. Excited to adopt this and try some experiments out.

One interesting thing to figure out is the combination of local and global. When I have an iot device and I'm away from home, or someone collaborating with me from a different location, the same app needs to fall back to using standard internet based interfaces. Not sure if that disqualifies it from being a potential use case of this.

3
SchizoDuckie 3 days ago 1 reply      
Oh great. yet another way for ad services and botnets to talk to eachother...

Because yeah, that's where this is going to be used first and foremost.

Most people are not intelligent enough to understand how to secure their internet banking, and now we're going to bake-in hosting tcp connectable servers?

These security prompts better have some real clear language and require giving permissions every time.

Now I can see some good things for this too, start a flyweb from your desktop and easily transfer some stuff from your phone for instance (something that still sucks in 2016)

I just think that most of it's use will be malicious.

4
shakna 3 days ago 0 replies      
This is actually kind of cool. The discoverable features for pre-existing non-FlyWeb servers stands out to me.

Secondly, the FlyWeb server gives you access to a really flexible API for serving just about any content.

It feels like federated content, we just need to question whether it should be locked to the local network.

5
formula1 3 days ago 2 replies      
Im not reading anything about opening ports on your routers firewall. Does this somehow circumvent this? Reading it further it seems to explicitly say "local" which probably implies you.ip.address.[0-255] is targeted.

I think this technology is intriguing and with some real use cases (more peer to peer) but the api seems disorganized. I cant tell if it wants to be another webstandard or be something different.

A part of me wants to dislike this and consider it as a distasteful competitor to pre-existing technologies that have learned to survive without "the web". Another part realizes that sandboxing these technologies protects and enables the average user in regards to awesome tech. This certainly wont replace torrent, webrtc or other existing p2p technology. But I certainly think its a cute way of opening up the field.

6
Animats 3 days ago 1 reply      
There seem to be two parts to this. One is a way to inventory your LAN using multicast DNS and find all the web servers on it. (There may be one in every lightbulb.) The other is to run a web server in the browser. These are independent functions.

The first seems useful. The second seems to need a more compelling use case. Also, opening the browser to incoming connections creates a new attack surface in a very complex program.

7
realworldview 3 days ago 1 reply      
I'm obviously reminded of http://www.operasoftware.com/press/releases/general/opera-un.... Is Mozilla reinventing the web again, too? Will this following the footsteps of Persona, and all those other Mozilla experiments. I would prefer less focus on such a public lab space where money is thrown, and greater focus on reality. Sorry for the negative view but Mozilla appears to have the same lack of focus problems as many other companies.

Edits: speling correctoins

8
BHSPitMonkey 3 days ago 1 reply      
Here's a little Chrome addon I just whipped up which lets you browse and launch local FlyWeb services:

https://github.com/BHSPitMonkey/flyweb-browser-chrome

9
deno 3 days ago 0 replies      
It seems the feature here is server-less discovery (mdns). Because otherwise intranet communication between apps is already possible via WebRTC.

Along that track, it would be nice to see native DHT support in the browser, for global server-less discovery.

Unfortunately, just using WebRTC is not a great fit for a DHT, because of connection costs. Also it makes more sense to have DHT persist between app sessions.

10
coldnebo 3 days ago 0 replies      
Curiously enough, the only thing that went through the mind of the bowl of petunias as it fell was: 'Oh no, not again.'"

[edit: ok, so it is cool, but I'm not sure it's secure, and I'm not crazy about web pages from other domains being able to setup local discovery on my network. Seems like a massive security problem. Uuids sounds like obfuscation, not security. ]

[edit: ok, well at least they've started thinking about it: https://wiki.mozilla.org/FlyWeb#Security

Would like to see this fleshed out some more. ]

11
jlgaddis 1 day ago 1 reply      
This looks really neat and I can immediately think of several use cases for it.

I'm pretty cynical and jaded, though, so I went looking for and finally found the "Security and privacy considerations" section of the FlyWeb (draft) Specification [0]. I'm quite disappointed by what I see there -- or, rather, what I don't see.

If Mozilla is to pursue this seriously then, in my opinion, they need to follow a process similar to Internet Drafts [1]. Development of the spec should be opened up to the public, other stakeholders (browser vendors and, of course, users) should be involved, and so on.

There was a time when we could remain somewhat confident that a device behind the firewall would not be accessible from the "Internet at large". That was before UPNP [2] and rebinding attacks [3]. As I said, I can quickly think of several use cases that would be perfect for something like this... but history has clearly proven that the privacy and security implications MUST be considered at every step of the way. This beast should be tamed before it has a chance to get away.

Myself, I'm going to go ahead and add a new lockPref entry for "dom.flyweb.enabled" to my mozilla.cfg in anticipation of the day that this comes to my browser. (Of course, with Mozilla's track record, they'll probably push FlyWeb heavily for the next year or so, then just abruptly announce one day that they're killing it off.)

[0]: https://flyweb.github.io/spec/

[1]: https://en.wikipedia.org/wiki/Internet_Draft

[2]: https://en.wikipedia.org/wiki/Universal_Plug_and_Play

[3]: https://en.wikipedia.org/wiki/DNS_rebinding

12
Senji 3 days ago 1 reply      
It would be interesting to see how this plays out with websockets based torrent clients.

Meshnetwork torrent trackers with DHT anyone?

13
gpsx 3 days ago 2 replies      
I think it is a great idea for devices to be web servers to allow remote devices to serve as their user interface.

I lean towards using bluetooth as a discovery mechanism rather than wifi. Google's "Physical Web" I think does something along these lines, though I am not sure whether or not they are thinking about web servers on these local devices. I think that is a key part of the idea.

14
pmontra 3 days ago 0 replies      
Some tips for whoever wants to try it out.

In the desktop FF Nightly the Flyweb menu must be picked from the customization menu (Menu, Preferences, drag the Flyweb icon to the toolbar). I think Mozilla forgot about this in their page.

Another important bit of information is how to install Nightly alongside with the current FF http://superuser.com/questions/679797/how-to-run-firefox-nig...

My take on this: interesting, especially the server side part. Instead the server inside the browser could be at best a way to drain batteries and at worst a security risk because of the increased attack surface. I wonder how locality applies to phones on the network of a mobile operator vs on a smaller WiFi network.

Anyway, if we have to rely on browsers to implement the discovery mechanism I'm afraid that it won't fly (pun intended). I'd be very surprised if Apple, Google and even MS will include this into their browsers. I got a feeling that they might want to push their own solutions to sell their own hardware. I hope to get surprised.

Maybe there will be apps autodiscovering those services or other servers acting as bridges to a "normal" DNS based discovery service.

Btw: Mozilla should test their pages a little harder. I had to remove the Roboto font from the CSS to be able to read it. The font was way too thin in all my desktop browsers and FF mobile. Opera mobile was OK, it probably defaulted to Arial.

15
Matthias247 3 days ago 0 replies      
I certainly like the discovery feature through mdns. Could be helpful for a lot of scenarios. Windows allows out of the box already something similar by showing UPnP devices in the network browser and double-clicking on them navigates to their advertised webpage. That makes it windows and UPnP (instead of mdns) only, but works with all browsers. Having it directly in the browser would allow to have it on all OSes, which is certainly also good.

I understand why they hide the real IP addresses behind UUIDs, but I think there should be an option to also convert it to the real IP/host address. Because often you want to share the address of the embedded device with your coworker, use the address in another tool, and so on.

However I'm not sold on the idea and state of the webserver in the browser API. It just leaves a lot of questions open: E.g. pages are often reloaded, how will this impact the experience. Or HTTP request and response bodys are possibly unlimited streams, the simplified API however does not expose this. What attack vectors are enabled through that and how will it limit use-cases?

16
greggman 2 days ago 0 replies      
Hey I know, I want to be able to connect to a device's flyweb server and give it voice commands. Oh, can't do that, accessing the mic requires HTTPS.

Sorry, I didn't mean to be snarky, I worked on a project that has some surface similarities to this (local only server) but last year when Chrome (and Firefox?) banned a bunch of features unless youre HTTPS that pretty much killed the project.

Thats not to say there aren't uses without those features. Its just interesting to see Mozilla make this feature that serves pagesp that can.t use the full range of features,

Examples:

You want to make media server but you can't go full screen

You want to use phones as wiimotes but you can't get device orientation

You want to speak into the webpage but you can't access the mic

You want to scan barcodes into the webpage but you can't access the camera

17
rjmunro 3 days ago 1 reply      
Is this compatible with Safari's Bonjour functionality?

Apple have hidden it behind flags in Preferences -> Advanced in recent versions, but when enabled, you get a "Bonjour" item in the favourites menu, which will show the internal settings websites of compatible printers etc. that are on the LAN.

18
xg15 2 days ago 0 replies      
A very fascinating idea. I agree, an autodiscovery mechanism like the described is badly needed for local applications. This could also be used to implement captive portals or "special-purpose" wifi networks in a less harmful way than currently done.

I don't quite understand the reason behind the random-UUID-as-hostnane design, however. Yes, it protects against a service stealing another service's cookies.

But wouldn't this also result in the same service having a different hostname and origin each time it is discovered? Woudn't this render cookies, storage and HTTPS(!) unusable for flyweb services?

19
adrianN 3 days ago 1 reply      
I don't really understand how this is better than just running a small webserver. The discoverability feature is nice, but I think you could do the same with a port scan in the local network. Can someone explain?
20
forgottenacc57 3 days ago 0 replies      
Examples of useful reasons to use?
21
TazeTSchnitzel 2 days ago 0 replies      
I imagine this will be the next generation of jackbox.tv-style experiences, except with no need for the external site. That'll make these more accessible and practical.

The stuff they've done with using phone apps to play group guessing games is a lot of fun.

22
logronoide 3 days ago 2 replies      
Flyweb botnets DDoSing services in 3,2,1...
23
esafwan 3 days ago 0 replies      
Really cool. If embraced by other browsers this can have good impact in iot space.
24
alphapapa 3 days ago 0 replies      
> Enabling web pages to host local servers and providing the ability for the web browser to discover nearby servers opens up a whole new range of use cases for web apps.

That's not all it opens up. "Enabling web pages to host servers"--who thought this was a good idea?

To top it off, later in the page, they tell users how to upgrade Node by running `curl ... | sudo bash -`. Good grief, the anti-patterns!

This FlyWeb site has me seeing red.

25
cm3 3 days ago 0 replies      
Sounds like extensions that delegate text fields to native editors should be easier to write with better ability to expose a localhost http endpoint.
26
cm3 3 days ago 3 replies      
How similar is this to Opera 12's built-in httpd?
27
ilaksh 3 days ago 0 replies      
This is awesome but if Chrome and Firefox both supported UDP then we could build things like this in userland. Is that happening?
28
chris_wot 3 days ago 0 replies      
Ok, this is cool and I think finally Mozilla have hit on something innovative. I'm going to check this out soon.
29
anysz 3 days ago 0 replies      
"Hello Flyweb"? smh blasphemous hubris
30
C for Python programmers (2011) toves.org
283 points by bogomipz  20 hours ago   143 comments top 16
1
krat0sprakhar 19 hours ago 17 replies      
I've been programming in Python for a long time and recently took up an OS class which exclusively used C. Syntactic differences aside (declarations in C can get pretty hairy), the steepest learning curve while writing anything useful in C is with a) pointers and b) memory management which this guide doesn't seem to cover.

From my experience, the best way for learning C has been [0] Build Your Own Lisp and Zed Shaw's [1] Learn C The Hard Way.

That and of course spending countless hours debugging segfaults and memory leaks.

[0] - http://www.buildyourownlisp.com/

[1] - https://learncodethehardway.org/c/

2
tedunangst 17 hours ago 0 replies      
No mention of malloc or the struct keyword? You'll probably want to learn about that before dealing with real C code.

> #define forever while(1)

> Expert C programmers consider this very poor style, since it quickly leads to unreadable programs.

So why even mention it??? There are more important subjects which could have been introduced in this space.

3
sangnoir 8 hours ago 0 replies      
Offtopic: when I saw "C for Python programmers", I immediately thought it was going to be this beaut/horror - https://twitter.com/UdellGames/status/788690145822306304
4
jstimpfle 19 hours ago 5 replies      
"Python for C programmers" would probably make much more sense (following "C for Assembly programmers").
5
payne92 17 hours ago 0 replies      
Oh how we've come full circle: in the "old days", this would be "Python for C programmers".
6
sjmulder 6 hours ago 0 replies      
I like the style this uses to explain the else-if construct. A few fundamental concepts are explained and combined in a way that it not only makes else-if obvious but also anything else using these fundamentals.

SICP also uses that style throughout and I love that. Wish I could explain things that well.

7
gallerdude 19 hours ago 4 replies      
Soon this will be me - freshman CS student who will be learning C the second semester. A lot of people say it will be difficult, but there is a certain fun in building things from scratch.
8
Kip9000 17 hours ago 1 reply      
What could be really interesting for a Python programmer is [Nim Lang] (http://nim-lang.org/)

Python like syntax, statically typed, garbage collected, C like perf.

9
nichochar 18 hours ago 1 reply      
People are criticizing this, but this is very valuable to someone like me, who just wants to brush up a bit on my C, having some minor experience, and a lot of python experience, so thanks!
10
denfromufa 16 hours ago 0 replies      
The best way to learn C for Python programmers is to dive into CPython interpreter or C-API extensions. Also use Cython annotation outputs to see how Python translates into C calls.
11
skoczymroczny 7 hours ago 3 replies      
"C does not have an support for accessing the length of an array once it is created"

Well, there is sizeof(array)/sizeof(array[0])

12
poseid 8 hours ago 0 replies      
not as detailed, but more on a higher level this post from JavaScript to C might be interesting for some too: http://thinkingonthinking.com/learning-c-for-javascripters/
13
mi100hael 18 hours ago 1 reply      
As someone who first learned Python and then picked up C, the two biggest challenges for me were: string parsing and mixed (or unknown)-type collections. Took me a good while to change my mindset since those operations are so easy & widely used in Python.
14
jibreel 8 hours ago 0 replies      
while writing c i found this free book helpful as a reference. http://publications.gbdirect.co.uk/c_book/
15
giancarlostoro 13 hours ago 1 reply      
Let me point out two alternatives:Cython which is very Pythonic looking, and compiles to C and produces amazing projects including UVLoop which is a drop-in module for asyncio for Python 3 that will speed up asyncio:

https://magic.io/blog/uvloop-blazing-fast-python-networking/

https://github.com/MagicStack/uvloop

Note how GitHub claims it's all mostly Python code ;) That's because Cython like I said looks Pythonic.

There's other examples, but I think this is one of the one's that come to mind the most to me.

There's also D which is called Native Python by some (unlike projects like Go and Rust, you can have your Object Oriented Programming (optional like in C++), and concurrency / parallelism too and other goodies like built-in unit testing, when you compile your code your unit tests are ran but not included in the final binary):

http://bitbashing.io/2015/01/26/d-is-like-native-python.html

https://blog.experimentalworks.net/2015/01/the-d-language-a-...

If it's been more than a few years since you've evaluated D you might want to check it out again, it may be worth your time. D is a language I knew about for years, and recently is where I've come to appreciate it for it's many features.

D has things like array slicing, [OPTIONAL] Garbage Collection, an amazing Web Framework called Vibe.d with it's own template engine called Diet-:

http://vibed.org/

https://vibed.org/blog/posts/introducing-diet-ng

Things I like is Vibe.d is not just a Web Framework but a full networking stack too, also it supports Pug / Jade like templates (see Diet-NG) you make and compiles them when you compile your project, so your website runs off a native executable using fibers instead of threads. Vibe.d is undergoing a period where the core is being rewritten to where it is more compartmentalized so that you pick and choose which parts you need, MongoDB, PostgreSQL, layout engine and other goodies, there's even a templating library whose syntax is based around eRuby called Temple (though the syntax can be tweaked) that supports Vibe.d:

https://github.com/dymk/temple

16
avg_dev 13 hours ago 0 replies      
Is there anything like this for Rubyists?
       cached 8 November 2016 16:11:02 GMT