hacker news with inline top comments    .. more ..    22 Jul 2017 News
home   ask   best   2 years ago   
1
The non-linearity of productivity kenperlin.com
37 points by wslh  2 hours ago   9 comments top 6
1
lordnacho 28 minutes ago 0 replies      
There's feeling like you're getting things done because you're on a coding roll, and then there's feeling like you're getting very little done because you're still organising the project in your mind.

I don't think the two can be separated. Being stuck and being moving are dependent on each other.

A lot of the time I'm frustrated because I haven't quite figured out how the code should look. It takes a long time, a lot of chopping and changing, before you come upon a structure that you feel comfortable building on.

When you do get to that point though, it feels easy. Everything you write makes sense in terms of your new structure, and there aren't so many awkward kludges. The only thing is this period can be brief, because when you're fast, you get to the next problem sooner. You'll find higher and higher levels at which to view your code, and at each level you'll need to do some organising.

2
LeonM 44 minutes ago 2 replies      
I've struggled with this question lately. However, I think our definition of 'productivity' needs more refinement.

Currently I'm working on a new startup which tries to tackle a rather hard problem (a generic document parser for invoices). Sometimes I'm stuck for days on a particular problem, and I feel like I'm not productive at all, I just stare at a blinking cursor all day. However, in my head I'm constantly working on the problem, and one could say that the moments where I don't write code are actually the most productive, even though it does not feel that way. It's rather counter intuitive.

So I don't think there really isn't a pattern in actual productivity (at least not in my case), although there might be a pattern in how productive you feel.

3
jpster 8 minutes ago 0 replies      
I've also been digging into its non-linearity, getting a lot out of this book at the moment: https://www.amazon.com/Peak-Performance-Elevate-Burnout-Scie...
4
mswen 23 minutes ago 0 replies      
I notice the ebb and flow of productivity as well. Ken uses the term fallow period. I like the reproduction analogy. New ideas, skills, or domain knowledge are like the sperm that finds its place in the egg which I think of as the existing stable but receptive body of knowledge and skills.

There is often excitement at the beginning "I have this new skill!" However, the new skill or knowledge hasn't really integrated with the existing body of knowledge yet. Then often it seems that there needs to be a gestation period. You have the sense that connections are forming and something new is forming that is the combination of the new and the old.

And, then it feels like I am ready to pour it out. Whether it is writing an article, preparing a speech or coding up that feature or application. When that happens it feels like the "Flow." But I understand that the gestation period was just as important if much less visible and seemingly unproductive.

5
skrebbel 36 minutes ago 0 replies      
I know it's an open door, but when I saw this title and the associated domain I immediately thought "hm, yeah, wow, my productivity is pretty much Perlin noise indeed"
6
MichaelBurge 16 minutes ago 0 replies      
Productivity is mostly linear for me, judging from non-test lines of code in my commit history. About 36 net lines/hour whether the language is Haskell, C++, or Javascript, or whether it's 3 hours or 14.

There is some variance: Test code increases the ratio, UI code increases the ratio, and debugging lowers the ratio. But it's either wildly off being one of these, or 36-40 lines/hour.

2
The Rise of Pirate Libraries (2016) atlasobscura.com
42 points by yitchelle  4 hours ago   14 comments top 6
1
resf 1 hour ago 1 reply      
Although descriptive of the legal situation, I object to the term "pirate library". There is no practical difference between a pirate library and a brick and mortar library, except the larger stock of books online. My local library even has a photocopier.

A library is a place where you can choose a book, and read it for free. That's how it's been for thousands of years. Copyright is a modern intruder and has no right to brand bone fide libraries as "pirate".

2
jackvalentine 50 minutes ago 0 replies      
The thing about these 'pirate libraries' that makes them seem so vital to me is they're made up of people far more motivated to preserve and keep available the documents than the actual copyright holders ever will be.

I draw comparisons to the vast value that was lost when Oink and What.cd were shut down.

3
krylon 1 hour ago 0 replies      
I really hate to be that person, and content-wise I find the article very interesting, but:

 The text collections were far too valuable to simply delete, he writes, and instead migrated to closed, membership-only FTP servers. More recently, though, those collections have moved online
So an FTP server is not "online"? I know, I know, it is a minuscule nitpick. But for some reason this kind of thing takes my attention hostage.

And let me repeat, content-wise this is a fine and fascinating article.

4
tomohawk 2 hours ago 1 reply      
The US government is not allowed to copyright anything, and is surely a funder of many of these works. Contractors for the US government should similarly not be able to copyright work done under contract. What would happen to these copyright arguments if this became the case?
5
unhammer 1 hour ago 0 replies      
For those times when you click a link to a paywalled article, http://unpaywall.org is a pretty handy Firefox extension that will redirect to a (legal) unpaywalled alternative site, if it exists.
6
DonbunEf7 2 hours ago 2 replies      
It's only a matter of time before somebody delivers a too-cheap-to-meter DOI-to-PDF service.
3
How economists rode maths to become our eras astrologers aeon.co
58 points by kawera  3 hours ago   22 comments top 11
1
MatthiasP 22 minutes ago 1 reply      
"To put it bluntly, the discipline of economics has yet to get over its childish passion for mathematics and for purely theoretical and often highly ideological speculation, at the expense of historical research and collaboration with the other social sciences. Economists are all too often preoccupied with petty mathematical problems of interest only to themselves. This obsession with mathematics is an easy way of acquiring the appearance of scientificity without having to answer the far more complex questions posed by the world we live in."

Piketty T. "Capital in the 21st century", p. 36.

2
meri_dian 1 hour ago 1 reply      
I studied economics and philosophy in college. Did a bit of economics research for a senior thesis.

I was encouraged to study economics as a more "practical counterbalance" to philosophy. The funny thing is, I've found philosophy, in how it helped me improve the rigor of my thought, to be much more useful and practical than anything I learned in economics.

Economics needs to rely more on computation and data than on abstract mathematical models if it ever hopes to have real predictive power.

3
lottin 6 minutes ago 0 replies      
A lot of people don't get that economics isn't about making predictions nor that the fact that a statistical model may lack predictive power has nothing to do with whether the model is correct. All a statistical model tells you is that the dependent variable will take this or that value given some values of the independent variables. So predicting the dependent variable requires you to know the values of the independent variables in the future, which means most of the time the model is completely useless for forecasting purposes. That doesn't mean that the model is useless in a general sense, or unscientific.
4
finolex1 45 minutes ago 2 replies      
Yes, economics has not been very successful at making predictions with pinpoint accuracy, and yes, people might place too much faith in economic theory.

But the stark reality is that the existence of economics as an academic discipline is inevitable. Policymakers worldwide make decisions that impact billions of people and involve trillions of dollars, and it simply either isn't palatable to the public ear to proclaim they're going by intuition alone, or involve proposals so complex that our intuition just can't evaluate them - they need an academic foundation to stand on.

And that's where economists come in - they might quibble over more esoteric topics, or waver in giving specific recommendations, but there are areas where they can say what definitely won't work in the long run - for e.g. when foreign aid might work instead of foreign trade, currency manipulation, regulation of monopolies, use of state subsidies, impacts of tariffs etc. The academic consensus in areas like these are a lot stronger than the media make it seem.

5
fpadilha 1 hour ago 2 replies      
This article confuses how top-level economists (who are de facto politicians) that attribute sometimes questionable decision making to models (most of the times due to political pressure) with a honest and very complex (at the edge of academic research) profession.It points out the it got so "mathy" that it kind of decoupled from non-technocrats but that can easily be said about literally any field today. What amazes me most is that the author complains about how that happens in the US, the most transparent and actually research-based application of economics in the world. I would invite you to go to some other less developed countries to see how "wonky" economics couples with intelectual dishonesty tries (and many time succeeds) to fool the many in favor of the few. And again, using wonky arguments to fool people is a practice that happens in any field, dismissing an entire profession who actually brought many advances to modern societies is ridiculous. The advance in statistics and econometrics is what made machine learning possible... I have to say, it has been a while since I've read such nonsense trying to pass as a good analysis/critique.
6
soVeryTired 1 hour ago 0 replies      
Macro has a pretty bad reputation, but economists often point to successes of the field as a counterpoint. Noah Smith points to four areas: Matching theory, discrete choice theory, gravity trade models, and auction theory [0].

Does anyone know about these topics? Are they really as successful as their proponents claim?

[0]http://noahpinionblog.blogspot.co.uk/2017/06/is-economics-sc...

8
nabla9 22 minutes ago 1 reply      
I don't buy argument. It confuses economics used to advance political or commercial goals with academic economics.

Economics is far ahead other social sciences because it uses mathematical modelling. Models make it easier to see the limits and the scope of the arguments. Ability of outsiders to clearly see oversimplification s is strength.

Rational actor model is the best first order approximation of behaviour. Moving to others, there are others, should be justified carefully.

Other social sciences have it worse.

Btw. Economics is only partially social science. Actors don't have to be humans.

9
fiatjaf 15 minutes ago 0 replies      
See also how the Austrian School of Economics does not fall in that category at all and has created sound, useful and correct economics without math, but only pure logic.
10
nickik 22 minutes ago 0 replies      
The same bad arguments against economist repeated again and again. The people who write this are either stupid or simply write what people want to read.
11
oriol16 22 minutes ago 0 replies      
All these articles from opinionated outsiders aiming at "stimulating debate" sound extremely arrogant to academic economists. Do you really think you are telling anything new?
4
Eschewing Zshell for Emacs Shell howardism.org
96 points by brudgers  7 hours ago   29 comments top 8
1
rekado 6 hours ago 4 replies      
Another thing that's really nice about Eshell is that it has TRAMP integration. You can just `cat /ssh:remote:~/foo.log` and it will transparently spawn an SSH session, run the command on the remote, and print the output.

It's a little like translators on the Hurd.

I don't use Eshell that much anymore (I got used to `M-x shell`), but it really is nice. The only drawback is that everything must go through Emacs buffers, so in some cases performance isn't great.

2
gbrown_ 4 hours ago 1 reply      
3
kozikow 4 hours ago 1 reply      
I tried to give it a go once and I remember constantly hitting small problems and after two days spent on fighting I went back to zsh.

The most annoying problem I couldn't find a solution for was lack of ability to open majority of applications in terminal - e.g. postgress shell.

Solution I am using nowadays is:

1. Zsh->Emacs shell macro to open file in emacs using emacsclient.

2. Emacs->Zsh by emacs function generating "cd" command to the folder of current file and adding it to my clipboard.

4
Derbasti 6 hours ago 1 reply      
Eshell was what brought me to Emacs in the first place, as it is a shell that works the same way on Linux, macOS, and Windows, and thereby saved my sanity.
5
teknico 4 hours ago 0 replies      
Speaking of cross-platform alternatives to traditional Unix shells, if you like Python try http://xon.sh . More compatible and overall usable as a shell than IPython.
6
topkekz 4 hours ago 0 replies      
How come emacs shell still does not support input redirection? You have to do

 cat file | program
instead of

 program < file

7
agumonkey 2 hours ago 0 replies      
wonderful, eshell is a nice shellish emacs tool, makes me want to go further but so far it's very potent as is. another johnw gem
8
codychan 6 hours ago 1 reply      
There is something wrong with the `eshell/x` function, if start `Ctrl-!` in scratch buffer, after execute `x` in eshell prompt, there will `Wrong type argument: integer-or-marker-p, nil x` in scratch buffer; if I start `Ctrl-!` in init.el, there will be `Marker points into wrong buffer: #<marker at 955 in scratch> x` in my init.el file.
5
Dell 38 inch UltraSharp monitor anandtech.com
42 points by dgelks  3 hours ago   57 comments top 10
1
bitL 3 minutes ago 0 replies      
I wish it were available in HiDPI version, i.e. twice the resolution - it would make it way better to look at, as seeing pixels while editing video or producing audio in Live is no longer acceptable.
2
TheAceOfHearts 1 hour ago 11 replies      
I apologize if this is an incredibly stupid question, but why do these monitors cost so much? As I understand it, you can buy a good 4k TV for considerably less, so what features of this monitor make it a better deal?
3
SideburnsOfDoom 1 hour ago 2 replies      
I suspect that in a few years a hololens / google glass style device will become cheaper and work as well as a big bank of monitors, and at that point it is going to rapidly replace physical monitors - and then economy of scale and iteration of the product will do the usual to the price/performance ratio of head-mounted devices, then screens go the way of CRT monitors when flat screens came along.

I'm not saying that a head-mounted device and virtual screens will necessarily be better than a bank of monitors - in fact it's time to asses drawbacks - but once it's cheaper and seems "just as good" then business will want to switch over, for better or worse.

4
fsloth 1 hour ago 3 replies      
Only 60 hz. Sigh - I hope mainstream moves to 100+Hz monitors soon. I love the fluency of my far-too-expensive 'gaming' IPS 144hz monitor despite I don't play that much. Just the fact that web pages and mouse pointers don't lag is a noticeable improvement.
5
hultner 1 hour ago 2 replies      
I'd much rather see one with higher resolution. The DPI is quite low with today's standard.
6
sundvor 1 hour ago 0 replies      
I bought the U3415W when it first came out. At first it was for games (coming from 3x U2412Ms) however I quickly realised how incredibly good the 21:9 3440x1440 resolution is for programming. No DPI scaling needs be involved, so I'm looking at Visual Studio experience where even with NCrunch unit test runners and the Solution overview, I have plenty of room for two main code editing windows. Brilliant.

I ended up buying an extra Acer X34 for home (surrounded by 2x U2412Ms on an Ergotech stand) and brought the U3415W to work as a personal device.

The 38" could potentially be even better, however I'm rather happy with the 34" as is. It's a bit of a shame they didn't add Freesync to it.

7
callumlocke 11 minutes ago 1 reply      
Is 110 ppi considered ultra sharp?
8
eps 1 hour ago 1 reply      
PSA - don't get tempted by UltraSharp reviews and recommendations, like I did. Just pick an Eizo instead.

UltraSharps are widely recommended for coding with glorious reviews and endorsements. Got one and no matter how I adjusted it, it was still too... eye-piercing, if you will, for longer coding sessions. Got mild headaches, tired eyes and general feeling of discomfort when working on hem even for shorter periods. Then switched to FlexScan and it's a completely different ballgame - softer, more gentle feel, incomparably more comfortable. The best monitor I've had a pleasure of staring at in my 20 years of programming.

However the interesting part here is that both monitors use the same panel (!), so the panel itself is only part of the recipe, which is something that many reciews tend to either downplay or not mention at all.

9
IdontRememberIt 1 hour ago 0 replies      
Low DPI, certainly Dell's traditional low quality anti-glare filter (3h). When will they wake up and build monitors for people who work with letters and numbers (vs video or image) and favor quality over price? For a few years I think I will still have to switch every programms to dark theme as a work around to hide the poor quality and defaults of the monitors... sad. :(
10
roselan 1 hour ago 0 replies      
Last time I tried such giant screen, I ended up extremely frustrated with the screen splitter feature, I prefer triple monitors for this reason.

Had there been significant progress in this field the last few years?

6
YouTube Will Redirect Searches For Extremist Videos To Anti-Terrorist Playlists tubefilter.com
56 points by lainon  5 hours ago   39 comments top 11
1
IIAOPSW 1 hour ago 4 replies      
I hate the fact that if I voice anything skeptical of this policy someone will call me a terrorist sympathizer. I'm of the conviction that the idea of blowing yourself up is bad enough that we don't really need to censor it. Furthermore I've noticed that the public largely stops thinking the moment "terrorists" or "ISIS" is mentioned. We now live in a state where we can't move ~5000 USD without informing the government because "anti-terrorism money laundering laws". We can't mail certain food products into the US because "bio terrorism act of 2003". Our FBI wants to back door encryption because "ISIS". Notice a pattern yet?

Will it stop with ISIS? Once the precedent is set how long will it be before other search results are manipulated for political reason?

And its not like there's any side of the political spectrum I can vote for. Both Hillary and Trump wanted to censor extremism on Twitter. Both wanted to expand the no-fly list into a no-buy-guns list (constitutional implications be damned). Left, right, center and abroad the civil libertarian is under siege.

2
freeflight 2 hours ago 3 replies      
This recent trend is really worrisome.YouTube also shut down quite a lot of channels in connection with the Syrian civil war. Reddit too closed down a couple of subreddits, and admins banned users who shared ISIS related content on the /r/syriancivilwar/ subreddit.

What an utterly useless and counterproductive exercise in de facto censorship. The only thing this does is drive people deeper into the underground, even closer to actual extremists.

Extremist ideas need to be fought with rational discourse, drag them out into the open and let the sunshine disinfect them.Trying to hide them out of sight, like they don't exist, only makes them fester all the more in secret and partly legitimizes these extremists in their view of being unjustly persecuted.

3
alsadi 18 minutes ago 0 replies      
This is very scary. They control what we see and only show us what they want us to see. Some party use terrorism execuse to have more control. Play on people fears to take their freedom and choice.

What they believe to be politically right is now what is right. This is drak ages church stoning witches and if you say something against it then you support black magic and get stoned

4
BartSaM 3 hours ago 2 replies      
And who will decide what is "Extremist Videos"?This is a dangerous game YT plays now. Who will decide who is a freedom fighter and who is an extremist? Who is good and who is bad?
5
kovalevlad 3 hours ago 2 replies      
That's all very well but perhaps they should also stop deleting Russian opposition videos/channels and justify that by claiming they contain extremist content when there is none. #FixRussianYoutube - https://www.youtube.com/watch?v=DZwAVsAgsLQ&t=190s
6
thinkloop 1 hour ago 0 replies      
It's hard to tell what "redirect to playlists" means - will the relevant videos be completely censored, or are they adding a play-list of the opposing view in addition to the results? The latter seems like a nice compromise.
7
Synaesthesia 3 hours ago 2 replies      
I wonder if white supremacist videos are gonna count as terrorist https://theintercept.com/2017/07/06/facebooks-tough-on-terro...
8
avaer 44 minutes ago 0 replies      
I'd even be fine with this if it were applied equally to US politicians calling for executions and sermons threatening people with damnation.

This isn't about actual extremism though; it's simply about getting off of YouTube the things that cannot be sold for advertising dollars.

9
oelmekki 2 hours ago 1 reply      
Which also means journalists won't be able to access those videos as sources. Although, I'm not ready to call that a bad decision, because propaganda is indeed the biggest problem with ISIS.

This is just treating the symptom, obviously (why do young people ever consider joining a terrorist group? Certainly not through youtube videos as the only factor), but also treating the symptom is ok if we try to solve the deeper problem at the same time (unemployment? defiance resulting from corruption?).

10
gaius 49 minutes ago 0 replies      
There is already suspicion that Google manipulated results for political ends: https://www.theregister.co.uk/2016/05/31/google_axes_eu_refe...

You might think "good, it was a stupid idea anyway" but what if next time it's something that you care about? It's a dangerous precedent to allow to be set.

11
logicallee 1 hour ago 1 reply      
what's interesting is that Google has been doing this for some time! For the longest time I was curious what would happen if you did a specific but obviously wrong search such as "how do I poison someone and get away with it" or "how do I join ISIS without being caught" or something. Well as you can imagine for obvious reasons I did not do those searches! (To spell it out, I did not want them in my search history, and also they show pretty clear criminal intent, and also we have clandestine agencies who presumably watch for such things perhaps with Google's cooperation, etc.)

Well some time ago I had the brilliant idea that I could test an "obviously wrong" search to see what kinds of results it would give.

So I'm male, and I decided to search "How do I trick a man into thinking he's the father" or something (as though I'm pregnant) which is pretty clearly wrong but which I'm obviously immune to the idea of intent for. (To spell it out, because males obviously don't get pregnant.)

I expected search results like forum discussions, a Yahoo Answer question phrased with that exact word, etc. You know, same as if you Google any other specific question like that.

Well that's not what I got at all. Despite my very specific question phrased something like the above, ALL of the top ten links were to pages about "paternity fraud" - which I didn't even know was a thing. (I just thought it was just a shitty thing to do, and anyway could always be played off as a genuine mistake.)

So I instantly learned that what I was Googling was fraud, and closed the page without clicking any of the links to learn more. But my reaction was: well played, Google!!

If I had actually started out with that thought, it's likely I would have abandoned it after that search.

To be clear, Google did NOT answer my question directly (even though no doubt there are tons of pages that would have answered it exactly as asked), instead teaching me why it's wrong.

I was very impressed. I can only imagine the same would be done for some of the worse types of queries someone can do.

-

Note: just to make this gender-neutral: if I were a woman then to do my experiment I could have Googled something like "how long can I trick her into having sex with me if I got a vasectomy and she is trying to get pregnant" or something, which obviously cannot be a genuine question by a woman.

7
ARKit + CoreLocation [video] twitter.com
20 points by gfredtech  3 hours ago   8 comments top
1
ice109 1 hour ago 2 replies      
is this a trick or is geolocation on iPhones better than the 5m typically quoted.
8
Darknet Messenger Briar Releases Beta, Passes Security Audit briarproject.org
133 points by mwheeler  10 hours ago   48 comments top 14
1
bartread 0 minutes ago 0 replies      
I love the fact that this is a "darknet" messenger service called Briar: "Black Briar". Now where have I heard that before?
2
slim 3 hours ago 0 replies      
I love the fact that the "build from source" section is for everyone, not just developers. It's illustrated with screenshots

https://briarproject.org/building.html

3
tptacek 7 hours ago 2 replies      
It's ironic that this update plays up how Briar "hides metadata" when the audit found that the application deanonymizes its users by exposing DNS lookups during RSS updates.
4
softwarelimits 2 hours ago 1 reply      
"Darknet" is a brainwashing propaganda term.Please do not use it, thanks.
5
captainmuon 6 hours ago 3 replies      
This looks interesting, but I wonder how safe it is in the stated use case of journalists, activists in an authoritarian country. It can use Tor, which hides whom you are communicating with, but the fact that you are using Tor sticks out like a red thumb.

The authorities probably just have to flip a switch to put you under closer surveillance if they see you use Tor. Or they'll just send someone to your registered address and see whats going on.

What I really think would be cool would be a protocol based on massive steganography and obfuscation. You would have kernels which tell it how to wrap data in an innocent looking container (HTTPS traffic, SMT, IRC, Cat pictures and recipies over plain HTTP, DNS, ICMP pings, ...). Ideally, you would have dozens. And they would be shareable between nodes. You could define them in a DSL, and make them sandboxed and provable (that they round-trip, i.e. can decode what they encode, and terminate properly - that restricts what you can do in them though). You could even autogenerate the kernels. The last two points would require a bit of R&D of course.

The goal would be to be able to create new "protocols" faster than authorities can learn to detect them. Then wrap a regular encrypted protocol in this obfuscation layer.

6
Tepix 6 hours ago 0 replies      
It's not yet available via F-Droid is it planned?
7
hamandcheese 7 hours ago 2 replies      
As far as the audit, I feel like 13 days is surprisingly short. I base this on my experience getting new jobs and familiarizing myself with new code bases. Maybe I'm slow.
8
raymond_goo 3 hours ago 0 replies      
Can someone explain how it deals with routers and NAT ? Does it use UDP hole punching ?
9
siberianbear 8 hours ago 2 replies      
I downloaded the beta and installed it, but I guess I need to physically find a friend who also installed it. I'm not in Silicon Valley, so I doubt that will happen soon....
10
JoeCoder_ 10 hours ago 4 replies      
Why not develop tox instead, which is open source, end to end encrypted, on more platforms, and seemingly further along in general?
11
lawnchair_larry 4 hours ago 0 replies      
As someone who does professional security audits, I would just like to say that there is no such thing as "passing" a security audit. In fact, most pen testing shops will carefully dance around actually making that claim in writing for a customer, because they know they are going to look bad when a bug is inevitably found in code they reviewed (and it's probably a dumb idea for liability reasons too).

There are certain certifications with falsifiable conditions that can be marked pass/fail. But, as I'm sure many folks here are aware, these are incomplete and often completely dubious. They don't purport to be "security audits".

What a real security audit tells you is that of the (probably 2-4) consultants that looked at a product for a few weeks (probably 2-6), these were the security bugs they found.

That alone contains little information, because the skill level and domain expertise varies greatly among consultants and companies. I can guarantee that if these results were withheld, and they gave the same codebase to another reputable outfit, the set of findings would be very different. There would likely be some overlap, particularly in the most obvious types of bugs, but bug hunting is way closer to art than science.

I know nothing about this project, and my intent is not to create doubt, but users of secure messaging apps should understand what an audit is and what it isn't.

Like other commenters, I was surprised to see 3 days of looking at crypto. It could be that the crypto is extremely simple and uses a few well understood APIs in a straightforward way, so this isn't a guaranteed red flag by any means, but it's a bit unusual.

And like any software, this is a 1 line patch away from being blown wide open. With every commit, an audit becomes increasingly meaningless. Just ask cperciva!

And perhaps I'm being cynical, but I always felt like the "conclusions" section of the audit report has an unspoken purpose of walking back from calling their baby ugly and keeping a decent rapport to ensure the possibility of future business. Not that I think what Cure53 wrote was not genuine, but there are natural incentives to be a little generous there. Again, I'm speaking from experience writing those sections as well.

Edit: Basically what tptacek said.

12
pasbesoin 10 hours ago 0 replies      
More "Darknet". I almost passed this by.

I'm glad I took a peek. This is actually interesting to me.

...Briar is a secure messaging app for Android.

Unlike other popular apps, Briar does not require servers to work. It connects users directly using a peer-to-peer network. This makes it resistant to censorship and allows it to work even without internet access.

The app encrypts all data end-to-end and also hides metadata about who is communicating. This is the next step in the evolution of secure messaging. No communication ever enters the public internet. Everything is sent via the Tor anonymity network or local networks.

13
kobeya 10 hours ago 2 replies      
Feedback: darknet has come to mean places where you go buy drugs online, not p2p applications generally.
14
baby 10 hours ago 3 replies      
"passes security audit". Is security audit an exam? What does passing mean?
9
Movidius launches a $79 deep-learning USB stick techcrunch.com
87 points by rajeevk  9 hours ago   26 comments top 7
1
nl 27 minutes ago 0 replies      
It's surprising how much attention this has had over the last few days, without any discussion of the downside: it's slow.

It's true that it is fast for the power it consumes, but it is way (way!) to slow to use for any form of training, which seems to be what many people think they can use it for.

According to Anandtech[1], it will do 10 GoogLeNet inferences per second. By very rough comparison, Inception in TensorFlow on a Raspberry Pi does about 2 inferences per second[2], and I think I saw AlexNet on an i7 doing about 60/second. Any desktop GPU will do orders of magnitude more.

[1] http://www.anandtech.com/show/11649/intel-launches-movidius-...

[2] https://github.com/samjabrahams/tensorflow-on-raspberry-pi/t... ("Running the TensorFlow benchmark tool shows sub-second (~500-600ms) average run times for the Raspberry Pi")

2
oelmekki 2 hours ago 0 replies      
It took me a while to find how it interfaces with the system (driver? dedicated application? just drop model and data in a directory which appeared on mounted key?), so I'll post it here.

To access the device, you need to install a sdk which contains python scripts that allow to manipulate it (so, it seems like it's a driver embedded in utilities programs). Source: https://developer.movidius.com/getting-started

3
legolassexyman 9 hours ago 1 reply      
> Movidius's NCS is powered by their Myriad 2 vision processing unit (VPU), and, according to the company, can reach over 100 GFLOPs of performance within an nominal 1W of power consumption. Under the hood, the Movidius NCS works by translating a standard, trained Caffe-based convolutional neural network (CNN) into an embedded neural network that then runs on the VPU.

This is sure to save me money on my power bill after marathon sessions of "Not Hotdog."

4
sillysaurus3 9 hours ago 2 replies      
So what can you do with a deep-learning stick of truth?

EDIT: Looks like the explanation is in a linked article: https://techcrunch.com/2016/04/28/plug-the-fathom-neural-com...

How the Fathom Neural Compute Stick figures into this is that the algorithmic computing power of the learning system can be optimized and output (using the Fathom software framework) into a binary that can run on the Fathom stick itself. In this way, any device that the Fathom is plugged into can have instant access to complete neural network because a version of that network is running locally on the Fathom and thus the device.

This reminds me of Physics co-processors. Anyone remember AGEIA? They were touting "physics cards" similar to video cards. Had they not been acquired by Nvidia, they would've been steamrolled by consumer GPUs / CPUs since they were essentially designing their own.

The $79 price point is attractive. I wonder how much power can be packed into such a small form factor? It's surprising that a lot of power isn't necessary for deep learning applications.

5
visarga 4 hours ago 0 replies      
Interesting applications for drones and robots. The small form factor and low energy requirements are the key.
6
tuxracer 7 hours ago 3 replies      
Really disappointing there doesn't appear to be a USB-C option
7
j_s 8 hours ago 0 replies      
Currently out of stock as best I can tell.
10
Show HN: DeepForge A Modern Development Environment for Deep Learning deepforge.org
46 points by williamtrask  7 hours ago   6 comments top 2
1
BucketSort 5 hours ago 2 replies      
I love these initiatives and would love to learn about any other end-to-end deep learning architectures. It took us a long time to build up our in-house tagging infrastructure and whole model warehouse + inference servers that are connected to Kafka. I'd love something more polished though. It's a killer when you want to do ML in the real world, but have to spend forever rolling out infrastructure.
2
Kiro 2 hours ago 2 replies      
OT but I want a service where I can upload a set of images, manually tag them with a few categories and then let some kind of AI tag the rest of them based on the tags I put on the dataset. I'm not looking for a general image recognition engine. The tags are pretty abstract things like "enjoyable".
11
The Tyranny of Other Peoples Vacation Photos (2016) nytimes.com
48 points by prostoalex  7 hours ago   26 comments top 6
1
jasode 2 hours ago 2 replies      
The NYT article about over-sharing vacation photos can be generalized to the widespread human nature of narcissism: people share too many personal highlights that most others don't care about.

E.g. new mother shares endless stream of baby photos while most of her 20-something ex-college friends/coworkers are oversaturated with it. (E.g. the sentiment that if you've seen one blob of an undifferentiated poop machine in diapers, you've seen them all.) Yes, the infant is The Most Interesting Person In The World to that mother so it seems logical to her that others would find her posted photos to be interesting too.

On a large scale, we simply have no self-awareness about our narcissism and its banality to others.

Long before the existence of the internet and Facebook in the days of film photography, hosts would torture their dinner guests by having them gather in the living room and flip through slides[1] of the hosts' vacation. The guests silently suffer through the boring presentation but of course, etiquette means nobody would dare say, "Joe, this is boring and we're going home now." Polite society on a large-scale, self-censors the expression of boredom which perversely continues the large-scale unawareness that sharing too many personal highlights is subjecting friends to cruel & unusual punishment.

[1] https://www.google.com/search?q=kodak+slide+carousel+slide+p...

2
jasonkester 3 hours ago 3 replies      
It's a shame in a way that mobile internet has become so ubiquitous, and that people tend to therefore spend so much of their time on the beach looking at their phone.

One of the nice things about being away is that you're away. That you're forced out of your routine, unable to keep up with the same day-to-day stuff as usual, and cut off from your friends. It forces you to make new friends.

Travel for me used to be a good hack to overcome a natural tendency toward introversion. At some point, after not speaking to a human for a few days, I would go in to Emergency Survival Social Mode, where I had to strike up a conversation with the person at the next table right that minute just to preserve my sanity. And nearly every time I'd find that that person was going through the same thing and was thus happy to chat about whatever for a while.

That's a lot harder to do now, when everybody sitting alone at a table has their phone out, facebooking away with their friends at home as per normal.

I don't know what the solution will be. Hopefully the pendulum will swing back and people will realize that it's nice to disconnect for real from time to time. Especially when there is real sand between their toes and a real sunset going on in front of them (and a real girl sitting at the next table).

3
eludwig 2 hours ago 1 reply      
Great article. I laughed. I wonder if this is mostly a matter of indiscriminate visibility and a lack of friction?

In the old days, there was that moment when visiting someone's house (friend, relative maybe) and the topic of family vacations came up. Then someone suggests the slide projector. You could almost feel the silent "uggh" from the people who were not on the vacation. The worst part would be the unknown length of the slide show. (ten minutes? two hours?) But setting up the slide projector was a fair effort, so it could easily be put off "until next time you come over" (phew! Dodged a bullet!)

Now you just pop them up on FB and they are there for all to see. Compound that with the fact that FB's sharing controls are so complicated that actually just sharing with a particular subset (immediate family, grandparents, etc) is nigh impossible for ordinary mortals and there you are! Bazillions of terrible family photos a click away, or worse: a scroll away, no click needed.

Physical photos did have some advantages: at least they could be avoided! ;D

4
noisy_boy 2 hours ago 3 replies      
Deleting my Facebook account has been one of the best decisions I've ever taken. It has helped avoid a lot many negative emotions.
5
amelius 2 hours ago 0 replies      
I'd like to see an experiment where people are forced to post their bank balance along with their vacation photos.
6
elorant 3 hours ago 0 replies      
When this thing runs its circle a lot of people are going to feel miserable and depressed.
13
Dotsies (2012) dotsies.org
42 points by tosh  7 hours ago   13 comments top 9
1
dzmitry_lahoda 1 hour ago 0 replies      
Title shoud be `Dotsies - dot based read optimized font (2012)`.
2
rcarmo 38 minutes ago 0 replies      
Reminds me of Marain, which I actually wish was in use:

http://www.omniglot.com/conscripts/marain.htm

3
c517402 1 hour ago 0 replies      
It seems like the correspondence should be something other a-z. Something like making the vowels or more common letters more distinctive. E.g., making the vowels the lightest weight, that is make aeiou use the single dot letters dotsies uses for abcde.
4
vortico 3 hours ago 0 replies      
Was about to pass this off until I starting reading the sample near the bottom of the page that gradually teaches you. It actually amazed me that I could sort of read it halfway through. I can't get to the end though, but at least I have an idea of the difficulty level to read it naturally.
5
vinchuco 3 hours ago 1 reply      
Why not follow the 'natural' scheme of binary? A picture is worth n words:

http://i.imgur.com/3XIMXcD.png (added the 0-9 digits for emotional effect)

I can't seem to put my finger on what makes a scheme more 'readable' than another.

Edit: Reminds me a lot of Chinese, but in this case there's a clear procedure to decode glyphs as a word!

6
johnnytieszoon 4 hours ago 1 reply      
I learned it back then. Used it for a while as a privacy feature for my phone.
7
duckwho 1 hour ago 0 replies      
Korean is structured this way
8
jensenbox 2 hours ago 1 reply      
I cannot tell if this is a practical joke or what. At first, I tried to learn it in earnest but then I fell totally apart and virtually threw my hands up. Either I have some sort of learning disability or this is there to simply waste my time.

Is this a real thing?

9
baalimago 3 hours ago 1 reply      
imagelike alphabets and languages are harder to process, takes more time, not effective

signs needs to be distinct

14
What's so hard about histograms? tinlizzie.org
56 points by robertkrahn01  7 hours ago   9 comments top 5
1
lukego 2 hours ago 1 reply      
What a beautiful presentation!

Tangentially: I am really enjoying the book "All of Statistics" as a reference for better understanding things like histograms, kernel density functions, etc, and their parameters.

https://www.amazon.com/All-Statistics-Statistical-Inference-...

2
agumonkey 2 hours ago 0 replies      
3
acbart 4 hours ago 1 reply      
In my introductory programming class, we teach a few basic forms of chart visualization. By far, students struggle the most with Histograms. Even more frustrating, they love line plots and attempt to use them everywhere. Despite my explanations that you can almost always use histograms, and you can almost never use line plots! Yet they go with what they find more intuitive...
4
wodenokoto 4 hours ago 0 replies      
Is there a way to read this decently on mobile?

I've tried Firefox reading mode as well as pocket but they both cut off large parts of the text.

5
RodericDay 1 hour ago 1 reply      
> We notice that you're not using the Google Chrome browser. You're welcome to try continuingbut if some parts of the essay are rendering or behaving strangely, please try Chrome instead.

what a world

15
Pascal at Apple fogus.me
108 points by janvdberg  13 hours ago   78 comments top 14
1
kabdib 11 hours ago 4 replies      
I joined Apple in 1987, about the time they started ditching Pascal in favor of C. C++ (in the form of CFront) was just starting to be a thing.

Apple's Pascal had been extended to the point where there were few true differences between it and C, other than

- strings with busted semantics (size being part of the type being a huge mistake, leading to a proliferation of types like Str255, Str32, Str31, Str64, etc). I should add that C's strings were semantically busted, too, and in more dangerous ways. No way to win :-)

- nested procedures (not terribly useful in practice, IMHO)

- an object syntax, used for Object Pascal and MacApp (a complete, though large and somewhat slow app framework).

- some miscellany, like enums and modules

Apple extended Pascal pretty extensively, adding pointer arithmetic, address-of, variant functions calls, and a bunch of things I've forgotten. I could write some Pascal, then write some C, and squint and they'd look pretty much the same. Most people shrugged and wrote new code C if they were able, and then moved to C++ when CFront became usable.

2
thought_alarm 11 hours ago 2 replies      
I'm reminded of a really great interview with Bill Atkinson where he describes (among many other things) how he initially brought Pascal to Apple and the Apple II.

https://youtu.be/6tUWoy1tJkE?t=45m

The Pascal bits are from 45:00 to about 50:00.

 ... My manager at the time said, no, we don't want to do this [Pascal], people are happy with what they got. I overrode him and went to Jobs, and Jobs said "Well, I'm not convinced. I think our users are happy with BASIC and assembly language. But you seem passionate about it. I'll give you one week to prove me otherwise." I was on an airplane within two hours down to UC San Diego and I started porting right away. ... The other thing that happened then is I had to plug in the disk routines, and their system was pretty big and that little 13-sector floppy disk didn't have a lot of capacity. Well, Woz had just come up with a different way of encoding the data on the disk so that we could get more data for the same disk size, and we needed the 16-sector disk routines. And so Woz came down, and I was there... I had never bothered to get a motel because I slept on the bench when I wasn't working. This is in the computer science lab at UC San Diego. I was busy, I didn't have time to go sleep. But Woz came down, and I got to interact with him and it was really fun because he was working on installing these 16-sector disk driver routines, and he'd go 'type type type type type' -- and he didn't type in assembly language and have it assembled. No, he'd type in 6502 machine code. Hex. -- He'd type in hex, and then, you know, watching him type and he'd go 'type type type' -- pause -- 'type type type type', and when he finished I asked him what was the pause? And he said "forward branch, seven instructions, I had to compute the offset before I continued". So, he didn't back-patch the offset, he actually looked at what he was going to be typing, knew how many bytes it would take... he was brilliant.

3
rcarmo 5 hours ago 4 replies      
I wrote a fair amount of Pascal in my 680x0 Mac days, both in MPW (the Macintosh Programmer's Workshop) and THINK Pascal. Back then Modula-2 was available on VAXen and "big" machines, but Pascal was almost "portable" across Mac/PC/VAXen and was amazingly fast, so it was pretty fun.

I eventually moved to C (also using THINK C - see retrospective link below for a sample of those heady times) and never looked back until a couple of weeks ago I set up Lazarus for my kid to play with (there are too many Python GUI development options, and none halfway as good).

Lazarus is _amazing_ (if somewhat odd in today's world), and I really wish we had more IDEs like it instead of all the crappy Electron/web approaches for building desktop apps. It builds fast, tiny, entirely native apps in less than a second, and is an excellent example of how far Pascal went despite falling off the mainstream wagon.

(If anyone knows of anything like it for cross-platform desktop apps, let me know, I'd love to try it out)

- Link about early C dev on the Mac, that also mentions MPW and Pascal in passing - https://retrocomputing.stackexchange.com/questions/3213/what...

4
cjensen 12 hours ago 7 replies      
I really miss Pascal; it was a great and safe language for beginners. As it was extended with Objects and Modules, it was great for development.

But there are good reasons it was surpassed by C. In early Pascal, you got a pointer by allocating memory; you could not get a pointer to an existing variable. You'd be surprised how often that gets in the way when implementing a data structure. Just try to implement the following function in C without using the address-of operator:

 struct list *head = (void *) 0; void push_back (struct list *entry) { struct list **p = &head; while (*p != 0) p = &p->next; *p = entry; }
Pascal got better. But once you've switched to C, the sheer verbosity of Pascal is bothersome. Instead of "{" and "}" Pascal uses "begin" and "end". It uses "procedure" or "function" to introduce a function.

There's no going back, but I wish it was still available for learners. Java is comparable in terms of programmer safety, but has too much ridiculous boilerplate just to write "hello, world".

5
dfan 9 hours ago 1 reply      
In order to run Apple Pascal on my Apple ][+, I had to buy a "language card". This was bigger than an index card (maybe 3 by 6 inches) and added sixteen whole kilobytes to your computer's RAM, beefing it up to a massive 64K and rendering it capable of running such a system hog as Apple Pascal. I think it was about a hundred bucks in the early 1980s.

Meanwhile, the Apple ][+ could only display 40 columns on screen, where of course by "screen" I mean "television". (You could buy another big card to give you enough memory to display 80 columns at a time, but who had the cash to make another huge purchase like that?). Of course, 40 columns isn't enough to write in a structured programming language with indentation like Pascal, and in fact the Pascal program itself supported logical lines of up to 80 characters.

This issue was resolved as brilliantly as you might expect. You could toggle between looking at the left half of your program (cut off at the 40-character mark) or the right half. I'm not kidding.

6
robterrell 10 hours ago 3 replies      
It may seem quaint now, but Apple Pascal was a serious tool. I took AP Computer Science in 1985 and the language taught was UCSD Pascal on the Apple ][+. (In the 80's, C on an Apple ][ was impossible. The only C compiler you could get was for a card that went in the expansion slot that included a Z80 processor.)

When I went to college in 1986, Pascal was the primary language used in all entry-level courses at Virginia Tech. (Turbo Pascal on an IBM PC -- $5 at the student stores, if you brought your own floppy. I'm the weirdo who brought a Mac Plus to school and used Lightspeed/Think Pascal.)

All of the classic Mac APIs used pascal calling conventions. Pascal continued to be the language used for serious Mac development for a long time.

I can't find any references via Google, but Apple had an internal language called "Classcal" which I was told was "pascal with classes". Eventually Think Pascal adopted this object-oriented Pascal syntax.

Just today I was thinking about how great it was coding in Lightspeed Pascal, when I was trying to get VS Code to display ligatures. Lightspeed Pascal parsed the AST and auto-formatted all your code for you. Tabs became tab stops, like a word processor. I still miss that; hard to believe today we're still fighting about tabs v. spaces.

7
swombat 38 minutes ago 0 replies      
I learned to program with Turbo Pascal on my PC back in the early 90s. Language lite is fun.

And yet, when I clicked on this part of me was really just hoping this was referring to nvidia's Pascal architecture, a hint that maybe they were finally dropping the Radeon line and getting some decent video cards into their machines.

One can but dream I guess.

8
jzelinskie 11 hours ago 3 replies      
I've been reading a lot about Niklaus Wirth recently. I read an interesting piece about Oberon I found in an HN archive[0] that mentions Oberon usage on Macs. I'm very tempted to buy "The School of Niklaus Wirth: The Art of Simplicity" after reading a few things about him. I wish there were more instances of "computing in a vacuum" like at ETH.

[0]: https://news.ycombinator.com/item?id=10058486

9
malkia 5 hours ago 0 replies      
I've started with Basic, some assembly (CALL-151) on my Apple ][ clone (Pravetz 8C), but as soon as I got my hands on IBM PC/XT (or AT) Turbo Pascal (the 30-40kb turbo.com) was just the right choice. It fit on one disk, there was plenty more, while a Microsoft C/C++ compiler and linker each took a whole separate disk.

The best thing I've loved were the .TPU files (but not sure whether TP4 or TP5 had them truly). There were no .h files to be included, or .lib (.a) to be added, it worked just magically well (with some limitations).

I've moved to C/C++ later simply because, well it's a stupid reason. I was writing a "File Manager" like app for DOS (just single column, not like Norton Commander, FAR or Midnight Commander), and the only function in Turbo Pascal 5.0 to move files was just renaming a file in the same folder... Had I known about inline assembly and be more brave, I would've stayed in Pascal Land (And I was already familiar with Ralph Brown's Interrupt List)... But hey, this stupid reason moved me to C/C++ as the builtin function there did it... then again, soon after that I've started using more and more inline assembly.

I love C/C++ now (especially C), and where I used to be really good at Pascal, I might have some hurdles reading pascal code today. Delphi was my last stop, and while I like it, I switched to video game development, and Pascal was not much used there (... Age Of Wonders I believe was written in some form of Pascal and possibly some other games,... Also part of Xoreax's IncrediBuild might've been, especially the part that does the C/C++ header processing, I think it was since we had issues with it, and while debugging found something pascal-ish in there, but don't remember now).

10
SwellJoe 8 hours ago 0 replies      
My first programming language was (obviously) BASIC, but my second was Pascal. I took AP computer programming in high school and it was taught with Pascal on Apple II and IIe computers. My dad later bought me Turbo Pascal for the PC (I remember a yellow box with "+ Objects" on it, so it must have been 5.5 Pro in 1989), and I used it on his machine, but never did much with it other than tinker. I finally got what I viewed as a real programming setup when I got DICE (Dillon's Integrated C Environment) for my Amiga a couple years later. Still didn't do much more than tinker, though, until I got a Linux box a couple years after that and source code for everything was available for poking at.

Anyway, Pascal was very common in education back then and Apply was very common in education...ergo, Apple and Pascal went together a lot of the time.

11
Animats 6 hours ago 0 replies      
The page isn't rendering properly with ad blocking. The original memo is being served from Storify. Where is it from? The Internet Archive. Here's the original, which reads better directly from the Archive.[1]

[1] https://archive.org/details/Apple_Pascal_History_DTC_1992

12
Lerc 5 hours ago 0 replies      
Does the p-code compiler self host? I've been working on an emulator for an imaginary 8-bit machine (AVR instruction set), and have been looking for languafge options to run on it.

In the last few days, I've gotten FreePascal compiling for it, bit I would also like to have languages that I can compile or interpret on the machine itself.

13
carapace 11 hours ago 0 replies      
Please, everybody, if you haven't read it, stop now, get a copy of "Humane Interface", spend the weekend reading it, and then come in Monday and apply it.
14
EvanAnderson 12 hours ago 1 reply      
When opened w/o Javascript you see only the first paragraph and the timeline at the bottom. I almost skipped over this because I thought there wasn't anything interesting there.
16
Abuses Hide in the Silence of Nondisparagement Agreements nytimes.com
103 points by JoshTriplett  13 hours ago   54 comments top 12
1
addcn 11 hours ago 4 replies      
As someone who has regrettably had to use one of these, I can say I'm happy they exist. They are certainly abused so we had a policy of only using them as part of a legal settlement.

We had someone who became mentally unstable on staff and they mounted a full on campaign against the company online whilst still employed. This included impersonating executives, other employees and key customers on social media and other things meant to damage our image. They got a sum ~30% of their annual salary and we got a non disparagement agreement. They were let go as part of that. 1 month later they were at it again and we were able to get our money back. Right before it bankrupted them...guess what stoped happening. We dropped it without taking a cent of their money (we wouldn't actually do that) Hasn't been a problem.

They can be absused, sure but there are legitimate uses too.

2
tptacek 11 hours ago 4 replies      
In the medium term, it seems like there needs to be a federal mandatory public policy exemption to nondisparagement, rendering the agreements unenforceable in cases like these and sanctioning illegitimate attempts to enforce.

In the short term, it would be good if people could organize legal aid for people in our field working under nondisparagement clauses. Some of these clauses may be prima facie difficult to enforce due to language; others may leave room to report companies to legal authorities.

Finally, and I'm a broken record on this: if a significant fraction of engineers at any company organized themselves and demanded reasonable limitations on their nondisparagement clauses, for instance to protect whistleblowing, they would get it. What's a significant fraction? At many companies, it's probably less than 20%.

I hope a lot of tech industry employees are, as we speak, talking with their friendly peers at their companies and starting to think about how to reach out to labor lawyers to start this process. It's not that hard, and, for the time being, federal law protects you extensively in the process of organizing your workplace.

3
Hnrobert42 11 hours ago 2 replies      
From the article: Employees increasingly have to give up their constitutional right to speak freely about their experiences if they want to be part of the work force, said Nancy E. Smith, a partner at the law firm Smith Mullin. The silence sends a message: Mens jobs are more important than womens lives.

I don't understand how the second quote is related to the first except in the case of the discrimination settlements.

4
BrandoElFollito 46 minutes ago 0 replies      
This is one of the advantages to have a very strict labor law in France. This is something which cannot be enforced (should it even appear in the contract).

There are disadvantages to that law, but in average the pros > cons.

5
danek 8 hours ago 2 replies      
Someone I know had to sign one of these as a prerequisite to getting a severance, but before the actual amount of severance was revealed to him. He asked my advice on it and I suggested he sign bc it would be reasonable for both parties. (But basically his very new boss was very authoritarian and vindictive and didn't like him, he was doing fine for years..). So he signed it. And then he got an astonishingly tiny severance check. So much so that I nearly disparaged the company.
6
phkahler 11 hours ago 1 reply      
Slander can get you in trouble. Anti-disparagement agreements should be considered unconstitutional, or at least have some kind of high bar for being allowed at all.
7
Overtonwindow 9 hours ago 0 replies      
I work in the world of politics, and I've yet to see one of these. I think this is because in this world, if you talk negatively about your old job, your old boss, or whatever, no one is going to hire you. You become a liability. This has probably led to some horrible things happening to good people, but they keep their heads down, say nothing, because they want to still be employable. I have a similar situation in which I could probably become a whistle-blower against my former employer, but I don't, because if I do my career is over.
8
brndnmtthws 11 hours ago 2 replies      
How enforceable are such agreements? IANAL, and I'm genuinely curious.
9
praulv 3 hours ago 1 reply      
How practical are these in a glassdoor world? There is no way an employer could track down an anonymous posting online and force you to relinquish your severance.
10
tsukaisute 11 hours ago 1 reply      
"The incident was described by two entrepreneurs who were told about it in the weeks after it occurred but were not authorized to speak about it."

May be unfair to all parties involved that a situation is described on hearsay, with the source not identified, and not even being a direct source (someone who wasn't at the event, and who heard about it weeks later.)

11
yuhong 11 hours ago 0 replies      
I have been thinking about Yishan-style CEOs and getting rid of nondisparagement in things like severance.
12
known 6 hours ago 0 replies      
NDA != Slavery
17
Parity's Wallet Bug Is Not Alone hackingdistributed.com
79 points by scarhill  13 hours ago   9 comments top 4
1
ithought 5 hours ago 1 reply      
It's amazing BitGo whould shrug off Emin's help when he helped fix their software. Emin's super smart, ignoring his offer to serve as a technical advisor is ridiculous.

And then BitGo goes on to be partially responsible for a $320,000,000 loss (current value) that almost destroyed BitFinex.

It's just sad that so much fighting and ego has prevented technical collaboration. I'm a supporter of the Core devs but Emin is a genius who should be respected.

2
Animats 6 hours ago 0 replies      
The article points out that the database community discovered that application programmers can't handle too much power. That's why databases have things like atomic transactions. Those are hard to do in the general case, but all SQL database systems now do it.

Blockchain contracts for real money need at least the assurance level of database transactions. The DAO attack involved causing a transaction to happen only in part; one part happened, then the transaction aborted. That needs to be prohibited by the underlying system. As with database transactions, either everything goes right and the transaction commits, or something goes wrong and the data is unchanged.

3
foepys 5 hours ago 1 reply      
And again, an interesting story about crypto currencies ends with a fight of egos instead of concentrating on the technical details. Why do those crypto enthusiasts almost always have to throw dirt at each other? Especially when it's completely out of place like in this blog post.
4
dibbsonline 9 hours ago 2 replies      
I don't always get mobile visitors, but when I do I use 750k images.
18
The Multi-Sig Hack: A Postmortem parity.io
39 points by ca98am79  8 hours ago   17 comments top 4
1
anonymouz 5 hours ago 2 replies      
> The restructuring of the original multi-sig wallet contract happened as part of a much larger change, namely the introduction of the Wallet user-interface, a 4000-line, almost entirely Javascript, CSS and HTML alteration. The depth and nature of changes made (and thus the severity of the potential bug) was misunderstood by the rest of the team. It was erroneously tagged as a (purely) UI change and thus received only one review before merge. A later audit by a Solidity expert also missed the issue.

Except for the design issues with the Solidity language that have been discussed on here before, this massive process failure seems to have been a main factor allowing this to happen.

- Security critical changes needed extra tagging for more in-depth review. The other way around seems a safer default to me: By default require additional security review, and only under certain conditions allow for a lighter review.

- In the basic review apparently they didn't check for correct tagging and failed to notice that a UI change was also changing contract code without being tagged as security-relevant.

- It seems to be a terrible idea to mix in UI changes with contract code changes. Feels to me that such a change should be simply rejected in review, and re-reviewed after splitting out the changes.

2
CaliforniaKarl 7 hours ago 3 replies      
> 3 multi-sig wallets were exploited from of a total of 596 vulnerable multi-sig wallets

Have the owners of those exploited wallets come forward yet?

I am wondering how long it will be before a wallet developer is targeted with a civil lawsuit. In particular, looking at Parity's web site, I am seeing the noticeable size difference between this text

> Parity is the fastest and most secure way of interacting with the Ethereum network.

(Front and center on https://parity.io/index.html)

and this text:

>We accept no liability for your use of the software or its source code (save to the extent such liability cannot be excluded as a matter of law).

(From https://parity.io/parity.html, at the bottom of the "Sample Parity" section.)

3
mannykannot 1 hour ago 0 replies      
"Going forward, Parity will try to arrange a bug-bounty programme."

I was going to dismiss this on the grounds that it is hard to compete with the implicit bounty that all the funds in contracts present to the black-hats, but on second thoughts, it would have the benefit of motivating ethical hackers who are too skeptical of Solidity to have anything at risk in Ethereum. It seems plausible that skeptical people are more likely to find bugs, all else (including motivation) being equal.

4
xiphias 6 hours ago 0 replies      
They claim to be the most secure wallet, and at the same time allow code without any review to get submitted. I feel that the whole Ethereum space is full of lies like this (I stuck with Bitcoin core, as I feel that the team is much more cautious).
20
The New Firefox and Ridiculous Numbers of Tabs metafluff.com
807 points by robin_reala  15 hours ago   460 comments top 54
1
huntie 14 hours ago 15 replies      
I'm really glad that people at Mozilla use ridiculous numbers of tabs too. Lazy-loading of tabs is the reason I switched to firefox. I'm not sure if it's still this way, but Chrome used to load every tab on startup. So even if you only had 100 tabs, you were looking at 5+ minute startup time. God-forbid that any of them were Youtube, or you'd have to go through and pause them all.

I've just updated to Firefox 55 to test this, and the improvement is ridiculous. I hope that Firefox focuses more on power users in the future.

I'm curious what the author uses to manage all of these tabs. I use Tab Groups, but I think they won't work in a few Firefox versions so I'm looking for alternatives.

2
ilaksh 14 hours ago 9 replies      
There are a lot of people who use tabs as bookmarks. Seems like a good way to keep the RAM industry going strong. Someone once told me (seriously) "I need at least 128 GB of RAM otherwise I can't keep my tabs open." But does everything you were interested in over the last X weeks or months really need to be loaded up? No, and if you use it like that then it can't preload stuff.

I think the main lesson is that bookmarks don't work too well or people just don't use them. If nothing else, make the bookmark display show newer bookmarks rather than the same old ones from four years ago. And maybe start preloading if they are opened regularly. Merge two features together, maybe add optional other organizational features for example similar to new tab screen.

The tricky thing is that there are a lot of things that are potentially supposed to happen while a tab is open. The browser is now it's own OS, and it may be very difficult for developers to use important features if tabs (processes) only _look_ like they are running.

3
elfchief 15 hours ago 11 replies      
Wow. I've been getting more and more frustrated with how poorly Chrome handles even a moderately large number of tabs (~150), and it sounds like my savior is going to be ... Firefox. Huh.

Wouldn't have guessed it, but I'll totally take it.

I have a nice extension for Chrome called Quick Tabs that gives me a searchable list of my open tabs and makes it easy to find things I have open... anyone know which of the several things that seem to do that with Firefox would be the best to use?

4
mannigfaltig 53 minutes ago 0 replies      
Whenever people ask me about the excessive number tabs in my browser, I simply show them Einstein's desk: http://2.bp.blogspot.com/-eyOIn_EOW2Q/Vmcl6O-55OI/AAAAAAAADs...

I am not Einstein but it is probably not a bad idea to have a whole lot of interesting things around you all the time. Sort of like a cache or the way proteins are synthesized, like swimming in a nutritious soup. "Oh, look here is a piece that fits!".

One basically decreases the chance of forgetting interesting and useful information. "Out of sight, out of mind."

5
lobster_johnson 14 hours ago 6 replies      
I've been considering switching to Firefox due to these performance improvements, but the one feature that's always missing is for the location bar to autocomplete terms from other sources such as Wikipedia. Is there some add-on I can install that can fix this?

Safari is brilliant here. If you enter something in Safari's location bar, it will suggest Wikipedia and other search suggestions right away [1]. I use this feature all the time. But FF, out of the box, will only show suggestions from one source. Here [2] is what FF suggests; all the hits are from Google, and it doesn't try to be clever about showing what I might mean to search for. Notice how it offers to search Wikipedia, with this tiny, obscure icon at the bottom of the suggestions, which I find to be a completely useless feature (I have keywords for that). The top hit tends to be what Google puts in a special box in its search results.

Here is another nice thing Safari does [3] which I make use of all the time. I've not visited walmart.com, so that "Top Hit" is just because it's a popular site. I can't make FF do anything like that.

[1] http://i.imgur.com/83FfnPn.png

[2] http://i.imgur.com/T4p1NZv.png

[3] http://i.imgur.com/MkRP2Le.png

6
rc_kas 15 hours ago 10 replies      
I <3 you Firefox. I'm so sad that nobody uses you.
7
iamleppert 6 hours ago 5 replies      
Firefox is a case study in how performance really does matter. A lot.

It used to be the best browser, and then something happened and it gradually became slow, really slow, while Chrome became fast. Who was in charge over at Mozilla during all this?

Any engineering director worth their salt would have noticed what was happening and installed metrics that didn't let engineers commit code that caused performance regressions and given an engineer (or multiple engineers) who loves optimizing things carte blanche.

Really, I want to know what happened over there. Does anyone know?

8
randomString1 14 hours ago 0 replies      
I find way more productive to use bookmarks

- Archive folder: bookmark dump to keep the links just in case I ever need it again (so they pop on the search bar even after I clean my history, you can also add keywords manually if you want)

- Buffer folder: to-dos, reminders and things I <need> to read soon. I keep it at a maximum of 10 items at all times

- Follow up folders (plural): pages I want to check ocasionally for updates. Often used for pages without RSS. I don't like to use extensions to check for page modification because I want to do it on my own pace. This helps reducing my mental load because I know it's there if I ever need it. I often delete the entire folder if I don't feel it's useful anymore.

- The rest are folders divided by a main folder and subject. This way I can easily delete them after I'm done with that task (after a minute or after a year). Example: Programming > Project X, Programming > CSS fix for that thing.

Middle click on the folder to open everything at once. Done.

The position of the folders are crucial and also helps with muscle memory. I keep it like this: the more to the right (of the browser), the more disposable they are.

9
adrianmonk 14 hours ago 2 replies      
A quick tangent to plug my method for paring down open tabs when it gets out of control: I create a document!

Personally I use Google Docs, but you could use a wiki or MS Word or many other things. The point isn't the technology, it's that when you have a whole slew of tabs open, and you feel the urge to keep them open, it's a strong sign that your mind is trying to gather info about a topic.

Putting it into a document often feels great. It gives you an opportunity to type out a few quick notes on the topic (like what you thought was significant about various links) or other thoughts you had. And you might find you want to share the document with people you're working with. And I find I feel more organized, not just because I cleaned up something messy but because I took a moment to focus my energies on something my mind was begging me to pay attention to. Sometimes you even realize you need two different documents on different subjects, and it's a little enlightening to realize the two separate themes.

10
znpy 1 hour ago 0 replies      
I am happy to hear this because I stopped using firefox and migrated to google chrome around firefox 51, and boy it was SLOW.

Now it might be worth it to give it another try.

In the meantime, I am seriously concerned about Thunderbird. Thunderbird is my MUA of choice and quite frankly, there are very few options to replace it, and none of them seems 100% okay (except, maybe, Evolution). Clawsmail is okay-ish, but so ugly to see and feature-poor.

11
chippy 15 hours ago 1 reply      
I'm also very impressed with FF's performance on Linux in recent versions. I bumped up the RAM allocated for multiprocess but I never really have more than 20 tabs open. Startup and rendering seems much quicker, and the add-ons seem more open.
12
forevercrashing 13 hours ago 1 reply      
Surprised to see no mention of Tab Center (https://testpilot.firefox.com/experiments/tab-center) in the comments. I've gotten so used to it that now I find it hard to use a browser with tabs on top. Being able to see more of the page title when tabs are displayed horizontally is extremely useful. There's a search field too. This combined with the "browser.ctrlTab.previews" set to true in about:config (enables MRU tab switching with ctrl-tab) makes managing tabs awesome for me.
13
aboodman 14 hours ago 5 replies      
Why do you have a profile w/ 1600 tabs in it. If whatever it is is so important, aren't you afraid to lose it? I'd be terrified that one time Firefox just wouldn't shut down clean.
14
giancarlostoro 3 hours ago 0 replies      
I love Firefox and never had issues, except at work. For some odd reason it will break, all tabs will look white and all I see is a loading icon on the middle no matter what tab I click on ruining my workflow. No idea what that's about since at home Firefox works fine, Chrome seems to work fine at work on the other hand. I guess I'll be using Chrome at work and Firefox at home till I figure out how whats causing the Firefox issue. I only usually have no more than 20 tabs open at any given time. Very unusual for me to keep 10 tabs open really.
15
bigbugbag 2 hours ago 0 replies      
How does this translate to real use, as in actually loading the pages and having a couple dozens extensions ?

The ability to open 5-10 times more tabs than I use is quite and edge case that mozilla usually doesn't care about, but the real question is what is the point of this when it comes with making firefox totally useless for said use case by dropping support for extension that make it practical or useful.

16
overcast 12 hours ago 2 replies      
I was already using 55.x Beta. My BIGGEST issue, is that EVERY browser seems to chew up memory over time, just by leaving it open with tabs going. Firefox, Chrome, Safari. All do the same thing. Alleviated by using the Great Suspender in Chrome, but why can't they all have this just built in? Startup/speed, and initially memory use really haven't been that big of an issue. It's the memory, and finally grinding to a halt that is the BIGGEST issue for me. I can't escape it.

Happens on MacOS, and Windows for me.

17
ernsheong 9 hours ago 0 replies      
(shameless plug) For those of you keeping many many tabs open because you worry you might forget it again, or working on related topics, I am building https://www.pagedash.com to save your page exactly as you saw it, and everything from the original page (HTML and assets) are saved to PageDash so that you can load it again without worrying that the original page went bonkers/down.

v1 will be quite basic, just a list of saved pages. Expect more organization tools (folders, tags, etc.) in the further releases.

Please do sign up to be informed of impending release! :) (estimated end August)

Also, do leave a reply if you are keen on using ML (link classification) to help organize your pages for you. Unfortunately, because computers can't read our minds, this can't be perfect so folders are probably still relevant for your mini-projects.

18
fiatjaf 11 hours ago 1 reply      
I have 2 tabs open right now. If the number of open tabs gets over 10 I start to actively look for tabs to close.
19
Koshkin 14 hours ago 5 replies      
Can anyone offer an explanation of why should not tabs be managed by the window manager? (My understanding is that this question is independent from how the particular application would choose to control the contents of a tab - whether directly, or through a separate thread, or by spawning a child process.)
20
problems 6 hours ago 0 replies      
I've been running a ~7 year old laptop for occasional browsing. Chrome is unable to seek properly in video playback (gives page unresponsive after a while) and lags randomly when loading large pages, I suspect due to RAM allocations.

Old Firefox played videos fine, but lagged on many page loads. I was about to conclude it was just too old to browse the web decently, but this... this seems incredibly usable.

Thanks Mozila, I'm definitely installing this thing on my main, much more modern machine tomorrow.

21
eyeball 12 hours ago 1 reply      
The OneTab extension has been really good for helping me handle my tab hoarding tendency.

https://www.one-tab.com/

It lets you hit a button and send all open tabs to a single page that persists between browser sessions. You can remove a link by opening up that list and clicking on it.

Has a bunch of other handy features too ... like publishing the list of links to a share-able URL.

22
userbinator 13 hours ago 2 replies      
"Ridiculous" is right, especially from a UI perspective --- it still puzzles me why they would design it so that by default all the tabs are crammed into the place which used to be the titlebar, making it difficult to both read the title and find the tab you're looking for.

I've seen others start opening multiple windows when the tabs get too small. I usually do that to keep tabs grouped into "pages I am unlikely to view simultaneously".

23
ledgerdev 14 hours ago 1 reply      
I would love get rid of chrome and to switch back to firefox as my everyday browser, but simply can't get over how messy/ugly the tabs(even in compact theme) and window title look compared to what chrome does with tabs and title bar, and lack of window title bar.
24
outworlder 15 hours ago 1 reply      
I use Firefox as much as I can, for many reasons. Two things keep me from using it all the time:

Yubikey My Chromebook (I would use and equivalent FirefoxOS if given the choice)

There are performance issues in some cases but nothing major. It is still somewhat slow compared to Chrome, even though this may be due to optimizations done specifically for Chrome.

25
versteegen 7 hours ago 0 replies      
I currently have over 1380 tabs open in firefox 52 ESR (over 4 windows). Oh god, it's terrible. Very slow and unresponsive, CPU usage is typically 60+% when idle. I restart firefox every couple days (which takes many minutes) to keep CPU and memory usage down, by causing all tabs to be unloaded. (As an example, right now at 5.2GB resident, with only a small percentage of tabs loaded).I've been trying to kick the habit.I also have several other profiles and other computers. Probably adds to 5000 tabs all in all.

I use All Tabs Helper to help jump between tabs. Finding tabs is hopeless without it. ATH also has features like mass closing tabs or unloading them. I wish it had a way to bookmark tabs, which I would use to close most of my tabs.

So, I'm very glad to hear this. Time to switch off ESR.

26
cf 3 hours ago 1 reply      
So one challenge I face is that many of my tabs are for web content that isn't always there. Do any of these extensions like Great Suspender, Session Buddy, etc actually let me save the webpage as it is and then let me choose to refetch it as necessary?
27
kronos29296 3 hours ago 0 replies      
If such numbers happen in android also you sir just have another firefox fan. I want to know right now. (Chrome sucks at this on all devices now.) The only reason I am using it is that it is better than firefox for accessing google.
28
ChoGGi 12 hours ago 0 replies      

 I measured by eyeball, using "time cat" on the command line. This might seem weird, but c'mon - I'm measuring minutes. Microsecond precision is not required.
For anyone else doing this sort of testing; there is an extension to monitor startup speed called about:startup

https://addons.mozilla.org/addon/about-startup/

29
fouc 6 hours ago 0 replies      
Note for the blog author:

"It's interesting that Firefox startup time got consistently worse over time until Firefox 51."

I believe you meant to write "until Firefox 52." I think the usage of "until" would typically point out the exception to the rule, in this case Firefox 52 is the first version that is no longer slower than the previous.

30
rndmize 14 hours ago 1 reply      
This is nice to hear. I use Firefox as my default browser with the tree-style tabs add-on, and just yesterday I replaced ABP with uBlock and Ghostery with Disconnect out of frustration with how slow things were going (~30-50 tabs open). The slow startup time hasn't been helping (come on, its not even loading the tabs until I click them, what's taking so long?)
31
evolve2k 14 hours ago 1 reply      
My biggest frustration with Firefox is that you can't 'Tab to Search' as you can in chrome. Every time I attempt to switch to FF the lack of this feature just kills my productivity and I end up switching back.

So wish this was possible.

Ref: https://www.chromium.org/tab-to-search

32
septentrional 15 hours ago 1 reply      
Sorry if this is somewhat off-topic but how do you make Firefox' tab bar look like the one in the article (i.e. no rounded edges for the tabs) on MacOS?
33
libeclipse 15 hours ago 0 replies      
Is there anything similar for chrome/chromium? Would be interesting to compare them.
34
austinjp 12 hours ago 1 reply      
And then there's Firefox Focus which has no tabs and no other new instance capability. One single window. "Focus" indeed. It still makes me twitchy, but is so far completely usable.

Could do with a "fetch as desktop" mode, though.

35
caio1982 9 hours ago 0 replies      
This is the first time in a decade that something (the article) convinced me to try Firefox as a possible day-to-day browser once again.
36
codychan 6 hours ago 0 replies      
Impressive, now I'm looking forward to the official stable version of 55
37
manuelmagic 15 hours ago 0 replies      
I have the bad habit to open many tabs on my old MacBook (mid 2010). Boot time, together with CPU usage, is one of the main reasons I had to use Opera as my primary browser. I'm really happy to see this might change in the near future.
38
mrkrabo 14 hours ago 5 replies      
I suppose I'm alone in getting all nervous if I have more than 10 tabs open.
39
TekMol 5 hours ago 0 replies      
Great that you can open over a thousand tabs now. Next blog post will be when it's over a million I guess?

Personally, I usually have something like 3 or 4 tabs open. But what I would really need is this:

https://xkcd.com/619/

Yup, hardware accelerated video on Linux. Chromium has it. Firefox doesn't.

It's 2017 and for me Firefox is still missing the basic feature of seeing smooth Youtube videos.

Anybody working on that? Can't you just take it from Chromium? I mean it's open source, isn't it?

40
reiichiroh 14 hours ago 1 reply      
With Chrome, I use "The Great Suspender" extension.
41
CurtMonash 13 hours ago 0 replies      
I use ridiculous numbers of tabs, and Firefox has been less stable for me the past months than ever before.
42
minusSeven 8 hours ago 0 replies      
I wonder what these numbers would be compared to chrome.
43
muppetman 4 hours ago 0 replies      
This would have been useful 4 years ago before we all gave up on Firefox.
44
wnevets 14 hours ago 1 reply      
What made it so much worse over time?
45
daef 5 hours ago 0 replies      
My usual day looks something like this:

open FF (it it's not already running) and wait for a hand full app-tabs to load:

 * slack (there's no dark theme in the native app) * skype (there's no dark theme in the native app) * toggl (timetracking) * email (office365, I stopped worrying about outlook/thunderbird) * jira (gotta know what to work on next) * social (fb&twitter - could probably also do w/o - I rarely open those)
since they exist I'm also use a hand full tab groups - at least those 3:

 * work * private * to read
the work tab group usually starts empty (at least when I finished up the day before), might grow to a hundred or two during the day - but usually ends up empty at the end of the day when I'm done again.

I hardly ever left-click a link, I only wheelclick. but my ^w is at least as fast as my wheel click - since the awesomebar searches through history, pagetitles and already openend tabs it's really easy to navigate even between tab groups just by ^t, type 3 letters, press return to 'switch to tab' (really convenient icon there so you know you're going to close the new tab and switch to an existing tab at this moment)

if I want to restore a tab i killed prematurely I fire up the history - where I only use the 'by last visited view' - does anyone srsly use the 'by date and site' view?

tabs are something very different than bookmarks to me, a bookmark is something I return to on a regular basis - a tab is an open 'todo'. I don't use the usual bookmarks thou, only the bookmark-toolbar below the url - and my bookmarks there have no text - they get renamed to "" so I only see their favicon.entries there are e.g.

 * HN * blog.fefe * oglaf * xkcd ...
one thing that I really disliked that mozilla sometimes decided to drop the dedicated keyboardshortcut to hide/show the bookmark-toolbar (I tend to hide it for screenshots where I want the URL to be on the screenshot https://xkcd.com/1863/ )

I have no idea what I'm going to do after the death of the tab groups extension.

one thing that really bugs me is that every time a show tab groups to 'normal users' (tm) they insta love them. I really wonder if no one used to use them bcs hardly anyone ever knew about them.

46
digitalzombie 8 hours ago 0 replies      
lol this is why I love firefox. I hoard tabs. But not as excessively as OP though... 1000+ is crazy.
47
crorella 13 hours ago 0 replies      
I tested with 1690 tabs and got similar results :D
48
PhasmaFelis 13 hours ago 1 reply      
I'm torn between thinking it's time to switch back to Firefox, and thinking I need to avoid Firefox at all costs, because the slowdown when Chrome gets over 100+ tabs is the only thing keeping my browser windows remotely navigable.

You know what I really want? A way to attach titles to browser windows. This window is "Games", this one is for "Books", this one is for my current "Work" task, this one is "Research" on the new doohickey I'm thinking of buying...

49
BlytheSchuma 14 hours ago 0 replies      
LOL I've been running Firefox with 5k+ tabs since 2009.
50
johansch 13 hours ago 0 replies      
Opera has supported this abnormal behavior for like two decades now. I remember being shocked by how many tabs the Opera core browsing/rendering engine developers used to have open on their desktops when I joined Opera back in 2004. I guess it was an odd pride thing? :)

To clarify: I am talking about a one-row tab scenario. With about 3-4 pixels per tab. And they were perfectly happy with that. Even seemed to feel it was a good user experience.

51
droithomme 14 hours ago 0 replies      
When I see bug reports with these sorts of ridiculous over the top use cases, I think, yeah buddy, why don't you join the project and fix it since it's only applicable to you.

And in this case that happened. This guy is an actual Firefox developer.

This is as it should be and congrats.

Now I am wondering how I can possibly get into this mysterious world of having thousands of tabs open.

52
linuxray 8 hours ago 0 replies      
thanks for info
53
revelation 14 hours ago 6 replies      
Whut? It didn't get any faster, it's just more aggressively lazy loaded now. This is breaking the very use case of people who have 100+ tabs; they want stuff to be there when they click on it.
54
valuearb 14 hours ago 0 replies      
Dear god, why?
21
UK to bring in drone registration bbc.co.uk
42 points by dan1234  3 hours ago   47 comments top 8
1
dTal 3 minutes ago 0 replies      
>The plans also include the extension of geo-fencing, in which no-fly zones are programmed into drones using GPS co-ordinates, around areas such as prisons and airports.

So will open source drone firmware become illegal?

2
DrNuke 2 hours ago 1 reply      
I act as an advisor for DronesBench http://www.dronesbench.com and we think there should also be some sort of drone efficiency certification for the consumer market (from 250g to about 4kg, under 250g they are toys according to EASA legislation draft), same as other electro-mechanical devices or machines, with a concise parameter to be sported on the drone's plate. Too much a difference from the value declared by the vendor may imply hidden defects of the drone and therefore the possibility of a crash. We are actively proposing our DronesBench Index to IEEE and EASA for the legislation to come, with encouraging response from IEEE at the preliminary level. It remains to be seen if and how things progress in a formal way.
3
thinbeige 2 hours ago 3 replies      
OT: Is it just me who does not see huge business opportunies in drones?

Don't get me wrong. Drones made a huge leap the recent years and from a hardware manufacturer's POV there is business. There is also great stuff like drone races, drone cams and selfie cam drones.

But will there be really much more? Are safety concerns, public regulation and limited use cases in many areas (such as urban areas) giving drones a hard time?

4
petepete 2 hours ago 1 reply      
I generally don't have a problem the idea of registration, but 250g is way too low a limit. If 2.2kg is the FAA's lowest risk category, the limit should be closer to that.
5
codebeaker 2 hours ago 1 reply      
Currently mentoring a startup in this space (not in the UK). For us it's because of mandatory insurance for 3rd party liability, and as a non-drone owner who's seen plenty of idiots wielding them, I'm all for it.

I prefer the light touch (ala Mopeds/Scooters) where it's simple and easy, insurance at a flat rate, easy to transfer ownership. Something akin to cars with a v5 document, etc would be overkill.

I don't really have an opinion on the weight limits, how much does something have to weigh to take out an eye, or cause a motor vehicle to crash?

6
phatbyte 1 hour ago 0 replies      
Portugal will introduce drone registration and insurance as well.
7
reallydattrue 2 hours ago 6 replies      
License means More Tax.

Why am I not surprised?

- Car License.

- TV License.

- Remember Personal Radios needed a License.

- Gun License.

- Travel License (passport).

Now Drone License. sigh

What else can the UK Government think of taxing?

8
madaxe_again 2 hours ago 1 reply      
While I laud the intent, I doubt bad actors will register their drones, so it'll be a burden for legitimate pilots and no deterrent whatsoever for folks who use them to fly contraband into prison etc.
22
More Good Programming Quotes (2016) henrikwarne.com
114 points by henrik_w  5 hours ago   47 comments top 17
1
fani_pack 1 minute ago 0 replies      
"I've been using Vim for about 2 years now, mostly because I can't figure out how to exit it." @iamdevloper
2
combatentropy 4 hours ago 0 replies      
"Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration." Stan Kelly-Bootle

Thanks for putting in that one in. Stan Kelly-Bootle was one of my favorite writers. He recently died, but he was a Brit who wrote computer books and folk songs.

I think people who dual-specialize, in something left-brained and something else right-brained, are better at both. I've only read parts of one of his books, a 90s book about UNIX. It was so clear and fun to read while seeming to break many rules about writing clearly. I'm not sure how it works.

3
zyb09 5 hours ago 3 replies      
I thought these two were pretty good:

Bad programmers worry about the code. Good programmers worry about data structures and their relationships. Linus Torvalds

That's how I always feel when people have lengthy discussions about spaces vs tabs. Good truth coming from the man himself.

Sufficiently advanced trolling is indistinguishable from thought leadership.

Kinda scary true, when you see how some online communities, that started mostly as trolling, became real idiologys over time.

4
joncampbelldev 5 hours ago 1 reply      
Very good, although I thought the dynamic typing one was a bit snarky. I'll happily give up dynamically typed languages like clojure when structural typing can give me the same flexibility.

I say this as a person learning idris and Haskell and loving it btw, before anyone prematurely tries to convert me to static typing.

5
neilparikh 4 hours ago 1 reply      
Some of my favourite quotes are from Alan Perlis's Epigrams on Programming [0].

My (current) favourite: "A programming language is low level when its programs require attention to the irrelevant."

[0] - http://www.cs.yale.edu/homes/perlis-alan/quotes.html

6
hliyan 4 hours ago 0 replies      
Harold Abelson 'Programs must be written for people to read, and only incidentally for machines to execute.'
7
christophilus 4 hours ago 0 replies      
Thanks for posting. This gave me a good laugh:

What do we want?

Now!

When do we want it?

Fewer race conditions!

@wellendonner

8
p0nce 4 hours ago 3 replies      
Perlis's Epigrams are my favourite.http://www.cs.yale.edu/homes/perlis-alan/quotes.html

"When we write programs that 'learn', it turns out that we do and they don't."

I still wonder what he means when he says: "Everything should be built top-down, except the first time. ". What is different the first time that warrants not to do it?

9
hyperpallium 3 hours ago 0 replies      
1. The bug is not in the section of code youre looking at.

2. Rule #1 is of no practical use.

Philip Roe on debugging CFD codes

10
seodisparate 3 hours ago 1 reply      
"When stuck debugging, question all of your assumptions about the program."

A shower thought. Helpful if the program does something other than what you thought it did, allowing you to figure that out quicker.

11
ronilan 5 hours ago 0 replies      
I like this quote: "'Even a single quoted word can be a "double-edged sword"' she said. 'You can't "escape" that'. He didn't."
12
leodelang 4 hours ago 0 replies      
Focus is a matter of deciding what things youre not going to do. (John Carmack)
13
WalterBright 3 hours ago 0 replies      
"I think that I can safely say that nobody understands template mechanics." -- Richard Deyman
14
WalterBright 3 hours ago 0 replies      
"The double is cast." -- Julius C'ster
15
Kenji 4 hours ago 0 replies      
> When your hammer is C++, everything begins to look like a thumb. Steve Haflich

As someone who works with C++ every day on my job, this made me laugh out loud. I think I'm gonna write this one on our whiteboard on Monday.

16
ianamartin 4 hours ago 2 replies      
Many of these are good. I agree with joncampbelldev. The dynamically typed quote is quite snarky.

To that I'd reply: Statically typed languages are when you have to tell a computer that 2 is an integer and 'two' is a string.

I know this argument goes back and forth and never ends. And I'm not trying to start a flame war.

But I will say this: type systems occupy a space on a spectrum in my opinion, and the spectrum is imperative vs. declarative languages.

It's a conceptual spectrum, of course. And the difference is between telling the language exactly what you want and how you want it vs. telling the language what you want to get.

On one end you have C# as the avatar of strongly and statically and imperative languages. In the middle, you have something like Python that is strongly typed but dynamic, and at the far end, you have SQL, which is as strong or weak as you define your tables as well as entirely declarative. You can choose your own adventure with databases.

You have some bad citizens like JavaScript and PhP, but the conversation is really about the use-case.

IMO, strong typing is far more important than static typing. Because I personally can't stand silent errors. Weak type systems are the root of all evils. Anything that can fail silently is dead to me. Fuck you, <language>. Tell me if something broke. Don't just quietly do something I don't expect.

Which approach to use to me depends on the team and the size of the project. You can run with Python for small teams. You really can't for large teams. You really shouldn't use C# for rapid prototyping or small teams.

But you absolutely should a statically typed language for large teams.

Edit to put the sharp in my example of statically and strongly typed language.

17
fefe23 3 hours ago 1 reply      
Why oh why do fans of agile programming and dynamic languages always hide behind snarky epiphets from other people?

This imbalance is, in my mind, the greatest reason why people still prefer C and Java over Clojure and other hipster languages. The feeling that those people must be overcompensating for something, just look at how they feel the need to talk down on others all the time.

In the left corner, you see snarky hipsters complaining about how all security problems would go away if only everybody stopped using C, and if people dropped their methods and adopted agile / XP / scrum / other fad of the week.

In the right corner, you see C programmers writing code in the waterfall model.

I'm much more attracted to the right corner. Not because C or waterfall are great. Because of the (to me) juvenile unprofessional behavior of the people in the left corner. You see, I'm a programmer. I'm attracted to people who are programming. Not to people who are telling others how not to lead their lives, while appearing to produce scant noteworthy code themselves.

Compounding this is that OP is leaving a link to his own blog here. And this blog post is basically "here are some things other people said that I happen to agree with". I learned nothing.

BTW: The very same behavior makes me think highly of postgresql and less highly of nosql databases. My instinct tells me to trust people who don't feel the need to trash-talk others.

EDIT: to stay with the theme of the post: I always liked "Don't learn the tricks of the trade. Learn the trade."

EDIT: To win me over, don't tell me my stuff is bad. Show me that your stuff is good.

25
SPIF Streaming Progressive Image Format fhtr.org
31 points by k__  7 hours ago   19 comments top 10
1
kig 1 hour ago 0 replies      
Hi, author here. Wow, this was a while ago, I'd forgotten I wrote that.

There's a version of this using a directory of images and loads in a bigger picture if you zoom in: http://fhtr.org/multires/ (Note that, yes, it'd be better to have a tile map for large resolutions and load in just the visible part of the image. And dump the hi-res tiles when zoomed out.)

SPIF's intention was to throw out a "it'd be cool if browsers supported something like this natively"-proposal, as the browser knows best what pixels of an image are needed for sharp rendering. For the webdev, the experience would be to just put the image on a page, rest assured that it looks good. Like with SVG.

Yes, loading JPEG2000 / progressive JPEG with stream truncation would be nice.

Images don't load on iOS? Probably some silly bug in my code.

Images can't be saved with right-click? That's probably due to using revokeObjectURL after loading the image from a blob.

2
morecoffee 3 hours ago 0 replies      
> Tech Details

> The SPIF format starts with a header that tells the offsets and sizes of the images in the SPIF. The images are stored smallest first, but there are no image size restrictions apart from that.

So... two HTTP requests per image load? That is probably going to hurt more than it helps. Also, that probably means a Range request, which don't have great support. (For example, the builtin Python http server SimpleHTTPServer doesn't support them.)

3
jsingleton 3 hours ago 0 replies      
This appears to be quite similar to FLIF (http://flif.info/) and to some extent BPG (https://bellard.org/bpg/). There's no shortage of image formats that are better then JPEG and some have been around for quite a while.

The problem is in software/hardware support, particularly in browsers. JPEG has a lot of momentum. WebP is only supported in Chrome/Opera (https://caniuse.com/#feat=webp) and IE/Android don't even support animated PNG yet (https://caniuse.com/#feat=apng).

I did loads of research on this for a web app dev book, which I'm currently updating for the second edition. Browser image support hasn't changed much since it was originally published.

4
pornel 1 hour ago 0 replies      
Nicer progressive display is possible in JPEG already:

https://imageoptim.com/progressiveblurdemo.html

it's only a limitation of libjpeg that smoothing of early progressive scans is weak and incomplete, and thus looks very blocky. I wish browsers improved implementation of this.

Partially loaded DCT coefficients correspond quite well to maximum possible image resolution, so it is known how much cross-block smoothing needs to be applied to get nearly-optimal smooth preview. There's even an implementation of that idea:

http://johncostella.webs.com/unblock/

5
vortico 5 hours ago 1 reply      
Nice idea, but it has usability issues:

- Right click -> View Image, Save Image, or Copy Link doesn't seem to work. This might be an issue with Firefox, because the blob is stored somewhere.

- No direct linking is possible, so I can't drop a link to the image into a chatroom.

- Zooming in page with Ctrl+Plus doesn't increase the resolution of the downsized image.

I'd rather download the original image, even if it takes more time/bandwidth, if these issues aren't fixed. And the CPU usage scares me a bit, if you multiply this by ~100 images, which is very common on news websites.

However, this method is better than embedding the full 7360 x 4912 image at least. It took 0.8s for the page to load in my browser and 6.0s to download the test.spif image.

I wonder if there is a way to use a progressive JPEG in a normal <img> tag (so it displays progressively rather than once it has finished loading) and use Javascript to halt the download once a certain amount has downloaded.

6
ricardobeat 1 hour ago 0 replies      
How old is this? In practice, since there is no native support for this format, ways 2 and 3 are the same (both JS based). We now have srcset + media queries that let us achieve this without any overhead.
7
userbinator 5 hours ago 2 replies      
It sounds like a reinvention of JPEG2000 and progressive JPEG, which were specifically designed to accommodate use-cases like this without having to store separate independent images for each resolution, and allowing the data stream to be truncated at any point to reach the required level of refinement.
8
ClassyJacket 5 hours ago 2 replies      
Cool work! I'm a fan of anything that makes things more efficient.

Why not go a step further and have a server dynamically scale the image based on the size the image will be displayed at, so it's always displayed pixel for pixel and there's no waste? Anyone tried that?

9
azinman2 5 hours ago 1 reply      
Image is very very low res on my iPhone (os 11 beta)... seems like it pre maturely stopped?
10
baalimago 3 hours ago 0 replies      
sort of pointless no?

it would be for me, anyways

26
Bitcoin May Have Solved Its Scaling Problem vice.com
75 points by artsandsci  7 hours ago   63 comments top 11
1
richardw 2 hours ago 4 replies      
Crypcocurrencies need to stop requiring us all to hold every transaction in one unified blockchain dump. There has to be some way to break the network out into shards while still preserving the distributed nature and ability to pay anyone.

No matter what we do, moving and storing every single transaction is insane. It's like my bank account needing to know what every person in the world's purchases are this morning. I shouldn't need to know what some guy on the other side of the world spent his lunch money on, just to buy my own.

What am I missing? Surely the uber com sci phd's have fully solved this?

2
wildbunny 2 hours ago 2 replies      
Bitcoin hasn't solved the scaling problem. All these BIPs are just short term fixes which don't address the core problem. I opened a discussion thread on bitcointalk hoping to catch the attention of the core developers, but it seems they don't read that forum anymore.

https://bitcointalk.org/index.php?topic=2036368.0

3
rebuilder 5 hours ago 6 replies      
This doesn't even remotely address the scaling problem, such as it may be. I'm not keen on forced blocksize caps as I don't see how that kind of planned economy model is compatible with the free-market money ideas behind Bitcoin, but even an unlimited blocksize would be unlikely to truly solve scaling.

Bitcoin simply doesn't scale very well because of the way transaction history need sto be stored. As long as the rate of blockchain growth can't be significantly reduced, transaction rate can't grow very much.

So small block proponents are likely to be right in that on-chain scaling is unrealistic, but if the market must be forced by developers to adhere to an artificial limit, I think that in itself implies Bitcoin is unlikely to be able to grow very far as the market is unable to self-regulate.

4
jgord 3 hours ago 2 replies      
Not a great article. Perhaps best to downvote until a better one comes along.

Things to consider, which most people can agree on :

- SegWit2X [ segwit feature, followed by 2MB block size ] is seen as a compromise worth taking to move things forward past the current impasse, by a range of people who dislike SegWit and think an actual blocksize increase is urgently needed

- There is an immediate scaling problem : The mean and median blocksize is converging on the max blocksize of 1MB, and transactions waiting to get into the block blow out on a daily basis to 20MB. This backlog of waiting transactions causes users to pay high fees to get their transaction to the front of the queue, and causes delays in processing/confirmation times.

- there is a less urgent longer term scaling problem due to the linear architecture of the blockchain itself

My personal viewpoint :

- if you are only processing 3 transactions per second, there is substantial room for performance increases

- SegWit may help slightly in alleviating transaction pressure, and a 2MB block size will help a bit more. Given the voracity of the debate, SegWit2X is a practical compromise we should adopt to move forward. The USD/BTC market seems to agree with this.

- We do need a schedule for block increases over the next couple years so all parties have certainty, and so it doesn't create renewed havoc every few months.

- blocksize is just the first of several engineering optimizations that could see transaction throughput on the same linear architecture increase over 1000x. eg. We have a 150GB full blockchain - it would be half the size on disk if compressed.

- Even if LN or side chains are the way to scale, and only 0.1% of transactions are 'settled' on the main chain, we will still need a more efficient main blockchain, because the growth of transaction volumes will likely be exponential for the next decade.

- some people think the blocksize is a fundamental part of the genius breakthrough in technology that is the blockchain, and so cannot be tinkered with. This is wrong. Proof-of-work, monetary supply, miner reward, chained hash of the blocks, longest block wins, distributed redundancy are fundamental ideas at the core of bitcoin .. but blocksize is merely a technical issue, an artificial limit that was there to make implementation simpler, and always assumed that it would need to be increased well before it became an issue.

5
danmaz74 6 hours ago 1 reply      
Stopped reading after the first paragraph, which said:

> after users overwhelmingly voted in favor of implementing a code improvement

It was miners who voted, not users. If you can't get the first paragraph right, I don't need to read the rest of the article...

6
erikb 4 hours ago 1 reply      
> The bitcoin network simply wasn't designed to be able to handle the kind of volume it is experiencing as more people are beginning to use the currency, and the 1 megabyte "blocks" of transaction data that get uploaded to the blockchain are full.

If you work in IT this is like a monthly meeting of the team. You ALWAYS hear "oh we are so suprised, it wasn't designed for such an amount of users/load/whatever". It's just not true. It's a show. And I really don't know why this argument STILL works. If you run 10 topics in parallel, 7 will report this exact reason. If you check them out closely 3 report that truthfully, but really let you wonder why management decided to put so stupid people on top of these projects in the first place. 4 actually have planned for this to happen, and where just not sure if they run into it this month or next month.

7
hashmp 5 hours ago 3 replies      
Bitcoin is not slow. It's fast, if you are prepared to pay for it.

This is where the disagreement lies. Should the price of on chain transactions be reduced and remain low.

8
mdns33 3 hours ago 0 replies      
worth discussing charlie lee, founder of litecoin proposal on how bitcoin and litecoin can work together to solve the scaling problem: https://segwit.org/my-vision-for-segwit-and-lightning-networ...
9
auggierose 1 hour ago 0 replies      
Somehow Bitcoin reminds me of tulips.
10
basicplus2 30 minutes ago 0 replies      
Even if the scaling problem was ever solved, what will one do when one runs out of all the power in the known universe to generate more Bitcoins..

https://motherboard.vice.com/en_us/article/aek3za/bitcoin-co...

11
baalimago 3 hours ago 1 reply      
bitcoin wont ever become useable since it's too easy to be anonymous.

governments wants structure

27
Probability quantum world obeys local realism is less than one in a billion phys.org
50 points by jonbaer  10 hours ago   33 comments top 6
1
krastanov 9 hours ago 1 reply      
This is just a fancy way to say that there is yet another Bell test experiment that confirms the Bell inequality[1]. The technology behind that is amazing, both in terms of practical immediately useful engineering and in terms of fundamental physics, but the title is just a pretentious obfuscated way to phrase it.

[1]: To be precise, it says that the chance that the Bell inequality is violated is very small.

2
atemerev 5 hours ago 3 replies      
Interestingly enough, if you are to design a really big and scalable MMO game (with topological metrics, as opposed to a location graph), you will probably introduce similar hacks to overcome limitations. There will be "speed of light" and "time dilation" analogues as the result of limited computing power. You'll get rid of "realism" early enough, as the result of limited memory why keep the word stored, when you can procedurally generate it on demand? And it is natural to think in interaction probabilities implementing a random number source and using it consistently is much cheaper than tracking all collisions on micro scale.

These results look somewhat different in modern MMO games (e.g. EVE Online), as computing nodes are arranged within a network (a large graph). But imagine a "computational fabric", with atom-scale nodes arranged topologically (a space made of computronium). What kind of effects would be natural to implement there, and what are the hacks to use?

My hypothesis is that what you'll get in such arrangement, if you are to design your simulation efficiently, is quantum mechanics, more or less.

3
OscarCunningham 5 hours ago 0 replies      
The title is a typical example of "transposing the conditional". What they mean is P(observations|local realism) < 10^-9, but what they wrote is P(local realism|observations) < 10^-9.

Science publications should know better!

4
pizza 3 hours ago 0 replies      
Is there anything related to the idea that even though physics limits information to at most the speed of light, you can still guarantee that "two events c*t distance apart both follow the laws of physics" in less than t time (0, really), so "the laws of physics affects things everywhere" is in some sense superliminal?

Even if the inputs really only rely upon states which are together unknowable before t (which includes the knowledge that two events occur in the first place..) - in this sense, no information is transferred, I suppose.

5
deepnotderp 8 hours ago 4 replies      
I'm curious what hacker news readers think about Andrew Friedman's work on closing the "free will loophole" in bell's theorem.
6
enkid 9 hours ago 5 replies      
Could someone explain the implications of this? If locality is broken, would that mean that sending a message faster than light would be possible (theoretically)?
28
Mega Maker mega-maker.com
26 points by doener  7 hours ago   7 comments top 2
1
CM30 1 hour ago 1 reply      
Oh hey, it's this fan game. Never expected it'd be posted on Hacker News.

It's a pretty damn impressive project though, especially given how things like online level sharing have been implemented and what not.

Probably won't be shut down by Capcom either, given their past history of supporting fan projects. So yeah, definitely one to check out if you're a Mega Man fan, or just want something a bit different to Mario Maker.

And you may also want to read my interview with the game's creator if you haven't already. Goes into a bit of detail about their history, previous works and inspirations for the project:

https://gamingreinvented.com/interview/lets-interview-mega-m...

2
chadcmulligan 6 hours ago 1 reply      
needs windows :-( - probably should mention that somewhere

Looks great by the way

30
Decentralized trust graph for online value exchange without a blockchain settle.network
60 points by cissou  10 hours ago   22 comments top 8
1
Confiks 8 hours ago 1 reply      
This all looks very similar to a nave version of the early (~2010) Ripple protocol [1], including all its limitations.

An unmentioned limitation is that the system doesn't really suit the 'consumer economy', unless salaries by producers are also paid out in the currency. It does work well for transactions between peers, i.e. friends spending / borrowing in this system it's essentially the same thing from each other.

[1] https://www.youtube.com/watch?v=xgGcVv04unM

2
kang 7 hours ago 1 reply      
> "But while the operations of such currencies, based on blockchains, have been fully decentralized, the trust graph of these cryptocurrencies have remained entirely centralized. Everyone need to trust Bitcoin to transact in Bitcoin, and everyone needs to trust Ethereum to transact in Ethereum or assets issued on the Ethereum blockchain."

Given a string can you programatically determine whether it is bitcoin? (Starting from a string called 'genesis block', asking assumingly malicious nodes for more strings, doing some math you can reach the string in question.) (Note: You cannot do so with ERC tokens, an alarming thing to ponder[0])

Can you determine with high probability the "amount of work done" in producing the set over which you previously did math? (This amount acts as the amount of trust we lay behind it, which we determine on our own. We always await anyone who can provide more cumulative work done.)

Thus there is no trust graph with bitcoin atleast. (Apart from mining centralization, and users prone to upgrading, bitcoin is pretty close to decentralised and looks to improve in future with physical limits being reached with mining chips and people increasingly opting for immutable code alongwith the immutable chain it calculates on (once all aspects of fungibility like anonymity is solved, users might stop upgrading at all).)

---

settle.network is just git.

All of private blockchains, federated sidechains, colored coins, so-called-smart-contracts-on-eth, premined coins, proof-of-stake coins, etc etc are just PKIs.

[0] https://www.google.co.in/search?q=how+to+check+if+a+token+is...

3
wsxiaoys 9 hours ago 1 reply      
I always think the value Ethereum brings is few when the use case is like "store credit", "in-game reward". This looks like a very promising solution!

Edited:One question: it seems to me that settle service is nothing but a group of rest API end point(include register?), it'll be really useful if i could interactive with it directly with curl/bash without installing any client.

4
mccoyspace 7 hours ago 0 replies      
This is very similar to the ripple pre-cursor called ripplepay. A clear explanation of that can be found here. https://classic.ripplepay.com/about/
5
eloary 8 hours ago 0 replies      
This looks like an approach more akin to federation (or more accurately confederation) on social networks than the "distributed" notion of blockchain trust, which in practice, as noted, still exhibits centralizing political forces per chain, but is robust to most forms of direct attack. I do think that a mix of approaches is what will happen in the future since the two methods express different levels of trust and liquidity of transaction.
6
em3rgent0rdr 9 hours ago 1 reply      
> "Currencies operate on a centralized trust graph. This sentence is almost tautological..."

Not the case with physical currencies such as precious metals...

7
infruset 8 hours ago 0 replies      
Any signs this project isn't dead? The curl command for installing it does not work and the last blog post is from January.
8
gabhubert 8 hours ago 1 reply      
interesting decentralising primitive since fiat currency basically "is" trust.
       cached 22 July 2017 13:02:01 GMT