hacker news with inline top comments    .. more ..    24 Jul 2017 News
home   ask   best   26 minutes ago   
1
Pev: Postgres Explain Visualizer tatiyants.com
389 points by insulanian  11 hours ago   37 comments top 17
1
hotdogknight 3 hours ago 2 replies      
I needed a version that ran on the command line so I made one here: https://github.com/simon-engledew/gocmdpev
2
ris 24 minutes ago 0 replies      
Something I've wanted from an explain viewer for a long time is simply using the "start time" "end time" information on the nodes to put things in a basic timeline. Most visualisers seem determined to keep the layout as a pimped up version of the tree given to them.
3
sgt 1 hour ago 0 replies      
This is really great and I think I might start using this. I would also love a standalone version of this that runs outside the browser. Something that can maybe connect directly to my DB.
4
atatiyan 6 hours ago 0 replies      
creator of pev here, thanks for all the kind words!
5
garysieling 10 hours ago 1 reply      
Does this store the plans? I like these things, but I'm always a little leery that this will expose my database schema in Google search results.
6
mistercow 7 hours ago 0 replies      
I use this tool often, and it's great. It's a lot easier to wrap your head around plans when the way it displays them.

The one thing I wish it had is either the ability to not save plans automatically, or at least a button to clear the history. As it is, I just pull up a console from time to time and do localStorage.clear()

7
obiwahn 5 hours ago 0 replies      
Looks awesome! How about adding a direction to your graph for people just starting with SQL.
8
fnord123 10 hours ago 2 replies      
Looks good, but why not dump is as a flame graph?
9
maxvu 8 hours ago 1 reply      
Why, in the example, does the constituent `customerid` join take longer than the forming `orderid` one?
10
sghall 9 hours ago 1 reply      
Cool project. Not a DBA but was interested in playing around with this. Be great to maybe add some example plans here:http://tatiyants.com/pev/#/plans

So if you just want to checkout the interface you can click to load up an example or two.

11
beefsack 11 hours ago 2 replies      
I can completely see myself using this on my Postgres projects, but something like this would be most useful for me at work.

How feasible would it be to port this over to MySQL / MariaDB? I know EXPLAIN output on MySQL is much simpler than what you get out of Postgres so my gut feeling would be that it wouldn't be possible.

12
edraferi 10 hours ago 0 replies      
Very cool! Now I want to figure out the Postgres EXPLAIN JSON format and start parsing other DBs to fit, just so I can use this tool on them.
13
dlb_ 9 hours ago 0 replies      
Very nice! I wonder if it would be possible to embed that into pgAdmin? Possibly with Electron?
14
emilsedgh 11 hours ago 0 replies      
Absolutely fantastic! Thank You!
15
isatty 10 hours ago 0 replies      
Thank you, this is very useful!
17
edoceo 11 hours ago 0 replies      
Rad
2
Status update from the Reproducible Builds project debian.org
255 points by lamby  12 hours ago   66 comments top 11
1
dingdingdang 12 hours ago 1 reply      
So happy someone are spending time on this issue, it's like a breath of fresh air and intelligence in the midst of all the usual software (security/privacy/etc. take your pick) mayhem. It's worth reading https://reproducible-builds.org/ for a brief (re-)reminder on why this project is important.

Outtake: "With reproducible builds, multiple parties can redo this process independently and ensure they all get exactly the same result. We can thus gain confidence that a distributed binary code is indeed coming from a given source code."

2
adamb 11 hours ago 0 replies      
Beyond providing security, reproducible builds also provide an important ingredient for caching build artifacts (and thus accelerating build times) across CI and developer machines. They also can form the basis of a much simpler deploy and update pipeline, where the version of source code deployed is no longer as important. Instead a simple (recursive) binary diff can identify which components of a system must be updated, and which have not changed since the last deploy. This means a simpler state machine with fewer edge cases that works more quickly and reliably than the alternative.

I'm very grateful for the work that this project has done and continues to do. Thank you!

3
seagreen 12 hours ago 1 reply      
Amazing work. Thanks so much to everyone who's contributing. The upstream bugs filed are especially appreciated since they make the whole Linux ecosystem more solid, not just Debian.
4
phreack 12 hours ago 0 replies      
In case anyone is not aware of what reproducibility is and why it's a worthy goal, here's their statement: https://wiki.debian.org/ReproducibleBuilds/About
5
cperciva 12 hours ago 3 replies      
Does anyone know if they've made the Packages file (repository metadata file, listing the packages in the repository) build reproducibly yet?

I tripped over this a couple weeks ago and was both amused and annoyed, since it seemed that packages were being listed in the file in a random order. I'm asking here because it might already be fixed; we're using a slightly old version of the package/repository tools.

6
Cogito 1 hour ago 0 replies      
Has anyone played with the tool they mentioned, diffoscope? Sounds interesting and wonder how good it is at, for example, comparing excel files with VBA code and formulas etc.
7
pen2l 12 hours ago 8 replies      
What does "reproducibility" mean? I understand and appreciate the importance of reproducibility in the context of scientific experiments, but I don't understand what it means in terms of computer programs. I am guessing it has to do with being able to build on different architectures without issue?
8
pmoriarty 12 hours ago 1 reply      
How does the kind of reproducibility spoken of here compare to that offered by Guix and Nix?
9
morecoffee 11 hours ago 4 replies      
Once we have reproducible builds, will it be possible to have verifiable builds? As in, can we cryptographically show that source + compiler = binary?

Right now we can sign source code, we can sign binaries, but we can't shows that source produced binaries. I would feel much happier about installing code if I knew it was from a particular source or author.

10
gtt 12 hours ago 4 replies      
How do they achieve reproducibility with python and some other languages which include timestamps and such?
11
mabbo 9 hours ago 4 replies      
One (misguided) counter argument I've heard from otherwise fantastic devs it's the notion of adding randomness to unit tests in the hopes that if there's a bug, at least some builds will fail. In practice, I've seen those builds and developers saying "yeah, sometimes you need to build it twice".

I think the solution is to give those devs who favor such techniques a separate but easy to use fuzzing tool set that they can run just like their unit tests, separate from their usual 'build' command. Give them their ability to discover new bugs, but make it separate from the real build.

3
Host your own contacts and calendars and share them across devices partofthething.com
27 points by acidburnNSA  3 hours ago   9 comments top 3
1
Nux 1 hour ago 1 reply      
People looking for easier ways, Nextcloud/Owncloud already does this and you also get with file sync/backup/share for computer/smartphone and more as a bonus; all you need is just a LAMP server.
2
dvfjsdhgfv 26 minutes ago 4 replies      
> Im trying to learn ways to minimize my reliance upon large companies for handling my day-to-day personal data.

Except that the moment you "share them across devices", at least one large company will silently grab your contacts anyway. And several others will try to, too, with one excuse or another.

3
sqeezy 1 hour ago 0 replies      
Check out Mailinabox. This Project does most of it for you. https://mailinabox.email/
4
A Practical Guide to Tree-Based Learning Algorithms sadanand-singh.github.io
214 points by sadanand4singh  14 hours ago   12 comments top 3
1
thearn4 12 hours ago 3 replies      
As interesting as I find the current state of deep learning to be, there is something about random forests that I can't help but find much more cool. Probably the amazing out-of-box performance.
2
iamnafets 12 hours ago 2 replies      
I've found Adele Cutler's presentation on random forests to be an outstanding resource for getting intuition of tree-based algorithms.

http://www.math.usu.edu/adele/RandomForests/UofU2013.pdf

Thinking about trees as a supervised recursive partitioning algorithm or a clustering algorithm is useful for problems that may not appear to be simple classification or regression problems.

3
6502nerdface 12 hours ago 1 reply      
Nice write-up, thanks for sharing. One possible typo I noticed:

> Maximum depth of tree (vertical depth) The maximum depth of trees. It is used to control over-fitting, higher values prevent a model from learning relations which might be highly specific to the particular sample.

Shouldn't it be lower values, i.e., shallower trees, that control over-fitting?

5
The gig economy of the 18th Century bbc.com
50 points by happy-go-lucky  6 hours ago   19 comments top 3
1
henrik_w 25 minutes ago 1 reply      
Some people argue that software development will move more and more to a gig model. In some cases it may make sense, but in many cases it doesn't. Mostly because of the knowledge you build up about the product and because software is never "finished".

https://henrikwarne.com/2017/01/22/software-development-and-...

2
madaxe_again 1 hour ago 2 replies      
A more apt term than "gig economy" would be "crumb pecking" - as in "pecking at the crumbs that fall from the luxuriant table of the plutocracy".

How governments can be enthusiastically cheering on a return to pre-industrialisation labour practices is beyond me - unless they have a vested interest in doing so, or are doing their damnedest to mollify an increasingly agitated populace while toeing the line their donors and lobbyists demand. Trump & Brexit are both symptoms of the increasingly widespread anger and despair, and if nothing fundamental changes (UBI?), a descent into a totalitarian/terrorist (where the terrorist is anyone who opposes unfettered state power) dichotomy is inevitable.

You already see the rhetoric in the press - traitors, saboteurs, etc. - and the political and economical landscapes are inseparable, despite what many wishful thinkers believe.

3
em3rgent0rdr 1 hour ago 3 replies      
The gig economy has been getting a lot of criticism by politicians who are fixated on "jobs" or who complain about the lack of "protections". But the gig economy is a very efficient way to allocate resources in a dynamic environment. The fact that the gig economy was widespread hundreds of years ago and today lends credence to its utility.
6
Petoskey stone wikipedia.org
31 points by curtis  5 hours ago   10 comments top 7
1
chime 16 minutes ago 0 replies      
Well this was a serendipitous post! My wedding anniversary is coming up and I wasn't sure what to get my wife. Thanks to this, I looked up 'Petoskey stone' on Amazon and found a pretty necklace for sale. Someday I'd love to visit Michigan and buy matching earrings/bracelet in person.
2
xanthineai 11 minutes ago 0 replies      
All the way from Traverse City to Charlevoix is, indeed, very good and never crowded. Storms are good for stirring up the aluvial deposits to reveal new finds. Rubber boots will help if the temperature is significantly below freezing. Otherwise, I prefer barefoot with some local microbrews. The biggest one I found, about the size of an american football, was the one I stepped on. :)
3
acidburnNSA 3 hours ago 1 reply      
I grew up in the small town of Petoskey, MI. You can indeed just walk down the shores of Lake Michigan and find these. People at local camps come and gather them one day and then spend the rest of their vacation polishing them in rock polishing stations and fashioning them into jewelry and stuff. All the gift shops sell Petoskey stone paper weights, knives, boxes, necklaces, Michigan-shaped Petoskey stones, you name it!
4
geoffbrown2014 14 minutes ago 0 replies      
Spent many summers up at Glen Lake as a kid sanding and polishing these stones. Lots of fun.
5
jphillipsio 2 hours ago 0 replies      
I was there and found a few on Saturday. I try to make it to Petoskey at least once per year. Looking for these is one of my favorite ways to spend an afternoon.
6
jlamberts 2 hours ago 0 replies      
We always used to go up to Torch Lake during summer when I was a kid. Collected a bunch of these. The easiest way to find them is to bring some water with you and go for a walk on a rocky road, and pour the water on likely rocks. The water makes the pattern much clearer and fewer people look for them on the roads.
7
sus_007 3 hours ago 2 replies      
Why a Wiki page of a stone on the front page of HN ?
7
Learn Ethereum smart contract programming ethereumdev.io
74 points by ym705  5 hours ago   25 comments top 6
1
staticelf 1 hour ago 1 reply      
The problem I see with Ethereum is that it is way too complex. I have read perhaps at least 10 times on their home page without even understanding what it does, what problems it solves etc.

This is the sole reason why I don't think it will be successful in it's current state. With most successful tech or services or whatever the core idea is often super simple to grasp and you can instantly see the benefit. I don't see this with Ethereum.

Even with bitcoin which is complex the benefits are instantaneous for the common man. Decentralized system, no single entity controls it. It is a fixed amount of bitcoins so like a mineral it's value is probably going to be stable in the long run and also each bitcoin will increase in value when more people get interested. It's easy to send coins to anyone in the world, at any time.

What does Ethereum do? Smart contracts is probably the key word but I don't understand how it works or how it will benefit me. Why bother?

2
omarforgotpwd 1 hour ago 3 replies      
Programmable smart contracts are a great idea in a world where programmers write bug free code. That world does not exist yet. Until we have near-perfect code writing AIs every new smart contract is just a disaster waiting to happen.
3
dullgiulio 2 hours ago 1 reply      
I know this might sound like a joke, but it is not: how about documenting how to actually test and debug smart contracts? Is there even a way to do so? How about fuzzying?
4
g00n 4 hours ago 3 replies      
Maybe it's because I don't have a use for it yet, or at least don't know if I have a use for it. But, the whole Ethereum universe seems vague and seems like they could explain it more than they do. "Install Ethereum wallet, write a contract, ..., Profit!" ?? Is there a better source that might explain what it is, how it works (a glancing explanation not a full network inner workings)??
5
rmetzler 39 minutes ago 2 replies      
I took a glance at the lottery example [1] and I wonder: isn't the owner of the lottery able to change the outcome so the winningNumber is always in his favor?

[1]: https://ethereumdev.io/managing-multiple-users-a-simple-lott...

6
pknerd 19 minutes ago 1 reply      
Is there any other crypto platform allow to write dApps for blockchain?
8
Startup Kite embedding ads in open source Atom editor plugins theoutline.com
38 points by posnet  1 hour ago   12 comments top 7
1
RubenSandwich 56 minutes ago 0 replies      
Look at this clear dark pattern: https://outline-prod.imgix.net/20170721-QVaxMDgDwdZ1TBufCdq4.... (Image taken from the article.) Want to use our service, then only lists positives. Or these other services, then only list negatives.

If you're reading this Kite. I now have a negative view of your product. We cannot allow corporations to take over open source tools. Donating is perfectly fine and encouraged, but the above example is a downright take over. If you want another tool then create one, don't take over an existing one and use the communities trust of that tool to promote your product.

2
dessant 22 minutes ago 0 replies      
This is the minimap fork:

https://atom.io/packages/minimap-plus

https://github.com/mehcode/atom-minimap-plus

It is a featured[1] Atom package, which may point to whom is GitHub endorsing in this issue, though we could see a more direct response from them regarding both minimap and autocomplete-python.

After reading sadovnychyi's reaction[2] to the autocomplete engine selection screenshot, I think forking is also the only remaining step for autocomplete-python.

[1] https://atom.io/packages

[2] https://github.com/autocomplete-python/autocomplete-python/i...

3
bloomca 45 minutes ago 0 replies      
If you are looking for the github thread https://github.com/atom-minimap/minimap/issues/588.
4
roadbeats 54 minutes ago 2 replies      
> It is unclear what Kites business model is, but it says it uses machine-learning techniques to make coding tools. Its tools are not open source.

I've never heard of such a thing before. Could someone explain how would they use machine learning for building coding tools ?

5
omginternets 41 minutes ago 0 replies      
I just uninstalled Kite.

It's a real shame as the service was good, but nothing is good enough to justify advertisements in my work-space. The fight against distraction is hard enough as it is without having to think carefully about where I'm clicking due to dark-pattern UI.

6
tangue 29 minutes ago 1 reply      
Time to write Adblock for code editors.
7
GoToRO 59 minutes ago 1 reply      
Why not use this to fund open source? Have a checkbox to disable ads if you really want to give people freedom. I just can't see how open source can compete without enough funds.
9
18yo arrested for reporting a bug in the new Budapest e-Ticket system marai.me
432 points by atleta  7 hours ago   140 comments top 23
1
amingilani 1 hour ago 1 reply      
In my country, the laws are draconian and totally against this kind of responsible disclosure. But being a good guy, whenever I find something I write a strongly worded email explaining why the company's IT department messed up, how to test said mess-up, and how they can hire my company to ensure these kinds of stupid things don't happen again.

I've reported several of these issues, sometimes all I get is single reply months later saying: "fixed".. mostly, nothing.

Once I found a SQL injection in a courier service's (very broken) web portal. This was very serious because any idiot could drop all the tables, so I sent an email to the most important worded member of their tiny, yet already bureaucratically structured team. I followed up several times because I knew someone saw my email (I embed beacons in my emails) but gave up after the sixth time. Three months later someone else replied saying "thanks Amin, we've fixed it"

On a separate occasion, a large government agency's emails routinely ended up in my spam folder. It was a huge problem, and they acknowledged it and said they couldn't figure out what was wrong. I took five minutes and found the problem to be a misconfigured server on the domain. The server sending the email thought it was `server-a.governmentdomain.com` but there were no DNS entries pointing the subdomain to the server.I reported this problem with clear instructions to test and fix the issue, but I was called despite the instructions, multiple times, to explain the issue with my words over the phone. This was 2 years ago, last I checked, the issue was still present.

2
goodplay 6 hours ago 8 replies      
I remember coming across a serious bug in a site that belonged to a top multi-billion company. My brother also found what essentially an unrestricted privacy leak (and possibly editing access) in a top university (leaked data is sensitive personal information, not academic). Neither of us reported (or exploited) what we found.

Protection from this kind of blame-shifting and misdirected retaliation should be guaranteed by law. Until it is, bugs in critical and important infrastructure will go on unreported, and remain available for malicious actors to exploit.

3
whatnotests 6 hours ago 3 replies      
That's how the DMCA works. Remember the guy who gave a talk about Adobe's PDF creator which purported to produce "secure" documents (required a password) but the feature was easily bypassed.

Adobe had him arrested the day after he gave his talk.

Link to a Wired article here: https://www.google.com/amp/s/www.wired.com/2001/07/russian-a...

EDIT: I have a terrible memory-- thanks to the folks who replied to my comment with corrections.

4
angus-g 6 hours ago 4 replies      
Side note: this page gives me the weirdest Firefox behaviour I've ever seen: https://gfycat.com/HandyRapidJabiru
5
pmoriarty 6 hours ago 4 replies      
"this outrageous move from the police brought about fierce reaction resulting in tens of thousands of 1-star reviews on the facebook pages of the companies involved"

In the old days, protesters used to physically go and picket in front of company offices. These days, protesters leave one-star reviews. I wonder which is more effective.

6
nthcolumn 13 minutes ago 0 replies      
Someone pointed out to me the other day that just connecting to a poorly configured system is illegal in some places (Finland in his case). A form of trespass he said. This was a ship in international waters registered in Russia Federation so not sure whose law applies lol. Perhaps if there were more cases where full advantage was taken of such incompetence with spectacular newsworthy results then people would be more appreciative of the work we do and the laws changed to protect whistle-blowers and activists generally.
7
fredsir 1 hour ago 0 replies      
We've seen two[1] cases[2] of this in Denmark in the last couple of years surrounding systems that kindergartens are using. The second one is currently (still) being investigated, but the first one was rightfully concluded earlier this year with the "hacker" being acquitted.

In both cases, it was dads of children in the institution that noticed the bugs when they were rightfully using the system and were ignored when notifying the responsible party about it until they "shouted it so loudly" that they couldn't be ignored anymore, in which case they were reported to the police for hacking.

Links below are in danish, but they can probably be translated if needed.

1: https://www.version2.dk/artikel/boernehavehackeren-frifundet...

2: https://www.version2.dk/artikel/interview-hacker-tiltalt-jeg...

8
anujdeshpande 3 hours ago 0 replies      
Sounds a lot like what happens here in India [1].

Also, if such behaviour is systemic, how should we bring about the paradigm shift in handling such events? Such incidents will happen more often across the world as e-governance becomes more predominant.

1 - https://thewire.in/119578/aadhaar-sting-uidai-files-fir-jour...

9
SeanDav 1 hour ago 0 replies      
Although deeply unfair, this is not unusual, there have been many reported cases of companies shooting the messenger.

Unless the company concerned has a well documented and trusted bug bounty procedure, it can be very risky to report a bug in a system, if it involves any kind of hacking.

What happens is once the "bug" is reported, someone inside the company asks "How did this happen?". Now the person responsible has 2 options, admit it was their fault and the vulnerability exists and risk being accused of incompetence, or say that the system was hacked.

Human nature being what it is, one tends to complain of being hacked, thus snow-balling effects, which lead to the arrest of an 18 year old just trying to help.

My advice: Don't report these types of bugs at all, or if you really feel you must, report anonymously.

10
StreamBright 3 hours ago 4 replies      
Actually he exploited the bug and purchased a ticket for the fraction of the price and than reported it to the public transportation company. The company that runs the infrastructure (not the public transportation one) followed its internal policy and Hungarian law reported the incident to authorities. Police brought in the guy for questioning.
11
chx 3 hours ago 0 replies      
> We knew that they have been working on an NFC/smart card based system for around 4 years, without any visible result despite having spent over 4 million EURs.

The public procurement process for the current system called RIGO was indeed 2013 but the whole process is much, much older than that. A more than 300 page feasibility study was published in 2011 https://www.bkk.hu/apps/docs/megvalosithatosagi_vizsgalat.pd... And a completely different system, called Elektra was announced in 2004 with a 2006 deadline.

This whole clusterfuck with RIGO starting in less than a year was absolutely unnecessary since the 2011 study already suggested supporting contactless credit cards so once RIGO starts the only ones using this online ticket purchasing system will be those who have a credit card but not a contactless one. This is a (very) rapidly shrinking audience.

12
minademian 2 hours ago 0 replies      
this reminds me of a dark joke.

a rabbit was detained by the secret police. the interrogator asks him, "what are you?" the rabbit says, "rabbit"

They torture, beat, and electrocute him for days.

Then, the interrogator asks him, "who told you you're a rabbit?"

13
pmoriarty 6 hours ago 2 replies      
"if you just typed in the url (shop.bkk.hu), the site just wouldn't appear. At first I thought they've taken it offline, but it turns out that they just didn't set up the http -> https redirection. And it was left like that for days. If you just heard about it, you couldn't use it. You had to click a link (normal users won't figure out to put an https in front of the host name, even I didn't think of it)."

I'd really like to know which of these is the better solution.

It seems to me that if people go to the http address, they could be redirected to an attacker's address with a simple MITM attack. So there's an argument to be made for not using http at all, even for a legitimate redirect, because it can be so easily MITM'ed.

On the other hand, if the http address is left unused, then people who try it anyway and it fails will be confused. For this solution to work, it seems the users have to be educated to always and only use the https address.

For these reasons, the whole separate http/https scheme seems broken by design.

What's the consensus from the security community as to the right setup here? Am I missing something, or is there a better way?

14
skinnymuch 5 hours ago 0 replies      
The list of bullet points of the egregious flaws in the software just get worse and worse. It's crazy how I thought the first one or two would be the worst since, but it just got worse.
15
dogmata 55 minutes ago 0 replies      
I wonder if the outcome would have been the same if instead of marking the price down from 9500HUF to 50HUF it was 9499HUF, the test would have still proved the issue.
16
SubiculumCode 2 hours ago 0 replies      
All I want to say is something off topic,but the only vacation I've had away fro m the kids and with my wife was a week in Budapest, and I miss it. Such a beautiful city, so romantic...and I rode the metro everywhere.

ahh Budapest.

:-)

17
qualitytime 3 hours ago 0 replies      
Once there was this website which offered phone number to location service.

They had a form you could try the demo where it sent an SMS to verify and only allowed one query.

If you looked at the source of the page it had hidden fields to override the SMS verification and allow multiple queries.

I freaked out some friends for the day and nearly contacted a journalist but lost interest after some weeks.

I could have had my 15 minutes of fame or be on some list, or both.

It's alright, had some fun.

18
Aissen 53 minutes ago 0 replies      
I thought some CERTs were now doing the reporting as way to shield security researchers from this kind of things ? Or did I hear wrong ?
19
odabaxok 1 hour ago 0 replies      
All I can think about, what a shame can this be for the developers releasing this software. There must have been a bunch of people working on this and wasn't there no one to say this is wrong?
20
ikeboy 5 hours ago 1 reply      
>Didn't any of the engineers on the team tell their managers that something isn't right? I find it hard to believe.

Or, the managers knew full well the system was shit and they had no time to fix it, but 80k/month is 80k/month.

21
beters 6 hours ago 3 replies      
When I was in Budapest a few weeks ago, I heard from multiple locals that the metro system was owned by some sort of mafia. I wonder if that explains the subpar security and overreaction to the bug report.

edit: a few weeks ago, not this past summer that is still occurring

22
daef 1 hour ago 1 reply      
is HN hugging shop.bkk.hu to death?
23
Negative1 4 hours ago 1 reply      
The price of a ticket was client-side authenticated!? I can't fathom the level of incompetence required to do something like this...
10
Your tools shouldnt spy on you, the case of .NET Core opinionatedgeek.com
34 points by mel919  1 hour ago   11 comments top 5
1
orf 11 minutes ago 4 replies      
For reference, they collect[1]:

 The command being used (for example, "build", "restore") The ExitCode of the command For test projects, the test runner being used The timestamp of invocation The framework used Whether runtime IDs are present in the "runtimes" node The CLI version being used
I'm actually OK with this to be honest.

Here is the telemetry code itself: https://github.com/dotnet/cli/blob/5a37290f24aba5d35f3f95830...

1. https://docs.microsoft.com/en-us/dotnet/core/tools/telemetry

2
wereHamster 9 minutes ago 0 replies      
On mac you can always use little snitch (https://www.obdev.at/products/littlesnitch/index.html) to reliably block outgoing connections. No need to muck around with environment variables, and you don't have to guess which domains dotnet uses, little snitch will tell you, even if they change them in the future.
3
staticelf 0 minutes ago 0 replies      
Come on folks, this is printed out on the use of the command and basically any site today does more intrusive telemetry.

I think they should ask people like Yeoman, but I don't think they deserve this much shit for such a small thing.

4
apk-d 7 minutes ago 0 replies      
This probably feels more unusual in the world of shell-based development tools - not many these days blink an eye for this sort of behaviour from an IDE package. Still, as a .NET core fan, definitely not a fan of this practice. To be expected from Microsoft, though - they bet big on telemetry in their tools and encourage developers to do the same (through tools like App Insights, for example).
5
0xFFC 12 minutes ago 2 replies      
tl;dr please?
11
Ask HN: Best-architected open-source business applications worth studying?
87 points by ghosthamlet  4 hours ago   22 comments top 13
1
elorm 3 hours ago 1 reply      
Nginx and Git.

Nginx has a lot of respect on the market for handling high concurrency as well as exhibiting high performance and efficiency.

I don't even have to speak about the Git architecture. It speaks plainly for itself.

There's a series of books called The Architecture of Open Source Applications that does justice to this topic

http://aosabook.org/en/index.html

2
yodon 4 hours ago 2 replies      
There's been a good deal of academic work on architectural differences between open source and closed source applications (basically resulting from the differences in the organizational structures that designed/built/grew them ala Conway's Law). Observations for example include reports that closed source applications tend to have more large scale API classes/layers, because there is a management structure in the designing organization that can herd them into existence, while open source projects of the same size and complexity tend to have a less centralized architecture, again reflecting the organizing characteristics of the developers involved[0].

None of this is arguing that one or the other style of architecture is "better" per se, but rather the architectures are different because they were in the end optimized for different kinds of development organizations.

Most business applications remain fundamentally a three-tiered architecture, with the interesting stuff today tending to happen in how you slice that up into microservices, how you manage the front end views (PHP and static web apps are pretty different evolutionary branches), and critically how you orchestrate the release and synchronization/discovery of all those microservices.

(None of which is directly an answer to your question, but is more meant to say that lots of the most interesting stuff is getting harder to spot in a conventional github repository because much of it is moving much closer to the ops side of devOps)

[0] http://www.hbs.edu/faculty/Publication%20Files/08-039_1861e5...

3
jph 4 hours ago 1 reply      
Spree is an open source e-commerce solution. IMHO has good architecture for learning.

Spree has a clean API, clear models, front end and back end, extensions, and command line tools.

https://github.com/spree/spree

Especially take a look at the models:

https://github.com/spree/spree/tree/master/core/app/models/s...

5
nXqd 3 hours ago 0 replies      
6
albertzeyer 2 hours ago 2 replies      
I'm not exactly sure what is meant by business. Commercial successful?

Anyway, here are some projects which I can recommend by its source code:

* OpenBSD. Also the other BSDs. Plan9. And the BSD tools. Linux is a bit bloated but maybe it has to be. I don't recommend the GNU tools.

* LLVM/Clang.

* WebKit. Also Chrome. Firefox not so much, although maybe it improved.

* Quake 1-3, as well as other earlier id games. Really elegant and clean. Also not that big in total. Doom 3 has become much bigger in comparison but again maybe it has to be.

* CPython. Anyway interesting also for educational purpose.

* TensorFlow. Very much not Theano.

I really enjoy reading the source code of most projects which I used at some point. Some code is nicer, some not so nice, mostly judged by how easy it is to understand and how elegant it seems to be. In any case it really is rewarding to look at it as you will gain a much better understanding of the software and often you will also learn something new.

7
Top19 4 hours ago 0 replies      
Two I am familiar with are OpenERP and OpenEMR.

OpenERP, now Odoo, is written in Python.

OpenEMR is written in PHP. It dates from a while ago, but has been mostly updated to the latest PSR standards.

Might also try OrangeHCM, but not sure what those guys are doing these days.

8
NKCSS 3 hours ago 0 replies      
As far as I know sqlite has the reputation of being great (mostly for the test coverage and sheer amount of unit tests).
9
chw9e 3 hours ago 0 replies      
Artsy has a bunch of Open Source applications that are interesting to check out, especially for those interested in mobile apps https://github.com/artsy
10
marknote 47 minutes ago 0 replies      
Has anyone mentioned SQLite?
11
kawera 2 hours ago 0 replies      
PostgreSQL and Apache HTTP server.
12
rrmmedia 3 hours ago 0 replies      
Check out ERPNext written in python https://erpnext.com
13
sidcool 4 hours ago 0 replies      
Shameless plug, but Bahmni and the Go CD open source projects.
12
Wilbur Wright's Letter to the Smithsonian (1899) si.edu
16 points by sethbannon  4 hours ago   9 comments top 5
1
finolex2 1 hour ago 1 reply      
Wilbur Wright actually hoped to enroll at Yale and become a teacher, for which he enrolled in college prep courses. All that changed when he suffered a brutal injury during an ice hockey game, following which he retreated into a long depression and eventually opened a bicycle shop in 1992.
2
dmurray 1 hour ago 0 replies      
If like me you were wondering how the Smithsonian replied, there are links to other letters on the same page. The next one is a letter of thanks for sending certain pamphlets and recommendations, enclosing one dollar for another book.
3
Gravityloss 1 hour ago 0 replies      
It's fascinating to see "behind the curtain" like this. Same with the civil rights movement etc. People know some highlighting event, but it's harder to express or celebrate the years of methodical work behind that.
4
tlb 2 hours ago 1 reply      
They succeeded 4.5 years later.

Their argument that birds make maneuvers 3-4x more aggressive than needed for level flight, and therefore level flight may be within the reach of man, is an intriguing one. You could make parallel arguments about AI today, for example that driving a car requires only 1/3 - 1/4 of a brainpower.

5
interfixus 2 hours ago 1 reply      
"I am an enthusiast, but not a crank in the sense that I have some pet theories as to the proper construction of a flying machine"

So, cranks were a wellknown commodity already in the nineteenth century. Whoever would have guessed...

13
MIT's Pathway to Fusion Energy [video] youtube.com
104 points by mozumder  12 hours ago   26 comments top 9
1
abefetterman 9 hours ago 3 replies      
Overall this is presenting a smaller university-class tokamak with advanced superconductors to try to reach Q>2 (scientific breakeven). One of the big advantages of higher fields is that the fusion power goes like B^4. I think this is an interesting idea, but it's hard to imagine the US funding something like this at the same time as ITER. Last year's talk [1] suggests "alternative funding," pointing to other private fusion research, which I am dubious of. There is a mindset that "if these bad ideas get funded, our good idea should get funded more," which we know is not how funding works.

As a former researcher of alternative magnetic confinement schemes, I'm disappointed the latest research in FRCs and mirrors didn't make it into this talk. Viewers should take into account that this, like most talks, is pushing an agenda, in this case a new device called SPARC. It appears to also be a way of using the incredibly talented tokamak researchers at MIT now that Alcator C-Mod is not operating.

[1] http://library.psfc.mit.edu/catalog/online_pubs/iap/iap2016/...

3
ChuckMcM 7 hours ago 1 reply      
Of course if you can build a 200MW plant (ARC) for $50B that has an operating cost that allows for it to pay for itself in 10 years you'll have companies like Apple or Facebook building them.
4
mcqueenjordan 10 hours ago 3 replies      
Summary in text for those of us that cannot watch a video at present?
5
tekkk 2 hours ago 0 replies      
Cool. I posted this same link two times already. No matter, it's a very interesting video but I'm quite curious how people can get their links to the front page. It seems impossible unless you rapidly gain like tens of upvotes or else it just drops from the new list and goes unnoticed.
6
SCAQTony 10 hours ago 0 replies      
Te lecture begins at the 2:20 mark
7
johnnybowman 9 hours ago 1 reply      
Does it mention anything about timeline?
8
mozumder 9 hours ago 0 replies      
Going beyond the tech into the business side, at $500 million a pop for the smaller SPARC sized fusion reactors. There's an opportunity for startup funding for this. Probably a market of 1,000-10,000 of these smaller reactors around the world, just for the initial first generation.
9
crimsonalucard 7 hours ago 0 replies      
Did not know tony starks arc reactor was based off of real fusion research. Interesting.
14
The best defense against malicious AI is AI technologyreview.com
70 points by etiam  11 hours ago   31 comments top 11
1
bem94 19 minutes ago 0 replies      
Depending on ones level of idealism, the best defense against malicious AI is actually educating programmers / hackers / engineers on the implications of their work and to have a sense of decency & foresight.

It baffles me that we just accept the kind of malice inflicted on people by programmers because "someone will always do it". As a profession / collection of skilled persons, we should really be better than that.

Obviously, one cannot see the future, nor would we want to be paralized by fear of doing anything. But there is a certain minimum requirement for collective responsiblity which I really don't think we are meeting at the moment.

2
strictnein 4 hours ago 0 replies      
On a related note, at Blackhat this year there's a presentation on using Machine Learning based malware detection to train your malware to evade ML based malware detection:

https://www.blackhat.com/us-17/briefings/schedule/index.html...

3
gsg 1 hour ago 1 reply      
So... the only way to stop a bad guy with a GAN is a good guy with a GAN?
4
zitterbewegung 6 hours ago 1 reply      
For something that is more oriented toward patches and network defense see http://archive.darpa.mil/cybergrandchallenge/ .
5
astrojams 9 hours ago 1 reply      
Would strategies used by AI's that play imperfect information games like Poker be useful for winning a contest like this?
6
etiam 11 hours ago 1 reply      
I'm following the original title for the post for now, but frankly I'm unhappy with almost everything about the style of it. I would encourage changing it.
7
jerry40 2 hours ago 1 reply      
A name of the topic reminds me "Watchbird" novel written by Sheckley
8
s-brody 4 hours ago 0 replies      
Well, if there were no AI, there were no malicious AI also.
9
zeep 5 hours ago 0 replies      
let's hope that this good AI won't turn bad once it's too late to switch to something else...
10
melling 9 hours ago 4 replies      
Are we still wasting our time talking about evil AI at this premature time? Can someone dig up Andrew Ngs comments on this?
11
prodtorok 8 hours ago 0 replies      
The Oracle.

She possesses the power of foresight, which she uses to advise and guide the humans attempting to fight the Matrix

15
Mario Kart director philosophical about need for the blue shell arstechnica.com
27 points by Tomte  5 hours ago   6 comments top 3
1
KozmoNau7 2 hours ago 1 reply      
The blue shell is only annoying for try-hard ultra-competitive weenies, who really should be playing serious racing or even simulation games instead.

Mario Kart is a fun party game, not some ultimate test of skill.

2
csydas 35 minutes ago 0 replies      
I think it was Double Dash where it was most apparent that well timed powerslides and boosts would let you avoid the Blue Shell and I think more effort should be put into this balance of being able to avoid it sometimes. I know that when I was younger and we'd meet on the weekends to play, it was always a riot when someone dodged the shell. To me this seems like the right engagement -- you hear the shell and instead of just bracing for the inevitable you try to position to dodge it. It also makes the use of the shell require a bit more thinking, since you want to avoid putting 1st place in a situation where it's easy to dodge.
3
dcip6s 2 hours ago 0 replies      
Please just make it optional in multiplayer matches, thats all we ask.
16
Wisconsin company to offer microchip implants to its employees kstp.com
18 points by rmason  3 hours ago   27 comments top 5
1
mic47 57 minutes ago 1 reply      
Can someone explain why people prefer implants over rings? Functionality wise it's same: read on RFID tag, maybe with some light computation.

People wear rings all the time, application is painless, easy to change once compromised or if there is a new model, and provides same features. Chips would have advantage if they would provide better IO, but currently they don't.

2
roel_v 44 minutes ago 1 reply      
Meh. I got an rfid implant in 2010 ish, but took it out after a few years because there were so few benefits. If there was a universal standard maybe, or if the range was > 10 cm, or if it had more than a few bytes of memory. I think most people who got a chip implant over the last 10 years came to the same conclusion. The only applications that show up in the press every now and then are like this - for payments. But never something that is widely available.

It's great to get yourself labeled a freak though. I got mine done in a a piercing shop, and when I walked in, the place was full with people with dozens of facial piercings, stretched earlobes and full arm and neck tattoos. But as soon as I mentioned the chip implant and showed that I was serious by showing the actual one I wanted put in, they all started murmuring and looking at me like I was send back in time from the year 2210. The owner wasn't sure how to feel when I pulled out a laptop and rfid reader and grinned like an idiot when I successfully logged on to the machine by waving my hand over the reader.

3
struppi 1 hour ago 5 replies      
OK, maybe I'm old or old-fashioned or just not creative enough...

But what exactly are the advantages of this implant over NFC-cards or something like Apple Pay? That I cannot forget my card or my phone? I don't even know when this last happened to me...

4
mattcoles 50 minutes ago 1 reply      
Good to know you can steal money from these employees with a handshake.
5
cwilson 1 hour ago 4 replies      
The title is, of course, misleading. They are offering employees a microchip implant. It's not required.

Still crazy that anyone would opt-in to do this, but a misleading headline all the same.

17
Most open-source software is libraries or frameworks medium.com
74 points by bhjs2  11 hours ago   36 comments top 14
1
SwellJoe 9 hours ago 2 replies      
This is a weird premise. It's github. Of course it is predominantly tools for coders. End user applications have wholly separate paths to the user. An application could have a handful of contributors/stars/downloads etc. on github and still have millions of downloads somewhere else (a project I work on fits that description). And, it might not be on github at all and still be used by millions daily.

I guarantee openssh, Firefox, LibreOffice, and probably a hundred other applications, are (orders of magnitude) more popular than the top applications on this list.

So, if this were titled, "Most open source software on github..." I wouldn't object. But, I have to completely reject the premise here, because I know that there's an entire iceberg of OSS software, including applications, that is completely excluded from the listing by virtue of either not being on github or being on github, but not using github as its primary method of distribution and promotion, and this data completely ignores everything below the surface.

Also, it's probably dangerous to begin to think of "Open Source Software" as only being "Software that has a public github repo".

2
fermigier 3 hours ago 2 replies      
1) Title is misleading, should have been "Most popular projects on GitHub are ...".

2) Judging from the top-5 list in the post, between 1/4 and 1/3 of the projects have been miscategorized.

Ex:

- https://github.com/chrislgarry/Apollo-11 -> should have been categorized as "Application", not "Documentation".

- https://github.com/tensorflow/tensorflow -> should have been "Library", not "Tool".

- Electron, Socket.io, Moment, lodash... are "Web libraries", not "Non-web libraries"

and probably more.

I hope the reviewers catch these errors before they publish this in a research journal.

3
cobbzilla 8 hours ago 0 replies      
Remove the words "open source" and the title makes more sense to me.

Most software is libraries and frameworks -- you just don't get to see most/all of the proprietary stuff, since it's not on github or anywhere else.

4
sigjuice 5 hours ago 0 replies      
The original title is misleading. It should say Most GitHub projects are libraries or frameworks. It is quite a leap to go from 60% GitHub projects to most open source software.
5
jcelerier 1 hour ago 0 replies      
How many of the software in Debian or any other big distro repo is actually being developed on github though ?
6
asragab 10 hours ago 1 reply      
I am not necessarily surprised by the results.

It is worth noting that the second most popular "software tool" tucked in between oh-my-zsh and homebrew (both command-line tools/packages) is Tensorflow.

That has to say something about the current state of the industry, though admittedly, I am a little confused as to why it was classified as a "software tool" and not say, "a non-web library or framework."

7
CiPHPerCoder 10 hours ago 1 reply      
Interestingly, I looked through their entire CSV and was surprised to find one of my projects in their 5000 most popular set. Unfortunately, it was a reading list (awesome-appsec), not an actual software project. But still, kind of neat.
8
ivanbakel 10 hours ago 2 replies      
Very surprised by the ubiquity of JS in non-web libraries/frameworks. I know it's a well-played fiddle, but it's saying something damning about our ability to put together quick and easy applications with other languages - or maybe just the number of people who start with webdev - when JS becomes the first choice. Is it just a consequence of how much UI work has been put into HTML rendering engines?
9
profpandit 8 hours ago 0 replies      
Open source is associated with philanthropy and with being really smart. If you're a really smart coder, out to change thw world, you'll obviously build a library or framework to help the other, not so fortunate coders around you as a way to bring relief from the tyranny of the big monopolistic corporates (read Microsoft). It's your David changing the evil ways of their Goliath.
10
johannes1234321 10 hours ago 2 replies      
What else should be top? - Applications and tools use libraries or are split up into library components ... if there were many more applications than libraries there would be a lot of NIH ...
11
skewart 9 hours ago 1 reply      
Is it really accurate to call socket.io a "non-web" library? Even lodash is questionable in that category - while not strictly tied to the web, it is primarily used in web apps. I'd be curious to see lower level stuff and/or stuff further removed from the web in the non-web category.
12
conceptme 4 hours ago 0 replies      
But every piece of software can use hundreds or thousands of libraries.
13
nickpsecurity 8 hours ago 0 replies      
Counterpoint: most proprietary apps are probably CRUD apps or spreadsheet stuff. There's still a lot of good, proprietary apps. Likewise for FOSS. Turns out the quick and easy solutions to common problems that please project owners happen more often than solutions to hard problems or things without immediate ROI. It says more about people's priorities than proprietary or FOSS software.
14
slaymaker1907 5 hours ago 1 reply      
Why is lodash considered non-web?
18
Lets Get Excited About Maintenance nytimes.com
174 points by nothinggoesaway  9 hours ago   67 comments top 15
1
nathanvanfleet 6 hours ago 8 replies      
I mean, I think the best way to start with this is to use a word other than "maintenance." That's not really the most sexy word if you really want people to get behind it. Furthermore it just suggests that the work is keeping something as good as it was from the beginning. Filling in holes, giving it a coat of paint every now and then.

What it really should be called is "refinement." The innovation ends up being incredibly crude but it gets the job done. How can we build on that, make it better and less coarse than it was? How can we make it more efficient?

2
liveoneggs 8 hours ago 12 replies      
(tangentially related to this article)

I have recently come to realize that, at least in my world, source code older than five years is basically doomed. Developers simply refuse to work on it.

The code that makes it to five years is extraordinary as most of it "dies" before reaching the eighteen month mark.

As a result I have recently been shifting my view to support replace-ability vs maintainability whenever possible. I'm not totally sure how to achieve it, though. Most current trends seem to be towards increasing baggage. (docker)

Data lives on and on and on, however. Data is king. :)

3
TYPE_FASTER 7 hours ago 0 replies      
This article posits that you can either maintain, or innovate. This is not at all true. We can, and should, innovate while maintaining to improve our maintenance.

We can use automation to gather data we've never had before. We can use this data to help prioritize maintenance tasks, and get them done faster with less interruption to service.

4
maxxxxx 8 hours ago 1 reply      
I always think this is the big strength and weakness of the US at the same time. This country is more willing than other countries to abandon old things and move on to the next thing. But right now it seems to be falling into the trap of a lot of pseudo innovation while the foundation is crumbling. Not sure what the best way is to move forward.
5
oautholaf 7 hours ago 0 replies      
Stewart Brand, "The romance of maintenance is there is no romance of maintenance". This was basically his point in "How Buildings Learn" (and really the Long Now too)
6
harpiaharpyja 8 hours ago 1 reply      
While maintenance is certainly undervalued, I don't think that means innovation is overrated.
7
SN76477 7 hours ago 0 replies      
Freakonomics did a great story on maintenance last October.http://freakonomics.com/podcast/in-praise-of-maintenance/
8
marcoperaza 6 hours ago 1 reply      
Sometimes you need to stop maintaining things. Towns rise and fall because the economy changes. If there's no economic activity bringing money into a town, it should eventually disappear. You need the political courage to stop wasting money on propping it up. Some bridges, pipes, roads, and trains shouldn't be repaired or replaced. They need to be closed if dangerous, and allowed to disappear into history. If holdouts want to keep living in their dead town, good for them, but don't make the rest of us pay for it.
9
jtraffic 6 hours ago 0 replies      
I agree with others on the thread that "maintenance or innovation" is a false dichotomy. In particular, I think we need innovative maintenance. For e.g., this article about new ways to fix potholes: https://www.economist.com/news/science-and-technology/217003...
10
acomjean 6 hours ago 1 reply      
Its hard too. my current job with a large existing code base (perl/python/shell scripts and java), and keeping it up to date is a large portion of my time.

The thing is I get very little credit for fixing something that is broken, but creating something new generates accolades and the illusion of productivity...

11
jshelly 8 hours ago 0 replies      
I'm a big fan of this topic. Something we seem to forget is that sometimes new is not necessarily better.
12
booleandilemma 7 hours ago 0 replies      
HN had another article about this recently:

https://news.ycombinator.com/item?id=14293775

13
omegaworks 2 hours ago 0 replies      
The problem seems to be that we've invested far too much federal money into projects that have to be maintained by local sources of funding. Big federal grants for development that will not produce enough tax revenue to offset the externalities and infrastructure costs that are required to maintain that development are just albatrosses around states and cities' necks.

https://www.strongtowns.org/journal/2017/1/10/poor-neighborh...

14
peterwwillis 2 hours ago 0 replies      
In Philadelphia, I passed under the Columbia Bridge by foot, and nearly fell over when I looked up. An entire section of the bridge has spalled and a huge gap of reinforcing steel rods is rusted and exposed. This bridge will collapse. I found no evidence that there is any plan to reinforce or replace it. Luckily it "only" carries CSX freight. Anyone passing under it should be extremely wary.

https://goo.gl/maps/uFkLJoKU1DB2https://goo.gl/maps/767CYu5Mwd62 (it actually looks worse than this up close)

I'd be interested to find out what the track record is of maintenance of infrastructure by private vs public entities.

15
legolassexyman 7 hours ago 0 replies      
How many times are they going to write this article?
19
Schizophrenias Tangled Roots sapiens.org
25 points by antigizmo  7 hours ago   6 comments top
1
DanBC 1 hour ago 0 replies      
Jacqui Dillon is an influential person in UK mental health care.

This submission starts with a description of her service experience 25 years ago. It's clearly an awful experience.

But it doesn't talk about the things that have changed.

People who report sexual abuse are much more likely to be believed. Things aren't good, but they're much better than they used to be.

There are early intervention in psychosis services in some parts of England. (Regional commissioning mean they don't exist everywhere). These work with people who have a first episode of psychosis.

Very recently there has been a lot of work around perinatal mental health (the Twitter user Rosey - @pndandme runs some Twitter chats and they'd have lots of info about how good / bad services are).

Importantly, mental health teams are multidisciplinary teams. That team would include social workers (who have some statutory duties), mental health nurses, occupational therapists, and a psychiatrist. They'd arrange access to other teams - psychologists, housing advice, debt and benefit advice, employment support, social activity, exercise support. (Not all of these would be "THE NHS"' see of it would be charities or community interest companies or private companies. They should all be free at the point of delivery).

MH Professionals are much more comfortable with "breakout symptoms" - they know that antipsychotic medication has pretty devastating side effects, so they want the patient to be on the minimum needed dose. This might mean that people still have auditory hallucination, but are given support to cope rather than being heavily medicated.

The article suggests that many psychiatrists are only there to prescribe meds. That is an important part of their job (they're the only ones who can prescribe meds), but there are plenty of doctors who fully accept the "bio psycho social" model, and who focus on the psychological and social factors.

The article.makes it sound like none of this is happening.

Also, go careful with Lurhman, there are several critiques of that report.

20
Fast-key-erasure random-number generators yp.to
29 points by fanf2  7 hours ago   1 comment top
1
djmdjm 5 hours ago 0 replies      
The design djb talks about of pre-generating batches of random numbers, deleting the cipher key and erasing the numbers as they are extracted is exactly what OpenBSD has done since Markus Friedl switched arc4random() to use ChaCha20 in 2013:

https://github.com/openbsd/src/commit/90c1fad70a3483c2c72c3c...

The design was inspired by Nick Mathewson's libottery: https://github.com/nmathewson/libottery

I think he misses the point his criticism of getrandom() - that is intended to be the interface by which the libc PRNG gets its seed; userspace programs should just use the libc PRNG instead of going off to the kernel (i.e. arc4random())

21
Ask HN: In what creative ways are you using Makefiles? Any samples?
23 points by kamranahmed_se  3 hours ago   27 comments top 18
1
richardknop 2 minutes ago 0 replies      
I use Makefile as a wrapper for build / test bash commands. For example I often define these targets:

- make test : run the entire test suite on local environment

- make ci : run the whole test suite (using docker compose so this can easily be executed by any CI server without having to install anything other than docker and docker-compose) and generate code coverage report, use linter tools to check code standards

- make install-deps : installs dependencies for current project

- make update-deps : will check if there is a newer version of dependencies available and install it

- make fmt : formats the code (replace spaces for tabs or vice-versa, remove additional whitespaces from beginning/end of files etc)

- make build : would compile and build a binary for current platform, I would also defined platform specific sub commands like make build-linux or make build-windows

2
alexatkeplar 6 minutes ago 0 replies      
Until recently we used them at Snowplow for orchestrating data processing pipelines, per this blog post:

https://snowplowanalytics.com/blog/2015/10/13/orchestrating-...

We gradually swapped them out in favour of our own DAG-runner written in Rust, called Factotum:

https://github.com/snowplow/factotum

3
chubot 24 minutes ago 0 replies      
Not particularly creative, but I'm using it to generate this blog:

http://www.oilshell.org/blog/ (Makefile not available)

and build a Python program into a single file (stripped-down Python interpreter + embedded bytecode):

https://github.com/oilshell/oil/blob/master/Makefile

Although generally I prefer shell to Make. I just use Make for the graph, while shell has most of the logic. Although honestly Make is pretty poor at specifying a build graph.

4
new299 1 hour ago 2 replies      
My favorite use was during my PhD. My thesis could be regenerated from the source data, through to creating plots with gnuplot/GRI and finally assembled from the Latex and eps files into the final pdf.

It was quite simple really, but really powerful to be able to tweak/replace a dataset hit make, and have a fully updated version of my thesis ready to go.

5
Figs 56 minutes ago 0 replies      
I once implemented FizzBuzz in Make: https://www.reddit.com/r/programming/comments/412kqz/a_criti...

Even though Make does not have built-in support for arithmetic (as far as I know), it's possible to implement it by way of string manipulation.

I don't recommend ever doing this in production code, but it was a fun challenge!

6
shakna 25 minutes ago 0 replies      
Lisp in make [0] is probably the most creative project I've seen. For myself, in some tightly controlled environments I've resorted to it to create a template language, as something like pandoc was forbidden. It was awful, but worked.

[0] https://github.com/kanaka/mal/tree/master/make

7
unmole 21 minutes ago 0 replies      
Not mine but here's a Lisp interpreter written in Make: https://github.com/kanaka/mal/tree/master/make
8
rrobukef 33 minutes ago 0 replies      
I use it to setup my programming environment. One Makefile per project, semi-transferable to other pcs. It contains

 * a source code download, * copying IDE project files not included in the source, * creating a build folders for multiple builds (debug/release/converage/benchmark, clang & gcc), * building and installing a specific branch, * copying to a remote server for benchmark tests.

9
accatyyc 28 minutes ago 0 replies      
One "creative" use is project setup. Sometimes, less technical colleagues need to run our application, and explaining git and recursive submodules takes a lot of time, so I usually create a Makefile with a "setup" target that checks out submodules and generates some required files to run the project.
10
INTPenis 1 hour ago 1 reply      
Not exactly creative but KISS. I use only Makefile for a C project that compiles on both Linux, BSD and Mac OS.

Point being that autoconf is often overkill for smaller C projects.

11
Da_Blitz 48 minutes ago 1 reply      
i use it to solve dependency graphs for me in my program language of choice, at the moment this involves setting up containers and container networking but i throw it at anything graph based

make seems to be easier to install/get running than the myriad of non packaged, github only projects i have found.

12
dvfjsdhgfv 1 hour ago 0 replies      
The main question to ask if you really need to use make. If you do, there practically no limit of what you can do with it these days, including deployment to different servers, starting containers/dedicated instances etc. But unless you are already using make or are forced to, it's better to check one of newer build systems. I personally like CMake (it actually generates Makefiles).
13
stephenr 1 hour ago 0 replies      
I guess it depends what you consider creative?

I use one to build my company's Debian Vagrant boxes: https://app.vagrantup.com/koalephant

I use one to build a PHP library into a .phar archive and upload it to BitBucket

My static-ish site generator can create a self-updating Makefile: https://news.ycombinator.com/item?id=14836706

I use them as a standard part of most project setup

14
johnny_1010 51 minutes ago 0 replies      
I use makefile to gen my static website.Also my CV, latex and make works well together.
15
matt4077 1 hour ago 1 reply      
I'm using ruby's rake in almost every project, even when it's not ruby otherwise.

It has much of the same functionality, but I already know (and love) ruby, whereas make comes with its own syntax that isn't useful anywhere else.

You can easily create workflows, and get parallelism and caching of intermediate results for free. Even if you're not using ruby and/or rails, it's almost no work to still throw together the data model and use it for data administration as well (although the file-based semantics unfortunately do not extend to the database, something I've been meaning to try to implement).

Lately, I've been using it for machine learning data pipelines: spidering, image resizing, backups, data cleanup etc.

16
yabadabadoo 2 hours ago 1 reply      
I use make to pre-compile markdown into HTML for a static website.
17
rurban 1 hour ago 0 replies      
I'm creating a config.inc makefile during make to store config settings, analog to the config.hhttps://github.com/perl11/potion/blob/master/config.mak#L275

Instead of bloated autotools I also call a config.sh from make to fill some config.inc or config.h values, which even works fine for cross-compiling.

18
jmurphyau 1 hour ago 0 replies      
I use make to make things
22
The Million Dollar Homepage as a Decaying Digital Artifact harvard.edu
351 points by sjmurdoch  19 hours ago   132 comments top 31
1
_kst_ 16 hours ago 2 replies      
I can still access http://www.milliondollarhomepage.com/

I can't currently access the article at https://lil.law.harvard.edu/blog/2017/07/21/a-million-squand...

[Insert joke about irony here.]

2
schiffern 11 hours ago 0 replies      
>Of the 2,816 links that embedded on the page (accounting for a total of 999,400 pixels), 547 are entirely unreachable at this time. A further 489 redirect to a different domain or to a domain resale portal, leaving 1,780 reachable links

Looking at the million dollar homepage, many of the links were never valid:

http://paid & reserved/

http:// paid and reserved - accent designer clothing/

http://reserved for edna moran/

http://paid & reserved for paul tarquinio/ (1200 pixels)

http://pending order/

These links are all shown in plain red ("link to unreachable or entirely empty pages") in the "visualization of link rot," so it looks like the authors didn't account for invalid URLs.

3
Houshalter 16 hours ago 4 replies      
Gwern has a good summary of the research in this: https://www.gwern.net/Archiving%20URLs

>In a 2003 experiment, Fetterly et al. discovered that about one link out of every 200 disappeared each week from the Internet. McCown et al 2005 discovered that half of the URLs cited in D-Lib Magazine articles were no longer accessible 10 years after publication [the irony!], and other studies have shown link rot in academic literature to be even worse (Spinellis, 2003, Lawrence et al., 2001). Nelson and Allen (2002) examined link rot in digital libraries and found that about 3% of the objects were no longer accessible after one year.Bruce Schneier remarks that one friend experienced 50% linkrot in one of his pages over less than 9 years (not that the situation was any better in 1998), and that his own blog posts link to news articles that go dead in days2; Vitorio checks bookmarks from 1997, finding that hand-checking indicates a total link rot of 91% with only half of the dead available in sources like the Internet Archive; the Internet Archive itself has estimated the average lifespan of a Web page at 100 days. A Science study looked at articles in prestigious journals; they didnt use many Internet links, but when they did, 2 years later ~13% were dead3. The French company Linterweb studied external links on the French Wikipedia before setting up their cache of French external links, and found - back in 2008 - already 5% were dead. (The English Wikipedia has seen a 2010-2011 spike from a few thousand dead links to ~110,000 out of ~17.5m live links.) The dismal studies just go on and on and on (and on). Even in a highly stable, funded, curated environment, link rot happens anyway. For example, about 11% of Arab Spring-related tweets were gone within a year (even though Twitter is - currently - still around).

4
resf 18 hours ago 3 replies      
Decaying in more than one way. The JS files on milliondollarhomepage.com start with:

 /* FILE ARCHIVED ON 5:47:20 Aug 6, 2015 AND RETRIEVED FROM THE INTERNET ARCHIVE ON 20:45:17 Aug 24, 2015. JAVASCRIPT APPENDED BY WAYBACK MACHINE, COPYRIGHT INTERNET ARCHIVE. ALL OTHER CONTENT MAY ALSO BE PROTECTED BY COPYRIGHT (17 U.S.C. SECTION 108(a)(3)). */
I guess someone didn't keep backups?

5
krallja 18 hours ago 4 replies      
The Million Dollar Homepage is not decaying (it is still serving its million dollar purpose) - it is the Web itself that has decayed. The brittleness of URIs is on full display. "Cool URLs don't change," but most of these URLs were never cool: they had to rent coolness from Internet cool kid Alex Tew.
6
glenstein 17 hours ago 2 replies      
The article seems to be suggesting that the Million Dollar Home Page has in some sense failed to fulfill it's promise because many of the links are now dead. I don't follow that logic at all. To me it seems that the MDHP's job was to be an iconic piece of internet history, and they've entirely fulfilled their end of the bargain.
7
sixQuarks 15 hours ago 3 replies      
I actually purchased a $300 spot on this. I did get quite a few clicks, but very low-quality traffic. Mostly, I got lots of offers from copycat sites to join their "billion dollar" homepage or whatnot.

It's crazy how many copycats came out, very unoriginal thinking going on.

8
ChuckMcM 14 hours ago 1 reply      
I think in many ways it is not a 'decaying digital artifact' as it is an excellent representation of the fallacy upon which a lot of the Internet hangs. In the Library of Alexandria you didn't have scrolls disappear because the kingdom where they originated had been crushed under the boot of an invader. But the Internet is no great library, no respository of knowledge, or an oasis of independent thought. The Internet is a conversation in a crowded room with amplified shotgun microphones pointed at all who walk through it.
9
AdmiralAsshat 16 hours ago 0 replies      
I'm not sure why the article considers it "squandered": it did its job as long as the advertisers cared to maintain their links.

It hardly seems fair to blame a billboard being in disrepair if the company it advertised no longer exists.

10
narrator 17 hours ago 0 replies      
I think all the broken links just goes to show that failure in business is the norm or that someone who thought it would be a good idea to promote their company on this service is probably not good at running business.
11
5_minutes 45 minutes ago 0 replies      
An interview with the creator would've been a nice addition to the story.
12
ernsheong 17 hours ago 3 replies      
FWIW, I'm building https://PageDash.com as a private web archive to address the problem of link rot, beginning from a personal level. Launching in late August. Think of it as a private version of perma.cc.
13
brosky117 18 hours ago 15 replies      
I just heard about the "Million Dollar Homepage" for the first time last week. Would this idea (or one like it) work today? Making a million dollars for something so bizarre, fun, and straightforward sounds amazing. Can anyone reference other attempts at similar ideas?
14
aidos 17 hours ago 2 replies      
Would be interesting to know how many people on the million dollar homepage are on HN. I imagine there's a wonderful cross over between the two groups.

Even though its with a business we're not doing now, my business partner and I are on there.

Edit: don't think it deserves a downvote - is it not an interesting question? I bet there are loads of serial entrepreneurs on both

15
tejtm 6 hours ago 0 replies      
As good a time as any to trot out my hobby horsewith suggestions on how to mitigate data rot. Aimed at science, but more broadly applicable.

"Identifiers for the 21st century"https://doi.org/10.1371/journal.pbio.2001414

note/claimer/disclaimer: Although I am included as an author I do not write that well.

16
mathattack 2 hours ago 0 replies      
1780/2816 links being reachable is actually much higher than I'd expect over 12 years. I'm not sure if that's what I would have predicted from the outset.
17
hellbanner 11 hours ago 0 replies      
A more modern variant, https://catbillboard.wordpress.com/

"Million Dollar Cat Billboard project sells 10 000 squares (places on a billboard) $100 dollars each to make worlds first ever cat billboard and put it up in 10 cities around the globe for a month. To proudly show your cat to the world you need to buy at least one square. But of course you can buy as many of them as you wish as long as they are available."

18
rxlim 10 hours ago 2 replies      
I wonder how he got everything to fit as more and more space was sold and if it was a manual process? It must have been like playing Tetris on expert mode.
19
Gargoyle 15 hours ago 0 replies      
Do this with an ICO, with your space verified via smart contract.

It's all in the marketing!

20
cdevs 17 hours ago 2 replies      
My first web page ever is in there. I'm not sure how special of a thing that is I don't know how many icons are involved.

Also I wonder how Word got around to me about things like this in the days of MySpace and yahoo as my internet.

21
philip4534 17 hours ago 1 reply      
Xanadu lost.
22
amelius 17 hours ago 0 replies      
This homepage demonstrates what an average city would look like without any regulation.
23
chenster 7 hours ago 0 replies      
I'm just jealous.
24
pishpash 15 hours ago 2 replies      
Whatever happened to DOI? (Or leveraging Google's knowledge of redirects?) A lot of rot is hosting changes; the documents, if the author cared, could well be hosted somewhere else.
25
keyboardmonkey 10 hours ago 0 replies      
it was always destined to decay, was always going to be a one-off success. interesting in it's success juxtaposed by its immediate pointlessness.
26
Nursie 15 hours ago 1 reply      
Oh wow, I remember that.

1 million pixels for only a dollar each!

That guy made a nice bundle off the idea, it got picked up and hyped by the media so much I'm sure the companies that bought in got some ROI, or at least some publicity. Such was the extent of the dot com bubble that this sort of nonsense could happen and everyone cheered...

27
peter303 16 hours ago 2 replies      
I wonder what the "rot factor" is for scientific citations? Some professional societies I am in mandate URLs for bibliographical references. Most of the time these are peer-reviewed articles. But they can be softer references like Wiki reviews, data repositories, etc.
28
johnbowers112 15 hours ago 0 replies      
Here's an archive of the article for those having trouble accessing it:https://perma.cc/A6ZZ-79X6
29
smegel 14 hours ago 0 replies      
It's amazing how well designed the ads within the image are...it's a big jumble but many of them stand out quite strongly with just a single word. I wonder if they designed ads with the surrounding color context taken into account.
30
malthazzar 11 hours ago 0 replies      
the left of the yellow coupons ad in the right middle
31
fatjokes 17 hours ago 2 replies      
I didn't realize you bought the pixels permanently. How did the owner keep up with serving costs?
23
Accounts of the financial crisis leave out the dollar swap deals between banks prospectmagazine.co.uk
69 points by drtillberg  14 hours ago   9 comments top
1
jstanley 12 hours ago 3 replies      
Page doesn't seem to work. I only see 2 paragraphs and the second one fades out at the bottom.

I have not disabled javascript.

24
Maine growers cultivate gourmet mushrooms mainebiz.biz
42 points by Mz  13 hours ago   1 comment top
1
photoJ 8 hours ago 0 replies      
Maine is trying to diversify to highly valuable farmed goods i.e. oysters, mushrooms... and of course legalization has created a stir...
26
Android Killed Windows Phone, Not Apple theverge.com
79 points by mpweiher  12 hours ago   66 comments top 17
1
Rjevski 7 hours ago 8 replies      
I'd say Microsoft killed Windows Phone. They reset the ecosystem twice (once when Phone 7 apps weren't compatible with 8, and after then Phone 8 apps weren't compatible with 10).

When you've got little apps to start with obviously the best idea was to throw them all out twice and hope the developers are still interested. Not to mention the dev environment required Windows 8 - personally this is what discouraged me from even trying to develop on the platform, I wasn't going to give up my perfect Windows 7 installation for a toy OS that will allow me to create apps for a toy phone while getting in my way when I tried to do real work.

Add to that a shitty web browser and quite slow devices and obviously its failure must be the fault of the competition, I mean how can such a great product fail?

The only good thing I can remember about my Windows Phone is that it handled IMAP push notifications, something iOS is still lacking.

2
greyman 16 minutes ago 0 replies      
I wonder how the fight between Android and iOS will go on. I kind of never liked Android and never owned one, but my wife had several Android devices and I always considered them inferior comparing to iOS. Mostly talking about operating system and how you "feel" using the device. But just yesterday I bought Siemens A3 2017 model to my daughter, for 300 eur, and I was surprised how similarly the phone felt to my iPhone 6s, which I paid 800 eur for... if this will continue, I am not sure if I stay with iOS after my next phone upgrade.

Also, I don't fully buy the argument that Windows Phone was unsuccessful because it was late. I think that doesn't matter that much - changing phones and even phone operating system isn't such a big deal. After using iOS for about 8 years, I have no problem to switch to something else, if it proves to be better, or equal but cheaper. If Windows Phone would be better back then, more people would switch to it after their next phone upgrade.

3
abdulmuhaimin 2 hours ago 0 replies      
Back in the days, I thought Nokia with its Meego OS will run away with the 3rd OS title. It was an OS that was much hyped when its still in development.

Thanks to Elop and Microsoft we never see that happening. It was killed before it was even born.

I bought the Nokia N9(the only Meego phone released) a couple of month after its launch, knowing the OS is coming to its end. The Meego OS was a very polished and well-made OS. It was smooth, very intuitive, and simply the best touchscreen smartphone experience I've ever had. Thats was why Elop and WP infuriate me so much. It was a missed opportunity. Nokia took the easy route of getting paid by Microsoft to use its OS, and that bite them in the ass.

And so I'm glad that trash WP and Nokia failed spectacularly

4
petilon 7 hours ago 6 replies      
It was Steve Ballmer and Steven Sinofsky that killed Windows Phone. They killed it first by being late: Android released September 2008, Windows Phone was released November 2010. Two years behind Android!! They killed it also by being not sufficiently differentiated. Then they also killed it with the bland Metro look & feel. Microsoft learned nothing from previous failures such as Zune.

Microsoft has learned nothing from the failure of Windows Phone, and is now in the process of killing Windows. Yes, Windows 10 is pretty good, but as an app platform it is a failure. Even Microsoft products such as Teams and Azure Storage Explorer use Electron, not native Windows APIs. And why would any developer make native Windows apps? Ordinary users can't tell an app built using Electron from a native Windows app. Thanks to Metro and its bland flatness, Windows apps does not have a differentiated look & feel, so end users don't know to demand Windows native apps. So developers are better off using cross-platform technologies such as Electron. (Yes, Electron is bloated but if your application is substantial this is not a deal breaker.)

5
bsclifton 3 hours ago 1 reply      
I don't think there is a single smoking gun- many things contributed to the downfall of Windows Phone. Microsoft had a decent market share with the old Windows CE based smart phones.

- Microsoft didn't care about mobile, thinking Windows CE was fine (they had ~42% marketshare in 2007)

- Windows Phone 7 was great, but it was too late by then (2010... 3 years after iPhone was release and 2 years after first Android phone)

- There were two resets (7 => 8, 8 => 10) which screwed customers hard. With the 7 => 8 upgrade, not only were apps incompatible, the OS was incompatible with previous hardware.

- The App Store was mostly full of garbage apps (lots of fake apps- hard to fine the genuine app)

- Carriers didn't do a very good job pushing Windows Phone (can you blame them? :P )

From my perspective (former WP user, 2011 - 2015), the biggest WTF to me was when Microsoft bought Nokia. That seems to be about the time they just completely gave up

6
bitmapbrother 7 hours ago 2 replies      
I disagree with this blogger's analysis. Instead, I believe Windows Phone commited suicide. When you osbourne your platform repeatedly, change your development strategy numerous times and alienate current and potential developers you have no one to blame but yourself. So, when Windows Phone saw the end coming it decided to end its struggle and quit trying to fight the inevitable.
7
Nursie 48 minutes ago 0 replies      
I watched the tech news about Windows Phone over the years it was released as 7 and 8.

The paid-for hype was obvious and hollow. Commentators sprung up in tech-related forums everywhere, singing the praises of the development experience of WP 7 with personal testimony of how awesome it was, months before it was available to developers. Then scarcely a year later, when it was clear that WP7 was not setting the world on fire, exactly the same obvious, transparent marketing hype started being produced for WP8. When challenged about this, it was declaimed that WP7 had only ever been meant as a transition phase, and 8 was where it was really the future!

It was so blatant, and so pointless.

8
nwah1 5 hours ago 1 reply      
The Universal Apps idea makes a lot of sense, and I don't think the refactors are what harmed it. Getting a late start is the primary problem, since it is all about getting apps. Windows succeeded on the desktop for that reason, and failed on mobile so far for the same reason.

But they have made all the right moves over the past five years, and have lots of momentum, cash, and goodwill from the tech community. They will do some sort of big mobile push in the near future. Probably some kind of Surface Phone, which can double as a desktop.

9
holydude 3 hours ago 1 reply      
I believe WP/MS committed suicide. They did not listen to their market and apart from the obvious app compatibility and lack of apps in general they totally forgot about people not really liking the unified design of the phones (you can't go the apple way if your audience does not like your defaults).Tiles were really meh and people still prefer app icons a la ios/android.

I believe Microsoft still has a chance but they need to 1) talk to samsung and other big android/smartphone manufacturers2) make a killer feature

10
roryisok 2 hours ago 0 replies      
I read this on my Lumia 925, which I continue to build apps for. I have a newer android phone for testing and building apps, but the wp is still my day to day phone. I think there will continue to be a niche market for wp, for the people that love it, and that see it as a superior UI.

IOS and android are the future, but WP is vinyl

11
shmerl 6 hours ago 0 replies      
I don't regret MS messing up their mobile strategy, except for them directing Nokia to kill Meego. That was a pretty bad loss.
12
gok 5 hours ago 2 replies      
The DOJ killed Windows Phone. If Microsoft wasnt worried about antitrust suits, they would have given Windows away for free on phones to build up market share and OEMs wouldnt have had to go to Google to get a free OS.
13
tmbsundar 5 hours ago 0 replies      
I would also wager that a lack of a thriving app store eco-system was/is a major factor that played Windows down.

By the time Windows entered the Market, Android and iOS had the critical mass of developers and users in downloading games and apps from their respective apps stores and Windows could not break into the network effect.

14
skinnymuch 7 hours ago 3 replies      
What ever happened to the project to port iOS and Android apps to Windows? Neither of them working out sucks. I thought if one of them could work out, as long as the porting didn't require more than 10 or 15 percent of a code base to be changed, that could've saved Windows mobile enough to have maybe a 10% market share and perhaps stay in the game. But maybe being that small was never going to be worth it for Microsoft.
15
senthilnayagam 6 hours ago 0 replies      
Someone approached us for building multiple apps for windows phone presumably on behalf of Microsoft.

We would get paid for having our app on store. Seems the person who reached was a middle man and was more interested in making money for himself than getting good apps built.

I knew there was no way they could recover from that situation.

16
eight_ender 4 hours ago 0 replies      
When I look back at the spectacular mobile OS implosions of that era I feel like Windows Phone deserved to die, but Palm's WebOS I'm still a bit sad about.

Their hail mary was impressive and ahead of it's time, but there was no room for a #3 in the market with Google giving away everything for free.

17
caycep 5 hours ago 0 replies      
This is basically stratechery's line of writing on Android...
27
Many of the best Scrabble players are Thais who don't speak English (2014) thestar.com.my
40 points by Tomte  12 hours ago   9 comments top 5
1
greenpenguin 8 minutes ago 0 replies      
This article appears to originally be from the Guardian: https://www.theguardian.com/commentisfree/2014/aug/06/scrabb...

The site also features several other Guardian articles by the same author. Seems a little iffy?

2
FeteCommuniste 10 hours ago 0 replies      
The champion of French Scrabble a couple years back was a New Zealander who didn't speak the language: http://www.npr.org/sections/thetwo-way/2015/07/21/424980378/...
3
contingencies 8 hours ago 1 reply      
As an abugida versus an alphabet, Thai and English are not that different in structural conception, though Thai has almost double the number of consonants.
4
paulcole 9 hours ago 0 replies      
The 2017 National Scrabble Championships are going on now and you can follow along here:

http://event.scrabbleplayers.org/games/nsc2017/

5
mathattack 1 hour ago 0 replies      
If understanding isn't important, wouldn't a halfway decent AI kill most players in the game?
28
What are covariance and contravariance? stephanboyer.com
155 points by beliu  16 hours ago   45 comments top 14
1
rntz 10 hours ago 1 reply      
Covariance and contravariance are just monotonicity and anti-monotonicity, applied to types ordered by subtyping.

That is: if we have a function on types, say, the function `f` defined by:

 f(x) = Int -> x
Then we say `f` is covariant because it is monotone: it preserves the subtype ordering on its argument. That is:

 if x <: y, then f(x) <: f(y)
Similarly, if we consider `f` defined by:

 f(x) = x -> Int
then this `f` is contravariant because it is anti-monotone: it reverses the subtype ordering on its argument. That is:

 if x <: y, then f(y) <: f(x)
People tend to find covariance more intuitive than contravariance; unless the issue is pointed out, they tend to assume everything is covariant. They see a type, say:

 dog -> dog
and they assume "oh, every dog is an animal, so I can put 'animal' in place of 'dog' and it'll be more general (i.e. a supertype)". This is false, as the article points out.

2
joshlemer 9 hours ago 1 reply      
Also see this handy infographic https://i.stack.imgur.com/W879X.png
3
tunesmith 3 hours ago 0 replies      
One of the ways I like to look at it is that I'm a foreman and one of my construction workers can't show up, and I need a substitute, I don't want someone that is even less useful than my worker. A really good substitute is someone that can do everything that my original worker could do, and maybe then some even if I don't take advantage of it. Meanwhile, I don't want his results to be worse than my original worker's, but I sure don't mind if it's better.

In other words, if my original worker only knows how to turn a Dog into a Dog, and that was good enough, the most useful substitute is someone who can take any Animal and turn it into any special kind of dog.

Or maybe I care about 2x4's, and my normal guy only knows how to turn Cherry into 2x4's and isn't trained on anything else, even though that suited my needs. The best sub is someone who can take lots of kinds of wood and turn it into different kinds of posts and planks; I'm just only taking advantage of his 2x4 skills.

Unfortunately a lot of programmers design subclasses by saying "hey look, I'm good at starting with a schnauzer" or "hey look, I'm good at starting with Brazilian cherry". These guys aren't helpful when they show up in your parameter list.

4
quantdev 11 hours ago 1 reply      
This was very interesting.

As a mathematician with no comp-sci type knowledge, my only understanding of inheritance is the "is a" rule. Using this, I realized that a subtype of the set of functions from Dog to Dog must be a set of functions such that each function could be treated as a function from Dog to Dog under an appropriate restriction. This would be the only way for such a set to satisfy what felt like the "is a" inheritance rule.

In other words, a set of functions from A to B where Dog is contained in A and B is contained in Dog would be a subtype of the set of functions from Dog to Dog. So Animals -> Greyhound works.

5
asavinov 1 hour ago 0 replies      
Covariance can be thought of as a mechanism for overriding functions. Assume there is a (base) function

 f: X -> Y
We want to override it by providing a new mapping from U to V:

 f: U -> V
This mechanism is said to be covariant if the new function guarantees that

 If u <: x then v=f(u) <: y=f(x)
This means that for each more specific input u in U the function returns more specific (not more general) output v in V.

6
Patient0 1 hour ago 0 replies      
Great article - I'm impressed that it covers every discovery I've painstakingly worked through over the years, all succinctly expressed on one page: Java arrays, immutable lists and even the fact that Eiffel got it wrong (which I remember puzzling over with a colleague back in the 90s)
7
noobermin 12 hours ago 2 replies      
So I used to think covariance and contravariance were related to covectors and vectors from physics, but in fact, our terminology is in fact confusing the concept (almost "opposite" actually) that is now accepted in math. (See comment on wiki[0]).

There we physicists go, confusing things again.

[0] https://en.wikipedia.org/wiki/Functor#Covariance_and_contrav...

8
the_mitsuhiko 10 hours ago 0 replies      
Microsoft probably has the most logical docs on that: https://docs.microsoft.com/en-us/dotnet/csharp/programming-g...
9
ducttapecrown 12 hours ago 2 replies      
Why are covariance and contravariance important, and how do their type theory definitions differ from mathematical or statistical definitions of covariance and contravariance?

Is it just a distinction of which direction the type hierarchy flows, and the consequences that must have with regard to functions in order for logical consistency to be maintained?

10
rectang 5 hours ago 0 replies      
The difference between covariance and contravariance is enough to get UW professor Dan Grossman jumping up and down: https://www.youtube.com/watch?v=pb_k8h6RuAY
11
OJFord 11 hours ago 1 reply      
This post presents a nice example, but only for the particular type system envisaged by the author.

At least, I believe that the point of conceiving of 'covariance' and 'contravariance' is that we may have or not have either, in input or return types.

The submission presents one incarnation, a common one I believe, but nevertheless I think if the goal's to understand variance, the concept must be distinguished from implementation.

12
bmc7505 12 hours ago 1 reply      
This a topic that also comes up in linear algebra. Is there any analogy between vectors and types, or is the terminology just coincidental? https://en.wikipedia.org/wiki/Covariance_and_contravariance_...
13
enriquto 9 hours ago 1 reply      
In mathematics we call each of them "invariance", if we do not want to sound too pedantic (or unless the distinction cannot be deduced easily from context)

I feel that the examples in math are easier to understand that in programming. For example:

- The integral of a function is invariant to additive changes of variable : \int f(a+x)dx = \int f(x)dx

- The mean of a distribution is contravariant to additive changes of variable : \int f(a+x)xdx = -a + \int f(x)xdx

- The mean of a distribution is covariant to shifts of the domain (same formula, because f(x-a) is a shift of size "a")

- The variance of a distribution is invariant to additive changes of variable

etc.

14
tritium 10 hours ago 3 replies      
This is typical technical jargon conflation in furtherance of interview pedantry. Please spare me.

Looking at the words superficially, their definitions are easily discerned:

 Covariance: changing together, with similarity. Contravariance: changing in opposition to one another.
But, as part of strategic nerd signaling in interviews to parse what a candidate has been reading lately, you'll encounter middle managers that will cull contending close-call applicants based on trivia like this. Similar technical jargon that isn't what you think it might be, due to some seminal blog post include "composition" or "isomorphic" or perhaps most obviously, the simple-but-loaded term "functional."

Try defining the word functional incorrectly during a technical interview and see what happens.

29
How Wi-Fi Works verizoninternet.com
251 points by sharjeelsayed  15 hours ago   80 comments top 17
1
andrenotgiant 9 hours ago 4 replies      
This is (good) SEO linkbait. Someone at Verizon got $10k to spend getting it created by saying it will boost organic search traffic to Verizon. Right now (during link-building phase) they keep the page completely separate from rest of site. Later, (after most links are created) they'll change it. Not sure whether the goal is just to generally build authority to VerizonInternet, or to get this URL ranking for wifi keywords. (Seems more likely the former.)
2
mrb 10 hours ago 2 replies      
As hatsunearu said, the radio modulation described is grossly incorrect. WiFi never uses 8-PSK (encoding 3 bits per symbol). 802.11n and 11ac encode 1/2/4/6/8 bits using a BPSK/QPSK/16-QAM/64-QAM/256-QAM symbol (256-QAM is for 11ac only). The modulation scheme is negotiated based on signal quality. Here is a quick reference: http://mcsindex.com/ (MCS = modulation coding scheme) On Linux you can find the MCS negotiated with "iw dev wlan0 link | grep -i mcs"

14 channels are defined in the 2.4GHz band. For example channel 6 is centered on 2437 MHz. Each channel is 20MHz wide and divided in 52 "data" subcarriers, each occupying a different frequency and spaced out by 312.5 kHz (52 312.5 kHz is less than 20 MHz because there are "control" subcarriers and additional spacing.) So 52 different symbols can be sent in parallel at the same time, which is what we call OFDM https://en.wikipedia.org/wiki/Orthogonal_frequency-division_... (basically, I'm simplifying!)

Remember this is for just 1 channel. So with 14 channels each composed of 52 subcarriers, we could have 728 symbols transmitted at the same time. If they are 256-QAM symbols that's basically 728 8 = 5824 bits being transmitted at the same time in the air. And they will all be received and demodulated independently. This high level of parallelism of OFDM is how WiFi can achieve very high throughput.

Then, with wide channels of 40 MHz, which basically aggregate two 20 MHz channels, we get a few more data subcarriers because we don't need as many control subcarriers so a few of them become used as data subcarriers. Hence a 40 MHz channel will have not 52 2 = 104 but actually 108 data subcarriers. And 802.11ac defines 80 MHz and 160 MHz channels with respectively 234 and 468 data subcarriers.

Let's calculate the maximum usable throughput of a single 802.11ac 160 MHz channel using 256-QAM modulation... It sends 468 symbols at the same time on 468 data subcarriers. Each symbol encodes 8 bits and takes in the best case 3.6us to be transmitted: 3.2us for the actual symbol + a short guard interval of 0.4us (the GI is normally 0.8us but can be a short GI of 0.4us if negotiated). The raw physical bitrate is:

1/3.6e-6 468 8 = 1.04 Gbit/s

However there is a mandatory error correction which is 5/6 in the best case so the actual usable bandwidth is:

1.04 5/6 = 866.67 Mbit/s

3
princekolt 13 hours ago 3 replies      
"How Wi-Fi Works" --> 503 Service Unavailable

Seems about right.

4
sshanky 13 hours ago 3 replies      
This is beautiful, but probably still too complex for most of their customers. I wonder what their motive in putting this together was, as it must have been very expensive.
6
deepsun 6 hours ago 4 replies      
Does anyone know, if I'm on a WPA2-PSK wi-fi, do other devices that are also on the same network can "sniff" my traffic.For unprotected networks it's obvious, but what about protected?
7
pier25 7 hours ago 0 replies      
I had to disable the ad blocker to get the nice web gl graphics.

From a front end perspective I think this it's awesome. No so sure about the content though.

8
bdcravens 13 hours ago 7 replies      
Speaking of how wifi works, I learned something interesting about wifi and Verizon's partner in many things, Comcast: Last night I notified my home Internet acting funny, and learned that the admin interface for my Comcast router had username "admin", password "password". SMH.
9
anilakar 1 hour ago 0 replies      
Stopped reading at "Wi-Fi antennas send information". Antennas don't send anything, they just match the impedance of the feedline into that of the medium.
10
arikrak 10 hours ago 2 replies      
I was looking at this and thought it looked nice but then my computer froze so maybe they overdid the 3D graphics a bit?
11
KozmoNau7 1 hour ago 0 replies      
TL;DR: It doesn't.
12
hatsunearu 12 hours ago 1 reply      
the radio stuff is pretty wrong; one glaring one is that PSK is pretty much not used anymore, it's all OFDM.
13
notgood 12 hours ago 0 replies      
Just a reminder that Verizon its the biggest lobbyist against Net Neutrality[0] and if you do support it then it's probably wise to stay away from their services as far as possible.

[0] https://www.dailydot.com/layer8/lobbyists-net-neutrality-fcc...

14
nvahalik 7 hours ago 0 replies      
Can someone explain the natural resonance of walls of talked about with regards to 5Ghz?
15
srtjstjsj 3 hours ago 0 replies      
website crashes chrome on windows after a few seconds.
16
Piccollo 9 hours ago 0 replies      
Why did CSS stop working
17
Yizahi 1 hour ago 0 replies      
In my experience it doesn't. My every attempt at improving this shit ends up in laying more wires everywhere.
30
Printed Solar Panels for Less Than $10 a Square Metre newcastle.edu.au
199 points by dakna  18 hours ago   65 comments top 17
1
kirrent 10 hours ago 0 replies      
Hey, I know the phd student who printed out these cells and set up the demonstration! It's part of the university's push for large scale organic solar demonstrations along with the smaller test cells.

From talking with him, the technology isn't really ready for prime time yet but it's getting pretty close. I think the key point is that efficiencies in small scale cells and larger scale manufacturing are still climbing (the same group has achieved greater than 5% in a cm^2 test cell iirc) and the printing is incredibly cheap and very amenable to fast scaling up.

It seems pretty obvious that you needed more efficiency for it to be a viable rooftop solution but the guy who set this up claimed that the fact he could just stick down some velcro and stick on the cells opened up some different use cases with cheap and lean installations supporting cheap cells.

All in all, if you look at how far the technology has come in the last 5 years alone, then it's a pretty exciting field to follow.

2
amelius 16 minutes ago 0 replies      
> By reinventing the delivery model we remove the need for initial lump sum outlays, overcoming the key barrier to community uptake and ensuring that the science actually ends up on our rooftops, said Professor Dastoor.

But can't banks just solve this, by financing panels upfront? There's quite some money to be made there, I'd guess. And the risk is limited.

3
danmaz74 17 hours ago 5 replies      
Looks like the conversion efficiency is between 2% and 3%, so, pretty low compared to silicon based PV

reference

https://cleantechnica.com/2017/05/17/researchers-australias-...

http://reneweconomy.com.au/uni-newcastle-team-tests-printed-...

edited to clarify 2/3%

4
QAPereo 18 hours ago 2 replies      
Does anyone have any information on how efficient these are how long they last and that kind of thing? All I got for the article was a lot of hype and $10 a meter.
5
Someone 14 hours ago 0 replies      
2% efficiency would give you ballpark 5W per square meter (peak) or about .25W for a sheet of paper (less indoors or at night)

So, if this can be combined with a paper-thin e-ink display (and, if needed, a flat sheet capacitor for power storage), would that be enough to make true paper-thin displays at reasonable price?

6
nilsocket 18 hours ago 1 reply      
> On our lab-scale printer we can easily produce hundreds of metres of material per day, on a commercial-scale printer this would increase to kilometres. If you had just ten of these printers operating around the clock we could print enough material to deliver power to 1000 homes per day, said Professor Dastoor.

That being said, may-be 10Km worth of these can power 1000 homes.

It costs $10 per sq.meter.

7
simcop2387 15 hours ago 1 reply      
I wonder if these could be used to produce panels on a remote planet/planetoid. I.e. could you use this to create a bunch of panels to place on the moon, even with the low efficiency you'd still save a lot by not having to ship them.
8
ChuckMcM 16 hours ago 0 replies      
An interesting addition to the mix. The design space around solar power systems focuses on either cost (like in this example where efficiency is low) or efficiency gains regardless of cost[1].

Presumably if we get to a point where you can cheaply print 25+% efficient cells then we're "done" as it were on improving solar cells :-)

[1] https://arstechnica.com/science/2017/03/japanese-company-dev...

9
wrycoder 11 hours ago 0 replies      
Shingles are about $8 per sq meter, uninstalled. Plywood is somewhat more. How durable is this stuff?
10
peter303 16 hours ago 0 replies      
A lot of cost in installations is now other factors such as casing, peripheral electronicls like invertors, labor, financing, etc.
11
Ghost66 7 hours ago 0 replies      
Efficiency is the bottleneck of that's project, perhaps if it became an open source project creative people can create fantastic uses for it.
12
HillaryBriss 18 hours ago 2 replies      
For a very rough comparison, according to this, conventional solar panels cost about $10-$12 per square foot, or very roughly $100 per square meter.

https://www.quora.com/What-is-the-cost-per-Sq-ft-for-solar-p...

Of course, what we really want is a comparison in terms of cost per watt.

Maybe equally important to the cost of these panels is the ease and cost of installing them. These new printed panels are very flexible/lightweight and can be deployed easily and even temporarily.

13
konschubert 10 hours ago 0 replies      
What's the cost per Watt?
14
msoad 17 hours ago 0 replies      
How those films hold under direct sun?
15
vasili111 15 hours ago 0 replies      
What is the price per watt?
16
unwttng 14 hours ago 0 replies      
Yeah but do they mine bitcoin
17
1024core 18 hours ago 4 replies      
> If you had just ten of these printers operating around the clock we could print enough material to deliver power to 1000 homes per day, said Professor Dastoor.

The US has 100M homes. That would require 100,000 days, or 300 years...

       cached 24 July 2017 10:02:02 GMT  :  recaching 2h 33m