hacker news with inline top comments    .. more ..    2 Feb 2017 News
home   ask   best   5 months ago   
1
Snap Inc. S-1 sec.gov
188 points by harryh  1 hour ago   122 comments top 29
1
stuckagain 1 hour ago 4 replies      
"We have committed to spend $2 billion with Google Cloud over the next five years and have built our software and computer systems to use computing, storage capabilities, bandwidth, and other services provided by Google, some of which do not have an alternative in the market."

There's some numbers for you that Google wouldn't have provided (as far as I have seen).

2
d4l3k 1 hour ago 1 reply      
> Although other U.S.-based companies have publicly traded classes of non-voting stock, to our knowledge, no other company has completed an initial public offering of non-voting stock on a U.S. stock exchange. We cannot predict whether this structure and the concentrated control it affords Mr. Spiegel and Mr. Murphy will result in a lower trading price or greater fluctuations in the trading price of our Class A common stock as compared to the trading price if the Class A common stock had voting rights. Nor can we predict whether this structure will result in adverse publicity or other adverse consequences.
3
dvdhnt 1 hour ago 2 replies      
> We had 158 million Daily Active Users on average in the quarter ended December 31, 2016, and we view Daily Active Users as a critical measure of our user engagement.

> We anticipate that our Daily Active Users growth rate will decline over time if the size of our active user base increases or we achieve higher market penetration rates. If our Daily Active Users growth rate slows, our financial performance will increasingly depend on our ability to elevate user engagement or increase our monetization of users.

> In addition, because our products typically require high bandwidth data capabilities, the majority of our users live in countries with high-end mobile device penetration and high bandwidth capacity cellular networks with large coverage areas. We therefore do not expect to experience rapid user growth or engagement in countries with low smartphone penetration even if such countries have well-established and high bandwidth capacity cellular networks. We may also not experience rapid user growth or engagement in countries where, even though smartphone penetration is high, due to the lack of sufficient cellular based data networks, consumers rely heavily on Wi-Fi and may not access our products regularly.

> Snapchat is free and easy to join, the barrier to entry for new entrants is low, and the switching costs to another platform are also low. Moreover, the majority of our users are 18-34 years old.

> This demographic may be less brand loyal and more likely to follow trends than other demographics.

> For example, users 25 and older visited Snapchat approximately 12 times and spent approximately 20 minutes on Snapchat every day on average in the quarter ended December 31, 2016, while users younger than 25 visited Snapchat over 20 times and spent over 30 minutes on Snapchat every day on average during the same period.

> Our Daily Active Users may not continue to grow. For example, although Daily Active Users grew by 7% from 143 million Daily Active Users for the quarter ended June 30, 2016 to 153 million Daily Active Users for the quarter ended September 30, 2016, the growth in Daily Active Users was relatively flat in the latter part of the quarter ended September 30, 2016.

4
brentm 44 minutes ago 3 replies      
Much greater losses than Twitter's S1. Snap's revenue is 27% higher but 548% greater net losses.

Twitter s1: https://www.sec.gov/Archives/edgar/data/1418091/000119312513...

5
synaesthesisx 1 hour ago 1 reply      
Is anyone else seeing these numbers? How are they pushing for such a high valuation with those kind of losses?

I suspect that Snap is merely capitalizing on traditional advertising metrics (engagement, CTR) which don't translate over on their app, and the advertisers just haven't caught on yet. People play with filters because they're funny/amusing, but those impressions don't convert into purchases in the same way other types of ads would.

Earlier last year Snapchat temporarily featured X-Men filters exclusively ahead of the movie release - I interacted with those for the novelty (and mainly because they disabled all the other filters so there were no other options) but I did not see the movie.

Again, I'm not sure if they follow something similar to a CPC model or what, but I bet it's expensive. If the advertisers decide it's ineffective their newfound revenue growth certainly won't be sustainable.

6
ch0wn 5 minutes ago 0 replies      
"The launch of Spectacles, which has not generated significant revenue for us, is a good example. There is no guarantee that investing in new lines of business, new products, and other initiatives will succeed. If we do not successfully develop new approaches to monetization, we may not be able to maintain or grow our revenue as anticipated or recover any associated development costs, and our business could be seriously harmed."

I don't think anyone expected Spectacles to be a cash cow, but I still would have expected a more positive outlook on them.

7
melvinmt 1 hour ago 0 replies      
Very interesting to see what they think of their competition:

> We face significant competition in almost every aspect of our business both domestically and internationally. This includes larger, more established companies such as Apple, Facebook (including Instagram and WhatsApp), Google (including YouTube), Twitter, Kakao, LINE, Naver (including Snow), and Tencent, which provide their users with a variety of products, services, content, and online advertising offerings, and smaller companies that offer products and services that may compete with specific Snapchat features.

> For example, Instagram, a subsidiary of Facebook, recently introduced a stories feature that largely mimics our Stories feature and may be directly competitive. We may also lose users to small companies that offer products and services that compete with specific Snapchat features because of the low cost for our users to switch to a different product or service.

> Many of our current and potential competitors have significantly greater resources and broader global recognition and occupy better competitive positions in certain markets than we do. These factors may allow our competitors to respond to new or emerging technologies and changes in market requirements better than we can.

> Our competitors may also develop products, features, or services that are similar to ours or that achieve greater market acceptance. These products, features, and services may undertake more far-reaching and successful product development efforts or marketing campaigns, or may adopt more aggressive pricing policies.

8
minimaxir 1 hour ago 2 replies      
Interesting DAU-growth plateau around the 150M user mark mid-2016. Maybe there is truth to the theories that Instagram is successfully slowing the drain of users to Snapchat?
9
nlittlepoole 1 hour ago 4 replies      
"We have incurred operating losses in the past, expect to incur operating losses in the future, and may never achieve or maintain profitability." haha
10
urs2102 1 hour ago 1 reply      
> We have incurred operating losses in the past, expect to incur operating losses in the future, and may never achieve or maintain profitability.

Also having only 158 million daily active users up from 150 million in June is definitely interesting and a lot smaller growth jump than I had expected.

Revenue up from $58MM in 2015 to $405MM despite a loss of $515MM last year. Wishing them best of luck, but I wonder how much their competition is hurting their growth (Instagram stories for example)?

Also: > We are not aware of any other company that has completed an initial public offering of non-voting stock on a U.S. stock exchange. We therefore cannot predict the impact our capital structure and the concentrated control by our founders may have on our stock price or our business.

Hmmm... will be interesting...

11
vineetch 1 hour ago 6 replies      
They spent $890,339 on security for Evan Spiegel in 2016? Why does he need that much security? Who is trying to hurt him?
12
benarent 51 minutes ago 1 reply      
It's great to see both founders ( Evan and Bobby ) ended out with equal shares, both at 21.8% of Common A Stock. The history is pretty interesting, and as always there is a possible 3rd founder. https://techcrunch.com/2013/07/31/spiegel-murphy-say-alleged...
13
sagivo 1 hour ago 3 replies      
> Snap Inc. is a camera company

interesting statement from a company that just released their first camera (Spectacles) few months ago. i'm not sure if their users see them as a camera company.

14
arzt 35 minutes ago 0 replies      
What's up with the massive negative gross margins? Will wall street glance over those? Is there any precedent for a company going public with such upside down financials (putting growth aside)?
15
ejcx 1 hour ago 1 reply      
Am I reading this right? I'm no banker... Their losses for 2016 were ~$500m? That's very steep, but it looks like revenue grew almost 700%. Wow.

Insane numbers. I don't think anyone on the planet knows what's going to happen with them but I am sure interested in finding out.

16
asdfg11235813 1 hour ago 1 reply      
> For the year ended December 31, 2016, we recorded revenue of $404.5 million... For the year ended December 31, 2016, we incurred a net loss of $514.6 million

> We have three classes of common stock: Class A, Class B, and Class C. Holders of our Class A common stockthe only class of stock being sold in this offeringare entitled to no vote on matters submitted to our stockholders

17
bluetwo 51 minutes ago 0 replies      
OK, their dirty laundry is about what I would expect it to be. They are required to list these things at this point to avoid being accused of hiding information later.

Given all this, are you buying? If you owned stock day 1 would you sell it?

18
leothekim 1 hour ago 0 replies      
> We rely on Google Cloud for the vast majority of our computing, storage, bandwidth, and other services. Any disruption of or interference with our use of the Google Cloud operation would negatively affect our operations and seriously harm our business.

Over-under on how much of their cost of revenue goes to Google App Engine?

19
pixelmonkey 57 minutes ago 0 replies      
"We are required to purchase at least $400M of cloud services from Google each year beginning on January 30, 2017..."

Wow, quite a snag for Google Cloud Platform to land that contract!

20
dddrh 1 hour ago 0 replies      
Small question, but is it normal to leave the numbers blank when it comes to share-percentage or number of shares rewarded to the founders in the Risks section[0]?

[0]: https://www.sec.gov/Archives/edgar/data/1564408/000119312517...

21
mmastrac 19 minutes ago 0 replies      
I wonder if Snapchat is still using AppEngine. I'm pretty sure they were 50%+ of the AE traffic at some point.
22
whitepoplar 51 minutes ago 2 replies      
Serious question: how does Snapchat spend so much money on infrastructure? How could the app run up a $2b Google Cloud tab? Would anyone care to break it down?
23
kartD 20 minutes ago 1 reply      
So what's the transition time from an S-1 to being able to buy shares. Does it mean the stock will list tomorrow?
24
charlesdm 8 minutes ago 0 replies      
Your pension funds at work, people.
25
obilgic 1 hour ago 1 reply      
Snap Inc. is a camera company.
26
earlyriser 55 minutes ago 1 reply      
When are they going to be on Nasdaq? I'm not sure where to find this info.
27
justinzollars 23 minutes ago 0 replies      
Snap is a company that doesn't know Oneself.
28
softwarefounder 16 minutes ago 0 replies      
I'll never complain about writing a SOW ever again.
29
screed 53 minutes ago 1 reply      
What a shit design of this page.
2
To Live Your Best Life, Do Mathematics quantamagazine.org
280 points by digital55  5 hours ago   137 comments top 17
1
110011 2 hours ago 6 replies      
As a counterpoint to the overly exuberant article let me chime in as a graduating theoretical computer scientist. Math is hard and despite its beauty and allure academic life doesn't take place in a vacuum- ergo there's a lot of politics and pettiness from your peers. You should expect to invest years of your life into making progress on some hard problem with little encouragement in the meantime, this takes a tremendous toll on the mind and not for someone who falls easy prey to self doubt. Sometimes even after you publish your top result that took a lot out of you, it may take many years before people appreciate, let alone understand, your work- the vast majority of papers are never going to be read. To compensate for this horrible feedback mechanism you need to basically play the popularity game and try to give many talks and talk up your result when you meet people, so there's a lot of salesmanship involved here as well. So the career path of a junior scientist is pretty crushing mentally and I couldn't stomach it in the long run.

In an ideal world I might have continued in academia but the career path is so twisted you have to either be insanely good (at math and at managing time) or just hate yourself enough to sacrifice your best years working essentially in the metaphorical darkness well outside the spotlight and most likely alone and poorly paid.

2
sametmax 2 hours ago 1 reply      
The problem with this theory, is that it completely ignores that a lot of people good at math have poor people skills. And you need people to be happy and improve your life opportunities.

So yeah, math is beautiful, and if you like it, go for it.

But if you look for a skill to acquire or practice, your taste no withstanding, this may not be the best investment. Sport, social skills, languages, time management, self introspection and cooking are examples of things that usually pay off better than math in your life. It brings more people, opportunities, health, money, etc.

Again, not saying math is not a good thing to practice. We need math as a specie. And an individual may need it for his or her happiness. But as a strategy I don't think so.

3
gthtjtkt 4 hours ago 16 replies      
Based on all the supposed benefits of doing math, it sounds like people would be better off studying philosophy. Same benefits but a much broader appeal, and far more applicable to most people's everyday lives.

When I finally "discovered" philosophy in college, I was angry that we hadn't been exposed to it at all in middle or high school. Instead, I'd been forced to waste years on things like math, biology, etc. that I had no interest in and no use for. Our history / social studies classes would be greatly improved if they incorporated more philosophy.

4
Koshkin 3 hours ago 2 replies      
Well, mathematics is hard. It takes a lot of effort to really learn it, a lot of dedication and (self-)motivation. Not everyone can do it, and, frankly, very few people - even among those who have spent many years studying mathematics - could say that it is the best way to live your life.
5
wallace_f 11 minutes ago 0 replies      
When you're not good enough to do, teach. When you're not strong enough to lead, follow. This article is just a bunch of exuberance and politically correct speech.
6
primodemus 4 hours ago 3 replies      
7
westoncb 2 hours ago 0 replies      
It seems like a lot of the benefits mentioned could arise from many other activities with similar likelihood. I think a lot of it has to do with developing expertise at something you believe is important, and which you are (or can become) comfortable doing.

Totally agree that for people working in abstract technical areas (e.g. software architecture, philosophy, inventing things), however, mathematics has a special sort of value over other subjects. It deals in these super distilled concepts which have very general applicability; so, the concepts you learn end up expanding this pool you can draw from for coming up with new, related ideas, in a wide range of fields.

It's also important to learn it as a sort of literacy, to widen the range of technical material you can read.

8
norcimo5 25 minutes ago 0 replies      
"To Live Your Best Life, Do +Applied+ Mathematics".There's something delightful about using mathematics as a mean to an end...
9
nojvek 4 hours ago 2 replies      
I wonder if inmates have access to computers, would they become good programmers?
10
recycleme 3 hours ago 3 replies      
Interesting. Anyone have a recommendation on a Math app for everyday use?
11
andrewflnr 2 hours ago 2 replies      
Su seems almost blind to the fact that you can do math outside an academic environment. You don't need to go to school for math to experience truth, beauty, play, etc in the context of math. You can read books, play around with the results, read articles on places like Quanta. You probably won't discover new things this way, but it will enrich your life.

If your goal is for math to make people's lives better, then assuming school is involved at all is another unnecessary restriction.

12
stablemap 4 hours ago 2 replies      
Everyone I know who went to his MAA farewell address said it was wonderful. Definitely follow their link to read or hear the whole thing:

https://mathyawp.wordpress.com/2017/01/08/mathematics-for-hu...

13
stOneskull 3 hours ago 0 replies      
it reminds me of the awesome team who get involved with the numberphile youtube series. guys like james grimes and matt parker. inspiring the love.
14
king_panic 2 hours ago 1 reply      
Elementary math skills, which many of us reading and writing on Hacker News know, would change the lives of many millions of people in ways we take for granted.
15
RA_Fisher 1 hour ago 0 replies      
Being math with different syntaxes, this is true for programming as well.
16
cosinetau 2 hours ago 0 replies      
The Greeks also drank hemlock. Just saying.
17
wnevets 3 hours ago 1 reply      
> In 2015 he became the first person of color to lead the MAA.

I didn't realize chinese was a person of color.

3
Online migrations at scale stripe.com
93 points by hepha1979  2 hours ago   26 comments top 10
1
docaholic 29 minutes ago 0 replies      
We're doing nearly the exact same approach in my office out of necessity.

We "allowed" small amounts of downtime previously (say 10-15 minutes at a time for migrations), but over the last year as our customer base has expanded, the window has shrunk smaller and smaller as to avoid service disruptions.

We're now at a stage where downtime is effectively not allowed anymore, so we've taken to this approach (make a new table, dual write new records, then write all the old records, remove references to the old table) to mitigate our downtime.

It's nice to know that other companies have taken this approach as well, my team honestly didn't know if we were doing it "properly" or not (if there is such a thing as properly)

2
Diederich 36 minutes ago 0 replies      
It's nice to see this kind of thing covered.

A company I worked for some years ago, LiveOps, also followed this methodology, because we had to.

LiveOps was and is a 'telephony in the cloud' provider, and the databases are in-line with some of the important call flows.

Last I looked, in the ten years previous, LiveOps had handled at least one call from something like 15% of all of the possible phone numbers in the United States, and all of that data was available in a huge replicated mysql cluster spanning multiple datacenters. We had mysql tables with tens of billions of rows, replicated across many servers.

And just to be crystal clear: there was absolutely no downtime allowed, ever, because downtime of any kind meant that phone calls would not go through.

We used extensive feature flags, along with the techniques described in the stripe.com article, to achieve exceptional uptime, relatively ok operational overhead and pretty good development velocity.

The point I'd like to drive home is that this is all possible even for medium sized organizations and moderately sized staffs. The important part is that everyone needs to keep these priorities, methods and techniques in mind, all the time, to make it work.

3
smnc 1 hour ago 1 reply      
The article refers interchangeably to tables and collections. Reading this[1], it seems they are using MongoDB as their transactional database. If they do, it would be interesting to know at what point did they decide to split the subscriptions data into its own collection. The article just states "As we added new features, this data model became problematic."

Mongo-modelling being a black art of sorts (to me, at least) I'd be more interested in what the tipping point was for them (data size and shape, usage patterns) than the relatively straightforward (conceptually at least, operatinally these things are never to be underestimated) table-doubling approach to changing a data model.

[1] https://stripe.com/blog/announcing-mosql

4
misterbowfinger 1 hour ago 2 replies      
This article seems to be about really fundamental changes to your data modeling. If you're using MySQL and interested in only changing one table, i.e. adding/removing a column or an index, check out LHM:

https://github.com/soundcloud/lhm

I've used it in the past a lot. You have to make sure to throttle the migration, or else it'll max out your CPU & memory.

5
rattray 1 hour ago 1 reply      
I'm curious about how this edge case was handled:

1. Lisa has old tv_service subscription, saved to snapshot, but not live-copied to Subscription table.

2. Lisa removes tv_service subscription.

3. Before snapshot is updated, batch processing adds lisa_tv_service to the Subscription table.

4. Subscription now has lisa_tv_service, but Lisa.subscriptions does not include tv_service.

My only idea: wait a day, then a batch-process job looking for data that exists in Subscription table but not in Customer.subscriptions

6
etaty 47 minutes ago 0 replies      
> Once the migration is complete, run the Scalding job once again to make sure there are no existing subscriptions missing from the Subscriptions collection.

That sounds scary! how many time do you need the "once again"?

7
overcast 1 hour ago 2 replies      
The big question that wasn't really covered, is how they are selectively updating the writers and readers on a live environment. Especially migrating on a per object basis to have them start dual writing, and reading.
8
danthaman44 1 hour ago 0 replies      
This is some very impressive stuff!
9
spullara 1 hour ago 1 reply      
Yikes. Use evolvable schemas and never do mass migrations.
10
mighty_warrior 52 minutes ago 3 replies      
This sounds a little over complicated all in an effort to decrease the amount of code changed at any given time. They took great pains to keep data in sync across A and B datastores and I'm not so sure that extra cost was worth the perceived stability of this approach.
4
Git-scm.com status report marc.info
103 points by cnst  2 hours ago   64 comments top 12
1
joeblau 6 minutes ago 0 replies      
This is very interesting. I run https://www.gitignore.io and this post highlights a lot of interesting things about git-scm.

1. GitHub is footing the bill I'm paying for gitignore.io (although it's only costing me the annual domain)

2. The site uses 3 Dynos Currently gitignore.io uses 1 Dyno on the free tier and I've recently moved the backend from Node to Swift to double / triple network performance based on my preliminary testing. I don't know why the site needs 3 Dynos because like the OP mentioned, it's a static site.

3. Access to Heroku seems to be an issue I ran into the same problem and I'm finishing up a full continuous integration process to build and test my site on Travis. I basically want to approve a pull request and have the site fully tested though my Heroku pipeline, then have the PR landed in production.

4. Traffic - I don't know how many users he's got but I'm seeing about 60,000 MAU's and about 750,000 requests a month.

* Jason Long helped design my site and logo as well.

2
dom0 2 hours ago 2 replies      

 > We (the Git project) got control of the git-scm.com domain this year. We > have never really had an "official" website, but I think a lot of people > consider this to be one.
So, uh, git-scm.com wasn't an official website all these years?

3
vlucas 1 hour ago 1 reply      
I am always amazed at how quickly Heroku gets prohibitively expensive when you start scaling.

When I ran https://jscompress.com/ on Heroku, I was up to $100 per month for 2 2x Dynos. Completely absurd for a simple one-page Node.js app. I put in a little work moving it to DigitalOcean, and had it running great (and faster) on a $10 VPS.

I get the appeal of Heroku (I have used it several times), but man sometimes it feels like gouging when you can least afford it.

4
toomanybeersies 1 hour ago 1 reply      
Speaking of popular software that doesn't have its own website, the PuTTY developers have never bothered with getting a domain name specifically for PuTTY (http://www.chiark.greenend.org.uk/~sgtatham/putty/). I'm not actually entirely sure what the rest of the site/domain is meant to be for either.
5
OJFord 2 hours ago 5 replies      
If it were my site, I think I wouldn't even bother with the search: just stick it all on S3, and have one of those 'Google custom search' or similar boxes, so it's static as far as your site's concerned, and just redirects to Google with the `site:foo` filter.

I don't really have a handle on what S3 costs 'at scale', but I think I'm willing to bet it would knock at least the 0 off the end.

6
bicknergseng 2 hours ago 1 reply      
I wonder how hard it would be to convince folks to drop their expensive setups in favor of nearly $0 static sites, as well as how much up front cost they'd be willing to shovel out for the transition. S3 + CDN (+ Lambdas optionally) feels really ready to me for almost any straightforward "website." For most things GitHub/Lab pages is an easy path to that.
7
gsylvie 1 hour ago 1 reply      
Some armchair speculation: the price Gitlab and Atlassian would pay to have one link each up there would probably dwarf the current monthly hosting costs.

Not sure if the "try.github.io" link should count as a link to Github, but most of the others do (e.g., github.com/google).

8
mentat2737 2 hours ago 1 reply      
Just a note:

> The deployed site is hosted on Heroku. It's part of GitHub'smeta-account, and they pay the bills.

So why aren't they just using a GitHub page for this?

9
sametmax 2 hours ago 1 reply      
> It uses three 1GB Heroku dynos for scaling, which is $150/mo. It also uses some Heroku addons which add up to another $80/mo.

Wow, why ? You can get a VPS with 2Go Ram + 10 Go SSD for 3 those days (https://www.ovh.com/fr/vps/).

That seems very expensive.

10
hobarrera 2 hours ago 1 reply      
IMHO, since it's a static website, they can use a static website generator and simply usage something like GitLab pages to deploy it (for free).

There is a bit of work to be done, but it shouldn't be too terrible if the templates and stuff are okay.

11
pulse7 2 hours ago 4 replies      
What's the best way to optimize cost here? Complete site cached and served from memory (no disc access -> faster response times -> scales better)?
12
camus2 2 hours ago 4 replies      
As far as I understand it nobody but the git team is paying for hosting. Why neither Github or Heroku are paying for this? They are built on top of git. Millions of tech dollars go to political causes right now yet nobody is willing to give $230/mo of free hosting to git website, the most used VCS today? Talk about priorities. And it's not the first time, plenty of open source projects used by billion dollar companies receive 0 of funding.

Edit: GIthub seems to be paying for that but Heroku shouldn't even bill them.

5
Circuit building: stop using antique parts (2014) sensitiveresearch.com
106 points by 6581  4 hours ago   90 comments top 26
1
ChuckMcM 4 hours ago 4 replies      
Sigh.

I like FETs as much as the next engineer but if you're going to write about them, try to not make things worse.

In particular, this made me sad: the 10K resistor to ground isn't strictly necessary, but it ensures that the MOSFET remains OFF if the arduino is disconnected or it's pin is not an OUTPUT.

I was helping a high school student with a transistor project and he was complaining that "half the FETs he bought were 'bad'". And they weren't really bad, but they were destroyed. They were destroyed because, like our author, the student had no understanding of how the FET worked and so didn't realize that when you exceed the maximum Gate voltage on a FET it generally causes a current to "jump the gap" to the source and that permanently breaks it. What is even more, since the gate is essentially a capacitor, you can just touch the gate with your finger and pass enough charge (without feeling a shock or anything) to greatly exceed the gate voltage. Pick up the FET without being grounded and "boom!" dead FET.

Now there is almost no way to generate enough current by touching a Bipolar transistor to kill it, and so they continue to work for a long time while plugging them in and out of breadboards.

Do they dissipate more power? Absolutely. Are they difficult to run in parallel? Sure. But they are pretty robust parts. Sort of like the difference between alkaline batteries and LiON rechargeable batteries. Sure the latter are a "better" choice, you can recharge them after all, but if you use them wrong they catch on fire, if you short an alkaline battery it gets hot but doesn't go ballistic on you.

So those "antique" parts are generally very cost effective, very robust, and easy to get. So they make excellent tools to teach you the basics. Do you want to stay with them as you get more advanced? Probably not, but you're probably not skiing on the same skis you learned on either.

/endrant

2
PeanutNore 4 hours ago 7 replies      
I build a lot of guitar effects, and that community is even worse when it comes to attachement to obsolete components. The discontinuation of J201 JFETs (in TO-92 through hole package at least) by Fairchild has caused a lot of upset. My perspective, not widely shared, is that JFETs in general have been made obsolete by depletion-mode MOSFETs, at least in audio circuits.

Part of the reason, I think, is that a lot of "designers" of guitar pedals don't understand the math that determines component values for a particular transistor in a particular circuit so they don't know how to either select a modern transistor to replace an obsolete one based on datasheet parameters or to change the collector / emitter / source / drain / etc resistor values to work right with a new part.

3
blackguardx 4 hours ago 3 replies      
There is nothing wrong with bipolar transistors. They aren't antiquated. FETs have definitely overtaken them in power applications (for good reason) but it seems hard to displace them for small signal applications. Small signal FETs just aren't that plentiful or cheap. There are many instances where a small signal FET makes sense (level translators, etc) but the cost doesn't justify its use. Using a 2n2222 and the like is appropriate.

On top of that, BJTs excel in many analog applications. There is a good reason why many analog IC companies make parts using a BiCMOS process. If you are doing a discrete analog design (generally for performance reasons) you often need the performance of BJTs or JFETs.

4
AceJohnny2 55 minutes ago 1 reply      
I've recently gotten back into hobby electronics, trying to revive all that EE knowledge I haven't used in 15 years, and one thing I sorely lack is knowledge of common components to use.

Online electronics stores are no help because of the overload of possible components.

Is there some sort of Cookbook out there I could refer to for up-to-date examples of common circuits and electronic tasks? I mean, I otherwise totally would've used a 2N2222 transistor for some tasks, because I wouldn't know better.

5
Animats 3 hours ago 2 replies      
If you want to switch power, MOSFETs are the way to go today. Voltage drop is near zero.Digi-Key has over a thousand MOSFETs in TO-220 packages. Here's one of the cheapest with a low gate threshold.[1] Probably good enough for most Arduino on/off applications.

MOSFETs do have some problems. They can draw large gate currents for the first few nanoseconds of turn-on, which can overload whatever is driving them. The gate input is very vulnerable to electrostatic discharge during handling. Device failure tends to be into the ON state.

I just went through a big struggle with MOSFET selection for a special purpose switching power supply. The big through-hole parts have too much gate capacitance and need too much drive at turn-on. Only in surface-mount could I get something that would work.

That's the real problem with antique parts. The new stuff is surface-mount only.

[1] https://www.fairchildsemi.com/datasheets/FQ/FQP4N20L.pdf

6
Severian 4 hours ago 1 reply      
The irony is that the MOSFET he suggests to use, the NTD4906N, is listed as obsolete on Digikey.

Which is why I pointed out. A lot of reasons people use antiquated components is probably because documentation and reference schematics are easy to find and have probably been refined to be very reliable.

7
duskwuff 21 minutes ago 0 replies      
The obsolete part I'd really like to see everyone get away from is the 741 operational amplifier.

The 741 was introduced in the late 1960s. It's completely outclassed by modern general-purpose op-amps; there is absolutely no reason to still be using it today.

8
lpmay 2 hours ago 0 replies      
There's not much of substance to this article. The world of electronics is a bigger place than turning a load full on or full off with a Mosfet. A couple exercises if you want to see why having some BJT's around is handy: Try to find a Mosfet that turns on at 0.6 Vgs. Compare the leakage Ids leakage of a Mosfet to the Ice leakage of a comparable BJT. I also take issue with the idea that "all that extra stuff" is drawn in the Mosfet symbol for "no reason". The body diode alone is an important factor to consider EVERY time you use a Mosfet. When you read schematics every day, you're trained to recognize when diodes will forward conduct in your circuit and having that drawn in the Mosfet symbol helps you immediately see problems that would otherwise make it into your final circuit.
9
mrob 3 hours ago 4 replies      
Mechanical relays are not obsolete. They can have lower resistance and capacitance than solid state equivalents. A good relay is a very close approximation to a plain wire when closed. Relays are still commonly used for RF switching.
10
spott 4 hours ago 0 replies      
11
coreyp_1 3 hours ago 2 replies      
Don't tell me that a LM386 is bad without telling me why and what to replace it with. If you don't, then you're not actually helping me at all, but rather it just sounds like you're spreading FUD.
12
dyselon 2 hours ago 0 replies      
Is there a good reference for potential upgrades to old, obsolete-ish parts? I definitely understand that many of the old logic ICs, transistors, and op-amps have been displaced by better parts, but I don't always know how to identify them short of a parametric search on digikey and hoping there's not some sharp corner I missed. I'd find a lot of value in a page that had some recommendations for "Are you using <OLD PART> for <PURPOSE>? Consider <NEW PART>!"
13
amelius 44 minutes ago 0 replies      
For anyone interested, I can highly recommend the course "Structured Electronic Design", [1].

[1] https://ocw.tudelft.nl/courses/structured-electronic-design/

14
joezydeco 3 hours ago 0 replies      
MOSFETs driving motors or large coils without spike protection? Good luck with that. Have a fire extinguisher handy.
15
bschwindHN 2 hours ago 2 replies      
I'm an electronics amateur, got a question here:

I use a PN2222 [1] transistor to switch a 5V supply through an infrared LED from a microcontroller. It works, but should I be using something better/smaller/more cost effective/more appropriate for the circuit [2]?

[1] https://blog.bschwind.com/2016/05/29/sending-infrared-comman...

[2] https://blog.bschwind.com/2016/05/29/sending-infrared-comman...

(the resistor in the photo is 680 ohms)

16
noonespecial 1 hour ago 0 replies      
Why use a 120 in todays modern age? Because that thing you're repairing was Woz'd to the point that that VCE plays an important part in the circuit's operation.

Keep your TIPS, you might not necessarily use them for that brand new design of yours, but you'll probably use them for something.

17
zw123456 2 hours ago 1 reply      
While we are on the topic of using more modern parts, if you are controlling a motor as shown in the example, why not use an H-Bridge chip ?
18
happycube 4 hours ago 0 replies      
The irony here is that the NTD4096N is discontinued/NLA. (There are other suitable parts in the price range, though.)
19
jhpankow 4 hours ago 1 reply      
You can have my 2N2222's when you pry them from my cold dead hands.
20
platz 4 hours ago 0 replies      
Although I am on board with OP's reccomendation, I think many folks using arduino are in the milliamp range; they are not driving amps of current. So the 'constant factor' of obtaining less well known parts from digikey is significant.

Do you really want to provide any more disincentive for budding hobbyists?

21
lightedman 1 hour ago 0 replies      
Good luck building a BFO metal detector using MOSFETs without a ton of additional useless circuitry.
22
nom 2 hours ago 0 replies      
23
jamesmp98 4 hours ago 0 replies      
I'll keep on using 6502's for fun
24
deepnet 1 hour ago 0 replies      
Page author is Tom Jennings, the creator of FIDONet, the BBS network that pre-saged the internet.

https://youtu.be/_Cm6EFYktRQ?t=2m32s

25
el_isma 4 hours ago 0 replies      
The page colors are terrible, but he does make a good point. Bipolars out, FETs in!
26
basicplus2 1 hour ago 0 replies      
that's funny really, because MOSFETs were around in the 70's
6
Israels Tech Firms Do Business in Saudi Arabia Quietly bloomberg.com
80 points by gavman  3 hours ago   30 comments top 10
1
Animats 31 minutes ago 0 replies      
The "binary options" scam industry, which is mostly run from a suburb of Tel Aviv, has been targeting the Arab world recently. Mostly by default. The industry isn't allowed to scam Israelis, but the rest of the world's suckers are fair game under Israel law. They've been kicked out of the US, and recently, most of the EU. So they're telemarketing into the Arab countries now.

This requires hiring Arabic speakers, who are available in Israel. Greed apparently overrides Arab-Israeli differences in this area.

(Big 15-part expose in the Times of Israel: "The Wolves of Tel Aviv".[1] Summary: binary options are bets against the house, not against other speculators. Binary options "brokers" are not really brokers, they're shills for the house. Worse, the house cheats, tweaking the prices to make customers lose. Even if customers win, the house won't pay up. 80% of investors lose all their money. This brings in over 0.7% of Israel's GDP, and that's just the part that pays taxes.)

[1] http://www.timesofisrael.com/the-wolves-of-tel-aviv-israels-...

2
marginalcodex 2 hours ago 0 replies      
As someone with direct experience with this, it's nice (and important) to see this reality get more recognition. Something unmentioned in the article is that the military/intelligence relations are far stronger and better integrated than the tech relationships.

The article correctly points out that the Arab states involved (this doesn't seem to apply to the non-arab muslim states) will not normalize formal relationships, without movement on the Palestinian front. However, despite this, these dynamics have made Arab countries much more susceptible to pressuring the Palestinians, and not putting diplomatic pressure on Israel on the things they care about (ie Operation Protective Edge).

An important point to note is that as all of the Arab countries becoming friendly to Israel are non-democratic, and their citizens don't view Israel along the same lines as their governments.

3
Cyph0n 48 minutes ago 2 replies      
I'm a Muslim Arab American and I am against Israel's more aggressive policies and actions.

But I definitely respect Israel as a state, and especially the Israeli tech sector. Israel has consistently produced world-class companies in almost every field you can think of. It's quite impressive once you consider all of the factors.

The author of "Startup Nation" argued that one key reason behind this success is how the military ties back into civilian companies, especially with the mandatory service. I'm not sure how important that is, given how many other countries have a similar model yet they've achieved nothing close to what Israel has. There are a few that have, such as Singapore and South Korea.

I'm hoping that my home country Tunisia can replicate (at least somewhat) the Israeli success story. We have a lot of talented people both inside the country and overseas. The biggest thing we're lacking right now is a functioning economy (yep, tough times)... but I hope we can get there in the near future!

4
dharmon 2 hours ago 0 replies      
I worked with an Israeli company that had to do something like this.

While in the country I was going to visit their office, but I couldn't find the address anywhere on the website, only their NY satellite office. I was later told this was because they have manufacturing customers in Pakistan and Bangladesh, from which they hide their true identity.

5
ysavir 2 hours ago 0 replies      
Hopefully Saudi Arabia doesn't read Bloomberg.
6
tw04 2 hours ago 7 replies      
>If its a country which is not hostile to Israel that we can help, well do it

To say Saudi Arabia isn't hostile towards Israel takes one heck of a lot of mental gymnastics, or a burying of your head in the sand. Wahhabism isn't exactly known to be pro-Judaism.

7
projectramo 1 hour ago 0 replies      
I wonder if these machinations are necessary.

I mean what if the company didn't try to hide its affiliations (for instance, on the website or when customers visited).

They assume that the customer would react negatively, but do we know this or is it just an untested assumption?

8
Fiahil 2 hours ago 0 replies      
> Moreover, common sense tells us that in order for Saudi Arabia to get any weapon systems, they have to be bought under trade agreements made with friendly countries that manufacture those systems with official and approved export trade certificates from their governments. It is also certain that Israel is not among the countries that have commercial relations with the Kingdom.

I thought software sales were exempt from trade certificates ?

9
gcb0 22 minutes ago 0 replies      
can't say which side paid who for this piece.
10
avip 2 hours ago 0 replies      
This has nothing to do with tech.

There are unofficial trade relations between Israel and (almost?) any country in the world.

7
Show HN: Posters of Your GitHub Contributions commitprint.com
167 points by aarondf  5 hours ago   60 comments top 14
1
lolsal 4 hours ago 6 replies      
I think the site is nice - it's clean and directly to the point. No huge hero and no click-throughs to get to the meat of your project. Prices for your posters seem entirely reasonable.

Tangent: I don't understand the fascination with one's own commit heatmap (or any other commit heatmap for that matter). Maybe it seems a bit too self-congratulatory? Vain? I don't know quite what it is, but it rubs me the wrong way and I wouldn't be interested in putting something up like that in my home or office. Maybe because it implies quantity over quality? Whatever the reason, your poster project is nice regardless of whether or not I would buy one.

2
aarondf 5 hours ago 6 replies      
Hey yall, looks like it's being hugged to death right now... I'm trying to keep up!
3
tuxracer 1 hour ago 0 replies      
Cool project! Quick suggestion: Ditch the "Update preview" button and immediately update the preview after you make new choices.
4
madamelic 58 minutes ago 2 replies      
This would be so much cooler if it was wall-mounted art and 365 LEDs.

I'm not big into electronics, but can't you adjust LED's brightness by changing what voltage is sent to them?

5
rhabarba 4 hours ago 5 replies      
It makes me sad to see that people actively contribute to the implicit "open source = Github" assumption. Open Source has always been successful because there was more than just a single distributor for almost anything.

I hope I'll see a new rise of decentralized solutions. For now I'm frightened of the monopolism here.

6
kuon 41 minutes ago 0 replies      
I guess the site might be overloaded, I tried with my github but it's just spinning.
7
smortaz 4 hours ago 0 replies      
Is there a way to get these for a repo? I'd love order a couple for my teams. Thanks!
8
pwny 2 hours ago 0 replies      
Love the look of the site!

How are you getting these printed and shipped? Do you do it yourself or consume a 3rd party service?

9
jrm2k6 2 hours ago 0 replies      
If you just want a little widget to display and not necessarily a poster, there is also this: https://github.com/jrm2k6/contwidgetor (gets your bitbucket contributions as well).
10
gravypod 4 hours ago 2 replies      
What about private repos. Will this work?
11
wooshy 3 hours ago 0 replies      
Awesome site and idea! I hope you figure out how to work your bottlenecks out
12
debt 4 hours ago 0 replies      
Very cool. I think there's a lot of focus on code quality and organization and shit; sometimes it's important to remember you built something! You should be proud of yourself.
13
googletron 3 hours ago 2 replies      
blatant gyroscope rip off. nice effort in hacking it together though.
14
ahm786 5 hours ago 0 replies      
Cool stuff!
8
Announcing Rust 1.15 rust-lang.org
297 points by steveklabnik  4 hours ago   63 comments top 16
1
thenewwazoo 4 hours ago 4 replies      
WHOA!

MSP430 support is HUGE. In case you don't know, that's a well-known very low-power microcontroller.

edit: this might be the only current platform that's <32 bits. I know there was some work on supporting the 8-bit AVR architecture, but MSP430 gets rust a lot closer to super-low-power applications.

2
dikaiosune 4 hours ago 3 replies      
This is pretty major. In anticipation of this release, I was able to remove all nightly feature flags from a smallish web backend which makes heavy use of type-driven codegen (serde and diesel). Really, truly, utterly fantastic to have stability promises for custom_derive. So excited!

!!!!!!

So excited!

3
squiguy7 4 hours ago 2 replies      
This is probably the most exciting release yet. Congratulations to the Rust team! I'm continually impressed by the inertia this project has.

I can not wait to see what everyone will build with custom derive now.

4
marcosscriven 2 hours ago 1 reply      
Great to see the progress here. The one I'm really looking forward to is 'impl trait'.[0]

My understanding is this will allow returning trait implementations (such as Iterator) without resorting either to dynamic dispatch or leaking the concrete type.

[0] https://github.com/rust-lang/rust/issues/34511

5
JelteF 3 hours ago 1 reply      
I've been hapilly awaiting this release. I have been heavily updating my rust first Rust library so it makes use of the new derive API, so that it could be used by stable Rust.

It adds derives for common traits such as Add and From: https://github.com/JelteF/derive_more. Any remarks/tips are very much appreciated.

6
eridius 3 hours ago 0 replies      
Woo, std::char::encode_utf8 is now stable, I can finally get rid of the nightly requirement for https://github.com/kballard/rust-lua!
7
flukus 19 minutes ago 1 reply      
So these procedural macros, would it be fair to characterise them as compile time reflection? It seems like these get used in the same way reflection does in other languages.
8
mayhew 15 minutes ago 0 replies      
Great to see custom derive landing in stable. Big thanks to Mozilla and everyone else working on improving Rust!
9
zegerjan 4 hours ago 1 reply      
> Rust 1.15 sees an extremely eagerly-awaited feature land on stable

Hoped to read incremental recompilation there, what a tease ;)

10
leshow 4 hours ago 0 replies      
Awesome! I have been stuck on nightly for so many different things because of my dependency on serde. It's very nice to have this on stable now.
11
kaoD 1 hour ago 1 reply      
On https://thanks.rust-lang.org/rust/all-time

 Rank Name Commits 1 bors 10401
That bors guy is such a monster! ;)

12
maxfurman 1 hour ago 6 replies      
Not a Rust expert by any means, this one piece of code from the example stuck out to me: `String::from("Ferris")`. What is the difference between that and `"Ferris"` on its own?
13
Siecje 51 minutes ago 1 reply      
So it seems cross-compiling is supported now?
14
the_duke 4 hours ago 0 replies      
Yay, custom derive is stable.
15
wyldfire 4 hours ago 0 replies      
Nice, I like the look of custom derive.

I've been using x.py/rustbuild since just after it was committed and it's been a pretty good experience so far.

16
EugeneOZ 41 minutes ago 1 reply      
:
9
Snapchat reportedly hit 160M daily users and $400M revenue in 2016 techcrunch.com
26 points by prostoalex  1 hour ago   7 comments top 4
1
fnovd 1 hour ago 1 reply      
As someone who never really "got" Snapchat, I found this interesting:

>Snapchat has done an impressive job soaking up attention by covering three different use cases with a single app: private messaging, social media Stories broadcasting, and professional Discover content. These work together to give people something to do even if their friends dont post interesting stories, theyre waiting for people to reply, or they dont resonate with the featured publishers.

2
eddd 1 hour ago 1 reply      
20% per year is not impressing to be honest. Don't get me wrong, I admire them as a company, but the grow rate is too slow, to compete with top social platforms.
3
aznpwnzor 31 minutes ago 0 replies      
And their GCP contract is for $400M a year?
4
ffef 1 hour ago 0 replies      
Good for them
11
How to use Deep Learning when you have Limited Data medium.com
12 points by sarthakjain  50 minutes ago   5 comments top 2
1
brandonb 13 minutes ago 1 reply      
Another idea is one-shot learning using deep generative models. DeepMind had a paper on this last year: https://arxiv.org/abs/1603.05106
2
sarthakjain 50 minutes ago 1 reply      
Machine Learning and AI seem to be in vogue but become tough to implement unless you have boatloads of data. We've personally had multiple frustrating experiences over the last ~7 years of trying to solve problems using ML. In almost all the cases we failed to ship due to lack of data. Transfer Learning is a major breakthrough in ML where companies with little data can also build state of the art models. Unfortunately not enough people know about it. We are trying to do our part to make it easier to use Transfer Learning as well as increase awareness about it.

Using Transfer Learning we can build a model to identify cats and dogs in images with a few (<100) images as compared to the few thousands it would take before.

To make Transfer Learning easy we are building https://nanonets.ai that has multiple pretrained models that can be augmented with your data to create state of the art models. We are currently in the process of building our first few models. Image Labeling and Object Detection (in Images) work with a few Text based models coming up in the next few weeks.

12
Why I replaced MIT with copyleft license for Nodemailer (rant) nodemailer.com
15 points by andris9  1 hour ago   11 comments top 3
1
scandox 2 minutes ago 0 replies      
It's a great piece of software. I use it in two projects both commercial platforms though not particularly big money makers.

The problem for me is I remain somewhat uncertain about the implications. If I use this within a service that requires payment from customers, it seems to me there is no change. And most software using this is likely to be delivered as a service right? Maybe I misunderstand the license.

In a way if the approach was more aggressive it would be easier for me. I could just go to the people that write the cheques and say: we use this software. New version requires payment. Write a cheque.

As it is I'm not sure whether we have to pay or not. As for donations - well I could make one personally (in fact I will) but it is unlikely to be 780 euros...

2
wccrawford 31 minutes ago 3 replies      
Spoiler: Money.
3
whitten 1 hour ago 1 reply      
andris9,you mentioned that the EUPL was more European Union friendly, similar to how GPL is more USA friendly.

Do you mind elaborating what you mean by this?

13
Launch HN: RankScience (YC W17) Automated Split-Testing for SEO rankscience.com
82 points by ryanb  4 hours ago   64 comments top 17
1
ryanb 4 hours ago 2 replies      
RankScience automates split-testing for SEO to grow organic search traffic for businesses. 80% of clicks from Google go to organic results and yet most companies don't know how to improve their SEO, or can't effectively measure their efforts to do so. Because of the scale of SEO and the constant change of both Google's ranking algorithm and your competitors' SEO campaigns, the only way to succeed in the long-run is with software and continuous testing.

We've built a CDN that enables our software to provide tactical SEO execution and run A/B testing experiments for SEO across millions of pages. Experiments typically take 14-21 days for Google to index and react to changes, and we use Bayesian Structural Time Series and Negative Binomial Regression models to determine the statistical significance of our experiments.

Our software is 100% technical SEO, and doesn't do anything black-hat, spammy, or anything related to link-building. One of our goals is to bring transparency and shed light on what is largely considered a shady industry, but is so important to so many companies' revenue and growth. In fact, If SEO didn't have such a bad reputation, we think someone else would have built this a long time ago.

SEO as an industry earned itself a stigma for being spammy: between buying links, creating low-quality pages stuffed with keywords and text intended for Google rather than humans, and the used car salesmen attitude that many SEOs have, many people have been conditioned to dismiss SEO as an invalid or illegitimate growth channel.

We're software engineers-turned-SEO's, who have previously consulted for dozens of companies on SEO, from YC startups to Fortune 500 companies like Pfizer. We previously shared our case study with HN, where we increased search traffic to Coderwall with one A/B test: https://www.rankscience.com/coderwall-seo-split-test

Ask us anything! We'd love to answer any questions you have about SEO, A/B testing, and RankScience.

2
austenallred 3 hours ago 1 reply      
A few questions about your product (I run SEO and digital marketing at LendUp - YCW12). This is very, very cool.

* What types of optimizations does it do/do you test?

* I assume tests take a while to run, waiting for Google to re-index etc. Do you essentially monitor rank changes, assume that Google has re-indexed at that point, and use that as the data to optimize on? Or do you have some smart way to monitor for when Google re-indexes?

* I'm again assuming here - that a user will plug in a few variations of things to test, and let your software test them? Or are there automatic optimizations the software tries to make?

* How many tests can one do in a given time period, without confounding test & control variants? It seems like they would take a while to run?

* Is there a good way to control for non-technical/non-content-based changes (e.g. external, links)? For example we get hundreds of negative links pointed at us per week. Do we just hope/assume that's not the cause of rank changes?

3
jitbit 3 hours ago 1 reply      
Guys, The idea looks great. Congrats on the launch.

2 SEO questions:

1) a CDN means "thousands of websites on one IP address". Google doesn't like it UNTIL it knows, the IP belongs to a well-known CDN like cloundfront/cloudflare etc. Please comment?

2) An A/B test might look like "cloaking" to googles. How exactly do you run it? I assume, not in parallel, but "variation A then variation B" - correct? If yes, does it mean I have to leave the website UNTOUCHED for 21 days so the test results are not distorted by my other activities? (adding new content, internal links etc)

4
birken 2 hours ago 1 reply      
As somebody who wrote an article about A/B testing title tags in 2011 before it was cool [1], this is an awesome idea. I've talked about SEO with many companies and coming up with the proper title tags and meta descriptions alone is often worth so much traffic for such little effort (once you get past the upfront cost of running the tests).

However, I think a critical aspect of SEO is thinking about an entire site holistically. Not only because certain signals are site-wide, but because a key aspect of SEO is deciding what pages of your website are "good" for SEO and which ones aren't, and then focusing on making the "good" pages better and not worrying about the "bad" pages. Good and bad in quotes because it is often quite a bit of an art and not a science.

How does RankScience play into this? You've nailed the on-page stuff but is there any world in which RankScience is able to talk about a site holistically and recommend which types of pages and content seem to be working most effectively (and maybe even suggesting pages that could be removed/de-indexed)? Or do you leave that to SEO consultants and you just nail the hell out of the on-page stuff.

1: https://www.thumbtack.com/engineering/seo-tip-titles-matter-...

5
sgslo 4 hours ago 1 reply      
Looks great, congrats on the launch!

Two points of feedback:

(1) The idea of the product is clearly conveyed, but I'm confused on exactly how it works. The landing page mentions that title tags, headlines, meta-tags, etc get tweaked - exactly how is this done? Do I have to manually enter a bunch of alternative text, or are you using a big fancy thesaurus to switch out some key terms?

(2) How do you evaluate performance of the product? Solely through click rates, or by search rankings? How often do google search results get updated? In short, how do I know the product is working?

6
BickNowstrom 4 hours ago 1 reply      
Though using A/B testing to improve user experience or conversion rates is fine, I thought using A/B testing to reverse engineer the ranking algorithm was against the guidelines. Has this been updated?

From https://support.google.com/webmasters/answer/7238431?hl=en

> Best practices for website testing with Google Search

> The amount of time required for a reliable test will vary depending on factors like your conversion rates, and how much traffic your website gets; a good testing tool should tell you when youve gathered enough data to draw a reliable conclusion. Once youve concluded the test, you should update your site with the desired content variation(s) and remove all elements of the test as soon as possible, such as alternate URLs or testing scripts and markup. If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if youre serving one content variant to a large percentage of your users.

Next to this, the advice is to use rel="canonical" to avoid duplicate issues with Googlebot crawling your variations. When using rel="canonical" this should not show you how a variation influences ranking.

> If youre running an A/B test with multiple URLs, you can use the rel=canonical link attribute on all of your alternate URLs to indicate that the original URL is the preferred version. We recommend using rel=canonical rather than a noindex meta tag because it more closely matches your intent in this situation.

7
gargarplex 17 minutes ago 0 replies      
What's the minimum amount of traffic to benefit from this offering
8
gukov 1 hour ago 1 reply      
It seems like you integrate with the likes of Google (Webmaster Tools) and Cloudflare in ways not done before, so my assumptions here are probably a bit outdated.

Let's say you want to test a new title. You collect stats for a week, change the title, collect stats for another week. The two datasets are then compared. Am I close?

Does it mean that you can't AB test in parallel? If so, it's not optimal for time sensitive stuff like breaking news.

9
franciscassel 44 minutes ago 0 replies      
I consult with startups and tech companies (recently: SurveyMonkey) on SEO, so this is super-interesting... especially the automated aspect, which is pretty novel in this space.

What are some of the specific types of automated tests that you run?

10
7cupsoftea 2 hours ago 1 reply      
Just want to chime in here and share that I'm a very happy customer. Ryan, Dillon, and Chad are some super smart guys who deeply understand SEO. I use RankScience for 7 Cups and Edvance360 and I've seen a huge ROI. They always find time to meet with me and my other team members. The advice and feedback they provide is worth the cost of the service alone. Highly, highly, highly recommended!
11
killion 3 hours ago 1 reply      
We've been using RankScience for months at Suiteness. They've given our indexed pages a consistent boost every week.
12
hartator 3 hours ago 1 reply      
Congrats on the launch.

How do you check the effectiness of a SEO change? You check for Google rankings or traffic?

13
ortekk 1 hour ago 1 reply      
That looks great!How do you automate google's rank checking? Do you use proxies/swarm of VPS's?
14
coolswan 4 hours ago 1 reply      
Thanks for the AMA! What's the best way to deal with Google algorithm changes? Also, at what point does site performance actually impact my SEO?
15
Mizza 3 hours ago 1 reply      
How does this compare with the other offerings in the space?
16
omarchowdhury 3 hours ago 1 reply      
How does your pricing work?
17
TomAnthony 3 hours ago 1 reply      
Hey guys! Congratulations on the launch! Very exciting!

We've been seeing some great results with SEO testing, and have recently had our biggest test result (in terms of revenue impact). Looking forward to hearing more of what you guys are up to. :)

I think the fact that DistilledODN, RankScience, Etsy, and Pinterest have all published SEO split-test results recently demonstrates the importance of this type of data-driven approach to SEO!

Best of luck with everything!

Tom, Distilled(Disclaimer: I run the https://www.distilledodn.com/ team)

14
Abusing the C switch statement beauty is in the eye of the beholder feabhas.com
188 points by ingve  8 hours ago   118 comments top 30
1
saurik 6 hours ago 6 replies      
I think this looks a lot better if you use "if (false);" at the top to normalize the "else if" in the rest of the chain.

 switch (x) default: if (false); else if (valid_command_message(x)) case CMD1: case CMD2: case CMD3: case CMD4: process_command_msg(x); else if (valid_status_message(x)) case STATUS1: case STATUS2: case STATUS3: process_status_msg(x); else report_error(x);
That said, I think the entire concept of this code having both the enum switch and the valid_* functions is just horrible :/.

To be clear, it isn't that I find the code difficult to grok: if you don't understand how switch works, please stop using C.

The issue is that this particular use case for the switch is a lazy performance optimization, as it should just totally replace the valid_* calls if we do that; it is like the author doesn't want to update the switch cases, but wants the code to keep working? Did they just forget the switch exists, or is it that they can't edit it? I just don't get it, particularly when any modern C compiler supports "give me a warning if I don't consume all the possible switch values in this enum", which is the feature that this code should be using, not if < END_*.

BTW: it isn't clear to me that this would even be a performance optimization; in an ironic twist, many compilers are going to choose to compile that switch statement into the moral equivalent of two if statements with range checks, as default has to be implemented as a range check anyway, and if you work out how many range checks you are doing combined with the code cache benefits of having explicit branches instead of implicit jump tables, this switch statement is going to feel extra repetitive when you look at the machine code.

2
rootbear 4 hours ago 2 replies      
While we're discussing abuse of switch, here's my personal best effort at misusing this feature, from over thirty years ago.

 /* * A program to print The Twelve Days of Christmas * using the C fall through case statement. * * Jim Williams, jim@maryland, 2 May 1986 (but first written ca. 1981) */ /* * If you have an ANSI compatible terminal then * #define ANSITTY. It makes the five Golden rings * especially tacky. */ #include <stdio.h> char *day_name[] = { "", "first", "second", "third", "fourth", "fifth", "sixth", "seventh", "eighth", "ninth", "tenth", "eleventh", "twelfth" }; int main() { int day; printf("The Twelve Days of Christmas.\n\n"); for (day = 1; day <= 12; day++) { printf("On the %s day of Christmas, my true love gave to me\n", day_name[day]); switch (day) { case 12: printf("\tTwelve drummers drumming,\n"); case 11: printf("\tEleven lords a leaping,\n"); case 10: printf("\tTen ladies dancing,\n"); case 9: printf("\tNine pipers piping,\n"); case 8: printf("\tEight maids a milking,\n"); case 7: printf("\tSeven swans a swimming,\n"); case 6: printf("\tSix geese a laying,\n"); case 5: #ifdef ANSITTY printf("\tFive ^[[1;5;7mGolden^[[0m rings,\n"); #else printf("\tFive Golden rings,\n"); #endif case 4: printf("\tFour calling birds,\n"); case 3: printf("\tThree French hens,\n"); case 2: printf("\tTwo turtle doves, and\n"); case 1: printf("\tA partridge in a pear tree.\n\n"); } } return 0; }

3
pdpi 7 hours ago 3 replies      
If this sort of thing tickles your fancy, you might also want to read about Duff's Device.

https://en.m.wikipedia.org/wiki/Duff's_device

4
YSFEJ4SWJUVU6 6 hours ago 4 replies      
Neat `abuse'.

Personally I don't like the what I consider an anti-pattern he used in his refactored functions. I'm talking about doing this:

 if (condition) return true; return false;
instead of simply

 return condition;

5
peapicker 7 hours ago 1 reply      
My favorite abuse of switch statements in C is to use them for co-routines in C...

http://www.chiark.greenend.org.uk/~sgtatham/coroutines.html

http://blog.robertelder.org/switch-statements-statement-expr...

6
chiph 4 hours ago 1 reply      
Abuse of the switch statement? The firm bought a company and we found in their source code that they had created a 50,000 case-statement monster.

We asked that they refactor it, and they agreed because the sheer size of it was giving the compiler problems. So they divided it into two separate 25,000 case-statement monsters.

7
JoachimSchipper 5 hours ago 0 replies      
Where available, my preferred solution is to compile with gcc -Wall -Werror (or equivalent); this will enable -Wswitch, causing a compile error when you fail to handle an 'enum' case. Needing to fix a few obvious compile errors after a change isn't much of a burden; un-maintainable code is code that silently breaks on changes. (The same attitude leads to lot of static_assert().)

(You may also be interested in -Wswitch-enum.)

8
gorb314 17 minutes ago 0 replies      
I'm the author of libb64[1], which (ab)uses the switch statement in a similar way to implement a base64 encoder and decoder.This was inspired in turn by "Coroutines in C"[2], written by Simon Tatham.

[1] http://libb64.sourceforge.net

[2] http://www.chiark.greenend.org.uk/~sgtatham/coroutines.html

9
DannyB2 7 hours ago 3 replies      
No need to abuse the C switch statement.

#define while if // make code faster

#define struct union // use less memory

10
LeifCarrotson 5 hours ago 1 reply      
> If the implementation uses a form of jump-table then this [switch statement] has the benefit of giving you an O(1) performance based on the messages, whereas with the if-else-if chain, the commands will be checked and processed before status and errors.

The reason for using a switch statement in the first place was so that you'd have constant performance. The argument against the simple implementation was:

> Lets assume in v2.0 of the system we want to extend the message set to include two new commands and one new status message ... the if-else-if version will handle the change without modification, whereas the existing switch statement treats the new commands still as errors.

Treating the new commands as errors is a feature of the switch statement style, not a problem. They need to be added to the jump table, not implemented with reduced performance.

11
lisper 6 hours ago 0 replies      
Of course if you were programming in a language with proper macros you could just automatically generate the switch statement from the enums and have the best of all possible worlds.
12
mcintyre1994 7 hours ago 2 replies      
Of all the things I learned in kindergarten this was my favourite :)

In all seriousness, this is.. interesting! I had no idea you could have a case inside default like that.

13
gp7 41 minutes ago 0 replies      
1) All four versions are O(1)

2) Textually interleaving two implementations is bad no matter what language you're in

3) You could achieve the same effect with gotos without giving anyone a migraine... but it still wouldn't be worth it

14
adekok 6 hours ago 0 replies      
There's just no reason to have a web site that goes down when HN viewers go to it.

I run wiki.luajit.org on a VM, using Gollum. We got hit by HN a while back. CPU went from 1% to 2%.

People saying "CPU is commodity" tend to write software which uses all available CPU... for little purpose.

Heck, I worked at a company which did just that. "CPU / memory / disk is commodity, use it." And they did! Soon enough, all CPU / memory / disk was in use, and they had to go back and re-architect their software so that it wasn't crap.

It would have been cheaper to do it right in the first place. But religious beliefs about engineering over-rode actual engineering.

15
greyman 7 hours ago 2 replies      
For me, the beauty is when it is apparent on the first sight what the code does. This is just the opposite of beauty (in my eyes :-)).
16
technofiend 5 hours ago 1 reply      
People may disagree but if the function valid_command_message already bounds checks x, isn't the case statement redundant? Do you really want the same bounds check in two different places? Maybe in embedded programming you do for safety but it seems like a simple if test is sufficient.
17
janekm 6 hours ago 0 replies      
This is just terrible. The fact that the compiler complains if an enum is added without updating the switch statement is a feature. The enum and switch version is clear and explicit. To defeat it, in the name of... what, code obfuscation? beggars belief.
18
iainmerrick 5 hours ago 0 replies      
I guess the reminder that C's switch syntax is bizarre and flexible is amusing, but this is still really wrong-headed. Friends don't let friends write code like this!

We're told we have "process_command_msg" and "process_status_msg" functions. Each of those functions is already doing a switch or if/else to determine the exact message type. If you care about the cost of those lookups, the correct thing to do is have a single switch statement that handles all the messages at once.

If you don't care enough about the cost of lookups to split up those functions, you should stick with the obvious if/else test.

Another thing you could do to improve the design is to have a "get_message_type" function returning a "COMMAND_TYPE" or "STATUS_TYPE" enum, then switch on that. That can be made efficient if you really care about execution cost (for example, check a single bit, and inline it so the switch can be optimized).

Abusing the switch statement not only gives you unreadable and bug-prone code, it simply isn't any better than a sensible approach.

19
DannyBee 1 hour ago 0 replies      
Meh, this is pointless, the compiler chooses if ranges or jump table regardless of which form you use here.But it probably makes you feel better :)
20
peter303 3 hours ago 1 reply      
Modal behavior should implemented through subclassing, not switches or if tests. This has the property of consolidating all code for a class in the same code file and not scattered among larger pieces of code. Much easier to maintain and extend.
21
vram22 4 hours ago 0 replies      
Partly related:

Simulating the C switch statement in Python:

https://jugad2.blogspot.in/2016/12/simulating-c-switch-state...

Not a proper simulation of C's switch, has limitations mentioned in the post, just something I whipped up for fun.

22
noobermin 6 hours ago 2 replies      
So the main point here is we save some time with a direct jump at least for a few values of x? It's cool I guess but the extra confusion might not be worth it.
23
kayamon 3 hours ago 0 replies      
Many compilers now support a range selection extension on case labels, so you can just do:

 case CMD .. CMD_END: /* whatever */ break;

24
qwertyuiop924 7 hours ago 0 replies      
I didn't learn this in kindergarten, but I did learn about Duff's Device in the 7th grade (as it turns out, the Jargon File makes for pretty good reading during boring classes).

Never had cause to implement it, though, and never saw this particular hack.

25
delinka 7 hours ago 2 replies      
Did I miss an explanation somewhere? Why are the `process_` functions referenced again in the default case? That has a code smell.
26
lend000 5 hours ago 1 reply      
Interesting ideas, but this bothers me: "Then refactored to:"

Refactored implies performance improvement to me, and inlining is almost always faster than the more modular representation that the author ended up with after "refactoring" (putting the comparisons in functions).

27
GFK_of_xmaspast 4 hours ago 0 replies      
This is a whole stack of horrible code under the banner of "might be more efficient" with no actual examination of what the code actually gets compiled into, much less any profiling.
28
posedge 4 hours ago 0 replies      
well, I dont see the beauty :)
29
anjc 6 hours ago 0 replies      
Cool trick, but difficult to grasp on first glace. I'd personally rather see pages of 'if' statements if it makes it easier to reason about.
30
neoeldex 7 hours ago 3 replies      
He's got a good point: Why is C still the staple it is?But I believe that Rust is going to take it's place.Not sure whether it's a good thing, but this trick isn't possible in rust ;)

Rust would give an error when an enum isn't exhausted in a match clause. So the issue being described doesn't exist :D

15
JSMpeg Decode It Like It's 1999 jsmpeg.com
138 points by phoboslab  7 hours ago   47 comments top 19
1
cs2818 3 hours ago 4 replies      
The "Why Use JSMPEG" section hits the nail on the head with respect to the needs of some projects I have worked on.

Particularly, I once needed to stream video from a UAV to multiple tablets and phones on a local network to allow collaborative annotations. Since it was a disaster response system there was no depending on external services, which at the time put WebRTC out of the picture (all of the easy to use implementations required internet access to use existing signaling services).

We ended up using MJPEG and then later a JavaScript implementation of a MPEG-1 decoder. This library certainly would have made my life a little easier at the time!

2
chadnickbok 3 hours ago 1 reply      
This is really cool! But the compression quality of MPEG1 is really nowhere close to h264.

If you're interested in this sorta thing, try taking a look at Broadway JS: https://github.com/mbebenita/Broadway

Here's a simple demo page they have setup: http://mbebenita.github.io/Broadway/foxDemo.html

Its entirely possible to use WebSockets to stream H264 to a browser and decode using broadway, and the performance is pretty good, even on mobile.

3
peterhajas 5 hours ago 1 reply      
Video decoding in JS is very impressive - really highlights the speed of modern interpreters. I especially love that a Bjrk track from the 90's is featured.

I recently worked on a personal project which had to play back .webm files, and I used a similar utility:

https://github.com/brion/ogv.js/

It decodes .webm files and plays them in the web. I believe it's also used by Wikipedia to play bag Ogg files.

4
Tade0 4 hours ago 0 replies      
I remember toying with this idea when I was doing some web-based slot-machine mobile games.

We had a ridiculous amount of assets(mostly animations) that had to be compressed, because one of the sales representatives noticed, that it's impossible to play any of the games if you're connected to a 2G network.

Eventually we didn't go with this solution, because it considerably reduced battery life and made the devices heat up too much.

5
mborch 2 hours ago 1 reply      
Cisco's got a royalty-free codec that would let you decode like it's 2017. https://github.com/cisco/thor
6
yeureka 1 hour ago 1 reply      
This is very cool!In 1999 my degree final year project was to implement an mpeg decoder in software. My only source of information was the MPEG technical reference manuals.It took me 3 months to be able to decode my first frame.It ran at less than 10fps on an AMD K6, but I learn a lot about video and compression.
7
Mizza 3 hours ago 0 replies      
This has also opened up some awesome new browser experiments, such as playing GTAV in the browser (on an iPhone!)

http://phoboslab.org/log/2015/07/play-gta-v-in-your-browser-...

8
rucas__ 3 hours ago 0 replies      
The talk about JSMpeg is an awesome learning experience. https://fronteers.nl/congres/2015/sessions/jsmpeg-by-dominic...
9
dalanmiller 48 minutes ago 0 replies      
It is incredibly hard to stream video from say, a Raspberry Pi to the web in a way/format that's easy to consume on multiple devices or using just the browser. This is awesome.
10
eriknstr 6 hours ago 2 replies      
I notice that the demo video stops when I switch tabs. Is this by design or by accident? Can I use JSMpeg but have it play in the background? Another thing; can I have video controls?

PS: That video has a very late nineties, early double-ohs feel to it indeed. Good choice of video :)

11
z3t4 1 hour ago 0 replies      
Imagine all cool stuff you can build with this. And it's JavaScript!!!
12
Retr0spectrum 5 hours ago 4 replies      
This begs the question, which codec is optimal in terms of "cpu-cycles-to-decode per second" for a given image quality ( for some subjective measure of quality)?
13
VeejayRampay 1 hour ago 0 replies      
Combine this with webtorrent and you have yourself a real nice distribution platform.
14
tambourine_man 5 hours ago 0 replies      
A stack that you can understand from top to bottom.

Love it

15
edwinyzh 4 hours ago 2 replies      
What does "Decode It Like It's 1999" mean? Thanks.
16
ilurkedhere 5 hours ago 0 replies      
How power efficient is this? Just curious because iOS was mentioned as a target.
17
adrianN 4 hours ago 0 replies      
This stutters a bit on my 2009 Macbook. If I weren't plugged in it'd probably kill my battery really fast.

It's a nice hack to get this kind of video on the iPhone, but somehow I feel that for decoding video JS is a bit high on the stack.

18
brian_herman 6 hours ago 1 reply      
sounds like the internet archive would love something like this.
19
streamer45 5 hours ago 0 replies      
main problem I see are bandwidth requirements to keep a decent quality. That said, very interesting project and well executed.
16
Introducing PureScript Erlang back end nwolverson.uk
129 points by pka  7 hours ago   23 comments top 5
1
proaralyst 6 hours ago 2 replies      
Looks like we just got compile-time type safety and higher-kinded polymorphism on the BEAM VM.
2
mwcampbell 4 hours ago 2 replies      
I wonder how feasible it would be to develop PureScript back-ends for Swift (for the Apple platforms), Java (for Android), and C# (for Windows).
3
Vosporos 5 hours ago 1 reply      
That's quite interesting!Perhaps you already know it, but it may be interesting for functional programmers who don't know Erlang : anonymous functions are not as efficient as they would be if they were declared as named functions (I read that it was because they're not inlined by the compiler, maybe someone with more experience could explain the inners of the compiler? =)
4
the_duke 5 hours ago 2 replies      
Why not compile directly to BEAM bytecode?
5
didibus 5 hours ago 1 reply      
Could PureScript become a more viable and practical Haskell?

I'd love to see it evolve in targeting more platforms, Python next maybe?

17
Hacker News Highlights: January 2017 ycombinator.com
71 points by craigcannon  4 hours ago   8 comments top 4
1
WA 1 hour ago 0 replies      
If I tell people about HN, I always say something like this:

"HN is this news aggregator. If someone posts a link to a story about the mars rover, or a breakthrough in fusion reactors, or anything like that, you can be sure to read a comment from someone on the inside with additional insight, like 'yup, I've been on that team and this is what really happened ...'"

The comments are the beauty of HN (most of the time). That's why I keep coming back. Thanks HN!

2
saycheese 2 hours ago 1 reply      
Might be wrong, but this is more like HN comment highlights. Here's the top stories for January 2017:https://hn.algolia.com/?query=&sort=byPopularity&prefix=fals...

Also, feels strange to be highlight comments/posts that now in some cases locked; that is further comments/votes are not possible.

3
strgrd 2 hours ago 1 reply      
I'm still trying to get BoHN off the ground... reddit.com/r/BoHN
4
minimaxir 4 hours ago 2 replies      
> Two MDs who are HN users discover they agree.

HN users agreeing is indeed cause for celebration. :P

18
Probabilistic Models of Cognition probmods.org
116 points by pizza  7 hours ago   12 comments top 4
1
pinouchon 4 hours ago 0 replies      
I'd like to mention that the research center behind this, the Center for Brains, Minds, and Machines (CBMM), has a great youtube channel: https://www.youtube.com/channel/UCGoxKRfTs0jQP52cfHCyyRQ

Some of my favourite videos:

- Neural Representations of Language Meaning - Tom Mitchell https://www.youtube.com/watch?v=pRBf8BWAG3k

- Computational cognitive science - Josh Tenenbaum (co-author of probmods) https://www.youtube.com/watch?v=2WQO9e5Mdj4

2
tbenst 6 hours ago 2 replies      
Just a heads up that church has now been replaced by webppl: http://webppl.org.
3
Tloewald 6 hours ago 1 reply      
Very interesting and well presented. If it were me I would try to provide a few concrete examples, ideally with figures, in the introduction but I'm a visual person.
4
nrjames 6 hours ago 2 replies      
I really like the concept but I don't really want to learn a niche programming language to interact with the site.
20
LCFS: A New Container Filesystem for Modern Datacenters portworx.com
32 points by old-gregg  3 hours ago   8 comments top 4
1
davexunit 1 hour ago 3 replies      
How many more file systems are we going to see before we realize that the file system is the wrong layer of abstraction to be solving the deduplication problem?
2
CSDude 1 hour ago 0 replies      
Really nice to see the new file systems addressing this issue, but, as many including me hurt by AUFS, I am being cautious. Do you think it is battle tested enough? As it is different than other merging ones, it is a little more dangerous than them as well since it works on block level? Not trying to be negative, I really liked it, and I am not even happy about even a docker load command performance over my overlay2, and will defintely be trying LCFS.
3
softwarelimits 53 minutes ago 0 replies      
Does anybody know of a good deduplication script / tool that runs on every filesystem and can be used as a nightly cronjob?
4
gourao 2 hours ago 1 reply      
Hi, I am one of the contributors on this project, and happy to answer any questions... LCFS will also be supporting the snapshot driver discussed here - https://github.com/docker/containerd/pull/484.

Currently, the install experience is not the greatest... We rely on the Docker v2 plugin interface and that is still in beta. We are working with the Docker team to help us smooth out some of these rough edges.

LCFS will also support other container formats and any help from the community is much appreciated.

21
Announcing gopass A 'pass' compatible password manager for teams justwatch.com
88 points by MetalMatze  6 hours ago   59 comments top 16
1
zx2c4 5 hours ago 5 replies      
> Second, with all due respect to the original author zx2c4 (who is currently working on the very promising WireGuard project)

Indeed I am working on WireGuard. But I haven't forgotten pass. We're currently working on a new release.

> the project proved to be a wild bunch of hotwired bash scripts that mostly looked like they were written as a one-off job

I very much disagree with this silliness.

2
VA3FXP 5 hours ago 1 reply      
I'm pretty excited about this actually. Thank-you so much for your efforts.I've been using pass for awhile now, and I really love what it does, but it's a case where it feels 90% finished.

I have one desperate request; colour output as an option.Every time there is an update to pass (or I need to reinstall) I need to edit the file and change the options from " tree -C " to " tree -n "

This is a pain in the ass.I am visually impaired. The 'default' dark-blue that tree uses for directories is unreadable to me.

My two choices for dealing with this are to use DIRCOLORS or edit the pass executable. I'd prefer to not muck about with my environment settings. (as I do not normally see any colour output)

Anyway; awesome project!

3
aeorgnoieang 43 minutes ago 1 reply      
I'm currently using Pass on most of my computers.

I was using Bruce Schneier's Password Safe (and various compatible apps: pwSafe on Mac and iOS and Password Gorilla on an older Mac and Linux boxes) and the big pain for me was merging changes.

I'd found that trying to use a single 'safe' via Dropbox was a recipe for disaster because some of the programs wouldn't cleanly close the safe file, or at least one of them would occasionally complain about the state of the file. So I created a copy of the safe for each of my computers and devices. Then every 6 months or so I'd merge all of the safe files into a single file and recreate all of the device-specific files as copies of that single file.

But merging in Password Safe sucked. There was no way to review the differences between entries in different safes, other than manually inspecting entries. I don't believe either of the two versions (Mac and iOS) of pwSafe (both version 1) supported merging at all. Password Gorilla was actually the best among the bunch as it had a nice 'diff' window with which you could explicitly pick which version of several fields for an entry you wanted to retain. But sometimes I couldn't get its 'diff' window to fit on my screen so I'd have to plug my laptop into a larger monitor.

Using Pass with Git is so much easier.

I've also been using git-remote-gcrypt[1] to push my local Pass repos to a shared 'remote' file stored in Dropbox. It works great.

[1] https://spwhitton.name/tech/code/git-remote-gcrypt/

The only painful aspects of Pass now is the weird behavior of `gpg` on Windows in Cygwin and the clunkiness of my current multi-repo setup. Hopefully running Pass under "Bash on Ubuntu on Windows" will mitigate the former. Given that Pass is written in Bash and that the various repo config settings are read from environment variables, it doesn't seem likely that the latter will get much better than my current setup, which involves sourcing a script to switch the relevant environment variables.

4
dkonofalski 1 hour ago 1 reply      
Really weird that there's no mention of 1Password and its flexibility through both 1Password for Teams and 1Password Shared Vaults. I work with 2 different teams that share passwords like this and one of them has a shared vault that syncs through Dropbox for everyone while the other one is managed through 1Password for Teams so that we can update passwords and access at our discretion.

I'm very curious to see how this will stack up against those solutions because, to be honest, there is very little room for improvement from 1Password, in my eyes. They have a very, very solid and secure product and the UI is fantastic.

5
endymi0n 6 hours ago 3 replies      
We're the team behind gopass and worked hard over the last weeks to bring you this - happy to hear any feedback!
6
ejcx 3 hours ago 1 reply      
Shameless self plug, I actually build a project called passgo, modeled after Jason's pass project as well. I'm a huge fan of pass' simplicity, but I really dislike managing keys:

https://github.com/ejcx/passgo

The difference is mine does not use PGP and is instead password based, but the command line interface is almost identical. I now use passgo to encrypt and manage my ssh keys, etc.

7
rkeene2 2 hours ago 0 replies      
I have a similar project that uses smartcards/HSMs/anything with a PKCS#11 interface called "hunter2":

https://chiselapp.com/user/rkeene/repository/hunter2/

8
LamaOfRuin 5 hours ago 0 replies      
Should tab completion work with gopass? Seems pretty unusable to me as a general purpose password manager without it, but it doesn't work for me out of the box with a `go get` source install.

Edit: found it on the github readmehttps://github.com/justwatchcom/gopass#autocompletion

9
OJFord 3 hours ago 0 replies      
This looks great, I've been toying with something similar built on top of Keybase that I just use for personal passwords, but using KBFS means it should be simple to extend it to shared passwords, since you just store it in the `private/me,you` directory instead of `private/me`.

My concern with using something like this or pass is that I have to manage the distribution/backup of the store/vault/db myself - whereas I can throw my laptop off a cliff, buy a new one, login to Keybase, and my passwords are still there.

10
draven 4 hours ago 0 replies      
I just spent a few hours to migrate our keepassx database to pass and make the team switch to pass.

Gopass seems great, especially the multistore support (which you can do w/ pass by setting an env variable), thank you for your work!

11
praseodym 3 hours ago 1 reply      
There's a Chrome+Firefox browser plug-in which also uses a native binary written in Go: https://github.com/dannyvankooten/browserpass/blob/master/br...
12
tyingq 5 hours ago 2 replies      
Pretty neat, and addresses a space where no tools seem to exist.

Would be cool if it could leverage a GitHub public repo for password updates. Something like using the list of collaborators on a repo, iterate over their GH public keys, and push new encrypted files for each collaborator on the repo.

I suppose though, this would leak a lot of metadata on how the tool is being used, and would tie it too closely to GitHub vs just git.

13
homakov 3 hours ago 0 replies      
Any example of a company where few people badly need to share password and there is no "team" where you can add new users?
14
WhitneyLand 5 hours ago 5 replies      
so this intended to be unix focused, not something that could replace lastpass for example? been looking for something better than lp.
15
zeveb 4 hours ago 3 replies      
> There is one slight drawback to all the simplicity, and that is an information disclosure inherent to the design: pass stores all folder and file names in clear text, so even if you fully trust GPG, you should probably not put this repo into a public place like Github, because this may expose your account names and other metadata.

This is my concern with pass. It's an awesome tool, but it really needs to figure out a way to hide the filenames. I think this is doable (after all, encfs has the same need, and does it well), but I don't know if the pass team have the will to do it.

> First, the project is curated in a traditional mailing-list based approach that was pretty unapproachable compared to a modern Github based workflow.

Sigh, not this again. I think that I prefer email vice a proprietary, centralised single point of failure like GitHub, and I know that I'd rather not work with someone who considers email unapproachable.

If your email account is unmanageable, fix it. Email's a really, really valuable tool; don't let go of it.

16
Daviey 5 hours ago 1 reply      
A comparison between this and Vault would be super useful.
22
Computer Science Education google.com
339 points by dr_linux  15 hours ago   211 comments top 14
1
duderific 4 hours ago 4 replies      
> Computers are everywhere in our world today and being an educated citizen requires an understanding of the fundamentals of computer science and its underlying problem-solving methodology of computational thinking.

I know this is just some copywriter writing this, but this seems the height of Silicon Valley bubble thinking. Plenty of people do just fine in their lives barely interacting with computers at all.

Sure, it's helpful to be able to use email and a web browser, but "being an educated citizen requires an understanding of the fundamentals of computer science"? Come on.

2
xiaoma 12 hours ago 1 reply      
Are they offering anything on the site for people who want CS education? From clicking around a bit it seemed like it was focused entirely on people teaching and learning in physical classrooms.

I was pretty excited at first, thinking this was something like a Google-sponsored version of EdX or OpenCourseware that was laser focused on CS.

3
defnfoo 7 hours ago 2 replies      
Meanwhile, google discriminates against older people ("old" being pretty much anyone in their late 30s). They are so inclusive, and so diverse... as long as they get to decide what diversity and inclusivity are.

http://www.businessinsider.com/google-loses-ruling-in-age-bi...

4
strathmeyer 2 hours ago 3 replies      
I graduated with a CS degree in 2004 from Carnegie Mellon University and Google didn't give a shit about me until 2011 where they told me they would only hire me if I were working for another local company first. The other local companies didn't want to hire me, because they thought I would just go work for Google. Never met anyone at Google who cared that I was a good programmer or had devoted my life to it. They seemed focused on stealing workers from their competitors. If they want to hire computer scientists, then tell their employees to focus on that in the hiring process. If they have someone with a Computer Science degree who is a great programmer who wants to work with them, maybe they should find the time to speak with them or to explain how they could get a job in computer science or programming.

My experience is that Google things a Computer Science degree is worthless. It can't get your foot in the door to speak to them, you have to be a race or gender they are looking for.

5
booleandilemma 54 minutes ago 0 replies      
Computer Science will end up like Math. Lots of adults today learned geometry, trigonometry, and calculus in school. If you ask them how much of it they use on a daily basis most will tell you they don't.
6
AtomicOrbital 2 hours ago 0 replies      
People need to learn how to learn first ... the Trivium ... grammar, logic, and rhetoric ... after a successful baseline there then they can teach themselves CS or whatever ... too early exposure to CS just continues the trade school factory mentality prevalent in current pre college education
7
chillydawg 11 hours ago 3 replies      
I look at that page and all I see is icons for Word, Powerpoint, Access and Excel in a line. My brain has been hard-branded for so long. Google need to change that page a bit.
8
bruceb 11 hours ago 3 replies      
We replicated a CS degree path which you can fill with free MOOCs:https://www.coursebuffet.com/degree

Admittedly we have left our course list dormant for a while but we will be pushing an update in a few weeks.

9
fillskills 6 hours ago 0 replies      
With all of Alphabet's resources, data, this is all they could manage on all of Computer Science? Whats the data/measurements behind this?
10
0xFFC 13 hours ago 3 replies      
I don't want to be negative here, But I think anyone who can convince Berkeley guys to re-activate this[1] channel would make humanity much much more service.

I am saying this because this channel changed my life, literally. I do live in third world country, I went to local college, where you will learn nothing after 4 year (not even writing a simple hello world, TBH). But after watching and learning from this channel and MIT opencourseware, right now I am working on internal of linux kernel (just think about not being able to write hello world after 4 year, and compare it with hacking linux kernel to get a feeling about how far I came) as hobby project, and I came this far only by watching various (from OS class to compiler, algorithm, etc) class and solving their assignments and writing their projects. I came this far by my own (only, without any help other than free material in resource I mentioned). I am 100% sure I will go further and further, because this is my thing, I may not be that smart, but I am resilient.

So I am perfect/live example of how free educational material can change people lives, and I do live in very small city (indeed very small, which most people are not familiar with computer until 3,4 years ago).

I did have 40kb connection, sometimes I spent a whole day waiting for downloading a lecture, to watch it, considering I had to use VPN, you can understand how hard it was to download a whole class from youtube. Right now I have better connection.

[1] : https://www.youtube.com/user/UCBerkeley/videos

11
supergeek133 4 hours ago 0 replies      
I want to know when we have the conversation about how every developer doesn't need to be a college graduate.
12
derrickdirge 9 hours ago 1 reply      
I was really hoping this was going to provide some path to completing my CS degree.

At least they're acknowledging the importance of education.

13
zizzles 3 hours ago 0 replies      
I glimpsed at the content for 30 seconds. This does not look anything like a computer science education: "Pencil Code", "Blockly", "Coding Adventures", "Craft Small Projects in HTML/CSS". These are just glorified baby-games for 5 year old's; or at the very most a stupefied introduction of basic development for girls and malnourished Indian children.
14
lineindc 13 hours ago 6 replies      
Instead of applauding this (oh, they are spreading knowledge), I'm going to point out that they just want cheaper labour.
23
From PS4 to 1.44 MB Floppy: Porting Retro City Rampage to MS-DOS [video] gamasutra.com
174 points by adgasf  10 hours ago   20 comments top 6
1
broahmed 8 hours ago 0 replies      
There's also a fascinating video of him getting the core of the game to run on NES spec constraints (think 8-bit CPU, 10KB of RAM):

https://youtu.be/Hvx4xXhZMrU(~11 mins)

It offers insight into how old-school game developers worked within the limitations of the hardware they designed for.

2
endergen 9 hours ago 2 replies      
Brian Provinciano is a coding machine. Addicted to running things on constrained environments.
3
badsectoracula 8 hours ago 3 replies      
I love writing code for old PCs. A couple of years ago i wrote a 3D maze engine in C that i wanted to run in the original IBM PC i was building (here is a video from running on a 286 https://www.youtube.com/watch?v=tfbQIvRYph4 with turbo turned off, i also managed to make it run in the IBM PC but that was some months later after i built it and didn't took a video). The renderer basically rasterizes sideways trapezoids in a column edge (top bottom) array and then draws it (the height also acts as an one dimensional depth buffer). It took me a while to make it run in interactive speeds, originally needing several seconds to draw each frame. I tried a bunch of methods and i ended up generating machine code for drawing 4 columns (CGA packs 4 pixels in one byte) for different heights in one go with a small post step to fix (clear/draw) the few individual pixels for each column in the batch and keeping the height and colors for the previous frame so that only changed columns are being drawn. I wanted to make some sort of turn-based dungeon crawler, although it would be a sci-fi one set on the moon :-P. I moved since then and my IBM PC is still in boxes since i do not have much space available (and i might move again soon so i don't want to unbox it because i had some professionals package it to avoid any damages).

I also made a small "hunt the wumpus"-ish game for CGA too http://runtimeterror.com/games/cgacave/ - this time the shots are from my IBM PC and not the 286. Also as a bonus a small program i wrote in Delphi 1.0 in Windows 3.1 to create the tiles :-). I've actually done a bunch of stuff in Delphi 1.0 because is a nice middle road between retro and modern (considering that i do a lot of "serious" stuff in Lazarus anyway). Last year i wrote a 3D editor for Windows 3.1 on it (http://i.imgur.com/eG34QXV.png and http://i.imgur.com/BZz6f9l.png the second one took a 486 laptop i have here some hours to render :-P).

I have a bunch of other stuff on my YouTube channel, most are random things i'm working on but i also have a few retrocoding works like another 3D maze in VB1 https://www.youtube.com/watch?v=CxhXjkogahs (i extended that a bit later http://imgur.com/gJXwCoj but got bored after a while) and an ultima-like engine https://www.youtube.com/watch?v=vXRjdbUjZX0 (this one has its own scripting language, image editor and map editor too - you can see them here https://www.youtube.com/watch?v=W70_G9LeByE) which i've managed to run in my 286 too with an IBM CGA monitor (the 286 has an EGA compatible graphic card with a CGA compatibility mode... sadly it is 16bit ISA and it doesn't work on the IBM PC because i'd like to have an IBM EGA card - and an IBM EGA monitor, but those are very rare and even more expensive).

Bonus photos from my IBM PC (without the monitor, i took those before i finished building it):

http://i.imgur.com/506cuFP.jpg

http://i.imgur.com/4tEvp24.jpg

http://i.imgur.com/8bz9IdR.jpg

Also i have and collect a bunch of old development software (mainly from ebay), including Borland C++ 5, C++ Builder 1, Visual Basic 4, Visual Basic 5, Klik&Play (including the manual), Delphi 2, JBuilder and some other things i forget. I usually image those to play around in VMs and keep the disks on my shelf.

Interestingly enough i found a couple of those actually useful (specifically Borland C++ 5 and C++ Builder) and i'm now using for beyond just playing around (mainly Borland C++ 5 because the IDE is lightning fast for compiling C code). Also i wrote a few patches for old games with C++ Builder which made it deal because of the small executable size and being able to design the window visually :-).

4
gravypod 7 hours ago 1 reply      
I'd love to see this rewritten in C++ using that transpiler written for that talk at the last CppCon.

https://www.youtube.com/watch?v=zBkNBP00wJE

5
panda-panda 8 hours ago 1 reply      
Really great to see someone doing this. Games developers often fail to consider performance constraints.
6
n1tro 7 hours ago 0 replies      
Lots of knowledge on this video, amazing work! wish i had the time to pursue these kind of projects.
24
The woes of Windows 10 economist.com
96 points by hourislate  6 hours ago   269 comments top 35
1
hfsktr 3 hours ago 10 replies      
I must be in the huge minority. I haven't had a single issue with Windows 10 (if we disregard privacy). Literally everything has just worked.

I will say thinking about it that my computer used for 99.99% gaming/internet/text documents. I keep anticipating that something is going to go horribly wrong but so far it's been great.

I hate the menu and store that it shows so almost everything I tend to open has a shortcut on the desktop or is pinned to the toolbar.

Having a functional toolbar on both monitors. That right there was all I wished windows 7 would do but otherwise I can barely tell that the OS is different.

I never used windows 8 so I don't have a comparison to that but I've seen Windows ME...

2
akappa 5 hours ago 14 replies      
I think there's an obvious and more general question here to be answered: if a piece of software is good enough, namely, it "disappears" when you are trying to do your job, then why bother to change it?

Yes, the new version might enable a more efficient workflow, or might be faster to boot-up.

But people hate changes and you need to invest considerable time in upgrading a piece of software (the upgrade itself + learning to navigate the new system + solving whatever goes wrong during the process). And that this is not just limited to "novices": I've heard wonderful things about Arch Linux, but my Ubuntu system works well enough and I've been using it for forever, so the incentives are pretty low and the time to be invested pretty high.

3
jacquesc 2 hours ago 3 replies      
I got a Windows 10 laptop (Razer) for Christmas for gaming / media / browsing purposes (been a Mac user for 10+ yrs).

It's buggy as hell. The shortcut icons on the desktop and taskbar kept going away (replacing all the icons with a "not found" placeholder), forcing me to reinstall the OS. Now none of the search functions work (e.g. windows key -> search for app.. no results). Whatever, I just use 2 apps anyways (Chrome and Steam).

Windows settings panels have at least 2 diff ways to configure anything, the "old" and the "new". Graphics settings have 4 places to change stuff (Intel, Nvidia, built in old and new). It's just a jumbled mess.

Bluetooth never recognizes device names (it's a guessing game as to which "Unknown Device" is the one I want). The trackpad is mediocre, and the config panel from Synaptics probably hasn't changed since Windows 95.

With the tech press in love with Windows 10, I was definitely expecting more. Feels like people just want to love every other version of Windows (7 "good", 8 "bad", 10 "good again).

4
Unbeliever69 5 hours ago 6 replies      
I am a manager at a small architecture firm and Windows 10 has brought nothing but woe to my office. We have spent dozens of hours and thousands of dollars troubleshooting and fixing problems (many of which remain unresolved) that occurred as a result of this draconian forced upgrade (and updates that followed). So what if we got the upgrade for free. I've paid for it many times over in labor!
5
frik 5 hours ago 3 replies      
No wonder that Win10 fails like Win8 - Microsoft doesn't care about users, they only care about their shareholders. Win8 and Win10 looks like designed by color blind designers with a bad taste the brutalism of UI design plus the phone-home spyware features that cannot be deactivated. Windows 7 is too good and the perfect OS, if you don't install some recent spyware updates it will last at least until 2020 - and who knows if Android/Fuchsia or whatever OS is a proper alternative than.
6
elorant 19 minutes ago 0 replies      
I had a rather odd problem with Win10 and that is it hurt my eyes. I don't know whether they changed something with ClearType or for some other reason but after two days I reverted back to Win7.

Other than that, I really didn't like the UI. To change colors in menus you have to use a hack because the preselected list of colors is just dreadful, the start menu once again is all over the place and various other mishaps that give the impression that it's not a finished product. They try to build a unified OS for all devices when all I want is a simple desktop OS.

7
casparz 2 hours ago 1 reply      
Stopped reading at:"There is no question that Windows 10 is an impressive piece of software, and quite the most secure operating system ever devised."
8
nul_byte 6 hours ago 6 replies      
I have a dual boot PC which is Arch Linux and Windows 7.

The Windows 7 machine is only for gaming / steam, I never browse the internet on there and have pretty strict firewall ruleset. To be honest, I would consider upgrading it to 10, but I missed the free window, so I really don't fancy paying for it.

My hope is that by the time that Windows 7 goes EOL, I can delete it and game just on Linux. Either that or I will get a console (as most of my gaming takes place in my living room anyhow, using a steam box).

9
ZeroClickOk 4 hours ago 1 reply      
I dont know why name Windows 10 a fail, when the bigger competitors of Windows are... Windows! Mac and Linux mostly are used for niche users (as us). I'm really really happy we CAN upgrade for a new Windows version easily, just compare with Android. More: the reason to most people dont upgrade isn't why "Windows 7 is better than Windows 10", but the actual reason is they are comfortable with Win7 and dont want/can take time to "learn" a new OS. Even Win7 being an awesome OS, took years to take most of XP users (even today there are a lot of loyal XP users out there...)
10
shmerl 5 hours ago 4 replies      
> But he has also dusted down his four-year-old Apple MacBook Pro and upgraded his Windows 7 desktop to the latest version of Linux Mint rather than Windows 10. <...> Despite their idiosyncrasies, Macintosh and Linux have never looked so attractive.

Linux gets refugees both from Windows and MacOS because MS just keeps messing Windows up (seems to be happening every other version), and Apple simply lets MacOS rot by not giving it any attention.

11
digi_owl 5 hours ago 1 reply      
Win10 fails for different reacons than Win8.

While 8 tried to make tablet the priority use case, and thus breaking the muscle memory of many long term users, 10 abandons the notion of the owner of the hardware being in charge (sadly one that is increasingly taking root in FOSS circles as well).

Thus you have things like updates being ramrodded through even if the user says no, and keep saying no.

Damn it, i keep having to roll back the driver of my igpu because for some reason MS keep trying to update it along with the dgpu. This even though the support has been discontinued by AMD...

12
bayeslives 2 hours ago 1 reply      
I've been using both Windows (work) and Linux (home) on the desktop for many years. I far prefer Linux (I tried many distros, settled on Arch and Mint now). Package management and window management is way better in Linux. To name just a few.

To me, Windows10 is not better than Windows7. The package management is still a mess. Forced updates, no rollbacks, reboots, you name it. Permissions are also strange: why are admin rights needed to install some fonts? We still have this "Registry" with its voodoo. We still miss tools like dmesg. When thing go wrong, Windows gives some sort of simple log file, of course not in txt-form but in some proprietary format that needs it's own reader (to read some flat text, for crying out loud). The file manager ("Explorer") is still confusing. Microsoft seems hellbent in obfuscating where files are stored. Going to the terminal, the difference is even clearer. Linux offers first rate terminals; Windows look like an afterthought. Is it even possible to use a terminal full screen in Windows, nowadays? Tab completion, a decent history, pipes?

The funny thing is that Ranger, my favorite file manager, is able to give previews of files in MS formats (such as .docx) where Windows Explorer is unable to do so.... ????

Two big steps back are privacy and font rendering. It's bad enough that apps spy on me, I don't want my OS to"phone home". The font rendering is optimized for the few Windows on mobile devices users, at the expense of the bulk of the users (like me) who use it on the desktop and are now confronted with blurry fonts.

My OS must help me with my personal computer tasks. Personal means a) I want to customize the hell out of it, and b) my data is _mine_. Also, I'd like a nice readable font on my screen. Sadly, Windows10 does not meet those requirements.

13
webwielder2 6 hours ago 1 reply      
It's bizarre that people continue to think of Windows PCs as being predominantly owned by individuals or even individuals with a specific interest in technology.
14
cwyers 5 hours ago 0 replies      
The thing is, people don't upgrade Windows, in the large. They use the version of Windows their computer came with until they replace it. Microsoft has done more work getting people to upgrade to 10 than any previous version of Windows. I don't see what more they can do.
15
dcdevito 2 hours ago 1 reply      
Windows 10 gets a lot of flak, but it's honestly the best OS I have ever used. It's fast, stable, reliable and secure. Oh, and I should mention I'm a developer, and WSL is getting better with every release. Is it perfect? Heck no, but it's sure a lot better than Linux on the desktop and Macs are just not worth my money anymore.
16
rwc 5 hours ago 3 replies      
> Chromebooks are now outselling MacBooks in the crucial education market, where long-term preferences tend to be established.

Apple has historically held a dominant position in the education market, so how does one square the "long-term preferences tend to be established" portion of that sentence with the 8% marketshare MacOS commands?

17
dmalvarado 3 hours ago 1 reply      
Well that was weird.

tldr;

Windows has 92% OS share. Windows 10 only accounts for 24% (maybe).

Windows 10 is so, so secure. How to make people upgrade?

Windows 10 new update will have advertising. Other OS's sure look good now.

I feel like this section of the Economist doesn't get much editorial review.

18
nxrabl 5 hours ago 4 replies      
> There is no question that Windows 10 is ... quite the most secure operating system ever devised.

Telemetry aside, can anyone comment on this?

19
ajmurmann 3 hours ago 0 replies      
Do users actually desire changes to the actual user experience of modern operating systems. I would be totally fine with the OS experience itself I had on Windows 2000 or OS X from 2010. Software that runs on top of the OS and support of it is a different issue. Of course also security and driver support. To me OS user experience is a solved problem and most changed to what I'm used to just make things different without providing value, but require me to retrain muscle memory. I do upgrade regularly because security. If that wasn't an issue I would never upgrade.
20
pdog 4 hours ago 0 replies      
Windows 10 is a great operating system, but if you have an existing business-critical workflow, why would you update your major version? Stick with what works unless you have a reason to update. It's why some multinational companies are still running COBOL programs after nearly sixty years.
21
mark-r 4 hours ago 6 replies      
Microsoft is their own worst enemy. My wife can't transition fully from XP to the Windows 7 PC I built for her, because it doesn't have Outlook Express or any reasonable substitute. We can't upgrade the living room PC because Windows 10 doesn't include Media Center.

I'm also not fond of the way Windows 10 updates are so aggressive, you'll essentially be replacing your OS every so often without any say-so. If I had any faith that this would be painless I might be tempted to try it, but see above. Windows 7 is darn near perfect, you can pry it from my cold dead hands.

22
yokohummer7 4 hours ago 0 replies      
IMO the success of Windows 7 was an exception, not the norm. Comparing the adoption rate of Windows 10 to that of Windows 7 might be a little harsh. 24% isn't actually that bad, considering Windows 7 is still good enough.
23
rkapsoro 3 hours ago 0 replies      
I'm normally very fond of The Economist, but this is one of their "blog posts", which seem to not receive quite the same careful, nuanced care that their other articles for their weekly edition do.

Part of the challenge of Windows 10 was the transformation of Windows from a large-release product into a trickle-release service with much faster cycle times with an incremental and continuously-learned approach to ongoing product changes.

At it's core, that's a valid and (I think) mostly welcome change, but Microsoft's solution to the impedance mismatch between the two categories of offering was to force it down and ask for forgiveness later. :/

24
boznz 4 hours ago 0 replies      
Forget any other reason, for $150 a licence (NZ$199) my other computers will never ever see windows 10.. ever
25
nyolfen 4 hours ago 0 replies      
i must give windows10 credit for one thing -- it finally convinced me to dive into linux
26
Animats 2 hours ago 0 replies      
The basic problem with Windows 10 is that it was intended for tablets/mobile first. On a desktop, there's just not much need for it.
27
k__ 4 hours ago 0 replies      
I installed it yesterday for the first time and it still seems to have problems.

Somehow Chrome always freezes every few minutes. Didn't do this on Win8.1 or Win7 :/

28
0xFFC 5 hours ago 5 replies      
Just fix fucking font rendering, I did use Windows machine as my workstation for 4 year (although being in love with linux), But I had to switch to Ubuntu, because of ridiculous font rendering. I don't have money to pay for new monitor every 2 year, and I am quite happy with my dual monitor 27 inch 1080p until windows 10 changed font rendering to DirectWrite. It is fucking disaster in low dpi monitor.

As always Microsoft ignored their current users in pursue of new users. I haven't seen this attitude from any other company. I haven't seen apple make font rendering on old MacBook worse to force people to switch to new MacBooks. Or I haven't seen something similar from Google.

This is fucking total disaster.

As someone who reads text for whole day (programmer) I care about my eyes and it fucking hurts my eye to see font rendering in new UI (UWP) and Edge.

p.s. Old/win32 application do use old rendering engine and are acceptable.

29
youdontknowtho 3 hours ago 0 replies      
In which the economist finally succumbs to clickbait.

WEAK.

30
epx 3 hours ago 0 replies      
I miss Windows 7. It did work just like Windows 95. It was just downhill from that.
31
mschuster91 3 hours ago 0 replies      
Well, the reason why corporate users are holding on to W7 is easy: training costs. With XP and even W7, you could at least switch the majority of the GUI to look and feel like good old W95/98/ME... no way to do so with W10.

Also many companies don't want the legal risk associated with telemetry (you never know what data leaves your premises and heads towards MS), and small-ish businesses are afraid of the non-disableable automatic updates - I certainly wouldn't want to get into the office and $essential_program has stopped working over night due to an update gone bad.

32
arca_vorago 3 hours ago 0 replies      
I also don't think you can have this discussion without talking about the influence the NSA and OGA's are having on big tech companies products. If game developers really pulled their shit together and started doing AAA/AA titles on linux you would quickly see the piss taken out of the likes of M$.
33
testUser69 4 hours ago 1 reply      
>The business world has been even more recalcitrant. In a recent study by Softchoice, an info-tech consultancy, corporate computers were found to be running a whole gamut of legacy versions of Windows. Fewer than 1% of them had been upgraded to Windows 10.

One thing I've learned over the years: you don't need a high IQ to run a successful business. Without a lot of smarts you're never going to compete with google, but you don't need to compete with google to make a living running your own business.

>Chromebooks are now outselling MacBooks in the crucial education market, where long-term preferences tend to be established.

It's unfortunate that proprietary software ever came to dominate schooling. Hopefully ChromeOS will open things up just enough so that cross platform tools become the norm. Schools using both Windows and Chrome will be more aware of proprietary/incompatible software and file formats and choose to use open formats.

>It is impossible to retrofit older Windows versions with the sort of defense-in-depth that has been built into Windows 10. Nor would Microsoft do so even if it could. If anything, it is about to do the opposite. Windows 7 users will soon lose access to a stand-alone toolkit for mitigating zero-day exploits.

Another reason why Windows is a joke OS for serious tech enthusiasts.

>A word of warning, though: such upgrades do not necessarily go without a hitch. A Windows 10 tablet your correspondent relied upon for much of his mobile computing was broken irreparably when a recent update corrupted the display driver, rendering the touchscreen useless.

......... makes you wonder why they lock these things down, are tech enthusiasts even designing them?

>But he has also dusted down his four-year-old Apple MacBook Pro and upgraded his Windows 7 desktop to the latest version of Linux Mint rather than Windows 10.

Yes I prefer Linux to Windows in most cases too. I never have good long term experiences with Windows.

>It used to be that only free software came with advertising; users paid a fee, if they chose to do so, to get the software free of advertising. Microsoft charges top dollar for Windows 10 ($120 or $200, depending on the edition) and now wants to bombard users with sales pitches to bootwithout so much as by your leave, let alone the option to turn the nuisance off. Despite their idiosyncrasies, Macintosh and Linux have never looked so attractive.

Yeah these days pretty much the only people who are using Windows are people who don't know any better. Most people aren't even exposed to Linux except those of us in the tech sector, and it seems to be just as popular as windows and OS X at the shops I've worked at.

34
mdekkers 5 hours ago 1 reply      
It appears that I have reached some kind of "article limit"
35
mmanfrin 6 hours ago 6 replies      

 More than 700m of the world's 1.5bn or so computers continue to run on Windows 7, a piece of software three generations old
Win 7, Win 8, Win 10; 2 generations old. Preeeetty easy to check, WSJ. If your lede carries such a glaring mistake, why do I trust anything in this article.

25
Visual Studio Code 1.9 visualstudio.com
271 points by jrwiegand  4 hours ago   201 comments top 38
1
wildpeaks 0 minutes ago 0 replies      
Does it have some kind of Projects management now? That was the showstopper for me last time I gave it a try.
2
AsyncAwait 3 hours ago 3 replies      
I am surprised at the amount of semi-negative comments here. Yes, some features are yet to be implemented, (it's still a fairly young project and you can always follow GitHub issues on progress), but for an Electron app, it's surprisingly fast and capable.

The Microsoft-developed Go plugin makes it the best Go IDE out there, the devs, (Ramya Rao etc.) are super responsive and really trying to resolve issues quickly.

If you haven't tried it yet, I think you really should and if you have found a problem, open an issue at https://github.com/Microsoft/vscode so the devs know about it, complaining doesn't help making it better. They generally release every month, so it will get fixed sooner rather than later.

P.S. Kudos to the team & contributors for another awesome release!

3
christophilus 3 hours ago 4 replies      
VSCode really does improve with each release, and the monthly cycle is just about the perfect pace. This is a great example of how to run an OSS project.

I wonder how much Microsoft spends on it each month, and what value they see in it? Is it just a marketing expense? e.g. They fund VSCode in order to gain the good will of developers which they hope will turn into Azure or maybe Windows sales in the future?

4
reynoldsbd 4 hours ago 2 replies      
Another great release!

Maybe I'm just squarely within the target audience of VSCode, but I'm consistently impressed by how many of my pain points just magically go away with each new iteration.

5
yokohummer7 3 hours ago 5 replies      
This is an off topic, but I just learned from the article that PowerShell will soon be the default in place of cmd.exe in Windows 10. I welcome this change as I found the experience of using PS was superior to that of bash/zsh in general cases.

But I hope they figured out the performance problem. As of writing, in the stable version of Windows 10, PS is perceptually slower than cmd, so I was forced to use PS only when needed. Funnily it was even slower in Windows 8, so the current affair is better than ever. But to be truly a default I think the performance of PS should at least match that of cmd.exe.

6
joekrill 3 hours ago 7 replies      
I keep trying VSCode, but the main thing that keeps me going back to Atom is the fact that I have to do everything at the command line or by editing an enormous json file. Maybe I'm spoiled, but I'd much rather have a nice UI to deal with settings than have to figure out that, say, in order to show line numbers I have to modify "editor.lineNumbers" and set it to "on" (or is it "true"? or 1? I can't remember... let me go waste more time looking it up...)
7
eob 2 hours ago 0 replies      
VSCode + TypeScript is the perfect foot-in-door for MS to get into the web space.

They're both excellent products by themselves, but also give Microsoft the platform to start dangling turnkey Azure integration in front of developers.

Imagine if the IDE started to offer the ability to configure, deploy, and manage targeted production stacks--no browser / command line hackery required. That would be compelling.

8
joaodlf 3 hours ago 2 replies      
I use vs code for Go development and it's pretty amazing. Slowly starting to use it in other languages as well, but the Go support is really 5*.
9
pwthornton 4 hours ago 2 replies      
I have a new project at work that I decided to use it for to see how I liked using VS Code.

For the most part, I like it, but it's lack of Mac-nativeness bugs me, and it may bug me enough to switch. Double clicking the window bar in all natives app minimizes them to the dock for me. In Visual Studio, it maximizes my window (but not into full screen mode). I keep clicking the menu bar to minimize a window, and this keeps happening. It's kind of maddening.

10
santaclaus 4 hours ago 1 reply      
VS Code's release notes are really nicely done - I don't usually comment on documentation but whoever wrote these did a bang up job!
11
grandalf 3 hours ago 0 replies      
As an emacs user, it's interesting to watch the race of open source extensible editors like Atom and VSCode.

I still haven't been tempted to leave emacs, but it's great to see so much progress in the ecosystem.

12
mootothemax 4 hours ago 1 reply      
Anyone know if the C# plugin supports cshtml files yet? The lack of autocomplete, inspections and so on really hampers web development with it.

I've now switched entirely to using Rider because of this. While it's still early days for Rider (and it makes my Laptop's fans spin nonstop), being able to hastily edit my views makes it more than worth it.

(And integrated ReSharper is always good!)

13
vgy7ujm 2 hours ago 1 reply      
It's nicer than Atom and almost feels as snappy as Sublime. But Vim still is miles better when you have put in the work to become proficient. MacVim works great for retina screens and high color support.
14
yokohummer7 4 hours ago 3 replies      
The slowness shown in the pre-1.9 terminal is a bit ridiculous. Was it really the case? I've never seen such a strangely-behaving emulator! Really good that they resolved the issue.
15
moogly 1 hour ago 0 replies      
I really like VS Code, and try to use it as much as possible, but I can't make it my main driver until they've added more detailed theming support. Textmate themes aren't good enough. I can't get used to such basic highlighting that doesn't even come close to ReSharper's with "Color identifiers" enabled.
16
xmatos 3 hours ago 1 reply      
Am I the only one who misses an apt repository?
17
therealmarv 1 hour ago 0 replies      
Does anybody know a way to have better window management in VSCode? I really don't like to split always and use the mouse to drag and drop... often I use two panes and I want to duplicate view in them. Hoping for something like the Sublime Origami plugin (which was perfect for me).
18
rdslw 4 hours ago 0 replies      
Their Workbench related changes in this release are great.Pay attention to redone (also faster!!) terminal support.

https://code.visualstudio.com/updates/v1_9#_workbench

19
Lord_Zane 1 hour ago 0 replies      
Setting "window.menuBarVisibility": "hidden" works at first, but then dosent work if I close and open vscode, anyone else have this problem?
20
Keyframe 4 hours ago 0 replies      
Anyone knows if there's something like Goya for Vim for VSCode? I know there's full screen and zen modes, but I miss margins so that when I'm full screen I can have everything centred on monitor.
21
thiht 3 hours ago 0 replies      
Great release and awesome release notes, as always. Many thanks to the team for such a great editor!

I'm especially thrilled by the integrated terminal improvements.

22
13years 2 hours ago 1 reply      
I am really loving VS Code. However, there are 2 big things I really need to give up Brackets/WebStorm entirely.

1) Jump to definition for plain JS. Both Brackets and WebStorm can find method definitions for any JS project.

2) Multiline searches. I use this quite a lot to find files that contain 2 terms on different lines.

23
jrwiegand 3 hours ago 0 replies      
I am really hopeful for the enhanced scrollbar to be implemented soon. After that, I am pretty well set with vscode.

https://github.com/Microsoft/vscode/issues/4865

24
manishsharan 3 hours ago 1 reply      
I feel so guilty using this because I did not renew my license for Webstorm. I used to love webstorm but the javascript development environment/ecosystem has come such a long way since when I first purchased Webstorm.
25
ParkerK 4 hours ago 1 reply      
Man, I'd really like if they'd just add a default hotkey for 'Open Folder'. I know you can custom map hotkeys, but for an editor that's supposed to be 'easy to use out of the box', it's still lacking some basic features.

That being said, it's nice that they're improving it still

26
ridiculous_fish 3 hours ago 3 replies      
I'm used to being able to run my program via command-R or some other key equivalent. I haven't been able to find a way to do this conveniently from VSCode. How do other VSCode users run their programs?
27
ry4n413 1 hour ago 0 replies      
Can someone post their python config?
28
legulere 3 hours ago 0 replies      
It's nice that they added an option to change the side of the close button. But way more important are sensible defaults, and that should be on the left side for mac os.

Still I am a very happy user of Visual Studio Code.

29
dlbucci 3 hours ago 0 replies      
I was just thinking how much of a pain it was to switch from the output pane to the debug console, so I like the new tabs.

Unfortunately, I can't seem to run my launch task in my second window anymore...

30
codingmatty 2 hours ago 2 replies      
The problem that I have with VSC releases is that they are automatic, and I don't want to screw up my current workflow when I reset the application for an update. I have wasted hours before with VSC updates.
31
didip 3 hours ago 2 replies      
For those of you who use VS Code,

how fast is it when grepping large number of files? Is it at least comparable to Sublime Text?

32
billconan 4 hours ago 0 replies      
I hope it could support gdb remote debugging
33
randyrand 3 hours ago 1 reply      
How many people work on VSCode?
34
raspo 3 hours ago 0 replies      
I find it kind of funny that most screenshots are taken from MacOS :)
35
leeoniya 4 hours ago 2 replies      
still no portable mode? :(
36
Hydraulix989 3 hours ago 0 replies      
Waiting for the rollback update. ;-)
37
testUser69 4 hours ago 2 replies      
38
aphextron 2 hours ago 5 replies      
Am I alone in rejecting this new wave of Electron/node-webkit based text editors? Every single one I've tried has been horribly slow, unstable, and a giant resource hog. I gave Atom a shot, and it crashed on importing a .csv file of a few thousand lines. What I want from a text editor is simply that: a text editor. For anything more advanced, move up to a proper IDE. Why anyone uses these over something like Sublime or NP++ blows me away.
26
Lychee identified as cause for mystery deadly childhood illness in India abc.net.au
242 points by adamnemecek  13 hours ago   70 comments top 17
1
throwanem 10 hours ago 3 replies      
For those similarly annoyed by its elision in the article, the causative substance appears (1) to be hypoglycin A (2).

(1) https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4412228/

(2) https://en.m.wikipedia.org/wiki/Hypoglycin_A

2
mkagenius 10 hours ago 2 replies      
Can confirm as I am from Bihar, India, children eat lots of lychees in the season (also lots of mangoes), even many adults eat just lychee the whole day without any other meal. Seizures were considered as being possessed by demon which explains lack of research and delay in this discovery.
3
woliveirajr 9 hours ago 3 replies      
Malnourished children eating a fruit (lychee) without any previous meal could get poisoned and have a intense decrease in blood sugar levels.That make me wonders on how many religious practices, superstitions, etc., still exists in modern cultures, transmitted mouth-to-mouth, due to some misunderstanding of the true mechanisms of some occurrence in the past?I can easily see that a "children can't eat fruits without eating some cereal" could have come as religious belief, from a long past. And it would be effective for hundreds of children.
4
idiot900 10 hours ago 1 reply      
One of the saddest parts of this story is that decades of confusion could have been ended by checking glucose levels in these children at the hospital. This is a basic standard of care but seems like Bihar is so underprovisioned that even this simple step doesn't happen.
5
Hupriene 31 minutes ago 0 replies      
See also: https://en.wikipedia.org/wiki/Jamaican_vomiting_sickness

It was illegal to import canned ackee to the US for years, due to risk of hypoglycin poisoning.

6
ramgorur 1 hour ago 0 replies      
Some interesting folk remedies prevalent in the south east asia:

1. don't eat lychee, tamarind, unripe mango in empty stomach, always eat them after rice or bread.

2. don't mix pineapple and milk, or don't take them one after the other, otherwise your stomach will get sick.

3. eating the raw batter of palmyra palm causes fever and stomach pain.

Looks like these are factual knowledge that have been gathered and retained over thousands of years came from observed causal relationships.

7
Symmetry 6 hours ago 1 reply      
I'm amazed at the amount of learning that people have to do in order to survive off the flora in a new location. We're mostly spoiled in the modern world by how much the plants we eat are transparent in the preparation needed to eat them.

Being able to come into some new biome, figure out through trial and error what can be eaten when and how, and then transmitting that information reliably to the next generation is what let humans spread all over this planet.

I highly recommend the book The Secret of Our Success covering this and several other topics.

8
zher 9 hours ago 0 replies      
Here's the link to the study published in The Lancet Global Health: http://www.thelancet.com/journals/langlo/article/PIIS2214-10...
9
Reason077 5 hours ago 0 replies      
"Fortunately, the high cost of these imported fruits and the likelihood that would be eaten in small quantities by well-nourished consumers, suggests there is little reason for concern in the USA,"

Interesting. As a kid my only experience with Lychees was the canned variety (often found as a desert at Chinese restaurants).

But now days, fresh ones are pretty common, and you can often buy them really cheaply at London fruit stalls: I bought a big 2kg box for 4 recently! Very tasty and we felt no ill-effects despite gorging ourselves on them.

10
mrfusion 9 hours ago 2 replies      
Has anyone researched if this chemical would help diabetics or even dieters? Obviously in smaller doses.
11
overcast 7 hours ago 4 replies      
Regardless of the poisoning, how does one eat nothing but a fruit the entire day? I feel like you'd be sick to your stomach / diarrhea from that?
12
mark-r 6 hours ago 1 reply      
Anybody know why only children would be affected? Or is it that everyone is affected, but only children die from it?
13
parennoob 9 hours ago 1 reply      
Looks like they were somewhat on the right track in 2011 http://archive.indianexpress.com/news/toll-47-bihar-may-decl...:

> "A study by the Indian Council of Medical Research concluded that a toxin from kasaundhi trees, being transmitted to humans through insects, caused the disease. The trees were cut and the disease subsided. Litchi trees and mosquitoes present in the region should be studied for this disease," suggested Shah.

14
SixSigma 7 hours ago 3 replies      
15
Moeg 10 hours ago 1 reply      
17
shkr 6 hours ago 1 reply      
Amazed to see this one hacker news....I would be interested to find the investigation behind this report.

"Hey are people sick at a hospital somewhere not in australia, like india ?"

"Is it the season for lychees ?"

"Was there once a scientific study where they found this fruitwas not perfect?"

"Google says so"

"Thanks I have an article to write. This will be useful for australians, and other folks importing lychee from southeast asia"

27
Schneiderman: Spectrum-Time Warner defrauded customers on internet service timesunion.com
63 points by JumpCrisscross  3 hours ago   34 comments top 11
1
makecheck 32 minutes ago 0 replies      
Its ridiculous how simple and fair the rules for this could be, if only theyd been implemented in law to cover certain scenarios that ensure honesty.

An example set of rules could be:

1. Periodic monitoring of connection speeds is to be expected, and any action requires a minimum of X samples (say, X=10). Measurements may be performed by any party and the measurement method must be fully disclosed. If more than 30% of samples are failing to meet advertised Internet speeds by at least 5%, or any one sample is more than 50% below advertised speed, customer is entitled to a one-day refund of Internet fees. If more than 5 total occurrences in a single calendar month are failing, customer is entitled to a 15-day refund of Internet fees.

2. If the Internet becomes unusable for more than 10 minutes at a time in a single month and the outage can be traced to ISP-given equipment, customer is entitled to a one-day refund of Internet fees. If Internet is unusable multiple times, customer is entitled to a 5-day refund.

3. If company has cause to adjust Internet delivery expectations (such as, too many additional customers to serve original speeds on pipe to same area), existing customers are all immediately released from any contracts and may terminate service immediately with no penalties. In additional, ISP is liable for crediting customer monthly bills for the remainder of service, proportional to the difference in service speed with a 10% penalty for violation of original contract by the ISP.

4. Internet is considered a separate service and may not be bundled with anything else.

And it doesnt even have to say this much to be a huge improvement. The point is that companies have been getting away with lousy services FOR YEARS and appear to be largely unpunished, while meanwhile the number of customers overpaying and not receiving stable and promised service numbers in the millions.

2
mixmastamyk 1 hour ago 1 reply      
Have had TWC now Spectrum. In two years the price has gone from $35 to $60. There is no cheaper tier, according to them. I use it over wifi so couldn't even use their supposed blazing speed (100mb) anyway.

What can I do? The only competitor, AT&T is not very cost effective either and I dread doing business with them---just paying the bill was an exercise in frustration due to their constantly broken website.

3
anonymous_iam 1 hour ago 2 replies      
When I began reading the article, I was hoping it would detail their deceptive bandwidth management policies. I had the service for about 3 years (recently terminated) and found that speeds would slowly degrade over months. In the end my 200mbps service was giving me about 50mbps. The cable-modem link diagnostics indicated good signal levels and nearly non-existent error rates so obviously it had nothing to do with the cable plant.
4
breul99 3 hours ago 2 replies      
I can't wait for the day when ISPs are held accountable for failing to provide the speeds we pay for.
5
heywire 1 hour ago 0 replies      
We were recently converted from Time Warner to Spectrum. It was all handled over the phone, and at no point did the customer service rep mention anything about my docsis 2.0 modem not supporting the "up to 60Mbit" they were advertising. Since this was my own modem, I won't fault them too much, but they really should make sure their customers have the correct equipment to take advantage of the service they're buying. Luckily, Spectrum does not seem to be charging for modem lease like TWC did, so I drove up to the local office and picked up a docsis 3.0 modem. My speed jumped from 35Mbit to 75Mbit afterwards (I was previously on 15mbit with TWC for $5 more per month).
6
imajes 2 hours ago 1 reply      
Curious: anyone else using twc since spectrum in nyc noticed that the ultimate/extreme speed increases are no longer available for sale/upgrade?

Seems spectrum maxes out at 120mbps, whereas they used to sell 300mbps. (or is that just me?)

7
iaw 2 hours ago 0 replies      
Consumer fiber-optic for $40/month just hit my area. The advertised speeds are 1000 Mbps down, 100 Mbps up.

I've been able to sustain 760 Mbps down and 105 Mbps up, it's the most satisfying experience I've had on the internet.

8
X86BSD 1 hour ago 1 reply      
I have google fiber. The speeds great and blows pretty much every commenter here out of the water.

So why am I posting? I'll tell you why. I have found a perverse interest in the fact it took google 18 months to drag fiber literally 250 feet to my house. I measured it. My neighbor had it. It took them 18 months. They kept telling me every quarter, it should just be two more months at the latest.

Do you know how long it takes cable companies to repair a downed power line?

Not 18 damn months! And that's if the power line is beamed up by aliens and disappears. Requiring an entirely new pole and cable run.

My point is there is no panacea so far. Cable companies suck, and milk the copper tit cash cow for all they can. They won't upgrade their infrastructure and will not compete with google fiber and other high speed offerings. But they do get work done on lines pretty fast.

Google screams in infrastructure but sucks at actual deployment. I'm really unimpressed with google. Gmail, android, fiber they just suck imo.

I'm more convinced now than ever that laws need to be revoked and cities need to deploy their own fiber to the home network at tax dollar expense and operate it like a utility. Open the backbone for leasing as well to interested parties at a fair price.

It's clear telcos and cable have a chokehold on the legislative. Things won't get any better. Competition won't increase. It's way beyond time to nuke the current system from orbit.

9
justinhensley 2 hours ago 2 replies      
Anecdotally, I have noticed my upload speeds with Charter Spectrum have been noticeably better over the past month. My uplod went from 100-150 KiB to 500-700KiB
10
rm_-rf_slash 3 hours ago 4 replies      
Somewhat off-topic, but whenever you have trouble with customer service ripping you off or giving you a hard time, there is a magical incantation that works almost every time:

"If $company does not refund/provide the service that I paid for, my next call will be to the State Attorney General of $homestate and the State Attorney General of the state your company is incorporated in, for charges of fraud."

It's the upgraded version of "I want to speak with your supervisor," and it has never failed me.

11
jarcoal 3 hours ago 1 reply      
28
John Carmack on expert witnesses and 'non literal' copying facebook.com
419 points by samlittlewood  6 hours ago   255 comments top 31
1
addisonj 6 hours ago 22 replies      
I don't have the evidence so I can't make a judgement about whether Zenimax or Carmack is in the wrong here, but this does point out something strange in our court system and that is where a "jury of peers" which translates to normal people with little to no knowledge of the underlying subject matter.

As our society becomes more specialized it seems a bit absurd to have people make judgements mostly based on how well each side can make incredibly complex things comprehensible and convincing without any real understanding of the underlying principles in the field.

It seems likely that the best expert witness in this case is not the most correct or credentialed, but the most charismatic.

It would be interesting to see how some of these big software trials (Google v Oracle) would have come out if the jury were made up of people who were both impartial and familiar with the industry.

2
ChuckMcM 6 hours ago 6 replies      
Interesting rant on expert witnesses. Mostly interesting because I've done some expert witness testimony and found it fascinating.

I completely resonate with John's issue with the expert's testimony and his understanding of it. John mentioned that their own expert testified differently but he didn't say whether or not their lawyers attempted to impeach their expert with cross examination. In the two cases I participated in, both sides agreed ahead of time on how to "handle" experts (in terms of cross examination). In one it was documents only (no testimony) and in one I was deposed by the opposing counsel after documents in which the lawyer clearly had some notes (presumably from their expert(s)) and was trying to get me to recant or change some of my points (I didn't need to).

In a jury trial I can imagine that having the jury understand your testimony is probably the most challenging.

And, like John, I consider the notion of 'non-literal copying' to be pretty ridiculous. In the limit it means "you read this code, understood how it worked, and wrote new code that could do the same function." You can stretch that to cover anything you have ever seen. Which is sad.

3
gregw2 4 hours ago 5 replies      
It's kinda hard to square John's post-trial comments "I never tried to hide or wipe any evidence, and all of my data is accounted for, contrary to some stories being spread." with ZeniMax's post-trial comments at http://www.gameinformer.com/b/news/archive/2017/02/01/zenima... that " (vi) Carmack intentionally destroyed data on his computer after he got notice of this litigation and right after he researched on Google how to wipe a hard driveand data on other Oculus computers and USB storage devices were similarly deleted (as determined by a court-appointed, independent expert in computer forensics); ... (viii) Carmack filed an affidavit which the court's expert said was false in denying the destruction of evidence; "

What is a reasonable explanation of this discrepancy? Varying notions of what consitutes "evidence"? Or differences in facts?

4
Digory 5 hours ago 2 replies      
I suppose many weird things happen in $500m cases, but it'd be really unusual to have trial testimony sealed against the agreement of the parties. Pretrial reports and depositions, sure. Media was apparently there reporting during the expert's trial testimony, and I'd be surprised if they cannot buy a transcript from the court reporter. (http://uploadvr.com/court-oculus-zenimax-last-day/)

A quick look at the docket shows the expert witness at issue is likely to be David Dobkin of Princeton. That name appears on the docket and in media reports. Let me know if that violates HN norms, but expert witness trial testimony is part of the public, permanent record in almost every case. If a witness is excluded for using unreliable methods, his or her value as an expert is ... diminished.

5
sfifs 6 hours ago 3 replies      
The problem for John Camack was I think he was on both sides of the table writing code - so he wrote the code at Zenimax & re-wrote the code at Oculus. From a layman's perspective, he very well could have been "copying".

Normally when you design clean room implementations, you use different people and make sure that the people who write the code never see the prior written code and the person giving the spec doesn't see the newly written code.

They were just in-sufficiently cautious legally, possibly un-surprising given Carmack's prior history in founding Id software, and got hit with a lawsuit.

6
timtadh 4 hours ago 0 replies      
I have do (academic) work [1, 2] on finding semantic code duplication. Two points that I have learned about code duplication:

1. There is are a lot, A LOT, of code regions that share similar constructions when analyzed in terms of dependencies. Dependencies only consider data flow and control dependencies. Where a statement X is control dependent on another statement Y (usually a if-condition or loop-condition) if Y decides whether X executes.

In my studies I have found modestly sized Java programs (~ 75 KLOC) have > 500 million patterns representing duplication in their dependence graphs.

2. Not all dependence structures which are "duplicate" would be considered duplicated by a human programmer [2]. It takes discernment by someone familiar with the code base to decide whether or not regions are actually duplicated.

I would argue you can draw similarities using automated metrics between disparate code bases. Those similarities are not evidence of copying. To decide whether similar regions are actually copied you would need to do further and subjective analysis. Without directly evidence of copying it would be very difficult to make a solid claim one way or the other. But, given the vast amount of similar code regions that exist (and assuming most code is not copied) I believe it should be given the benefit of the doubt.

Note: I have not studied density of duplicated code between different projects. The above is merely an conjecture based on my experience.

[1] http://hackthology.com/rethinking-dependence-clones.html[2] http://hackthology.com/sampling-code-clones-from-program-dep...

7
trendia 6 hours ago 6 replies      
This struck out at me:

> There are objective measures of code similarity that can be quoted, like the edit distance between abstract syntax trees ...

If this became the primary legal metric, then programmers who stole code would change the code so that functions achieved the same output with practically no AST similarity. That is, they could maximize functional similarity while minimzing code similarity. This would be a large task, but one that might be easier and preferred to writing code from scratch, since you drastically reduce the trial and error process of writing code in a field where best practices aren't currently known.

But, while it's possible to game any 'objective' metric like AST distance, maybe that's preferable to the subjective and incentivized claims of an expert witness, who will be biased in favor of the party paying them. And while it's easy to get lost in the paragraphs of expert witness testimony, a collection of objective metrics is easy to compare to previous cases.

8
Arathorn 5 hours ago 1 reply      
Having gone through a similar trial, this is painfully familiar. The reality seems to be that the legal process optimises for whoever argues the most manipulatively (using every dirty trick in the book) rather than actually optimising for truth.

Perhaps this is inevitable: rather than hoping to successfully convey a logical scientific argument to the judge/jury/arbitrator about a deeply technical area, instead the lawyers find themselves painting a more subjective simplified story that the audience may relate to... at which point you have to fight like with like. It's almost a post-truth kind of situation; the actual reality is irrelevant; it's instead how well the theatrical posturing is executed or how compelling the simplified version of the story is.

That said, I have no idea whether Carmack is in the right or wrong here. In my instance the system worked in my favour, but I suspect it had a lot more to do with having lawyers who were good at debating than the almost incidental fact that objectively and demonstrably I had done nothing wrong...

9
tetraodonpuffer 5 hours ago 2 replies      
This is a thorny issue because say you could make the argument that if I spend 5 years writing and rewriting code from 0.1 to 0.8 at company A to understand a domain, then I move to company B and write 0.9 and 1.0 in the same domain and make a pile of $$$, then company A should somehow be compensated even if the code itself is not exactly the same (which it wouldn't be since you learned from mistakes made).

On the other hand if you take, for example, an engineer who learns to build bridges and builds a lot of bridges in one company, and then moves to a different company still building bridges, one would not think that just because they build a great bridge at company B, company A should be compensated.

Where do you draw the line between 'improving your craft / becoming a better software developer' to 'taking a company's IP'?

10
6stringmerc 6 hours ago 2 replies      
An interesting statement on things that may be relevant in the grand scheme of the trial, but, unless I'm really missing something here, the verdict was about busting an NDA (the Code being downstream of the NDA violation). Hence the $500M vs the $4B in damages. If it was just about code, then I'm sure Carmack's points would hold up just as well as they read in hindsight. But - and it's a big but - keep in mind his post is in a vacuum of sorts by not relating it back to the mechanics of how the case was actually ruled. Again, I might be missing something in his post, but I don't think so.

As for this part:

>The notion of non-literal copying is probably delicious to many lawyers, since a sufficient application of abstraction and filtering can show that just about everything is related. There are certainly some cases where it is true, such as when you translate a book into another language, but copyright explicitly does not apply to concepts or algorithms, so you cant abstract very far from literal copying before comparing. As with many legal questions, there isnt a bright clear line where you need to stop.

Non-literal copying is why I like the "Gaye Family vs. Blurred Lines" verdict and why a wide swath of business folks and industry creatives hate it. Also, it's nearly a 1:1 if you want to talk about Code like Carmack does or Music like Musicians do. It's a totally reasonable thought experiment for one side to claim "Well, a jury of non-musicians ruled the wrong way and called it derivative" to which I'd counter "Well, a jury of expert musicians would probably rule the same way because they understand the nuances even better than the average lay person." Again, these are hypothetical arguments with real-world consequences, but I don't think these cases can be easily ruled upon. There's always going to be some hard feelings at the end of it.

11
DannyBee 4 hours ago 0 replies      
"There are objective measures of code similarity that can be quoted, like the edit distance between abstract syntax trees, but here the expert hand identified the abstract steps that the code fragments were performing, made slides that nobody in the courtroom could actually read, filled with colored boxes outlining the purportedly analogous code in each case. In some cases, the abstractions he came up with were longer than the actual code they were supposed to be abstracting."

This is because of the abstraction filtration comparison test that courts use:https://en.wikipedia.org/wiki/Abstraction-Filtration-Compari...

I think it's garbage too, but such is life.

I don't think you can blame the expert for producing something that goes along with what the current law is.

12
DonHopkins 3 hours ago 0 replies      
Dave Beazley (Python expert and author of SWIG) was an expert witness and analyzed a huge body of code. This is his fascinating talk about his adventure:

https://www.youtube.com/watch?v=RZ4Sn-Y7AP8

So, what happens when you lock a Python programmer in a secret vault containing 1.5 TBytes of C++ source code and no internet connection? Find out as I describe how I used Python as a secret weapon of "discovery" in an epic legal battle.

Slides can be found at: https://speakerdeck.com/pycon2014 and https://github.com/PyCon/2014-slides

13
hoorayimhelping 4 hours ago 1 reply      
>* After he had said he was Absolutely certain there was non-literal copying in several cases, I just wanted to shout You lie!. By the end, after seven cases of absolutely certain, I was wondering if gangsters had kidnapped his grandchildren and were holding them for ransom.*

This seems to be a very common reaction when reasonable people are put through a lawsuit.

14
nabla9 54 minutes ago 0 replies      
Battle between experts trying to convince people from the street is very fragile system. Arguments start from the scratch every time.

The idea of making expert witness testimonies as part of ones academic record is great.

Scientific expert testimony literature should be cumulative in the same way as legal precedents are. Code copying is case that comes up again and again. There should be convergence towards scientifically justified ways for experts to determine copying.

15
grellas 3 hours ago 0 replies      
I learned long ago (at the beginning of my legal career) that a determined litigant can essentially cook up expert testimony on demand as needed to further some tendentious goal or other.

It is dressed up to be "scientific," "learned," etc. but, when objectively analyzed, it is in reality utter garbage. But, and this is a big "but," it is the sort of garbage that cannot be refuted conclusively owing either to uncertainty in the science involved or uncertainty in the facts and assumptions on which it is based.

By this, I don't mean there are experts who have integrity and who will not allow their good names and reputations to be cynically used for such purposes.

But, for every expert of such integrity, it is (sadly) pretty easy to find others who can be bought.

Don't know what happened in this case but I instinctively can sympathize with the author here that this is what might easily have happened.

16
jstanley 6 hours ago 0 replies      
Link doesn't work for me, I just get a popup telling me to sign up to Facebook, and when I close it the page is blank?

EDIT: Incognito mode fixed it. http://pastebin.com/KxAyvqyM

17
qwertyuiop924 2 hours ago 0 replies      
To be clear, I (like many of you) have poured countless hours into John Carmack's games, and stand in awe of his work. I also have considerably less capability than most HN users in programming, etc. So I may have a large pro-Carmack bias going into this...

Having said that, the concept of "non-literal copying" is, IMHO, nonsense. And dangerous. So I'll agree with Carmack on that much. And while I don't know (I don't think it's been revealed, although I haven't been following the case closely), it seems that Carmack had some hand in writing the original code, so of course the code will have similarities.

Also, the analogy of "Harry Potter with the names changed" is, from what I can tell, a pretty bad one for what actually happened.

In short, I think this is pretty much nonsense.

OTOH, Zenimax does have so evidence of wrongdoing on Carmack's part, so it's not like their case has no basis, and I am inclined to be biased toward Carmack. And there's always the possibly that I have no idea what I'm talking about and I'm just stark raving mad.

18
kchoudhu 6 hours ago 0 replies      
Well, who was the expert witness?
19
emmett 4 hours ago 0 replies      
My father was an expert witness in electronic discovery cases for many years in the 90s and early 00s.

I remember so many stories growing up about similar situations. The main lesson was that it was actually way easier to be an expert witness when your side was totally in the wrong, because you could just make stuff up. Explaining the truth about how computers work is hard, compared to making up simpler explanations that don't happen to actually reflect reality.

20
squizzel 6 hours ago 3 replies      
I often wonder if John was involved with the story telling aspect of his games, not just the code.
21
wnevets 5 hours ago 0 replies      
Expert witnesses is kinda crazy in this country. One of the most famous cases had to be the father wrongfully executed because of it[0].

Law enforcement are also trained to talk about their years of experience and how they're an expert when giving testimony regardless of facts and juries love it.

[0] https://en.wikipedia.org/wiki/Cameron_Todd_Willingham

22
minimaxir 5 hours ago 0 replies      
I recently served on a jury in an assault and battery case. The defense offered a psychological defense and called a psychologist as an expert witness. The witness offered a few theories which were credible but not evidentiary. The prosecution, however, did not attempt to contradict the claims, and instead argued that the witness was too expensive and did not have much contact with the defendant.

It was a mistrial due to jury deadlock, as reasonable doubt was present.

23
bisby 3 hours ago 0 replies      
That's a great analogy that I'm surprised I haven't heard before.

If I write a book called Gary Potter, about a boy who finds out he's a wizard and goes on (specific) adventures with his friends Jon and Germione... I'd be in trouble.

But a book about someone wishing for a better life > finding out they secretly had great powers > struggle to gain control/mastery of the powers > must use the powers to save the world from evil...that's just a story framework and applies to Harry Potter, Star Wars, Hercules(the Disney version at least). There's variations in why they want the better life (abuse, boredom, etc), how they find out, and what the exact evil is. but if you abstract far enough, they're all basically the same story.

Using exact code and changing variables like "main_character_name" is clearly infringement, but using a similar abstracted flow shouldn't be. ESPECIALLY in code. If your end goal is to put pixels on a screen. you run algorithms to determine pixels > add those pixels to a buffer > send the buffer to the screen. There's not many other ways to do it. There's only so many ways to implement some things. And if a way is good, independently coming up with the solution isn't copying. In fact, if I came up with a different solution, it would be sub-optimal. For some problems, the abstracted flow of the code is defined for you. "We're doing VR, we want X, Y, Z to happen"... There's not a whole lot of options on how to do that.

If Oculus only figured out the flow-charts based on copying zenimax's prototype, then they perhaps are copying. And thats the argument zenimax is shooting for. "this is a non-trivial, non-obvious solution they only got by copying us".

But the next question is: at what level of abstraction am i just changing variables at a large scale? If I wrote a book about a young boy living with abusive relatives because his parents died and he finds out he's a wizard and moves away to a magical wizard school, does that count as copyright infringement? There's a point where it is, and a point where it isn't.

I can never find the source, but I once saw a 256x256 picture of yoshi, that immediately below it had a 128x128 version of the same picture (stretched to 256x256). Below that was 64x64, etc until the 1x1 was just a green pixel. Along the side it said "at what point does it stop being copyright infringement? If you started at the top (the high res), you could get pretty far down and still go "oh yeah, this is still the same image". but if you started at the bottom, you could get much higher and go "this is some weird abstract art of random pixels"

24
wiz21c 2 hours ago 0 replies      
The goal of the expert is not to give a "truth", it's to convince the jury. That's a very different job.
25
unityByFreedom 5 hours ago 0 replies      
Should he be talking about this? Don't they still have a chance to appeal?
26
almonj 3 hours ago 0 replies      
All laws relating to Intellectual Property need to be abolished.
27
vanattab 6 hours ago 1 reply      
Isn't the testimony already a matter of public record?
28
sandworm101 5 hours ago 0 replies      
Too many people read too much into "peers". These are your legal peers, not social or educational equivalents. These are fellow citizens. So long as they arent royals, elected officials or cops they are peers. (Lawyers can technically serve on juries but every court i know doesnt want them to do so, nor law students.) We live now in a society with fewer official class systems, or we at least now better separate legal classes, and so easily forget the original intention of such words.
29
potatoman2 6 hours ago 0 replies      
Loser's lament.
30
LeicaLatte 4 hours ago 2 replies      
Carmack is a hero for me but he seems to come across as very naive here. I employ other people now and employees who argue this sort of stuff isn't copying are toxic to the culture and ethics of work. When you are paid a salary every month, I actually don't care if you are productive. But the least I expect you to be is loyal. And I am no monster for expecting that.
31
pyb 6 hours ago 5 replies      
"The expert witness circuit is surely tempting for many academics, since a distinguished expert can get paid $600+ an hour to prepare a weighty report that supports a lawyers case. I dont have any issue with that, but testifying in court as an expert should be as much a part of your permanent public record as the journal papers you publish. In many cases, the consequences are significant. There should be a danger to your reputation if you are imprudent."

So basically he's kind of threatening the expert witness now?

29
Show HN: AutoMIDIFlip An automatic MIDI flipper automidiflip.com
25 points by Sophira  4 hours ago   9 comments top 5
1
tunesmith 2 hours ago 2 replies      
One of my favorite discoveries, thanks to a suggestion a teacher made, was how inverting a pattern can help you learn something really difficult.

I was learning Beethoven's Appassionata sonata and there's a section where the right hand has a series of five-note arpeggios that gradually decrease in register until the left hand takes over and continues down into the bass (while the right hand takes a climbing melody that originally started in the left).

Those five-note arpeggios were really hard in the left hand. So the technique my teacher recommended was to basically play an inversion of the left hand pattern at the same time as the left hand was playing. So when the left hand started on a Db, the right hand started on an Eb an octave up, and then basically play a mirror version - mirrored by the physical key, so E vs C, Bb vs F#, etc.

It sounded like crap, but it was very effective - since the right hand is stronger, the left hand would basically learn from the right. Pretty quickly, my left hand got very strong at that pattern and it became one of my favorite sections of the movement.

2
Sophira 4 hours ago 0 replies      
I wrote AutoMIDIFlip based on the #midiflip challenge going around Twitter right now - basically, take a MIDI file, flip it so that the intervals are preserved but in the opposite direction, and see what it sounds like. It actually has some surprisingly good results.

Andrew Huang did the first one manually, as can be seen in the YouTube video linked on the front page. I figured I could write code to automatically do it even for people who didn't have the software to do it themselves.

I plan to add more features to AutoMIDIFlip - in particular, it'd be nice to have an optional feature to auto-shift the different channels up or down octaves in order to have them occupy the same sort of range as they did in the original MIDI, as that'd probably end up with more listenable tunes. (Bass notes become screechy notes of death right now.)

So far, AutoMIDIFlip is the only automatic MIDI flipper I know of that preserves everything about the original song except for the note positions. In other words, if you flipped a tune twice, then removed the 6 extra empty MIDI tracks that AutoMIDIFlip would have added (3 per run, basically just attributing AutoMIDIFlip so that nobody just runs a MIDI file through it and calls it an original), you'd end up with exactly the same file, hash checksums and everything. There are only three exceptions to this rule:

* If the original source didn't utilise MIDI's "running status" feature (which simply acts to reduce the file size by removing redundant information - more info at http://www.midikits.net/midi_analyser/running_status.htm), then the resulting file from AutoMIDIFlip will be smaller than the original. It'll still contain exactly the same information, however.

* For Format 0 MIDI files, AutoMIDIFlip will output a Format 1 MIDI file so that it can insert the empty attribution tracks. These tracks do not contain anything, and AutoMIDIFlip won't attempt to separate the file into tracks; you'll still just have one track containing all the data.

* If for some reason the input MIDI file has more tracks than are indicated in the file header, those extra tracks will not appear in the output file. This scenario should basically never occur; if it does, there's something wrong with whatever program generated the MIDI file.

I'm happy to answer any questions people might have!

3
leeseibert 2 hours ago 0 replies      
I flipped the arpeggio from Stranger Things. Doesn't have the same umpf. Cool tool though.
4
azeirah 2 hours ago 1 reply      
I wonder what a song will sound like if its overlay + its original are layered on top of each other.
5
bcook 2 hours ago 1 reply      
Are MIDIs still being used by any popular services, apps, OSs, etc?
30
A link between air pollution and Alzheimers disease latimes.com
44 points by ozdave  6 hours ago   27 comments top 5
1
jimlawruk 3 hours ago 0 replies      
Is the cause "air" pollution, or could it be "noise" pollution? I would think living near loud traffic areas might lead to less quality sleep for the brain. And a lack of sleep is linked with alzheimers. Just a thought.
2
throwaway2016a 3 hours ago 0 replies      
What would be surprising to me is that they found air pollution makes Alzheimers less likely.

If it accelerates it that's unfortunately but I'm not surprised.

3
jack9 4 hours ago 1 reply      
"surprising" as in "not surprising" - way to go LA Times. You hit another one out of the park with backwards editorializing. Next, another "surprising" link between water pollution and cancer.
4
jmaloney10 4 hours ago 6 replies      
latimes won't let you read the article with an adblocker enabled
5
st3v3r 5 hours ago 2 replies      
Maybe this is a good thing. After Trump's EPA guts clean air regulations, we'll all develop Alzheimer's, and we can forget that he was ever elected.
       cached 2 February 2017 23:02:01 GMT