hacker news with inline top comments    .. more ..    23 Jul 2014 News
home   ask   best   5 years ago   
1
I've never seen a language's style guide recommend avoiding comments before
62 points by vs2  2 hours ago   36 comments top 24
1
blowski 29 minutes ago 0 replies      
Avoiding comments that do what your code should be doing is common practice, and I think that's what this style guide is recommending.

Comments are useful to describe __why__ you're doing something, often when you are not able to change the unexpected behaviour. Whenever I build an API library, my code is littered with comments like "Acme Corp API requires this happens before that" with a link to that bit of the API documentation.

Here's a C++ example about "documenting surpises" (taken from Steve McConnell's Code Complete:

    for ( element = 0; element < elementCount; element++ ) {        // Use right shift to divide by two. Substituting the        // right-shift operation cuts the loop time by 75%.        elementList[ element ] = elementList[ element ] >> 1;    }
And a Java example:

    /* The following code is necessary to work around an error in       WriteData() that appears only when the third parameter       equals 500. '500' has been replaced with a named constant       for clarity. */    if ( blockSize == WRITEDATA_BROKEN_SIZE ) {         blockSize = WRITEDATA_WORKAROUND_SIZE;    }    WriteData ( file, data, blockSize );
He also gives a whole list of situations in which comments are a bad idea, and it's similar to the OP.

2
koonsolo 46 minutes ago 1 reply      
Comments say what your code does, your code says how you do it. The swap example is trivial, but for most functions it is good to add an API comment, because how you use the function shouldn't depend on how it's implemented, but what it should do, described in the comment. That way you can change your implementation as long as you don't change the contract. In other words, changing how your code does something shouldn't therefore change what it does. And this last part is specified in the comment.
3
daemonk 2 minutes ago 0 replies      
Write comments that explain why a certain line is there.

Let's say you are parsing a standard tab delimited file. You find that the tab delimited file has some non-standard features, so you have to write some extra lines of code to handle it. For people who thinks the code just parses a standard tab delimited file, these lines will be confusing, so you comment these lines and say why you included them.

4
nrzuk 2 minutes ago 0 replies      
Personally I have no problems with comments in code for complex functions etc.But pointless comments like this below drives me insane.

// get the user$user = $this->getUser();

Times that by the thousands of lines in a project and you have one big headache!

5
nodesocket 11 minutes ago 0 replies      
I've seen this many times:

    ...thus they (comments) tend to diverge from actual implementation.
It happens, you update/refactor code, and forget to update the comments. Thus the comments are outdated or worse not applicable anymore. Common mistake by less-detailed oriented developers. Begs the question, in this case is is better to have confusing/incorrect comments, or no comments at all?

6
riquito 8 minutes ago 0 replies      
It's often a good idea to comment "why" the code exist, if it is non-obvious (e.g. it's obvious to sanitize input parameters). The comment must be short and possibly point to a ticket wrote somewhere else.

It may be a good idea to comment "what" the code does, if it isn't clear (the code itself is "how" it is done, but "what" does it do may be hard to read, e.g. sometimes you use a clever hack for performance reasons).

As always, handle with care :-)

7
krzrak 22 minutes ago 0 replies      
It's doesn't recommend avoiding comments - it encourages to write understandable code and avoid meaningless comments. It is a basic rule of the clean code.
8
yxhuvud 33 minutes ago 0 replies      
Comments are lousy for describing what you are doing, but there are no alternative to comments for describing why something is done.
9
tibbe 43 minutes ago 0 replies      
Note that this isn't the "official" Haskell style guide. We don't have one (although we probably shouldn't). This is one of the competing guides out there.
10
Nursie 44 minutes ago 2 replies      
I'm a big believer in function level comments in code, in a sort of doxygen-ish style (I write mostly C).

It allows you to document the intended inputs and outputs of the function and state its purpose. This increases maintainability and reusability.

Functions themselves should be short and written as a sequence of logical steps.

I'm also a big fan of doing things right rather than just hacking until it works, which seems to put me in a minority.

11
krat0sprakhar 1 hour ago 0 replies      
> I repeat: Try hard - very hard - preferably repeatedly - to remove unexpected behaviour from your program. Comments can never fix unexpected behaviour.

This is golden!

12
duncan_bayne 2 hours ago 2 replies      
I've always treated comments as code smells. Not necessarily something bad, but something that at least suggests the possibility of suboptimal code.

One of the best cases for comments IMO is documenting an unexpected behaviour on the part of a third-party API. But even then, correct exception / error-handling code can obviate the need for comments in many cases.

If I'm reading code (from an experienced programmer) and I see a comment, I immediately pay attention, because Here Be Dragons.

13
bnegreve 55 minutes ago 1 reply      
This is not convincing to me because the examples are trivial:

    -- swap the elements of a pair    swap :: (a,b) -> (b,a)
Yes this is redundant.

    let b=a+1   -- add one to 'a'
Yes this is also redundant

Does it mean that every piece of code can be expressed as clearly as in a one-line comment in natural language? I don't think so.

14
josch 44 minutes ago 0 replies      
Leo Brodie says in "Thinking Forth" in the style section: "The most-accurate, least-expensive documentation is self-documenting code". I am sure there are other prior examples.
15
bozhidar 43 minutes ago 1 reply      
There's similar advice in the Ruby Style Guide - https://github.com/bbatsov/ruby-style-guide#no-comments

Comments often go out-of-sync with the code, so I think it makes a lot of sense to prefer writing comprehensible code instead of trying to explain with comments something totally incomprehensible.

16
skriticos2 1 hour ago 1 reply      
I totally agree with this. I like to put a bigger comment block at the top of my source files explaining the overall concepts and data structures used and then put few actual comments in the code. Instead I think about my variables and function names and make them talk for them-self. Commenting each and every element of your code will just make people (possibly yourself) curse at you when debugging your code and realizing that the comment was rendered obsolete 15 iterations prior and the code does something entirely different.
17
MerreM 1 hour ago 1 reply      
I was always told that if someone couldn't tell what your code was doing by glancing at it, you'd done it wrong and should re-write it.

I know that's absurd in practice, we don't always have the time but I've always used comments as a last resort.

If I need to comment code to make it understandable at a glance, so be it but I'd rather avoid them all together and rewrite until it's clear enough without them.

18
tomp 1 hour ago 1 reply      
This submission needs to have a different title. It's not very clear whether the submitter is being genuinely surprised or sarcastic.
19
vince_refiti 1 hour ago 0 replies      
Comment only magical code, but don't write magical code.
20
warrenmiller 14 minutes ago 0 replies      
Good code should be self documenting.
21
samuli 1 hour ago 1 reply      
The article correctly advices against commenting the obvious. I think it is often practice by novice programmers to assure themselves about what the language expression actually does.

What the article omits is the suggestion of commenting the right way, i.e. adding reasoning or the description of the high level logic behind the code.

22
kevinpaladin 1 hour ago 0 replies      
I agree. Comments sometimes make it difficult to go through the source code. That's why I always loved Java naming convention that suggests variables and method names to be self-explanatory.
23
viach 52 minutes ago 0 replies      
To rephrase - code makes comments understandable?
24
Myrmornis 1 hour ago 0 replies      
The article is spot on. Comment as last resort.
2
Bootstrap 3 Free Themes and Templates
42 points by DaveJn  2 hours ago   9 comments top 7
1
sida 45 minutes ago 1 reply      
At the risk of sounding like an asshole. The themes on prebootstrap is pretty average at best. Alternatives like bootswatch.com offers far better themes.
2
viach 1 hour ago 0 replies      
Great resource! This is also worth attention: http://bootswatch.com/. I have zero-to-none design skills and using these themes makes my chrome extensions design look acceptable (well, almost)
3
yoanizer 38 minutes ago 0 replies      
I don't want to sound mean, but most of them look very amateurish at best.
4
MasterScrat 35 minutes ago 0 replies      
Don't waste your time and just head to wrapbootstrap.com.

Even for a side project it's worth paying $15 to get an actually good looking template.

5
johantinglof 37 minutes ago 0 replies      
Not sure about this. It seems kind of wierd for the 'login' templates to have a confirm password box. In my book startbootstrap has an advantage.
6
lumpypua 36 minutes ago 0 replies      
Looks like the folks behind this are doing real estate tech. Can I ask y'all what the product is?
7
hendry 46 minutes ago 1 reply      
None of these Bootstrap 3 themes seem to just replace /bootstrap/css/bootstrap-theme.css which makes them pretty daft.
3
William Shatner reviews Facebook Mentions
185 points by cityzen  9 hours ago   37 comments top 11
1
skizm 5 hours ago 2 replies      
William Shatner is 83. That is amazing to me. I hope I'm that with it and well put together if I make it to 83. I suppose being worth $100 million doesn't hurt, but it still gives me some hope.
2
freakyterrorist 7 hours ago 2 replies      
I can't believe Facebook would force celebrities to follow other celebrities, It seems like their contempt for users is universal regardless of how important you may be.
3
nsxwolf 3 hours ago 2 replies      
There's an Internet for celebrities and an Internet for the rest of us.

Facebook has this.

Wikipedia has notability requirements.

Twitter has verified accounts, which only celebrities can have because no one gives a shit if your account is real or not.

The Hacker's Manifesto is a bit turned on its head.

4
JacobAldridge 7 hours ago 3 replies      
I wonder if the George Takei suggestion was intelligent situational awareness ... or if everyone has him as the first suggestion because he's basically the platonic form of a Facebook celebrity?
5
bignaj 5 hours ago 1 reply      
I remember back in 2005-6 when I thought that News Feed, Events and other stuff introduced to Facebook was a bunch of cluttered junk. If I could have seen into the future then... the horror, the horror!
6
joshmlewis 6 hours ago 0 replies      
> Im already following those who I want to follow - why insist I follow that short list of others?

Money.

Edit: If you think about it, it's a poorly implemented solution to some problem. What was the problem?

7
personjerry 6 hours ago 1 reply      
Facebook Mentions seems like an ill-conceived way of making a move against Twitter.
8
HBSisBS 4 hours ago 2 replies      
Things against Facebook rarely fail to rise to the top of HN. Anything not favoring Facebook gets up here quickly. Why am I surprised.
9
moron4hire 4 hours ago 1 reply      
Are new users joining Facebook? I hear from more and more people that either they have left or "wish [they] could", if it weren't for "everyone" with which they wish to stay in contact[1].

I myself deactivated my account 6 months ago. The only time I regret it is after I've been drinking and want to troll someone. So actually all around a good thing.

Are new people actually, really, honestly joining Facebook? I know the "delta new accounts" number is positive, but do they represent real people, rather than just spam bots?

I just have this feeling that, given a certain plateauing of new users, there comes an associated stagnation of follow-actions. No, I certainly don't have data to this issue, but I know that I personally only spend effort following people when A) I first join a site, or B) I think there is a good chance the person will follow me back[2]. So, for your everyday Joe-blow user who has is more than a month old, it seems like they are either already following William Shatner or never will.

My own observations in blogging have been that engagement with users is highly dependent on novelty. You either grind out finding new followers who haven't experienced your content yet, or you post radically new things all of the time--which could backfire and alienate your established followership, though honestly by that point they are probably ignoring your posts. Either way, you write off anyone who has been following you for more than 6 months. 90% of the time, that person is unretrievable.

I guess I just see followers as a limited, unsustainable resource, sort of akin to oil, except much easier to deplete. But it seems like Facebook, et. al., are banking on it being more like solar. IDK, I've seen reports saying FB has 1.3 billion users. There are only 7 billion people in the world. Do I really believe FB has almost 20% of the entire world's population? Do I really believe they could get more?

[1] Apparently "everyone" doesn't know how to use email or a telephone or text messaging.

[2] Incidentally, a policy that works for about 50% of cases. And for 90% of people, they will be unfollowed within a week, regardless of whether or not they follow me back. I don't need their crappy animated GIFs of Sherlock or Dr. Who clogging up my dashboard.

10
darrenf15e 5 hours ago 0 replies      
fb paid him
11
beartime 7 hours ago 1 reply      
Not many other users of this app feel this way
4
SpaceX Soft Lands Falcon 9 Rocket First Stage
423 points by cryptoz  14 hours ago   139 comments top 9
1
jccooper 14 hours ago 5 replies      
Better video from the first landing, despite the data corruption, than this one, due to the ice fouling. Wonder why this one got iced? Clouds?

But even so, it certainly shows success. Can't wait to see 'em get a stage back. That'll be amazing.

2
larrydag 12 hours ago 4 replies      
I've been super impressed with how efficient SpaceX is in getting launches into orbit. I did a visual comparison to the Space Shuttle program. It looks like SpaceX will overtake the Space Shuttle launch efficiency by the end of the year.

http://snag.gy/wUTD7.jpg

sources:

http://en.wikipedia.org/wiki/List_of_space_shuttle_missions

http://en.wikipedia.org/wiki/List_of_Falcon_9_launches

3
andrewtbham 13 hours ago 3 replies      
> Flights 14 and 15 will attempt to land on a solid surface with an improved probability of success.

Anyone know when flight 14 is scheduled?

4
nickhalfasleep 12 hours ago 1 reply      
"it fell over, as planned"... great attribution of the fundamental nature of gravity.

But mad props to SpaceX for working towards recycling their boosters.

5
ilaksh 12 hours ago 3 replies      
The hardest part is not running out of fuel on the way down. Old-school science fiction often assumed something like nuclear power where that wouldn't be a concern. Seems like for many years most people haven't even attempted this full recovery/landing thing because of fuel and weight issues.

Has SpaceX really cracked that problem? What portion of flights on this smaller rocket can do it? It says some missions won't have enough fuel left over.

Will the Falcon 9 Heavy be able to do all or almost all missions with a landing and full recovery at the end? How much fuel will Heavy have left on average after a return? Theoretically it would not wait until the last possible second to start slowing, and therefore use up almost all of the fuel in order to give the descent a larger margin of error and also reduce the possibility of damage, I assume.

6
russell 12 hours ago 0 replies      
I did see a launch north over my town, about 80 miles north of Vandenberg. It looked like the path was over the ocean all the way. The first stage was still burning, so it was pretty spectacular.

Edit: oops. This was supposed to be a reply to the comment below about launches from Vandenberg being mostly to the south.

7
crb 13 hours ago 1 reply      
Will there be a video of the landing, as seen from Earth?

Is the rocket currently aimed "at the Atlantic", or are they operating with the precision they need to land back where they took off from: and in that case, do they have a camera pointing at the landing point from a ship?

8
Crito 14 hours ago 5 replies      
The video is pretty cool, but make sure to read the rest of the page. Most exciting of all is this last note: "We will attempt our next water landing on flight 13 of Falcon 9, but with a low probability of success. Flights 14 and 15 will attempt to land on a solid surface with an improved probability of success."

That "solid surface" will presumably be a barge. Very exciting stuff.

Edit: The "loss of hull integrity" also answers a common question of "why not use parachutes". Even if they could land the stage softly enough with a reasonable amount of parachutes, the stage has trouble surviving tipping over in the ocean after landing. It needs to land and remain upright, but using parachutes you will often get some lateral velocity that would cause a tip-over. They need more control than parachutes can provide (and that's all ignoring the problems of salt water).

9
lefrancaiz 12 hours ago 4 replies      
Why does the rocket tipping over cause a loss of hull integrity? Does that mean that it actually exploded, just from going horizontal?
5
Review: Amazons Fire Phone
52 points by digital55  5 hours ago   13 comments top 5
1
TorKlingberg 2 minutes ago 0 replies      
It may be a problem for Amazon that they lack the global presence that their competitors in the phone market have. Amazon is big in the US, UK, Germany and Japan, but not in rest of the world.
2
r00fus 4 hours ago 4 replies      
My question remains - why did Amazon go for the high-end price point? Amazon has excelled previously by competing on price. While the Fire Phone is competitively priced to the corresponding iPhone or Galaxy (esp. given the freebies), it simply doesn't make up for the lack of Google Play or iOS AppStore. And chained to AT&T?

Perhaps this is Amazon's stalking horse into the phone market.

3
dnewms 4 hours ago 2 replies      
Mayday is truly a new level of customer service, and makes the phone a great choice for those who might struggle with new technology -- like the remaining population without a smartphone.
4
davidw 2 hours ago 0 replies      
I got one of the original Kindle Fire tablets. It's not bad, but without all the Google stuff: gmail, maps, etc... it's just not as useful as my Nexus tablet.
5
scythe 3 hours ago 1 reply      
Something tells me that Dynamic Perspective, as implemented on the Fire phone, might not have been an entirely awful experience to implement. A couple of points:

* Amazon isn't known for wasting time or money. They wouldn't have delayed the release of their first-ever phone for an expensive gimmick. Sticking a cheap gimmick on within time requirements is doable.

* 3D rendering is, well, a solved problem. Accelerometers have been incorporated into phone UIs for more than five years. Even facial detection is in most modern digital cameras. Tying these things together doesn't require sending a whole lot of information back and forth -- you just update a couple of vectors representing estimates of phone location and head location.

I wouldn't be too surprised, given Amazon's track record, to find that Dynamic Perspective started as kind of a "why not?". If people can make cool things out of it, they'll keep it: it might be partially a trick to get developers to target the device, and to make interesting things out of it. If not, it won't be missed.

6
Commercializing the first direct-diode laser bright enough to cut and weld metal
35 points by dalek2point3  5 hours ago   4 comments top 2
1
danmaz74 1 hour ago 1 reply      
So now we only need better batteries, and we'll soon be able to cut and weld metal with our phone?? :D
2
InclinedPlane 1 hour ago 1 reply      
This is huge, it means that laser cutters will be even cheaper than they are now, and that much more ubiquitous. Laser cutters are already making their way into small scale shops and hackerspaces, this'll just hasten that. But it also has a ton of applications elsewhere. Likely it will lower the cost and increase access to the minimum set of machine tools necessary to sustain a developed economy, which has implications for the entirety of the developing world as well as further afield in things like Mars colonization.

Additionally, it makes things such as low-footprint or modular factories, configurable/programmable or wholly automated factories, and self-replicating factories more of a possibility in the near future.

7
How recursion got into programming: a comedy of errors
8 points by rudenoise  1 hour ago   discuss
8
iOS: About diagnostic capabilities
66 points by comex  7 hours ago   33 comments top 8
1
jatoben 5 hours ago 3 replies      
I was surprised Zdziarski made such a big deal over the packet capture tool. It's been documented in the referenced developer Q&A and in various blog posts[1] since iOS 5, and I've personally found it very useful for troubleshooting connectivity problems on enterprisey networks.

[1]: http://useyourloaf.com/blog/2012/02/07/remote-packet-capture...

2
pilif 4 hours ago 1 reply      
Having an official explanation what the processes do is very welcome, but this still leaves out the question whether there is a way to access these daemons without prior user approval.

And if there is, the question is, who has access to that method and how well that access is protected (from rogue employees with access to keys for example)

At this point, I would still consider all data on my phone to be accessible to law enforcement and criminals (assuming they have stolen the keys) provided they have physical access to the device.

I'm basing the data stored on the device on that assumption and, for example, keep ssh keys separately encrypted and don't store the pass phrase.

3
Titanous 6 hours ago 0 replies      
This is almost certainly a direct response to this presentation: https://news.ycombinator.com/item?id=8057470
4
jvdh 3 hours ago 1 reply      
So pcapd on iOS is supposed to allow you to capture packets from a trusted computer: http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=e...

  pcapd supports diagnostic packet capture from an iOS device to a trusted computer. This is useful for troubleshooting and diagnosing issues with apps on the device as well as enterprise VPN connections. You can find more information at developer.apple.com/library/ios/qa/qa1176.
If you actually follow that link, you come out to a page detailing how to do packet captures for different Apple devices, including iOS:

  iOS does not support packet tracing directly. However, if you're developing for iOS you can take a packet trace of your app in a number of different ways:  If the problem you're trying to debug occurs on Wi-Fi, you can put your iOS device on a test Wi-Fi network. See Wi-Fi Capture for details.  If your app uses HTTP, you can configure your iOS device to use a debugging HTTP proxy (such as Charles HTTP Proxy).  In iOS 5 and later you can use the remote virtual interface facility.
There does not seem to be a mention of that pcapd capability in there...

5
owenwil 6 hours ago 3 replies      
Wow - this is something Apple would have never done in the past. They're really pushing to make the point that they don't have any backdoors/participate with government agencies. Interesting turn, where Apple would have just stayed silent previously.
6
IBM 6 hours ago 0 replies      
It's pretty amazing how much press this guy got for nothing.
7
iancarroll 6 hours ago 1 reply      
Is there a way to use the file_relay capability? Documents?
8
X-Cubed 6 hours ago 1 reply      
So, if they're there for diagnostics why aren't they disabled by default, requiring user intervention to enable them?
9
How to Ruin Your Company With One Bad Process
287 points by joaodepaula  17 hours ago   59 comments top 12
1
nostrademons 14 hours ago 13 replies      
"As a technologist, you know that the worst thing that you can do is over-constrain the problem before you start. You'll kill creativity and prevent yourself from getting a truly great outcome."

As an engineer, I love hearing firm constraints from the beginning. The constraints are what breed elegance; there is no such thing as an elegant solution when there is no shape to the problem. It's nice if the constraints are prioritized so you know what to give up if you can't satisfy them all. But there's nothing quite like saying "Yeah, we did this thing in two weeks that everyone assumed was impossible, and we did it without a binary push" or "Through our clever architecture, we accomplished with one server what everyone thought required a whole rack."

I believe design is the same way. The designs I've seen where the dictum is "Let your creativity run wild!" tend to be uninspired, while the ones I've seen where it's "This is what the user is trying to accomplish, and we have a 4 inch screen and 10 seconds to hook them" are often much more creative.

As a manager, the constraints are annoying. But one consequence of that is that setting firm constraints will tend to shift your culture from being manager-centric to being engineer- and designer-centric, which IMHO is a good thing.

2
curun1r 14 hours ago 2 replies      
One part of his conclusion struck me as wrong:

> As a technologist, you know that the worst thing that you can do is over-constrain the problem before you start.

From what I've read on this subject, this is not the worst thing you can do. The absolute worst thing you can do from a creativity standpoint is be completely unconstrained. That "blue sky" thinking leads to a lack of focus that prevents you from coming up with good solutions. The startup mentality of "embracing constraints" is more than just a rationalization that tries to turn a negative into a positive...it's an observation of how to best creatively problem solve.

There's no doubt that over-constraining can also have negative consequences, but if you've got little to no natural constraints, it's almost always best to invent some reasonable constraints prior to diving in and trying to solve the problem. It's okay to document those constraints and, perhaps, make changes if you find the problem you've created to be intractable. But operating without those constraints, whether real or self-imposed, is the absolute worst way to approach any creative problem solving endeavor.

3
debt 14 hours ago 1 reply      
Budgeting is particularly interesting in a situation where you receive funding. It seems it would be much easier to budget in a bootstrap situation, where growth is likely happening a bit more slowly. You're slowly adding a person(engineer, marketer, salesperson, etc.) here and there as you go along(as needed), buying more servers, etc.

I would assume budgeting is a bit different in a situation where you receive a massive cash infusion. You want to spend all of it to grow even more quickly but the effects of spending may be harder to measure(because you're spending soo much soo quickly). So do you just replicate the growth strategy used in the "bootstrap" phase or do you adopt a new one?

In a situation involving a massive round of funding(relative to the current size of the company), wouldn't it be difficult to attribute any problems supposedly associated with the increased budget to a "bad budgeting process"? That is to say, if the growth isn't happening, then there's no guarantee that just by adding more money to "growth" that "growth" will happen. You may have just pointlessly hired managers, salespeople, engineers and bought more servers and whatnot, for growth that wasn't going to happen in the first place.

To put it a different way, when the profits don't catch up with spending, was the problem really the budgeting process? Or with the market itself? Or some other factor entirely?

Basically, how do you identify a bad budgeting process in a situation where some entity just externally infused a bunch of capital?

4
jobu 14 hours ago 2 replies      
"If you quadruple your engineering headcount in a year, you will likely have less absolute throughput than if you doubled headcount. As an added bonus, you will burn way more cash."

This is a great point (although I hate the term "headcount"). It takes time to get engineers up to speed in any organization, and it usually cuts into the productive time of current engineers.

5
allochthon 9 hours ago 0 replies      
This is an excellent warning to any startups looking at growing quickly. I am living through such a growth effort, and I can definitely see how the culture is changing for the worse in the company.

Counter to the author's general point, I would add that I think well-defined budgeting exercises that lead to clear targets of the kind described in the article, where people are "made accountable," appear hubristic to me. There are so many unknowns that will come up that even to pretend to be able to meet such goals feels off. I'm waiting for management science to catch up with agile development.

6
coldcode 12 hours ago 0 replies      
I saw where it was going right at the start, I've seen it a lot in various startups I've worked with. Growing a company past certain sizes often winds up failing due to unbridled optimism. Reason goes out the window as growth seems it will never end; people buy into it and want to expand their role for when it hits the big time. The end is often a mess. Eventually either sanity prevails or it goes bust.
7
dchichkov 8 hours ago 0 replies      
"Well, that's the beauty of the game. It only takes one player to opt in, because once someone starts playing, everybody is going in -- and they are going in hard."

As per Frans de Waal's, this is true even for chimps ;)

8
meh_master 9 hours ago 1 reply      
off-topic: why does Ben Horowitz always include rap music and a quote which aren't relevant to the article in question? (other than that he just likes rap)
9
danso 14 hours ago 1 reply      
Not so much a "process" as much as a bad planning strategy, or a company structural deficiency (in which the incentive of "tell me what budget you need" is exacerbated).

When I think of "one bad process" that could ruin any company...I think of something like, stack rankings, or, partially-automated deployment (e.g. http://www.zerohedge.com/news/2013-10-22/how-lose-172222-sec...)

10
Mz 8 hours ago 0 replies      
As a technologist, you know that the worst thing that you can do is over-constrain the problem before you start. You'll kill creativity and prevent yourself from getting a truly great outcome.

Perhaps "artificially constrain" is the real problem. Constraints that are rooted in real world limitations lead to the elegant design that others are commenting on here. Constraints that are kind of made up BS can, in fact, mess things up pretty badly, just as the author suggests.

I mean, if there is solid logic behind the constraints, it helps foster good problem-solving and good design. But constraints that lack that solid logic can very definitely cause things to go awry.

11
michaelochurch 11 hours ago 1 reply      
When I asked my managers what they needed, I unknowingly gamified the budgeting process. The game worked as follows: The objective was for each manager to build the largest organization possible and thereby expand the importance of his function. Through the transitive property of status, he could increase his own importance as well.

This will always happen with closed allocation, no matter how much it is tweaked. It's an inherent property of closed allocation systems that the definition of work is driven by managerial status assertions rather than the needs of the business, which can only be assessed, at the lower levels, organically.

Closed allocation doesn't invariably destroy a company, and it's only in software that open allocation is obviously superior. (An open-allocation nuclear plant may not be the best idea.) Most industrial efforts can tolerate the inefficiencies that come with closed allocation. Software often can't, because software efforts tend to be binary in outcome (most lose, a few are big winners) and closed allocation generally creates enough needless complexity to cripple the company before it really succeeds.

12
AndrewKemendo 13 hours ago 4 replies      
Interesting article for sure and I think it highlights exactly how you get unwanted bloat and how to prevent it. Budgeting has always been a dark art, especially for creative work, and adding smart constraints on the front end are some ways to reign things in - however I think this largely applies to the companies who can say the following: "we had plenty of cash in the bank." I know for us our budgeting process is, what is on fire and can our limited amount of cash put it out?

Changing gears though, I have to say that I was immediately turned off by Ben quoting a rap lyric at the beginning of his article. Not because I don't like rap, I do, but because of the lyric he chose.

I think this one is particularly egregious given that the quote includes the term "nigga" - even though it was of course self censored - which I don't think is particularly appropriate coming from a white man.

Not really a big thing, and it doesn't impact the overall message (which is why I think it only hurts things) but it may strike others wrong too.

10
Cuts and jumpers on a different scale (1989)
53 points by mhb  7 hours ago   4 comments top 2
1
rosser 4 hours ago 2 replies      
Discussion from the last time this was posted (about 1.5 years ago, linked from a 2002 post on JWZ's blog): https://news.ycombinator.com/item?id=4948768
2
nsxwolf 3 hours ago 0 replies      
symbolics.com was the first dot com.
11
IPFS: The Permanent Web
252 points by _prometheus  17 hours ago   45 comments top 16
1
leoc 15 hours ago 2 replies      
So, a named-data networking https://en.wikipedia.org/wiki/Named_data_networking project.

> Named data networking (also content-centric networking, content-based networking, data-oriented networking or information-centric networking) is an alternative approach to the architecture of computer networks. Its founding principle is that a communication network should allow a user to focus on the data he or she needs, rather than having to reference a specific, physical location where that data is to be retrieved from. This stems from the fact that the vast majority of current Internet usage (a "high 90% level of traffic") consists of data being disseminated from a source to a number of users.

(Another idea with a Ted Nelson pedigree, btw.) Van Jacobson's working on another, NSF-funded project in this area http://named-data.net/ at present.

2
fiatjaf 8 hours ago 0 replies      
This is similar to GNUnet[1] in many ways. GNUnet, besides offering anonymity and infinite application-building possibilities, comes with an interesting incentive mechanism for making nodes keep others' files.

There is a project to port GNUnet to the browser going on here[3]: https://github.com/amatus/gnunet-web

[1]: https://gnunet.org/

[2]: https://unhosted.org/decentralize/26/Decentralized-reputatio...

[3]: https://github.com/amatus/gnunet-web

3
sabalaba 10 hours ago 0 replies      
According to a report by EMC, current global storage capacity is around 1.5 zettabytes (1.5 million petabytes) [1]. With Commercial off-the-shelf hardware, storing 1 PB of data carries a fixed cost of around $100,000 USD [2]. Thus, the cost of today's storage capacity is around $150 billion USD. (About what the United States government pays every year in interest [3].) Those numbers are expected to increase an order of magnitude by 2020. Thankfully this does not apply to the interest payments.

With numbers like that, and with cloud storage prices sitting very far away from marginal costs, protocols and incentive structures like IPFS and filecoin are going to be of great value to enterprises, governments, and consumers. I would not be surprised if, by 2020, a majority portion of the data online was stored in such a system; with exponential growth, a new majority share is created every ln(2) / rate years. In the case of the "data universe", which grows about 40% annually, doubling time is about 2 years [1].

[1] http://www.emc.com/leadership/digital-universe/2014iview/exe...[2] http://www.backblaze.com/petabytes-on-a-budget-how-to-build-...[3] https://www.cbo.gov/publication/44716

4
rsync 14 hours ago 1 reply      
... and you are also the filecoin person, yes ?

If you have some spare time, could you solve that power inverter / solar panel challenge that's on the front page currently ? Thanks in advance.

5
vertex-four 15 hours ago 2 replies      
So the obvious question, after reading that page, is... how do we ensure that if the data identified by X is there now, that data will still be there tomorrow? What incentive is there to host other people's data?
6
anon4 14 hours ago 1 reply      
So what happens when someone uploads illegal content (like torture videos, snuff films, child pornography, the US constitution on some campuses)? Can clients define blocklists for things they don't want to store?
7
jasonjayr 9 hours ago 1 reply      
This sounds a lot like Freenet[0], though w/o the anonymity & plausible deniability guarantees. Freenet shares some of the same features + Limitations:

* Content keys are based on hashes of content, and don't change (unless a new encryption key is used to re-insert a file)

* keys that are not requested frequently will fall off the network automatically.

* Keys can be signed so only the holder of the private key can update said key

[0] https://freenetproject.org/

8
mtdewcmu 14 hours ago 1 reply      

  # a mutable path  /ipns/my.host.com/some/file.txt
This reminds me of AFS, which is old, but seemed pretty sophisticated. I've only personally seen it in use at CMU. This project has much broader aims than AFS, of course, and the resemblance is probably superficial.

9
ilaksh 14 hours ago 1 reply      
Seems like a very important piece of work.

I didn't read very carefully but I don't see many facilities for permissions control other than having a personal folder.

Did I miss that? Do you have plans for permissions support, like groups or read/write access or ACLs etc.?

Or maybe if you just make a simple type of group so that one account could share access to its personal folder with a set of other people/accounts, that would handle most use cases for permissions.

Other than that, seems like this is a great start on solving everyone's problems.

10
wnoise 13 hours ago 1 reply      
Sweet. I loved the promise of SFS and was really sad to see the development on it die.
11
_prometheus 13 hours ago 1 reply      
I should mention: Am hiring, so if you'd like to work on this full or part time, send me an email: juan@ipfs.io
12
thomasvarney723 3 hours ago 0 replies      
Using locations instead of names for data seems like what Rich Hickey means when he talks about Place Oriented Programming.
13
zwegner 14 hours ago 1 reply      
Looks fairly similar to PPSPP: https://datatracker.ietf.org/doc/draft-ietf-ppsp-peer-protoc...

Both look pretty cool, I'd like to see some more traction in this area.

14
c54 12 hours ago 2 replies      
Correct me if I'm misunderstanding, but this is only beneficial for static content, correct? For instance, if I want to load facebook.com, I still need to hit Facebook's servers directly (rather than the IPFS mesh). How does IPFS know if I'm requesting dynamic content?
15
defen 15 hours ago 1 reply      
Sounds like a workable version of the Xanadu project. Is there any connection there or ideas from that being used?
16
reitanqild 15 hours ago 1 reply      
This has the potential to be seriously interesting!
12
An open competition to build a smaller power inverter, with a $1M prize
243 points by ismavis  16 hours ago   110 comments top 20
1
jeremymcanally 16 hours ago 4 replies      
At first I was like "Why would you do that for only $1m? If you had that big of a breakthrough, you could easily generate that (and then a lot more) by selling it yourself." Then I read that they aren't taking the IP, and are just giving you the cash as a pure incentive. They can publish your high level approach documents, but you still own the invention.

I wish more of these contests were run that way. I think they'd yield much high quality and differentiated results with a lot more entrants.

2
ChuckMcM 16 hours ago 3 replies      
I saw this earlier and briefly considered it. 50W/inch^3 is soldering iron level heat dissipation. And my take on it was that it really isn't possible unless you can cheat and have the "inverter" be the thing on the end of a solid copper bar that is sitting in ice water on the other end :-). So really they are looking for a 10X improvement in efficiency. Which is to say to take something which is 90% efficient and make it 99% efficient. Even looking at the wide bandgap semiconductors they reference on the web site I'm having a hard time getting more than a few percentage points more efficient.
3
zw123456 8 hours ago 2 replies      
Wouldn't it be more efficient to simply drive all our electronics equipment directly off DC? Almost all electronics devices now days run off 5VDC (USB) or 12VDC, Solar panels put out 12VDC, all those conversions seem like a waste of energy. What if you just run our big appliances off 120VAC and run all our small stuff off of Solar directly along with a battery back up. It seems like if a new wiring standard were developed that had both AC and DC distribution it would greatly reduce the cost of installing Solar. In fact, I believe it would be possible to put a DC bias on top of the AC (similar to the way old time phone lines work). Just a thought, rather than shrinking the inverter, think outside the box and get rid of the inverter all together.
4
pkulak 5 hours ago 0 replies      
Seems like we just need to switch to DC already. 60hz AC is good for resistance heat and... that's about it. We are at a point now where we're creating DC on our roofs, turning it into AC to go through the walls of our house, then turning it right back into DC to charge our cars and power our other electronics. With losses and expensive hardware at each step.
5
anigbrowl 13 hours ago 4 replies      
2. ELIGIBILITY: To be eligible to enter the Contest, you must be: (1) above the age of majority in thecountry, state, province or jurisdiction of residence (or at least twenty years old in Taiwan) at the time ofentry (2) not a resident of Italy, Brazil, Quebec, Cuba, Iran, Syria, North Korea, or Sudan (3) not a personor entity under U.S. export controls or sanctions and (4) have access to the Internet as of July 22, 2014

I wonder why Italy, Brazil, and Quebec are included. The other countries are under special sanctions regimes already but I can't think of a good reason to exclude these three or why the contest would be considered illegal there.

6
swamp40 11 hours ago 4 replies      
Why is there any need to make it much smaller than the solar panels that will be providing the power?

I can see where ultra-thin (and flexible) would be a benefit, but why not allow the electronics to spread out over the entire area of the solar panels? The space is being used up already.

That gets rid of the super high power density problem.

The sun delivers about 1KW per square meter, so even if the solar panels were 100% efficient, you'd have an entire square meter of room for a 1KW inverter.

7
otterley 15 hours ago 0 replies      
This story was previously discussed on HN: https://news.ycombinator.com/item?id=7730042
8
lutorm 16 hours ago 2 replies      
Why does inverter size matter? The inverter is already smaller than the battery or PV panel components, so it's not immediately obvious to me what groundbreaking new applications will be possible with an even smaller one.
9
phkahler 14 hours ago 1 reply      
Already done. When I worked in EVs we used a HybridPack2 power module from Infineon. It's about the size of a sandwich but longer, and skinnier. You add a driver board, a logic board, a capacitor, connectors, cold plate. It's about the size of a shoe box and can deliver 100kW continuously. I pushed one under ideal conditions to 200kW.

Of course, liquid cooling means a total system that is quite a bit larger than I describe. In order to get rid of liquid cooling at that power level you'd have to get the losses down by a huge margin. We were dissipating 2-3kW at high power, so for air cooling you'd need to get that down by a factor of at least 10. The only way to drive the heat down like that is at the semiconductor device level.

This is a challenge that everyone in the field is already aware of and working on, while people outside the field have no ability to do meaningful research.

At the small scale, an Arduino with the mega-moto shield can push some hundreds of watts in a few cubic inches. So what exactly is the challenge?

10
iandanforth 16 hours ago 2 replies      
Anyone know why this is particularly hard?
11
nsajko 9 hours ago 0 replies      
But a lot of devices convert to DC internally! Would it be hard to dispose of that redundancy? It seems to me there'd be less need for a power inverter that way.EDIT - some semirelevant discussions:https://news.ycombinator.com/item?id=7730205 - in the past thread, reasoning about usage of AC vs. DChttps://news.ycombinator.com/item?id=8071524https://news.ycombinator.com/item?id=8071670- DC vs. AC
12
54mf 14 hours ago 3 replies      
I'm just tickled by how many commenters have a totally obvious solution to this problem. Surely, the hundreds (thousands?) of experts at Google, the IEEE, and the ~8 manufacturers who put this contest together are just fools who couldn't come up with such amazing, brilliant ideas themselves.

Congrats in advance, and enjoy your million bucks!

13
mmanfrin 10 hours ago 1 reply      
Could someone explain, in layman's terms, what the difficulty in building a smaller inverter is? I unfortunately paid less attention in high school Physics than I wish I had.
14
gooseyard 13 hours ago 1 reply      
Assuming one was starting from scratch and didn't care whether the available appliances of the day required AC or DC power, but all the power coming into the home was solar, what would the motors on the appliances look like? Would it still be desirable to use AC motors, and if not, would it be practical (other than for the obvious reasons) for appliances to use a standard DC voltage?

I don't mean to suggest that we abandon ac powered appliances, I'm just curious about what electrical wizards would come up with, if they were doing it all over again.

15
jessaustin 15 hours ago 3 replies      
Anyone have any ideas why they highlight only "wide bandgap device manufacturers"? I'm hope they'd accept a winning solution with different tech, but surely there are other possibilities they could mention right at the start?
16
elsewhen 15 hours ago 2 replies      
does anyone know why the list of countries that this contest is blocked from are:

"ITALY, BRAZIL, QUEBEC, CUBA, IRAN, SYRIA, NORTH KOREA, AND SUDAN.[1]"

aren't the first three places strange to see on that list?

[1] https://www.littleboxchallenge.com/pdf/LBC-TermsAndCondition...

17
dskhatri 10 hours ago 0 replies      
There are all sorts of specifications/requirements listed (box size, ripple allowed, EMI limits) but the most interesting that is not mentioned is cost. There is no upper limit set on the BOM cost.
18
nsxwolf 13 hours ago 2 replies      
What's the difference between a picnic cooler sized inverter, and the one in my Jeep, which is nowhere near the size of a picnic cooler?
19
imranq 10 hours ago 1 reply      
didn't FINsix solve this problem: http://finsix.com/dart/
20
pp19dd 15 hours ago 8 replies      
The simplest solution (just give me the $1mil now) is to cut out the middle man. I mean, there is a needless conversion here from DC to A/C, and then back to DC. Not that many devices need A/C these days - maybe just your alarm clock, if it's cheap enough (since cheap alarm clocks use the alternating current frequency for keeping time, instead of a precise resonating crystal.)

Example of what they have now: [solar-dc] -> [inverter] -> [ac/dc transformer] -> [device].

Cut out the inverter, the ac/dc transformer and you have:

[solar-dc] -> [device]

Required materials: wire cutters, cheap voltage regulator IC, some wire. Done. I'll take a cashier's check please.

13
Exploring No Mans Sky, A Computer Game Forged by Algorithms
199 points by Libertatea  15 hours ago   109 comments top 16
1
DanAndersen 14 hours ago 15 replies      
The issue with a lot of procedurally-generated content is that overall things become same-y. Lots of breadth, little depth. When you have a set number of variable sliders, you don't have to see all the possible worlds to get a sense of how many sliders there are and what their ranges are. The problem only gets worse when moving away from planet/landscape generation (dumb matter) and getting into simulations of history or society.

I'd be really interested in learning about any work in the "interestingness" of procedural content. Is it possible to quantify that sort of emergent complexity that happens in life, or in a work or story with artistic direction that takes on a life of its own? Can a system make a billion compelling and unique stories without solving the issue of Strong AI and just making a superintelligent DM?

2
gavanwoolery 14 hours ago 4 replies      
Looks like it could be a good game either way, but I'm questioning their use of words when they say "every atom" is procedurally-generated, as in their first reveal trailer.http://www.youtube.com/watch?v=U6fCn8oB-sg

From what they have demoed, they have not proven that anything actually is being generated. My best guess is that they randomly vary colors and textures (and generate the atmospheres, as they say), and randomly place models about the planet, which is a far cry from actually generating all of the models and animation for everything. A lot of it just looks modeled by an artist (and they do have an artist on their team). Anyhow, I don't think this will necessarily detract from the game, just a nitpick on choice of words...

An additional concern is how quickly this game came out of nowhere (I think?). The Inovae (formerly Infinity) engine took many years to develop - it was one of the first of its sort to allow seamless planet to space transitions with tons of terrain detail. If you look in close detail at the differences between the two, you can tell that No Man's sky does not have a very good sense of scale when flying into a planet (unless the planet were incredibly tiny). Also, note the angle of the fly-in is straight on; its much harder to prevent detail popping when flying in tangentially to the planet and I wonder if they have yet addressed this. Again, just another nitpick. :)

No Man's Sky (2:00 in for the planet fly in): http://www.youtube.com/watch?v=nLtmEjqzg7M

Inovae/Infinity:http://www.inovaestudios.comandhttp://www.youtube.com/watch?v=a6a69dMLb_k

3
protonpopsicle 14 hours ago 2 replies      
None of these articles on No Man's Sky ever mention Noctis, which is really strange. That's clearly a precedent here worthy of mention. http://en.wikipedia.org/wiki/Noctis
4
fisher-lebo 14 hours ago 6 replies      
I get the gist of how Minecraft is constructed: just a bunch of blocks with algorithms which hide certain blocks when they aren't in the player's view.

Are there any articles about how an engine like this is built? It is an extreme level of procedural generation (apparently), but the graphics aren't blocky and as abstract, but rather pretty impressive.

In a similar vein is Eskil Steenberg's Love [1], but this is on another level entirely.

[1] http://www.quelsolaar.com/love/index.html

5
ChuckMcM 11 hours ago 1 reply      
I'm having a partial flashback here and its killing me. Back in the 90's there was a guy who was building this exact same vision "Space Galaxy" or something where every planet was unique, every alien race unique, every solar system, full planet to planet seamlessness. Derek somebody? Generally considered impossible and derided by the PC gaming press corps.

I agree with most that you need a certain dimensionality to the generation in order to get something credible. Definitely going to check this one out when it comes out.

6
jimmcslim 11 hours ago 1 reply      
"The tens of millions of planets that comprise the universe are all unique."

We will probably never know, but I wonder if the reality of our own universe is that any life on other habitable planets will look remarkably similar to that here on Earth; given the parameters within which a sustainable biosphere are possible quite narrow. There might be some local variation but nothing that would be particularly mind-boggling to biologists (e.g. non-carbon lifeforms).

I think Brian Greene suggested that in a sufficiently large (i.e. approaching infinite) universe, if you travel far enough you will eventually encounter a multitude of 'Earths' that are the same as our own but subtly different in various ways.

In which case, if its a criticism of procedurally-generated worlds that they tend to have a lot of repetition, then the models are probably quite accurate!

7
mladenkovacevic 13 hours ago 1 reply      
Another space game coming out this year that will heavily use procedural generation to re-create the milky way galaxy is Elite: Dangerous. I'm in the beta and it already looks great despite many features not yet being activated for the beta players.
8
thisjepisje 13 hours ago 1 reply      
Now what would be really interesting is to have some sort of rudimentary evolution in a game like this. Star systems, planet geology, flora, fauna, everything.
9
coldcode 12 hours ago 1 reply      
I was excited to read about it until I saw it's only for a console. I've wanted to build a unique "park" world with similar ideas for a long time, but never had the time or resources. I guess you have to start somewhere, but doing this for a single console (P4) is not what I would want to work with. Outerra is the engine that has taken real world geometry combined with procedural geometry in the direction I wanted to go and is built for PC/Mac when it's complete.
10
shmerl 8 hours ago 2 replies      
PS4 only, really? Why can't developers release cross platform games these days? Releasing it for one platform prevents a significant amount of potential users from even trying the game.

I'd play it when they'll release a Linux version.

11
prawn 8 hours ago 1 reply      
I read something on Reddit the other day which goes some way towards capturing my expectation about NMS. It's not particularly succinct.

http://www.reddit.com/r/truegaming/comments/28cddf/im_worrie...

That is, procedurally-generated worlds are one thing, but an engaging environment (Minecraft) or story or purpose are other things entirely.

I have a flat game concept involving procedurally-generated concept but just can't think of what might tip it over into super-engaging territory.

12
djent 14 hours ago 1 reply      
Disappointing article. There's no discussion of the algorithms the game uses to generate content.
13
tiglionabbit 13 hours ago 0 replies      
I'm very impressed that you can fly from the surface of one planet to another seamlessly. Sure the science of partitioning space and changing levels of detail based on distance isn't that difficult, but it is rarely implemented.
14
mratzloff 14 hours ago 1 reply      
Holy shit, I guess I'm getting a PS4.

I suspect--or hope, at least--that they've built meshes that can be combined in a myriad of ways, so by combining the hundreds or thousands of combinations you can get millions of animal or plant types.

15
anigbrowl 13 hours ago 2 replies      
I was impressed by the demo until I saw the very flat explosions that just faded out quickly, which took me out of the illusion immediately. Perhaps that's just a pre-production thing. Impressive product even if the article is sort of gushy and breathless...when I was playing Elite back in the 1980s the planets were just a circle with a smaller circle rotating on it for perspective XD
16
lotsofmangos 13 hours ago 1 reply      
To be honest, I thought our sun would be around a lot longer.

If you were to visit one virtual planet every second, he says, then our own sun will have died before youd have seen them all.

...

"The tens of millions of planets that comprise the universe are all unique."

14
Transit A format for conveying values between different languages
254 points by _halgari  18 hours ago   89 comments top 26
1
haberman 15 hours ago 6 replies      
I really think the future is schema-based.

The evolution of technologies goes something like this:

1. Generation 1 is statically typed / schemaful because it's principled and and offers performance benefits.

2. Everyone recoils in horror at how complicated and over-designed generation 1 is. Generation 2 is dynamically typed / schemaless, and conventional wisdom becomes that this is generally more programmer-friendly.

3. The drawbacks of schemaless become more clear (annoying runtime errors, misspelled field names, harder to statically analyze the program/system/etc). Meanwhile the static typing people have figured out how offer the benefits of static typing without making it feel so complicated.

We see this with programming languages:

1. C++

2. Ruby/Python/PHP/etc.

3. Swift, Dart, Go, Rust to some extent, as well as the general trend of inferred types and optional type annotations

Or messaging formats:

1. CORBA, ASN.1, XML Schema, SOAP

2. JSON

3. Protocol Buffers, Cap'n Proto, Avro, Thrift

Or databases:

1. SQL

2. NoSQL

3. well, sort of a return to SQL to some extent, it wasn't that bad to begin with given the right tooling.

If you are allergic to the idea of schemas, I would be curious to ask:

1. isn't most of your data "de facto" schemaful anyway? Like when you send an API call with JSON, isn't there a standard set of keys that the server is expecting? Isn't it nicer to actually write down this set of keys and their expected types in a way that a machine can understand, instead of it just being documentation on a web page?

2. Is it the schema itself that you are opposed to, or the pain that clunky schema-based technologies have imposed on you? If importing your schema types was as simple as importing any other library function in your native language, are you still opposed to it?

2
lnmx 17 hours ago 0 replies      
So, EDN [1] is a formalization of Clojure data-literal syntax that includes tagged types, has a text representation, and no built-in caching.

Fressian [2] supports the same types and extensibility as EDN, has a compact binary encoding, and the serializer/writer can choose its own caching strategy (so-called domain-specific caching[3]). I believe it was created to provide a serialization format for Datomic.

Transit sounds like an evolution of EDN and Fressian: make the bottom layer pluggable to support human-readable/browser-friendly JSON or use the well-established msgpack for compactness. Caching is still there, but it can only be used for keywords/strings/symbols/etc. instead of arbitrary values like Fressian -- probably a good trade-off for simplicity.

[1]: http://edn-format.org[2]: http://fressian.org[3]: https://github.com/Datomic/fressian/wiki/Caching

3
nimish 16 hours ago 2 replies      
1. Protobuf2. Avro3. Thrift4. MsgPack5. CORBA6. ASN.17. Cap'n Proto8. FlatBuffer

+ whatever internal stuff big software companies have cooked up etc.

What was so special about your use-case that demanded a totally new standard?

I hate to bring up that xkcd but it's actually relevant here.

Is it the higher-level semantics on top that allow abstraction over the underlying serialization format?

The "caching" doesn't seem to be that big of a win where network latency is high and some of the other formats can be directly mmapped, but it looks intriguing however it seems like something that could be added in a versioned binary format that some of the others provide.

4
sgrove 17 hours ago 2 replies      
One of the big issues we've been struggling with is getting large ClojureScript data structures with tons and tons of structural sharing (think application state history) 1.) small enough to transmit to the server 2.) for efficient storage.

It sounds like Transit may help with this via its caching etc.? Can someone from Cognitect comment on whether this is a suitable use?

5
fogus 18 hours ago 0 replies      
A tour of the JS implementation is at http://cognitect.github.io/transit-tour/
6
ciniglio 18 hours ago 0 replies      
7
Ixiaus 18 hours ago 3 replies      
Why re-invent the wheel when MessagePack already exists, supports a similar set of types, and has far greater implementation reach?
8
arosequist 15 hours ago 0 replies      
They also released a podcast episode about Transit, which doesn't seem to be mentioned on the blog post:

http://blog.cognitect.com/cognicast/060-tim-ewald

9
tetsuoironman 17 hours ago 0 replies      
This is likely a prelude to how Datomic will support multiple languages outside of the JVM.
10
mahmoudimus 11 hours ago 0 replies      
This comment is not necessarily related to Transit, but any serialization specification -- in XML, we had XSLT which could transform any well-defined input to an XML output and vice versa.

What's the equivalent for JSON/Transit etc? When parsing and validating the correctness of an input, what is the standard protocol for propagating error messages laced with contextual domain information?

The two solutions I've found was:- use XSLT- use a domain specific language

11
antihero 8 hours ago 0 replies      
I think it would be nice if more serialization formats could at least support timezone-aware date/times and deltas, as they are used really really frequently and it's a total pain to have to do a second parse to deserialize them.
12
rubiquity 18 hours ago 1 reply      
Literally about an hour ago I was browsing around Rich Hickey's Twitter account and the Cognitect website because I thought, "Hey, I haven't heard anything from him/them in a while", and voila! Just like that, this appears.
13
joeevans 16 hours ago 1 reply      
Can anyone explain what this would mean for the day-to-day programmer?
14
bellerocky 17 hours ago 1 reply      
Reminds me of Thrift[1] which is an Apache foundation project started by Facebook and supports more languages. It also is battle tested. I've seen it used in production under heavy load. I don't know if Thrift does the caching or needs to. Data on the wire is already compressible via gzip which should handle repetitive values.

[1] https://thrift.apache.org/

15
mijoharas 18 hours ago 2 replies      
I couldn't ask someone to explain the similarities and differences between this and EDN could I? I realize that this seems more aimed at transferring data whereas EDN may have been more targeted at serialization (is that correct? please correct me if I'm misremembering) but I thought they covered similar use cases? (with EDN obviously not including the performance enhancements that transit seems to have).
16
saosebastiao 18 hours ago 0 replies      
It wasn't exactly clear from the post...how is this different from msgpack? Is it just an implementation of more complex data types on top of it?
17
wicknicks 16 hours ago 0 replies      
I couldn't find any documentation about it, but is there any way to achieve forward/backward compatibility with Transit?
18
iheart2code 18 hours ago 5 replies      
As neat as this sounds, I would prefer to do a little extra parsing by hand in exchange for the readability of JSON. Looking at some of Transit's examples, it seems like it would be difficult to gain as complete an understanding of a set of information at a glance.
19
bhouston 18 hours ago 1 reply      
Can this be used for saving state to disk when you don't want to use a database?
20
peterkelly 17 hours ago 0 replies      
It looks promising, but is there a mapping for XML? I would recommend adding this (as an optional profile) in a future version of the spec. It would help interoperability with legacy (non-Transit based) systems.
21
squigs25 17 hours ago 0 replies      
Cool, now just add a TransitSchema package for every scripting language and we can use it in place of protocol buffers
22
ciroduran 18 hours ago 0 replies      
Obligatory XKCD standards post - http://xkcd.com/927/
23
EGreg 12 hours ago 0 replies      
I think this is very helpful to keep in mind... consumers pushing their demands to producers... and eliminating waste and inefficiency.

http://en.wikipedia.org/wiki/Lean_manufacturing

24
c4pt0r 7 hours ago 0 replies      
something like BSON?
25
zcam 18 hours ago 1 reply      
Underwhelming... From the teasers it seemed like it could be something actually novel.
26
ape4 17 hours ago 2 replies      
As easy as:

[["^ ","~:district/region","~:region/e","~:db/id",["^ ","~:idx",-1000001,"~:part","~:db.part/user"],"~:district/name","East"],["^ ","^2",["^ ...

15
The Sniper Attack: Anonymously Deanonymizing and Disabling the Tor Network [pdf]
57 points by newaccountfool  9 hours ago   4 comments top 3
1
finnn 6 hours ago 1 reply      
Am I correct in assuming this is what was pulled from Black Hat?

EDIT: According to someone on reddit, it's been patched, and the Black Hat one sounded like it hadn't been. http://www.reddit.com/r/netsec/comments/2bf9fl/the_sniper_at...

2
s_q_b 5 hours ago 0 replies      
It's been patched. Here's the Tor Project's take: https://blog.torproject.org/blog/new-tor-denial-service-atta...
3
lucb1e 3 hours ago 0 replies      
On his blog it reads:

> 21st Symposium on Network and Distributed System Security (NDSS 2014)

So nothing to do with Black Hat which I thought it was until I saw the comments here. Misupvoted...

16
Ethereum Genesis Sale
72 points by MichaelAO  9 hours ago   37 comments top 12
1
Cyther606 5 hours ago 1 reply      
Ethereum launch in review:

- Investment Prospectus: https://www.ethereum.org/pdfs/TermsAndConditionsOfTheEthereu...

- Premine Part I: Anyone can buy ETH in exchange for BTC for the next 42 days

- Premine Part II: On top of Premine Part I, +10% of the total ETH allocated during Premine Part I will be distributed to "early contributors"

- Premine Part III: On top of Premine Part I and Premine Part II, +10% of the total ETH allocated during Premine Part I will be distributed to the Ethereum Foundation

- The supply of ETH is uncapped and inflationary at a rate of +25% per year

- The premine is being conducted by "EthSuisse", a Swiss entity which will be prompty dissolved after the premining period ends. The Ethereum team makes no guarantee that development of Ethereum will continue after the dissolution of "EthSuisse"

- Regardless of how many BTCs are raised during the premine period, ~4000 BTC is explicitly reserved to pay for "Expenses incurred prior to and related to Genesis Sale". Translation: they are pocketing the first 4000-5000 BTC

> The Ethereum Platform is being developed primarily by a volunteer contributor team - many of whom will be receiving gifts of ETH in acknowledgement of their dedication - and will continue to be developed on a volunteer basis by some developers as well as under a more formalized contracting or employment relationship for other developers. The group of developers and other personnel that is now, or will be, employed by, or contracted with, Ethereum Switzerland GmbH ("EthSuisse") is termed the "Ethereum Team." EthSuisse will be liquidated shortly after creation of genesis block, and EthSuisse anticipates (but does not guarantee) that after it is dissolved the Ethereum Platform will continue to be developed by persons and entities who support Ethereum, including both volunteers and developers who are paid by nonprofit entities interested in supporting the Ethereum Platform.

2
panarky 8 hours ago 2 replies      
I generally think that pre-mined cryptocurrencies are bad, just a lottery for a small group of insiders.

And this ethereum sale is pre-pre-mined ... pay now for the chance to get pre-mined currency later.

I'm intrigued by the concept, though. Decentralized storage, compute, scripting and secure exchange is an ideal platform for autonomous corporations.

3
lukifer 8 hours ago 0 replies      
> Also, please note that before the release of the Genesis Block in the winter of 2014/2015, ether will not be usable in any way, in fact it wont technically be created until that point in time.

This doesn't exactly instill confidence. Buyers are purchasing an IOU, which is the opposite of a trustless, distributed system. (At least they're up-front about it.)

4
viach 1 hour ago 1 reply      
Sometimes reading new crypto news I think that all my attempts to build some real software for people are just worthless. All I need is stay at home for 6 months and roll out another $coin$ to participate in this gold rush. Oh ok.
5
radicalbyte 5 hours ago 2 replies      
Is this the western/hipster equivalent of a 419 scam?
6
faizdev 8 hours ago 0 replies      
If you have time to read up on the technicals, the following introductory paper (dubbed 'the yellow paper') is highly informative with regards to the technology and related work Ethereum is based upon: http://gavwood.com/Paper.pdf
7
programmarchy 6 hours ago 3 replies      
Introducing Freethereum: a future fork of Ethereum with its own Genesis Block. Please note that before the release of the Freethereum Genesis Block, freether will not be usable in any way, in fact it wont technically be created until that point in time.

Buy Now: 1 BTC = 9999 FREE

Exodus to 1MAHTNYgv6di77RVcCU7vXtA7FgYEosb8H!

8
wyager 8 hours ago 2 replies      
I don't see this panning out well. Between the wonky incentive scheme, the lack of usefulness, and the mishaps with broken hash algorithms and stuff, there is not a lot of promising stuff going on here.

Is there something I'm missing? Does Ethereum actually let us do anything new, or better than we can do it now?

9
StephenGL 8 hours ago 2 replies      
Ethereum has all the trappings of a cult (have you seen some of the nuball videos they have posted) and cults aren't known for having it all together.
10
dinkumthinkum 5 hours ago 1 reply      
OK, I have followed some of the musings of the cryptocurrency crowd, though not in tremendous depth. I watched the video, I'm not looking for a tl;dr; per se, just what is it that I just watched?

I feel like I just watched yet another solution to all the world's problems or something.

Now, when he says "users can be rewarded with tokens of value into the startups that they invest in" .... Now, in my world, this is not a novel concept. This is called "money," wall street does this every day. Just saying ...

11
javert 7 hours ago 1 reply      
I was pretty excited about Ethereum when I first heard about it, but now, it looks like there is no fixed cap on the total amount of ether, which I consider to be a major fault.

What's stopping people from forking and making a "version" with a fixed cap, like bitcoin?

12
ps4fanboy 8 hours ago 2 replies      
Does anyone have a TLDR on this?
17
Logs Are Streams, Not Files (2011)
31 points by fbuilesv  6 hours ago   7 comments top 4
1
colmmacc 4 hours ago 0 replies      
From the article:

> a better conceptual model is to treat logs as time-ordered streams

At scale it's probably better still to re-think logs as weakly-ordered lossy streams. One form of weak-ordering is the inevitable jitter that comes with having multiple processes, threads or machines; without some kind of global lock (which would be impactful to performance) it stops being possible to have a true before/after relationship between individual log entries.

Another form of weak ordering is that it's very common for log entries to be recorded only at the end of an operation, irrespective of its duration; so a single instantaneous entry really represents a time-span of activity with all sorts of fuzzy before/after/concurrent-to relationships to other entries.

But maybe the most overlooked kind of weak ordering is one that is rarely found in logging systems, but is highly desirable: log streams should ideally be processed in LIFO order. If you're building some kind of analytical or visualisation system or near real-time processor for log data, you care most about "now". Inevitably there are processing queues and batches and so on to deal with; but practically every logging system just orders the entries by "time" and handles those queues as FIFO. If a backlog arises; you must wait for the old data to process before seeing the new. Change these queues and batching systems to LIFOs and you get really powerful behavior; recent data always takes priority but you can still backfill historical gaps. Unix files are particularly poorly suited to this pattern though - even though a stack is a simple data-structure, it's not something that you can easily emulate with a file-system and command line tools.

2
philsnow 5 hours ago 1 reply      
sink/drain, not source/sink ? Does anybody use "sink" to mean the place where stuff comes out of (from a particular system's perspective) rather than the place where stuff goes ?
3
farva 5 hours ago 1 reply      
It's not like there's really a difference between the two, under *nix.
4
antocv 2 hours ago 0 replies      
Files are streams.
18
Groundbreaking Operating System Is Named an IEEE Milestone
57 points by bane  9 hours ago   8 comments top 5
1
weland 3 hours ago 1 reply      
It's rather unfortunate that this comes so late. Gary Kildall received far less recognition than he was due during his lifetime, and a story -- by all appearances, unfounded -- regarding his sloppy handling of the IBM deal was pushed, mainly for PR reasons, by a lot of tech "journalists" along with the likes of Bill Gates himself. Perhaps even more ironically -- considering IEEE's distinction now -- is that his own alma mater treated him rather unceremoniously, famously inviting him to attend their CS program anniversary in 1992, but asking Gates, a dropout from a rival university, to deliver the keynote speech. He apparently lapsed into alcoholism later in his life, which ended with a motorcycle accident.

I certainly applaud IEEE's initiative, but I cannot shake off that feeling of "too little, too late".

2
pjmorris 6 hours ago 0 replies      
I had a Sanyo CP/M machine in the early 80's. I couldn't believe how fast Turbo Pascal ran compared with the Pascal compiler on the DEC VAX 11/780's we had at school.
3
brandonmenc 8 hours ago 1 reply      
I loved watching Kildall co-host Computer Chronicles. It's a shame he died so young.
4
smoyer 8 hours ago 0 replies      
My first real experience on a computer with an operating system (that didn't just boot basic) was using CP/M on Z80-based computers. The CPU portion of the computer was huge by today's standards, but since the predominant floppy drives were all eight inches across (and only 180KB) you needed a lot of space.
5
idibidiart 5 hours ago 0 replies      
Kildall was the real deal, IMO.
19
I made a patch for Mozilla, and you can do it too
292 points by martius  21 hours ago   33 comments top 9
1
paulrouget 20 hours ago 3 replies      
I've been working with new contributors for some years now. Here are some advices:

- find a mentor. It will helps a lot (use http://www.joshmatthews.net/bugsahoy/);- for the first bugs, writing code is not usually the hard part. Understanding bugzilla and writing tests are;- ask for a commit access level 1 early to have access to the try servers (to run the tests);- finish what you start. Bug is fixed when it lands, not when a patch is attached;- mozilla hackers are nice people, ask questions on IRC. There's no stupid questions (we all started from zero too);

Everything will get much easier after the first bug fix. And motivation grows a lot once you have finally landed some code :)

2
vasi 18 hours ago 1 reply      
I did it! https://bugzilla.mozilla.org/show_bug.cgi?id=548763

Unfortunately it took quite a long time, the Mozilla process is quite confusing to a newcomer, even one with a lot of open source project experience. I definitely second the recommendation of finding a mentor.

3
Diastro 16 hours ago 2 replies      
In your post you mentionned that a goo way to start working with open-source project it to look for smaller scale projects to contribute on github. If been working hard to make it easier for people to find interesting/smaller projects on github by creating the /r/coolgithubprojects and the http://coolgithubprojects.com/ website lately. It's not perfect but anyone who's looking to contribute and find interesting will (I hope) find these tool useful. They're not perfect yet but we'are working hard to making open source project sharing as easy as possible!.My 2 cents.
4
zokier 18 hours ago 3 replies      
I wonder how much of his positive experience is due the fact that he decided to contribute to Servo instead of one of the "main" Mozilla projects (Gecko, SpiderMonkey, Firefox etc). Less hairy codebase, smaller community, github-based, and probably lot more suitable low-hanging fruits.
5
gluxon 20 hours ago 0 replies      
I've done this a few times for bugs that have personally annoyed me. Gotta say, it's easy to get started and the reviewers are amazing.
6
jonalmeida 7 hours ago 0 replies      
I think the real take away is that this same experience can be applied to any large open source project that seems daunting to new comers, not just for Mozilla.

I had a similar experience with emscripten (yeah I know, Mozilla too) initially. So I start working on a project with the following steps:

- Download code- Setup environment- Run test suite (if exists)- Play with simple bits of code by hard coding changes and building it to see it's effect.- ... sleep?- Attempt baby bug first

7
DomingesZ 19 hours ago 2 replies      
What's kind of skills for programming with Mozilla?
8
barkingllama 19 hours ago 1 reply      
Too bad we can't patch IE.
9
terminado 18 hours ago 3 replies      
...but can I patch the look and feel of the user interface, or are we not permitted to patch superficial things that affect "branding" and user experience?

I liked Firefox 28, not Firefox 29 and up.

20
Norris numbers
28 points by johndcook  6 hours ago   4 comments top 2
1
ExpiredLink 1 hour ago 1 reply      
The article has some valid points. But a person who needs to emphasize his superiority against novice programmers in such a way lacks maturity of personality.
2
habitue 2 hours ago 1 reply      
I don't see how "refuse to add features" allows complexity to scale past 20K. That seems more like a strategy to stay under a given number of LOC, dodging the issue (which isn't to say it's a bad strategy).
21
Bitcoin: the Stripe perspective
776 points by gdb  1 day ago   220 comments top 31
1
napoleond 1 day ago 9 replies      
This is my favourite article about Bitcoin to date, and properly describes one of the main ideas I wish Bitcoin detractors would come around to. Bitcoin has a lot of problems as a unit of account and as a store of value, but that is not primarily what Satoshi was building (https://bitcoin.org/bitcoin.pdf). Bitcoin is, and has always been, a medium of exchange first and foremost. It still has some shortcomings in that regard, but it is the closest thing we have to solving the trust issues of peer-to-peer exchange in a purely technical fashion.

Comparing Bitcoin addresses to the IP layer of the internet is brilliant. Something that the Bitcoin community has been a bit slow to accept is the idea that "peer-to-peer exchange" may be occurring at the corporate level rather than at the individual level for most people--it's hard to imagine a world where that isn't true due to the points outlined in the "Comparison to the card networks" part of this article. However as long as the corporate implementation is done in such a way that anyone could jump in as an individual if they wanted to bear their own risk, then we are still miles ahead of how the traditional financial system currently works ("net neutrality" for money).

2
nlh 1 day ago 3 replies      
This is one of the best posts on the "state of the Bitcoin economy" I've read yet. They nail a few key points that shows they get it in a real-world sense.

* Mass-consumer adoption of Bitcoin is a tough sell in developed countries (USA, etc.)

* Bitcoin the Network may ultimately be more valuable than BTC the currency

* "No chargebacks!" is a pitch to merchants for BTC, not consumers. Consumers like chargebacks & trust.

* BTC the currency may end up being a behind-the-scenes player so long as traditional currencies do their job.

This says to me that Stripe's position is ultimately to be the Visa of Bitcoin or the SWIFT of Bitcoin. And that's could indeed be a huge opportunity.

3
aeturnum 1 day ago 4 replies      
>However, Bitcoin has huge potential as a way to transport value. Its surprisingly difficult to move money today, and the experience of paying for something online is just about the only part of the internet that hasnt changed dramatically in the past twenty years.

Based on my very limited understanding, the difficulty in moving money has nothing to do with technical limitations of money (it's not like we lack the technology to transfer dollars electronically) and everything to do with regulation. Does bitcoin only offer a "reset" button for regulation? The restrictions on money transfers exist because stakeholders in the finance system want them there - why wouldn't they implement the same restrictions for *coin?

I suppose you could try to exploit the semi-anonymity of bitcoin to avoid regulation, but that doesn't seem attractive to most businesses.

4
berdario 1 day ago 0 replies      
> There are a few walled gardens with great payment experiences (the App Store, Amazon)

I'm of a different opinion: Amazon is much stricter than other online merchants, and they don't give you meaningful errors when something goes wrong with your billing. When I was back in europe, I had to rely on a relative's credit card to make a payment, since my debit card was being refused.

Now that I'm travelling in the US, the situation is even worse: they locked my account 2 times already, and they request information that (due to the privacy laws in my home country) can't be easily accessed and returned by my bank.

Apparently there're differences with credit cards, debit cards and atm cards... but as long as you have enough money/credit on your account, I cannot see how the payment process should be different, and no one has been able to explain it to me yet (an interesting blog post about how things can go wrong when paying in person with a card is this one btw: https://blog.flameeyes.eu/2014/04/my-time-abroad-chip-n-pin it deals with "Cardholder Verification Method" which I never even heard of before reading it)

Trying to pay for an order on Lenovo is even worse: they don't accept non-US issued cards (unless it's an american express) and my bank doesn't let me do a wire transfer to the US online (you have to phisically go to their desks and ask for the wire transfer to be sent)

It's mind boggling the amount of manual work and custom e-mails sent back & forth needed for payments. It's almost like being stuck in the early 20th century.

OTOH, the few times that I used bitcoin to pay for something online, the process has been flawless, and due to its utter simplicity I can vaguely understand the whole process of sending money on bitcoin... unlike with systems like banks, paypal and amazon, which are mostly huge black boxes of which I cannot even audit their source code.

5
flavor8 1 day ago 3 replies      
It's not quite as simple as that, though. To, for example, send money from the US to Kenya via bitcoin, you need somebody in the US who is willing to sell you their bitcoin in exchange for dollars, and somebody in Kenya who is willing to buy your bitcoin in return for shillings (which are then paid to the destination seller). Creating the technology that makes this relatively transparent is quite doable, but you still need an active bitcoin market in both financial markets. If the general pattern of value transfer is unidirectional (e.g. remittances) then there needs to be a viable flow of bitcoin out of the destination country (i.e. your Kenyan bitcoin purchaser needs to regularly make purchases from foreign markets in bitcoin); that's tricky to establish.

On top of this issue, big banks charge only cents to send large amounts of cash internationally (according to somebody I talked to a couple months ago, Citi charges 10c per international ACH transaction assuming you have $250k on deposit) so the volume in the bitcoin market has to be built from small transactions -- there's no real upside to doing large transactions over bitcoin vs international ACH.

6
VexXtreme 1 day ago 0 replies      
The beautiful thing about bitcoin is that it's an open system, an open currency if you will, that will allow a nice app ecosystem to be built on top of it. That aspect will allow many non-technical people to interact with abstractions that hide away all the complexities (such as Stripe), but it will also allow experts and hackers to still tap into the network directly with their own wallets, nodes, private keys etc.

I think that bitcoin is currently in its early infancy when it comes to user adoption and still has a very long way to go before it reaches its potential. It's not completely off base to compare it with the way the internet was back in the early 90s.

7
nemo1618 1 day ago 6 replies      
It worries me to see people describe Bitcoin as the "IP layer of payments." I have serious doubts about Bitcoin's ability to scale to a global audience. Transactions are too slow, the blockchain is too heavy, etc. I see Bitcoin in a very similar light to IPv4 and JavaScript: a good idea that escaped into the wild too quickly. And so we wind up piling hacks upon hacks to make up for the lack of a solid foundation, and it only gets harder to replace the current standard with a better alternative.
8
jarin 1 day ago 5 replies      
Taking this a step further, when using it as a pure transport medium, the cryptocurrency itself doesn't matter. It could be Bitcoin, Litecoin, Dogecoin, 2304293f20983uf2089j2f3coin, or whatever, as long as it's liquid enough to convert back and forth.

What we really need is a gateway system that will intelligently convert between your local currency -> the cryptocurrency with the best liquidity/exchange rate -> the destination currency.

Interestingly, I could see this leading to automatically generated cryptocurrencies, as various popular cryptocurrencies fall below liquidity, transaction time, and/or exchange rate thresholds. Over time, I'd expect to see some interesting competition and arbitrage between cryptocurrencies going on behind the scenes, all computer controlled.

9
abalone 1 day ago 0 replies      
The last section on trust & protection cuts to the heart of what it will take for general consumer adoption. But it leaves unstated the fundamental tension: the very nature of bitcoin is that of anonymity and finality. Whereas card and even ACH transfers can be reversed, bitcoin cannot.

The essay hints that it will take a central trust provider to regulate and police transactions to control for fraudulent activity. That's true. The key question is whether they'll be able to sustain the promised cost efficiencies of bitcoin by the time they build in this capability.

If the technical differences melt away and it just becomes another competitor to Visa/Mastercard, minus the billions of dollars of marketing and POS infrastructure over decades that have gone into cementing that network, then we really have to scrutinize whether "openness" and "unbundling" present a serious enough benefit to warrant the cost of a consumer global payments network rollout.

And couldn't a central trust provider work against this openness and unbundling? That's the whole point of a central entity, right?

10
tomasien 1 day ago 0 replies      
One thing I'm glad they didn't focus much on is the fees associated with Bitcoin. Too much is made out of the fee reduction of BTC (I think because it's easy for regular folks to understand) - but that's only a major factor for why you're able to develop on top of it. Bitcoin is SO much more than fees, and really the fees themselves aren't "set in stone" - others will charge on top of BTC anyway, so we have no idea what the fees will be eventually. Coinbase charges 1% for interchange, which seems reasonable, and that's just the start.

The card "network" itself wouldn't have to have the fees it has, it's Visa and Mastercard, the banks, and especially Amex that DECIDE to make the fees higher. Another example is ACH, which is essentially free to transact, and in some countries has wide adoption.

Fees should NOT be a major driver in BTC adoption for consumers or businesses. If you want low fees, let's work on ACH (and we are). But Bitcoin has a TON of other benefits, and those are phenomenally well documented here.

11
doctorpangloss 1 day ago 3 replies      
> First, the resulting ecosystem is technologically open. Open ecosystems have a way of getting better much faster than their closed counterpart.

Unless you are specifically in the business of making financial institutions, it would seem that Stripe (and for that matter, nearly every payment provider that a computer can interact with) is open in all the ways that matter to a normal person.

> There are a number of cryptocurrencies which already have gateways baked in at a protocol level (such as Open Transactions and Ripple). However, there are huge network effects in any financial system, and to date these other systems have failed to win the necessary user support.

So you agree with the abovethat as far as history is concerned, just being open doesn't get you adopted. :)

Bitcoin's crazy volatility and speculation drives "user support." It's ironic, because (1) the volatility is what makes Bitcoin unsuitable as a currency from the perspective of everything that matters economically (like your example, value storage) but (2) the volatility makes Bitcoin phenomenally successful as a currency from the perspective of user acquisition.

I don't know why it's so hard to promote transaction systems. Maybe there's some rule out there that no one competes on the basis of transaction fees, and those who do are shut out of the cabal. That's not a conspiracy that I subscribe to. Maybe the minimum transaction fee to make things economically viable is 2.9% of every transaction. So people do make platforms with lower transaction fees, but then they can't afford to tell you about it, or they don't manage to enrich themselves enough to make it worthwhile.

But Bitcoin, it managed to lie and tell us, "the lowest possible transaction fees;" while technically true, you just pay for the economic cost of promoting the platform and enriching its owners through its volatility.

In that sense, Bitcoin has been a terrific failure as a payment system. It's much worse to buy Bitcoin at the wrong time than to pay 2.9% on every transaction. But as a payment platform, as far as venture capitalists are concerned, it's fabulously successful. The early buyers of Bitcoin did enjoy a fabulous return.

User acquisition with negative cost to the platform's owners? Brilliant.

12
pat2man 1 day ago 0 replies      
Stipe is hitting on a bunch of the key points here. The big one is that bitcoin alone is not going to solve all our money problems, it will be the network of companies that build products on top of bitcoin. Its the reason we are seeing so much money going into these companies instead of investors just buying and holding Bitcoin.
13
billyarzt 7 hours ago 1 reply      
Potentially dumb question: if there are 21 million bitcoins (and therefore 21 million bitcoin addresses), would that limit bitcoin's efficiency as a medium of exchange? Ie wouldnt there be potentially billions of transaction moving through a financial system powered by bitcoin daily? (Apologies in advance for any flawed assumptions).
14
throwaway2274 1 day ago 1 reply      
One problem with the current system that the article did not mention is that having a central party handle all the global financial transfers creates a big handle for nation states to put leverage on.

We saw this very clearly when the US government put pressure on VISA and mastercard to reject donations going to WikiLeaks. I remember watching that whole episode in disgust. With bitcoin, this is pretty much impossible.

15
mfringel 1 day ago 1 reply      
Bitcoin is to payment system as git is to source control system.

It's an interesting toolbox that can be used on its own, but gains a ton of power once tools are built on top of it.

16
jheriko 21 hours ago 0 replies      
What if banks compete and start providing a reasonable quality of service?

It wouldn't be hard.. they just don't bother because there is no return for the investment right now.

Bitcoin scares the crap out of me. It looks too much like a criminals dream for me to have any trust in it and the network yet...

17
flatline 1 day ago 0 replies      
Great article. The only problem I had with it was an IP address being analogous to a bitcoin public address. Stripe.com is vouchsafed by a certificate authority, so when you go to the address either by typing or by a link, you have some reasonable, if sometimes tenuous, assurance that you are dealing with the real stripe.com and not some typo-squatting fraud or MITM. Using friendly addresses for payment removes the statistical uniqueness of public keys from the equation. As long as the friendly identifier is not the primary one, there is no issue, but I see nothing wrong with establishing identity through copy/pasting a bitcoin address from a known trusted source or scanning a QR code.
18
mcs 1 day ago 2 replies      
What about a voting system? With little keychains given out with only the voting system having the public key for, and any third party can validate the votes. "Coins", or voting credits can be distributed back to voters without transaction fees.
19
Symmetry 1 day ago 0 replies      
They copied my blog post! [1] Ok, that's actually extraordinarily unlikely, and I'm glad that other people think we need to look at Bitcoin through the medium of exchange/unit of account/store of value trichotomy and I'm glad that they seem to be going forward with using bitcoin as the basis for an electronic money system without denominating anything in bitcoins.

[1] http://hopefullyintersting.blogspot.com/2014/07/thoughts-on-...

20
driverdan 1 day ago 0 replies      
This is a very good article and highlights what I also think is Bitcoin's biggest strength, its use as a medium of exchange. Whenever someone asks me to describe why they should be interested in Bitcoin I always focus on using it to transfer money. I use an example of sending money to someone else easily, cheaply, and quickly without using shitty services like Paypal. This resonates more in the USA than Europe since our banking system is terrible and doesn't have easy, free transfers baked in. It also makes a lot of sense for cross border / currency transfers.
21
ChuckMcM 1 day ago 0 replies      
If you had a large enough transaction base where you could simultaneously convert in and out of bitcoin chunks such that the transaction didn't suffer f/x creep (or if it did, it was tolerable via transaction costs) that would be a pretty interesting thing indeed.
22
k-mcgrady 1 day ago 0 replies      
Good piece. If you're reading this expecting something new from Stripe though you'll be disappointed:

>> "So what role will Stripe play here? We already provide Bitcoin acceptance, and we're actively investigating other functionality. We'll have other updates on this front before too long."

23
ahtomski 1 day ago 0 replies      
This post does a very good job of explaining clearly why Bitcoin could be huge in a way that someone not familiar or excited by the technology could understand. And I think the meta-point here from Stripe is, by the way 'we want to be the next Visa'.
24
jtanner 1 day ago 0 replies      
This article is genius, Greg Brockman has figured this out.

Does NameCoin have all the needed features to make this happen right now?

25
webmaven 1 day ago 0 replies      
Money isn't the only relevant unit of account. Even in stable economies, stocks, bonds, and the like are very high friction, and there is a lot of room for disruption and innovation.
26
GmeSalazar 1 day ago 0 replies      
Unrelated question: are there any formalization of the Bitcoin protocol out there?
27
aburan28 1 day ago 0 replies      
There is too much venture capital money behind bitcoin for it to fail. EDIT: Don't underestimate the stubbornness of investors.
28
jcr 1 day ago 0 replies      
> "what might a Bitcoin that's useful for the mainstream look like?"

Great article, and the extremely familiar line above made me smile.

29
edpichler 1 day ago 0 replies      
Very good article. Great times are coming.
30
freedom123 1 day ago 2 replies      
Bitcoin, no matter how you word it or rationalize it will always have a root problem and inherent risk...it is backed by nothing. Users today think its usage gives it market value and it can but to a limit. Money (USD) originally (the dollar bill) worked because it was back by gold. A metal that worked because everyone on the planet wants it. That dollar bill was a "check" or agreement stating, this dollar bill represents this much gold thus the value of money. The USA today does not have gold backing its money-- so as you can see we already have a bitcoin and the USA will not allow you to complete with its money - enjoy
31
dcc1 1 day ago 4 replies      
I am not sure why anyone would want to accept Bitcoin via Stripe

Same can be accomplished with Bitpay, who take 0% in fees with a 30$/month package (1% otherwise), they also pay into the bank the next business day unlike Stripe.And they are also alot more open as to the types of businesses they accept.

Or hell one could directly accept bitcoin with bitcoind running locally or using the blockchain.info api and then converting bitcoins with Bitstamp (or just using the bitcoins to buy things, every day more and more places accept them!)

It is great seeing Stripe actually embracing new technologies but imho their current bitcoin "offering" is not great and they are picky as to who they do business with.

edit: ah typical HN fanboyism, vote down anything negative said about Stripe instead of addressing the points raised.

22
Postcards from the post-XSS world
29 points by ShaneWilton  7 hours ago   1 comment top
1
zerker2000 5 hours ago 0 replies      
"For any type of a tag, a new node with a name matching the id parameter of the tag is inserted into the default object scope. In other words, <div id=test> will create a global variable test (of type HTMLDivElement), pointing to the DOM object associated with the tag."Did not know this.
23
The Julia Express [pdf]
69 points by Xcelerate  13 hours ago   2 comments top 2
1
StefanKarpinski 9 hours ago 0 replies      
Nice quickstart guide. Also useful is "learn X in Y minutes" for Julia:

http://learnxinyminutes.com/docs/julia/

2
plg 11 hours ago 0 replies      
LaTeX source code would be nice
24
Rendering Worlds with Two Triangles on the GPU [pdf]
90 points by muyyatin  15 hours ago   22 comments top 9
1
algorias 14 hours ago 1 reply      
Ah, the classic presentation that got me started in the demoscene a couple of years back.

The Google cache version of the pdf doesn't include any images, so I put up a copy over here:

https://dl.dropboxusercontent.com/u/2173295/rwwtt.pdf

2
userbinator 7 hours ago 0 replies      
I think the most elegant thing about this method is that it describes a scene in terms of its basic mathematical 3D objects and transformations on them (list here: http://www.iquilezles.org/www/articles/distfunctions/distfun... ) and then exploits the massive parallelism of the GPU for rendering all the pixels.

Here's a demo of someone playing around with it, complete with a Slisesix-inspired scene: http://www.rpenalva.com/blog/?p=254

This set of slides is also related:http://www.iquilezles.org/www/material/function2009/function...

3
DanBC 10 hours ago 0 replies      
I love these.

People might enjoy noodling around the Geisswerks pages which have many code snippets around ray tracing; graphic demos; and so on.

http://www.geisswerks.com/

4
hughes 8 hours ago 0 replies      
I was working with distance fields back in 2008, and the idea of inverting the process blew my mind.

I had no idea Iigo Quilez's image was produced this way and I'm so glad I had the chance to see how it was made.

Thanks for posting!!

5
rogerallen 14 hours ago 1 reply      
And if you want to try things out yourself, iq has created a playground for you here: https://www.shadertoy.com/
6
sp332 14 hours ago 1 reply      
You can download it from here http://www.pouet.net/prod.php?which=51074 It's been updated to run more reliably (edit: on Vista) but I can't find a version that will run on Win7.

Edit: I found a similar one on Shadertoy https://www.shadertoy.com/view/lsf3zr

7
yzzxy 6 hours ago 0 replies      
Is the demoscene a good place to get into graphics programming? The prevalence of older methods leads me to believe one could learn in a similar progression to the graphics gurus of today, moving from simpler old methods with performance and size optimization to modern techniques?
8
DanAndersen 13 hours ago 0 replies      
This is a really impressive presentation -- after looking on from afar at the seemingly magical works of the demoscene, this finally helped me understand a little bit of how the magic happens. I've only got a bit of GLSL experience so far but now I want to learn a lot more.
9
thisjepisje 13 hours ago 3 replies      
Could someone explain to me what the "two triangles that cover the entire screen area" have to do with anything?
25
Prosecutors Are Reading Emails From Inmates to Lawyers
115 points by growlix  7 hours ago   38 comments top 8
1
Teodolfo 2 hours ago 1 reply      
Bar associations should demand that lawyers allow clients encrypted email communication. The adoption problem for encrypted email (all encrypted controlled by the client machine using open source software so google or whoever doesn't also have access to the plaintext) is so hard because it takes BOTH parties to keep communication secure. If all lawyers had to do this, people would become much more aware and maybe demand their medical and tax information sent over email also be encrypted.
2
jroes 6 hours ago 1 reply      
I was listening to NPR recently and there was an interview with a former Clinton administration official [1] who mentioned almost in passing that prosecutors regularly infringe on the right to privileged conversation between a defendant and their lawyer. Specifically, he mentioned that the room you are given with your lawyer has paper thin walls that the police and prosecutor folks can hear through easily, and when in jail there is no way to have a real private conversation with your lawyer as well.

[1] http://wfae.org/post/webb-hubbell

3
YokoZar 6 hours ago 1 reply      
Wikipedia tells me that both of the judges that ruled emails are unprivileged were born in 1946. The judge who ruled against the government, however, was born in 1955.

I don't think this is a coincidence. I'd be surprised if the former two judges even used email at all.

4
rtpg 6 hours ago 1 reply      
>She seemed to take particular offense at an argument by a prosecutor, F. Turner Buford, who suggested that prosecutors merely wanted to avoid the expense and hassle of having to separate attorney-client emails from other emails sent via Trulincs.

Would it be that hard to specify one e-mail address as a "priviledged address" (with a signature from the lawyer about such) and filter out those? It's really surprising how people can go to court and argue such claims.

5
Natsu 6 hours ago 3 replies      
> Especially since he is acting as a public defender in this case meaning the government pays him at $125 per hour Mr. Fodeman argued that having to arrange an in-person visit or unmonitored phone call for every small question on the case was a waste of money and time.

The juncture of these two factoids struck me as odd.

6
diafygi 6 hours ago 1 reply      
http://youtu.be/WTPimUSIWbI

Attorney client privilege is one of the biggest fallouts from mass surveillance. Earlier this year there was a legal hackathon at Mozilla where I tried to make a product to help that.

7
RexRollman 6 hours ago 0 replies      
I worry for this country.
8
Canada 6 hours ago 2 replies      
"...budget cuts no longer allow for that, they said."

Is the amount of money being spent on prisons in the United States really shrinking?

Honestly, I have no idea.

26
Microsoft Cloud Growth Drives Strong Fourth-Quarter Results
66 points by theatraine  12 hours ago   20 comments top 5
1
ChuckMcM 11 hours ago 2 replies      
Pretty nice, ouch on the $733M fine by the EU. This bit was what I'm watching:

   > Bing search advertising revenue grew 40%, and U.S.    > search share grew to 19.2%.
40% growth on Search Advertising revenue against a 23% gain on 'sites revenue' for Google. I wish Microsoft would break out their CPC numbers but expect that is a bit much to ask.

2
pmalynin 10 hours ago 2 replies      
"Strong" means 7% less than last year.

Source: BBC Business

3
jasonkolb 9 hours ago 1 reply      
This is a one-time hit. They're now enforcing licensing very strictly where they didn't before and offering "nicer" terms if they buy some Azure instead of paying those costs retroactively. It'll only work once, and will create a lot of bad karma.
4
sirkneeland 11 hours ago 0 replies      
Good for them.

The Microsoft Mobile division (formerly Nokia Devices & Services) lost $700 million. How much of that is one-time charges related to the ownership change and how much is the continued decline of unit sales?

5
xamlhacker 10 hours ago 0 replies      
Lumia device sales for the period of April 26 to June 30 was 5.8m. Doing a simple extrapolation to full 3 months gives an estimate of 8m Lumia smartphones in the quarter.
27
Apple Reports Third Quarter Results
72 points by orrsella  13 hours ago   74 comments top 6
1
untog 12 hours ago 9 replies      
The iPad clearly isn't going away, but it isn't taking over computing in the way everyone predicted either. If the rumours of Apple making a larger iPhone come true that could hurt iPad sales even more.

Aside from anything else that makes me ponder the wisdom of Microsoft's tablet-heavy focus on Windows 8.

2
cmollis 12 hours ago 4 replies      
Why is apple's tax rate lower than mine?
3
smackfu 12 hours ago 0 replies      
OTOH: "Since the Nokia acquisition was completed in April, Microsoft sold 5.8 million Lumia phones and 30.3 million non-Lumia phones. "
4
kostyk 12 hours ago 0 replies      
wow, it's improvement from last year. This engine just keeps running.
5
plg 11 hours ago 2 replies      
Ironic that the spreadsheet linked to on the Apple Press Info page (https://www.apple.com/pr/pdf/q3fy14datasum.pdf) is a pdf created not by iWork/Numbers but by ...

...

Excel

doh!

6
jph 12 hours ago 3 replies      
TLDR: "heavily below expectations" -- Telegraph

Live blog commentary: http://www.telegraph.co.uk/technology/apple/10983304/Apple-r...

28
Digital Science invests in writeLaTeX
13 points by freyfogle  41 minutes ago   discuss
29
How GoG.com is growing beyond a back catalog
154 points by danso  22 hours ago   76 comments top 9
1
auxbuss 14 hours ago 1 reply      
It's GoG's ethics that I love. They walk the talk.

I'm no gamer, but I bought The Witcher on Steam with the aim of playing it in a language I'm learning. Seemed like a solid and fun challenge. But Steam's version of The Witcher wouldn't even load on OSX, despite it being sold as such.

GoG, rather kindly, offered a download for folk experiencing this problem, providing you provided a valid key for the game. My complaint to Steam has never been answered.

See, to me, GoG add value to the customer -- even though I wasn't one at that point. They are focussing on things from a customer's POV. That's the kind of business I want to do business with.

GoG is the current wearer of the "Don't be evil" crown.

2
tempodox 19 hours ago 4 replies      
I love GoG. They have one of the best online shopping experiences I have seen yet. And they are the antithesis to Steam: No nagging ads every time I start a game, no pointless updates to the client (just so it can nag with even more ads), no DRM. GoG knows how to win customers with goodies instead of forcing crap on them, like Steam does. Seriously, I only ever buy games on Steam if I cannot get them anywhere else. Even Apple's AppStore sucks less than Steam (and it's not easy to suck more than the AppStore).
3
SixSigma 19 hours ago 0 replies      
It's not just nostalgia that makes me a happy Gog customer. Many of the titles I didn't completely finish, some have expansion packs I never bought, some I only have for non-existent 5.25" / 3.5" floppy drives and some have re-texturing and mods for modern systems to improve graphical resolution.

Plus many titles I never had the pocket money to buy and now they are pocket change. With the Gog digital download system I can relax about keeping backups and the like.

4
shmerl 18 hours ago 1 reply      
GOG doesn't focus solely on old games anymore. They sell new games as well but they still focus on good games though. And obviously they remain DRM-free and work with publishers to convince them to release their games without any DRM junk attached, which is praiseworthy.

It's good that they are attempting to compete with Steam more - we need that. But in order to differentiate, it's not enough to make the client optional - they can also open source it to improve trust. That would set them apart from Steam even more.

You can vote for it on GOG's wishlist board:

https://secure.gog.com/wishlist/site/release_the_future_gog_...

At the very least they can document the protocol / API of the client to enable community alternatives. Vote for it as well:

https://secure.gog.com/wishlist/site/document_the_protocol_a...

5
reiichiroh 15 hours ago 0 replies      
Sorry if this is a tangent, but for those of you that use Steam, try using this: www.enhancedsteam.com
6
cridenour 13 hours ago 0 replies      
Does anyone remember their "scare tactics" when they rebranded? Told everyone that they were shutting down - but launched the "new GoG.com" the next day.
7
oneeyedpigeon 12 hours ago 2 replies      
Generally, GoG is great, they have a great user experience, sell most of the best games from my formative years, and represent brilliant value. My one complaint is the odd compatibility hole - I've been waiting for Settlers 2 to become available on the Mac for what seems like forever.
8
NickWarner775 12 hours ago 0 replies      
Great idea. I have never heard of this company before but I would love to start playing my old favorite games again.
9
mreiland 16 hours ago 2 replies      
GoG was never a back catalog, does anyone else find it annoying how these "authors" decide they want to write about something and choose something as stupid as talking about how GoG is no longer something it never was?
30
Ask HN: Coding outdoors
5 points by innsmouth_rain  20 minutes ago   1 comment top
1
TobbenTM 6 minutes ago 0 replies      
I would think a tablet along with a wireless keyboard would be the best option if you are looking for something small. A tablet with a data-connection even better. The biggest problem would be a good enough screen to use in the sunlight.
       cached 23 July 2014 10:02:01 GMT