hacker news with inline top comments    .. more ..    14 Oct 2015 News
home   ask   best   4 years ago   
Comparison R vs. Python: head to head data analysis dataquest.io
153 points by emre  3 hours ago   80 comments top 19
bigtunacan 1 hour ago 5 replies      
R is certainly a unique language, but when it comes to statistics I haven't seen anything else that compares. Often I see this R vs Python comparison being made (not that this particular article has that slant) as a come drink the Python kool-aid; it tastes better.

Yes; Python is a better general purpose language. It is inferior though when it comes specifically to statistical analysis. Personally I don't even try to use R as a general purpose language. I use it for data processing, statistics, and static visualizations. If I want dynamic visualizations I process in R then typically do a hand off to JavaScript and use D3.

Another clear advantage of R is that it is embedded into so many other tools. Ruby, C++, Java, Postgres, SQL Server (2016); I'm sure there are others.

mbreese 2 hours ago 5 replies      
This is interesting, but not really an R vs. Python comparison. It's an R vs. Pandas/Numpy comparison. For basic (or even advanced) stats, R wins hands down. And it's really hard to beat ggplot. And CRAN is much better for finding other statistical or data analysis packages.

But when you start having to massage the data in the language (database lookups, integrating datasets, more complicated logic), Python is the better "general-purpose" language. It is a pretty steep learning curve to grok the R internal data representations and how things work.

The better part of this comparison, in my opinion, is how to perform similar tasks in each language. It would be more beneficial to have a comparison of here is where Python/Pandas is good, here is where R is better, and how to switch between them. Another way of saying this is figuring out when something is too hard in R and it's time to flip to Python for a while...

sweezyjeezy 2 hours ago 3 replies      
This is just a series of incredibly generic operations on an already cleaned dataset in csv format. In reality, you probably need to retrieve and clean the dataset yourself from, say, a database, and you you may well need to do something non-standard with the data, which needs an external library with good documentation. Python is better equipped in both regards. Not to mention, if you're building this into any sort of product rather than just exploring, R is a bad choice. Disclaimer, I learned R before Python, and won't go back.
danso 2 hours ago 2 replies      
I spent a few weeks a few months ago learning R. It's not a bad language, and yes, the plotting is currently second-to-none, at least based on my limited experience with matplotlib and seaborn.

There's scant few articles on going from Python to R...and I think that has given me a lot of reason to hesitate. One of the big assets of R is Hadley Wickham...the amount and variety of work he has contributed is prodigious (not just ggplot2, but everything from data cleaning, web scraping, dev tools, time-handling a la moment.js, and books). But that's not just evidence of how generous and talented Wickham is, but how relatively little dev support there is in R. If something breaks in ggplot2 -- or any of the many libraries he's involved in, he's often the one to respond to the ticket. He's only one person. There are many talented developers in R but it's not quite a deep open-source ecosystem and community yet.

Also word-of-warning: ggplot2 (as of 2014[1]) is in maintenance mode and Wickham is focused on ggvis, which will be a web visualization library. I don't know if there has been much talk about non-Hadley-Wickham people taking over ggplot2 and expanding it...it seems more that people are content to follow him into ggvis, even though a static viz library is still very valuable.

[1] https://groups.google.com/forum/#!topic/ggplot2/SSxt8B8QLfo/...

xixi77 14 minutes ago 1 reply      
Really, syntax "nba.head(1)" is not any more "object-oriented" than "head(nba, 1)" -- it's just syntax, and the R statement is in fact an application of R's object system (there are several of them).

IMO, R's system is actually more powerful and intuitive -- e.g. it is fairly straightforward to write a generic function dosomething(x,y) that would dispatch specific code depending on classes of both x and y.

evanpw 35 minutes ago 0 replies      
If you only have time to learn one of the two, learn Python, because you can it's better for non-statistical purposes (I don't think that's very controversial).

If you need cutting-edge or esoteric statistics, use R. If it exists, there is an R implementation, but the major Python packages really only cover the most popular techniques.

If neither of those apply, it's mostly a matter of taste which one you use, and they interact pretty well with each other anyway.

vineet7kumar 17 minutes ago 0 replies      
It would be nice to also have some notes about performance of both the languages for each of the tasks compared. I believe pandas would be faster due to its implementation in C. The last time I checked R was an interpreted language with its interpreter written in R.
falicon 20 minutes ago 0 replies      
Language comparisons are equiv. to religion comparisons...you aren't going to find a universal answer or truth, it's an individual/faith sort of thing.

That being said - all the serious math/data people I know love both R and Python...R for the heavy math, Python for the simplicity, glue, and organization.

The13thDoc 2 hours ago 0 replies      
The "cheat sheet" comparison between R and Python is helpful. The presentation is well done.

The conclusions state what we already know: Python is object oriented; R is functional.

The Last Word appropriately tells us your opinion that Python is stronger in more areas.

acaloiar 2 hours ago 2 replies      
I have always considered R the best tool for both simple and complex analytics. But, it should not go unmentioned that the features responsible for R's usability often manifest as poor performance. As a result, I have some experience rewriting the underlying C code in other languages. What one finds under the hood is not often pretty. It would be interesting to see a performance comparison between Python and R.
daveorzach 32 minutes ago 0 replies      
In manufacturing Minitab and JMP are used for data analysis (histograms, control charts, DOE analysis, etc.) They are much easier to use and provide helpful tutorials on the actual analysis.

What features or workflow does R or Pandas/Numpy offer to manufacturing that Minatab & JMP can't?

mojoe 1 hour ago 2 replies      
The one thing that sometimes gets overlooked when people decide whether to use R or Python is how robust the language and libraries are. I've programmed professionally in both, and R is really bad for production environments. The packages (and even language internals sometimes) break fairly often for certain use cases, and doing regression testing on R is not as easy as Python. If you're doing one-off analyses, R is great -- for anything else I'd recommend Python/Pandas/Scikit.
willpearse 2 hours ago 0 replies      
Very picky, but beware constantly using "set.seed" throughout your R scripts. Always using the same random number is not necessarily helpful for stats, and makes the R code look a lot trickier than it need be
xname2 2 hours ago 0 replies      
"data analysis" means differently in R and Python. In R, it's all kinds of statistical analyses. In Python, it's basic statistical analysis plus data mining stuff. There are too many statistical analyses only exist in R.
fsiefken 1 hour ago 0 replies      
It would be nice to compare JuliaStats and Clojure based Incanter with Python Pandas/NumPy/SciPy. http://juliastats.github.io/
acomjean 2 hours ago 2 replies      
I work with biologists. R which seems strange to me they seem to take to. I think some of it is Rstudio the ide, which shows variables in memory on the side bar, you can click to see them. It makes everything really accessible for those that aren't programmers. It seems to replace excel use for generating plots.

I've grown to appreciate R, especially its plotting ability (ggplot).

zitterbewegung 3 hours ago 1 reply      
This is not just interesting for comparison but its interesting for people that know R/Python how to go from one to the other.
vegabook 1 hour ago 0 replies      
Python's main problem is that it's moving in a CS direction and not a data science direction.

The "weekend hack" that was Python, a philosophy carried into 2.x, made it a supremely pragmatic language, which the data scientists love. They want to think algorithms and maths. The language must not get in the way.

3.x is wanting to be serious. It wants to take on Golang. Javascript, Java. It wants to be taken seriously. Enterprise and Web. There is nothing in 3.x for data scientists other than the fig leaf of the @ operator. It's more complicated to do simple stuff in 3.x. It's more robust from a theoretical point of view, maybe, but it's also imposes a cognitive overhead for those people whose minds are already FULL of their algo problems and just want to get from a -> b as easily as possible, without CS purity or implementation elegance putting up barriers to pragmatism (I give you Unicode v Ascii, print() v print, 01 v 1 (the first is an error), xrange v range, the list goes on).

R wants to get things done, and is vectors first. Vectors are what big data typically is all about (if not matrices and tensors). It's an order of magnitude higher dimensionality in the default, canonical data structure. Applies and indexing in R, vector-wise, feels natural. Numpy makes a good effort, but must still operate in a scalar/OO world of its host language, and inconsistencies inevitably creep in, even in Pandas.

As a final point, I'll suggest that R is much closer to the vectorised future, and that even if it is tragically slow, it will train your mind in the first steps towards "thinking parallel".

k8tte 2 hours ago 2 replies      
i tried help my wife who use R in school, only to get quickly lost.also attended ~1 hour R course on university.

to me, R was a waste of time and I really dont understand why its so popular in academia. if you already have some programming knowledge, go with Python + Scipy instead

EDIT: R is even more useless without r studio, http://www.rstudio.com/. and NO, dont go build a website in R!

HAProxy 1.6.0 released haproxy.com
116 points by oldmantaiter  3 hours ago   20 comments top 9
wheaties 25 minutes ago 1 reply      
Whenever I see a long list of features like these, especially something major like Lua integration, I always wonder, what was given up in the process of adding them? Normally when there's a performance bump, they show the numbers. In this case there's no numbers. Was there a performance hit? Negligible?
radoslawc 2 hours ago 0 replies      
External check pleases me greatly, but sending emails for me seems to be overkill, there are well established ways to do this in unified manner (logparsers, snmp traps etc).Half way trough to fullfill Zawinski's Law.
pentium10 3 hours ago 0 replies      
Cool, now we can use Device Identification feature to route mobile users to a different backend, also love the HTTP/2 connection sharing.
binaryanomaly 6 minutes ago 0 replies      

But no http2 so it won't get in front of my nginx instances, yet ;)

nailer 1 hour ago 2 replies      
As an HAProxy user, support for logging to stdout (and hence journald) would be great. Currently HAProxy users on the major Linux distros either have to use it in debug mode or have a second log server just for the purposes of running HAProxy.

Otherwise I love HAProxy!

dexcs 3 hours ago 1 reply      
Nice. It supports lua and mail alerts on changing servers now...
wpblogheader 1 hour ago 0 replies      
Supports Lua? SWEET!
ErikRogneby 2 hours ago 0 replies      
Lots of goodness!
ausjke 2 hours ago 2 replies      
I was comparing HAproxy to Squid a while ago and could not figure out what's haproxy's advantage over squid? I ended up using Squid but still am very interested in HAproxy, would like to learn more about it.

Squid remains to be the only one that can deal with SSL proxying(yes it's kind of MITM, but it's needed sometimes), and it's also the real "pure" open source. HAproxy might be better fit for enterprises that need support?

Running Swift code on Android goyet.com
24 points by Ecco  24 minutes ago   2 comments top
alediaferia 3 minutes ago 1 reply      
This is awesome resource. Thank you for sharing.
Side Project Launch Checklist keepwomen.com
57 points by userium  2 hours ago   35 comments top 11
shocks 1 hour ago 2 replies      
I don't think this is a good list for a side project.

Every side project I have ever attempted has taken me ages and never got anywhere. Saying things like localisation need to be done before your MVP ready is nuts.

I'm working on a side project right now. I've tried to learn all my lessons from other attempts. It's just a basic website (landing page). Between building that, trying to sort out product, make a plan, etc - there is barely any time left... I have a full time 9-6 job, and a two hour (total) commute. There's hardly any time left at the end of the day, squeezing out an hours work each day is hard enough. Localisation would never be on my list.

Having the time to check each one of these points is a luxury most side projects can't afford.

briholt 2 minutes ago 0 replies      
This is terrible list.

First, it's basically a list of minor nit-picky random things rather than a thoughtful list of essentials. How is "Pages don't refresh automatically" on the list, but, "Buy an SSL certificate" is not?

Second, many of these things would be total waste of time pre-launch. If you have zero people buying your product and you spend your time perfecting "Currency, language, country specific deals, taxes" then your perspective is way out of whack.

onion2k 1 hour ago 1 reply      
What exactly is it about the things you've listed that makes them important? The majority of successful businesses that I've seen launch broke pretty much every point on that list when they deployed their first version, and most didn't fix them for months (and years) afterwards.

If I was making a similar list it'd be something like:

1. Does the product do something useful yet?

As soon as the answer is "yes" then you should launch. That's it.

vdnkh 24 minutes ago 1 reply      
>Links are easily recognizable. They look clickable. Items that aren't links don't look clickable, for example underlining text is avoided.

I've been struggling with making things 'look clickable' on my website (or maybe everything is fine and it's in my head). To solve this I've been trying to define what exactly makes a user want to click something. The most common methods are the blue link color, underlining, or a 'more/click me' button. I don't want to do any of those - wanting someone to click content is more desirable than telling them to.

In researching other sites and monitoring my own behavior I noticed that I always want to click images. This probably has to do with being a long time image-board browser. But the content I want to be clicked doesn't have any images associated with them. I've been trying to work in glyphicons that indicate a link, but it messes with the aesthetics of the content.

Any suggestions?

jkaptur 1 hour ago 1 reply      
I don't think I followed any of these for my side project (www.diff.so) because, well, it's a side project! It's a prototype, something I did for fun. This is a great list of things to do before I expect someone to use it seriously, not a pre-launch checklist.

To go a bit off-topic, I actually tried to follow the "don't use color alone to provide information" for the static diff on diff.so/about, but I couldn't find a way to get ChromeVox to read text on one side of the diff any differently than text on the other. I'd be really interested in any advice on that - it seems like some screenreader users could use something a little more user-friendly than emacs.

ser_tyrion 1 hour ago 2 replies      
"Personalized Features: Language" sounds like a lot of work for not much payoff. Halting the launch before translations are ready will get in the way of 'just shipping'. That said, probably a good idea to dev with js i18next even if all translations aren't ready.
siavosh 33 minutes ago 1 reply      
Depends on what your definition of a side project is, but I would add a security list as well.
rckrd 1 hour ago 0 replies      
Much of this goes against conventional wisdom for launching an MVP.
userium 1 hour ago 5 replies      
The creator of keepwomen.com here, if anybody wants to discuss? I'm happy to hear about improvement ideas for the website!
kylehotchkiss 1 hour ago 0 replies      
this is a good checklist for real projects too. As a developer, it's hard for me to see big-picture what matters because I have my head buried in the code.
netman21 40 minutes ago 0 replies      
Of course you do not need a URL redirect to get the primary domain to go to www.domain.com That is handled in your DNS "A" records.
Mattermost 1.0 release (Slack opensource alternative) mattermost.org
33 points by shuoli84  1 hour ago   7 comments top 5
jeffjose 4 minutes ago 0 replies      
In my past job, I was desperately looking to get an open source Slack alternative. The ones I tested then (few months back) didnt hold up nicely against Slack. I'm happy to see that finally there's some good competition.
netcraft 24 minutes ago 0 replies      
I think Slack serves its stated purpose very well (smaller, business oriented teams), but many groups have started using it for larger communities, mostly because it has unlimited users for free. But it isn't made for that and there is no way that most of these groups would ever be able to pay for a premium subscription due to the per-user costs. 10K messages across all channels is surprisingly easy to hit, need the ability to ignore users, etc. I think this project has great potential to fill that niche if it is marketed properly. Slack is so close to working well in that area but really needs to pivot to be able to serve it well and make money doing it.
DannoHung 14 minutes ago 0 replies      
Interesting that it will be a default feature of Gitlab.

That's a move that seems like it may push Gitlab ahead of GitHub in some ways (well, to me at least).

pionar 25 minutes ago 1 reply      
So, what does this offer over Slack, besides being open-source? I see no mention of any actual features, besides basic chat features.
jhildings 15 minutes ago 1 reply      
Why not just use IRC ?
Kilogram conflict resolved at last nature.com
206 points by ColinWright  6 hours ago   87 comments top 17
Asbostos 4 hours ago 1 reply      
The best part about this batch of changes is they push the mole and Avogadro's constant out on their own where the belong, not linked to any other units. Now we'll have only a single mass unit (kg) instead of two (kg and unified atomic mass unit) that we do now. This will knock carbon 12 of its perch as the definition of the "other" mass unit that's been essential to use SI's mole but was not actually SI itself.
jakeogh 5 hours ago 0 replies      
silicon-28 sphere(s?): https://www.youtube.com/watch?v=ZMByI4s-D-Y yep, they let him palm it)

watt balance: https://www.youtube.com/watch?v=VlJSwb4i_uQ

LoSboccacc 3 hours ago 0 replies      
so the proposed definition was set by fixing the numerical value of the Planck constant to 6.62606X1034 s1m2kg

and the conundrum was that they still needed to have a precise enough measurement of that constant because it's an experimental measurement.



zb 3 hours ago 1 reply      
I was amused to read this:

"They never found the cause for the disagreement, but in late 2014 the NIST team achieved a match with the other two"

at a time when this story, also from Nature, is also on the front page: https://news.ycombinator.com/item?id=10383984

tinkerdol 2 hours ago 0 replies      
Reminds me of this movie: https://www.youtube.com/watch?v=5dPnFO_JCdc(haven't seen it, but it looks interesting
MarcusP 5 hours ago 4 replies      
Are metric measurements all derived from the value 1kg? If so, does this mean that the entire metric weight range can now be officially based on mathematics?
justhw 47 minutes ago 1 reply      
There's a good Radiolab episode related to this .


pervycreeper 2 hours ago 2 replies      
What is the level of accuracy they are aiming for? If it entails have some uncertainty over the precise number of atoms in the silicon sphere, then how did they choose this level of accuracy?
dfc 3 hours ago 2 replies      
The kilogram is still the only base unit that contains an SI prefix in the base unit's name.
Aoyagi 4 hours ago 1 reply      
Here I thought a kilogram was defined by water... oh well, looks like that definition is slightly outdated.


spydum 3 hours ago 1 reply      
i was hoping this was going to explain the kg differences between the original and the copies. instead it just resolves it by changing the standard. good for science i guess, sad for my curiosity
unwind 5 hours ago 0 replies      
Duplicate, very close in time: https://news.ycombinator.com/item?id=10385743.
acqq 3 hours ago 2 replies      
I didn't understand what then will be used: the Si sphere or the Watt balance?
bartvbl 4 hours ago 3 replies      
I wonder: the article states that the SI unit for Kg up to this point was defined using a single object. Doesn't this definition also involve the fact that it's placed on earth, thus requiring two objects for its definition?
TomGullen 3 hours ago 0 replies      
Lived next door to a PHD NPL physist who was working on this a few years ago. I think they ended up handing the project over to Canada or somewhere like that IIRC. Fascinating project and guy.
castratikron 2 hours ago 0 replies      
Strange to see Planck's constant used that way, defining a kilogram. Planck's constant usually only shows up when you're doing quantum mechanics and the things you're working with are really small.
mtgx 4 hours ago 5 replies      
Now even the U.S. can adopt it.
How the 'hoverboards' took off in spite of laws against them bbc.co.uk
20 points by nns  1 hour ago   25 comments top 10
markh1967 1 minute ago 0 replies      
"All of them should be allowed on the pavement, because people aren't stupid, they will get out of the way if they see one coming."

Speaking as someone who sometimes puts his back out and spends a few days hobbling slowly about, I shouldn't have to quickly get out of someone's way if I see them coming and they shouldn't be allowed on pavements.

lightbritefight 21 minutes ago 0 replies      
Boosted boards are still the best "hoverboard" for my money:


The price is prohibitive, but the market now has something like 7 or 8 brands competing, so I expect the price will come down in the next year or so.

alexc05 38 minutes ago 0 replies      
To be clear, with most of the celebrity adopters, they're often given the device and sometimes paid to use it.

Here in Toronto, I've seen marketing teams of people handing out this device and rolling around Yonge-Dundas square (sometimes cast as times square in feature films) in a guerilla marketing push.

It doesn't mean they're not cool and fun, but the breathless articles about how cool, hip, and trendy people are "suddenly using the device" are often coordinated.

swang 11 minutes ago 0 replies      
I have one of things. Bought off Alibaba for about ~$300 shipped.

It is fun, but you have to be very careful about stepping on. It is one of those things where the more you panic the more something goes wrong as you overcompensate. I fell when I tried to step onto one on concrete and it was way faster than I was use to so I overcompensated for the pressure my foot was causing and landed hard around my tail bone. And by that point I was pretty experienced getting on (just not on concrete).

Snowboarders seem to have a pretty easy time with it. But again, getting on without help is probably the most difficult step.

JoshGlazebrook 56 minutes ago 1 reply      
The funny thing is all these "hoverboards" are the same exact device from China. You can get them off alibaba for $200 or spend $1,500 for some American branded version that is the same exact thing.
TazeTSchnitzel 10 minutes ago 1 reply      
I'm not sure the ban on public use is a bad thing. I imagine they'd become a nuisance if a lot of people were using them. There's quite a difference between someone walking into you and someone crashing into you at 14mph on a 10kg set of wheels. Bicycles are used on bike lanes and on the road for a reason.
6stringmerc 10 minutes ago 0 replies      
Well, as long as people are willing to risk life and limb in the pursuit of fun and thrills, I can't really talk down on them.

Also I can't wait to get some more traction in my side project so I can get my "recreational personal flight device" into a prototype and testing stage. Think of it as akin to JetMan, version 2.0, and not needing a helicopter.

hodwik 22 minutes ago 0 replies      
Have been riding one of the single wheel versions of this. Incredibly difficult, but fun none the less.
k__ 40 minutes ago 0 replies      
Are these the real life version of conveyor belt sidewalks the scifi autors wrote about?
JulianMorrison 1 hour ago 9 replies      
Not everything is about effing obesity. Sometimes fun is just fun.

Seriously, the way people whine and whinge about them, you'd think they never did anything unless it was for some grimly healthy purpose.

Apple facing huge chip patent bill after losing case bbc.com
100 points by jnord  5 hours ago   73 comments top 15
devit 3 hours ago 3 replies      
Looks like the "idea" of the patent in the description is to use a predictor to predict when a STORE and LOAD alias and not speculate the LOAD and any instruction depending on the load (although the claims generalize this to any non-static dependency).

As it generally happens in software/hardware patents, the claimed solution seems quite obvious whenever one wants to solve that particular problem, and the hard part is the "execution", i.e. implementing it efficiently and figuring out whether the tradeoffs are worth it.

So assigning patents to things like this seems really dumb.

rayiner 2 hours ago 1 reply      
This PDF explains what I discuss below in more detail: http://moodle.technion.ac.il/pluginfile.php/315285/mod_resou.... Prediction of aliasing is discussed on slide 25.

The patent in question pertains to an optimization of what these days you'd call "memory disambiguation." In a processor executing instructions out of order, data dependencies can be known or ambiguous. A known data dependency is, for example, summing the results of two previous instructions that themselves each compute the product of two values. An ambiguous data dependency is usually a memory read after a memory write. The processor usually does not know the address of the store until it executes the store. So it can't tell whether a subsequent load must wait behind the store (if it reads from the same address), or can safely be moved ahead of it (if it reads from a different address).

If you have the appropriate machinery, you can speculatively execute that later load instruction. But you need some mechanism to ensure that if you guess wrong--that subsequent load really does read from the same address as the earlier store--you can roll back the pipeline and re execute things in the correct order.

But flushing that work and replaying is slow. If you've got a dependent store-load pair, you want to avoid the situation where misspeculation causes you to have to flush and reply every time. The insight of the patent is that these dependent store-load pairs have temporal locality. Using a small table, you can avoid most misspeculations by tracking these pairs in the table and not speculating the subsequent load if you get a table hit. That specific use of a prediction table is what is claimed by the patent.

Maybe this is worth a patent, or maybe not. For what it's worth, I don't think anybody was doing memory disambiguation at all in 1996. Intel was one of the first (maybe the first) to do so commercially in the mid-2000's. Apple's Cyclone architecture also does it, and I think it was the first in the low-power SoC space to do it.

msravi 1 hour ago 0 replies      
UWisc has always been very aggressive with its patents. I recall sometime during 2002 or thereabouts, while working for a reasonably big semiconductor company with DSP/ARM processors, one of the guys in our team with an interest in computer architecture, used the company network to download and play with a simulator or something (might have been simplescalar). A few weeks later the head of our group gets contacted by the company lawyers saying that UWisc was asking for licensing costs for using their tools (they provided the ip address that was used to download the tools). I'm not sure how it was resolved finally, but I don't think the company paid.
Kristine1975 1 hour ago 3 replies      
>The University of WisconsinMadison is a public research university

So it's a university [mainly] funded by the tax-payer. How can it be that the research of this university isn't in the public domain? The public paid for it, the public should reap the benefits without paying again.

Sure, Apple tries their hardest not to pay taxes, but the patent isn't limited to them.

DannyBee 2 hours ago 0 replies      
In general, I welcome the day when universities get what is coming to them for this kind of stuff (see also: Marvell vs CMU for 300+ million, reduced from 1.5 billion on appeal, etc).

In particular, given how much industry funds them, collaborates with their professors, etc, what is going on now is a remarkably stupid approach mostly driven by tech transfer offices that want to prove their value.

Which will be "zero", once the tech industry starts cutting them off.

monochromatic 2 hours ago 2 replies      
How is this journalism? It doesn't even tell you the damn patent number.
ctz 3 hours ago 1 reply      
DannoHung 1 hour ago 0 replies      
I have one question: Do the professors teach this technique in classes?

I mean, that'd be funny, right? Teaching students something that you patented, waiting a few years for them to go into industry and apply what they learned, then suing them for it.

propter_hoc 2 hours ago 3 replies      
This is sort of a depressing precedent. Do we really want to turn our universities into patent trolls?
bitmapbrother 44 minutes ago 0 replies      
I have no sympathy for Apple in this matter. Considering the worthless, prior art ridden patents they used against their competitors they deserve the blowback. And in keeping with their modus operandi they ignored the University of Wisconsin and wilfully infringed the patent.
bwilliams18 23 minutes ago 1 reply      
What if patents could only be held by individuals and not corporations?
abluecloud 4 hours ago 4 replies      
$862m isn't that huge in the grand scheme of things. Not to mention, it's most likely not going to be $862m, my guess is it'll be less.
cozzyd 2 hours ago 0 replies      
Awesome, maybe the Brewers need a new stadium too.
werber 3 hours ago 1 reply      
I don't get how they settled out of court and then did it again, that seems really bizare.
mtgx 3 hours ago 1 reply      
You know what they say: Live by the patent sword...

Why doesn't Apple start lobbying for real patent reform?

Chromecast + NFC maxwellito.tumblr.com
36 points by maxwellito  2 hours ago   33 comments top 9
potench 40 minutes ago 2 replies      
The sender and receiver SDKs have mature APIs for playing video. The session and media channels/objects are well documented and leveraging them on your custom receiver ensures far easier integration on the senders (iOS, android, and web). If you venture out of the cast-a-single-video experience, it can get pretty complicated as there are best practices for how to handle multiple senders, how to fetch a playlist and continue autonomously if the sender disconnects without stopping the cast, and maintaining a good user experience on all senders that represents current state of receiver. You could (and would be advised to) handle all this via the provided "media" channel.

More to the point of the article, there is also a custom messaging channel, you can create your own interactive experience (the receiver is just a website displayed in a chrome tab). Here's an example of tic-tac-toe: https://github.com/googlecast/Cast-TicTacToe-chrome. You can already develop a custom receiver for an in-store search feature or something more interactive and playful. You can connect to it via the same wifi. I don't understand what additional value nfc provides in this case as you'd still need to maintain an open/persistent connection to the receiver from the sender, would nfc be able to provide that?

arpit 1 hour ago 2 replies      
Considering the Chromecast is plugged behind the TV, wouldn't it be really awkward to get to to tap on it?

Btw, Google has a series of APIs called Nearby (https://developers.google.com/nearby/) that are all about connecting to nearby devices but NFC doesn't seem the right answer here.

ron0c 53 minutes ago 1 reply      
According to iFixIts teardown: https://www.ifixit.com/Teardown/Chromecast+2015+Teardown/501...

It has a Marvell Avastar 88W8887 chip, which has NFC built in. http://www.marvell.com/wireless/88W8887/ as well as FM Radio.

I said this should be used for "Tap and Play"

coldpie 1 hour ago 8 replies      
Okay, someone help me. The article says:

> The new version of the Chromecast is launched. I was already in love with the existing version and [its] JavaScript API.

I have a Chromecast and I Don't Get It. I'll be honest, I haven't tried searching for cool stuff to do with it. Sometimes I stream Netflix or Youtube from my phone. I found a way to stream MKV videos from my computer using Chrome. But that's it, nothing there is cool or revolutionary.

So, what's some cool stuff I can do with my Chromecast? I recognize there's something neat going on here, but it seems so locked down that I can't figure out what it is. Can I write arbitrary apps for it somehow? Is there a cool collection of apps that do... something?

Help me out. What's cool about Chromecast?

maxwellito 1 hour ago 0 replies      
The point of this article is for a different usage of the Chromecast.

The Chromecast can display webapps (Chromecast apps) and this could transform any TV as interactive screen. Imagine you have a store, you can create an app which display the new products. But instead of having to type the URL on your phone to have more info, the web app could use the Chromecast API to broadcast a URL via NFC.

And for more advance stuff, it could implement something like Liwe. It's a service to use smartphones as remotes for web apps (>> liwe.co).

The problem right now is the Chromecast is only known for broadcasting video and audio while it can do more then that.

sowbug 1 hour ago 1 reply      
The article author actually wants a Physical Web beacon. The range of BLE better suits the use case.


Jyaif 1 hour ago 0 replies      
NFC is unnecessary for what the OP is talking about. The chromecast can advertise itself to mobile phones with inaudible audio or with a hidden QR code on the screen.
amelius 1 hour ago 0 replies      
Pretty obvious solution, but very useful. If only Bluetooth-pairing was this simple.
codazoda 1 hour ago 0 replies      
This seems like saying, "Hey, lets ditch remote controls and put the controls right on the TV!"
Crowdsourced research: Many hands make tight work nature.com
5 points by DanBC  50 minutes ago   discuss
Usefulness of mnesia, the Erlang built-in database erlang.org
115 points by motiejus  8 hours ago   26 comments top 10
lucozade 4 hours ago 1 reply      
I can't say I found the piece particularly insightful.

It seemed to imply that mnesia is the DB of the future as soon as everyone realises that everything they are doing is completely wrong and they should be doing things that are more suited to mnesia. Without saying what those things are.

I actually found one of the child comments [1] was pushing in a better direction. Essentially, the vast drop in $/TB of storage means that persistence of time series/ event type data is practical for the masses now. Sure it's found a niche in ads on the web, but it has much wider applicability than that. I personally think that Erlang is particularly well suited to this space.

[1] http://erlang.org/pipermail/erlang-questions/2015-October/08...

jerf 2 hours ago 2 replies      
It is what it is. It's useful to prototype with in Erlang. It may be useful to ship with. If Mnesia turns out not to fit your problem, here in 2015, you've got literally dozens of choices of alternate DB, with all sorts of consistency and distributibility and performance characteristics.

My guess is that if somehow Erlang was where it was in 2015 except it didn't have Mnesia, nobody would really perceive much of a hole there, and nobody would write it, because of the database explosion we've seen in the past 10 years. But it is there, and if it works for you, go for it.

My only slight suggestion is that rather than inlining all your mnesia calls, you ought to isolate them into a separate module or modules or something with an interface. But, that's not really because of Mnesia... I tend to recommend that anyhow. In pattern terms, I pretty much always recommend wrapping a Facade around your data store access, if for no other reason than how easy it makes your testing if you can drop in alternate implementations. And then, if mnesia... no, wait... And then, if $DATABASE turns out to be unsuitable, you're not stuck up a creek without a paddle. With this approach it's not even all that hard to try out multiple alternatives before settling on something.

daleharvey 4 hours ago 0 replies      
This has little to do with databases, erlang or mnesia, its just a moan against people writing ad tech.

mnesia is a database for the 90's because it was written by smart people in the 80's and like most of the rest of the otp stack was fairly under used or maintained.

I have a huge amount of respect for Klacke and the original authors behind a lot of this tech, however the erlang community that followed seems to suffer some cognitive dissonance around what problems it solves and how well they are doing them. It would be hard to pick a database less suitable for SMB use than a domain specific database in a niche ecosystem.

jacquesm 5 hours ago 2 replies      
> This is much more interesting than chasing click statistics in the interest of brokering ad sales at the speed of light so that users can continue to ignore them.

That comment really packs a punch and should get much wider visibility. Ad tech and related software is where way too much of our collective efforts are going.

i_feel_great 6 hours ago 3 replies      
I have much the same sentiment with SQLite. Much dismissed as a toy database, but absolutely appropriate for 99% of my clients - small and medium businesses, the same as mentioned in thread.
lectrick 2 hours ago 0 replies      
Erlang (and by association Elixir) tooling has a nice progressive approach to managing state.

Agent -> ets -> dets -> mnesia -> riak (or sql tooling etc.)

(Agent http://elixir-lang.org/docs/v1.1/elixir/Agent.html is just a state-holding process. Erlang folks can probably write one of these in their sleep, Elixir added a bit of wrapping-paper around it.)

If you're writing an app, I think it's best to be storage-agnostic from the get-go. You shouldn't be building up queries in your core app code- push it to the edge of your code, because otherwise it's not separating concerns. All your app (business logic) code should delegate to some wrapper to work out the specifics of retrieving the data; your app code should just be calling something like Modelname.specific_function_returning_specific_dataset(relevant_identifier) and let that work out the details. That way, if you ever upgrade your store, you just have to refactor those queries but your app code remains the same. On top of that, in your unit tests you can pass in a mimicking test double for your store to do a true unit test, and avoid retesting your store over and over again wastefully. (You'd still of course have an integration test to cover that, but it wouldn't be doing it on every test.)

twsted 6 hours ago 0 replies      
Many valuable considerations inside, read the post.Starting with this thought in the question: 'I hate transiting syntactic boundaries when I'm programming'.

But the answer is such a broader evaluation of the utility of the tools we are using, related to what we use those in.

And some rants that I share: "but really boil down to adtech, adtech, adtech, adtech, and some more adtech, and marketing campaigns about campaigns about adtech."

eddd 4 hours ago 0 replies      
Well, you could summarise this article by saying: "Just because something got invented 25 years ago it doesn't mean it is useless".On the contrary - it is worth taking a look on technology that survived 25 years in the wildness.
zaphar 2 hours ago 0 replies      
The article is light on actual details and heavy on rants about the current state of products built on the web.

However there is 1 thing that mnesia got absolutely and totally right. Database schema upgrades. You can create an mnesia database and upgrade it's schema on the fly as a part of it's operation without once bringing it down or running a script. I did this[0] for a toy project I did in erlang once that I unfortunately never finished since the need for it disappeared.

[0]: https://github.com/zaphar/iterate/blob/master/src/db_migrate...

davidw 2 hours ago 0 replies      
By the same argument, though, why not just use Postgres? And I write that as a fan of Erlang. Indeed: https://github.com/epgsql/epgsql
Canal Defence Light(WW2) wikipedia.org
24 points by vinnyglennon  3 hours ago   10 comments top 4
arethuza 2 hours ago 0 replies      
The UK was very fond of making specialised of tanks in WW2 - notably "Hobart's Funnies":


The Soviets also used searchlights to dazzle enemies during attacks - particularly that attack on Seelow Heights:


kitd 2 hours ago 0 replies      
I remember seeing one of these at the Tank Museum in Dorset, UK. If you're in the area, it's worth a visit. The weird contraptions mounted on some WW2 tanks must be seen to be believed.
ceejayoz 2 hours ago 2 replies      
Can't imagine driving one of these into battle. Seems like it'd be a giant "shoot me first!" announcement.
Avalaxy 1 hour ago 2 replies      
So why is it no longer in use? Seems to me that emitting an extremely bright flashing light is a great way to disorientate enemy infantry in close quarter combat scenarios.
What a City Would Look Like If It Were Designed for Only Bikes fastcoexist.com
36 points by geezsundries  1 hour ago   36 comments top 16
hackuser 6 minutes ago 0 replies      
I had thought the custom on HN was to post valuable, constructive comments, and not to find fault or to post typical Internet forum sniping and hyperbole.

Perhaps it's my impression, but I've rarely seen anything constructive recently. Almost everything in this discussion, for example, is reaction and sniping ("useless", "absurd", "stupid", etc.). It's a way to hang out and socialize online; there's nothing wrong with it. But personally, I've read enough Internet sniping for a lifetime; it's not thoughtful, informative, insightful or constructive; I don't learn anything and leave uninspired.

Perhaps it's just my impression or it's temporary; perhaps it's a bigger change (related to YC and its leadership distancing themselves from HN?). Is there anywhere online where the sniping is eliminated and the discussion more valuable?

EDIT: Sorry, I know it's off-topic, but there's no other place to post it (that I know of).

Avalaxy 1 hour ago 3 replies      
A bike ramp into the apartment is useless. You can just take the elevator with your bike, or you know... Just make a garage for bikes on ground floor. The article doesn't really say much about how to design a city for bicycles, which mostly comes down to providing good roads where you aren't bothered by cars.
Shivetya 16 minutes ago 0 replies      
If exposed to weather it is an automatic fail. Sorry, that is just how it works. From any transportation alternative that is no enclosed it always comes down to comfort. One day it is too hot, another too cold, or too wet, and on and on, until the excuse is no longer needed as the alternative mode is parked.

So any bicycle or pedestrian friendly environment needs weather protection as part of its design. It does not need be fully enclosed but that type of protection may be required depending on climate. Perhaps a convertible system where panels retract?

throwaway41597 36 minutes ago 0 replies      
Misleading title and absurd idea: they suggest having bike lanes inside buildings. This is wasting precious real estate: the indoor bike lanes and parkings are nearly as big as the apartments they service. Why not build streets in buildings if you prefer cars? or a river if you're a boat person? or a half-pipe for skaters?
jacquesm 1 hour ago 1 reply      
That's a city designed for ONLY bikes, and no city will ever be designed like that, no cars allowed is nice in theory but not workable. To be practical it would at a minimum have to accommodate foot-traffic, supply lines to stores (or did you think that those stores will be supplied by cargo bikes?) and access for disabled people (not everybody can ride a bike).

So it's a nice thought experiment but not at all practical, on top of that 'bike lanes in apartment building hallways' make you wonder just how much experience the designer has riding bicycles, you park your bike at the interface between inside and outside and you don't run around the apartment hallways on a bicycle because of (1) pedestrians, (2) playing kids, (3) the fact that you now have to elevate your bicycle every time you want to go in or out of your house and (4) storing your bike at streetlevel is simply much more practical.

smcl 1 hour ago 2 replies      
That "bike friendly" apartment block looks extremely pedestrian hostile
geezsundries 1 hour ago 1 reply      
I'm confused as to why they created bike paths in apartment building hallways.
wehadfun 19 minutes ago 0 replies      
A bike friendly building is a good idea if city really, really wanted to promote bike use. Being able to ride your bike into a grocery store, shop, then ride your bike into your kitchen and unload is kind of awesome. But this is the last step after bike only roads and decent bike public transportation.
ape4 1 hour ago 0 replies      
A secure/convenient place to lock your bike at ground level is better.
zyxley 44 minutes ago 1 reply      
> and maybe even wheel their bike through stores as they shop, with a sleeping baby in the bike carrier, or use the basket to hold groceries

Anyone who has ever actually had a bike for more than a week and who actually shops for their own groceries will know this is a terrible idea.

mroll 31 minutes ago 1 reply      
One problem that immediately stands out to me is how sweaty everyone would be. I'm in shape but riding around a city on a hot day would have me drenched.
crystalmeph 1 hour ago 0 replies      
For one thing, you wouldn't have to look at old or infirm people...
acaloiar 52 minutes ago 0 replies      
Even as a cyclist these designs seem myopic to the demands of civic design such as pedestrian friendliness and affordability. Perhaps a Dubai could afford retrofitting, but as these designs stand, they are only practical (and I use that loosely) for new construction.
electricblue 36 minutes ago 1 reply      
particular small areas with extremely high pop density might improve if cars were banned but for the most part adding separate bike paths that don't interact with roads much is the way to go. The sloped apartments seem dangerous and stupid.
jbb555 48 minutes ago 0 replies      
It would look empty, because who would choose to live there?
coldcode 6 minutes ago 0 replies      
I live 43 miles from my job, all on major interstates, in a big metro area. What good would this do? Unless you designed the city for mixed living/working/shopping in the first place, most US cities could never change to be bike first. This isn't SimCity where you can raze the whole town.
After 8 years and $128m raised, the clock is ticking for men's retailer Bonobos businessinsider.com.au
6 points by prostoalex  36 minutes ago   2 comments top
ucaetano 10 minutes ago 1 reply      
Call me crazy, but Im hoping we can build something standalone,

The e-commerce business is really challenging, and we feel like with this online-offline equation weve really unlocked something that can scale,

I have a long vision for the company, one that could take decades to unfold, and I didnt think that my running the company day-to-day was necessarily optimal to getting there.

So after 8 years unable to scale, they now want to scale by building brick-and-mortar stores where you can try a product that you can only buy online?

Am I missing something here?

Transit: JSON Data Interchange Format github.com
147 points by dedalus  13 hours ago   87 comments top 15
lars512 8 hours ago 2 replies      
I couldn't get the why of the project from the Github page alone. Rich Hickey's post introducing it a year ago is clearer:


JSON has become the go-to schemaless format between languages, despite being verbose and having problems with typing. Transit aims to be a general-purpose successor to JSON here.

Stronger typing allows the optional use of a more compact binary format (using MessagePack). Otherwise it too uses JSON on the wire.

Anyone who knows more, please correct me.

nly 3 hours ago 1 reply      
Not sure I like the idea of cramming ASCII type tags in to the encoded JSON.

I'm more partial to the way Avro does it, where the encoded JSON remains type-tag and cruft free, and a separate schema (also JSON) is used (and required) to interpret the types, or encode to the correct binary encoding.

creshal 8 hours ago 4 replies      
Reinventing XML, one data type at a time.
escherize 9 hours ago 1 reply      
The most novel use of transit is in the Sente [1] library for clojure/script. It is an abstraction over long-polling / websockets that lets us treat it as a core.async channel (which is like a go-block in Go).

It's worked awesome for updates, and using Transit to keep the transmissions minimal has let us focus on the API for a realtime system.

[1] - https://github.com/ptaoussanis/sente#sente-channel-sockets-f...

mukundmr 7 hours ago 4 replies      
Why choose this over Google's Protocol buffers? https://github.com/google/protobuf
khgvljhkb 8 hours ago 3 replies      
Am I the only one amazed by what the Clojure community and core team are conjuring up?

Doing client-side programming with things like CLJS, Figwheel, Reagent and core.async feels miles ahead of what we have in moden-js-land (es6/7, babel, webpack, React, promises).

If you were to start a startup today, would you be comfortable going with something like Clojure/script?

hyperpallium 3 hours ago 2 replies      
> The extension mechanism is open, allowing programs using Transit to add new elements specific to their needs. Users of data formats without such facilities must rely on either schemas, convention, or context to convey elements not included in the base set,

The extension mechanism is writing handlers in all languages communicated with, since its stated purpose is cross language value conveyance.

In contrast, a schema language allows extensions to be described once, in one language.

I was expecting this to be a sort of macros for data notation (an inline schema language), but it seems more like an extendible serialization library.

kayamon 8 hours ago 2 replies      
I can't help but wonder if it isn't simpler just to use gzipped JSON. I'd be interested to see a size comparison of the two. It seems like they're going to an awful lot of work to hand-roll a suboptimal text compression scheme here.
fnordsensei 8 hours ago 0 replies      
I recently used this in a project where I simply wanted a typing guarantee that JSON can't provide (i.e., that a timestamp really is a timestamp when it arrives on the other side, not a string like in JSON). It's very easy to use, more or less just a drop-in middleware.
Perceptes 6 hours ago 0 replies      
Previous HN discussion from when it was announced last year: https://news.ycombinator.com/item?id=8069346
agopaul 8 hours ago 0 replies      
So it's basically a set of libraries used to marshal/unmarshal objects without using a schema or can it also be used as an RPC library?
maweki 8 hours ago 0 replies      
So the python implementation is 2.7 only...
latenightcoding 8 hours ago 1 reply      
Very interesting(just commenting to read it later)
Chrome finally kills off the HTTP-HTTPS mixed content warning arstechnica.com
15 points by smacktoward  48 minutes ago   7 comments top 3
latortuga 4 minutes ago 0 replies      
> The problem is, it's almost impossible to switch completely from HTTP to HTTPS in one fell swoopthere are just too many factors that need to be tested and debugged.

Such as? Article is a bit short on facts here.

> At the same time, webmasters weren't keen to begin the migration process to HTTPS because of that pesky mixed content warning, which had a tendency to spook less-experienced users of the Information Superhighway.

And rightfully so! There's no difference between mixed content and HTTP only for the purposes of data security. Just yesterday I noticed that a payments website had mixed content issues and elected not to risk my personal info. This change is even better because now you really can tell your family to "just look for the lock icon".

0942v8653 31 minutes ago 1 reply      
I didn't like the old warning, because it looked more like a yellow up arrow to me than an orange warning triangle (I am colorblind). At first I thought the old one meant the page had upgraded or better security. I don't like the new one much better, though. I'd prefer to have the red slash through it like the one that appears in some cases.
Sir_Cmpwn 39 minutes ago 3 replies      
Are there any plans from any of the browsers to show HTTP sites as insecure, rather than as some sort of "normal" state?
Seeing stars again: Naval Academy reinstates celestial navigation capitalgazette.com
79 points by curtis  10 hours ago   51 comments top 11
SCAQTony 26 minutes ago 1 reply      
I am presuming an even darker scenario. GPS signals would not be jammed by the enemy but rather satellites would be obliterated wholesale as a result of a limited nuclear exchange. Thus sailing the old fashion way: Sextant, watch, and maps are the only way home.
csense 1 hour ago 2 replies      
Isn't the GPS signal fairly low-power, and thus, trivial for an enemy to jam?

Or would they have to get their jammer close enough to be in range of the ship's guns (and presumably its loud radio broadcasts would give away its position)?

maze-le 4 hours ago 3 replies      
It is always good to have a backup system, in case the primary system (GPS) fails. Celestial navigation has a proven track-record for centuries.

I wonder more why it was dropped in the first place...

joshuaheard 1 hour ago 1 reply      
I'm surprised they gave it up. It is still best practice in yachting to have at least one crew member on an ocean crossing be able to do celestial navigation. You can't count on electricity, and thus GPS, so you have to return to the basics as a backup.
DrJokepu 3 hours ago 1 reply      
Do commissioned officers actually do any navigation at all? I understood that this is mostly done by enlisted quartermasters (QM) an navigation electronics technicians (ET).
xg15 5 hours ago 3 replies      
I wonder if, with the use of image analysis, it would be possible to create a "robo-sextant" - i.e. a system which scans the sky in certain intervals, locates the waterline, sun or other celestial bodies and performs the necessary calculations. Such a system could be used to e.g. detect GPS manipulation quickly
gchadwick 6 hours ago 3 replies      
I wonder if they also asked trained officers to do regular practise? The theory behind celestial navigation is fairly trivial (I've almost forgotten it all but give me an alamnac and an hour and I'm confident I'd work it all out). The successful use of a sextant for a decent fix however requires training, practise and regular use to keep your skills sharp.
dctoedt 4 hours ago 1 reply      
I sort-of learned celestial navigation during Navy ROTC training 40 years ago. I remember the Nautical Almanac being something of a pain to work with. It would have been so nice to have the modern-day mobile-device apps that will take the sextant data and do the table-lookup number crunching.
yread 6 hours ago 1 reply      
RYA Yachtmaster course requires celestial navigation and some schools even require candidates to perform the navigation on the qualifying passage (couple hundred nm) without GPS.
moosetafa 2 hours ago 2 replies      
Now, if the Navy wanted their sailors to be truly hardcore navigators and be totally immune from cyber hacking, they'd have them use an abacus for calculations rather than a calculator or computer.
VLM 1 hour ago 1 reply      
One interesting side effect of learning celestial nav is you can't avoid learning some astronomy and keeping up to date on the night sky.

Another side effect is its a filter on people who can't handle math or can't handle complicated written procedures. Its unclear how important that is, but it is clear its a very good filter if you value those skills for other reasons.

With respect to situational awareness, its very easy to own a GPS while confusing that attribute of ownership with understanding where you are, but its very hard to mentally do celestial nav without understanding where you are, what are the present sea and weather conditions. Also there is a difference between merely owning a (possibly GPS accurate) clock or chronometer and understanding what time it is. "I own things that could provide accurate 4 dimensional situation, were I to actually understand the outputs" is a lot different from "I can perform extensive labor and calculations with the result of being deeply aware of my 4 dimensional situation"

From $250M to $6.5B: The Bay Bridge Cost Overrun citylab.com
79 points by vinayak147  12 hours ago   36 comments top 14
lukasm 6 hours ago 0 replies      
There is joke in Poland that goes like this:

Polish minister goes to France to meet his counterpart. They meet in an amazing office and he asks his french college- How did you get money to build this?- Can you see the bridge outside the window?- Yes.- 500 mln on paper, built it for 250. Voil.

After a year they meet again in Poland in even bigger an more magnificent building. French minister ask:- How did you get money?- Can you see the bridge outside the window?- No.- Voil.

carsongross 31 minutes ago 2 replies      
As a reminder, The Golden Gate Bridge, built in 1933, cost $1.5 billion in todays dollars, is 8,980 feet long (to the 11,616 feet of the eastern span) is 746 feet tall (to the 525 feet of the eastern span) and is tremendously architecturally significant, rather than looking like a highway onramp with a preposterously small sail plopped on one end of it.

Please stop this ride.

tristanj 6 hours ago 2 replies      
The SF Chronicle did a great expos on the bay bridge earlier this year. The Chronicle explains how the committee that selected the bridge was made up of specialists in fields only tangentially related to the job at hand seismic experts, building engineers and architects. These people had not designed bridges before, and chose the design based on aesthetics, not practicality. When given a choice between two bridges: a conventional cable-stayed one at $1.5B and an experimental self-anchored suspension at $1.7B, they chose the attention-grabbing bridge over the practical one. This led to the enormous cost overruns we see today.

Note that the bridge isn't unsafe, the modern span is much, much safer than the old WW2 era span. The new bridge won't collapse in an earthquake. However, because of design failures and shoddy construction, it will last much shorter than planned.


triggercut 5 hours ago 2 replies      
Some of these extra costs could not have been prevented or foreseen. No one could have anticipated China's thirst for steel back in 2001, let alone 1995/6; or the fact that, prior to the GFC, investment in major/mega public/private infrastructure projects worldwide was snowballing due to burgeoning western economies, putting a significant increase on demand for that (relatively cheaper) steel too, as well as the engineering services and human capital to drive it.

One thing I see time and again, in projects I've either been involved with directly, or indirectly, is a failure to do any substantive geo-technical assessment in the early planning phases. Not understanding your environmental invariants properly (when they are usually diverse due to the physical scale of these projects) always comes back to bite you later on. It's usually (eventually) a critical path item on any schedule since the structural design is so dependent on them.

"In April 2006, a consortium involving American Bridge and Fluor won the tower contract. It was built in China to save moneya decision that carried its own costs when inspectors later found poor welding and busted bolts at key points that required fixing. Frick says the current $6.5 billion total is a rough estimate, and that it doesnt include interest or financing costs."

A mistake most larger EPCM's made in those days. The horror stories regarding the quality of Chinese steel and fabrication back then are very real. It is unthinkable now to let any fabrication of that nature happen there without adequate on-the-floor supervision and oversight.

nashashmi 2 hours ago 1 reply      
I am a Civil Engineer and I see a problem in the industry so gaping wide, even CEOs of the industry claim the industry is ripe for "disruption."

This industry is so old yet the innovation is really behind. Technology (?lame word?) should be speeding things up and making things more accurate and less error prone, but instead it seems to be delaying the time work gets completed.

Some promise is held in Information Modeling like BIM or CIM, but they are not trickling into the industry at the pace required. And further, many of the present day engineers are not in a position to understand this stuff.

quanticle 4 hours ago 1 reply      
This is yet another data point against people who argue, "Well, civil engineers can estimate complex project accurately. Why can't well software engineers?" Well, as it turns out, civil engineers aren't that good at estimating either, for the same reasons, no less. Shifting requirements are as much of a problem when building bridges and tunnels as they are when you're building software.
arel 4 hours ago 2 replies      
Regarding the prices of steel rising 50% can't this be mitigated to some extent by buying core materials (or a proportion of) on the futures market which is specifically designed to protect the buyer from price fluctuations?
ucaetano 4 hours ago 1 reply      
That didnt sit well. Pulitzer-winning architectural critic Allan Temko blasted the skyway option as dull and likened it to an outsized freeway ramp. MTC head Mary King said of the skyway: While we appreciate the governor has offered vanilla ice cream, we want chocolate sauce on top. One Oakland resident wrote that since the Bay Area was full of such creative types, I think each of us should draw our own bridge and send it to MTC for consideration.

Oh, you want the chocolate sauce on top? That will be an extra few billion dollars, please.

guelo 1 hour ago 0 replies      
I used to be a supporter of California's high speed rail project until this bridge. It's not just the cost overruns it's also the shoddy work, the political interference, the secrecy, the publicly funded PR bullshiting, and the complete lack of accountability. I really doubt the rail project will be completed in my lifetime.
exelius 2 hours ago 1 reply      
In our environment of cost cutting and "fiscal conservatism", I don't see any other way to fund a large public works project.

If the first estimate had been $4 billion (assuming we're taking $6.5 billion in 2015 dollars and working back to 1998 dollars), the project never would have gotten off the ground. The government would have said "fuck no" and asked for another bid. It would have been mired in discussions, argument, etc for years before eventually settling on a $1 billion price tag -- that will eventually balloon to $7 billion or so anyway, because the winning bidder intentionally underbid because it was the only way it would get approved.

The only way to build large public projects like this is to take advantage of the sunk cost fallacy (or "bait and switch".) Government contractors will get their cut, and the regulatory tack-ons added by local governments to put their stamp on it (and get some operating budget!) also add money.

bigethan 6 hours ago 0 replies      
This quote hits close to home: Basically at the onset of a project I think the higher ups prefer a dollar amount and schedule that doesnt shock the public.

When the people who are not knowledgeable about the actual details of the project require their expectations to be met, other expectations of theirs will not be met (the classic "fast, cheap, or good, choose two" joke). I like to say to people making unreasonable demands "Do you want to be disappointed now, or later?"

johan_larson 4 hours ago 1 reply      
I wonder if the accuracy of estimates would improve if the clients let it be known that if the accumulated cost ever exceeded twice the initial estimate (or some similar multiple), the whole project team running the show would be replaced, and the project potentially scrapped.

It would be hard to stick to such a pledge, of course.

ck2 29 minutes ago 0 replies      
A better question is why didn't they stop when it hit 100% overrun.

$6 billion overrun could have fed, clothed and health insured million of people

EU data protection law after the Safe Harbour judgment eulawanalysis.blogspot.com
69 points by robin_reala  8 hours ago   63 comments top 6
PeterStuer 7 hours ago 12 replies      
Interesting tidbit: If you try to refer to this article with a link on Facebook, they will block you from posting it.
jamesblonde 7 hours ago 3 replies      
In general, this is a very good thing. The main outcome will be that more engineers will be needed to do more work to ensure that data is handled more carefully. The cost will be slightly reduced profits at companies that handle large volumes of data globally. What's bad about that?
fauigerzigerk 5 hours ago 1 reply      
There is one key issue that is routinely ignored. The US and other countries have two sets of data protection rules that govern police and security services. One set of rules for residents of that country (e.g. US persons) or domestic data and another much less stringent set or rules for everyone else.

So even if data protection rules were perfectly adequate in every single country on this planet, there would still be justified concern about transferring data across borders.

That's a situation that must change, and it can change without taking away the bowl of sweets from security agencies altogether (which will never happen).

pjc50 5 hours ago 3 replies      
I think there's a bit of a rush to panic about data balkanisation here; remember, this is not a ruling that applies directly to Facebook, but to the information commissioner of Ireland.

There's no new policy and no court orders to do particular things. What's likely to happen is an extensive legal limbo. We may even end up with a special Snowden version of the cookie warning: "Data stored on this system is subject to mass surveillance and may be accessed by the security services without a warrant or due process".

tajen 5 hours ago 2 replies      
Question: If Facebook manages data within Europe, what are the safeguards in place to ensure that there won't be mass surveillance, e.g. face recognition, shadow profiles, friend graph browsing?
mtgx 7 hours ago 0 replies      
> Since the Court refers frequently to the primary law rules in the Charter, theres no real chance to escape what it says by signing new treaties (even the planned TTIP or TiSA)

Oh good, I was worried a little about that one.

> Undoubtedly (as the CJEU accepted) national security interests are legitimate, but in the context of defining adequacy, they do not justify mass surveillance or insufficient safeguards.

Another good thing. I wasn't sure if this ruling affects spy agencies, too, or just companies.

The Most Mysterious Star in Our Galaxy theatlantic.com
87 points by hvs  3 hours ago   32 comments top 11
xlm1717 1 hour ago 2 replies      
The article dismisses natural explanations as "wanting", but then goes on to dedicate half the article to aliens?

It claims that another star having a close encounter with KIC 8462852 (the star discussed in the article) and stirring up its comet cloud "would be an extraordinary coincidence". There is, in fact, evidence that such an encounter happened in our own solar system, "only a few millennia before humans developed the tech to loft a telescope into space." Calculating the speed and trajectory of a particular star, astronomers found that it would have crossed within the radius of the Oort Cloud approximately 70,000 years ago, producing exactly the scenario that "would be an extraordinary coincidence."[1]

Additionally, astronomers have checked stars in the galaxy for the possibility of a close encounter with our solar system, and find dozens of such candidates to come close to our solar system, sometimes within the radius of the Oort cloud, within the next million years (some as close as 240,000-470,000 years from now).[2]

The idea that a passing star would stir up the comet cloud of KIC 8462852 should not be dismissed as a coincidence, especially not to give leeway to discuss the potential for intelligence to build megastructures, when we see that such a coincidence is not even that rare for our own solar system.

[1] http://iopscience.iop.org/article/10.1088/2041-8205/800/1/L1...

[2] http://www.mpia.de/~calj/stellar_encounters/stellar_encounte...

joeyspn 1 hour ago 0 replies      
And here a somewhat more mundane anti-clickbait article...

"Citizen scientists catch cloud of comets orbiting distant star"


yk 1 hour ago 0 replies      

The paper. Basically Kepler, or to be more precise the Planet Hunter crowdsourcing effort, found a star with a rather strange light curve and the Atlantic jumped the gun and babbels of aliens.

StreakyCobra 1 hour ago 4 replies      
But that would be an extraordinary coincidence, if that happened so recently, only a few millennia before humans developed the tech to loft a telescope into space. Thats a narrow band of time, cosmically speaking.

After all, this light pattern doesnt show up anywhere else, across 150,000 stars. We know that something strange is going on out there.

It would also be an extraordinary coincidence if we find another planet with life on it, so quickly after humans started to look for it. 150'000 stars is a narrow band of universe, cosmically speaking.

codezero 1 hour ago 0 replies      
There is so much cool science out there, I wish we could accept it as cool without it being aliens. The phenomenon that lead to this odd flux observation is more interesting than hypothetical alien megastructures.

Another cool citizen science project was the observations of the epsilon Aurigae transit. https://en.wikipedia.org/wiki/Epsilon_Aurigae

MrBra 11 minutes ago 0 replies      
Yes but, how many light-years away is that star?
suprjami 2 hours ago 2 replies      
So what's the star called? Even if it's just a boring numbers-and-letters name it'd still be nice to know.
ryandvm 26 minutes ago 0 replies      
Hmmm, looks like somebody threw Occam's razor in the trash...
stupidcar 2 hours ago 1 reply      
I know I shouldn't get excited about this, as a more prosaic explanation is far more likely, but my heart still skipped a beat when I read stellar-light collectors.
LoSboccacc 2 hours ago 3 replies      
just hypothetically speaking: wouldn't a dyson array need to beam energy to the planet where those aliens live?

we should be able to pick up that high energy stream with current technology, but we'd need first to guess at which frequency it transmits to (if it's there)

edit: ok scratch that I stand corrected I was still thinking in earthling terms

dynofuz 2 hours ago 0 replies      
This is great. I always thought an advanced et would want to build something like a dyson hemisphere around its star.
Convolutional neural networks and feature extraction with Python christianperone.com
67 points by perone  15 hours ago   2 comments top
amelius 2 hours ago 1 reply      
It would be nice if there existed a set of standard problems, a set of benchmarks for each of them, and an overview of methods to approach these problems and corresponding benchmarks. Then for each problem, also a set of implementations.

Right now, the field of neural networks seems like a maze. It is too easy to get lost, or to settle on the wrong, suboptimal solution.

Optimizing Sidekiq mikeperham.com
14 points by nateberkopec  1 hour ago   1 comment top
sandGorgon 5 minutes ago 0 replies      
How do you guys deploy Sidekiq. The big problem is when I deploy Sidekiq through Foreman, then I always end up with hung processes on restart. Somehow, the TERM signal does not work very well.

I have now started using supervisor for deploying Sidekiq, but I would have preferred a Foreman like tool (so that development is also nice and simple)

Rewilding process could soon return wolves to Scotland bbc.com
52 points by nols  13 hours ago   63 comments top 12
sandworm101 8 hours ago 2 replies      
Britain and Japan, both island nations where anything remotely threatening was wiped out long ago. The largest predator in Japan is the salamander. A meter-long specimen will make news.


What's the largest predator in Britain? The badger? The fox? Or that housecat everyone thought was a lion.


Talk to anyone in the pacific northwest. If you take only the slightest precautions you have nothing to fear from the wolves, cougars and bears. You are far more likely to be eaten by a fellow human. You are more likely to be killed by deer. They are already all over Britain. So the wolves will in all probability reduce the number of animal-related deaths.


smackay 7 hours ago 1 reply      
The environmental benefits of re-introducing wolves are well described. However, with more and more people packed into the south of England, the economic benefits from tourism are going to be enormous. For example the re-introduction of the White-tailed Eagle benefits the economy of the island of Mull to the tune of 5 million UK pounds per year.


dpflan 10 hours ago 1 reply      
Here is a HN submission (https://news.ycombinator.com/item?id=8448929) for the How Wolves Change Rivers video that is mentioned in this article. (It was posted exactly a year ago, a coincidental anniversary of the topic here on HN).

There are some good discussions and great links to related content including the TED talk on 'rewilding' by George Monbiot who narrates the How Wolves Change Rivers video.

techterrier 6 hours ago 1 reply      
Lack of wolves and bears has caused some interesting side effects in Southern England where I live. Namely, the deer population has exploded as their only predator is now the motorcar. It's pretty cool seeing massive flocks of them and the occasional close encounter on my mountain bike; I once fed one some lettuce from a sandwich.

On the other hand, they are eating all the woods, starting with the saplings which is causing real harm to the sustainability of forests.

Predictably, the notion of culling some is very controversial, especially from nature loving people. But the alternative is bringing back the wolves. The wolves will do lots of wolfy things like killing dogs and eating livestock and be equally controversial.

(edit: SPAG)

arethuza 7 hours ago 0 replies      
There are some great charities trying to restore the ancient woodlands of Scotland - the Trees for Life site has a lot of good information about some of the issues relating to high deer populations:


mapt 10 hours ago 1 reply      
You know, we could avoid a lot of the culturally ingrained fear of wolves if we instead "returned Labrador Retrievers to the wild", and let feral packs of them roam Scotland pursuing deer. So cute.
m-i-l 3 hours ago 0 replies      
It is interesting how something considered harmful can have beneficial effects. In this case reintroducing wolves could control the deer population and help the re-establish some of the Caledonian Forest.

There was another example I read recently: the Indian Vulture Crisis[0]. Apparently the vulture population in India has been declining dramatically. I wouldn't have thought vultures were particularly good, but their declining population has led to all sorts of significant issues such as an explosion in the number of wild dogs and the spread of disease. It has been traced to the administration of an anti-inflammatory called diclofenac to livestock.

Nature has many complex interactions.

[0] https://en.wikipedia.org/wiki/Indian_vulture_crisis

barking 7 hours ago 0 replies      
In the 17th century one of the nicknames some English people had for Ireland was Wolfland.

The century following the Cromwellian conquest saw a bounty-led drive to exterminate wolves with the last one being killed in 1786.


cskau 6 hours ago 1 reply      
Wolves also returned (naturally I believe) to Denmark last year:http://www.telegraph.co.uk/news/worldnews/europe/denmark/112...
chroma 10 hours ago 5 replies      
"To be lying in your tent in the middle of nowhere and to hear a wolf cry. Now that must be quite something."

The first time? Maybe. It quickly becomes distracting, annoying, and (depending on the distance) frightening.

The only decent argument I can find for reintroducing wolves is that it would help keep wild deer in check. But the costs of wolves are far higher than the costs of too many deer. Deer don't kill livestock or humans. And of course, wolves aren't the only solution to reducing the deer population. They can be culled in other ways. The whole thing seems like a non-starter to me.

I think most who are in favor of reintroducing wolves are just infatuated with charismatic megafauna. "Wolves look cool and they used to be on the island, so let's bring 'em back." or something like that. Then they rationalize their conclusion with arguments about tourism and culling deer.

What if instead of wolves, it was crocodiles that had been eradicated from Scotland? I seriously doubt there would be as many supporters, yet the same arguments for reintroduction apply.

intellix 8 hours ago 0 replies      
The rabbits are on strike
peteretep 10 hours ago 0 replies      
Oooh, maybe we could rewild polio and TB too!
Open source projects for modern COBOL development opensource.com
51 points by vezzy-fnord  15 hours ago   26 comments top 7
kagamine 1 hour ago 1 reply      
We use a COBOL variant from a company in the US that is not open-sauce, but is very, very efficient and in continued developemnt, and we like it. A lot. It has a simplified syntax over COBOL, for example, no 6 columns on each line before code, no weird definition section at the start, seemless integration with the DB (no sql, just "READ users INVALID KEY next sentence" will open and read a file and handle an error (no primary key found) ).

It's so efficient that it's hard to grok at first. And 'normal' COBOL sytnax works. And it comes with a web framework. If anyone on HN is sitting on an aging COBOL project/app you could do worse than to chat to Zortec in Tennessee.

Kristine1975 6 hours ago 3 replies      
Say what you want about COBOL, but at least it had a built-in data type for decimal numbers (looking at you, Java).
jestar_jokin 6 hours ago 1 reply      
> Most of the COBOL being written today is for maintaining legacy code, not starting new projects.

Funny thing, I worked on a project with a mainframe division, that replaced a crusty old COBOL-based mainframe app with... an RPG mainframe app. In the 2010s. Ostensibly because we could leverage existing knowledge from other divisions under the same parent company, but really because of politics, as these things normally are.

protomyth 1 hour ago 1 reply      
I still think it would be interesting to do a COBOL / RPG IV compiler to WebAssembly. The decimal handling will probably be an issue, but I am pretty tempted to give it a go when the WebAssembly stuff matures a bit.
Boggins 9 hours ago 3 replies      
"...allows you to use COBOL in node.js".

What a vivisection of nasty things!

jevgeni 5 hours ago 0 replies      
COBOL is like HNs shutter shades.
Show HN: Rune.js Graphic design systems with SVG in the browser or Node.js runemadsen.github.io
84 points by runemadsen  13 hours ago   15 comments top 8
simple10 24 minutes ago 0 replies      
linqables 7 hours ago 1 reply      
This looks really interesting. Are you planning to do text as well? Specifically, text with line wrapping?
andrewray 9 hours ago 1 reply      
This seems to take the declarative nature of SVG and make it more imperative, like HTML5 Canvas, even though it ironically uses a virtual DOM implementation under the hood. For my React projects I chose SVG over canvas because I could write my layout declaratively, and avoided imperative manipulation libraries. How do you modify existing SVG shapes with this library?
bigethan 6 hours ago 0 replies      
The server side rendering looks really interesting (I've been getting into SVG for mobile web graphics). Are there other SVG libs that do server side rendering, or is this unique as it seems?
amelius 5 hours ago 1 reply      
I want an API where I can intersect or combine paths, and be able to read out the resulting paths.
jacobolus 9 hours ago 2 replies      
The examples are horribly broken in Safari.

Anyway, echoing andrewray, whats the point of this? It seems no easier than just working with SVG directly. Its considerably less powerful than e.g. using D3.

kyriakos 7 hours ago 0 replies      
great work there.
hundunpao 5 hours ago 0 replies      
I love it!
Results from Nigerias Business Plan Competition priceonomics.com
36 points by davidiach  10 hours ago   3 comments top 3
notahacker 54 minutes ago 0 replies      
Pity the Priceonomics article didn't speculate on the effect of the size of the grant. According to the article the average grant size was in the region of $50k paid in tranches contingent on low-end performance targets, which is serious funding for many types of business in a country with (non PPP) per capita GDP of around $3k.

Whilst it's heartening to see that these top businesses apparently didn't waste the cash, it's not especially surprising to see that enough free cash to pay 3 workers' entire salaries for 3 years significantly boosted the chances of the business surviving over that period.

awjr 6 hours ago 0 replies      
I think what is interesting is how they recognised a real issue in the potential for corruption/nepotism and designed this issue out of the selection process.

I also think it was quite interesting how they had a preference for existing businesses from a commitment point of view (and probably to prevent corruption) and how they now think this was wrong.

Will be interesting to see the results form Phases 2-4.

qznc 7 hours ago 0 replies      
Sounds somewhat like YC done by government.
Counterintuitive economics of a chess tournament davidsmerdon.com
31 points by rkrzr  9 hours ago   34 comments top 4
dkbrk 7 hours ago 1 reply      
I'm not sure that it makes a difference in this case, but it is insufficient in situations such as these to look at just the expected return. A rational agent's actions are determined by expected utility, which could be highly non-linear depending on myriad factors such as the player's net worth. i.e. it could be rational to trade a lower chance of the largest payoff for a more consistent payoff of a lesser but still substantial amount, even at the cost of a significantly smaller expected return.
rm999 2 hours ago 0 replies      
This stuck out to me:

> Even under some very tolerant assumptions, the expected payoff from playing on, for either player, was greater than the expected payoff from accepting the repetition.

Payoff, sure. But it's well-known that the marginal utility of money is not linear, which means people tend to value money differently based on how much of it they have (poor people value a dollar more than rich people). This indirectly means some level of risk aversion is actually an optimal choice. Turning down a gamble, even if the expected payout is positive, can be rational.

This is well studied in economic theory: https://en.wikipedia.org/wiki/Von_Neumann%E2%80%93Morgenster...

TorKlingberg 7 hours ago 1 reply      
Could someone provide some context? Had Nakamura and McShane already played each other, and the second game looked like it was going to become an exact copy?
throwaeayBan 7 hours ago 6 replies      
"Top female Prize " and... no top male prize...We can never be equal while there is any discrimination even when it is possitive. And please don't start with "we need more women in chess thus...". No, we need good players in ches and not insert anything here. And tell, what forces us to improve more than competition? Why play better when you will get prize anyway?
Simple Sequential A/B Testing evanmiller.org
54 points by revorad  17 hours ago   9 comments top 2
warkon 9 hours ago 4 replies      
A much simpler approach is to AABB test instead of AB test. Rather than splitting your users into 2 buckets (A and B), split them into 4 buckets (A1, A2, B1, B2). Give groups A1 and A2 one variation and groups B1 and B2 the other variation. When A1 equals A2 and B1 equals B2 then you know you have statistical significance and you can compare A1+A2 to B1+B2.
tristanz 9 hours ago 1 reply      
Why not just estimate p(A - B | observed data) and be done with it?
Judge: NYC Seizing Thousands of Cars Without Warrants Is Unconstitutional amny.com
260 points by bane  16 hours ago   87 comments top 16
zaroth 13 hours ago 2 replies      
I agree 100% this is a perfect example of where we the people rely entirely on the judiciary to provide a remedy. That such an obviously illegal practice could continue for years unfortunately does not reflect well on any thoughts of swift justice.

It should be possible to get a temporary restraining order against the city in cases like this within days of the first contested case. It should be easy to demonstrate there is no imminent harm of telling the city, you have to stop doing this until we decide it's OK or not, and quite the opposite, cars are an essential and significant asset, and this policy placed a potentially massive burden on the citizens it effected.

In one of the examples, by the time the victim prevailed against the illegal seizure backed by zero evidence or investigation of any kind, they had already sold off his car, and offered nothing in return. A pretty large part of the population doesn't have a spare $2,000 in cash to get their own car back while the city makes them prove in front of a Kangaroo Court that they were driving their own family to the airport... Missing from the article -- is there any hope of any kind of restitution? Can the victims now pursue a civil case against the city?

mapt 10 hours ago 1 reply      
You: The city is stealing my car without probable cause in an attempt to extort money from me.

City DA: No they're not.

What's your recourse here? Call the FBI or federal prosecutor and report an organized crime syndicate being run by corrupt law enforcement professionals? Because... isn't that what this is?

Is there any onus, or even incentive, for them to listen and investigate? Is the only way to redress the problems a civil lawsuit against the City citing Bivens and various appellate court principles like malicious prosecution? Because grand theft auto, extortion, racketeering, and fabrication of evidence / perjury are not civil offenses, and conservative readings of the concept of 'standing', as I understand it, make it rather difficult to challenge the authors of a failed / withdrawn prosecution in order to get at the legal principles which triggered it.

Concepts like this one, as well as things like civil asset forfeiture, are so clearly in direct violation of the Constitution that at some point, it's not legitimate to shelter enforcers under cover of "just following orders". We still have laws (Constitutional and common), and Peabody, Minnesota doesn't have the right to do things like put all the gay residents to death by legislative fiat & judicial compliance; If you found this occurring, you wouldn't need to file a lawsuit alleging that a constitutional overreach has been committed and demanding merely that the policy cease to be in effect. Instead, you would get some overriding authority, like the state police or the FBI, to run in with SWAT teams and arrest and prosecute every last person peripherally attached to the Peabody legislature or judiciary or law enforcement. For murder.

No amount of 'adopting selective prosecution based on what we can win, since the courts recognized a valid affirmative defence' or 'changing training programs to be more in line with civil rights' or 'firing/reprimanding the officers involved and settling a civil suit' makes killing the gay population of Peabody less of a crime, and no amount of lawsuit would be required to get that recognized.

maehwasu 12 hours ago 0 replies      
And once again, the nice thing about living in not America is that bribes are significantly cheaper.
grecy 12 hours ago 1 reply      
>Probable cause is not a talismanic phrase that can be waved like a wand to justify the seizure of any property without a warrant

Does that apply to civil forfeiture as well? Sounds like it should.

aswanson 12 hours ago 2 replies      
Why is the regular news reading more and more like my Onion RSS feed? I have a feeling things were always this absurd, if not more so, but the idiocy gets amplified now by the channels being so connected.
peeters 10 hours ago 0 replies      
I think the most interesting, or scary, part of all of this is the justification for this warrantless search and seizure: to stop Uber from operating in the city. Usually the government has to invoke public safety to try to justify removing individual rights. Now they can just invoke the taxi lobby I guess.
thoman23 13 hours ago 0 replies      
So the government should not arbitrarily seize property from its own citizenry? I'm sure they will take that under advisement.
dandare 10 hours ago 2 replies      
This is on of the things that fascinates me about the US. Such blatant injustice would be unthinkable in the Europe.
avoutthere 13 hours ago 3 replies      
Wow, how was this ever legal to begin with?
dannysu 13 hours ago 2 replies      
I was getting redirected to http://www.forbes.com/forbes/welcome/ if I clicked the link on HN.

If I copy & paste the link to a new tab, then it worked for me.

dools 12 hours ago 0 replies      
Just another example of how prohibiting human behaviour instead of regulating leads to over zealous police and undue burden on law abiding citizens.

They should take a page out of London's book and allow minicabs to operate.

AdmiralAsshat 13 hours ago 1 reply      
Is the lack of any page displaying with adblocking turned on intentional, or is it simply bad design?
timtas 12 hours ago 0 replies      
Yet another reason why I've stopped using plural pronouns to refer to the state at any level.
briandear 7 hours ago 2 replies      
Has anyone ever died because of an unlicensed limo? Is it really a threat to public safety? If consenting adults agree to a transaction, I am not sure how that's the government's business. However, if an unlicensed vehicle was portraying itself as a licensed vehicle, then you have a fraud issue, not a public safety one.
pbreit 14 hours ago 2 replies      
Is this an Uber thing? I didn't see it mentioned.
bsder 14 hours ago 4 replies      
What's the deal with all the Forbes links redirecting to welcome? How do I stop this?

I tried checking the "Warn me when websites try to redirect or reload the page" box in Firefox, but it doesn't appear to be stopping it.

Presumably too many people are starting to use things like "Google Sent Me".

       cached 14 October 2015 16:02:03 GMT