hacker news with inline top comments    .. more ..    3 Nov 2015 News
home   ask   best   4 years ago   
Computer, Respond to This Email googleresearch.blogspot.com
204 points by dpf  3 hours ago   79 comments top 20
1
supersan 46 minutes ago 2 replies      
I'm often amazed how Google still gives it staff this unique hacker-like approach to their million dollar projects.

For example, the stuff in Google labs (gmail) has had some silly things like don't hit send when you're drunk but sometimes very useful features which may be considered unorthodox like "Gmail telling you when it thinks you wanted to attach a file but forgot".

It's something I wouldn't generally associate with a very corporate design but here they are wanting to add another silly feature and who knows how it will turn out? Maybe it will be super useful and then all the other companies will start to copy it.

But the thing is they're inventing new ways that really don't fit your product development roadmaps. I really like that about them.

2
imh 1 hour ago 2 replies      
I appreciate the privacy standards they used (no humans reading your email to develop this), but am concerned that it's not enough. As I understand with language models, overfitting takes the form of returning a sequence of words seen in the training set. If this is overfitting in any part of the response space, this could happen. Out of a million emails, how many suggested responses are going to substantively resemble another response the original author wouldn't want read by others?
3
rcthompson 19 minutes ago 0 replies      
Hey, remember that time this was a Google April Fools Day joke[1]? Kind of funny how it's now a serious and actually useful-looking product.

[1] https://www.gmail.com/mail/help/autopilot/

4
icey 3 hours ago 2 replies      
I just started reading Avogadro Corp (http://www.amazon.com/Avogadro-Corp-Singularity-Closer-Appea...) this weekend, and this reminds me quite a lot of the emergent AI that figures heavily in the story (ELOPe). A quick synopsis: developers build a system to "improve" responses from emails. The system at some point is given the ability to send emails on its own, and a poorly issued directive. It's been an engaging read so far, and fairly hilarious since the corporation in the book is very obviously based on Google.
5
dikaiosune 2 hours ago 5 replies      
While the technology is very cool, this pushes me that much closer to a "dumber" email service. It's always a conflict for me as I love shiny new things, but I'd rather we let AI loose on someone else's email, especially when the AI's revenue stream is advertising (a business predicated on knowing as much as possible about your audience).
6
davedx 11 minutes ago 0 replies      
Meanwhile, Twitter change their favourites from stars to hearts.
7
alphydan 2 hours ago 2 replies      
> Another bizarre feature of our early prototype was its propensity to respond with I love you to seemingly anything

Is it trained mostly on personal gmail accounts?

8
hokkos 1 hour ago 0 replies      
Does the responses generated from one user response leaks to another one ? Because this could be a major privacy flaw.
9
julianozen 2 hours ago 0 replies      
Where's the "sorry, I won't be able to make it." in the server dropping email.
10
OkGoDoIt 1 hour ago 0 replies      
This looks more powerful than iOS's context sensitive keyboard, but that also looked impressive when demoed. It will be interesting to try this out with real-world emails and see how well it works in practice.

This gets really exciting when implemented on a smartwatch. Single tap responses that are actually useful would make smartwatches significantly more powerful. Speech recognition is great but it's not always appropriate for social environments.

11
ambicapter 1 hour ago 0 replies      
Sounds like the system has no way to take into account context. Each message is encoded into a 'thought' vector, but is it difficult to come up with a 'thought' that changes meaning depending on prior messages? I imagine it is such a concern that leads one to look into Hierarchical Temporal Memory and similar techniques.
12
deckar01 2 hours ago 0 replies      
I kept imagining it would generate responses based on MY past emails until the end of the article. Example "When does your flight leave for Paris?", I was imagining it would pull the info from Google Now to reply "Friday at 9am", but I don't think it is built for this type of specific question.
13
tabrischen 58 minutes ago 0 replies      
When you receive a call today, you are already prompted with auto response messages such as 'I'm busy, I'll call you back.' This seems like the natural next step.
14
sagarjauhari 2 hours ago 0 replies      
Similar to the quick text feature when not taking a phone call, this could come in handy - especially on phones.

But does that mean I'll have to re-enable Inbox to try it. Duh :(

15
mapleoin 2 hours ago 2 replies      
Finally! This will dispel all those rumours that Google reads all your email.
16
msoad 1 hour ago 1 reply      
It's not available for Google Apps users or am I missing something? It doesn't show any replies to me...
17
pjmlp 1 hour ago 1 reply      
Waiting for the news about social and legal problems that will happen from typing on the wrong answer.
18
yeukhon 1 hour ago 0 replies      
Minus all other concerns, for the operation / infrastructure / devops, the auto-replying about server downtime or inquiry about performance, those kinds of stuff is something I am actively thinking about and starting to learn how to implement in my organization.
19
drzaiusapelord 1 hour ago 0 replies      
> But replying to email on mobile is a real pain, even for short replies.

Imo, this has already been solved. I just whisper replies into my watch for both email and SMS. Google's voice recognition is finally good enough for this. I can't think of anything more convenient.

I'm not sure if AI-ish replies are something I'm interested in. Throwing complex fuzzy logic at what should be a hardware/interface problem seems shortsighted. I prefer a much higher level of granular control, especially if these messages are work related.

20
Sevzinn 15 minutes ago 0 replies      
Senior research scientist? So, a forgetful research scientist?
I Tried to Buy an Actual Barrel of Crude Oil bloomberg.com
320 points by sergeant3  6 hours ago   115 comments top 18
1
comex 5 hours ago 8 replies      
Another notable excerpt from the linked state safety agency document (http://www.michigan.gov/documents/cis_wsh_cet5041_90142_7.do...):

--

Two employees of a fertilizer company in Riga, Michigan, were assigned to install a new float valve in an old 35-foot deep cistern for a new 300-foot well. This cistern was covered with a concrete slab with entry through a covered manhole. The first worker entered the cistern and as he reached a plank platform six feet below the opening, he was instantly overcome and fell unconscious into the water below. The man on the surface immediately ran to the nearby plant for help. Several workmen responded and two of them entered the cistern to render aid. They met the fate of the first worker. A passerby who had been drawn to the scene by the crowd which had gathered was told by an excited bystander that several men in the cistern were drowning. Upon hearing this, he shouted, "I can swim, I can swim" and pulled away from a company employee who was trying to restrain him. Now there were four bodies in the well.

Shortly afterward the fire department arrived at the scene with proper rescue equipment. The fire chief entered the cistern wearing a self-contained breathing apparatus. After reaching the plank platform, he removed his face mask to shout instructions to those on the surface and he, too, was instantly overcome. All five persons died in the cistern.

Tests of the cistern atmosphere revealed that H2S in a concentration of about 1000 parts per million was present when the five deaths occurred. The water pumped up from the deep well contained dissolved hydrogen sulfide which was released in the unventilated cistern.

2
nosuchthing 1 minute ago 0 replies      

 Enterprising Child Saves $54 To Buy Barrel Of Oil
http://www.theonion.com/graphic/enterprising-child-saves-54-...

3
33W 3 hours ago 3 replies      
Planet Money had a recent podcast titled "The Onion King", where someone took physical control of onion futures, and why that is the one type of future outlawed by congress.

http://www.npr.org/sections/money/2015/10/14/448718171/episo...

4
kyleblarson 4 hours ago 1 reply      
I worked at a small quant fund and we once had a bug that caused our roll forward trade to fail on a rather large position of front month WTI contracts. We technically were going to have to take delivery since we were long the contracts at expiry, but our brokerage was nice enough to fix the situation for us.
5
strictnein 5 hours ago 3 replies      
On a related note, the (slightly stylized) retelling of a time a commodities trader accidentally bought actual coal:

http://thedailywtf.com/articles/Special-Delivery

6
cheriot 2 hours ago 0 replies      
The best example I've seen of this form of journalism was NPR's purchase of a "toxic asset" during the financial crisis.

http://www.npr.org/series/124587240/planet-money-s-toxic-ass...

8
teddyg1 5 hours ago 1 reply      
I did this recently as well (though for a different purpose, chemical testing). it's even more difficult than the journalist describes. No refinery wants to stop processing or halt flow in their pipelines, and so getting WTI or Bakken oil is immensely difficult when you're not at the source. There are small mom & pop ~2-3bbl/day producers who are willing to sell their oil at a premium, however.

Ventilation is a huge problem - H2S emits a very foul smell. Quickly open and close a small bottle of crude in a room, and the scent lingers for a few minutes or more.

9
chrisBob 5 hours ago 0 replies      
At work we have recently been working with an oil company. Even on a project fully funded by this oil company it took us months to convince them to send us a few bottles containing about 100mL each. Getting rock samples from their oilfield, which is what they really want us to do tests on, was much harder.
10
bigethan 5 hours ago 4 replies      
Relatedly(?) When I had a diesel car, I bought and stored a small barrel of biodiesel in the basement of our apartment (saving like $0.05 a gallon!). I was told that diesel doesn't burn except under extreme pressure, and that you could throw a match into an open cup of diesel and it would just go out.

Never actually tested that out.

11
drcode 6 hours ago 2 replies      
Would be great for parties though:

"Hey, what's that thing in pantry?"

"Oh, that's just my barrel of crude oil."

12
SeanDav 5 hours ago 0 replies      
I was pleasantly surprised at the subtle humour in this article, although some of it does require a bit of insider knowledge to fully appreciate!
13
stcredzero 2 hours ago 0 replies      
I bought a tiny bottle of crude oil from Drake's Well near Titusville, Pennsylvania as a grade school kid. We used it as furniture polish and for various things around the house. (Drake's Well was the original oil well.)
14
iheartmemcache 5 hours ago 3 replies      
Not in petrochemicals, but neither is this is this reporter. I get it, it's supposed to be a joke, but it's such a poorly executed one fraught with so many major components glossed over that it detracts from any humor the article could have had. When the Onion writes something, it's humor is wonderful because not only the wit is there but there are few if any inconsistencies even for pedants who dissect things such as I.

1000ppm of hydrogen sulfide surely can kill you. The work-around this is to take delivery of a barrel of "sweet" rather than "sour" crude. Even in the "sour form", you get "up to" 1000ppm. However, barrels 1) are sealed properly by institutions with safety regulatory agencies just like any other industry, and 2) to get H2S exposure at that rate, you'd have to effectively stick your head into a barrel and inhale directly. To calculate actual PPM you'd likely be exposed to, you'd have to evaluate the aromatic distribution of H2S. (see: Gas-Phase Reactions: Kinetics and Mechanisms, Chapter 2). The journalist is right about the insurance policies.

She later even mentions she has sweet crude. :facepalm:

>> My boss insists that I must factor in the cost of lost productivity for the many minutes spent on the phone with FedEx in an attempt to trace the package as it zigzagged across Manhattan. On that basis, Im probably already in the red.

Bloomberg - I love your reporting for what it is. Tom crushes it in the morning. Alyx Steel makes poignant remarks. BBC doesn't try to be The Daily Show. Stick at what you're good at & don't try to be VICE. Especially for someone who reports on finance, she's overlooking so many things. Insurance she mentioned earlier but didn't bother to mention factor that capex. Storage is an opex (she mentions consignment venues (tech analogy: co-locating your servers) but those are fees she didn't factor in). In real life, if she's arbitraging US oil, the journo needs to be a CME member. For a March maturity date, a ~3 mo membership lease is $1k as an opex.

* Further reading:

"Petroleum and Gas Field Processing", et al. Chapter 7: Crude Oil Stabilization and Sweetening

"Gas-Phase Reactions: Kinetics and Mechanisms" V.N. Kondratie, et al. Chapter 2 (or there abouts).

Merck on pulmonary irritants: http://www.merckmanuals.com/professional/pulmonary-disorders...

CME fees: http://www.cmegroup.com/company/membership/membership-pricin...

Edit: To anyone who's downvoting me, by all means please respond (with some cordiality if you would), as to why! I tried to cite my sources properly, but if any professionals who deal with futures (especially crude oil, and even more so those who have taken delivery or performed the arbitrage she "simulated") see anything factually incorrect, I'm really happy to be corrected.

Edit 2: I responded as to why I found this article so aggravating below. Please read it before you down-vote me. It's comparable to how frustrated my tech friends and I get when television shows cross the threshold of comical technical inaccuracy and progress into a zone of visceral opposition. HN traditionally had a "Don't downvote people solely because you disagree with them" policy. My remarks are relevant to the article and (to my knowledge) factually correct.

15
MrJagil 3 hours ago 0 replies      
How is it possible for the workers in There Will Be Blood to hack through a foot of oil to get to the vein, if a few seconds of exposure kills you?
16
maxerickson 6 hours ago 0 replies      
Just for comparisons sake, 1000 PPM is a good deal higher than the carbon dioxide content of the atmosphere, which is ~345 PPM.
17
Arnor 4 hours ago 2 replies      
> ... playing host to a herd of feral cats.

A group of cats is called a clowder... or a glaring :)

18
cubano 5 hours ago 2 replies      
It was awesome he was eventually paid for it on the blockchain.

Interesting that traders are using it for personal transactions these days.

Introducing 1Password for Teams agilebits.com
108 points by bismark  2 hours ago   29 comments top 14
1
wishiknew 0 minutes ago 0 replies      
And yet another app that would rather become an OS on its own rather than stick to one thing but 'do it well'. 1Password.app has already been taking ages to load since the shiny/pointless redesign a year ago, and now we're getting even more features
2
Tomte 10 minutes ago 1 reply      
I had hoped AgileBits would step up their Windows offerings.

I've recently bought the "real Windows application", since the Universal App doesn't allow to enter new logins (really?), only view existing ones.

Unfortunately, KeePass was much more useful with its Alt-A shortcut. In 1Password I need to manually copy login data from the application, since I'm using Edge and there's obviously no plugin, yet (Edge's fault).

Oh, and syncing must be a bad joke. Lots and lots of sync options, but the only one working across all platforms (iOS, Windows Phone and Windows are the ones relevant to me) is Dropbox. No OneDrive, no WLAN sync.

And don't get me started on vault management. I was using a non-synced vault without realizing it for weeks, and then I was pulling my hair out trying to sync the correct one. I finally only managed to do that by completely removing the Windows Phone app and starting from scratch.

At least they are moving everything to opvault. It was fun trying to get everything to sync, only to find out that the default vault format "agilekeychain" cannot be synced to Windows phone (or was it Windows desktop? I'm not sure).

3
rdl 1 hour ago 1 reply      
1) Finally 2) Awesome!

Thank you for doing this. Super into analyzing this for security. 1Password is my preferred single-client solution, but not having a good Team solution has been a serious drawback.

4
codycowan 2 hours ago 0 replies      
With the sale of Lastpass to LogMeIn, more excited than ever for 1Password to add team features
5
pantulis 1 hour ago 1 reply      
Apart from the pricing plans --makes it difficult to invest in uploading all our company secrets if I don't get a clear return--, what about Linux based desktops? Will this be only Mac/Windows centric?

Are there single-sign-on options for Google Apps for Work?

6
bdwalter 1 hour ago 3 replies      
Our teams would be all over this if they had real linux support.
7
omarforgotpwd 7 minutes ago 0 replies      
Wow, this is gonna be big! Congrats to the agile bits team!
8
joshfinnie 1 hour ago 0 replies      
A nice addition, we are currently using LassPass Enterprise, and the UI is absolutely terrible, but being the only game in town kind of forced our hands... now there's options!
9
tw04 20 minutes ago 0 replies      
Interesting - lack of active directory integration, and lack of on-prem solution is disappointing though.

Edit: since someone is apparently upset over my comment - those two features are absolutely mandatory in almost all corporate environments. If you have a comment to the contrary, feel free to share it. Don't just downvote my comment because you don't personally need the features.

10
goeric 1 hour ago 0 replies      
Very excited about this! We definitely have been waiting for it.
11
jedberg 1 hour ago 2 replies      
I used the beta for this, it was pretty slick.
12
mikeevans 1 hour ago 1 reply      
This is pretty awesome. Super excited to try it out.

What's not awesome though is how long they've been working on the refresh of the Android app with fingerprint support. Demoed in May, it's now November and they aren't even ready to launch it on their beta channel.

13
helper 1 hour ago 2 replies      
Cool, product.

I'm not super excited about the use of WebCrypto, but it isn't any worse than storing passwords in the clear in a database.

My biggest question is does it support having an audit log of who accessed what credentials when? If that is supported I could see some our our teams switching over to this.

14
neilellis 11 minutes ago 0 replies      
At last full NSA support, I've been waiting for this for ages. Really getting tired of having to open my firewall and give them an SSH login.
The Apple iPhone 6s and iPhone 6s Plus Review anandtech.com
79 points by IBM  3 hours ago   10 comments top 4
1
nicholas73 6 minutes ago 1 reply      
The iPhone 6s Plus I have is a wonderful phone, but I can't help but feel that Apple is losing some of it's perfection. There are some obvious UI annoyances that are present.

Such as, the Plus is a large phone so you most often open it with the phone tilted somehow, giving you landscape mode. It does not easily revert to the upright state without juggling it around, nor is there an option to not go landscape on the home screen only (seriously who needs that?). It's not even in the settings menu to disable rotation entirely - you have to find it in the pull up menu.

Another gripe I have is that all the pull up and down menus also trigger when you don't want it. The screen has gotten really sensitive, which is great, but there are lots of unintended effects now.

A fanatical user running the company wouldn't have overlooked this.

2
jbk 1 hour ago 1 reply      
> In light of these factors, I would give the iPhone 6s line the Editors Choice Gold award. I believe that the criteria for this award is such that a product is not only one of the best in its category and an extremely good product in a vacuum, but pushes the smartphone user experience forward in significant ways.

That's quite an endorsement, that is quite rare from Anandtech, if I've been following correctly.

I'm not always a huge fan of the iPhone, but I have to say that, after testing the 6S, it's really tempting.

3
firloop 11 minutes ago 0 replies      
4
madez 1 hour ago 0 replies      
If it only were hardware that worked for me and not for Apple if I bought it.
The Road to 2M Websocket Connections in Phoenix phoenixframework.org
158 points by chrismccord  3 hours ago   19 comments top 9
1
jkarneges 4 minutes ago 0 replies      
I love this kind of stuff. Awhile back I attempted to do similar benchmarking with Pushpin:http://blog.fanout.io/2013/10/30/pushing-to-100000-api-clien...

Like the Pheonix test, this tested broadcast throughput. However we couldn't push more than 5000 messages per second out of a single EC2 m1.xlarge instance otherwise latency would increase. My theory is that we were hitting the maximum throughput of the underlying NIC, causing packet loss and requiring TCP to retransmit (hence the latency). At some point I want to try adding multiple virtual NICs to the same EC2 instance and see if that helps.

2
polskibus 34 minutes ago 1 reply      
Great job Phoenix team! I wonder how much of this wouldn't even be possible if not for the great BEAM platform and Cowboy web server.

Whatsapp famously works with 2m connections on a FreeBSD box (this number is old, I bet they've beaten that number). http://highscalability.com/blog/2014/2/26/the-whatsapp-archi...

I wonder whether these two cases are in any way comparable. Different stacks, different machines, different test. The only similarity being BEAM/Erlang used as a platform. Speaks well of its scalability!

3
bcardarella 2 hours ago 1 reply      
We've been using Phoenix for several applications for over a year. The channel functionality is so easy to use and implement. High recommend people checking this out: http://www.phoenixframework.org/docs/channels
4
chrismccord 3 hours ago 0 replies      
My ElixirConfEU keynote gave a high-level overview of the channels design for those interested - quick jump to place in presentation:https://youtu.be/u21S_vq5CTw?t=22m34s

I'm also happy to answer any other questions. The team is very excited about our channel and pubsub layers progress over the last year.

5
toast0 2 hours ago 1 reply      
If you listen on multiple ports, you can get way more than 64k connections per client. If the workload is mostly idle, tsung should be able to manage quite a few connections too.
6
ch4s3 1 hour ago 0 replies      
This is really cool, I'm hoping to see more small companies experimenting with Phoenix. It seems like a great tool.
7
Thaxll 55 minutes ago 2 replies      
Is it a relevant benchmark to open 2M connexions without doing anything in it?
8
paulus_magnus2 1 hour ago 1 reply      
isn't there a 65k limit of number of open ports on the TCP stack??
9
mtw 34 minutes ago 1 reply      
I'd be interested in benchmarks vs other similar frameworks, for the same use case. These had json/db queries benchmarks however Phoenix lags behind (will be at the top soon?) https://www.techempower.com/benchmarks/previews/round11/
DevTools Challenger for Firefox devtoolschallenger.com
119 points by bpierre  4 hours ago   15 comments top 6
1
ben174 3 hours ago 3 replies      
Took me a minute to realize this is developed for use with Firefox Developer Edition. If you're using Chrome or vanilla Firefox, you're going to be a bit lost when trying to follow the tutorial.
2
derek 2 hours ago 1 reply      
tl;dr This is a series of interactive demonstrations showing off some of Firefox Developer Edition's developer console features, with emphasis on CSS animation.
3
darkhorn 1 hour ago 1 reply      
Do we have official 64-bit stable version for Windows? https://download-installer.cdn.mozilla.net/pub/firefox/relea... Because I use 64-bit Windows and when it has updated to 42 it was still 32-bit version.
4
intrasight 2 hours ago 0 replies      
Problem is that if you scroll too fast, the text visibility animation doesn't kick in fast enough and you just see a blank black screen.
5
ck2 2 hours ago 0 replies      
I'm sure someone had fun making that but it is a really weird way to present something.
6
fiatjaf 1 hour ago 0 replies      
The problem with Firefox Devtools is that they are slow.
Annotated Version of John McCarthys Paper on Lisp fermatslibrary.com
15 points by fermatslibrary  34 minutes ago   2 comments top 2
1
daveloyall 9 minutes ago 0 replies      
Dear fermatslibrary,

Every time I try to read an annotated paper on your site, I spent a full minute clicking the left margin and the paper, back and forth, trying to get a single view that shows all the annotations.

My monitor is huge and I like to read a whole document, top to bottom, by click-holding my scroll bar once and dragging (up and down) until I am done.

2
karamazov 10 minutes ago 0 replies      
I had a chance to add the first few annotations. Happy to answer questions on the paper as best I can.
Hearts on Twitter blog.twitter.com
123 points by janvdberg  4 hours ago   105 comments top 37
1
subnaught 1 hour ago 1 reply      
Translation: Our users don't understand our service, and we don't either, so we'll just try and mimic a successful one.
2
soyiuz 1 hour ago 1 reply      
Twitter lost the plot. Oh look, riots in Istabul. Four people injured. I guess I'd better heart that one. Because I love violence... What is the storyline here? Many people use Twitter for live coverage of events, to organize protests, and to keep up with professional news. How does a heart fit into these use cases??
3
xjay 2 hours ago 0 replies      
Twitter has some dual personality when it comes to UX.

Recently, Twitter quietly started relying on the HTTP Referrer field. That's OK, I guess, except they did nothing to inform their users about it, which is easy for them to check and report on specifically. Instead, they let their users get confused by generic messages when attempting to post, or change settings, trying all sorts of things. If you want to require HTTP referrer, you should know what that means, and why many may have it disabled.

It's also interesting to see the social media sites and the web converging on using the same types of icons. All because of the mobile platform and a certain apple OS, with the cog wheel and menu symbol.

Heart versus star: I think the heart miscommunicates the function. I can't identify with it anymore, as I don't tend to "heart things".

Also, they didn't just change the icon, they changed the color as well. The star was a weak yellow. The heart is a darker, burning red, and therefore sticks out more in contrast with the white.

4
janvdberg 4 hours ago 7 replies      
I think this is a rather significant change. Whereas before a favorite was the equivalent of a bookmark now it is more in line with the Facebook like or Instagram 'heart'.This is a clear push for more engagement. I personally like it. But I would still like to keep an option for bookmarking/read later (i.e. how I use the favorite button now).
5
evv 15 minutes ago 2 replies      
While we're on the subject, can somebody explain the behavioral difference between a star (or heart, whatever) and a retweet? They seem to have nearly the exact same effect
6
iconjack 2 hours ago 3 replies      
From the post:

"The heart is more expressive, enabling you to convey a range of emotions and easily connect with people."

It's still just one bit, right? How does it enable conveying a range of emotions?

7
BHSPitMonkey 11 minutes ago 0 replies      
Hopefully I'm not the only person who spent a good 10 seconds staring at this heart/circle/play-icon hybrid thinking it was supposed to be the new icon being unveiled:

http://i.imgur.com/pKCfDmj.png

8
r3bl 4 hours ago 4 replies      
> We want to make Twitter easier and more rewarding to use, and we know that at times the star could be confusing, especially to newcomers.

Wow. Really?

9
justizin 1 hour ago 0 replies      
Two stories about Twitter circulating today:

First, The only black manager in product or engineering left after encountering tone-deaf collaboration in measuring and increasing diversity.

Second, They changed the star to a heart.

10
the_watcher 1 hour ago 0 replies      
Starring a tweet always seemed a bit odd to me, except when using it to bookmark something for later. Hearts imply likes, based on other social platforms, which will almost certainly lead to more use. I found myself rarely, if ever, starring tweets, which actually made the function more useful (looking at my favorites is a short list of tweets I wanted to remember).
11
demian 58 minutes ago 0 replies      
I "love" this tweet about how 10 people died on an earthquake last week.

It has the same problem as "Like".

12
pmlnr 49 minutes ago 0 replies      
500px has favourites (heart) and like (thumbs up).

I believe they are one of the few who understands there's a difference; it's unfortunate that most of the social media doesn't.

13
codezero 2 hours ago 1 reply      
A heart conveys much more emotion than "like" or "favorite" it's just an emoji, but still, it seems significant enough to me that I'll do less liking on Twitter.

Like also carries a connotation over from Facebook liking a page on Facebook attaches it to my profile, and potentially feeds me content based on that, will this happen on Twitter? I can't assume it won't.

14
agentgt 4 hours ago 0 replies      
I am sort of tired of the ole lets make it emotional to make it stick [1] tactic.

Of course this is just my opinion but I sort of liked how Twitter was more newsy and less friendsy. I would even prefer the star to be a bookmark but I suppose it doesn't scale down well (size wise).

[1]: http://www.engineerguy.com/white-papers/made-to-stick.htm see chapter 5)

15
dombili 3 hours ago 0 replies      
I'm glad they're focusing on fixing major problems such as newcomers finding the functionality of the favorite button confusing, instead of trying to find new ways to deal with abuse on Twitter.
16
stevewillows 4 hours ago 2 replies      
I've always viewed and used hearts / stars / likes as a way of saying 'I have nothing else to add to this conversation.'
17
homulilly 17 minutes ago 0 replies      
Translation: Our business is failing and we're completely clueless about what to do but we want to look like we're trying so uh here's a new icon.
18
chuckgreenman 3 hours ago 0 replies      
TLDR + Sarcasm: "Today we made twitter a little bit more like facebook, because lets face it, some people are confused by stars and the word favorite."
19
oldmanjay 1 hour ago 1 reply      
That's a heck of a lot of words for a tiny semantically void change. A more cynical me would assume a whole bunch of product people at twitter are well familiar with justifying their job in the face of obvious uselessness.
21
staunch 1 hour ago 1 reply      
Another useless change. If you want to try out something that makes Twitter far more usable, try removing retweets.

1. Install uBlock Origin.

2. My Filters add: http://twitter.com ## .tweet[data-retweeter]

3. Enjoy a Twitter without retweets.

22
dpeck 4 hours ago 0 replies      
I don' believe that the wording could be more condescending if they tried to make it that way.
23
armandososa 3 hours ago 1 reply      
I'm bummed. I used stars as a "save-for-later" feature. I like the idea of having the option to explicitly endorse something you like, but I will miss the bookmark functionality. Even Facebook has a "save" feature (a little hidden in a menu, but that's ok for me) and they seem to use it effectively for engagement purposes.

I think Twitter is missing an opportunity here.

24
tdkl 3 hours ago 1 reply      
I don't use Twitter, but this sounds more confusing now. Stars are used throughout other services or systems for favoriting something such as bookmarks and if I "star it", I can revisit it in the future. But that means only I can see it.

Hearts on Instagram are broadcasted to my followers, Facebook likes as well. Is this also the case on Twitter now?

There's a difference between saving something for personal recollection and also broadcasting it.

25
egusa 4 hours ago 0 replies      
I think its a good idea, but the "like" to me feels too much like Facebook.
26
mschuster91 3 hours ago 0 replies      
Not long before Twitter will blend in with Failbook and Instagram. A pity, but apparently neccessary as Twitter needs fresh cash.

The problem: the money is in the high-active users, not in the noobs using Twitter for an hour and then moving along. And a significant chunk of said high-active users will depart from Twitter because, like FB, the users haven't been consulted in any way...

edit: If Twitter REALLY wanted to improve its experience, it should...

1) fix tab-navigation to work like text->submitbutton, not text->media->navigation->poll->submitbutton

2) only auto-complete with tab when the current word begins with an @. No, if I just type "example"<tab> I DO NOT want a random user with "example" in the name expanded.

3) in the Android app, when I'm scrolling 2h behind and rt'ing a kitten image, I DO NOT want to jump right to the top and have to scroll aaaallll the way down again.

27
mathgeek 3 hours ago 0 replies      
The trend continues towards "everyone looks like Facebook:" http://readwrite.com/2014/06/04/social-networks-all-look-ali...
28
jscheel 4 hours ago 3 replies      
In the past, I used stars as bookmarking. However, I noticed a shift towards more "like" behavior about 1-2 years ago amongst my twitter network.
29
fideloper 3 hours ago 0 replies      
My assumption on any changes Twitter makes is that it's powered by the hope of future ad revenue.

I see that potential for ads in Moments.

I would guess Hearts will get tweaked in the future to behave in a way that will make them something of value to brands (at least, Twitter hopes).

Similar to brands spending money for Facebook Likes (and thus getting their logos in people's feeds), I see Hearts as a first move towards a similar feature/concept.

Personally I like this change (as it is now) because I've only ever used stars as a "like" button, knowing people on the other end see that in their timeline. It's a very useful social tool for acknowledgement.

30
Grue3 3 hours ago 1 reply      
So now it's like Tumblr. Now they only need to remove 140-character limit and there will be no differences at all.

"The heart, in contrast, is a universal symbol that resonates across languages, cultures, and time zones."

I don't really buy it. Many countries have stars in their flags/coats of arms. There's no such unified symbolism for hearts. For example Chinese pictogram for heart is .

31
tildlyo 3 hours ago 1 reply      
Favourite was the generic thing for acknowledging a post whether you like it, relate to it, or whatever...so now if someone has bad news (eg. someone dies) what do you do? 'like it'?
32
hk__2 4 hours ago 0 replies      
I assume thats in line with them adding the birthday field on user profiles. People used to Facebook will like more tweets than with the previous star/favorite system and thus provide more accurate data for ads.
33
midgetjones 4 hours ago 0 replies      
If you click the link to the twitter username at the top of that article, you get to a page which still shows the favourites star rather than the full twitter feed with hearts

https://twitter.com/intent/user?screen_name=akik

34
atomi 4 hours ago 1 reply      
Yeah that's great just as long as the API calls dont change.
35
moron4hire 2 hours ago 0 replies      
If nobody had said anything, I would have never noticed the change.

People are complaining and/or celebrating this change over some notion that it changes what the action "says". For people who agree that it changes what you are saying, they seem to universally agree it's changing from "I bookmarked this" to "I endorse this".

Here's the thing: if they weren't meant as endorsements before, then why was the list of them public, and why were they called "favorites"? Since when does publicly marking something as a "favorite" translate to a neutral "I am expressing no opinion on this matter, merely marking it for later retrieval"?

Because I can guarantee you: whenever you star/favorite/like/whatever-the-hell-we're-calling-it-today a tweet/post/blog/picture/whatever-site-we're-on-today in a public way with notification back to the original poster, it always gets interpreted as interest/enjoyment/endorsement.

It's literally nothing more than changing the icon. Same button, different color. It does the same thing: make a public notice that you touched that button on that tweet. If you weren't using it for what it was meant for before, you can continue to use it against what it's meant for today.

You all are literally bike-shedding right now.

36
draw_down 4 hours ago 0 replies      
Wow, I am already sick of hearing about this. Star -> heart. Got it.
37
chinathrow 3 hours ago 0 replies      
I've seen them testing this and the icon style changes for months now, having had three different test screens spread over three different accounts. On mobile.
What we learned from rewriting our robotic control software in Swift sunsetlakesoftware.com
43 points by ingve  3 hours ago   22 comments top 8
1
svec 2 hours ago 1 reply      
This quote is the big takeaway for me:

"We use Macs to drive these robots, with control software that was written in Objective-C. We made the decision to rewrite this software in Swift late last year, but only after a careful survey of the bugs that had impacted our customers over the years. That survey revealed that most of these shipped bugs would have been prevented or detected early in Swift-based code."

Apparently they were able to categorize their shipped bugs and prevent those types of bugs from happening again with Swift. That's a very professional process.

2
lordnacho 2 hours ago 1 reply      
I've moved to Swift as well. It's much better, for much the same reasons as stated in the article.

Particularly stronger typing is useful. Code that uses the Any object (or Object if you're in c#) tends to look horrible and have runtime errors, or it (what class it really is) has to occur to you as you're coding, which sadly doesn't happen 100% of the time for me.

I also find it's easier to not do [obj msg] everywhere. It just seems unnatural to have to go to the beginning of a token and put a [ in front. Much easier with a more ordinary syntax that's just obj.msg. ObjC had some of this, but mixing the two made it even weirder.

I did come across a compiler bug though, and that tends to take a long time to be sure of (don't want to write a bug report and find out it was your own code all along). I guess it's just a question of time before that sort of thing is stable.

Another pet peeve of mine is separating classes into .h and .m. I have to do this in c++ code as well, and I don't like it. Much better just to let the language take care of it for me. The interface is pretty clear anyway, so why separate it and have another file to keep consistent? I guess it's mainly a legacy issue.

One thing that needs to be done is XCode needs to be able to refactor Swift. Hopefully it's around the corner.

3
SeanDav 2 hours ago 1 reply      
This is not a fair apples to apples comparison. I am certain that had they rewritten the application in Objective-C from the ground up, taking into account all they learned from the mistakes of the past, they would have ended up with similar or even better gains (by developing in a familiar environment).

It is natural, almost unavoidable in fact, that an application developed and continuously enhanced over many years will end up in a bit of a mess, unless the approach is very disciplined - which it was not in this case (inadequate unit tests for one example).

This was not a weakness or failure of Objective-C, but of circumstances and approach.

4
tspike 52 minutes ago 1 reply      
I'll let you in on a secret: I don't trust any of the code I write. I've been programming for decades, and I still keep making the same stupid mistakes.

I'd love to work with this guy. Overconfidence is not a virtue in developers.

5
saosebastiao 2 hours ago 2 replies      
I think swift is great as long as you are squarely in the Apple ecosystem, which was true for this use case...but the fact that they were using macs to drive robots was curious to me. Is there some advantage in robotics to using Apple/OSX? Why not linux, BSD, or some other embeddable operating system and off-the-shelf hardware?
6
agentgt 50 minutes ago 0 replies      
I'm sort of interested in what their migration process was like.

Particularly because I have seen so many HN posts "written in X converted to Y success stories". Most companies just can't drop everything and do a code conversion. They need to do it incremental. It seems with Swift this should be straightforward but maybe its not (I have to imagine its easier than going X to Golang which seems to be in vogue)?

IMO would have found it far more interesting to hear about the migration than why Swift is superior.

7
zzalpha 2 hours ago 4 replies      
Do we really care that much about LoC? I understand that more code means more opportunity for bugs, but I'd be far more concerned about performance, battery usage, code quality metrics, etc, not to mention maintainability and codebase stability, access to talent who understand the language/frameworks/idioms, etc.

Don't get me wrong, I'm sure Swift is a big improvement in a lot of ways, but LoC is a weird measure to focus on, IMO... though, I can't say I blame them. LoC is easy to compute. Turns out measuring actually useful things can be rather hard...

8
melling 3 hours ago 0 replies      
This article points to the Lyft article where they also claim that rewriting in Swift reduced their code base:

"Over the years, the original version of Lyft had ballooned to 75,000 lines of code. By the time the company was done recreating it in Swift, they had something that performed the same tasks in less than a third of that."

If you're interested in learning Swift, I have a small project where I collect all good Swift urls that I find:

http://www.h4labs.com/dev/ios/swift.html

Also, note that it's easy to start using Swift in existing Objective C code without needing to rewrite. It takes a few minutes to set up a project to use a bridging file. If you've got a lot Objective C, you can extend those classes with Swift by using extensions, then migrated a few methods at a time.

 extension MyObjectiveC_Class { func aSwiftFunc() { // Can be called from ObjC // .... } }

Why I pulled my son out of a school for 'gifted' kids mashable.com
19 points by gexos  1 hour ago   9 comments top 5
1
Someone1234 0 minutes ago 0 replies      
You tell a kid they're smart: You've immediately set them up for long term failure.

"Smart" is a quantifiable hard limit. The problem with smart people is that as soon as they run up against a challenge/concept/issue they cannot immediately overcome, they get frustrated, because they "should be smarter than this!" Which often results in avoidable frustration/anxiety/depression.

You see this a lot. "Gifted" "smart" kids who coast through school for years, until one day they finally run up against something they cannot do and run away from it screaming. Simply because they're not "smart enough."

Where is the school program that let's the bullshit pseudo concept of "smarts" fall by the wayside and instead replaces it with an atmosphere where failure IS acceptable, and that you just have to work through the hard parts?

I have a kid. My wife's side is second generation "gifted." These people are absolutely obsessed with how smart they are/sound/come across, and get incredibly upset/frustrated/annoyed when they feel "dumb" (i.e. things are hard, they don't get it right away, or they make a mistake).

Unfortunately when I raise the issue of "hey, focusing on an intangible level of intelligence could be damaging [here look at this child research]" I just get eye rolls, because they're so deeply into the concept of how innately intelligent they are, they cannot see a less damaging way of living one's life.

2
savanaly 10 minutes ago 1 reply      
It seems like the optimal learning environment is one where you are

a) being taught at the speed and level that is at least in the ballpark of your potential. The reason this is needed is obvious: if you are being taught way beyond your capability you won't learn anything, and it's a disservice to be taught way below your capability as well.

b) in the top 50% of your local group of students, for self esteem reasons and also because you'll start to feel like studying and working hard pays off if you seem to be successful in school relative to your peers.

There's a real tradeoff between these two sometimes. Given two identical kids, if you send one to the school that is not up to his "gifted" potential but he is the smartest kid in his class he may well do better in the end than the one you send to a gifted school but he is obviously at the bottom of his class. The first will certainly be happier and perhaps learn the life lessons and habits that will make him super successful in high school, college and beyond.

We can at least start to address a) with these gifted programs and special education for the exceptionally challenged and so on. But addressing b) is also important and the laws of statistics sort of prevent us dividing kids into groups where everyone is above average for their group. In theoretical terms I can envision some sort of frequent rotation system where you're constantly shuffled from class to class so that one week you're top of your class and another you're bottom. Perhaps that would even help illustrate to kids the extremely hard to learn lesson that judging yourself against others is fruitless and destructive.

3
baldfat 8 minutes ago 0 replies      
I was a "gifted" kid AKA they took my IQ score and put me in the program. (Horrible way to figure that out). I got to go to special things on Saturdays and take harder math classes in school. I totally sucked at spelling and still do and this is a known issue with people with a "mathematical brain" AKA can't spell to save my life but read at a high school level in 3rd grade and was doing calculus in 9th grade.

My son (5 years ago) was tested to be "gifted" and after looking at the program we said no.

1) They would move him to a totally different school in our district.

2) They used a different curriculum and class strategies.

3) Being an "4.0 Student" in undergrad I knew the pressure to stand out besides the perfect scores. Kids would be worse.

4) We live in a dirty poor urban school district and I figured the "gift program" was going to get cut. It did the next year and he would have been right where he was anyways.

5) He had cancer and I kind of was mad that they would tell him these were his options without talking to us first, because there were medical needs that a local school afforded us (AKA we could run to school and give him his pain meds if necessary)

Gifted doesn't mean separate because the one thing that people with different brain strategies needs is the ability to figure out how to work with others. If there is no others they are bound to have trouble later on.

4
cuckcuckspruce 14 minutes ago 3 replies      
> I will never forget the day I mentioned to another mother my concern about lack of empathy and kindness among the students, and she told me: Thats not the schools job.

Is there nothing left that parents are meant to teach? Or is it all now the school's job? If so, why don't we go full Spartan and take the kids away from the parents right after they're born and have schools teach everything.

5
hugh4 10 minutes ago 0 replies      
Couldn't cut it, dropped out, parents wrote sour-grapes article, the system works.
Leo P. Kadanoff, Physicist of Phase Transitions, Dies at 78 nytimes.com
16 points by espeed  1 hour ago   discuss
Show HN: Invoice Boilerplate Simple automated LaTeX invoicing system github.com
27 points by mrzool  2 hours ago   3 comments top 3
1
NicoJuicy 19 minutes ago 0 replies      
It seems similar to https://github.com/scalableminds/invoice-compiler ( which i forked for a ERP app to https://github.com/nicojuicy/report-compiler ) but the documentation isn't upto date on my fork :(
2
RodgerTheGreat 57 minutes ago 0 replies      
Seems like a great deal of work and indirection compared to simply editing a LaTeX template like:

http://www.latextemplates.com/template/invoice

I'm not convinced that YAML is substantially easier to work with than LaTeX source if you're just filling in a bit of text and some numbers.

3
mhw 39 minutes ago 0 replies      
In a similar vein, I generate my invoices using this Jekyll plugin that I wrote: https://gitHub.com/mhw/jekyll-invoice
Nearly 9,000 Artifacts Uncovered in CA Desert, Spanning 11,500 Years of History westerndigs.org
59 points by diodorus  5 hours ago   7 comments top
1
JoeAltmaier 4 hours ago 3 replies      
Still there because mostly undisturbed I imagine. Any sites near human habitation get messed with pretty badly.

My wife grew up on a mesa in New Mexico - White Rock, near Los Alamos. She reports exploring as a child with neighbor kids. They found stone houses with intact pottery inside, wooden frames and tools. The kids messed with them, then the boys knocked everything down and threw the pots over the side of the mesa. Nothing left but a few broken fragments today.

We'll be running out of places for archaeologist to explore one day.

In Religious Arbitration, Scripture Is the Rule of Law nytimes.com
33 points by kyleblarson  2 hours ago   30 comments top 9
1
vinceguidry 45 minutes ago 1 reply      
One of the things I gathered when reading the excellent Fields of Blood by Karen Armstrong is that this 'rule of law' property of religion was a fundamental facet of early-modern history. The idea that all persons could be held answerable to one set of rules and principles is so alluring that you have people even today trying to hold secular American leadership accountable to them.

Religion, initially, came in two flavors. One for the aristocracy, which was oriented around finding meaning in the many ethically and morally awful things they had to do to maintain their territories, human capital, and privileged lifestyles.

The other was for commoners, who needed ways to cooperate and trust each other. If two people didn't know each other, then they could still trust each other if they spoke the same language, worshipped the same gods, paid tithes to the same temples. Religion was how they ordered their thoughts about what was right and wrong. It didn't matter for the people at the time what was scientifically or rigorously right, just that it was the same throughout a people.

Until the age of Christianity, rulers had no interest in how the commoners believed. But as the world became more and more connected and the scale of warfare grew to where conscription became more and more necessary, rulers increasingly started to try to control how their populace worshipped.

Fast-forward to the age of Gutenberg. Bibles getting printed in the vernacular, so that ordinary people could read them, was a game changer. All of a sudden, religious hypocrisy became a thing, people with power became accountable for how they used that power, rather than just making up interpretations amongst themselves as they went along.

2
Animats 12 minutes ago 0 replies      
The Christian version of this is new. Previously, using US law to make decisions of religious courts binding has mostly been a Jewish and, to a small extent, Islamic thing. There are Jewish courts (Beth Din) in many US states.[1] They do both commercial and family law arbitrations. The commercial side is generally well thought of, but the family law side is iffy, because Jewish family law explicitly favors the husband and keeping children in the Jewish community.

There have been attempts to set up Sharia courts in the US, but there's much opposition.[2] Sharia family law also explicitly favors the husband.

[1] http://www.nylslawreview.com/wp-content/uploads/sites/16/201...[2] http://www.huffingtonpost.com/2013/07/29/sharia-law-usa-stat...

3
Brendinooo 14 minutes ago 0 replies      
I recently signed up for a new credit card and was really disappointed to find out that if I wanted to use the card, I'd have to submit to arbitration and forgo my rights to a proper American trial.

I'm not against arbitration; speaking to this article specifically, the apostle Paul directs Christians to try and resolve their differences without going to non-Christian courts. There's nothing inherently wrong with the concept.

But in either case, it'd be nice to have some sort of a provision that allows for escalation to a court if you can't get things resolved in arbitration. That's my concern with the credit card I just signed up for; I feel like the company is setting the rules in a way that stacks the deck against me, and if I get slighted I'd have no recourse.

I would find it difficult to imagine Paul, the Roman citizen who appealed to Caesar himself, wanting to shut people out of certain aspects of the legal system entirely.

4
sageabilly 58 minutes ago 1 reply      
My faith is still strong, she said. But I am more careful in dealing with Christians than I used to be. They are just people with no more ability to be good than anyone else.

If you don't grow up in an area that has a very heavy Christian presence it can be hard to understand how very conservative/evangelical Christians think. Many of them believe, honestly and truly, that they are incapable of being wrong because they have been sanctified and saved and therefore even if they perform a transgression no one on Earth has the authority to call them out on it. But see, since they're saved, Jesus forgives them, so no one can call them out on their sin because no one is without sin, so best to just let it go.

It's an extremely insidious mindset because there's no arguing with it. There's no way to get someone who believes that they are morally right to budge even an inch because they 100% do not believe in capitulating to anyone other than Jesus Christ- and since the only source of what He thinks is the Bible and it can be interpreted in about a thousand different ways... you see where this is going. Their personal definitions of what's "good" and "bad" are very, very different from people who did not grow up in the religious culture and there's no way to bring them around to another way of thinking. It happens within the culture too- my father in law and his lifelong best friend had a falling out about 10 years ago over a very small, nitpicky religious matter (they were both elders in their church and disagreed over the way some church finances were distributed) and my father in law straight up stopped talking to his best friend because his friend would not capitulate and come around to my father in law's way of thinking. My Father in Law believed, and still believes, that he was 100% in the right and that his best friend was deceived by the devil and that's why their friendship had to end. And this is over something like "Should we put the church's nest egg all in a savings account or should we take half of it and put it in a Money Market account instead?"

I point this out to add some context to this type of arbitration since, if you didn't grow up in an area where this was just a normal fact of society, you're probably scratching your head wondering how in the world this is even a thing.

Even growing up Southern Baptist below the Bible Belt I had no idea that Religious Arbitration was a thing until I read this article. I had heard of Orthodox Jews having their own set of laws and rules and that being a thing in some very small neighborhoods of NYC, but I didn't realize there was a Christian equivalent being played out across the US.

5
biomcgary 48 minutes ago 0 replies      
This article is purely anecdotal. In theory, promoting reconciliation when possible seems like a positive, but also likely to lead to sugar-coating reality. I'd be interested in a way to quantitatively evaluate outcomes from arbitration of various types, e.g., religious, commercial, voluntary associations, and lawsuits in courts, both state and federal.

I'm surprised by the power of arbitration clauses in contracts. My understanding is that you cannot sell yourself into slavery in the US because the contract will not be considered valid. Perhaps if you put an arbitration clause into the contract, you could?

6
dang 1 hour ago 0 replies      
All: when commenting on this piece, please stay substantive and engage with the specifics of the article. Avoid generic comments about religion, which turn into flamewars, which are off-topic here.
7
oldmanjay 51 minutes ago 0 replies      
> When you think Christian, you automatically think good[...]

That is everything you'll ever need to know about why religion will never leave humanity. Substitute literally any belief system for Christianity, and then realize that every single human who holds that belief system also holds that refactored sentence to be true. Possibly in a more grammatically correct manner.

8
sithadmin 1 hour ago 4 replies      
If arbitration proceedings grounded in superstition and other tomfoolery (i.e., Scientology) hold up, it seems like anything will.

I think there needs to be an 'arbitration by combat' service.

9
WalterSear 1 hour ago 0 replies      
So, basically, Christian Sharia.
Advanced economies are so sick we need a new way to think about them washingtonpost.com
111 points by Amorymeltzer  6 hours ago   104 comments top 15
1
narrator 1 minute ago 0 replies      
There's something wrong with the cost structure in "Advanced Economies". It's just too cheap to make things in China. There are some products where the cost of the raw materials would be more expensive in the U.S than having the finished product shipped from China. There are sky high costs, and nobody can figure out where they are coming from and everyone just shrugs their shoulders. This kind of thing happens in healthcare and education too.
2
clavalle 4 hours ago 3 replies      
Perhaps if we expand the meaning of Keyensian "Government investment in infrastructure" to include human infrastructure and provide stimulus, or at least a stable platform, for the actual nodes of demand (the people) we can get back to optimal growth faster.

The idea that we can have a quarter of the people or more (the poor and working poor) in a state of playing zero sum or negative sum games -- exploitative economic relationships -- non-convex factors in what should, ideally, be a convex efficiency curve, and still reach market efficiency is silly. We just reach the state where the convex portions of our economy cannot compensate for those massive destructions or stagnations of value during times of contraction so it is easier to shake those results out of the aggregate. But that destruction of value happens even in times of relative prosperity and overall growth and it would be wise to whittle down the problem while is relatively painless.

Take pain and death out of the equation for the people through direct stimulus. Create a market where everyone must be enticed with value-add to trade. Watch demand bubble up and create the strongest growth possible.

3
hackatroll 3 minutes ago 0 replies      
The article says the U.S. is about 10% below a trend estimated through 2007. 2007 was the peek of the bubble. Why are they using 2007 as a target? The credit market was different back than. How much of that 10% output was fueled by easy credit? It might be better to use a few years before 2007 as a target. I would like to know where we are if you look at it before the Mania Phase of the bubble.
4
nostrademons 47 minutes ago 2 replies      
I actually think there are a fair number of prominent figures that have a good grasp of what's actually going on with the economy, but they would rather make money off of it than fix it. You have to divine their theories from their actions and a few oblique public posts.

The academic economics community just seems terribly out-of-touch these days, using models made a generation ago to try and explain an economy that has changed dramatically since then.

5
bigethan 5 hours ago 2 replies      
Related: http://www.nytimes.com/2015/11/01/upshot/is-the-economy-real... (plus a great gif in the header)

When economists are unsure what exactly is going on, what does that mean? There's the quote by Buffet, "Be fearful when others are greedy and greedy when others are fearful".

What should one do when others have no idea what's going on?

6
nemo 5 hours ago 3 replies      
Larry may be right on some points, but given his past support for and architecting of deregulation of the U.S financial system (including the repeal of the Glass-Steagall Act) while he was in the Clinton admin., along with support for the disastrous privatization of the economies of the Post-Soviet states that's lead to massive human suffering and political failure, as well as gambling with Harvard's endowment and losing the school $1.8 billion, triggering layoffs and budget slashing, I think we'd do well to find advice from those with a less predictable and disastrous track record.
7
AngrySkillzz 5 hours ago 2 replies      
Wouldn't a simpler hypothesis be that output trends before a recession are artificially high? We know fairly well that economic trends tend to feed back on themselves; the idea that growth leads to an acceleration of growth which overreaches and ends in a crash isn't particularly far-fetched.
8
AndrewKemendo 4 hours ago 0 replies      
These are quite bold, and arguably heretical statements to modern neoclassical economics:

New Keynesian models imply that stabilization policies cannot affect the average level of output over time and that the only effect policy can have is on the amplitude of economic fluctuations, not on the level of output

But Summers tries to refute that, just by saying it's absurd and giving the costanzian platitude "well I gotta do something!"

I think he makes his best point by saying:

Beginning the study of stabilization with this assumption takes away much of the motivation for doing macroeconomics.

Indeed it does Larry.

9
jstalin 4 hours ago 0 replies      
Surprise surprise, world's central banks are engaged on the largest bout of central planning the world has even seen and... guess what, it doesn't work!
10
asquabventured 1 hour ago 1 reply      
Keynesian economics creates market failures. Government "stimulus" does not create natural demand and any "stimulus" has front-loaded costs that are paid for by future generations.

Central "Planners" should be fired and our economic systems need to move to a more decentralized economic model, la Austrian Economic Theory[1] if we ever want to see natural growth again.

[1]https://en.wikipedia.org/wiki/Austrian_School

11
6stringmerc 4 hours ago 0 replies      
Bill Gross is a rather polarizing figure, but compared to Summers, I'd say he's a lot more "hands on" and experienced in the way that financial systems work - or don't. Here's his latest Janus newsletter / update:

https://www.janus.com/bill-gross-investment-outlook?utm_camp...

12
crdoconnor 4 hours ago 0 replies      
13
ThomPete 4 hours ago 0 replies      
There are two very important things that would need to be factored in before we start to better understand what is going on.

1) We need to understand that we live in a global economy not a national one. One of the key elements to understanding this global economy would probably be to look at something like Supply Chain Management. I.e. the productivity of china should be closely tied with an understand of US economy. They aren't separate anymore than two computers connected via a network is.

2) We need to find a way to factor in technology, digitalization and automatization instead of treating it as the externality it is today, by almost all economic thinkers.

Especially those who advice countries and central banks. It is one of the key components in understanding what is going on. And before we find a way to factor that in, we will keep fighting in the blind.

Until then we will have this fruitless idealistic fight between Keynes and Austerity and not get to the heart of the discussion.

14
youngtaff 5 hours ago 1 reply      
Perhaps instead of giving cheap money to the banks so they can make a profit at our expense, we could give the money to the people so they can use it the 'best' way for them (save, spend, pay off debt etc
15
caseysoftware 5 hours ago 2 replies      
"Standard new Keynesian macroeconomics essentially abstracts away from most of what is important in macroeconomics."

Yes, that has been the criticism for decades. There is nothing new here.

The only time Keynesian economics has "fixed" an economy was WW2 when we (the US, specifically) had full employment because of the draft and all of our competition had been literally flattened and/or killed.

For another look at it, check out the Keynes vs Hayek rap battle:

https://www.youtube.com/watch?v=GTQnarzmTOc

Foetry.com wikipedia.org
17 points by networked  2 hours ago   1 comment top
1
mturmon 28 minutes ago 0 replies      
There is a similar entity for visual art, the juried exhibition or juried publication. Submissions are invited, which are reviewed by a jury of artists, and the best are invited to be in a gallery show, or a publication.

But you have to pay a flat fee that covers the cost of the process. And this flat fee can be way more than it actually costs to review work -- it's basically profit-taking by whoever is putting on the show.

And further, you have no real reason to believe that your work was actually seen. The jury could have just tossed it out and selected works from their 12 friends. These shows are generally regarded with great suspicion, but still, artists pay to be included. (http://joannemattera.blogspot.com/2012/01/when-do-you-stop-e...)

Signal for Android: RedPhone and TextSecure in one app whispersystems.org
194 points by pjf  9 hours ago   129 comments top 20
1
redwards510 20 minutes ago 1 reply      
Wow, tough room! So much negativity. Whisper Systems, thanks for making encryption simple enough that my Mom can use it, and open enough that I can trust it. That is the success story here.
2
Tepix 7 hours ago 5 replies      
Signal is pretty awesome, it's by far the best that we have right now:

 state of the art crypto open source free as in beer Available for Android and iOS
There are a few minor features that are missing but I can live with that.However, there are also a couple of important shortcomings:

 no decentralization use of the phone number
I hope they can be fixed sooner or later.

3
verusfossa 6 hours ago 7 replies      
TextSecure used to be on fDroid, then this happened https://f-droid.org/posts/security-notice-textsecure/. Now it's a GPlay exclusive. I don't have GAPPS so now I can't get it. I'd assume many privacy conscious people don't have GAPPS. I understand the technical hurtles, but it's too big a pill to swallow.
4
deckiedan 8 hours ago 9 replies      
One of the reasons I didn't install WhatsApp is the sheer quantity of permissions it wants (on Android).

Yes, I realise that this is part of Android's broken security model.

But the Signal app also wants access to practically everything. Device & App History, Identity, Calendar, Contact, Location, SMS, Phone, Photos/Media/Files, Camera, Microphone, WiFi Connection info and Device ID and call info.

Call me paranoid, but making an app with all those permissions seems kind of the obvious place for backdoors and similar.

If there was a 'light' version of the app which only required access the internet, then I'd be much more likely to install and use it. (And maybe if I ended up trusting it, then later install add-ons / the full version later).

5
tabrischen 18 minutes ago 0 replies      
Unfortunately when I used Redphone previously the call quality is pretty bad. The volume of the call would get screwed up and I know it's not to do with the signals (lol) because I would hang up and call the person back without redphone and it would work perfectly.
6
newman314 1 hour ago 0 replies      
I'm really hoping for some intersection of Signal, Ricochet.im and Burner.

* Signal for the security piece.

* Ricochet.im for the Tor (optional?) and anonymous endpoint

* Burner for the idea that we can have lots of throwaway identities (if so desired).

7
corney91 8 hours ago 5 replies      
I really want to start using TextSecure (or Signal now I guess), but the only thing holding me back is it depends on Google Play Services. I love what they're doing and can understand the decision, but still thinks it sucks a bit that the best option for secure communications is so tied into Google.
8
sbt 8 hours ago 1 reply      
I have been using Signal (and TextSecure before that) for a while. The two features that would make me use it more would be:

1. Web client. (I send most of my texts from my laptop, which means I use Hangouts)

2. Search messages on the client.

Thanks for a great product and keep up the good work.

9
furyg3 8 hours ago 1 reply      
Great! I've been using Signal for iOS and while it's not yet comparable to WhatsApp feature-wise, it's for sure good enough to use.

I find it very difficult to get people to switch from the walled gardens of iOS messaging and WhatsApp (WhatsApp dominates the Dutch market for interesting historical reasons). I've been able to get a handful of privacy conscious friends to switch to Signal/TextSecure, hopefully the cross-platform branding makes this a bit easier.

10
Nexxxeh 8 hours ago 3 replies      
As was mentioned on another HN thread, why the ridiculous name? "Signal" is an already widely used term when talking about mobile communications. With cell phones in particular.

"I'm trying to get signal on my phone" or "I'm trying to get Signal on my phone". Great(!)

11
amluto 2 hours ago 0 replies      
What happened to short authentication strings? The SAS protocol is nicely documented in the Silent Circle Instant Messaging Protocol paper [1], but when I go to "Verify identity" in the app I'm asked to verify an obnoxiously long pair of hexadecimal strings.

The phone call feature supports it (with a curious lack of documentation), but it would be easy to imagine a UI that allowed verification without making a phone call and without allowing users to screw it up: one phone shows the SAS string, the other phone asks you to type it in, and neither phone allows IMs to be sent while doing this.

12
Omnipresent 2 hours ago 2 replies      
Is Signal for Android open source like other things coming out of open whisper? I didn't see it on their github page [0]

[0] https://github.com/WhisperSystems

13
mrmondo 7 hours ago 1 reply      
I'm glad it's not called RedPhone for Android users, they may get it confused with YouRed I mean YouTubeRed I mean RedTube or whatever it's called now. In all seriousness though - what was (is) with the trend of apps with Red in the name? Red to me gives the idea of danger or something that's stopped / blocked.
14
oha 1 hour ago 0 replies      
If they would add support for sending pdf and doc files and any other file and release a desktop client it would be an unbeatable messenger.
15
mtgx 7 hours ago 0 replies      
Any ETA for the desktop version?
16
artichokeheart 8 hours ago 2 replies      
As long as it still requires Google Play I'll still consider it snake oil.
17
creshal 8 hours ago 2 replies      
https://www.reddit.com/r/netsec/comments/3rc9br/psa_signal_f...

There seem to be concerns about the security of it.

18
pasbesoin 2 hours ago 0 replies      
I just updated, after which I checked the settings. I noticed that auto-downloading in MMS text messages was enabled (moxie stated for a prior version that it was not, at that time), whereupon I changed the settings to disable. I may well be misinterpreting what I was seeing, but better safe than sorry.

(I'm stuck on Android 5.1 (not 5.1.1) on a Verizon phone. I was thinking of the ongoing Stagefright problems.)

19
JulianMorrison 8 hours ago 2 replies      
Why not on my Nexus 7?
20
venomsnake 8 hours ago 3 replies      
Do they have ZRTP in the baseband of normal GSM call?

This is what I have been waiting for.

How the F.B.I. Can Detain, Render and Threaten Without Risk nytimes.com
191 points by rbcgerard  4 hours ago   82 comments top 7
1
dccoolgai 2 hours ago 6 replies      
Advice from my father-in-law, who is a prominent attorney: "Never, ever talk to the FBI without a lawyer - even if you want to help them as a witness... because if they don't like the truth you're telling them, they can (and often do) say you lied to them which is a federal offense. If you have your attorney, they at least know there is a credible witness present who is keeping track of who said what."
2
rayiner 2 hours ago 2 replies      
Skip the article and just read the opinion: https://www.cadc.uscourts.gov/internet/opinions.nsf/E8CAF3B0....

The case here decides a specific issue: whether a Bivens action for money damages is available when the government violates a Constitutional right of a U.S. citizen abroad in a terrorism investigation. Bivens is basically a last-ditch option where a tort cause of action is created when there is no other remedy for a Constitutional violation: https://en.wikipedia.org/wiki/Bivens_v._Six_Unknown_Named_Ag.... But because it's a totally judicially-created construct, the Supreme Court has warned caution in expanding it to new contexts.

The opinion does not say that it is legal for the Government to do what Mr. Meshal alleges it did. The question is whether Mr. Meshal can get money from the Government for violating his rights.

3
mercurialshark 33 minutes ago 0 replies      
The author and title confuse the issue at hand. The subject, a US citizen, travelled abroad under his own free will. The Due Process Clause of the Constitution only applies domestically. Had the FBI detained him on US soil and rendered him to foreign entities, that would trigger due process scrutiny.

Being that he was already in Somalia, the FBI can coordinate with host countries (and intelligence agencies) to interrogate individuals. Not only is the FBI not obligated to return him to the US, it's by no means obvious that they can forcibly extradite, even if they had wanted him returned to the US. Sure - they could request (as the State Department and FBI do all the time) - but that's a different issue.

Disclaimer: I'm not saying it's good, just that the issues (and therefore the author's conclusions) are presented inaccurately.

4
yourepowerless 1 hour ago 2 replies      
Simply more proof that the laws have become illegitimate tools of power for the elite and corrupt. American is no longer a democracy in any meaningful sense of the word, but a tiered society of those inside the halls of power, given extraordinary and unaccountable powers to detain, torture, murder others and the meek who suffer such atrocities.

It's laughable and sad anyone appeals to the court system for a remedy when they simply are a rubber stamp, but what do people expect when secret courts and NSLs abound?

Voting won't help, legal redress won't help, what then can you do?

5
ck2 2 hours ago 0 replies      
Same anti-constitutional way the TSA does every day?

BTW the FBI has never, ever, found themselves in the wrong after shooting someone, apparently they are perfect, infallible human beings.

Not for decades. Because trust them, only they investigate themselves.

6
mtgx 2 hours ago 1 reply      
Although I think this is still mostly a decades-old systemic issue, I think Obama has been a huge factor in making this problem worse.

Every time there was an abuse, he has defended the crimes and the criminals, whether it was the FBI, CIA, or NSA.

Bush administrations' crimes? - "We need to move forward"

CIA torture - "We tortured some folks - but you don't actually expect me to punish anyone for doing it, do you?!"

NSA - "There have been no abuses". Enough said.

Even with the OPM hack, the largest government data breach in US' history, Obama has tried to push the story under the rug, so it doesn't make his administration look bad - or something.

Obama got it wrong. You don't "move forward" by ignoring the problem. You move forward by admitting the problem exists and then punishing those responsible. The criminals in the government can't continue to have jobs or even keep their liberty as if nothing happened, while through their power, they have destroyed many lives.

7
l3m0ndr0p 3 hours ago 3 replies      
The FBI is our version of the Gestapo of Nazi Germany.
What would the taxes be if I exercise my startup options? medium.com
32 points by itsjaredc  3 hours ago   13 comments top 3
1
delinka 25 minutes ago 0 replies      
I recently talked to a CFO who put it like this:

For the sake of planning: The difference between what you paid (you have to exercise options for this to matter) for your shares and what the are worth (see the company's most recent valuation) is counted as income for the current tax year.

If you hold the shares for more than a year, the money you make on a sale is subject to capital gains tax. This is indeed affected by WHEN you buy shares, so make sure to take monthly purchases (if you vest monthly) into account.

When you file your taxes: Hire someone; don't try to do this yourself.

2
lgcoleman 1 hour ago 1 reply      
It completely depends what type of options you have.

If they are incentive stock options then you will need to report the value of the bargain element (fair market value of the options less the amount you actually paid) as AMT (alternative minimum tax) income. The AMT tax rate is ~27%.

Say your stock is worth $110 and you pay $10. Your bargain element is $100 and your AMT is ~$27, federal. Your state could also have tax reporting requirements on the exercise of your options.

This assumes you exercise and hold. If you sell the shares after exercise a whole host of other issues come into play.

Fairmark.com is a good resource for the tax implications of equity compensation. http://fairmark.com/execcomp/index.htm

Good Luck!

3
logfromblammo 1 hour ago 2 replies      
I can hear all the non-US persons reading this article and laughing about all the insane idiocy that we put up with from the US tax code and IRS.

All US persons have to put up with at least this amount of calculation every year (otherwise pay more than is strictly necessary, or possibly suffer penalties and fines later), and the details constantly change with the political tides.

This is why tax reform crops up as an issue every 4 years, though it somehow never manages to survive past primary season to the general election.

Cheap, clean stoves were supposed to save millions of lives what happened? washingtonpost.com
29 points by r721  3 hours ago   28 comments top 7
1
bcg1 1 hour ago 1 reply      
Ethanol stoves are actually great for this problem... very low emissions, and fuel that is easy and safe to store and can be produced locally. In many parts of the world, there are abundant non-food sources of mash... kelp, mesquite seeds, cattails, etc... and community scale production is much more efficient than the bizarre government subsidized ethanol industry in the US that uses GE corn with that requires a lot of energy input and has a high energy cost for transportation to and from the centralized production facilities.

I'm not sure how it is "undeveloped" countries but around here is quite illegal to make your own fuel in this way however... it is, after all, the "A" in BATF. It seems that without addressing such issues, the "clean" stoves will not be popular because locals would need to keep paying some "authority" (government or otherwise) for access to the fuel, and money is much more scarce than dung in many places.

2
kirk21 19 minutes ago 0 replies      
Solar panels + batteries will have an enormous impact in a lot of countries. Once ppl buy a stove it is 'free' to use vs having to buy fuel every time (like pellets or ethanol).Will take 10-15 years but it is coming. Combine this with satellite internet and the living standard will radically increase.
3
aaron695 1 hour ago 0 replies      
Sounds to me like it is working, is what happened.

> Although these cookstoves produce fewer emissions than open fires, burning biomass fuels in them still releases plenty of toxins

For toxins that actually matter, that's smoke in layperson terms.

So what if the benefits are only partially there. It still rocks.

It makes me suspicions that the study where all the stovetops were no longer being used, was just picked as an example because it failed.

And as to electricity and gas being better, that's nice, the doers can continue down the biomass efficiency road while this is still being discussed.

4
stevetrewick 1 hour ago 2 replies      
>Perhaps more research could apprehend what actually works

No, not really. Since the obvious solution (i.e. the one that has worked everywhere in the world it's been tried) is affordable grid scale electricity - which at this point means fossil or nuclear - and the west is apparently hellbent on preventing the devolping world from having this, it's unlikely any future research will come up with anything of any actual utility.

5
eveningcoffee 1 hour ago 4 replies      
I do not have time to read this bag of words but I searched for the word "chimney" and I did not find it.

See, there is your problem and a big fat hint for a solution.

Most countries in northern hemisphere solved this problem more than century ago and it amazes me how this solution is not even mentioned once.

6
pakled_engineer 1 hour ago 0 replies      
A solsource solar cooker I gave to a friend's family in the Rajasthan desert still works as advertised a year later. These are desperate needed around refugee camps where locals clash with refugees looking for firewood.
7
SixSigma 1 hour ago 1 reply      
"We know what's best for rural Indian poor" proclaims millionaire with her own catering.

Still, at least India has a rocket going to Mars even if its children don't have toilets and clean water.

A New Biography Traces Allen Dulles and His Cabal theintercept.com
6 points by pavornyoh  1 hour ago   2 comments top 2
1
cryoshon 58 minutes ago 0 replies      
"And today weve largely returned to the balance of power Dulles set up in the 1950s. As Jay Rockefeller said in 2007 when he was chairman of the Senate Intelligence Committee, Dont you understand the way intelligence works? Do you think that because Im chairman of the Intelligence Committee that I just say I want it, give it to me? They control it. All of it. All of it. All the time."

I get the feeling that things are probably worse today than in 07 regarding secrecy. If this tidbit is true, it effectively means that the intelligence agencies are far beyond any kind of political oversight.

EDIT:

"Whatever its funding sources, the evidence suggests the Safari Club was largely the initiative of these powerful Americans. According to Heikal, its real origin was when Henry Kissinger, then secretary of state, talked a number of rich Arab oil countries into bankrolling operations against growing communist influence on their doorstep in Africa. Alexandre de Marenches, a right-wing aristocrat who headed Frances version of the CIA, eagerly formalized the project and assumed operational leadership. But, Heikal writes, The United States directed the whole operation, and giant U.S. and European corporations with vital interests in Africa leant a hand. As John K. Cooley, the Christian Science Monitors longtime Mideast correspondent, put it, the setup strongly appealed to the U.S. executive branch: Get others to do what you want done, while avoiding the onus or blame if the operation fails."

This is a way that the world really works, in plain text. The governments ask their buddies in the oil companies for geopolitical actions, then get a policy kickback later on after the dirty work is done. Disgraceful.

2
ddp 47 minutes ago 0 replies      
There's a good interview with the author on Democracy Now (make sure you watch both parts):

http://www.democracynow.org/2015/10/13/the_rise_of_americas_...

The Black Box of Product Management medium.com
28 points by saadatq  3 hours ago   16 comments top 6
1
notacoward 2 minutes ago 0 replies      
From an engineer's perspective, the problem with product management is often lack of a definable process. How do the PM's priorities correlate to actual user priorities? I've never had a PM who showed actual names and numbers or anything quantitative to show this relationship. If I could see that, it might be easier to understand why they're willing to tie up half of engineering for a year to implement a feature that not one of those engineers believes will actually matter, while other apparently low-hanging fruit have to go unpicked. It's OK if they're wrong, so long as there seems to be a rational process involved. I've worked with some pretty good PMs, but even they seemed to be good primarily because they had good gut instincts and not because they applied a rigorous process. In other cases, PM decisions seemed to be based on pure personal bias unchangeable by facts. Anyone who has lost a year of their life because of a PM's ten-second decision is going to be less keen on PMs as a species for a long time.
2
RyanZAG 16 minutes ago 0 replies      
> One day, there may be graduate degrees and definite career paths to product management, but not today.

What he has described is a multi disciplinary generalist with knowledge of each business sector and training in how to manage organizational goals and people. There is actually a current post graduate degree for that exact task - the MBA.

3
mattzito 1 hour ago 1 reply      
It's a really long article explaining and justifying what "Product" is.

I think a much simpler explanation is that product management is the hub that connects all of the other groups together in a unified and consistent fashion.

Is marketing up to speed on exactly what is launching a month before it launches, so they can prep content and PR people? Do sales people understand exactly how we should or should not be selling a new feature? Do finance and legal know how we're going to sunset that old piece of functionality without violating contractual agreeements with customers?

Thats' all product management. And then you get into the roadmap side of things, but even that is largely in service of the above goal.

I'm not one of those product people who thinks that all companies need product managers. But I think in the end, someone ends up filling that need, whether they have a formal title or not. It could be a sales exec, or a presales engineer, or a head of engineering with good business sense, but that's the job function they're stepping in to fulfill.

4
mbesto 1 hour ago 1 reply      
I wouldn't necessarily agree with the part about "why product management exists".

Product management exists because customers need a clearly and concisely communicated path to their benefits today and in the future. Product developers need a clearly and concisely communicated path on what to build to create value for customers.

Successfully marrying those two is not an easy endeavor, and the way at which you do it isn't a one size fits all orchestration. Hence, why product management is seen as something of a dark art.

Here's what I do know. Companies without strong product management risks themselves to higher customer churn and longer development-to-value timelines.

5
jarjoura 1 hour ago 2 replies      
I find here in the valley companies value engineers who want a say in product direction. So we have companies filled with engineers who don't want to just be shoved a task and relegated to coding monkey status. I think that's where most of the ambiguity exists.

As I see it, a PM should be your team's CEO and the lead engineer should be the CTO. Together the pair of you are delivering on a single vision, one person outwardly and the other inwardly. Yet for the magic to work, the entire team needs to be on the same page. No one person should be setting the goals for everyone else. Simple put, the PM isn't the engineers bitch and engineering is the PM's bitch.

6
zzalpha 1 hour ago 1 reply      
Interesting piece...

As a relatively newly minted Product Manager, I struggle, every day, to figure out what, exactly, my job is, other than to abuse commas (I'm in a rather odd spot in that I have no mentor and weak senior leadership, so I'm largely responsible for defining my own role in an organization that doesn't really grok what product management involves).

One thing I've found is that I can't easily put my finger on what exactly I do... "leadership + coordinator/facilitator" is the best I can come up with. It's something that's troubled me for a while now, but this post makes me realize that maybe that's just normal, which is rather nice to see.

Still doesn't mean I know what the hell I'm doing, but that's a different problem...

Life Is Rescues newyorker.com
4 points by jeo1234  14 hours ago   discuss
In 1972, Scientists Discovered a Two Billion-Year-Old Nuclear Reactor In Gabon iafrikan.com
125 points by tefo-mohapi  10 hours ago   30 comments top 8
1
Eric_WVGG 5 hours ago 0 replies      
Steven Baxters Manifold: Space features one of these in a sort of Planet of the Apes scenario. Great book. https://en.m.wikipedia.org/wiki/Manifold:_Space
2
dj-wonk 4 hours ago 5 replies      
> Davis and co point out that the Oklo data can also constrain changes in other constants, such as the ratio of light quark masses to the proton mass. To date, this work is consistent with these constants being constant.

In the history of science, how many 'constants' have, so far, been shown not to be constant?

3
AnAfrican 5 hours ago 2 replies      
As a side-note, it's always weird to me to see place like Gabon or Congo called "West Africa".

It's clearly on the West Coast but within Africa, the West starts after Cameroon.

4
legulere 7 hours ago 0 replies      
> The one exception was a shallow reactor zone at a place called Bangomb, some 30 kilometres from Oklo, although this has largely been washed out by ground water.

That doesn't sound that safe to me after all.

5
the_watcher 1 hour ago 1 reply      
This is simply astounding.
6
InclinedPlane 8 hours ago 0 replies      
One interesting thing about this is that it represents a race, between the concentration of Uranium in high-grade ores through geological processes (which requires the Earth to have formed, and so on and so on) on the one hand and the reduction in abundance of U-235 in natural Uranium over time due to radioactive decay.
7
notdonspaulding 2 hours ago 2 replies      
FWIW, the title on this post scans like the number 28 as opposed to the number 2 billion (to my eye, anyway).
8
dang 9 hours ago 1 reply      
Static Website Generators Are the Next Big Thing smashingmagazine.com
195 points by jimsteinhart  8 hours ago   129 comments top 46
1
jedberg 2 hours ago 3 replies      
When I was heading up reliability at Netflix, we considered, and even began evaluating, turing the whole thing into one big static site. Each user had a custom listing, but generating 60+ million static sites is a very parallelizeable problem.

At the time, the recommendations only updated once a day, but an active user would have to dynamically load that content repeatedly, and at the same time, the recs were getting updated for users who hadn't visited that day. By switching to static, we could generate a new static site for you every time you watched something (which could change your recommendations), and increase reliability at the same time, so it would have been a much better customer experience. Unfortunately we couldn't get enough of the frontend engineers to buy into the idea to get it off the ground, and also they were already well along the path to having a data pipeline fast enough to update recs in real time.

2
ts330 6 hours ago 4 replies      
I'm not entirely sure they're the next big thing. More likely, is that 15 years ago, it was people who were used to static sites began moving to dynamically generated ones as sites became more complex. They were the new thing then. Now we have a load of people who have grown up with dynamically generated sites and are suddenly discovering the benefits of static sites - thanks in part to the proliferation of tools that are easy to use.

It's the usual boomerang cycle of discovery and adoption.

Both types of sites have their benefits and it's a balancing act to use the right tool for the job. It's getting this right that comes with experience and an understanding of the current pitfalls of each. It's the rough edges that push people in the other direction and without the experience of the pitfalls of each, it's inevitable that people start predicting that one solves all the problems facing the other.

I fully expect the usual over reliance on the wrong type of tech for the sake of it being the current hotness and then an over correction in the other direction the moment we have a new generation of developers.

3
spjwebster 5 hours ago 3 replies      
Every time static generation rears its head, I'm reminded of Yahoo!'s ... unique... take.

Back in 2006 when I worked for Yahoo!, and they had a CMS / template management system called Jake that statically generated templates for the PHP-based frontend servers to evaluate at request time. The idea was that you put as much of your logic as possible into the template generation layer, leaving the request-time logic to handle the stuff that changed request by request.

Now, that all sounds quite reasonable, but the two layers were written in different languages. The pre-template-generation logic was written as inline Perl (plus a little custom syntax, because why not), while the dynamic frontend logic was written in PHP. Perl was frequently used to generate chunks of PHP code to be executed by the frontend servers, and sometimes this PHP code wrote chunks of inline JavaScript. To say that debugging said JS was fun would be an understatement.

4
CM30 1 hour ago 0 replies      
It's an interesting idea, and I see the appeal if the site is fairly basic, but I think there's one thing people are forgetting here.

You're outsourcing half your site to third parties, and basically letting them do whatever the hell they like with it. Disqus comments? Better hope the people behind that system don't decide to outlaw comments about the thing your website is about. Javascript embedded shop system? Good, so long as you don't need to modify the look very much and don't mind all your data being hosted in a different part of the world (like, the US for people in other regions).

And if they decide that all your data needs to be shared with the NSA or some other government organisation, then tough luck. If they hacked... well, tough luck again.

Without hosting such systems yourself, you're relying on a lot of third parties to be transparent, honest and respectful of your privacy (and that of your visitors). It's basically like a return to the days of free hosting and services like Bravenet.

5
davexunit 5 hours ago 2 replies      
The reason I like static site generators so much is because it allows me to treat my website as a program that outputs a website. Take some posts in whichever format you prefer, write a program to parse them as a tree of HTML nodes, insert them into templates for HTML pages, Atom feeds, etc. It's all just plain text code and markup, no stateful database with a dynamic web application in front doing everything.
6
thenomad 6 hours ago 2 replies      
I've been a fan of static websites for a long while now.

In addition to page load issues, they also more or less completely solve the Slashdot effect (aka the Reddit Hug Of Death, these days). A competently-configured Nginx server on a 512mb VPS, serving static-only content, will handle ridiculous amounts of traffic without flinching.

Ever since a front-page mention on BoingBoing took down my feature film BloodSpell's site immediately after release in 2007, avoiding server load on a site has been high-priority for anything I'm launching that is likely to have bursty traffic.

It's nice to see usable tools for managing larger sites with a static generator developing and becoming popular.

7
geraldbauer 3 hours ago 0 replies      
FYI: I've put together a showcase of the world's greatest static sites [1] (using the Jekyll machinery). Examples include: Bootstrap, Google Polymer, Facebook React, Open Data Handbook v2, PHP: The Right Way and many more. Cheers. [1]: http://planetjekyll.github.io/showcase
8
intrasight 4 hours ago 1 reply      
I've only done static websites for a long while now. I created ThinCMS as a browser-based tool for building and publishing static web sites. It came about after I learned XSLT well enough to bend it to my will. I used it to build several public and private web sites, including two iterations of longwoodgardens.org (they've since moved to Drupal) and pittsburghtoday.org (where it is still used). XSLT ( and probably other static templating engines) is perfectly capable of generating complex nested navigation. The templates themselves are nested three layers so as to keep things DRY.

The PittsburghToday site is representative of the idea that a static web site is only static in the technical sense of the back-end content serving. The front-end is still dynamic since the data for the charts is being obtained from Google Docs and the Twitter feed from Twitter, etc.

I always felt like the odd man out, so I am glad to see strong interest in static web sites nowadays.

9
geraldbauer 3 hours ago 0 replies      
FYI: Join the "movement" and start a static site user group. For example, I've started the 1st one in Europe, that is, Vienna.html [1]; others in the U.S. include Static Web Tech in San Francisco [2] and {Static is} The New Dynamic in New York City [3]. Cheers. [1]:http://viennahtml.github.io [2]:http://www.staticwebtech.com [3]:http://www.meetup.com/The-New-Dynamic
10
tetraodonpuffer 4 hours ago 1 reply      
I love pelican as a generator, it's great

The one static generator I wish there was (unless there is one and I just haven't found it) is one that would take a tree of code files and display it kinda like github does, in a browsable file browser hierarchy with syntax highlighting when drilling down to individual files.

Kind of like a precompiled static file browser, there are several dynamic file browsers around but they all require server-side code (php usually) to do the directory listing and so on, but I think it should be possible to precompute all the directory display pages with symlinks to the individual files, and do the highlighting in those in JS/CSS

I might end up writing this at some point as it's definitely an itch I'd like scratching unless it does exist already and somebody kindly points me to it

11
bigethan 5 hours ago 1 reply      
The thing that always pulls me back to wordpress is the ease of creation and hosting. Drag an image into your text, and pop it's uploaded, hosted, and linked to. Write something in the interface and bam it's online and reliable. Every static site generator I've played with (And there have been TONS), solves 80% of the problem.

I'd love an app that gave me that ease of hosting and creation and generated a static site from there (hook it up to an S3 bucket or something). I'd pay for sure.

Currently I'm using Nginx w/ SSIs for lightly dynamic sites. Works well enough and is very very simple.

12
Loque 6 hours ago 3 replies      
The only problem I feel this really solves is caching, and for that specific problem, generating static pages may be a (work-aroundy) solution to consider, but in general I don't think static website generators are going to be the next big thing...

When you look at page speed, not much of the slow-down is from servers delivering pages, but browsers having to digest HTML/CSS/IMG/JS

I guess it all depends on what you are trying to achieve. Thanks as always for the article, interesting to see someone taking this with both hands.

13
ohitsdom 5 hours ago 10 replies      
I'm still looking for a static site generator that's as easy to use as Wordpress (so with a UI, not markdown). Anyone know of something that fits the bill?
14
stoikerty 5 hours ago 0 replies      
I can't wait for ghost's API.

I've been following their repo for a while, hopefully by next year they'll be far along for it to be usable.

Ghost has an excellent editor and it would be awesome to have a static site built in any way you like that links straight to your blog posts via API calls.

They have a trello card that mentions ithttps://trello.com/c/QEdjRlgK/67-open-public-api-via-oauth-a...

and are working on it as we speak :)https://github.com/TryGhost/Ghost/issues/4004

I know there's a wordpress API as well but I find wordpress too bloated IMO.

15
rmdoss 3 hours ago 0 replies      
That's the cool tool they used to test the performance of the site:

https://performance.sucuri.net/

16
yuvadam 6 hours ago 1 reply      
I would be equally interested if the article was titled "Why Static Website Generators Are Awesome". They are not the "next big thing", they've been around for years.
17
deanclatworthy 5 hours ago 1 reply      
I've been using static site generators a lot for client work over the last two years. I started off with Jekyll, but unless you are having under 100 pages, the build process gets painfully slow (trust me, I have micro-optimised).

I've since started using http://gohugo.io

It's lightning fast with 1000s of pages, and quite easy to pick up.

18
milge 1 hour ago 0 replies      
I'm currently (slowly) working on a static blog that utilizes only HTML, JS, and CSS. I liked jekyll, but wanted something with no backend technology requirement:

https://github.com/milge/lilblog

19
eatonphil 6 hours ago 0 replies      
I am working on a project, http://blogful.me, that combines a hosted static blog generator with a solid admin backend including post syndication, an embedded analytics dashboard, authoring tools, and an API if you don't want to use the frontend.

There definitely appears to be a lot of interest in this space because you get the best of all worlds. Static site generators definitely seem like the way to go for all but actual web applications.

20
bigethan 5 hours ago 1 reply      
... because Javascript and external APIs can now do so much.

If I can do everything with JS on the visitors browser, why not host some shell HTML on S3 and never worry about a server? Maybe hit AWS Lambda if need be for one specific thing? Dunno. The age of the do everything server seems to be coming to a close.

21
ak39 6 hours ago 1 reply      
Ok, "static" here means no RDBMS-backed website. But you can still use statically generated JSON resources from a db once off. These resources can then be "filtered" and "combined" without the need for databases (not using the words JOIN or WHERE carefully).

Sounds like a great idea to overcome the need to obsess about connection multi-plexing.

22
EA 5 hours ago 1 reply      
Anyone remember Noah Grey's "Grey Matter"?

It was an inspiration to what became WordPress

https://en.wikipedia.org/wiki/Greymatter_(software)

23
kefka 7 minutes ago 0 replies      
I do something I call Dyna-Static.

I can run a static page off an Apache (or any wwwserver) instance. Just chuck files in /var/www/ where you want them.

Now where it gets interesting is I use Node-red to generate the pages; content and all. I want headers? It's a variable. I want ads? It's another variable Google provides. I want chat? Easy ( I can do it with nodered or 3rd party). I can bridge that webchat with my or someone else's IRC room.

Now, I can script it so the pages are updated from nod-red server to webserver. They can easily sit on the same box, as node-red takes few resources.

And the kicker is that I could get that done in an hour or so. Check out Node-Red . It really is that amazing.

24
AnkhMorporkian 6 hours ago 7 replies      
You know what I've always wanted, but my searches have led me to believe that it (inexplicably) doesn't exist?

I want a simple site generator. I don't want markdown, I don't want a fancy templating engine. I want some simple templating system that takes in normal HTML and generates pages from simple templates I define. I want to shove in some arbitrary HTML and have it spit out a site using some base templates.

To the best of my knowledge, that doesn't exist. It would be perfect for someone like me who wants to keep a website updated, but doesn't always want to run PHP on the server for something as simple as that.

I implemented a shoddy version of it on my own, but it's far from ideal. I'm pretty astounded there's not a well thought out version of it out there, considering how useful it seems it would be.

25
ramon 3 hours ago 0 replies      
I think you guys are thinking small when thinking about a static site. Change the content dynamically with Javascript only is still considered a static website, this doesn't make the frontend static at all, it's very dynamic! But no more need for PHP, Perl, etc.

Best Regards

26
threatofrain 3 hours ago 0 replies      
I think static site generation will simply be a feature of a build tool like Webpack or Gulp, maybe as a plugin, and either way there will be an api for developers. Or it will simply be part of a larger build chain / automation system somehow.

Static site generator doesn't mean there's no backend. A website is called 'dynamic' when its operation depends on communicating with a server. The JS logic delivered to the client can range from animation to async http requests.

The distinction between "static site generator" and Webpack / Gulp is very gray. It all depends on what you want to do with your client-side JS logic.

27
Ideabile 6 hours ago 0 replies      
I think that Static Website Generators, aren't by them self the next big thing.Mostly is about how the codebase influence on the content of the website, and use the the power of the existing versioning tools is indeed an advantage.Instead by them self are quite limited.

IMO, they are the next big thing if their are contextualised in Micro Service Structure, so that's why I build Monera - http://github.com/Ideabile/monera

What you think?

28
luxpir 6 hours ago 1 reply      
Haven't done any benchmarks myself, but I'd be keen to find out if a static site loaded up with JS design elements, a product store, comments and analytics code would load any quicker than a CMS with PHP caching, system caching and microcaching to handle bursts on a lightweight webserver such as Nginx.

The article is frustratingly biased in this regard. Static sites should just play to their strengths, otherwise you probably want a CMS that will act like a static site when it needs to.

29
cryptos 5 hours ago 1 reply      
Most of the static site generators have serious usability problems. They are simply not usable for many users using WordPress today.
30
manishsharan 5 hours ago 1 reply      
I host my company's web site on AWS Cloudfront using my homemade static website publisher which minifies my JavaScript's ,css and html and gzips them before pushing to Cloudfront . The pages load fast but the downside is making changes to page like fixing typos is not as simple but that probably an issue with my generatorvas well as Cloudfront cache configuration. I don't use Jekyll or Hyde or other static publishers because I wanted to write one in clojure and I figured I could write one in clojure faster than what it would take me to learn jekyll etc. You can check out and (critique) my website by visiting https://www.videocloudmanager.com. I run business video hosting service.
31
look_lookatme 3 hours ago 0 replies      
My favorite kind of static website generator is lazy, just in time, and supports TTL based regeneration on a per page level.
32
danneu 5 hours ago 1 reply      
I think the simplest static site generator is a command that visits all of your routes and saves the response html into a `build/` folder.

That way you can use whichever framework/stack/templating/database you're already familiar and productive with, and in the end you're just deploying a static build folder from localhost.

I started doing this when it came down to hacking Jekyll to implement something that's trivial to do in a microframework, so I went with microframework + page caching. I do the build and deploy with a gulp task that I'd like to generalize into a gulp module.

33
CamatHN 5 hours ago 0 replies      
To an extent I agree I think we are just getting better at recognizing the security and performance optimizations that are staring us in the face when content doesn't change rather than having complex websites completely static as static site gens do. I think we will see an increase in static elements mixed with dynamic elements in a more comprehensive way.

------

I host 10s of more or less static sites (Contact forms being the most dynamic elements) which are generated on the spot from one PHP (laravel) installation.

Anyone know the best way to cache the html/css statically to serve?

34
bjfish 3 hours ago 0 replies      
One static site CMS that I know of is:http://www.webhook.com/

It's open source. Design and content is edited collaboratively and it deploys a static site.

Are there any other CMS systems designed to deploy static sites?

35
bhanu423 5 hours ago 0 replies      
I am working on Open Source Static Website Generator/Manager for College Professors to create/manage course webpages. Its actually a Desktop App which runs on windows, linux & Mac and can directly push the webpages created to Github-Pages, SFTP Server. Do check out the project at https://github.com/navya/Kalam .
36
jhack 5 hours ago 0 replies      
I'd love to use static site generators for my client work but they usually ask for features like e-commerce that completely rule out it's use.

Usability is also an issue. Wordpress is a far more friendly environment for them to make changes or create a new post than creating a text file with specific formatting and running a script.

37
stadeschuldt 4 hours ago 0 replies      
I use MiddleMan as a static site generator for a page hosted on my raspberry pi: http://pi.tafkas.net

As I am more of a python guy I wonder if there is a similar (as in not primarily for blogging) generator for python?

38
nstart 4 hours ago 0 replies      
I'm currently building my own static site CMS. Basically. Word press like interface (at some point). Spits out a static site. My blog http://adnanissadeen.com runs on it. I'm building the CMS over at https://github.com/spartakode/static-cms . Been inactive for a while because I'm just a horrible procrastinator. I'll be finishing up (read: making it usable by public) end of this year hopefully. Will try and offer a hosted service a little later.
39
amelius 6 hours ago 0 replies      
Does this mean that developers and admins will get paid less, because their jobs just got a lot simpler?
40
tmaly 4 hours ago 0 replies      
I remember this custom CMS written in Perl mason that we used at JupiterMedia ( formerly internet.com ) back in 2004. It worked well but had some complexity.
41
ohitsdom 5 hours ago 0 replies      
How many more articles about static sites being the "Next Big Thing" before they just become "The Big Thing"?
42
Dolores12 5 hours ago 1 reply      
Movabletype anyone? Its older than wordpress.
43
dschiptsov 6 hours ago 0 replies      
Next to geosites.com, I suppose.
44
sogen 3 hours ago 0 replies      
just an ad for Netifly
45
pistle 6 hours ago 1 reply      
... Next Big Thing For Minimally Dynamic Sites

If your content doesn't change frequently and/or the costs of regenerating the static content is minimized for you, great.

At what point do we see static sites take a fair share of the top-X-trafficked sites? Top 100? 1000? 1,000,000?

This is probably great for a small corp's info site... but then the client asks for a contact form or members/admin secured area, and there we go down the rabbit hole again.

46
elf25 2 hours ago 0 replies      
I'd be happy with a wordpress plugin that rendered static pages. 95% of client sites I've ever done have little to no reason to be dynamic. I know there are valid reasons for a dynamic site however IMHO those are rare.
Fail at Scale: Reliability in the face of rapid change acm.org
7 points by jsnell  1 hour ago   discuss
Reasonable System for CSS Stylesheet Structure github.com
42 points by dhruvbhatia  4 hours ago   21 comments top 7
1
bpatrianakos 1 hour ago 0 replies      
These are all great ideas. The only thing I don't like about these types of projects/guidelines is the almost religious, strict adherence to them that some people who pick them up like to rant about.

The examples are great and realistic but they will inevitably break down in any project with just a moderate amount of complexity. When this happens I see people decide that they've done something wrong and think they either need to restructure the UI to fit the guidelines or find a different set of guidelines to fit their project. Both are wrong.

After years of studying these guidelines and trying to perfect the perfect, most efficient, small, and understandable set of CSS styles I've come to the realization that we need to simply accept that there are going to be exceptions and that's okay. Yes, think in components, give classes sane names, don't use ID attributes, etc. but also know that at some point you're probably going to have to break a rule and it's not the end of the world. Think about the problem for a few minutes but don't waste your time trying to shoehorn your front end code to fit a set of guidelines so you can feel superior because of your strict adherence to RCSS or SUITCSS or whatever. There are more pressing issues than a few unused or single use classes. If you can follow the rules more than 80% of the time then I say you're as close to perfect as you'll ever be. Anything else is reaching for an impossible standard.

It's funny that the stricter you adhere to some of these guidelines the less productive you become as you spend more time thinking about how to create components so they fit the rules rather than getting the damn styles to look however you want.

My point being this: I love all the different philosophies and think they're all worth studying but let's be realistic when it comes time to implement stuff.

2
MatekCopatek 3 hours ago 1 reply      
I always feel like these naming conventions are just one big workaround for the pain of a global namespace. We should tackle it with tools like CSS Modules[0], not with something as flimsy and error-prone as a naming convention.

[0] https://github.com/css-modules/css-modules

3
Wintamute 2 hours ago 1 reply      
Hasn't this already been solved with SUITCSS?

https://github.com/suitcss/suit/blob/master/doc/naming-conve...

4
smpetrey 2 hours ago 1 reply      
I was under the impression that child selectors are specific in nature, and they should be used later in the sheet, when declarations become increasingly specific.
5
adamkochanowicz 3 hours ago 1 reply      
Another alternative to BEM http://getkickstart.com/amadeus/
6
ilikescience 3 hours ago 6 replies      
I like how readable it is, but nesting selectors:

 .parent .child .element {
is bad for performance thus BEM's:

 .parent--child__element {

7
at-fates-hands 3 hours ago 1 reply      
Naming conventions seems to be old school now that virtually everything is being done with JS.

Using webpack's css-loader changes the game completely as pointed out by Mark Dalgleish here:

https://medium.com/seek-ui-engineering/the-end-of-global-css...

Writing maintainable CSS is now encouraged, not by careful adherence to a naming convention, but by style encapsulation during development.

I'm sure there will continue to be more JS solutions to getting rid of CSS naming conventions, since it seems everything is being done with JS these days.

V8 crankshaft vs. arguments object mrale.ph
37 points by luu  5 hours ago   13 comments top 3
1
exogen 2 hours ago 2 replies      
Is it common for people to enable these deoptimization messages (e.g. "Unsupported phi use of arguments") somehow when they're optimizing code? I know there are linters that might warn you about some cases, and the Chrome dev tools show a little hazard symbol next to deoptimized functions... is there a more streamlined workflow where you see deoptimization messages from the engine in a nice way (instead of what I'm expecting to see by enabling some V8 flags, which is a verbose jumbled mess)?

Off topic: The design decision in ES2015 to have lexical `arguments` [1] in arrow functions completely baffled me and my coworkers. It completely breaks the contract that most people think arrow functions abide by, which is that it's a simple transformation to `function() { ... }.bind(this)`.

 function foo() { const bar = (a, b, c) => { console.log([a, b, c]); console.log(arguments); }; bar('a', 'b', 'c'); } > foo('x', 'y', 'z'); ['a', 'b', 'c'] { '0': 'x', '1': 'y', '2': 'z' }
In realistic cases it's common to switch to arrow functions just for their brevity or to capture `this` and then all of a sudden things aren't doing what you expect. This has already bitten me several times.

[1] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

2
doublerebel 3 hours ago 2 replies      

 ...arguments object not being an Array is yet another JavaScript wart that causes nothing but special cases and complexity in optimization pipelines...
So, why can't we make `arguments` into an immutable Array for ES7? Is there any compiler advantage to keeping the special case?

3
roflchoppa 2 hours ago 0 replies      
man i got excited about crankshafts, i wanted to see it damage something...
RoboVM Is No Longer Open-Source infoq.com
121 points by macmac  9 hours ago   77 comments top 13
1
hitekker 6 hours ago 4 replies      
Copy pasting what I think to be the most important part of the article:

 RoboVM is a complicated piece of technology that we have worked hard for years to create. Over the past few months, we have seen competitors actively exploiting our good faith by using our open source code to compete with us directly in commercial products. On the flip side, we have received almost no meaningful contributions to our open source code. You can imagine how disappointing this has been to us; we had hoped our initial business model of OSS with proprietary extensions (like our debugger and interface builder integration) would work. But in light of the low contributions and behavior of competitors, we decided to stop automatically releasing changes to the core of RoboVM as open source.
-Zechner

I'm not sure what to think about the competitors-stealing-our-code angle, but the OSS-is-being-used-as-freeware argument is certainly one point in favor of closing/controlling their code (as a business).

2
ex3ndr 4 hours ago 3 replies      
Several months ago, after consulting with HN community (https://news.ycombinator.com/item?id=9757243) what we can do with our startup, we open sourced our platform and now our team feels pretty well.But after months, we see that powerful companies just silently use our software to solve there's needs and making no contributions. We accept small contributions, but we can't count on them as platform is much bigger than contributions.We tried to write Wikipedia-like letter to ask for donations - doesn't work.Tried to make consulting - eats too much time.Now we releasing enterprise version (https://corp.actor.im) and we decided not to open sources of it as it is almost impossible to earn money without pushing people to pay. And pushing can be performed only with limitations that is, actually, harming people in some cases.Does anyone know any other solution for this?
3
bad_user 8 hours ago 3 replies      
RoboVM was basically the competition of Xamarin and Xamarin killed it. RoboVM was cheaper and had a bigger market, since it targeted JVM developers. With RoboVM you aren't even tied to Java the language, being entirely possible to use a language like Scala. And contrary to Xamarin, in order to use IntelliJ IDEA or Eclipse, or in other words the state of the art in IDEs, you didn't need to pay $999 per year, like you need for using Visual Studio with Xamarin. And RoboVM has been open-source, which brings with it benefits, like trust, trust because should things go wrong, you can always fork it.

Well, now is the time for a fork and I hope it will happen.

4
MikeTaylor 8 hours ago 1 reply      
There is no such thing as "no longer open source". The story here is in fact "the copyright holders of RoboVM have made a proprietary derivative".
5
codeulike 8 hours ago 1 reply      
I've been using libgdx and was meaning to explore the whole iOS/RoboVM angle, I finally got around to it this weekend, ironically just in time to catch the kerfuffle around this.

Presumably there is an open-source version of RoboVM still around. Perhaps here: https://github.com/robovm/robovm ... though I gather the problem with that is that it does not include any of the latest iOS 9 work, which was done in a closed repository somewhere.

Looks like the Libgdx guy (Mario?) fought hard to keep a free version of RobobVM for Libgdx users. Not sure how long that can last; its currently based on self-identifying yourself as a Libgdx user and hence easily abused.

The other irony is that Libgdx used to use Xamarin to target iOS, but they switched to RoboVM because it was free/open.

6
merb 14 minutes ago 0 replies      
I don't think that everything of RoboVM could be closed source, since you are doing the same things just for another platform, that google did and got sued by oracle.
7
seibelj 4 hours ago 1 reply      
I originally thought this was a bad idea, then changed my mind. I was going to contribute to the fork but decided not to. What sealed it was several competitors to RoboVM emailed me directly after seeing my posts in the RoboVM fork message boards begging me to help them. No one wants to pay for anything anymore. RoboVM made the right decision.
8
JustSomeNobody 6 hours ago 0 replies      
Have we just become too selfish for OSS? We're going to end up punishing ourselves by having to use proprietary, closed systems for everything. I don't want to compute in that world. That would be awful.
9
scrollaway 9 hours ago 5 replies      
So let me get this straight... RoboVM is acquired by Xamarin, the company behind Mono (the open source c# implementation)... and they therefore go closed source?

I don't get it.

10
mahyarm 1 hour ago 0 replies      
As a dev, a big thing for me is being able to see the source code for the full callstack in the debugger, and possibly making derivatives of that code in my app. As long as they provide that in some form, then closing the source isn't as big of a deal.
11
synic2 54 minutes ago 0 replies      
People don't respect those who give away their labor for free.

Consider doctors, dentists, lawyers, CPAs, or other higher-prestige professionals. They engage in protectionism to restrict competition and drive up their earnings and rarely if ever give away their services for free, and they are much, much more respected than we are. And on the rare occasion that they do perform some act of charity, such as working as a public defender or providing discounted dental care to the poor, they usually still get paid for it, just not as much, and they make a big show of it so everyone knows that they're taking a huge paycut for a good cause and thus deserve even more respect than they get already.

Programmers by contrast are rapidly conditioning the public to expect software to be cheap or free, regardless of how long and difficult it is to produce, to the point where iPhone devs complain they can't sell their apps for the price of a cup of coffee.

Programmers need to worry about their future and should probably spend their 20s and 30s making as much money as they can writing proprietary software before they hit 40 and become increasingly unhirable due to ageism instead of performing unappreciated acts of charity in the form of open source for an ungrateful public.

12
aurelien 2 hours ago 1 reply      
Once again Open Source demonstrate its danger in front Free Software.Stop to waste your time, bring progress to fsf.org
13
Zardoz84 8 hours ago 2 replies      
> Several RoboVM components used to be made available under the Apache 2.0 license while the compiler was open sourced under the GPL license

How can close source the compiler if it is on GPL license ?

       cached 3 November 2015 20:02:03 GMT