hacker news with inline top comments    .. more ..    26 Aug 2015 News
home   ask   best   4 years ago   
M, a personal digital assistant inside Messenger wired.com
117 points by jasonlbaptiste  1 hour ago   44 comments top 19
1
roymurdock 34 minutes ago 0 replies      
You have lots of AIslike Siri, Google Now, or Cortanawhose scope is quite limited. Because AI is limited, you have to define a limited scope, Lebrun says. We wanted to start with something more ambitious, to really give people what theyre asking for. This meant the team would need more than AI...Even after bringing neural nets into the mix, he says, the company will continue to use human trainers for years on end.

I can't help but picture a large, fluorescent-lit room of jolly old British "trainers" in safari khakis running around admonishing misbehaving AI for telling bad jokes, all the while trying to juggle placing calls to the DMV and restaurants to make reservations for 700 million messenger users.

2
msvan 43 minutes ago 2 replies      
This seems like a move into the Chinese-style mega-app where you can do everything from one app - buy shoes, talk to your friends, figure out when the train departs. Facebook already has two top-50 apps, and creating new, unproven apps and promoting them to that point is expensive. So, to increase influence they are putting more into the existing apps.
3
mbesto 1 hour ago 2 replies      
The first thing I see when someone asks "find me a good burger place in Chicago" is "how can companies game this through official ($) or artificial (spam) means?"
4
btbuildem 1 minute ago 0 replies      
Ah, Mechanical Turk strikes again..
5
andybak 8 minutes ago 0 replies      
The article title has the word 'Facebook' in whereas the post just mentioned 'Messenger'. Is 'Messenger' clear enough? I'm old enough to think that refers to Microsoft Messenger!
6
tomg 46 minutes ago 0 replies      
Have an ad network buy products on my behalf? No thanks.
7
__michaelg 8 minutes ago 0 replies      
It looks like you're writing a letter. Would you like help?
8
dhutchinson 11 minutes ago 0 replies      
I can appreciate FB trying to innovate, but with the on going privacy issues and the fact that it seems they are just repackaging existing tech, i'm just not into it.
9
viksit 30 minutes ago 1 reply      
Haha, it looks eerily similar to Myra, the cross platform assistant I launched last week [1]. Including the name. Interesting times.

[1] https://news.ycombinator.com/item?id=10060074.

10
marcusgarvey 1 hour ago 0 replies      
Facebook's answer to Magic?
11
viach 1 hour ago 1 reply      
"It can purchase items, get gifts delivered to your loved ones, book restaurants, travel arrangements, appointments and way more"

So it can spend my money in behalf of me?

12
apetresc 1 hour ago 0 replies      
Anyone figured out how to sign up for the test? Is it a contact you can add to your Messenger list, like chatbots of old?
13
zkhalique 1 hour ago 1 reply      
My main question is - how did facebook make a HUGE picture show up when you share this page on facebook? Anyone know?
14
chimeracoder 1 hour ago 1 reply      
15
umanwizard 1 hour ago 3 replies      
It's M, not Q.
17
zkhalique 1 hour ago 0 replies      
18
nedwin 1 hour ago 0 replies      
19
swalsh 37 minutes ago 0 replies      
YC Stats ycombinator.com
89 points by macleanjr  2 hours ago   52 comments top 21
1
sydneyliu 1 hour ago 2 replies      
Sam also just tweeted that 300 YC companies are no longer around. Amazing how much transparent they are and how great YC is at picking and training. So many really crazy ideas that most would think are insane and YC is able to find the ones that make sense and help them. Seems like that's a bit less than 1/3 of all YC companies

Tweet is here: https://twitter.com/sama/status/636586179970752512

2
eloff 5 minutes ago 0 replies      
I think there's a big elephant in the room here. Companies usually take time to fail, especially with VC funding. The companies from the newer batches are going to be skewing those fail statistics. If you only look at companies from 4 years ago and older, you might get a more realistic impression.

It still won't be 90% though, but YC companies are widely known to not be representative. It's the highest profile accelerator, so it attracts the best talent. Just like people who graduate from Ivy League schools earn more on average, but mostly that's because they attract people who are above average to begin with.

3
harmegido 35 minutes ago 4 replies      
Ok, so I did some quick numbers on performance. It looks like the first batch was in 2005, meaning ycombinator has been at this for 10 years. Have they always funded with 120k? Assuming that (and not present valuing the older money):

Total Investment: ~$131,600,000Total Companies Value: >$65,000,000,000YCombinator's 7% Value: $4,550,000,000Total Return: 3357%Annualized Return: 42.5%

Obviously, it costs more than the initial investment, but these are really nice numbers if compared to a mutual fund, etc. Did I make any mistaken assumptions?

4
minimaxir 1 hour ago 1 reply      
Interesting way to bury the lede that 32 companies received a Fellowship offer. The estimate was 20. (http://techcrunch.com/2015/07/20/y-combinator-just-introduce...)
5
cm2012 1 hour ago 2 replies      
Considering there is probably a vintage effect, and currently 5% of YC companies are worth over $100 million, that means that most likely 5-10% of founders that join YC will become millionaires over time. That's pretty impressive.
6
npalli 1 hour ago 0 replies      
Interesting twitter update by sama

> Oh, another important stat: about 300 of the companies we have funded have shut down.

This is a hard gig. Even after you get into the top accelerator in silicon valley.

7
lordnacho 1 hour ago 1 reply      
Since applications are opening tomorrow, it would be interesting to have a breakdown of interviews/acceptances based on various criteria: sole founder, has revenue, has user, etc...

It would allow prospective applicants to think about whether to apply, and hopefully keep the pile a manageable size for the people reading it.

8
pesenti 44 minutes ago 0 replies      
It would be great to get stats about exits as well. How many exits? Total valuation at exit? Median/average age at exit? etc.
9
jypepin 1 hour ago 4 replies      
how good or bad are those numbers? only 8/970 companies are worth > 1b and 40 are worth over 100M.

I assume YC is doing great for themselves, so I'm wondering, what is the rule of thumb to say an investment was successful? (for YC, knowing they invest cheap and early.)

Is it once a company reaches 10M? 1M? Where there is an exit?

It would be interesting to know how many companies YC says the investment was a success (either made money or will if no exit yet).

10
dude_abides 1 hour ago 1 reply      
Number of companies funded by YC that have shut down: ~300

Does this include aqui-hires?

Also, would you know how many companies rejected by YC are worth more than $1 billion? Zero or non-zero? :)

11
dantillberg 1 hour ago 0 replies      
These stats are great, or at least they sound great to an aspiring entrepreneur. I've seen stuff like this for a long time on HN, and a feeling took hold in me that I've only started to really shake recently, a feeling that the best and most determined founders will always get into YC, and that failing to get into (or apply to) YC is an indication of weakness.

But just remember that, even though these numbers look amazing, you can create a business via other means and still possibly achieve whatever sort of success you're after -- you don't have to join an accelerator, or get prestigious seed or venture funding.

Or at least, I think that you shouldn't have to.

12
brayton 1 hour ago 2 replies      
What does it mean by saying "got their start with us"? I believe many companies have been around for awhile before YC. Many already with revenue, traction and funds raised (although this might be the exception to the rule?).
13
bsbechtel 1 hour ago 0 replies      
I would be very interested in seeing a cohort analysis of YC company valuations by batch :-)
14
mattmanser 53 minutes ago 1 reply      
As a clarification, by 'shutdown' do you mean failed, or does that include people who sold or were aquihired?

Good to see the 90% of startups fail 'statistic' blown out the water with some facts!

15
dzlobin 1 hour ago 0 replies      
Nice to see more detailed stats on this stuff. One figure I'd really like to see is the number of companies worth more than $10M & $50M.
16
snake117 1 hour ago 0 replies      
I keep reading that seed investment firms consider it "good" if 4 out of every 10 startups they fund turn out to be something of value. Over time this is showing to be true.
17
julien_c 1 hour ago 1 reply      

 Number of companies we offered to fund yesterday for the first YC Fellowship: 32
That's a figure I was looking for (took the interview yesterday, didn't get in).

18
mdc2161 1 hour ago 0 replies      
another stat from sama on twitter:

> Oh, another important stat: about 300 of the companies we have funded have shut down.

19
nicklovescode 1 hour ago 0 replies      
How many founders have gone through YC?
20
andreasklinger 1 hour ago 0 replies      
> Number of companies in the last batch: 107

Wow.

Just wow.

21
7Figures2Commas 1 hour ago 1 reply      
Who Hacked Ashley Madison? krebsonsecurity.com
230 points by david_shaw  2 hours ago   119 comments top 13
1
philangist 1 hour ago 2 replies      
> They said Avid Life employees first learned about the breach on July 12 (seven days before my initial story) when they came into work, turned on their computers and saw a threatening message from the Impact Team accompanied by the anthem Thunderstruck by Australian rock band AC/DC playing in the background.

This reads like a scene straight out of Hackers or some other campy tech movie. Life imitates art.

2
smtddr 1 hour ago 11 replies      
I don't condone this hack, but morals/ethics aside for a moment:

The one positive thing this hack has done is really give serious ammo to the battle for online privacy, because the demographic hit by this hack is the most politically & economically powerful demographic in the world....

3
mahouse 26 minutes ago 0 replies      
OT: where's the source code for the AM website? Is it inside one of the dumps?

Edit: found it in "Ashley Madison 2nd dump 20 GB"

4
deadlycrayon 26 minutes ago 1 reply      
This is pretty poor journalism. Reminds me of the reddit Boston bombing witchhunt. Note that in the comments, Krebs had to be "reminded" to reach out to Thadeus Zu for comment.
5
GigabyteCoin 1 hour ago 2 replies      
>To say that Zu tweets to others is a bit of a misstatement. I have never seen anyone tweet the way Zu does; He sends hundreds of tweets each day, and while most of them appear to be directed at nobody, it does seem that they are in response to (if not in reply to) tweets that others have sent him or made about his work. Consequently, his tweet stream appears to the casual observer to be nothing more than an endless soliloquy.

Perhaps that's all Zu is? A bot, or a covert chat channel of some kind. Perhaps prime numbered words from every third tweet contain the real message, or something like that?

6
lawl 1 hour ago 2 replies      
So Krebs has no conclusive proof for anything?

As he himself admits:

> It is possible that Zu is instead a white hat security researcher or confidential informant

Jeez, how about talking to the police and let them do their job, or at the very least censor the name.

This is just a witch hunt.

7
graycat 1 hour ago 1 reply      
Can we also ask, HOW did they hack Ashley Madison?
8
madrinator 35 minutes ago 0 replies      
9
lifeisstillgood 1 hour ago 2 replies      
This seems awfully meta - as the AM hack revealed 30m people to have lost any real privacy in the digital age, the person seemingly / likely / possibly responsible is hunted down and much of his life laid out like a private investigators report through his digital trail.

It's curious - we are all being affected by the new digital pollution

10
werber 1 hour ago 0 replies      
11
futureYCalum 1 hour ago 0 replies      
12
Benichmt1 1 hour ago 0 replies      
13
satanrepents 1 hour ago 0 replies      
A Woman Who Spent Six Years Fighting a Traffic Stop themarshallproject.org
95 points by danso  2 hours ago   28 comments top 8
1
rayiner 1 hour ago 3 replies      
> Larvadain, who is now 74, worries that young lawyers are less concerned with right and wrong than with chasing dollars. Although Patricia Parkers lawsuit against Woodworth offered small hope of a big payday, Larvadain agreed to represent her. Because I saw wrong had taken place. And I knew someone had to take a stand.

I know a lot of young lawyers who'd love to take on a case like this one. But matching people who need help to lawyers who have the time and institutional resources to take on these projects pro bono isn't easy when you're talking about folks wronged in rural Louisiana.

Also, I think there's something of a hesitancy to for business law firms to add suing municipalities to their pro bono docket. Which is a shame--a few New Orleans firms building up a pipeline of pro bono suits against places like Woodsworth could do a lot of good.

EDIT: If you're interested in (potential) justice porn, it's worth following Fant v. City of Ferguson. The DOJ gave a lot of ammunition to plaintiffs' lawyers when it concluded that the city was systematically using fines and other police actions as revenue sources, particularly against poor black people. It's Woodsworth writ large. The complaint is here: http://www.nytimes.com/interactive/2015/02/08/us/ferguson-co.... The docket is available here on Justia: https://dockets.justia.com/docket/missouri/moedce/4:2015cv00.... With her opinion last month on reconsideration, the judge has effectively denied the city's motion to dismiss. Dispositive motions are due May of next year, so we'll likely see by then whether this is granted class status. If so, things will get interesting.

2
ars 1 hour ago 3 replies      
I've always felt that all fines - for any cause - should be turned over to the state. Even simple parking tickets.

The state should fund the court for hearing those cases (but not enforcement).

Then we'll see if cities really care about safety or about money.

3
gknoy 50 minutes ago 0 replies      
In case you miss the link inside the article, be sure to read the ruling[0]. It is both thorough and entertaining.

http://www.la3circuit.org/Opinions/2015/03/030415/14-0943opi...

4
qiqing 39 minutes ago 1 reply      
Wow. "A woman told the paper that she was pulled over by Woodworth police while in labor and was kept for more than a half hour."
5
542458 1 hour ago 0 replies      
Wow - that's a much more interesting article than the title would imply. I wonder if anything changed in Woodworth in response to the verdict (I doubt that it did, but still).
6
cafard 1 hour ago 0 replies      
Good for Lavardain and the appellate court.
7
drzaiusapelord 1 hour ago 2 replies      
I used to live near a town like this. Thornton, Illinois is the site of a giant limestone quarry and there's one major road you can drive through to get through the quarry if you don't want to get on the expressway (which is a bit out of the way depending on where you're coming from). These streets have 25 or 30mph signs, yet are wide enough to easily be 35-45mph streets. Drivers unfamiliar with the area would be taken off-guard by these extra slow speed zones. The police there pull people over for 2 or 3 mph over regularly.

According to IDOT, there were 2,334 traffic stops in Thornton in 2014. This town's population? About 2,300. That's one stop per man, woman, and child in this town! This is naked profiteering and pretty much all small towns in strategic areas turn into police-led profit centers. The real question is why isn't anyone stopping this?

8
Someone1234 1 hour ago 4 replies      
Secretive fusion company claims reactor breakthrough sciencemag.org
88 points by sparrowlisted  3 hours ago   36 comments top 10
1
carapace 1 hour ago 1 reply      
I keep wondering why this talk doesn't get more traction?

https://www.youtube.com/watch?v=rk6z1vP4Eo8

Published on Aug 22, 2012Google Tech TalksNovember 9, 2006

ABSTRACTThis is not your father's fusion reactor! Forget everything you know about conventional thinking on nuclear fusion: high-temperature plasmas, steam turbines, neutron radiation and even nuclear waste are a thing of the past. Goodbye thermonuclear fusion; hello inertial electrostatic confinement fusion (IEC), an old idea that's been made new. While the international community debates the fate of the politically-turmoiled $12 billion ITER (an experimental thermonuclear reactor), simple IEC reactors are being built as high-school science fair projects.

Dr. Robert Bussard, former Asst. Director of the Atomic Energy Commission and founder of Energy Matter Conversion Corporation (EMC2), has spent 17 years perfecting IEC, a fusion process that converts hydrogen and boron directly into electricity producing helium as the only waste product. Most of this work was funded by the Department of Defense, the details of which have been under seal... until now.

Dr. Bussard will discuss his recent results and details of this potentially world-altering technology, whose conception dates back as far as 1924, and even includes a reactor design by Philo T. Farnsworth (inventor of the scanning television).

Can a 100 MW fusion reactor be built for less than Google's annual electricity bill? Come see what's possible when you think outside the thermonuclear box and ignore the herd.

Google engEDUSpeaker: Dr. Robert Bussard

2
nine_k 2 hours ago 1 reply      
In short: Tri Alpha demonstrated plasma confined for 5ms. So their concept works.

Their next step is burning D-T fuel (needs 10x temperature increase). Their goal is burning H-B fuel which requires much higher temperatures, but has numerous advantages.

Update: "Tri Alpha is backed by Sam Altman, among other things." -> not at all.

3
rubidium 2 hours ago 1 reply      
They showed a "stable" reaction of 5 ms, which is quite good.

The video was quite nicely done. I recommend watching.

Caveat: As with all fusion companies, they're only 10-20 years from being ready to market : )

4
phasetransition 33 minutes ago 0 replies      
The summer of 1999 on the way to E&M class I rode the same bus to the physics building as Dr. Hendrik Monkhorst. He is one of the founders of Tri Alpha Energy, and I remember chatting with him about about aneutronic fusion. It was very eye opening for a 19 year old undergraduate. I remember him espousing commercialization within ten years, which now seems like prototypical professorial optimism. It is an exciting milestone to see them have successful confinement for a solid length of time.
5
DrNuke 8 minutes ago 0 replies      
If only any of these passes the lab stage to start with engineering... ITER is the only fusion machine 10-20 years away from demonstration, in 2015, and it's too big to fail.
6
guimarin 1 hour ago 1 reply      
Fusion is one of those areas that would benefit greatly from Public R&D. It's really sad that we've spent almost nothing on fusion research for the purposes of producing affordable electricity in the last 20 years.[1][2][3]

1. The NIF is and always has been a sideshow for the Nuclear Weapons development research that goes on there.

2. US last serious investment was the TFTR back in the 90s. http://www.huffingtonpost.com/2015/01/20/fusion-energy-react...

3. IMO ITER is a joke, too large scale.

7
arcanus 2 hours ago 3 replies      
Fusion: only twenty years away. Always.
8
danmaz74 40 minutes ago 1 reply      
> hydrogen-boron fusion, which will require ion temperatures above 3 billion degrees Celsius

3 billion degrees... that blows my mind.

9
ogrisel 1 hour ago 0 replies      
The cure for cancer and the solution to climate change announced on HN on the same day! ;)
10
onion2k 2 hours ago 0 replies      
A 23m long machine is far too big to mount on a Delorean. They'll need to fix that.
Unity Comes to Linux: Experimental Build Now Available unity3d.com
265 points by fractalb  7 hours ago   93 comments top 15
1
yarper 6 hours ago 8 replies      
It's an unfortunate title for this post since unity is something altogether different for a lot of linux users

see: https://unity.ubuntu.com/about/

Though there is a certain appreciable irony in having two totally different things named unity.

2
QuantumRoar 3 hours ago 3 replies      
Hooray for Linux!

But there's something that has been bothering me for a while about games for Linux compiled with the Unity SDK.

Does anybody know what kind of dependencies a game written with the Unity SDK has? I'm pretty clueless about that stuff but I'd still like to know what it would take to get a game running on a minimal Linux install (like a naked Arch or Gentoo). Ubuntu obviously comes well equipped for the task but I don't really care about that.

So where do the graphics come from? Does it need some special libraries apart from OpenGL? Where do fonts come from? How does it interface with hardware, i.e. does it need X, or does it come with its own drivers for keyboard, mouse, gamepad?

I fear that it is necessary to install half of Ubuntu to get the games running but - as I said - I don't really know anything about that.

3
pluma 6 hours ago 2 replies      
I thought it already had Linux support? For example, Wasteland 2 was built with Unity and works on Linux.

Or are there multiple game engines called Unity (on top of the existing confusion between the game engine and the Ubuntu thing)?

EDIT: I think the announcement is about Linux support in the SDK, not the actual software developed with Unity?

4
rocky1138 4 hours ago 1 reply      
I've been considering switching from Windows 8.1 on my main machine for a while but not being able to work on my game in Unity3d has held me back. This might finally put me over the edge.
5
bobajeff 7 hours ago 0 replies      
"Todays build is what we call an experimental build; future support is not yet guaranteed. Your adoption and feedback will help us determine if this is something we can sustain alongside our Mac and Windows builds."

I think we all know the future of this then.

6
rrhyne 3 hours ago 1 reply      
Personally, I'd rather they spend time making webgl production ready to ensure Chrome player support.
7
jarcane 6 hours ago 10 replies      
grumbles something about developers saying "Linux" when they mean "only supported on Ubuntu"
8
jestinjoy1 6 hours ago 1 reply      
This came just in time when I was thinking of switching to Mac. Blender game engine has features but not upto the mark to compete with game engines like Unity. The python scripting in Blender is good.
9
andresmanz 6 hours ago 2 replies      
Nice, so Unity is finally an option for me. Now it would be nice if the Unreal editor had official Linux support as well. (I know there's Wine, but it didn't work too well for me.)
10
amyjess 3 hours ago 1 reply      
Misleading title: should be "Unity Editor Comes to Linux". Unity games have been running on Linux for a while.
11
billconan 2 hours ago 1 reply      
for me, Unity is too expensive. I hope Unreal can do the same. there is an unofficial build instruction for unreal.
12
gpvos 5 hours ago 1 reply      
13
x5n1 6 hours ago 0 replies      
yay!
14
werber 3 hours ago 0 replies      
15
gcb0 2 hours ago 1 reply      
A new role for Qasar ycombinator.com
42 points by runesoerensen  2 hours ago   5 comments top 5
1
skhatri11 2 hours ago 0 replies      
Qasar has been instrumental in Instavest's development. He gives it to you straight up and has garnered the trust and respect of all YC founders. You can count on him for sage counsel and "gentle reminders" that keep you focused on growth and product. This is a great step for Qasar and an awesome win for YC. Congrats!
2
sytse 1 hour ago 0 replies      
We were lucky to have Qasar as a mentor in W15, he was amazing. Very cool to see him help YC execute even better, well deserved.
3
zallarak 2 hours ago 0 replies      
Wow, excited to see what is meant by this:

 Qasar will help scale our organization and operations as we tackle bigger and more ambitious projects
My interactions with him at YC left me impressed.

4
khamoud 1 hour ago 0 replies      
I've never gone through YC but I have had a few interactions with Qasar. He is a very intelligent man and in my few interactions he has given me great insight into my own personal projects.
5
brock_r 1 hour ago 0 replies      
Facebook's Human-Powered Assistant May Just Supercharge AI wired.com
20 points by blandinw  1 hour ago   4 comments top 2
1
WalterSear 1 hour ago 2 replies      
It's certainly supercharged bloviating article headlines.
2
roymurdock 43 minutes ago 0 replies      
Potentially Reprogramming Cancer Cells Back to Normal Cells neurosciencenews.com
181 points by jaoued  8 hours ago   31 comments top 9
1
escape_goat 5 hours ago 1 reply      
This popped up in /r/worldnews yesterday with another source, where /u/sharplydressedman gave it some context [1].

> I hate to be a buzzkill, but the linked article is very sensationalist ... it entirely misses the point of the research published by the Anastasiadis lab (pubmed link [2]). Being able to stop or revert transformed cells in vitro is not new, we've been able to stop or revert tumor cells for decades....

> Tldr of this is that this is unexciting unless you're a researcher studying E-cadherin/B-catenin, and means very little to someone outside of the cell biology field.

/u/squaresarerectangles provided a link to the original Nature Cell Biology paper, reproduced in citation [3].

[1] https://www.reddit.com/r/worldnews/comments/3idyjg/us_scient...

[2]http://www.ncbi.nlm.nih.gov/pubmed/26302406

[3]http://www.nature.com/ncb/journal/vaop/ncurrent/full/ncb3227...

2
joeyspn 6 hours ago 2 replies      
A novel approach and more good news. We're in the right direction. Link to the study in Nature Cell Biology...

http://www.nature.com/ncb/journal/vaop/ncurrent/full/ncb3227...

3
cfontes 6 hours ago 1 reply      
Be aware that this only works so far to few types of cancer. It's positive nevertheless.

I fell like Cancer is an umbrella term for several different little monsters.

4
mavdi 6 hours ago 2 replies      
This is it. This is what we should be focusing on with all the cancer research funds available. Cancer is just a data corruption in the cell DNA and we have a few methods of fixing data in the digital world.
5
codecamper 5 hours ago 2 replies      
It seems more & more the solutions to the big 3 diseases: cancer, heart disease, and dementia/alz, involve direct DNA / RNA / Mitohcondrial DNA reprogramming.

It would then seem prudent to ASAP get your own DNA sequenced. As time goes on, you collect more & more DNA damage & probably your original DNA becomes harder & harder to find.

6
cryoshon 5 hours ago 0 replies      
What this group is doing has been pursued for quite a while, and many people have laid claim to finally cracking it, so I'm skeptical. It'll be a good while before this kind of intervention is finished being refined in vitro, nevermind in vivo.
7
sravfeyn 5 hours ago 1 reply      
(Sorry for being naive) What is the discovery of this finding - is it the Chemistry of reprogramming the cells or a spontaneous way to evolve these cells back to their original code?
8
mc_hammer 2 hours ago 0 replies      
suspect it would go better with a bit of NLP or training of the brain to stop them from doing whatever they did in the first place that programmed those cells wrong. :)
9
dang 20 minutes ago 0 replies      
Many posts of this, including https://news.ycombinator.com/item?id=10121168 and the threads I linked to from there.

We added the qualifier "potentially" from the article's first paragraph.

How to Onboard Software Engineers fogcreek.com
196 points by GarethX  6 hours ago   65 comments top 15
1
lutorm 1 minute ago 0 replies      
I wish the article would start by explaining what the hell "onboarding" is, since I've never heard that word before...
2
agentultra 4 hours ago 5 replies      
Kate Heddleston is freaking awesome, if you don't know. Her talk at Pycon 2015 about diversity in engineering environments is fantastic. Well explained in language free of activist rhetoric -- it was a pragmatic and thoughtful speech. Her blog expands on much of what she covers and should be read.

On the topic of on-boarding -- super important to get right! The idea of making your people the best they can be is lost, I think, on our generation. My grandfather-in-law was a mechanical engineer at Chrysler back in the day when they had a special school they ran to train their new recruits. When he started he had no idea of how cars worked but they picked him up from school and gave him the training he needed to become an important figure in their company.

Focusing on hiring the best because hiring someone mediocre will damage your business is negative thinking that can kill your on-boarding, and thus your culture and business. I've worked at places with terrible on-boarding and it was a fight just to get some direction and support for your work. Getting the attention of management was something you wanted to avoid which led to people becoming complacent about their work and its quality.

Great article. On-boarding is important to get right.

3
dhutchinson 3 minutes ago 0 replies      
I really enjoyed this video as I am currently looking for programmers and coders. I am leaning towards female engineers as I am a proponent of diversity and that I want to produce gender neutral software for my market. My goal is to hire the "right" people who can objectively work together while also producing software that takes multiple points of view into its development.
4
sqldba 6 hours ago 3 replies      
I wish everyone took onboarding seriously.

I worked for one company for a month. From the get go they had 100+ staff but no onboarding. Nobody knew how to set up the development environment or even get the application working on the company laptop. Nobody could show me how to VPN to client sites or show me how the software worked. I was meant to support it...

I was in my manager's office every day letting her know hey I really need assistance here, nobody seems to have time, nothing is working, I'm not learning anything and this needs to improve.

She'd tell staff to assist and they just wouldn't. And rinse wash repeat the next day.

A month in we are in a meeting and some difficult software issue comes up and it's assigned to me to fix. I mention that I don't have it running, don't know how to use it, have no method of finding who the customer contacts are to call them and will sound stupid not understanding anything, but also haven't been shown how to connect in. That I'm happy to shadow someone else and learn it.

It felt shameful but I knew it was the right thing, I had a year of extensive help desk experience under my belt including programming and other development, bug fixing, accounting, you name it, I was even team lead. I was used to dealing with multi million dollar clients and had no qualms about doing so... but this place was dysfunctional.

Anyway the boss stared me in the face in the meeting and called me a liar, to shut up and get to work. I was stunned. I repeated myself and she turned to my coworkers who claimed they had "showed me everything". I hadn't spent more than 5 minutes with them in the entire month and had been in her office every day. She told me to start pulling my weight and stop lying.

I walked very calmly to the office printed out a resignation and left my keys and walked out the door. I didn't get paid (a funny side effect of walking out of a job) and the move left me penniless and ALMOST homeless.

Luckily I got a job just in the nick of time shortly after, at twice the pay, and being treated like a human being. I removed that place off my resume and rarely discuss it because it's so humiliating.

I still remember the recruiter calling me screaming how unprofessional I was - he only cared about his lost bonus. I explained I was extremely professional but that after being humiliated and called a liar in a meeting and having none of the promised training, or anything remotely capable of making me a functional employee, there was no recovery.

Now I take onboarding very seriously.

5
patio11 6 hours ago 4 replies      
Onboarding is a people/process/docs/technology problem. To the limited extent that it is a technology problem, you really can't go wrong with a) treating it like it is an actual shipping product of the company (implying a minimum standard of care like e.g. a repository, documentation about it, and a dedicated place to put issues) and b) actively maintaining something with the goal of getting someone up-and-running quickly.

None of my projects are at the point where you can just "vagrant up and go", but the next-best thing has been READMEs in the relevant repositories with exact lists of "Type this, type this, type this, type this. You now have a fully-working system running on localhost and you should be able to type this to get a full green test suite. If you can type this and it does not come out green, fixing that is more important than anything Patrick is doing right now."

Here's, for example, what we have for getting someone up and running on Appointment Reminder (in preparation for me soon no longer being the engineer who keeps all of that system in my head): https://gist.github.com/patio11/a0b1063c5d33b5748da6 Feel free to steal ideas in terms of level of detail or useful things to include. (A lot of the magic is in the rake commands like "setup_site", which take care of "All that crufty configuration stuff which you would otherwise need a senior engineer to do for you prior to actually seeing the page render correctly on localhost:3000.")

Quick hack which helped us on making sure this guide was actually accurate: we had two engineers coming into the project at the same time. I wrote down everything I thought they needed to know. Engineer #1 implemented to the document I had written, filled in the blank spots where he needed to ask questions, and then we committed the readme. Engineer #2 then had to do it off the readme without asking me any questions. Given that he was actually able to do this, we have high confidence that there is not at the moment anything rattling around in my head which is absolutely required to get up-and-running and documented nowhere else.

6
organsnyder 3 hours ago 0 replies      
I'm in the midst of being onboarded at a very large healthcare organization. In addition to the complex SOA infrastructure, we deal with a multitude of databases, accessible through many different methods, from a number of vendors. There's a ton to take innot to mention the business complexity inherent to the organization as a whole.

Heddleston's point about having less-senior people do the mentoring is especially apt. While I've received support from everyone in the team, the most helpful person has actually been an intern with only a few years of programming experience under his belt. While I have over a decade more general programming experience than he has, he knows a lot more about the specific domain, and has been an excellent teacher. I feel that this arrangement has been beneficial for both of usas I've gained domain knowledge, he's gained confidence and depth in his own knowledge.

I think that a big part of this is not setting expectations too highno matter how senior the new employee is. While I have a fair amount of experience, the expectation in my new positionboth from me and from the teamis that it will take me a while to get up to speed,(especially given the complexity inherent to the role), even though I'm not coming in as a junior dev. Therefore, I don't feel the need to pretend that I'm an expert in something that I'm not, and there's no ego hit when I'm being mentored by someone who is technically my junior.

In my city, it seems like most everyone is looking for senior devsto the point that they leave positions unfilled for months rather than hiring someone they don't consider senior enough. This is madness, and damaging to the industry as a whole. We need to focus more on efficiently developing talent, and Heddleston seems to have some great ideas for doing so.

7
debacle 6 hours ago 2 replies      
An inability to bring on new talent, junior or otherwise, is a symptom of lacking essential documentation. If I can't come into your office and have a copy of your software running or successfully compiling on my machine in less than an hour, something is very wrong with your documentation, tooling, or (lack of) build process.

"Here's Jim. He's our Widgitsoft guy. He's the only one who works on Widgitsoft. The system is entirely undocumented because Jim knows everything. Yeah, development has been slower than expected lately. Yeah, it would be nice if we could bring on a contractor when necessary, but getting to that point would be a lot of work, and we'll always have Jim."

8
egusa 6 hours ago 1 reply      
great point about onboarding providing a high ROI, it's something I haven't thought enough about. this is a particularly useful part: "The best person to help someone set up their development environment is the last person who joined, regardless of seniority."
9
jlees 3 hours ago 0 replies      
I think outside of the day-to-day (you need to know Python, here's how to get Vagrant running, this is how you submit a code review) there can also be a gap in cultural onboarding. For example, someone moving from a large tech company to a startup has certain norms and expectations that they may slowly re-evaluate (if they actually do) over a number of weeks or months, potentially harming the product and even their own career -- for example, "that's not in my job description" doesn't fly at a startup but can be a protective reflex at a large company. (One may argue someone with this attitude may not be hired at a startup in the first place, but sometimes the attitudes and expectations aren't so upfront and clear-cut, and are hard to test for at interview!)

Would love to find some examples of great cultural onboarding where it's not just the "what" of the work that a new hire learns, but also the why and how, to avoid implicit assumptions and biases from day one...

10
DasIch 4 hours ago 0 replies      
I think lack of onboarding is probably one of the biggest problems with open source projects as well. You can't expect to get new contributors, if you have no documentation on how to setup a development environment, which tools you use and how the project is structured.
11
luu 3 hours ago 0 replies      
This is timely, as Ive just gone through the worst onboarding I've ever experienced. It took two very long days to install an OS on a machine and get it to boot (I had to assembly my computer and then use the corporate installer, which failed multiple times with errors like an unknown error occurred). It then took me about a week to run the local projects hello world (I was initially pointed to documentation that had been deprecated for months, and then when I found the right documentation it was incomplete and out of date). The actual process to get there was hilariously baroque; for example (if my email records are correct and I didnt get duplicate emails from the system), I must have clicked through 18 EULAs to get a subset of the permissions I need. I promise, thats worse than it sounds since the form to do so took more time to click through than you can imagine. It was a month before I could effectively do any work at all.

I originally thought that I was just unlucky, but when I asked around I found that I had an above average experience, at least among people near me who started recently. Unlike many other folks, I was actually able to get a username/alias, a computer, and an office. I dont know what folks without a username do since they cant even start clicking through the necessary EULAs to get permissions to do stuff, let alone do any actual work.

Im not sure how it is in other parts of the company, but if you imagine that its similar and do some back of the envelope math on how many people get onboarded and how much losing a month (or more) of time costs, it comes out to N million dollars for a non-trivial N. And thats just the direct cost of the really trivial stuff. Its hard to calculate the cost of the higher level stuff thats mentioned in the article, but I suspect thats even more expensive.

12
snlacks 5 hours ago 0 replies      
I don't agree with all the advice, but I do think it's a good message. My favorite part is actually on pre-onboarding: "If everyone has to come in and manually set up everything, what you have is this super painful onboarding process thats just going to bottleneck your company."
13
cocoflunchy 5 hours ago 1 reply      
On this subject, 10% of Airbnb engineers started last week: https://twitter.com/mikecurtis/status/633759315480805376

Kind of crazy...

14
jxm262 2 hours ago 0 replies      
Great discussion. I actually didn't know fog had a large blog section with podcasts/videos. Will be listening to these now :)
15
mud_dauber 4 hours ago 0 replies      
I really, really like this post. It applies to us product management dweebs too.
Apple Loses German Top Court Case on Swipe-to-Unlock Patent bloomberg.com
203 points by bitzerlander  9 hours ago   108 comments top 18
1
JustSomeNobody 7 hours ago 1 reply      
This always struck me as an "On the computer" patent. The slide to unlock mechanism has been around since forever (think any bathroom stall or old wood screen door, etc). Just because it's on a computer screen shouldn't make it patentable.
2
germanier 6 hours ago 2 replies      
Minor nitpick: The ruling court was not "the German Supreme Court" (which there is no single direct equivalent - often the term is used for the Bundesverfassungsgericht but that's problematic on many levels). The court was the Bundesgerichtshof which is the highest court for civil cases. Best is the avoid the term "Supreme Court" at all when talking about the German court system.
3
TheMagicHorsey 1 hour ago 0 replies      
Even the so called "good" software patents have a lot of the same elements as this bad patent. The problem is, most people aren't going to bother to read the claims of those patents and try to understand what the concepts claimed really are.

I have been involved with several patent suits (on both litigant side and defendant side) and as an engineer, I have to admit that there has never been a time when I haven't read the statement of the problem the patent says its going to solve, and not thought of the solution myself, way before the patent presents the same solution. In other words, every single litigated software patent I've been asked to review has been BLATANTLY obvious. And I'm no genius. I've talked to other engineers and they've all said the same thing. I just explain a problem domain, and they usually give a solution that comes under the claims of the litigated patent.

This is not to say that there aren't non-obvious software patents. Its just that those never seem to get litigated, because they aren't some obvious concept sitting at the nexus of a well-trodden path the industry is following.

I can't describe or link the specific patents I've been involved with, for obvious reasons, but the stuff I'm talking about sounds like things as follows:

"Receiving at a server a data packet, the data packet comprising a user identification number and a merchant identification number

retrieving a record in a database referenced by the user identification number

determining if the record in the database contains an authorization entry corresponding to the merchant identification number

responsive to the record in the database containing an authorization entry corresponding to the merchant identification number, transmitting a second data packet, containing an authorization token, to a server operated by a merchant."

I am not lying to you. This is how stupid each of these patents have been. Sometimes even worse.

Nobody not involved in these litigations understands how bad it is. And this is coming from someone who has made at least enough money to buy several luxury cars, providing consulting services to this particular legal industry. In other words, I have a financial interest in things remaining this fucked up. And I'm still telling you, its really fucked up.

4
minthd 7 hours ago 2 replies      
I think this should apply generally to touch screen gestures. Once someone invented a good enough touch screen display(capacitive) , the gestures are not that big of a step.

All Apple did was to acquire the inventors of the capacitive touch - and worked a bit on the UI. And while it's valuable to be the first company who recognize the importance of a capacitive touch screen - that isn't a basis for a patent - and Apple did get enough benefits anyway.

5
JohnTHaller 2 hours ago 0 replies      
This is long overdue. As is invalidation of the bounceback patent. Like so many of "designy" patents, there's quite a bit of prior art.

The whole "but on a computer" patent needs to go away. "Sliding a latch from one position to another to open but on a computer" should not be patentable.

6
brlewis 3 hours ago 0 replies      
Judges on Tuesday said that the iPhone makers method didnt reach a level of sophistication needed to award patent protection

Just so I understand what happened, can someone summarize German patent law? Is it the same 3 tests as in the U.S., i.e. statutory, novel, non-obvious?

7
thomasrossi 5 hours ago 1 reply      
Well in EU some algorithm is surely patentable if it has a "technical effect", for instance if you can move a robot arms consuming less energy or producing less waste materials, it must have a physical impact on something. Quote: "the method didnt reach a level of sophistication needed to award patent protection", just this, lol at patenting it in the first place.
8
tempodox 7 hours ago 0 replies      
The contested patent thus isnt based on an invention.

It seems there are more patents that fill this description.

9
amelius 7 hours ago 4 replies      
I'm waiting for a future where we can ask a "blank" AI to come up with trivial solutions to new problems, so that we can just invalidate such stupid patents. If the AI can invent it, it is not worthy of a patent.
10
DasIch 6 hours ago 1 reply      
Interesting that the article doesn't mention that there is prior art, which was discussed in court, in the form of the Neonode N1m.
11
littletimmy 5 hours ago 1 reply      
I don't get how this was a patent to begin with. The door in my room has a "slide-to-unlock" lock that dates back 50 years. Surely Apple did not invent this.
12
Tloewald 4 hours ago 1 reply      
13
shmerl 3 hours ago 0 replies      
Good. Such stuff should never have been patentable to begin with.
14
jchrisa 5 hours ago 0 replies      
15
mildrenben 9 hours ago 2 replies      
Glad to hear it, Apple's been going stupid crazy with this patenting lately.
16
chubs 6 hours ago 1 reply      
17
jheriko 6 hours ago 0 replies      
18
astazangasta 5 hours ago 3 replies      
KittyCam Cat Facial Recognition Powered by Raspberry Pi github.com
17 points by girlie_mac  1 hour ago   3 comments top 3
1
mangeletti 33 minutes ago 0 replies      
For some reason, I'm on all the different things posted here, this makes me want to get into robotics. I've been thinking about ordering a raspberrypi, or a MicroPython board, etc., but something holds me back. Maybe the fear of getting into something that I won't have time for.
2
noir_lord 1 hour ago 0 replies      
"or put his butt on the camera, it fails to tell me my cat was eating."

My favorite issue this year.

3
Gladdyu 39 minutes ago 0 replies      
Now you just need to automatically post the pictures to imgur/reddit and let the karma flow in. :|
Quantitative Economic Modeling in Python and Julia quant-econ.net
45 points by k0nartist  4 hours ago   4 comments top 2
1
pathdependent 2 hours ago 1 reply      
Ignoring the (fantastic) content, I think this is how Julia slowly wins users. I keep find myself doing parallel implementations like this.
2
graycat 1 hour ago 1 reply      
Sorry, guys, there's an unfortunate pattern:

In parts of applied math for business, there are lot of talks and papers of the form "Problem X: A Y Approach".

That seems to suggest that there is something really promising about Y. Instead, more appropriate would be "Solving Problem X".If Y was involved, then fine; if not,still fine; that Y was involved reallydoesn't mean much.

Then also in computing there are talksand papers of the form "Problem X viaProgramming Tools A and B".So, for the part "A and B",can substitute Python and Julia,Fortran, C and C++,C# and C, C# and C++, C# and VisualBasic, Common Lisp, anythingTuring equivalent, etc.

To meQuantitative Economic Modelingis a big enough subject and quitechallenging. That some of thecomputing was done inPython and Julia instead of C, C++, C#, Fortran, Algol,Folderol, etc. strikesme as nearly irrelevant. That is,I see nothing about Python and Juliathat promises especially good resultson the main challenges of the very challenging problem of Quantitative Economic Modeling.

Where am I going wrong?

Network handover in Google Fi nicholasarmstrong.com
33 points by ndrarmst  5 hours ago   16 comments top 6
1
dgulino 34 minutes ago 1 reply      
https://republicwireless.com provides seamless wifi->cell and cell->wifi handover of calls.
2
jewel 1 hour ago 3 replies      
When Fi was announced they mentioned that public wifi traffic would go through a VPN to google's datacenters. At the time I assumed that they'd just run ALL traffic through the VPN, since that'd make for some very seamless switching. As bad as that would be from a privacy perspective, I trust Google more than T-Mobile or Sprint.

By running everything through the VPN, you'd be able to have TCP connections that didn't break when the network switched, since your device's public IP address would be in a datacenter somewhere.

Also with a VPN you'd be able to send voice traffic over both a carrier connection and the wifi connection at the same time to avoid dropouts.

There is something similar called Multi-path TCP (MPTCP) which uses latency to decide which TCP path to send traffic over.

3
thrownaway2424 34 minutes ago 0 replies      
I remember having UMA on a T-Mobile BlackBerry (8800, I think) and the call handoff from WiFi to mobile really was seamless, no gaps at all. This was in 2007, by the way.
4
edude03 1 hour ago 1 reply      
I'm surprised that (it seems) google isn't using Multipath TCP to carry the VPN traffic to google. This would allow it to switch seamlessly between LTE and Wifi and in theory even LTE and LTE while maintaining the VPN connection and thus the call.

In fact, Apple uses this tech for Siri to reduce latency on voice queries.

5
adovenmuehle 1 hour ago 1 reply      
I've been keeping track of Google Fi since it was released.

I'm hoping the new Nexus 5 (both LG and Huawei versions) is compatible with Google Fi, although I did call the Fi support number (and talked to a real human) and she said they haven't heard anything about support for the Nexus 5.

Here's hoping.

6
JoshTriplett 1 hour ago 0 replies      
Interesting; "Data does not handover between networks" seems to contradict other reports I've seen that the data connection seamlessly hands off.
A heads up display for git github.com
191 points by michaelallen  9 hours ago   42 comments top 17
1
an_ko 7 hours ago 3 replies      
I use a modified version of mislav's git prompt https://gist.github.com/mislav/1712320 which is pretty minimal but usually enough for me.

For when I have to wrangle lots of files at once (like during interactive rebase to clean up history before push) I have a git watch alias that shows a high-level overview of changes that refreshes with inotify:

 [alias] watch = "!clear;inotifywait --quiet -mr -e modify,move,create,delete --format \"%f %e\" @/.git . | \ while read file; do \ clear;\ git status --short;\ git --no-pager diff --shortstat;\ done;"
I leave that running in a visible terminal window. It's more verbose than a prompt and reduces the need for constant git status sanity-checking. Maybe useful for someone.

3
couchand 5 hours ago 1 reply      
This is neat and I think there's a lot of potential here. As with many information displays, though, it's critical to consider what the most important information to convey is and how to effectively do it.

My main question is around the use of color. I'd argue the error states - conflicts, diverging branches, etc - should be the ones in red, since those are the issues you want to call the most attention to.

Getting rid of any chartjunk is the other big thing. Using four characters of every prompt just for `git:` is not reasonable. And as much as I like the idea of being warned about untracked files, I fear that in most real situations you end up with random scratch files in the same directory. My prompt would always say `7A` at the end, wasting more space (and mental effort!).

Good work!

4
Walkman 2 hours ago 0 replies      
If you like this, I recommend oh-my-zsh [1] or prezto[2], both have themes for things like this.

[1]: https://github.com/robbyrussell/oh-my-zsh

[2]: https://github.com/sorin-ionescu/prezto

5
avar 7 hours ago 0 replies      
Consider sending these changes upstream to contrib/completion/git-prompt.sh in git.git. It already has a lot of toggles for adjusting the prompt. These things you've added could be added as options.
6
JoshTriplett 1 hour ago 0 replies      
I like the appearance of prompts like this, and I've tried them a few times, but I always find myself turning them back off the first time I cd into a large git repository and have to wait a full second or two for the prompt to return. git is fast, but the prompt needs to show up instantly, and git isn't instantaneous on repositories the size of Linux or Chrome.
7
opsunit 5 hours ago 0 replies      
liquidprompt https://github.com/nojhan/liquidprompt is another viable alternative.
8
mallamanis 6 hours ago 1 reply      
I use https://github.com/magicmonty/bash-git-prompt which I also like. It seems to present less information than this one though
9
oalders 5 hours ago 0 replies      
I use oh-my-git for a similar purpose. https://github.com/arialdomartini/oh-my-git Took some wrangling to get the fonts to work, but I find it to be quite helpful. The README is especially nice.
10
lorenzfx 5 hours ago 0 replies      
I use zsh-vcs-prompt [0], which also supports hg and svn.

[0] https://github.com/yonchu/zsh-vcs-prompt

11
laumars 5 hours ago 1 reply      
You could negate the need for a --bash or --zsh flag by checking $SHELL:

 echo $SHELL | egrep -o '[a-z]+$'
It might also make sense to bundle everything under one file as well. While I'd normally advocate separating code into smaller and more manageable files, a single file shell script would be more convenient to install and would require less disk reads per every prompt call.

Looks good though. I'm definitely going to use this on my dev boxes.

12
noalt 7 hours ago 1 reply      
https://github.com/dahlbyk/posh-git has had something similar for awhile - very useful!
13
zerolinesofcode 7 hours ago 1 reply      
You could have called it GitHud ;-)
14
leni536 7 hours ago 1 reply      
I use something similar for my bash prompt:

https://github.com/xtrementl/dev-bash-git-ps1

I wonder if this one is any faster. Waiting for a bash prompt in large repos can be frustrating.

15
toddchambery 2 hours ago 1 reply      
Anybody have a translation for the fish shell prompt?
16
WorldWideWayne 3 hours ago 0 replies      
This is okay, but I wish OS X had something as nice as TortoiseGit.
17
r3bl 7 hours ago 2 replies      
Looks very promising! Thanks, we really missed something like this in Git.
Legendary Deep-Sea Fish Sighting Continues to Be Debated After 60 Years nautil.us
10 points by dnetesn  3 hours ago   discuss
Tsunami Warnings, Written in Stone (2011) nytimes.com
48 points by superfx  5 hours ago   3 comments top 2
1
oddsquare66 1 hour ago 0 replies      
This was shown as an example of Long Term Thinking during a presentation on the design of the Clock of the Long Now.
2
theandrewbailey 2 hours ago 1 reply      
(2011)
BlackRock acquires FutureAdvisor futureadvisor.com
37 points by chauzer  5 hours ago   3 comments top 2
1
blueyes 16 minutes ago 1 reply      
Former FA employee here: They're a great company. Grew about 45x while I was with them. They use BlackRock funds, so BR saw the growth. A pretty good ending for 5 years work. The Financial Times is reporting they sold for $150M-$200M.
2
nickpsecurity 1 hour ago 0 replies      
BlackRock is interesting in that they're very forward-looking and conservative at the same time. One of the first to use an AI-like system to support management of huge assets. Also one of the elite firms of the country in terms of just how much capital they manage.

http://www.economist.com/news/briefing/21591164-getting-15-t...

Will be interesting to see what happens to FutureAdvisor.

A Word Is Worth a Thousand Vectors stitchfix.com
7 points by legel  4 hours ago   discuss
A Computational Introduction to Number Theory and Algebra shoup.net
143 points by luu  11 hours ago   15 comments top 4
1
mrcactu5 8 hours ago 1 reply      
Algebraic Number Theory: A Computational Approachhttp://wstein.org/books/ant/ant.pdf

A Course in Computational Algebraic Number Theoryhttp://bit.ly/1heah8l

2
Ebbit 9 hours ago 3 replies      
What background of mathematics does this book assume?
3
kasperset 4 hours ago 0 replies      
4
Hackernaut 8 hours ago 1 reply      
How relevant is this book to artificial intelligence
Isomorphism in Information Carrying Systems (2004) [pdf] rutgers.edu
13 points by brudgers  5 hours ago   2 comments top
1
nine_k 1 hour ago 1 reply      
From the abstract: I show that in order for the information that they [generalizations] carry to be available to cognition, perceptual representations must be isomorphic with respect to the constituent structure of the properties that they represent. Isomorphism therefore plays an important role in the information theorists account of perceptual representation, even though it plays no role in determining content.
The Problem with Problems 000fff.org
44 points by veb  5 hours ago   17 comments top 9
1
jasode 3 hours ago 1 reply      
Another trigger for tech companies is "curiosity" instead of deliberate "problem-solving".

The software or hardware engineer "tinkers" with something simple. And then, a lightbulb goes off and the "toy" looks like it is well-suited to solving a particular problem.

Larry Page wasn't looking to solve a "problem of inefficient advertising expenses." He was satisfying an intellectual curiosity about applying the citations (e.g. Erdos #) in research papers to web pages. (One could argue that you could reformulate "The Problem" to be "retrieve more relevant weblinks than AltaVista" but for my example, I refer instead to the "ad dollars problem" because that's the one that pays Google's bills.)

Maybe it depends on the person. One type of person sees a "problem", then he/she deconstructs that into components and try to make a viable business. That's definitely where a lot of B2B businesses get started.

Another type of person simply tinkers and experiments and "solves problems" as a side effect.

2
zaphar 4 hours ago 1 reply      
While not directly the point of his article I found the example of time tracking to perfectly line up with my experience.

The hidden problem exists because there is a disconnect between the people who fill out those timesheets and the people who consume them.

The people who fill them out can't use the data. As a result the data is almost never correct and is largely useless and the people who consume them are picking out the wrong patterns from bad data.

The only reliable unit of measurement for time tracking is days spent. Anything else is measuring a largely made up number.

Further more they feed this idea that cramming at the end of a project is a good way shorten the time. Since you are tracking hours not days it's a short step to just upping the hours without upping the days as a short cut. But none of these apps give any data on the quality of those hours spent.

3
mgrennan 3 hours ago 1 reply      
I have proof this is true. I have 35+ years experience in micro data processing. I have mentored several young people and seen them rocket to the top. I understand it is our nature to push away from parents to strike out on our own. I also understand why older people resist change. Magic happens when the young seek to understand the wisdom of their elders and elders hold on to explorer spirit of their youth.
4
michaelbuckbee 3 hours ago 1 reply      
I'd cast it more as looking for inefficient processes than as outright "problems" as most people who are not developers and doing their day to day jobs don't even perceive things as an issue so much as just how things are.

I've developed a few small software things that have saved people hours a week.

For example, I built a free tool [1] that lets you export a tagged subset of bookmarks from Pinboard into a nice format for inclusion in a webpage or Mailchimp newsletter. The person who I built this for wasn't really complaining about "gee, it takes me a couple hours to collect all these links, format them, etc." but when I saw their process, sheesh.

1 - http://www.bigbadassresourcelist.com

5
beat 1 hour ago 0 replies      
This touches on so many things I've been saying...

I describe really interesting problems as "fish don't know they live in water". The people who have the problem and deal with it every day don't even recognize that it's a solvable or that pain reduction is possible.

6
tempodox 3 hours ago 0 replies      
I realised long ago that the problematic quality of a problem stems solely from the perception that it represents a problem at all. If you consequently refuse to cast a situation as a problem, it ceases to be one.
7
nine_k 3 hours ago 1 reply      
The key takeaway:

It is my guess that there is a potential goldmine of problems we simply dont know of because the people who are exposed to them arent connected with the people who have the opportunity and willingness to solve them. Perhaps the real power of diversity in business isnt hidden in gender but in age

8
jackgavigan 1 hour ago 1 reply      
> Are the older and younger generations wildly underutilizing each other as resource?

I actually think that the older generation is far better at utilizing the younger, than vice versa.

9
paulus_magnus2 4 hours ago 1 reply      
How security flaws work: the buffer overflow arstechnica.com
132 points by guardian5x  11 hours ago   35 comments top 6
1
ghuntley 8 hours ago 1 reply      
One of the best explanations ever written is the now classic "Cult of the Dead Cow issue #351 - The Tao ofWindows [NT|95|98] Buffer Overflow" @ http://www.cultdeadcow.com/cDc_files/cDc-351/
2
kriro 8 hours ago 3 replies      
Alternatively (just the concept, this should not work as outlined on modern systems):http://phrack.org/issues/49/14.html
3
cm2187 8 hours ago 4 replies      
The worse is that this error only happens because we are trying to save 4 bytes. If all arrays would also store their own size we could eradicate this bug with a limited memory impact.
4
jakozaur 6 hours ago 1 reply      
There were so many critical vulnerabilities attributed to that... and in spite of putting tons of measures this is still potential danger. I knew projects decided to use Java instead of C++ (which got array boundary checks) mostly to limit security surface.

Maybe if we could start CPU architecture from scratch, array with sizes would make more sense? Maybe even we could done it so memory segmentation would no longer be needed:https://en.wikipedia.org/wiki/X86_memory_segmentation

Haven't figured out the details, but maybe we can even gain some performance that way.

5
rawdisk 5 hours ago 3 replies      
Are "buffer overflows" possible in systems without without virtual memory?

I know how I would answer this question but I am curious how others would answer it.

EDIT: s/possible/known to occur

6
segmondy 3 hours ago 0 replies      
Go LD_PRELOAD backdoor experiment github.com
18 points by mattbostock  9 hours ago   5 comments top 2
1
readams 1 hour ago 2 replies      
So from what I can tell this is not in any way a security issue or anything new. If you can LD_PRELOAD a library when running a system executable, then you can backdoor it trivially since LD_PRELOAD is designed to allow you to substitute existing symbols with your own symbols. The use of go here matters not at all, and this is not an attack against system executables that use go.

Of course, you can't, as an ordinary user, start system executables with elevated privileges and set LD_PRELOAD.

2
forgottenpass 54 minutes ago 0 replies      
Often unknown dynamic linker trick, not a backdoor.

Although, if anyone reading this has never heard of LD_PRELOAD, take a look at a better resource than this link because it can be a powerful debugging and testing tool.

The Contiki OS Version 3.0 Released contiki-os.blogspot.com
76 points by adunk  9 hours ago   17 comments top 7
1
georgerobinson 5 hours ago 2 replies      
Earlier this year I built a middleware in Contiki OS for the communication of wireless sensor data from sensor to base station [1].

The Contiki OS operating system is a lot of fun to work in. It's amazing just how much you can do with sensor-like hardware. However, the Cooja simulator was so unreliable. I'm not sure if the two projects are affiliated but the simulator was the single worse part for me when programming Contiki.

1. This was an academic project for an MSc class I took at UCL. It was also my first significant C project, so the source code might not be idiomatic (https://github.com/georgerobinson/citiesio/tree/master/citie...). The extended report (http://www0.cs.ucl.ac.uk/students/g.robinson/citiesio.pdf).

2
fra 1 hour ago 1 reply      
I've had mixed results with Contiki in the past. Several of their modules are of extremely poor quality. The coffee filesystem in particular is an un-redeemable mess of bugs.

Great OS to experiment with, but I would not recommend building a business on top of Contiki today.

3
plainOldText 5 hours ago 0 replies      
Another IoT OS I've just discovered http://www.riot-os.orghttps://github.com/RIOT-OS/RIOT
4
ausjke 2 hours ago 0 replies      
This is a great OS indeed for IoT sensors. FreeRTOS has a much larger presence in MCU OS and they also overlap each other.I ended up using FreeRTOS but am always interested in checking out Contiki.
5
synchronise 6 hours ago 1 reply      
Great to see this project is still alive.I know this probably has already been asked, but when can we expect the Commodore 64/128 versions?
6
rmhsilva 6 hours ago 0 replies      
Particularly good to see 802.15.4 link layer encryption (AES128) and seamless IPv4 <> IPv6 translation.
7
bucma 4 hours ago 2 replies      
I'd love to see Contiki run on something like the beagle bone black.
The Advanced Persistent Threat You Have: Google Chrome [pdf] netsq.com
8 points by epsylon  5 hours ago   4 comments top 3
1
xpaulbettsx 24 minutes ago 1 reply      
I can't read through the pages and pages of grandstanding in this PDF, does this at all have some sort of escape of a security boundary, or is it just "I found a weird way to hack myself"?
2
irickt 15 minutes ago 0 replies      
The paper is dated 18-Apr-2012
3
epsylon 5 hours ago 0 replies      
WebExtensions FAQ mozilla.org
171 points by jtgeibel  16 hours ago   117 comments top 21
1
sugarfactory 10 hours ago 2 replies      
The author of Tree Style Tab has commented (in Japanese) on the change to deprecate add-ons making use of XUL. [1]

Roughly what he said might be summarized as follows (sorry if I misunderstood his intention): Tree Style Tab is useful because the add-on changes the behaviors of the tab globally. That way, it can cooperate with other tab-related add-ons whose authors didn't intend to make the add-ons work with TST. Therefore providing the Sidebar API doesn't help because you can't expect add-on authors to write code just to make add-ons work better with TST.

[1]: https://twitter.com/piro_or/status/635078508981555202https://twitter.com/piro_or/status/635079271032090624https://twitter.com/piro_or/status/635079995371556864

2
mnarayan01 14 hours ago 3 replies      
I like Firefox. A lot. And I think there are a number of ways that the core (i.e. non-extension based) functionality is superior to that of e.g. Chrome. That said...the reverse is also very much true.

When Firefox loses the extensions which require XUL (whether fundamentally or merely to avoid rewrites) I doubt its upsides will outweigh its downsides for me anymore. They may surprise me, and even if they don't they still might increase usage with other people, but right now I am sad.

3
examancer 13 hours ago 3 replies      
Currently developing a Chrome extension I was dreading porting to Firefox. I am glad this is coming. Microsoft is also promising a mostly-Chrome-compatible API for Edge browser extensions.

This will mean my extensions for all three browsers can mostly share the same code. Since they use pretty standard HTML/CSS/JavaScript much of the code will also be shared with my app's traditional web services. This sounds like a huge win for bringing rich, native-like experiences to all browsers.

Would be great to see all browsers support a common extensions API. I hope WebExtensions is that API.

4
TazeTSchnitzel 14 hours ago 4 replies      
The mention that they plan to remove XUL usage within Firefox (which is a good idea) is interesting. That's an extra justification for the change of extensions approach (the one we knew before was e10s).

Looks like they realised that to keep moving forward, clean up technical debt and fix longstanding issues, they'd unfortunately break almost all extensions in the process. And if that's the case, they might as well switch to a better API while they're at it. It's painful, yes, but Firefox is still a slow, unstable and memory-guzzling behemoth, where all your extensions break on every update. This won't change if they stick with single-process, XUL, XPCOM and the existing extension model.

I am optimistic. I think Firefox can and will survive this. Look how well Apple's Mac OS to OS X transition went. Mozilla are willing to help people port extensions. And their timeline is probably unrealistic, but it can be pushed back. In two or three years, Firefox will be a world-class browser again, and we will look back at the panic we had now and laugh.

5
Guzba 13 hours ago 0 replies      
I work on an extension for Chrome, Opera, Safari, and Firefox that's pretty widely used. Firefox (Jetpack) is so different from the rest that I basically hate working on it.

I get that some people may love XUL and whatever, but having never used it, I can't really comment. All I do know is that Jetpack (which Firefox has been pushing) is a sorry excuse for a way to build extensions after you've worked with Chrome. I really hope WebExtensions makes my life easier soon.

6
hackuser 12 hours ago 0 replies      
Also, NoScript developer Giorgio Maone's take on WebExtensions:

https://hackademix.net/2015/08/22/webextensions-api-noscript...

7
Animats 14 hours ago 0 replies      
XUL/XPCOM have been on the way out for years, because Fennec, the mobile version of Firefox, doesn't have them. Jetpack extensions work fine in both desktop and mobile versions, though, and it seemed that the future was extensions to Jetpack. Jetpack isn't all that great, but after five years, it more or less works.

Being compatible with Google Chrome isn't very important. Extensions aren't used much on Chrome. I have the same extension for Chrome and Firefox, and Firefox usage is 100x greater.

8
hardwaresofton 13 hours ago 1 reply      
Super glad to hear this -- I created a small chrome extension a while ago and found it a joy to code (very easy, with a distinct lack of surprises), and when I looked into creating a similar Firefox extension, the idea of learning XUL was enough to shut down the idea of a port very very quickly.

Maybe I'm just lazy, but I was very surprised to find that Firefox would create such a technology when more "open web friendly" solutions were possible. I'm sure it's just that XUL is a vestige of a time when the web was still young, but I am going to wait until Web Extensions are released before I attempt to write any extensions for Firefox.

Are there any reasons to keep XUL around (other than lack of apps that were written with it)? It doesn't seem like there are any unique features that couldn't be reproduced using something like chrome's model...

9
kristofferR 13 hours ago 4 replies      
I really hope WebExtensions will be capable of the type of extensions that is currently exclusive to Firefox, like Tree Style Tabs and DownThemAll, although I doubt it.

If not I'll keep using the last version with proper extension support for a long time, until an alternative comes.

10
paulryanrogers 2 hours ago 0 replies      
As an extension author this is disappointing. For all the hate XUL gets, it was a powerful tool to manipulate the UI. Hopefully the new API will be as empowering as the old without the same risks. And ideally this will happen before the old APIs are completely removed.
11
protomyth 14 hours ago 3 replies      
"It sounds like you've made decisions without community input, why?"

I guess it's good that they acknowledge that they didn't get input from those that write extensions before they made this decision.

12
jasonhansel 3 hours ago 1 reply      
Still no support for unsigned extensions? :(
13
devsquid 8 hours ago 0 replies      
"The Chrome extension API was designed to work well with process separation and we are taking inspiration from it and copying functionality where it makes sense. However, there will be differences, and the goal of WebExtensions is not to copy Chrome or allow Chrome extensions to run unmodified in Firefox, but to simplify cross-browser development by providing commonly-supported methods and interfaces. We won't implement all of Chrome's APIs, and Chrome is unlikely to implement some of the APIs we add. Imagine the APIs as a Venn diagram. In the middle are cross-browser APIs for content scripts, tabs, and windows. In the Firefox side are APIs for toolbars and other UI elements. On the Chrome side are APIs for Google's cloud services."

Reading that made me extremely happy! I'm glad to see cross compatibility is still a goal for browsers.

14
pasbesoin 3 hours ago 0 replies      
For me, this boils down to, "Whose client is it?"

One reason I've liked and used Firefox is that, especially with extensions, it has been more "my client".

Amidst all the noise over this change, I read into it that -- to some degree -- it is becoming less my client.

Which, to me, seems like another step in Chrome's direction, where I've felt that the client is increasingly the advertiser's client. (And the DRM pushers' client, etc.)

Being my client is what, for me, Firefox has had going for it.

It's my PC. On which I wish to use my client, handling and presenting data in the manner in which I want it handled.

15
Qantourisc 11 hours ago 0 replies      
Say what is going to happen to Thunderbird's plugins ?!?Cause breaking those would actually be worse then breaking Firefox !
16
yoodenvranx 7 hours ago 0 replies      
I love the multi row tab feature of TabMix Plus. I hope something like this will be possible with the new API.
17
mrbig4545 10 hours ago 0 replies      
I am looking forward to this. The way I see it, it's all preparation for servo.

And I'm sure the extensions to the webextensions api will provide enough to do what needs to be done

18
vdfs 8 hours ago 0 replies      
I hope they use the same API from Chrome for plugins like Flash.
19
curiousjorge 2 hours ago 0 replies      
could I port a chrome extension to firefox now?
20
untog 15 hours ago 0 replies      
Yes, how dare he say that an employee being abusive towards another employee is unacceptable. How very dare he.
21
DonGateley 14 hours ago 1 reply      
Show HN: EtherPot A decentralized, autonomous, provably fair lottery etherpot.github.io
4 points by aakilfernandes  3 hours ago   1 comment top
1
aakilfernandes 3 hours ago 0 replies      
Hey everyone, EtherPot is a smart contract on the Ethereum Blockchain. That means that no one can steal the funds or cheat to win. The lottery is provably fair.

100% of finds (except for transaction costs that go to miners) get returned to the users who play.

How developers search for code: a case study research.google.com
95 points by pramodbiligiri  12 hours ago   13 comments top 4
1
boyter 9 hours ago 0 replies      
Agree with pretty much everything inside this. The big difference I note however is that my experiences with running searchcode.com suggest that for public search engines that closer to 60% of searches appear to be "How to use a function". By contrast 60% of API calls appear to be looking for AWS keys, passwords and exploits.

This is probably totally different to search working over internal codebases however.

I must confess... I was originally looking from a vanity point of view and have mixed feelings to see that searchcode was mentioned in the references but not linked.

2
debacle 6 hours ago 4 replies      
Kind of funny that this is coming from Google considering how bad Google has become for code search. I wish there was a way to turn off the "I'm ignoring what you searched for and returning what I think you meant." feature.
3
fizixer 2 hours ago 1 reply      
I imagine typing a code comment in my text-editor before I start writing the code (e.g., calculate fibonacci sequence) and my editor (on a machine connected to the internet) populates with sample open-source code in my language (inferred from the file extension, or shebang, etc), that I can use as a boilerplate and make changes. The populated code is one search result, with the option to look at second most relevant result, third-most relevant and so on. And it shows the source url where I can go for details.

I think something like this with rosetta-code snippets is very doable (a weekend project, assuming you're good with your editor's programmability).

4
nautical 8 hours ago 2 replies      
25% of the total is documentation related .. Thats really interesting ... A 'smartly designed' documentation website can cater to 25% of total development queries ?
       cached 26 August 2015 19:02:01 GMT