hacker news with inline top comments    .. more ..    11 Aug 2014 News
home   ask   best   4 years ago   
1
Key Advances in OpenGL Ecosystem
41 points by Jarlakxen  1 hour ago   8 comments top 3
1
exDM69 1 hour ago 3 replies      
OpenGL is long overdue for a spring cleaning. A bit less than 10 years ago there was an initiative to create a new OpenGL API, code name "Longs Peak" which would have solved some of the obvious issues there are.

Right now there's a variety versions of OpenGL out there and they are incompatible in subtle ways. To write "portable" graphics code, you need both compile time and runtime version checks for a variety of features. Some restrictions in mobile versions make sense because of hardware requirements, others are just plain ridiculous.

And the versions of OpenGL that vendors ship is very diverse. For years, Nvidia and AMD have been the only ones to provide (at least almost) the latest version of OpenGL (but only on Windows and Linux, not Mac). Other vendors are lagging behind by several years.

I won't even start listing the obvious problems with the OpenGL API. Everyone who is working with it knows that the API is ridiculous.

I'd like to see an API clean up (ie. rewrite from scratch), a common shader compiler frontend, a common shader binary format and common tooling like benchmarking and profiling tools. Perhaps even a software emulated "gold standard" implementation.

At the moment, writing practical OpenGL applications is miserable. It's quite alright as long as you're working on a small project for your own enjoyment but once you have to start dealing with a variety of driver bugs from different vendors and whatnot, it takes a lot of time to actually ship an application.

2
SynchrotronZ 1 hour ago 1 reply      
The quotes on that page feature the best job title I have ever seen: "Aras Pranckeviius, graphics plumber at Unity."
3
theandrewbailey 1 hour ago 0 replies      
Direct State Access is finally in the core spec. That might be the last missing piece of what OpenGL 3.0 was supposed to be.
2
German armed forces warn that carbon fiber might be cancer-causing
76 points by m8rl  3 hours ago   40 comments top 11
1
mxfh 2 hours ago 1 reply      
Since the carcinogenic effect of asbestos is mostly caused by it's physical properties[1]; anything that resembles the shape of asbestos' micro-particles is also highly suspected to be carcinogenic.

The shape, size, and adsorbing nature of the fibers also appear to be critically important. Recently, doubts have arisen concerning the safety of commercially available carbon nanotubes,[2] which may possess the same carcinogenicity as asbestos fibers because of their similar characteristics. Ample care has to be taken to prevent a tragedy similar to the one caused by asbestos exposure.[1]

[1] http://www.med.nagoya-u.ac.jp/medlib/nagoya_j_med_sci/7112/p... [PDF]

[2] http://www.nature.com/nnano/journal/v3/n7/abs/nnano.2008.111...

On making nanotubes less dangerous: Shorter is better http://onlinelibrary.wiley.com/doi/10.1002/anie.201207664/ab...

2
Kallikrates 1 hour ago 1 reply      
In another life I had to take aircraft crash response training, and was in charge of recovery (salvage, battle-field repair etc). The response to an aircraft with composite materials is totally different. With normal construction we could get to work as soon as the fire, was out and any hazardous liquids were taken care of. Crashes involving damage to composite structures required that the damaged areas be sprayed down with a varnish like chemical by the same troops who handle Chemical weapon response (they had special suits). Then there would be air particulate tests before we could move in. Here is a military study: http://www.dtic.mil/dtic/tr/fulltext/u2/a420193.pdf
3
exDM69 2 hours ago 5 replies      
Formula 1 and other open wheel racing drivers have been exposed to carbon fiber dust (from the carbon-carbon brakes) for a few decades now. There is some research going on with former racing drivers.

E.g. former F1 driver Mika Salo underwent surgery where his lungs were examined to assess the effects of repeated exposure to burned carbon fiber dust (this was several years ago). Unfortunately, I do not have any links to sources nor do I know the results of the research.

Another big question mark is the health and environmental effects of graphene. There is a lot of research going on in applications of graphene but only now there have been research projects into possible negative effects on the environment.

4
chiph 1 hour ago 1 reply      
The USAF was concerned at one point that should an F-16 burn on a runway, it would release carbon-fiber strands which would float around the base and short out electrical equipment (communications gear, phone switch, power generation, etc).

Seems like the concern was correct, but misplaced. We had a plane catch fire on a taxiway and burn, and no electrical mayhem resulted. Perhaps we should have worn our gas masks -- but we didn't know at the time.

5
bcohen5055 1 hour ago 1 reply      
While in undergrad I spent about 5 years working with carbon fiber in a motorsports setting. I did a lot of cutting and sanding without a mask. There were weeks I would blow my nose 3 days after working on a part and still find black specks. I guess I'll find out in a few years...
6
m8rl 3 hours ago 0 replies      
A research facility of the german armed forces warns that carbon fiber once burned at high temperatures is transformed to micro-sized particles having effects on the lungs comparable to asbestors.
7
xmstr 1 hour ago 0 replies      
I was in the US military in the late 80s and early 90s and we were told then that carbon fiber used in aircraft may be cancer causing. Nothing new here.
8
rnernento 3 hours ago 3 replies      
From the Google translated version of the article it looks like it's only dangerous when burned.
9
krschultz 2 hours ago 2 replies      
Note that this article (at least from the translated version) seems to be implying that the risk is around burning or particulate carbon fiber. I would definitely believe that.

The epoxy is also pretty nasty too. If you don't wear the right personal protective equipment when using it, you can quickly become sensitized to it. I don't know what that means medically, but it can't be good. I have always worn the right gear when building things, but it is clearly harmful to you without proper ventilation and separation from your skin.

I love composites, but they definitely have downsides.

10
progx 1 hour ago 0 replies      
And next week our great Bundeswehr Research Facility will tell us that: smell of burned oil can make us ill and eating of weapon bullets can cause diarrhea.
11
rikacomet 1 hour ago 2 replies      
Though I'm all in to scientific development and the benefits of these new materials, I still want to point out, that nor carbon-fiber or graphene exist in nature in large quantities for more than one reason, while diamond and graphite do. Now the question should be .. why is that?
3
I Liked Everything I Saw on Facebook for Two Days
148 points by sethbannon  2 hours ago   62 comments top 18
1
eumenides1 1 hour ago 5 replies      
I was thinking of a way to destroy my facebook account because you can't actually delete it, this is pretty much it, like everything. Also I guess you could check-in every where at once all the time.

I didn't think about the collateral damage it could cause to others, which is to bombard your "friend's feeds". This is also interesting because, well you can destory facebook in this manner. If enough people are peeing in the pool, people are going to get out.

Maybe I should write a greasemonkey script to accomplish just that. Not that I want to destroy facebook, but that if i wanted to destroy my data this would be the way because facebook doesn't give me that option.

Also does everyone else think it's creepy when your friends stop using facebook and old "likes" pop up on your feed to make it look like that user hasn't left?

2
protonfish 1 hour ago 4 replies      
So if you like everything on Facebook, the result is political extremism and the most insipid pop culture trash. What frightens me is what if this has nothing to do with Facebook, but is instead about human nature? Does this also describe what happens to a person's life when they stop caring, when they are no longer motivated to put forth the effort to be discerning, when they give up trying to make anything but the easiest and most immediate choices?
3
phillmv 1 hour ago 0 replies      
From the TFA,

>Also, as I went to bed, I remember thinking Ah, crap. I have to like something about Gaza, as I hit the Like button on a post with a pro-Israel message.

>By the next morning, the items in my News Feed had moved very, very far to the right.

[...]

>Rachel Maddow, Raw Story, Mother Jones, Daily Kos and all sort of other leftie stuff was interspersed with items that are so far to the right Im nearly afraid to like them for fear of ending up on some sort of watch list.

[...]

>While I expected that what I saw might change, what I never expected was the impact my behavior would have on my friends feeds.

This article has so much modern anxiety in a nutshell. We have the pervasive surveillance society, and having our behaviour affected by algorithms.

What this really highlights, to me, is the extent to which Facebook exerts editorial control over the news that you're subjected to. This has all sorts of other effects on how media dollars are spent and as a result the shape of discourse - I'm immediately reminded of http://www.slate.com/blogs/moneybox/2014/05/22/facebook_s_mi...

This is not to say that there haven't _always_ been pernicious incentives at work; but before you could at least question those incentives and motivations instead of shrugging, and pointing to there's an unexplainable, mysteriously biased, support vector machine et al pulling strings.

4
ChikkaChiChi 1 hour ago 3 replies      
Confirmation bias bubbles like a social media feed like this are atrophying the debate muscles of our society. When we don't actively participate in two-sided discussions we lose the empathy and scope of how so few things are really black and white.

I wish there were more discussion venues where the quality of your participation was based on the value to the discussion, not as to whether or not you supported what that person said.

5
uberdog 57 minutes ago 1 reply      
I did sort of the opposite. I would click on "don't want to see this" for any link to an external site. It took a few days, but now pretty much all I see is updates from friends.
6
scoot 15 minutes ago 0 replies      
Could someone please put 'Liked' in the HN title in quotes so it makes grammatical sense.

As it stands, the title makes it sounds as if someone tried Facebook for two days and was happy with what they saw. (I know this is the original title, but that doesn't make it correct.)

7
webwanderings 1 hour ago 1 reply      
Nice. This is the post I needed to read, as I am on one-week fasting from Facebook (in fact, I am fasting from all kinds of email/group, facebook interactions). It is hard work, but much needed. I tend to believe, as humans in this age and time, we are losing our fundamental ability to delay gratification.
8
NhanH 1 hour ago 1 reply      
Just a very tangential point to the post, but the author mentioned a FB employee trying to connect him with the PR department. Exactly what does FB have in mind in doing that?
9
artumi-richard 1 hour ago 1 reply      
Could there be a browser plugin to like and share everything as you wander through the web? That would be an interesting way to demonstrate tracking technologies.
10
sosuke 1 hour ago 2 replies      
I'd love to be able to hit the reset button on my FB profile, undo all the profiling they've done on me and start again. Same thing for Google. What does it look like to be a vanilla user?
11
milge 57 minutes ago 0 replies      
The only winning move is not to play.

Or in this case like/post.

12
sebkomianos 1 hour ago 1 reply      
This made me think about creating a second profile on facebook, adding myself as a friend and liking whatever I post on my original - just to find out what "advertising value and quality" I create. Anyone got any ideas on how to automatise this?
13
po 27 minutes ago 0 replies      
This article has reminded me of how astoundingly ahead of his time Andy Warhol was.
14
flippyhead 28 minutes ago 0 replies      
Every since I read The Circle stuff like this seems to ring true.
15
skimmas 1 hour ago 1 reply      
it's possible to unlike everything you've ever like isn't it? I'll try that one day.
16
stevebot 55 minutes ago 0 replies      
Interesting read. wonder what would happen if they added a dislike button.
17
PaulHoule 49 minutes ago 0 replies      
Somebody still uses Facebook, film at 11.
18
limaoscarjuliet 1 hour ago 1 reply      
I'm so put off by Facebook that I'm even put off by articles describing how one is put off by Facebook.

Full disclosure - I closed my account 2 years ago. Did not miss it even for 10 seconds.

4
DashGum Free Bootstrap Admin Template
61 points by tilt  3 hours ago   21 comments top 7
1
lucaspiller 2 hours ago 4 replies      
Looks good. You might want to disable jQuery Nicescroll on Mac though as it feels really unnatural and laggy.
2
rikkipitt 1 hour ago 1 reply      
This looks heavily based upon a commercial admin template by Carlos Alvarez for sale at wrapbootstrap which I found months ago... Interesting! Who was first?!

http://www.alvarez.is/demo/dashio/dashboard/

http://alvarez.is/demo/dashio/index.html

https://wrapbootstrap.com/theme/dashio-dashboard-frontend-WB...

3
nkozyra 44 minutes ago 0 replies      
It looks very simple & clean. One thing I noticed is the amount of in-app requests (AJAX/whatever) were giving no indication of progress, so it seemed like a lot of things were hanging.

Obviously, I attribute this to HN load, but in the real world you'd probably like to get some notice that the SPA is attempting to do "something."

4
tilt 3 hours ago 1 reply      
6
CSDude 1 hour ago 1 reply      
Loved it! Also check the panels page, nice compositions! I decided to use this in my.project. Good work!
7
bluetidepro 53 minutes ago 0 replies      
Maybe I missed it (sorry!), but is their a GitHub repo for this somewhere?
5
First-Person Hyper-lapse Videos
1247 points by davidst  19 hours ago   158 comments top 56
1
UnoriginalGuy 18 hours ago 6 replies      
The result is quite simply breathtaking. It looks like something shot for a movie using a stabilised dollycam, the fact they were able to achieve the same thing using nothing but a GoPro, their software, and likely a week of post-processing on a high end desktop PC is simply amazing.

I hope we see this technology actually become readily available. There might still be work to be done, but in general if they can reproduce the demo videos with other content then they're on to something people would want.

2
Mithaldu 18 hours ago 0 replies      
Since it's not quite obvious, the supplementary page has videos with better bitrate than what youtube did to them: http://research.microsoft.com/en-us/um/redmond/projects/hype...
3
rkuykendall-com 18 hours ago 0 replies      
Interesting that the final video ( mostly the rock climbing ) resembles a video game, where shapes and textures "pop-in" as they are rendered. The technical explanation video was really well done.
4
msane 15 hours ago 2 replies      
If the MSR researchers are here -- I'm curious what does it look like when bordering hyperlapse with regular input? i.e., if there were a video consisting of input frames at the beginning and the end, with a stretch of hyperlapse in the middle, what does the transition look like? Does it need much smoothing?

Also you probably saw this over the past week: http://jtsingh.com/index.php?route=information/information&i... disregarding the politics of that) Whatever he's doing (I assume a lot of manual work) it has a very similar effect and it has these beautiful transitions between speeds.

Amazing work and the videos are stunning.

5
spindritf 18 hours ago 2 replies      
The videos don't load for me (due to mixed content, I believe), so here they are:

https://www.youtube.com/watch?v=SOpwHaQnRSY

https://www.youtube.com/watch?v=sA4Za3Hv6ng

The hyperlapse of the climbing video looks like an FPS game from a decade ago with texture refreshing as you get closer.

6
pwenzel 1 hour ago 0 replies      
The city demos remind me of the Blues Brothers' car chase scene in downtown Chicago. https://www.youtube.com/watch?v=LMagP52BWG8
7
steven2012 12 hours ago 1 reply      
Okay please sign me up. I'm willing to pay hundreds of dollars for this software. I have hundreds of gigabytes of time lapse that I've taken that is just sitting there because of lack of ability to do something. I'd easily pay $200+ for this software right now just so I can have those videos and free up massive hard drive space.
8
jahmed 14 hours ago 1 reply      
I walked around Boston once with some friends for 7 hours. When I remember it I see it as the hyperlapse, not moment for moment or sped up. Super interesting work.
9
tumes 38 minutes ago 0 replies      
This looks like something Michel Gondry would've figured out manually 20 years ago with an 8mm camera and a paper cup or something.

That being said, it really does look amazing!

10
moultano 17 hours ago 1 reply      
I wonder how far you can get by using a "naive" timelapse of selecting frames from the video, but being smarter about which frames you choose. Rather than just choosing every nth frame, try to choose visually consistent frames by making the intervals between the frames loose, then apply conventional stabilization after the fact.
11
iamshs 18 hours ago 3 replies      
Bloody amazing! Fantastic work! Release it. Release it soon. This is something that everyone would want.

I see they have listed a Windows app coming. Is that Windows desktop app?

12
bitL 3 hours ago 1 reply      
Very nice! Is there software available I can use, or do I have to implement the algorithms from the paper myself?

I make a lot of 4K hyperlapse movies, it is tedious as AfterEffect's warp stabilizer is useful only in a small fraction of cases, Deshaker is more consistent but also not perfect, and the only option in the end is multi-pass manual tracking and stabilizing which is very time consuming and tricky for long panning shots.

13
photojosh 8 hours ago 0 replies      
I would use this for sure. I do timelapses of runs I do and set as challenges for our social running group. The source is a head-mounted GoPro.The problem with them is that a straight forward pick-every-nth-frame gives a motion-sickness inducing video, as well as blurry. If you could extract the frame from the top of each stride when the camera is most steady, I would imagine it would be very much more watchable.
14
sabalaba 6 hours ago 0 replies      
This is great. I have weeks of footage from a camera that I wear around and would love to use that video to make a hyperlapse. I would also be interested in seeing how well this does with photos taken every few seconds as opposed to video. Although, after reading the paper, it looks like there would be a lot of optimization that would need to happen to make it more efficient. (Their original implementation took a few hours on a cluster.) Luckily, as they stated in the technical video, they haven't tried to do anything more than a proof of concept; so there is plenty of room to optimize. I'd be interested to see how well a single-machine OpenCL or CUDA implementation does compared to the CPU clusters they were using in the paper.
15
31reasons 15 hours ago 1 reply      
Mind blowing results! Although the name Hyper-lapse doesn't really convey the goal, it should be named Smooth-Lapse, because thats what its doing. Too much hyper-x already.
16
Lifescape 18 hours ago 3 replies      
It appears as if the mountains are loading in the background (like in a video game) as you get closer to them.

Awesome idea/execution!

17
readerrrr 18 hours ago 2 replies      
Great results but looks expensive. I wonder how many minutes of processing per minute of video.
18
oh_sigh 17 hours ago 1 reply      
Really great results. I wonder if this could be coupled with google street view data to give a smoother time-lapse of a path.
19
tehwebguy 18 hours ago 0 replies      
This is incredible! If you want a good look at how it handles moving objects / people check out the part at 2:15

https://www.youtube.com/watch?v=SOpwHaQnRSY&t=2m15s

20
itchmasterflex 11 hours ago 1 reply      
Would it be possible to do something like this for audio? It would be incredible to sample an hour-long album or mix in minutes.
21
adt2bt 18 hours ago 0 replies      
This is so insanely cool. I plan to get a GoPro some day soon and will take it on hikes in the Pacific NW. If I could turn my hikes into beautiful time-lapses like these, I'd be blown away.
22
bsimpson 7 hours ago 1 reply      
As others have commented, the videos look great, and much closer to how people remember journeys. However, there appear to be some image persistence problems (many street poles simply dissolve as they get closer to the camera).

I'm curious to see what happens if they insert more action-packed footage. An MTB course with trees, switchbacks, and jumps would be an interesting stress test of this technique.

23
Yuioup 10 hours ago 1 reply      
I'm surprised Microsoft never releases anything cool like that to their app store.
24
aceperry 9 hours ago 0 replies      
Pretty cool that they've included the raw video. Everyone can use that to compare their alternative method to the microsoft researchers' results.
25
jelveh 18 hours ago 1 reply      
26
closetnerd 18 hours ago 0 replies      
That honestly has fantastic results.
27
crahrah 10 hours ago 0 replies      
I'm thinking of trying to recreate this - does anyone know if this is covered by a patent?
28
rumham 16 hours ago 1 reply      
Incredible results. I'd kill to try this with Oculus Rift.
29
jalopy 14 hours ago 0 replies      
Wow please release this either standalone or even better as a feature of some MS movie production tool. iMovie needs some competition.
30
hrjet 12 hours ago 0 replies      
I downloaded the .mp4 file and watched at half-speed. It looks great at half-speed too, and much more realistic. I wonder why they couldn't have slowed it down a notch. Perhaps, as a research result, they are just staying true to their algorithm's output frame rate.
31
bellerocky 13 hours ago 2 replies      
I don't understand how computers figure out 3D from single camera video without parallax. How do they do that?
32
washedup 15 hours ago 0 replies      
This is incredible. By watching the hyper lapse versions of the mountain climbing, you can clearly see which path is taken, are able to get glimpses of whatever paths are available. This would be a huge advantage for people learning how to rock climb. I can image that a similar situation would occur for many other activities. Great work!
33
NicoJuicy 17 hours ago 0 replies      
What's the best alternative for this? I'm doing a GoPro video while cycling with some friends... And this would be insanely usefull
34
rbanffy 13 hours ago 0 replies      
Couldn't it be done with a wider angle lens, a better imaging sensor and conventional image stabilization techniques? If such captures become commonplace, it's easy to imagine capturing with a wider field of view so that the stabilizer would not be so overtaxed.
35
the_cat_kittles 18 hours ago 0 replies      
be sure to watch the technical exegesis at the bottom, its almost more amazing
36
issa 18 hours ago 1 reply      
I want to put something more meaningful into this comment, but all I can think to say is that this is really amazing. Well done!
37
Netcob 6 hours ago 1 reply      
A GoPro timelapse is also called a prolapse.
38
suchetchachra 3 hours ago 0 replies      
Excellent work!
39
michaelmachine 10 hours ago 0 replies      
Woah, very cool. I would love to see this applied to a SCUBA diving video.
40
sjtrny 16 hours ago 0 replies      
Does anyone know how the match graph is created? (The part that is used to identify redundant frames). The paper barely mentions it.
41
zobzu 10 hours ago 0 replies      
wow i wanted this for a while

now to implement it open source ;)

42
lukasm 17 hours ago 1 reply      
Is this Hamburg? "getrankemarkt" :
43
unphasable 11 hours ago 0 replies      
wow this is incredible!
44
GhotiFish 15 hours ago 0 replies      
awww. I always have mixed feeling about a new microsoft funded tech. Something really really cool that I will never get to use/exploit/play with/anything.

The technical video breaks down some of the techniques they used. Global match graph is particularly interesting. This technique alone could lead to a big improvement in timelapses, by trying to select consistent changes between frames.

http://cg.cs.uni-bonn.de/aigaion2root/attachments/FastSimila... <- maybe this?

45
bobwaycott 12 hours ago 0 replies      
One of the most impressive things I can recall seeing from Microsoft in years. Would absolutely love to have this as a cloud service or desktop app.
46
jaequery 13 hours ago 0 replies      
U
47
anvarik 9 hours ago 0 replies      
why paper is 35MB?
48
manos_p 15 hours ago 0 replies      
still sucks
49
EvanL 16 hours ago 0 replies      
Cool technology, applications?
50
tannerc 17 hours ago 1 reply      
Stunning effect, can anyone help me see the practical or entertainment-value use?

I'm also curious if anyone else got motion sickness while watching the video.

51
l33tbro 10 hours ago 0 replies      
Great tech. But it really just looks like a steady-cam sped up x10. I get the technical brilliance, but visually it's not terribly innovative.
52
tibbon 11 hours ago 1 reply      
I'll be the immature one and say I'm curious how funny porn would turn out with this speeding it up by 10x. They don't show much of how it deals with people and I imagine the results would be terribly funny looking- but perhaps awesome.

Also, I will pay $$$ for this to use with my motorcycle footage from GoPros.

53
melling 13 hours ago 0 replies      
Now we need a little facial recognition so you can scan to where you meet your tagged friends... of course, there are other surveillance opportunities too.
54
philip1209 16 hours ago 1 reply      
The server supports HTTPS, but the videos are improperly embedded resulting in mixed-content errors. This is disappointing.
55
rasz_pl 16 hours ago 1 reply      
Nice results, but missing forest for the trees.

One of the by-products of this algorithm is fully textured 3d model representing filmed environment. Offering that as pure data dump, or even a manual process allowing user to control camera would be as valuable as fully automatic one-off timelapse no one ever watches (except maybe your granny).

What sounds better - a video tour of a house, or a 3D model of a house you can traverse however you like?

I wonder if 3 letter agencies have better structure from motion implementations a la "Enemy of the State" (Isnt it sad that this film turned out to be a documentary?). I suspect something like a 3d reconstruction of Boston Marathon (FBI did collect all video footage of the event) would be very helpful to the investigation.

56
boyaka 16 hours ago 2 replies      
Video stabilization + more FPS / slower rate than the "every 10 frames timelapse" + feel good inspirational music = this

I would guess that I could upload a shaky video to youtube to get it smoothed out, download it, and speed it up with similar to their rate and get similar results. The timelapse that they show that is so much worse uses way less frames of the raw footage (every 10th frame?) and goes way faster than their "hyperlapse". It isn't a fair comparison.

6
OpenStreetMap 2007 vs. 2014
335 points by sashazykov  6 hours ago   80 comments top 21
1
pilif 4 hours ago 6 replies      
The one enlightening experience I had in the 2009ish timeframe is how incredibly easy it is to contribute valuable information to OSM.

Intuitively I would have assumed that this would be really difficult to do, but the tools OSM provides for editing are actually quite easy to use even for people with no clue like myself.

You don't even require accurate GPS hardware or anything. Knowledge of your surroundings combined with the (blurry, but available none the less) satellite imagery might already be enough for you to really do good.

In my case, I've added building numbers of my neighborhood, marked one-way roads, added a few gravel foot paths where they were missing and I knew where there because I walk on them on my commute and so on. All this required zero hardware nor actual knowledge in map making (also: The changes are still in unaltered, so I assume they weren't all bad).

Using a cheap GPS tracker and a bike I furthermore added a few small lakes and a small creek close to where I live. The GPS tracker was very helpful as the satellite imagery was (understandably) just showing forest, but biking around the lake a few times was really helpful in giving me the correct measurements.

All this was both a lot of fun and absolutely trivial to do. I highly recommend that you give this a shot on your end. I'm a person with zero experience in map making and yet I could easily contribute my part and I had an absolute blast doing it.

Also, if you are good with directions, this isn't limited to the places you live now - I've also added a lot of detail to the map around the place where I went to elementary school (yes. The environment has changed a bit, but that was a great opportunity to visit the place again).

Contributing to OSM is a very pleasant and fun experience.

2
mattlondon 4 hours ago 0 replies      
Some of the most relaxing & satisfying things I have ever done with a computer was spending hours updating OSM from fresh satellite images.

It was giddying to discover that Bermuda had great satellite images, but zero mapping and then a few hours later you'd literally put a well-known place like Bermuda "on the map" by drawing in most of the main roads, airport and so on. Great to see that others have built on this with detailed information.

Very fond memories.

3
devnill 31 minutes ago 1 reply      
I really want to like OSM, but the default UI is really holding me back. Even with its limitations, Google maps looks great out of the box. Open Street Maps is downright ugly.

Aesthetics is a huge selling point, especially when used for business, and I can't help but think that its the factor that is holding OSM back from prime time.

4
exDM69 5 hours ago 1 reply      
OSM has really evolved in some places. For the immediate vicinity of my home, OSM is a lot better than Google Maps or any other map I've looked at.

OSM even includes horse trails and bicycling routes that I can't find in any other map available online. Not even local, official maps.

The weakness in OSM still is that it's difficult to tell whether or not the maps are of good quality and up to date for a particular region.

5
vog 5 hours ago 0 replies      
Very nice idea and good visualization, although I'd have preferred a single "switch" button to quickly switch the whole view between 2007 and 2014.

While this demonstrates the huge progress of OSM, it is important to keep in mind that the hardest (and not so rewarding) part of the work is not to create the maps, but to keep them up to date!

That's why it is important to keep supporting OSM.

6
Brakenshire 34 minutes ago 0 replies      
It's also interesting to see how much activity has been going on in the recent past - ITO World have maps that can show you OSM edits made in the last 90 days, and in the last 7 days. For instance, this map shows the last 90 days of edits around Y Combinator's head office:

http://www.itoworld.com/map/127?lon=-122.06776&lat=37.38596&...

7
lovelearning 2 hours ago 1 reply      
I'm rather surprised that even North Korea is mapped quite extensively in OSM.(https://mvexel.github.io/thenandnow/#14/38.9911/125.7365).

For a long time, North Korea was shown as a featureless area in google maps.

8
samcrawford 5 hours ago 0 replies      
The poor Comcast business connection that's hosting the 2007 tile data is really struggling with the traffic! Lots of retransmissions (caused by packet loss) by the looks of it, likely because his upstream is completely maxed out.
9
k-mcgrady 3 hours ago 2 replies      
It's amazing how easy it is to improve these maps. I just searched for my small town, created an account and started editing. Added a few alleys/shortcuts and about half a dozen POI's in 5 minutes.

What is the best way to use OSM on mobile, specifically iPhone? I remember trying some apps before but I didn't like them. Is there anything as polished as the Apple/Google apps?

10
Vvector 50 minutes ago 3 replies      
My last three addresses and all surrounding roads cannot be found on OSM. The streets are all there, they are just tagged incorrectly. With some help from the forum, I tried to correct the errors, but it never worked. Apparently there are 'broken relationships' that are beyond my ability to correct.

http://forum.openstreetmap.org/viewtopic.php?id=24177

11
yuribit 2 hours ago 0 replies      
I still can't believe, that's my city, Rome: https://mvexel.github.io/thenandnow/#11/41.8974/12.4987 I am truly amazed by the wonderful job which has been done by the community.
12
beaker52 5 hours ago 3 replies      
Open Street Maps are way ahead of any other maps in my opinion. In my town they have all the fields, ponds, lakes etc all accurately mapped. Amazing.
13
hyptos 4 hours ago 5 replies      
The difference on my island is huge :D

https://mvexel.github.io/thenandnow/#12/-20.9850/55.4634

14
ErikRogneby 26 minutes ago 0 replies      
This is going to be addictive. One, last, edit...
15
cliveowen 3 hours ago 1 reply      
Am I the only one who see this working backwards? If I move towards 2007 everything gets more detailed.
16
mcv 3 hours ago 1 reply      
Zoom in on Artis (the Amsterdam Zoo), or the various parks. The amount of detail is staggering!
17
brickmort 2 hours ago 0 replies      
Truly astounding. 2007 doesn't even seem like it was a long time ago.
18
glomph 3 hours ago 3 replies      
Has any one made a directions engine that runs on OSM in the browser?
19
tonny747 5 hours ago 4 replies      
I feel like this is somewhat exaggerated... https://mvexel.github.io/thenandnow/#11/-34.9324/138.6289
20
Jekyll 4 hours ago 3 replies      
Anyone ever tried printing a paper copy of OSM for their region?
21
BillFranklin 3 hours ago 0 replies      
The power of online collaboration is inspiring.
7
JuliaCon 2014 Opening Session Presentations
44 points by karbarcca  4 hours ago   2 comments top 2
1
KenoFischer 17 minutes ago 0 replies      
Note that these are just the first three videos. As I understand the rest of the videos are being edited and uploaded right now.
2
StefanKarpinski 44 minutes ago 0 replies      
JuliaCon (the first one ever) was a great event and I'm glad the videos are ready to go up now so others can see these excellent presentations there will be more, but this includes talks by Tim Holy about the medical imaging applications his research lab at WUSTL uses Julia for and some cool recent NLP work by Pontus Sterntorp. Well worth watching for both scientific curiosity and interest in Julia as a language and ecosystem.
8
Square Appointments
22 points by hodgesmr  2 hours ago   10 comments top 5
1
yalogin 10 minutes ago 3 replies      
Why is it not free? These additional features are to lure you in to using the payment platform and lock businesses in. More so I need end-users to also install the square app to use this feature. So there is that additional friction as well. This is really not something they should be charging for.
2
Macsenour 11 minutes ago 0 replies      
I'm interested in this new wrinkle, and always looking for something that will help out my day. But is $30 a month, for a single person, a competitive pricing plan?
3
ankurpatel 18 minutes ago 2 replies      
Seems like they are competing with MyTime - http://www.mytime.com. The only thing they lack is good search to find businesses
4
sleepyhead 1 hour ago 0 replies      
5
bitonomics 20 minutes ago 0 replies      
It's really cool to the additional value being added tot he platform outside of payments.
9
Compile like it's 1992
283 points by cremno  14 hours ago   38 comments top 9
1
GnarfGnarf 31 minutes ago 0 replies      
Borland 3.1 also came as a graphical IDE. For a while Borland was ahead of Microsoft's character-based Programmer's Workbench. Borland had a great IDE.

My company stuck with Borland until 4.5. We parted ways when the compiler generated code that GPF'ed when calling 'new' in a DLL, or declared some global variables undefined that had successfully compiled a few modules earlier. Also, Borland couldn't step through 32-bit code in debug mode. Visual Studio 6 could.

2
zak_mc_kracken 12 hours ago 3 replies      
Have to hand it out to Fabien, he's not just an amazing developer with baffling skills to understand code and graphical routines, his dedication to understanding old games of yore is second to none.

Nice job, Fabien!

3
haberman 7 hours ago 1 reply      
In a similar vein, here is Lua 5.2 being compiled with Turbo C 1.0 (impressive that software written in 2013 compiles unmodified on a 1990 compiler!)

https://www.youtube.com/watch?v=-jvLY5pUwic

4
hayksaakian 13 hours ago 2 replies      
Its nice that they made it public domain. There are 1000 more games from that era whose code will simply dissapear despite being simply bits and bytes that may be perfectly preserved for ever.
5
gavinpc 12 hours ago 2 replies      
I remember getting the Borland C compiler as a ten or eleven year old and thinking, twenty megabytes! What in the world could they need twenty megabytes for? That was like, my entire hard drive.

N.B. you have a typo

    Z:/> mount c ~/systen/c

6
webmaven 2 hours ago 1 reply      
Nice trip down memory lane (and a good counter-example to those who claim that web development today is soooo much more complicated and tedious than desktop development was back in the day...).
7
russtrotter 11 hours ago 4 replies      
Does anyone else find it sorta ironically odd that one of the masterminds behind this great stuff is now a .... gasp Facebook ... employee?? my heart kinda aches.
8
frozenport 11 hours ago 1 reply      
Will it build with a modern compiler?
9
ramgorur 11 hours ago 0 replies      
nostalgic :'-(
10
The Youngest Are Hungriest
90 points by clarkm  1 day ago   36 comments top 6
1
apsec112 8 hours ago 3 replies      
"Over 40 percent of those 5 and under are stunted meaning they are in the bottom 2 to 3 percent of the worldwide height distribution for their age and sex"

Argh. It really bugs me when newspapers can't do simple arithmetic. Especially in the lead paragraph.

India is about 17-18% of the world's population. It has a slightly above average fertility rate, so it will have disproportionately more children. So about 20% of the world's children are Indian.

If 40% of Indian children are stunted, therefore, at least 8% of children worldwide must be stunted (not including Africa, China, etc.). 8% of children cannot be in the bottom 2-3% of the world's height distribution. That is not how statistics works.

If this obvious an error was put in the lead paragraph, how can we trust that the other facts are accurate?

2
VMG 7 hours ago 2 replies      
> Indias patrilineal traditions dictate that the eldest son care for his parents in old age and inherit property, while the dowries paid to marry off daughters can be expensive. The result is sex-selective abortion and an underinvestment in girls so common it has popularized a Hindi motto: beti to bojh hoti hai, meaning, a daughter is a burden.

This explanation has always been unsatisfying for me. At least it seems incomplete. The system is obviously unethical, but I don't understand how the economics work here.

If only the eldest son inherits, why are the the following sons valued? Given that the practice creates a surplus of unmarried men, shouldn't unmarried women become an asset instead of a liability? Shouldn't unmarried men be a huge force against the dowry system?

3
McCoy_Pauley 7 hours ago 3 replies      
I'm not sure if this is the place for an anecdote, but here we go. And just to give a warning, this is from an American perspective.

I am the first born. My mother has told me that she would could the amount of protein she would get each day during her pregnancy with me. If she didn't get enough she would each more or drink a large glass of milk to try and supplement her diet. She didn't drink soda either.

With my younger brother she tried to make sure she ate enough protein, though she gave up not drinking soda.

With the youngest, my sister, she was working part time during her pregnancy and wasn't able to watch what she ate like with either me or my brother.

This all being said, we are no where near malnutrition. There is a considerable size difference between me and my siblings. I am 6'4" and have considerably larger bone structure than my siblings. (I'm not fat/obese/heavy. I only weigh 193 lbs.)

My brother is just a 6' and considerably smaller than I am. My sister is smaller than my brother and has a similar bone structure.

Might there be a biological imperative to ensure that the first born is healthy. Then with the later children parent lose the need to ensure their children are as healthy.

4
_nedR 1 hour ago 0 replies      
Isn't this still a problem in the south too? The article clearly mentions its talking about India on average- Incidentally, the states you mentioned are also among the most populous. Most of the problems discussed do exist in the south as well although not to the same extent as the north. Only Kerala seems to be relatively less afflicted- with a favourable sex ratio, less absolute poverty and sanitation issues. Even in Kerala, the practices of dowry and favouring of sons is widely prevalent.

I, for one think the article is commendable in bringing new insight to problems facing us in India.

5
fgt 7 hours ago 0 replies      
The South African Indian population has, apart from easily distinguished recent migrants, been in the country for 100+ years, and is relatively large (more than 1 million people). Due to apartheid, assimilation was limited, and the population was drawn from all over India. It would make an interesting comparison group, matched for socioeconomic status. If the pattern persisted in South Africa, then further research would be needed - South African Indians generally rely on pensions/savings to support them in old age.
6
dropit_sphere 8 hours ago 3 replies      
Devil's advocate, which is why I have this account: why is it not a gender equality issue that the sons are expected to care for their parents and not the daughters?

I'm not saying everything is hunky-dory in India. I'm saying that if someone thinks the fix is "Oh just apply Americanism in this particular aspect" then that someone is naive.

11
Introduction to Signal Processing
90 points by tonteldoos  1 day ago   9 comments top 6
1
shas3 1 hour ago 0 replies      
On the topic of free e-books related to signal processing, I really love this book by Vivek Goyal, Martin Vetterli, and Jelena Kovacevic: www.fourierandwavelets.org

I think their treatment of the subject is more 'modern'. Classical signal processing is the stuff that you will find in Orfanidis's book in the OP and other classics such as Lyons, Oppenheim/Shafer, etc. Modern signal processing involves more harmonic analysis. There has been a lot of work, since the late 80s in the areas of wavelets, dictionary learning, etc. which you won't find in 'classical texts' on signal processing. In some universities these topics are taught in 'advanced' signal processing courses, at honors or graduate level. I hesitate to call this kind as 'advanced' signal processing, because I feel you need the same kind of prerequisites for 'classical' and 'modern' signal processing: linear algebra, Fourier analysis, basic probability, 'random processes', etc. In fact, I think 'modern' signal processing taught at the undergrad level also has the added benefit of being a gentle application-oriented introduction to real analysis for EE students.

2
adamnemecek 8 hours ago 2 replies      
There's also "Scientist and Engineer's Guide toDigital Signal Processing" which free too and is pretty #swag as well http://www.dspguide.com
3
hcrisp 4 hours ago 1 reply      
"Understanding Digital Signal Processing" by Richard Lyons should also be mentioned. Not free, but probably the best text for grasping the concepts of DSP. As a mechanical engineer his style and illustrations really clicked with me.
4
gallamine 1 hour ago 0 replies      
Jose Unpingco has some nice articles on his blog that are excerpts from his book _Python for Signal Processing_. http://python-for-signal-processing.blogspot.com/
5
streptomycin 3 hours ago 0 replies      
On the subject of free books from Rutgers professors, check out http://www.math.rutgers.edu/~sontag/FTP_DIR/systems_biology_... if you are interested in mathematical modeling in biology.
6
marcosscriven 6 hours ago 0 replies      
Recognised that cover right away - I remember paying a huge amount for a copy of this as a student back in 1996! Incredible really how the web has made such a large amount of educational material available for free.
12
The Network is Reliable
41 points by r4um  5 hours ago   7 comments top 5
1
blutoot 24 minutes ago 0 replies      
I feel like the authors (or someone else) can do a lot more justice to their overall objective (i.e. tease out patterns) by applying some kind of a qualitative content analysis of case studies [0].

[0] http://www.qualitative-research.net/index.php/fqs/article/vi...

2
falcolas 2 hours ago 2 replies      
The network is not reliable, but usually the cost of manually fixing problems arising from infrequent types of instability is less than the cost of pre-emptively addressing the issue.

As a practical example, our preferred HA solution for MySQL replication has effectively no network partition safety - if a network becomes partitioned, we'll end up with split brain. However, we have not once had to deal with this specific problem in our years of operation on hundreds of servers.

That said, do make the assumption that your AWS instances will be unable to reach each other for 10+ seconds on a frequent basis. Your life will be happier if you've already planned for that.

3
blutoot 39 minutes ago 0 replies      
There was some discussion on a preliminary version of this article/blog-post[0] last year: https://news.ycombinator.com/item?id=5820245

[0] http://aphyr.com/posts/288-the-network-is-reliable

4
jchrisa 1 hour ago 0 replies      
Related reading on data structures that make availability easier to maintain under network partition: http://writings.quilt.org/2014/05/12/distributed-systems-and...
5
peterwwillis 2 hours ago 0 replies      
Takeaways:

* Network partition tolerance can be designed around, assuming infinite time and money

* Network partition tolerance depends on the application

* Mitigating potential failure requires having a very long view on very fine details

* Most organizations will not be able to engineer solutions to address all network partition-related outages

13
Reverse Engineering for Beginners: Free book
228 points by galapago  15 hours ago   22 comments top 8
1
j_s 51 minutes ago 0 replies      
Reverse engineering has come up a few times in the past few months:

Automated reverse engineering (of DRM) - https://news.ycombinator.com/item?id=7989490

Open-source debugger for Windows - https://news.ycombinator.com/item?id=8092273

2
luckyno13 1 hour ago 0 replies      
I have been contemplating taking up coding in my spare time, especially after the post about turning the $200 Chromebook into an el cheapo learning machine. This could be the starting block I have been searching for.
3
kqr2 14 hours ago 1 reply      
For understanding the stack frame layout on x86-64, I found this post to be quite useful:

http://eli.thegreenplace.net/2011/09/06/stack-frame-layout-o...

4
newaccountfool 12 hours ago 2 replies      
After just visiting DEFCON and watching all the teams partake in CTF, this is what I want to learn. This is computing.
5
middleclick 10 hours ago 0 replies      
I remember how I spent a summer going through Lenas videos tutorials (look them up). They were really good and I learned a lot. Reverse engineering is an addictive thing.
6
codygman 9 hours ago 3 replies      
Is there a good reversing toolset for linux? For instance this book recommends a windows only tool called Ida.
7
ryanmerket 11 hours ago 2 replies      
Takes me back to high school. I used to sit in my room for hours working on keygennerators.
8
checker659 13 hours ago 1 reply      
Does anyone know of a tool that can dump C++ vtables from 64 bit macho files?
15
Animations in Swift
22 points by normanv  1 hour ago   5 comments top 2
1
monkey_slap 50 minutes ago 1 reply      
Really cool! I'd argue that Swift doesn't really have much to do with this, more "Animations in iOS".

You brought up a couple of neat concepts that I wasn't aware of, especially "UISystemAnimation.Delete".

2
mathewsanders 1 hour ago 1 reply      
Thanks for submission!

I've had some feedback that I've made a bit of a mess with casting when trying to calculate a random number - would appreciate any best practices or thoughts on that...

https://gist.github.com/mathewsanders/82311409978066b02932

16
Marka: Icon transformations
85 points by tilt  1 day ago   9 comments top 6
1
marcoms 3 hours ago 0 replies      
Looks like it could implement things like http://material-design.storage.googleapis.com/videos/animati... to an extent...
2
wiradikusuma 8 hours ago 0 replies      
I assume the transformation is not hinted, because e.g. animating ^ to double ^ (like chevron) would be (subjectively) better by cloning the ^ and shifting it down.
3
splitbrain 9 hours ago 3 replies      
What's it good for?
4
jevgeni 8 hours ago 0 replies      
This is beautiful.
5
heropotato 2 hours ago 0 replies      
This is nice and neat. Thanks for sharing!
6
mychaelangelo 6 hours ago 0 replies      
love it! thanks for sharing.
17
Low-level Bitcoin
48 points by jc123  7 hours ago   24 comments top 3
1
b1db77d2 5 hours ago 2 replies      
A little note about the "make a privkey" section of the signature example; it can sometimes* make invalid privkeys that are off the end of the EC curve. Only integers between 0x1 and 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141 are valid in our particular case. Super unlikely to ever get a sha256 hash that matches the invalid portion, but it's worthwhile to point out.

* probably never, but worth mentioning

2
VMG 5 hours ago 1 reply      
Possible bug:

The instructions BOOLAND and BOOLOR don't interpret the stack values the same way IF, VERIFY etc do. They decode the top stack values as integers and compare against zero, thus they have to fail when the top stack item size is greater than 4 bytes.

Edit: littleEndian.decode also doesn't seem to respect the size limits

Edit2: .. or signed integers for that matter. So while this is a very cool basic concept, it's not a complete implementation.

The reference client provides test suites

https://github.com/bitcoin/bitcoin/blob/master/src/test/data...

https://github.com/bitcoin/bitcoin/blob/master/src/test/data...

3
norswap 4 hours ago 1 reply      
This article had me wondering what will happen when people start storing copyrighted data (or worse, pedopornographic images) inside the blockchain.
18
Show HN: CheckUp social network suicide prevention in Go
18 points by rkirkendall  3 hours ago   20 comments top 5
1
matart 19 minutes ago 1 reply      
How accurate do you think this can be? What happens if I write a facebook update that says, as an example:

I don't believe I have ever said "I feel depressed"

Would this be flagged? Does it only take one post to be flagged or is it looking for recurring behaviour?

2
phfez 34 minutes ago 5 replies      
I can understand why you think this might be a good idea. But it completely doesn't take the depressed person into account at all. It thinks it can solve suicide by checking up on the individual? Do you think the individual wants to be checked up on? Maybe the people checking up on the person are the actual problem in that person's life, which could create an even more serious dilemma for that person.

If there is anything that would drive me to suicide, it would be more people thinking that they can 'solve the problem' in this manner.

3
cjslep 1 hour ago 2 replies      
I think this is really cool. Clarifying question from these two statements that seem to conflict:

> The goal of the CheckUp project is to detect any serious sign of depression, self-harm or suicide posted to a social network and provide peer support by notifying a concerned party.

> The app works by checking the tweets on your home timeline every few minutes and sending you an email notification if a tweet is flagged.

Shouldn't it instead make the person signing up the "concerned party" to be notified via e-mail, and instead have that concerned party specify which twitter feeds to watch? I'm probably missing something here.

4
rubiquity 1 hour ago 1 reply      
This sounds like a great service but I really don't give a damn that it is written in Go.

Also:

> This application is temporarily over its serving quota. Please try again later.

5
wyager 44 minutes ago 0 replies      
This seems a little accidentally sinister to me... Something about doing automatic sentiment analysis on someone's data without their permission seems morally questionable. I mean, almost none of us are happy when the government does it, even though it's allegedly with the (very questionably) "good" intention of "fighting (terrorism|drugs|bogeymen)".
19
3D Programming with JavaScript
38 points by TOWK  5 hours ago   3 comments top 2
1
weavie 2 hours ago 1 reply      
Every time I think of a new hobby project I want to start I think about learning a new language alongside. Inevitably before I start a library or tutorial comes along that brings it back to Javascript.
2
poseid 4 hours ago 0 replies      
nice overview!
20
Imgui: Immediate Mode Graphical User Interface with minimal dependencies
112 points by jblow  13 hours ago   29 comments top 8
1
w0utert 2 hours ago 1 reply      
I was totally sold on this and already preparing to try this out on a simple game using OpenGL I'm working on as a pet project, until I saw the code examples. The amount of highly-specific OpenGL code that's apparently needed to embed an ImgUI popover is completely prohibitive to my use case, the GLEW and GLFW examples literally have 3 times the number of OpenGL calls my own rendering function has, I see custom shaders used just for the UI, setting up clipping rectangles, loading font textures, etc. And that's just to manage renderer state, not even to setup the popover itself.

I can understand that it's an advantage to have something like this renderer-agnostic, and embeddable anywhere in your render flow, but the way you have to use this completely defeats its purpose IMO. I would love a higher-level abstraction layer on top of this that deals with all the render state setup, so I only have to setup the UI itself and populate it, and can throw in a one-liner somewhere in my render function that draws it. In this case, I would happily trade the fact that this would break render-independence and would clobber render state, for ease of integration.

That said, the end result looks immensely useful and very-well done, it's just too involved to setup and use I think...

2
rsync 1 hour ago 1 reply      
Is there an equivalent tool for ncurses ?

Just curious ...

3
Keyframe 3 hours ago 1 reply      
This looks sweet, but 6KLoC file. How do you even do that and stay sane?
4
seanmcdirmid 11 hours ago 1 reply      
What is better than using React's one-way reactive dataflow? Well, just run everything in a continuous loop!
5
seanmcdirmid 8 hours ago 0 replies      
The last time we talked about IMGUI, for historical reference:

https://news.ycombinator.com/item?id=4201748

6
kevingadd 10 hours ago 3 replies      
Other than the REALLY inadvisable name choice (good luck getting precise search results for the term IMGUI now...), this looks great.

The low # of commits and vague name might be a bad signs, but check the repo description: Developed at Media Molecule and used internally for a title they've shipped, with feedback from domain experts like Casey Muratori. Those two points alone make me pretty excited to try this out.

And a brief explanation of why this matters:IMGUI frameworks are increasingly popular for editors, debugging tools, etc. because they eliminate the need for state synchronization between your 'engine' and the visualization/editing UI. You also avoid the need to duplicate your layout in multiple places - you have one function that is the complete description of your UI. This reduces memory usage and can actually be more performant in many cases because you can easily ensure that you're only ever running code to update visible UI. Things like virtualized infinite scrolling become quite trivial.

Among other places, IMGUI techniques are used aggressively in Unity's editor.

7
remon 7 hours ago 1 reply      
Well, if there is an award for the library name that is hardest to get Google results for this one is a contender.
8
huhtenberg 5 hours ago 2 replies      
Not to state the obvious, but there are very few types of desktop programs where not having a native look and feel is OK. Especially when UI looks as unique as ImgUI does. Games, demos and, perhaps larger application software, like Blender, that are meant for prolonged continuous use and that run full-screen.

ImgUI is a very nice engineering feat, but it's not terribly practical.

21
Uber Drivers Flock to Hamptons to Gain Partygoers Fares
9 points by zabalmendi  1 hour ago   discuss
22
Vancouver tech surges as U.S. immigration reform idles
206 points by refurb  18 hours ago   257 comments top 22
1
inoop 14 hours ago 10 replies      
Earlier this year I interviewed for a research/engineering position at Apple. I was offered the job, took it, and went into the H1B lottery. It took the USCIS four months to let me know that my application was not selected, even though it was filed under 'premium processing', and the draw had been done using a computer in the first week after submissions closed. Since the job required me to be in Cupertino, the deal simply fell through.

Now imagine this, you worked your ass for for a total of nine years to get your masters and PhD, and you spent countless hours honing your skills and finally you get a dream job that merges awesome research, top-level engineering, and working on a product that millions of people use every day. And then it all goes tits up because of a lottery.

A. Fucking. Lottery.

The problem for people like me is that, if I want to work in the US, I have to go through a lottery that will keep me in uncertainty for months. Then, _maybe_, if I'm lucky enough, I can relocate only to have the same thing looming a few years down the line. I cannot build a life around such uncertainty. I cannot built a future like that.

So, I'm staying in Europe. Not having been born in the US means that a lot of jobs are just not open to me, not because it's impossible to get them, but simply because the hassle is just not worth it anymore.

2
paul 17 hours ago 4 replies      
I think if we ever start a second YC location, this will be the reason -- to help founders who are unable to get past the US immigration system.

It's unfortunate that our government is becoming increasingly dysfunctional at every level -- federal, state, and local governments are all failing to do their jobs. I'm glad that Canada is introducing some competitive pressure.

3
turar 16 hours ago 5 replies      
This is a repeat of the same story that happened the last time H-1B limit didn't meet the demand for visas. US tech companies, most notably Microsoft, opened "offices" in Vancouver. What happens next is that those employees will work for 1 year in Canada, and then will become eligible for L-1 intra-company transfer back to the US, side-stepping the H-1B quagmire.
4
walterbell 16 hours ago 2 replies      
With the recent cancellation of Canadian real-estate immigration visa, hopefully the Vancouver housing market will stabilize, http://www.theprovince.com/business/Will+cancellation+Immigr...

China has been allowing domestic yuan conversion to western mortgages, boosting real estate demand in Vancouver, NY, SF & elsewhere, http://www.vancouversun.com/business/Secret+path+revealed+Ch...

5
shane999 12 hours ago 1 reply      
Vancouver guy here. I work for a media company as a part of a ~60 member software team. The fact is that probably less than ten people in the team are actually from Vancouver. Several of them are people who lived in the US on H1B visas that they couldn't get renewed. They come here "temporarily" but then like the lifestyle and stay.

I totally agree that the messed up US immigration system has benefited us here. I have experience with both immigration systems and the process in Canada was much easier than the US.

6
bayesianhorse 6 hours ago 0 replies      
Stop this "immigration control" craziness. The reality in the US is sometimes, that it is easier for low-skilled immigrants to live in the US, than it is for high-skilled immigrants.

Why? Well, ask some of the 10million+ undocumented immigrants. They can live a better life in the US compared to their home countries, even undocumented. Tech workers or even middle-skilled workers take more issue with being "illegal" or "undocumented".

I think all immigration restrictions have to be put on the stand. Are they practical? Do they have a measurable benefit (beyond some hand-waving about supply and demand)? Do they have measurable negative impacts?

7
deepGem 4 hours ago 0 replies      
The sad part is that this H1B visa cap will not be raised in the near future as the process is a part of the so called immigration reform (Dodd and Frank bill). Why someone would club legal and illegal immigration together is beyond my understanding.
8
jmspring 10 hours ago 1 reply      
This is a complicated topic that I've been waffling on responding too.

The situation around h1b visas is messed up. But the mess is multisided. First off, there is a large amount of unemployment in this country in the IT sector (especially around older workers) and no apparent desire to fix it. Second, there are "sweat shops"/consulting bodies milking the system to under pay for what is often basic/menial work like CRUD development/etc. Third you have the specialists at the whim of (2) and the number of allotted slots.

I think both (1) and (3) need to be fixed at the expense of (2). Any company bringing over labor that is basic and can either be filled by outsourcing contracts or simply training local labor should pay the price. Tata, IBM, Wipro, etc are at the top of that list.

The current system has few filters.

Edits: typos from posting using phone.

9
jbarham 16 hours ago 0 replies      
The main reason that companies like Sony Imageworks are in Vancouver is that the BC government pays hundreds of millions of dollars a year in subsidies to VFX studios if they relocate to BC. The subsidy trade war has been covered for years by blogs such as VFX Soldier (http://vfxsoldier.wordpress.com/).
10
fredgrott 54 minutes ago 0 replies      
my solution to the H1 Lottery mess

Charge for the H1 but have no quotas

Let market demand and ability to pay tell how many H1s come in

But, with a twist..home country has to have a similar program for US workers wanting to work that country so that the economies of both countries have a chance to rise due to flexibility of worker immigration

11
gdilla 1 hour ago 1 reply      
I am a Canadian-American, and I wish Canada actually did more to exploit US immigration dysfunction. Unfortunately, I think the weather gives Canada a bad rap. I'm not even joking, Canada needs a tropical island. That's really the last step to really compete with sunny California.
12
pskittle 16 hours ago 2 replies      
Has anyone here been hired by or hires engineers for an american company which has an office in Vancouver? Would you be kind enough to talk about you experience
13
LetBinding 6 hours ago 1 reply      
Is there a Big Data / NLP / Machine Learning scene in Vancouver? I rarely see any Data Scientist positions being advertised in Vancouver?

I am a foreigner with a PhD in applied NLP from a US university and I have been looking at such positions in other tech hubs like Vancouver, Montreal, Berlin, etc.

But these types of jobs only seem to be in the Valley. I work in the Valley and I like it here but I want to move to a place where I can have a stable immigration situation.

US immigration may be cumbersome but the most interesting jobs in Big Data seem to be in the US.

14
Tiktaalik 9 hours ago 0 replies      
I've found it odd that for so long there hasn't been many Vancouver branch offices of Silicon Valley tech companies as it seems like a perfect fit. Vancouver offers cheaper employees, the same time zone as Silicon Valley, and an attractive, livable city* perfect for luring talent.

It's good to see that things are finally changing, even if for some companies it seems they're setting up purely for immigration purposes. Hopefully even if these US immigration issues pass, these companies will realize the advantages Vancouver offers and continue to stick around.

* Highest rated North American city and 15th overall according to Monocle Magazine's 2014 Quality of Life city rankings.

15
myth_drannon 14 hours ago 3 replies      
Just a personal example, a friend of mine was looking for a web dev position in Montreal for months until he found some low paying crappy job, he then decided to move to Vancouver and found a job within 3 weeks! Vancouver job market is burning hot and startups are having difficulties hiring, I read some even consider setting up offices in Montreal.
16
Dewie 17 hours ago 6 replies      
> "I will not go to Canada," said the 25-year-old from Argentina of his initial reaction. "Twenty degrees below zero, are you crazy?"

Perhaps the warmest major Canadian city, with a mild winter due to being coastal ('twenty degree below zero' - probably not).

17
parennoob 16 hours ago 4 replies      
As someone in the Sisyphean green card system (processing taking forever for the initial application, 6-7 years to go after that for the actual green card, best case); I am really glad that Canada is offering better alternatives. I bet if they emphasize this angle (quicker permanent residency for non-contract employees), they will get a decent amount of talent that otherwise is stuck grinding away in Silicon Valley for the same employer.

To anyone of Indian nationality seeking a job in North America, and interested in permanent residency strongly consider the Canada alternative, for education as well as jobs. Unless the weather means a lot to you, ask your company if they have a Canada office where you can work when you start out. Things will be way easier.

18
jsudhams 9 hours ago 0 replies      
Is there any reason for these companies not allowing work from home if they don't have office in respective countries or in the respective country offices? The B1 USA for meetings and discussion is easy to get it as long as you have necessary letters and proof.
19
notastartup 11 hours ago 2 replies      

    - B.C.'s tech industry has many exploitative employers    - B.C.'s tech market is underpayed, below national average.    - B.C.'s living cost on average exceeds the average salary.    - B.C.'s high real estate cost results in many house poor population.    - B.C.'s political party places far more labor rights on nurses and people cutting down trees.    - B.C. has a problem of bleeding talent to other provinces because of above reasons.
Source: I live here.

20
patmcguire 13 hours ago 0 replies      
"B.C. boasts more than 600 digital media companies, employing about 16,000 people and generating $2.3 billion in annual sales, according to the commission"

While $2 billion is obviously a lot of money in real terms, think about what that is when you compare is to the collective revenues of Bay Area tech companies.

21
api_or_ipa 13 hours ago 0 replies      
As a young programmer within the Vancouver dev community, I can say that everyone from the UBC CS program to the local meetup scene to the local incubators, including GrowLab/LaunchAcademy, Invoke and Wavefront have done an exceedingly great job and preparing for this increase in business in Vancouver.
22
eruditely 16 hours ago 0 replies      
I'm finding it increasingly ever necessary to consider dual citizen-ship as an option for the medium-future as a US(south bay) citizen.

I'm trying to work as hard as I can and start my own start up and open a second location.

They are lost.

23
Webinar Invitation: Lets Talk Smart Contracts
9 points by yrashk  1 hour ago   discuss
24
Inside Apples Internal Training Program
106 points by jameshk  13 hours ago   16 comments top 8
1
hobs 11 hours ago 1 reply      
I used to be a certified apple trainer, this really paints their training program in the best light possible, it interprets fairly normal corporate training stuff as original ideas.

Most of the training programs are just like any well made otj training, the comment about employees signing up for job specific courses is misleading, while they had some optional courses, it was just like any online course management system, a simple way to manage initial and ongoing job training.

In Apple's defense, their otj training was still better than any subsequent company I have worked at.

2
amrrs 10 hours ago 0 replies      
Everything that Apple does is exaggerated. And this Picasso stuff - "Simplifying the Bull: How Picasso Helps to Teach Apples Style" looks like a nice PR stunt ahead of the new iPhone launch.
3
pcurve 11 hours ago 1 reply      
I was expecting a bit more from the article, but I noticed couple of things:

1. The program was established in 2008, which I think is relatively late and coincides with Jobs rapid deterioration of health. I guess Jobs was deeply concerned about the long term prospect of Apple.

2. As novel as the idea may seem, it didn't sound too different than garden variety internal training programs offered at larger Fortune companies.

3. Sounds like Apple HR is growing. I'd be interested to find out what percent of employees fall under HR, before and after 2008.

It should be noted that Jobs was never big on performance metrics.

4
projectmayhem 6 hours ago 0 replies      
I once did "outside of Cupertino" training. The best bit was the pens & notebooks with subtle Apple emblemising on them. It made me much cooler with peers when I did my masters.
5
ryanburk 10 hours ago 1 reply      
jobs did co-found pixar, but the way he did it was to fund the spin out of people and IP from lucas.

http://www.pixar.com/about/Our-Story and page over to 1986:

"Steve Jobs purchases the Computer Graphics Division fromGeorge Lucas and establishes an independent companyto be christened "Pixar." At this time about 44people are employed."

6
jameshk 12 hours ago 0 replies      
Thanks to Kogir or whoever renamed the post. I'm pretty tired right now.
7
gordon_freeman 7 hours ago 3 replies      
this is fascinating inside view of Apple's culture and the TV remote example given is why Google TV flopped and Apple TV sells well (although way less compared to their i-products). The article depicts a picture of Apple's design centric culture where even engineers think top-down by starting with user interface/experience and then towards the technical implementation. This shows in their products.
8
LeicaLatte 2 hours ago 0 replies      
Bull.

For bullshit.

25
Show HN: Simple algorithmic trading in JavaScript
68 points by Cloud9Trader  13 hours ago   37 comments top 11
1
chollida1 2 hours ago 2 replies      
There was a company last week that showed off their bitcoin algos and I wrote a comment that basically said don't use them and I felt bad about it for the rest of the week. I'll try to just be positive here:)

To weigh in on the mention of using doubles for finance....

Using double for finance, perfectly fine. All trading systems I've seen use double, from HFT systems to deep learning AI systems that open and close potions over months. Double is fine for most algo trading, heck the exchanges and dark pools I've talked with use double.

What you can possibly do with this.

1) Learn how to write the logic of an trading algorithm.

2) learn the basics of technical trading, with MACD, Keltner channels, vortex and Bollinger band indicators. They've definitely put the time in to getting the indicators that FX traders like to use.

What you can't do with this system.

1) React to currency fluctuations on a tick by tick basis. FX is just so fast and precise, there is a reason that professional FX traders mix FX spot quotes from multiple sources, we use 4 at the fund I work at and some use up to 10 sources. There is also a reason why FX is quoted to 4 decimal places while equities are to 2, sometimes 3 for penny stocks.

2) Use this in a real world setting. I can't see any privacy policy or contract indicating what happens when shit goes wrong. We are talking about money here, you can't just half ass this kind of thing.

Plus they use FXCM which had this new out about them:

http://online.wsj.com/news/articles/SB1000142405270230425560...

> LONDONForeign-exchange trading firm FXCM Inc. FXCM +0.08% agreed to pay fines and refunds totaling almost 10 million ($16.7 million) to settle allegations by a U.K. financial regulator that the company withheld profits from clients and failed to inform British authorities that it was under investigation in the U.S.

> The Financial Conduct Authority said that U.K. units of FXCM withheld 6 million from customers on foreign-exchange transactions between August 2006 and December 2010. The regulator said the broker pocketed profits when exchange rates moved in its customers' favor while a trade was in process, but it passed on losses that occurred on other trades.

2
ErikRogneby 9 minutes ago 0 replies      
Could anyone find a list of what markets this supports?
3
radikalus 59 minutes ago 0 replies      
I hope to not come off as too critical, but there's a ridiculously wide margin between this and what trading firms in the low latency space do.

I don't know of a single firm that's successful in the time horizon that a lot of pseudo-HFT systems operate in. (The 10ms-3s range) And that's assuming an ideal fee situation...

Most of the people ignoring the LL arms race target RV opportunities on the 30s+ range. This involves taking a pretty big step back from microsctructure/toxicity models.

4
mikkom 7 hours ago 1 reply      
> We believe algorithmic trading is simple

Trading might be simple but being profitable definitely is not.

5
pathikrit 8 hours ago 1 reply      
Looks cool! Another: https://www.quantopian.com
6
jgunaratne 2 hours ago 0 replies      
It looks like this is only for currency trading at the moment. Do they allow other assets to be traded?
7
CmonDev 8 hours ago 1 reply      
Writing critical code in JavaScript? That's interesting.
8
cwmma 5 hours ago 3 replies      
So there is a reason you don't see much financial stuff written in JavaScript, doubles plus financial really don't mix.
9
foenv 9 hours ago 1 reply      
Sorry if I missed it - I couldn't see on your site which brokers you intend to connect with for live trading. Thanks
10
CWIZO 7 hours ago 1 reply      
Good job Kafton :) Glad you released it :)
11
hliyan 9 hours ago 1 reply      
Your login, register and demo links are not functioning.
26
Repowering California for all purposes with wind, water, and sunlight [pdf]
41 points by ejr  9 hours ago   31 comments top 6
1
beloch 7 hours ago 4 replies      
This paper proposes converting California's power generation over to wind and solar, with electric vehicles and electrically generated hydrogen vehicles replacing all others. All use of fossil fuels and nuclear power is to be phased out. Notably, it suggests that hydroelectric power should be used to balance loads. i.e. It proposes that California's power grid be stabilized with only hydroelectric power as an on-demand source of generation.

Firstly, in 2013 California imported 32.7% of it's electricity. California has little control over how this is generated. Of the power used by California, 40.8% is from Natural Gas, 8.1% is hydroelectric, 6.0% is Nuclear, 4.3% is from wind power, 4.2% is geothermal, 2.1% is from biomass, and 1.4% is solar[1].

Natural Gas and Nuclear power are both excellent on-demand sources of power, and currently meet 46.8% of California's electricity requirements. If these power sources are to be phased out, they must be replaced with energy sources that are on-demand. Wind and solar do not fit this description. Hydro does, but quintupling California's hydroelectric capacity will have a huge impact on the environment. This paper greatly underestimates how much on-demand power generation capacity a power grid needs in order to be stable.

Side note: California currently derives little of it's electricity from wind or solar power. Electric vehicle batteries carry a high environmental cost to produce, so it is imperative that the energy they are charged with be of renewable origin for any net environmental benefits to be reaped. Given that 40.8% of California's electricity currently comes from natural gas, it's clear that anyone plugging their EV's into California's grid is doing the environment no favors.

[1]http://energyalmanac.ca.gov/electricity/electricity_gen_1983...

2
schainks 3 hours ago 1 reply      
I'm seeing a lot of misinformation on this thread, and I think it would be useful to point hacker news at the Electric Power Research Institute (EPRI), which is a cited source in this paper. They are an engineering-based, independent, non-profit research company. And their headquarters is in the "heart" of Silicon Valley next to Xerox...

http://www.epri.com/

Full disclosure: I am not an EPRI employee, but I've read a lot of their papers and presentations. Their research is original and unbaised. Their engineering is pragmatic and chocked full of raw 100% reality. I wish some of the websites people are citing here were talking to places like EPRI first, but instead write sensationalist headlines that hide details and misinform, making sane, coordinated discussion difficult.

I suggest hacker news check them out and maybe send some emails to get better information about this proposal and learn more how the grid truly functions politically, economically, and technically.

3
timthorn 7 hours ago 0 replies      
David MacKay's "Without Hot Air" is an excellent investigation of what needs to be done to power the UK. http://www.withouthotair.com/
4
schainks 3 hours ago 1 reply      
I'd like to take a moment to point out another really great "renewable" technology that is a clever arbitrage hack using power prices: Compressed Air Energy Storage. (http://en.wikipedia.org/wiki/CAES)

To me, this is "Grid 2.0" technology. You are moving energy from times where you have cheap excess and placing it on the grid in times of expensive need. If we are going to move to a grid with a lot of renewables, technologies like CAES and pumped hydro (https://en.wikipedia.org/wiki/Pumped-storage_hydroelectricit...) are two necessary ingredients.

The primary difference between CAES and pumped hydro is that CAES is cost effective for both medium (50+MW) and large (500+MW) installations, while pumped hydro is cost effective only at large scale (500MW+) installations.

5
leccine 8 hours ago 1 reply      
You still need a base power plant and few others for controlling the amount of energy in the system. Using only renewable is kind o hard. If we can figure out a way to store energy the way we can access it very quickly with arbitrary output, we could move on to renewables exclusively.
6
naland 3 hours ago 0 replies      
no need to think about wind, water, and sunlight, first break great suckers intel and ms.
27
How Times Square Works
62 points by prostoalex  11 hours ago   13 comments top 4
1
chton 1 hour ago 0 replies      
I've developed control software for LED panels of the kind you see in Times Square a few years back. While that was for a different manufacturer, the hardware and control techniques aren't much different. The article makes it sound a lot more high-tech than it really is. They are interesting devices, especially when you get into strange shapes, but it's basically irrelevant how big your display is. The resolution matters, as well as the refresh speed. That's where the basic challenges lie.
2
limsup 10 hours ago 3 replies      
Times Square generates 11% of NYC's economic output? 385k jobs?I find these numbers difficult to believe.
3
afafsd 10 hours ago 1 reply      
There was a proposal a couple of years ago to allow giant advertising signs somewhere in the most blighted part of Market St, San Francisco. I saw a talk by one of the proponents (who just happened to own one of the buildings there), and he assured us that giant flashing signs would turn Market St into a Times Square or Piccadilly Circus. I was skeptical, but hey, I guess it couldn't hurt.
4
outside1234 5 hours ago 2 replies      
what a garish nightmare. yuck.
28
Expense Calculator
119 points by chicken_lady  16 hours ago   16 comments top 12
1
genericresponse 5 minutes ago 0 replies      
If anyone doesn't realize Ward is the same guy who created the Wiki.
2
skrebbel 6 hours ago 0 replies      
This is genius. A markdown for spreadsheets.

I wonder whether it would be possible to extend this concept to get more powerful features, inspired by spreadsheets (and programming languages maybe). Sometimes you need more than a sum or an average, but writing out the formulas in full, repeatedly, seems like a lot of cognitive overhead.

I see in this thread that Emacs org-mode has something like it, but I'm not convinced that stuff like

    #+TBLFM: @2$4=vmean($2..$3)
matches the humane, readable, Markdown-esque approach that I'd want to use.

3
zrail 15 hours ago 0 replies      
This is very clever. Of course, coming from Ward it's hard to expect anything less :)

Again, I'd like to plug http://www.ledger-cli.org. It's similar in spirit, with a reporting program reading a lightly formatted text file.

4
ovechtrick 14 hours ago 1 reply      
5
stiff 11 hours ago 0 replies      
You can do those kinds of quick reports in text files conveniently in emacs org-mode nowadays:

http://orgmode.org/org.html#The-spreadsheet

6
jetro223 7 hours ago 0 replies      
Clever solution - technically. For trips with my friends we use a "money-pot" system where everyone pays a fixed amount into a pot before the trip and everything we do or buy together is paid out of this pot. After the trip the calculation is much easier - refund = (remaining amount / persons) ;-)
7
bernardeli 7 hours ago 1 reply      
I'm very impressed. That is the shortest version of expense calculation I've ever seen.

I have written a python script back in 2007 for calculating two trips expenses: https://github.com/bernardeli/trip_money_organizer

I'm not a Python developer myself, however I was pretty happy with the result. I know it works fine, and have used few times with no issues.

8
simonnreynolds 10 hours ago 0 replies      
9
keyle 13 hours ago 1 reply      
Clever pre-excel solution. In a way, ahead of its time, similar to MarkDown.
10
jzwinck 14 hours ago 0 replies      
The program needn't be a shell script invoking awk--you can use a "shebang" line to make the entire thing an awk file. This will help if you use an editor which understands awk syntax (for highlighting etc.). Of course in 1981 you probably didn't have such luxuries, but today we can just remove the "exec awk", the outer quotes and $1, and put this as the first line:

    #!/usr/bin/awk -f

11
dnr 10 hours ago 0 replies      
These days you should use Splitwise: https://www.splitwise.com/

(Not affiliated, just a happy user. And there's a nice network effect if more people use it, so more people should use it.)

12
mcot2 15 hours ago 0 replies      
In 1981?
29
Show HN: Aerobatic Smart Hosting for Single Page Apps
65 points by jgowans  13 hours ago   27 comments top 7
1
soapdog 1 hour ago 1 reply      
I think this service is great and I have a question. Have you folks put any thought on allowing custom domains? I am asking because if someone hosts their app with you and, eeris forbid, you go under, they will have a hard time migrating their users to some new domain.

If you had some custom domain mechanism then people would feel safer hosting stuff with you because migrating away would be easier...

2
Ronsenshi 11 hours ago 1 reply      
Might be me, but it won't load in latest Firefox. Bunch of blocked Cross-Origin requests for font files and in the end:

> ReferenceError: ga is not defined

3
d0m 10 hours ago 1 reply      
It's actually pretty cool. I've just been through all of that painful setup personally. CORS, grunt, SSL, cloudfront, S3, deploy, different stages, minifying, zzz. As others have asked, how do you plan to price such a service? A feature from Heroku I really like is the "Rollback", maybe that's something you may want to add as an added benefit.
4
ericmsimons 12 hours ago 3 replies      
This is actually really great. The ability to proxy api calls is huge, and I think the only thing missing here is automatic seo. If you guys could automatically spit out static HTML files for googles crawlers, I would use this in a heartbeat. Do you have a free tier like heroku? How are you pricing this?
5
wereHamster 12 hours ago 1 reply      
Login via GitHub but then I have to manually enter my email address? Why don't you request the 'user:email' scope and get the email automatically?
6
ing33k 11 hours ago 1 reply      
blank page, but when I view source, its showing up markup
7
hackerews 12 hours ago 2 replies      
This is really awesome.

Front-end hosting + easiest way to create a back-end api (https://api.blockspring.com.) could be a sweet connection. Would love to chat. paul@blockspring.com.

Great work!

30
Pyrasite: Tools for injecting code into running Python processes
70 points by tombenner  13 hours ago   10 comments top 4
1
mau 2 hours ago 0 replies      
I had some "memory leak" issues a few months ago with the api hosts of the service I'm working on. As the code changes made during the days before didn't seem to have affected anything that could be related to this leak I realized it was something about the requests we were receiving but I wasn't able to replicate the same issues in my dev host nor in staging. Eventually I dumped the memory of a running process using Pyrasite and after hacking a little bit with meliae (https://launchpad.net/meliae) I was able to find the issue and solve it.

Very useful.

By the way is worth noticing that you won't probably be able to attach the interactive console to a running web server as the output is usually handled by the supervisor process, at least I wasn't able to do that in my first tentative and the memory dump was good enough for me.

Check this sample to dump the memory out of a running process: http://pyrasite.readthedocs.org/en/latest/Payloads.html#dump...

2
klibertp 1 hour ago 0 replies      
Wow, live inspection of running code without prior instrumentation is a great feature of Erlang which I always missed in Python. I used `rconsole` from rfoo package (https://code.google.com/p/rfoo/) before, but this looks like a cleaner solution because you don't have to include server spawning code in the code you want to inspect. Very nice.
3
tbarbugli 8 hours ago 2 replies      
Would be nice to know how this actually works; docs dont say much about how the tool injects the code into the running process.
4
AnkhMorporkian 11 hours ago 1 reply      
I've used this a few times to debug running processes that were having small issues that weren't bad enough to restart the system to put in a proper manhole, but bad enough to be affecting users. Worked perfectly to figure out what was going on, and to patch the naughty method.

It's one of those tools that you'll be glad exists when the need arises, but you'll feel a little dirty using it.

       cached 11 August 2014 16:02:01 GMT