hacker news with inline top comments    .. more ..    28 Mar 2012 News
home   ask   best   6 years ago   
1
Go version 1 is released golang.org
333 points by enneff  4 hours ago   76 comments top 10
1
bgentry 3 hours ago 0 replies      
Congrats to the Go team! I've been looking forward to this release for a while. Go makes it easy to do some very complex things, and much of this can be attributed to the wonderfully built standard library.

For those interested, you can run Go apps on Heroku using this buildpack: https://github.com/kr/heroku-buildpack-go

Note: you may need to use the #rc branch until its changes are merged in master

2
exch 2 hours ago 0 replies      
I've been using Go for 2 years now. It's been a lot of fun. It certainly brought fun back into programming for me. Very happy to see the 1.0 release hitting the public. Congratulations to the Go team and contributors for a job very well done!
3
SpoonMeiser 3 hours ago 4 replies      
"We're announcing Go version 1, or Go 1 for short"

I foresee the next version being considered harmful.

4
mace 3 hours ago 6 replies      
Anyone care to share their experience writing something in go? I've toyed with it but not built anything in production.

FWIW, the go dashboard (https://godashboard.appspot.com/) has a number of interesting projects.

5
peregrine 3 hours ago 1 reply      
Been playing with the weekly versions of Go for a couple weeks now. Highly enjoying it, my biggest pet peeve was that every library had a master and "weekly" branch which required me to manually pull down the weekly. Shouldn't be long and I can move to master!
6
TylerE 1 hour ago 1 reply      
Well, they got (at least) one thing exactly right: Doing away with the "toolchain". Having one command that can figure out how to build your code, including downloading external dependencies, is really nice.

Pity about the whole "not having generics" thing though.

7
Meai 3 hours ago 2 replies      
I'm still wishing that we could embed Go into a multithreaded c application, just like we can with the JVM and Mono. I would like to use Go as a "scripting" language. I have seen a thread in the mailing list that someone was working on it on a personal basis, but I can't find it anymore. It doesn't seem to be in high demand.
8
jasonlotito 3 hours ago 2 replies      
If anyone from the Go team reads this, http://golang.org/doc/go_tutorial.html linked to from the blog gives a file not found error.
9
melling 3 hours ago 3 replies      
Hitting 1.0 should convince more people to use it in production. However, getting any new language in an organization is always a struggle. Is there a list of organizations that currently use it in production?

[Update]

A few high-tech companies were listed. I was hoping more for big pharma or finance? There are some super convervative firms out there. Even Google, for example, will only allow a few languages in their firm.

10
Osiris 1 hour ago  replies      
Does Go support Win32 development?
2
Wind Map hint.fm
51 points by taylorbuley  36 minutes ago   5 comments top 5
1
gibybo 0 minutes ago 0 replies      
I am used to seeing wind markings on aviation maps (e.g. http://www.aviationweather.gov/adds/winds/) but it's pretty cool how much clearer this map is. It makes me wish I could see a much larger portion of the world so I could get a better understanding of the currents that are entering/exiting the US.
2
lutorm 1 minute ago 0 replies      
Beautiful visualization! It would be interesting to have an underlay showing the topography so one can correlate features in the wind pattern with mountain ranges, etc.
3
th0ma5 0 minutes ago 0 replies      
been wanting to do this myself for like forever. it looks like they are using high level summary data from the NWS ... they also have this amazing thing called MADIS which focuses on micro climate research (and more) ... so in theory you could get a high resolution version of this for san francisco or nyc, and that'd be really great and possibly rather useful too. I think the Weather Underground is a subscriber of MADIS, and their "rapid update" feature has some of this, and also I should mention some of this is collected by ham radio people in their spare time too.
4
Edootjuh 1 minute ago 0 replies      
Very cool. It's interesting to see how elevation affects the wind here (http://www.theodora.com/maps/new9/usa_elevation_map.gif), but I'd wish there was a worldwide version. (I'm aware that you need a data source)

Should you be interested, here's data from the Netherlands: http://www.knmi.nl/samenw/hydra/cgi-bin/register.cgi

5
blueski 1 minute ago 0 replies      
Beautifully done. Just discovered you can zoom in by double clicking.
3
Building the worst Linux PC ever: 6 hours to boot Ubuntu hackaday.com
138 points by voodoochilo  2 hours ago   20 comments top 10
1
reginaldo 20 minutes ago 1 reply      
These crazy projects are the most fun. As Richard Feynman notoriously said: "What I cannot create, I do not understand".

shameless plug below:

Last year, inspired by Bellard's jslinux, I too wrote an emulator that can run Linux on the browser. Only I was lazy and emulated the vastly easier LatticeMico32 processor.

Anyways, the result was very intellectually satisfying.

After writing the interpreter, I went ahead and wrote a version that generates Javascript code on the fly (and captures up to 3 backwards jumps to the same block), for massive speed ups.

Anyways, it doesn't serve any purposes, but boy was it fun...

The code is at:
https://github.com/ubercomp/jslm32/

And there's a demo running on:
http://www.ubercomp.com/jslm32/src/

BEWARE: It only works well on Chrome (takes download time + 10s to boot on my machine).

If anyone is interested in this stuff, just ask and I'll write a post describing what I did to take boot time from 2.5 minutes to 10 seconds.

2
ajross 1 hour ago 3 replies      
The linked blog post misses the interesting bits. Yes, it boots Ubuntu (slowly) via an ARM emulator written for AVR. That's just software.

What struck me is that he wrote a controller for an 8 bit FPM DRAM bus instead of just using a big SRAM. That's surprising, and not at all trivial to do over a bunch of GPIO pins.

3
mhd 1 hour ago 0 replies      
From the actual web site[1], describing the video of the boot process[2]:

"The raw video is in a few segments, since I had to change camera batteries a few times while filming."

[1]: http://dmitry.co/index.php?p=./04.Thoughts/07.%20Linux%20on%...

[2]: http://www.youtube.com/watch?v=nm0POwEtiqE

4
iliis 22 minutes ago 1 reply      
Impressive. And now I'm wondering if a Minecraft inside Minecraft would actually be realizable. As 8bit CPUs have already been made (altough with unknown architecture), it is a bit less crazy to imagine running linux with a java stack on it. It will be hellish slow and you probably need some sort of "hardware" graphic acceleration. And a lot more memory than the current versions. But hey, running Minecraft in Minecraft would be pure brainmelting awesomeness!

Maybe some sort of Hardware Description Language to Minecraft compiler would be useful.

5
jonbro 1 hour ago 0 replies      
this is an amazing hack. Well done for this guy, and his totally useless project.
6
stcredzero 1 hour ago 2 replies      
The best part about this is the battery pack in the picture.

I wonder if there's a niche for the equivalent of the Model 100 in today's world?

http://www.trs-80.com/wordpress/trs-80-computer-line/model-1...

A fully mobile general purpose computer with instant boot, impressive battery life, running off 4 AA's.

The Wikireader comes close as the modern equivalent. The input method would have to be something like Siri, with an optional keyboard. With a form factor and batteries from an iPad 3, but with a slower processor and a fast refresh eInk screen, you could have truly phenomenal battery life.

7
agumonkey 46 minutes ago 0 replies      
As I said earlier, he needs to run jslinux on this stack. It will be like `computing slowmo @ 1Mfps`
8
joshu 1 hour ago 1 reply      
I disbelieve. How could this possibly boot to X without a framebuffer emulator too?
9
WiseWeasel 1 hour ago 1 reply      
I'm sure he could coax at least a few more hours out of it by somehow finding a working 1x CD-ROM drive to boot a LiveCD from.
10
mathieud 45 minutes ago 0 replies      
What's really impressive, is that it still work somehow...
5
Socialcam 4.0 Launches Today techcrunch.com
28 points by mwseibel  52 minutes ago   4 comments top 2
1
sethbannon 14 minutes ago 0 replies      
It's surprising how much adding a soundtrack changes the whole feel of a video. Great work, guys, keep it up.
2
mwseibel 41 minutes ago 1 reply      
This is the biggest release we've done since video filters - Ammon and Guillaume are allstars! Socialcam = Army of Three
6
Jessamyn Smith: Fighting sexist jokes with a Python bot dreamwidth.org
363 points by dpritchett  8 hours ago   241 comments top
1
hythloday 7 hours ago  replies      
"It has been fascinating to watch the ongoing reactions. There have been complaints that we have too many bots in the channel now. There have been complaints about it spamming the channel. There were several “Make them shut up!” responses. These are not reactions I have seen the other bots elicit, certainly not with such intensity. One person even complained about the name being too long, though to his credit he realized right after he said that that several other people in the channel also have very long handles. To me, all of this seems like typical geek behaviour: something is making them uncomfortable, and so they attack it on “rational” grounds. Most likely, they aren't even aware of the gut reaction fueling their logic."

This is excellently picked out: most of the disagreements over the term "sexism" hinge around this point. Stuff that doesn't bother you (that another poster here describes as being "tuned-out"), as a man, because your gender plays no part in your job, is experienced differently by your female colleagues, because their gender does. Their experience of seeing "that's what SHE said" (and the death by a thousand cuts of other gender policing) is just as distracting and annoying as the "that's what she really said" bot. If your reaction to this post is "the first one seems harmless but the second is really overstepping the line", please take a few minutes to consider how you would feel if everyone's reaction to that (and everything else you objected to) was "just lighten up".

8
What Adobe's new pricing for Flash means for social game developers brianrue.wordpress.com
16 points by brianr  46 minutes ago   7 comments top 3
1
guard-of-terra 27 minutes ago 1 reply      
How would that even work? How would they charge a share of revenue for the use of the runtime? From both technical and legal sides?

I actually think it would be a huge hassle for development; it would cost more than those 9%: increased cost of development, less skilled developers, disasters when the complicated scheme doesn't work, sales ping-pong expensive in both time and money.

I think would especially hurt overseas developers (think Asia).

Over the net, any process that requires approval from both sides (a contract perhaps) goes hundred times slower than process that doesn't (the current distribution of flash player and tooling doesn't). Increasing the distance, in form of borders, cultural difference and raw miles, slows it down even more. And certainly Adobe would hurt their potential partners by understanding their business poorly and trying to apply the same set of expectations worldwide.

It seriously would not fly. Bye bye flash.

2
protomyth 31 minutes ago 1 reply      
IE 10 touch and iOS won't work with Flash anyway, so I would hope social game developers are at least looking at HTML5 for their next projects.

I sometimes wonder why their isn't a cross platform "game browser"? It seems one of the cross platform game engines could do a much better job than flash. Adoption is a problem, but people seem to be able to download a lot of apps these days.

3
michaelpinto 43 minutes ago 0 replies      
I think the last time I was this angry about a software pricing scheme it was when movable type started charging " that really pushed a generation of folks into the wordpress camp for good. my bet is that we could be watching the decline of adobe. and not for nothing for you folks looking to start a business the entire adobe product line is waiting to be disrupted " the time may be right for a Photoshop or an InDesign killer. The entire CS line is bloatware at this point...
10
Kids these days: the quality of new Wikipedia editors over time wikimedia.org
70 points by vgnet  3 hours ago   24 comments top 10
1
redthrowaway 1 hour ago 1 reply      
It's not even a new user issue. I've been an editor for going on 5 years, and I still get templated and reverted on inconsequential edits.

Case in point: I recently edited this picture: http://en.wikipedia.org/wiki/File:Turing_Machine_in_Golly.pn...

It's a screenshot of the 6 octillionth generation of a Turing Machine built in Conway's Game of Life, using the GPL'd program Golly. The previous image contained Windows GUI elements (title bar, etc) which are non-free and cannot be used on WP when a free alternative exists. So I cropped it and uploaded the edited image.

Within minutes, I had received a template on my (seldom-visited) commons talk page informing me that I had uploaded a file without specifying its license, and that it would be deleted. Take a look at the "Permission" field on that image: "See LICENSE.TXT distributed with Golly for GPLv2 license"

Despite this (incredibly clear) assertion that the image was GPL'd, I received a warning that it would be deleted. Why? Because I hadn't included the "This image covered by the GPL" template that a) I didn't know existed, b) there was no mention of on the upload page, and c) is a wordier version of what I wrote in the license field.

As an experienced editor, I'm used to these stupid quibbles and time-wasting fights. I'll still contribute, although they are a large part of why I don't contribute more. As a new editor seeing this, however? I would have told them to fuck off, got banned for incivility, and never gone back.

There's an attitude among the regulars that Wikipedia is a treasured resource that must be defended against innumerable vandals, trolls, and spammers by a select cadre of noble volunteers. To an extent, they're right. But when you have such badges as "The Defender of the Wiki Barnstar" [1] being held up as the height of achievement for veteran editors, it engenders a culture that is exclusionary, if not actively hostile, towards new editors.

[1] http://en.wikipedia.org/wiki/File:WikiDefender_Barnstar.png

2
jonnathanson 2 hours ago 5 replies      
I'm glad that they're finally coming around to the realization that Wikipedia has become increasingly closed to new contributions, and that they've stopped touting the (patently absurd) hypothesis that new users just don't "get it." (The fact that they'd even think, let alone think first, to blame the users is just a giant head-scratcher).

As a simple UX experiment, I would ask new users this: try to contribute substantively to any article on Wikipedia. Just try it. Make a good-faith, high-quality edit to a page, and see how long the edit is allowed to stand. More likely than not, the contribution will be automatically reverted, within milliseconds, by a bot. If it's not, it'll be hand-reverted by a hardcore Wikipedia editor -- part of the statistically small, but disproportionately powerful cadre of self-appointed content cops, who seem to see their jobs as being bulwarks against change. In its zeal for the trappings of due process -- attributions, anti-"vandalism" policework, source checks, guidelines, and so forth -- this clique has lost sight of the net effect it's had on the site, which is to calcify and close off the free exchange of information that was so crucial to Wikipedia's early growth.

IMO, Wikipedia has faced a fundamental challenge in recent years: namely, that content-quality efforts have threatened new content volume. I don't envy this strategic predicament, being forced -- quite literally -- to choose between quantity and quality. It's not an easy balance to strike, and, given the circumstances, Wikipedia's historic track record is quite admirable. Recently, however, the balance has tipped too far in the direction of quality-policing. And now it's starting to undermine the core tenets of the project. I remain optimistic that Wikipedia (and/or the Wikimedia Foundation) can right the ship. But it'll have to mean a substantial uprooting of some bad seeds that have been allowed to take hold for years now.

3
tokenadult 7 minutes ago 0 replies      
Like a lot of people here, I have sporadically read Wikipedia for years. I came on board as a Wikipedia editor in May 2010 after meeting the project's co-founder and his family in person the year before. I've since seen some of those immediate family members on another lengthy occasion. My children regularly interact with that family in an online education community. Through a web of mutual friendships, I thought I had some sense of what the community norms would be like on Wikipedia before I started. Moreover, I began editing only after reading the several published books about Wikipedia available at that time (as disclosed on my Wikipedia user page), and came on board as someone who has actually had both academic journal editing positions and paid journalism editing positions before Wikipedia even existed.

Even at that, I get a lot of well sourced edits reverted by ideologically motivated drive-by I.P. editors as part of an ongoing process of edit-warring on articles that I happen to know sources for.

http://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests/...

For some controversial topics, no amount of good-faith editing by editors who actually know how to look up reliable sources and how to have civil discussions of controversial issues can overcome the flood of point-of-view pushers (both I.P. editors and registered editors who are sock puppets or meat puppets of previously banned editors) who want to drag down the project to below the level of a partisan blog. There simply isn't any incentive in today's atmosphere on Wikipedia for readers who actually know what encyclopedias look like and who have actually engaged in careful research on controversial topics to devote any of their time and effort to Wikipedia.

My number of edits per month has plummeted, and mostly I wikignome to clean up copyediting mistakes on miscellaneous articles written by young people or foreign nationals who didn't write grammatical English in the last revision of the article. The way to increase participation by productive, knowledgeable, literate editors is to drive away the ideologues and enforce some reasonable behavioral norms on article talk pages and user talk pages. I see no sign of that happening over at Wikipedia, and until I do, I will heartily support anyone's effort to build a competing resource, either limited to a specialized topic or a direct attempt to build a better quality general online encyclopedia.

I think the "Lamest Edit Wars" page in project space sums up much of what is amiss about Wikipedia.

http://en.wikipedia.org/wiki/Wikipedia:Lamest_edit_wars

4
blahedo 2 hours ago 2 replies      
It's good to see real data to address this question, rather than the neverending stream of basically anecdotal information about the problem that we've been slinging around. It's especially interesting to see the percent of not just "good faith" (flawed but well-intended) edits that are reverted but also the percent of "golden" (actually contributing) edits that are reverted; this sort of hostile drive-by is discouraging even for experienced editors.

One thing I see in the graphs is that the "survival rate" of editors who made a good-faith first edit was already in relatively steep decline by 2005, but the same rate for editors whose first edit was golden remained on a high plateau through 2006 and then just categorically dropped off a cliff. What happened then?

5
nowarninglabel 2 hours ago 0 replies      
"What this means is that while just as many productive contributors enter the project today as in 2006, they are entering an environment that is increasingly challenging, critical, and/or hostile to their work. These latter findings have also been confirmed through previous research."

This confirms to me what I argued on HN last year: http://news.ycombinator.com/item?id=3272926 and http://news.ycombinator.com/item?id=3273204

6
Alex3917 2 hours ago 1 reply      
This ignores the fact that the standard for what is considered a good edit has risen dramatically over time. Most of the featured articles from the early years of the site no longer even meet that standard. There are 3,500 featured articles, and almost 1,000 that have become unfeatured as standards have risen.
7
Klinky 2 hours ago 0 replies      
NPR's Talk of the Nation recently had a segment devoted to the bureaucratic run-around that can happen on Wikipedia.

http://www.npr.org/2012/02/22/147261659/gauging-the-reliabil...

8
jes5199 18 minutes ago 0 replies      
I was surprised to see that “Assume good faith” was a principle of Wikipedia - it doesn't match my experience with the community at all.
9
nwj 2 hours ago 1 reply      
Is it possible that the opportunities to make quality edits has decreased as wikipedia has matured?

What I have in mind is the possibility that topics that haven't been well written out are like low-hanging fruit and are easy to positively contribute to. As wikipedia has expanded, that low hanging fruit has disappeared to some extent, and thus new editors have fewer opportunities to actually provide quality edits.

This explanation does not place blame on either newbies or pre-existing users who are "content cops". Instead, it proposes that the difficulty of providing a "golden" edit increases as we move forward in time. If that's a problem (debatable, since it means wikipedia as a product is better than it was previously) then it strikes me as a difficult one to solve.

10
cnspcs-cmplr 2 hours ago 0 replies      
Oh, the good old days. I went and inserted some Irish Evil into Wikipedia in commemoration.
11
The new Comcast Xbox Xfinity app is the first nail in net neutrality's coffin extremetech.com
135 points by SkippyZA  5 hours ago   73 comments top 11
1
JumpCrisscross 4 hours ago 7 replies      
This mangles two issues - discriminating based on source versus path.

Source discrimination is bad. The Internet allows applications and services to run “at the edge” of the network and not centrally; this encourages innovation [1].

Path discrimination is more complicated. Many content providers already pay private networks to transport their traffic on faster than the public internet [2]. There is even a market for traders paying tens to hundreds of thousands of dollars for low latency connections [3].

Given that ISPs charge each other for transporting traffic [4] it costs more to broker traffic across others' networks versus its own. It doesn't seem unfair for the ISP to charge less for the latter. This would allow Comcast et al to compete with the Akamais and Level3s that irk them today [5]. Sooner or later they will find it makes more sense to offer the discount for same-network traffic to everyone.

Comcast has made shitty statements about net neutrality before [1]. But it's not okay to vilify anything Comcast says by virtue of it being said by Comcast - that's straight up ad hominem.

[1] http://www.stern.nyu.edu/networks/Economides_Net_Neutrality....

[2] http://techcrunch.com/2010/11/11/level-3-lands-netflix-strea...

[3] http://www.highfrequencytraders.com/article/682/options-it-o...

[4] http://blog.teracomtraining.com/how-isps-connect-to-the-inte...

[5] http://blog.comcast.com/2010/11/comcast-comments-on-level-3....

2
roboneal 4 hours ago 5 replies      
Comcast claims that the app turns the XBox into essentially a set top box and that all data is streamed over Comcast's "private" network capacity and do not use any of the traditional public facing internet infrastructure.

Standard usage of "On Demand" programming from a DVR or other set top box do not count against the existing data cap quotas.

If this app essentially allows an XBox to plug into this private network capacity like any other set top box, I think this is an important distinction.

3
recoiledsnake 3 hours ago 0 replies      
>Comcast's FAQ strongly implies that Microsoft is compensating it in some fashion for the new service; the document states several times that the Xfinity app is only available to those with an Xbox Live Gold subscription

>For companies like Comcast, which has railed against the concept of being a dumb pipe, Microsoft's decision to pay it for free access for Xbox Gold users is a major coup.

Why does the article take as a given that Microsoft is paying Comcast? What if the arrangement is that MS gets more value added to XBox Live Gold subscriptions and Comcast gets more value added to it's TV service? Now this may not be the case, but doesn't seem any less likely than the article's assertions given what we know.

4
jsz0 1 hour ago 0 replies      
About 90-95% of the bandwidth available 'on the wire' is already being used to deliver Comcast's video service. So if this is the first nail it's been there for an awful long time. The way traditional VOD is delivered (QAMS full of MPEG2 programs) is actually very inefficient. There may be 4-8 QAMS just sitting there inactive if your neighbors aren't watching VOD. The switch to IP delivery eventually will allow Comcast and other MSOs to use their bandwidth more efficiently.

From the customer's perspective nothing really changes. Technically your modem will probably be provisioned differently to support the extra services. So for example if you buy a 50Mbit/sec Internet package and a video package from Comcast your modem would actually be provisioned with multiple service flows -- a 50Mbit/sec for Internet traffic and another 50Mbit/sec reserved for Comcast services. That second 50Mbit/sec service flow allows you to have the same video service functionality as the 2-3Gbit/sec of broadcast video they presently waste 90-95% of their spectrum on. This will be reclaimed for the big general-purpose IP data pipe. Comcast will continue to use some percentage of that pipe for their own services but it will be a much smaller percentage than they use today.

So really everyone wins in the end. IP set tops are cheaper than traditional cable set tops. Consumers get to use Comcast services integrated into devices they already own. Comcast's competitors get a bigger dumb-pipe into people's homes to ride on. I admit it looks bad if you don't understand the technology but it's important to remember the bandwidth crunch that Comcast and other MSOs have is directly related to how they presently deliver their own services. Any effort they make to solve that problem is good for consumers in the end.

5
pasbesoin 4 hours ago 2 replies      
Here are a couple of questions to consider:

Does Comcast allow a user to purchase additional bandwidth beyond their 250 GB cap?

If so, how much does that cost?

It's not just a cap, it's a hard limit. If you want more, you are forced to go with the ISP's own "blessed" option.

This is a significant portion of what bothers me with these caps. They are not graduated in a reasonable fashion. Instead, they are a cliff -- either entirely, or through absurdly high "addtional bandwidth" costs, limiting the service that the user can receive.

THEN, the ISP comes along and offers the user a sole way around/past this limit: Purchase whatever subset(s) of the additional service exclusively via the ISP's "value-added" content. (You can only have more bits if you buy your movies (well, movie viewings) from us.)

THAT, my friends, is a monopoly. Especially when you only have one or two ISP options, and they're all doing it. (Again, to you overseas, this is the case for much of the U.S.)

6
ricardobeat 1 hour ago 0 replies      
You can see the danger of this right here in the comments. Everyone is fine with it depending on the way it's presented.

Imagine that suddenly your smaller, lesser-known favorite sites are all bandwidth limited. There are just a couple hundred sanctioned ones that you get "for free". Small endeavours like Reddit or HN don't have a chance to grow fast anymore. That's the future if you are ok with this.

7
gioele 3 hours ago 0 replies      
I do not like this trend, but my fears is that Wikipedia is being used in a similar way. You can argue against free XBox traffic, but can you argue against free Wikipedia traffic?

Past HN topic: https://news.ycombinator.com/item?id=3505922

8
unreal37 4 hours ago 1 reply      
I don't understand the outrage here. Comcast is offering something for free to its customers that other ISP's customers have to pay for. It's good to be a Comcast customer I guess.

A lot of the arguments in the article and even some in this comment thread are "imagine if"... Either something is wrong on its face or its not. You shouldn't have to come up with theoretical examples of Comcast charging for unlimited access to Youtube and Netflix in order to make your point.

My cell phone service by the way (Bell Mobility) offers free unlimited access to Twitter, Facebook and Myspace that don't count against my mobile data plan cap. If you consider that against net neutrality, then net neutrality died a long time ago.

9
guelo 5 hours ago 3 replies      
Our government's failure to enact network neutrality is the latest sign that it is completely driven by corporate interests and will work against society's overall interest when there is a conflict. The only way to get any true democracy anymore is by creating mass panic as with SOPA, but it isn't possible to achieve that with every issue, and corporations are tirelessly relentless. I believe that without a functioning government this superpower is in a period of decline which will be characterized by increased authoritarianism and chaos as we go from crisis to crisis.
10
doki_pen 2 hours ago 0 replies      
This announcement is contrary to a statement made in July of 2011. Do they suddenly not give a shit about other customers service if it's xbox live traffic?

"If someone's behavior is such that it degrades the quality of service for others nearby -- that's what this threshold is meant to address," said company spokesman Charlie Douglas. "It can negatively affect other people."

http://hothardware.com/News/Comcast-Cuts-Customer-Off/

11
spindritf 5 hours ago  replies      
Maybe it's different here in Poland, where ISPs really have next to nothing to offer except for Internet access and maybe some VoIP services, but the whole discussion around net neutrality sounds panicky to me. Using your ISP's services is obviously technically easier and cheaper for them so why wouldn't/shouldn't it be reflected in prices?

Maybe we'll reach a point when consumer connections start to be billed like business ones are now (95th percentile, or whatever) but I don't think there's anything wrong with pricing that better reflects real usage. It may turn out to be a little more expensive for some of us, here on HN in particular, but if we want a better Internet infrastructure, we'll have to pay for it because AFAIR ISPs' ROIs aren't particularly impressive.

12
Timeline - A JS Library For Beautiful Timelines That Are Easy & Intuitive To Use functionn.blogspot.com
64 points by noob007  3 hours ago   13 comments top 10
1
udp 2 hours ago 1 reply      
GPLv3 is quite limiting for a JS library. The source file[1] also has no license in the header, just a copyright.

[1] http://veritetimeline.appspot.com/latest/timeline-min.js

2
noob007 3 hours ago 0 replies      
3
atestu 2 hours ago 0 replies      
This is gorgeous.

Thank you for making it open source.

Reminds me a lot of Simile Timeline (by MIT: http://www.simile-widgets.org/timeline/) but this looks so much better!

Thank you.

4
m_ke 56 minutes ago 1 reply      
Looks good from the pics but on my iPhone it makes the website impossible to use. Scrolling and pinch to zoom doesn't work.
5
gbaygon 2 hours ago 0 replies      
Your blog adds no content, please post direct links. Thanks.
6
kodablah 2 hours ago 0 replies      
Very neat. There are no examples for parsing times instead of just dates that I can see (but it appears VMM.Util.parseDate supports it). Something like a timeline for server monitoring data would be a good use of this. Also, I would hope for a little more commercially friendly license for a library like this instead of the GPL.
7
saurabh 2 hours ago 0 replies      
Try the zoom buttons to experience the slickness. Very cool!
8
silverlight 2 hours ago 0 replies      
Very neat!
9
jaequery 2 hours ago 0 replies      
beats the heck out of facebook timeline!
10
thornofmight 44 minutes ago 0 replies      
Very cool.

I want to create a journal/blogging app which uses this as its main user interface.

13
Sealand, HavenCo, and the Rule of Law illinoislawreview.org
103 points by rdl  5 hours ago   42 comments top 8
1
algoshift 21 minutes ago 0 replies      
Can a privately launched satellite in orbit be considered free from control from any one nation? Would my satellite be sovereign territory?

Might this be an idea for the future of DNS and highly-independent web hosting in some form? No cooling problems for shure!

Of course, there's the minor issue of a connection to the wired internet...

I want a .orbital TLD!

Forget the "cloud" how about "space"?

Hmmmm, isn't there a prominent web entrepreneur who's building a rocket company?

2
moonchrome 2 hours ago 5 replies      
If I was a billionaire I would probably buy Nauru[1], it's a 21km2 independent island state with <10k people and GDP ~30M$, which comes at 3000$ per capita. I would offer 50.000k$ to every citizen (more than they will earn in 15 years), which comes down to 500M$, to abdicate authority trough democratic process and move off the island (or possibly stay and work for me). Then build your own benevolent dictatorship that's already recognized internationally as a state.

[1] http://en.wikipedia.org/wiki/Nauru

3
rdl 5 hours ago 3 replies      
I realize this is 10-12 years ago, but the analysis here is quite interesting in applicability to other cases -- Megaupload, jurisdictional issues in general, etc.

The summary of this over at Ars Technica is great too. (http://arstechnica.com/tech-policy/news/2012/03/sealand-and-...)

4
ChuckMcM 3 hours ago 0 replies      
This should be mandatory reading for "Internet generation" entrepreneurs.

It is a clear exposition on why people need to engage their local institutions to change the law to meet their needs, rather than to try to create from whole cloth new institutions. When folks argue that changing the law of the land is 'hard' I do not disagree, but when they say creating something new from scratch is 'easier' I do. Like a manager who just sees the effects of the software and not how it achieves those effects, creation tends to look easier to the ignorant than to the experienced.

Anyway, there are great lessons to be learned in the story of Sealand.

5
stcredzero 2 hours ago 2 replies      
My version of a datahaven:

A series of small cases with a 12 volt auto plug, a small fanless computer, some batteries, and a 3G wireless connection. Hire some vehicle operators to keep it plugged in while they are moving. Once started, they will need a daily activation code to keep operating, otherwise they erase their hard drive encryption key and deactivate, requiring their encryption key to reactivate.

The small cases will constitute a low-powered cloud with no permanent address. If a case is seized in a raid, it becomes inactivated.

6
samstave 2 hours ago 1 reply      
I sent the article over to Ryan Lackey via Quora to see if he had any comments. Maybe he'll come post here as well.

(Ryan Lackey was founder of HavenCO.)

HAHA edit: RDL is already in this thread! I sent before looking at the comments :(

7
wisty 2 hours ago 1 reply      
I think the review plays down one thing - time. Sealand may (I'm not a lawyer, and certainly not an expert in the laws of statehood) become a state if:

* It stays independent.

* It obtains a sustainable population.

* Commercial independence. Not just as a "data haven", but hydroponic food production, and stuff being built there (possibly just IP, but more than just what you get from legal arbitrage).

* A functional community, with a rule of law.

Not too likely. The platform is simply to small to support any of this.

8
shawnbaden 58 minutes ago 0 replies      
I'm ignoring the core issue here but I just want to say:

SEALAND IS AWESOME

The location and structure itself I mean. What a sweet pad that'd be. Sure, it makes no sense from an economical standpoint. Or if you value safety. Or any logical reason really. But it's awesome.

That is all.

14
I can manipulate your amazon.com recommendations diskurswelt.de
93 points by middus  5 hours ago   22 comments top 11
1
phillco 2 hours ago 2 replies      
I can think of two evil uses for this:

#1: If you're an author who also runs a blog, you can make Amazon "recommend" your book to your visitors.

#2: The Amazon referral tag is included in the iframe. If I recall right, once you visit an Amazon page with such a tag, that gets set for your entire session and the referrer gets credit for everything you buy. This means if I embed this my website, and you visit, I will get a cut of all your Amazon purchases. Automatically.

(As stated below this will probably get you banned before you can receive payment, sadly)

2
nbpoole 3 hours ago 1 reply      
X-Frame-Options (in supported browsers) prevents the result of a request from being rendered in the browser, it doesn't prevent the request itself from being made. So does X-Frame-Options actually prevent this? Is the change to your recommendations made by an AJAX callback when you view the page? I would've assumed it was made by the pageview itself.
3
utunga 45 minutes ago 2 replies      
1. clicked through as described 2. Carnegie's book not on my list

either Amazon fixed this already or it never worked in Chrome? or something else.

4
wisty 4 hours ago 1 reply      
So, it's going to be a matter of time before some hacker's "Learn to be successful with no effort, pick up chicks, lose weight, and make money in your free time" book rockets to number 1.
5
ambirex 3 hours ago 0 replies      
After a conversation with a co-worker about how products followed him around on amazon, I had the same thought of providing a series of "funny" recommendations using this method.

The only problem would be researching the funny products would probably skew my results as well (unless I was careful and viewed incognito mode)

6
epikur 2 hours ago 0 replies      
Tangent: Thanks for disclosing your Amazon referral link. I don't mind when people use them, but they always seems much more polite with a disclaimer on the page.
7
JoshMock 4 hours ago 1 reply      
This is such a simple hack that I am shocked it took this long for someone to try it.
8
X4 3 hours ago 1 reply      
That's nothing special actually.

The Amazon recommendations module is definitely buggy, but it works for their purpose and as long as they can cope with "marketplace manipulators" they're okay. Oh they can't.. well you know they are one of the largest Marketplaces, they will find a solution quickly..
http://news.ycombinator.com/item?id=2475854 Oh, they can't even fix that? Well what needs to be said, needs to be said.

Thanks for your find middus!

9
thenextcorner 1 hour ago 1 reply      
And in the mean time, all HN readers who visited the page with the iFrame got cookied with an Amazon affiliate cookie of the "Hacker" writing about this!
10
goggles99 2 hours ago 0 replies      
I thought about this for a minute, then I thought "who cares". If I go to someone's web site and they set my amazon recommendations to buy their merchandise that they are setting, couldn't they have just showed me the same thing on their site with a link to Amazon? Couldn't they just as easily redirected my browser to the amazon page or tried opening up a popup to go to their produce there? amazon's recommendations are based on what you viewed or searched for last. How often do you go to amazon to find something to buy? I go there to find something I already know that I want - and I then search for it.

Seeing their recommendations for some merchandise that I also saw on a different website earlier does not affect me in any way, in fact I don't even look at the recommendations. I type in what I am looking for and hit enter (boom, old recommendations are gone).

11
drzaiusapelord 3 hours ago 1 reply      
> X-Frame-Options response header is set to SAMEORIGIN

So break every iframe on the net to fix Amazon's lazy approach to recommendations?

15
Google account activity googleblog.blogspot.com
79 points by Uncle_Sam  4 hours ago   32 comments top 7
1
ericabiz 3 hours ago 5 replies      
Pretty ironic that Google--a company that gained notoriety for its minimalist search engine--has a "company blog" full of ridiculous eye candy. On my laptop screen, the top bar (which stays as you scroll down) takes up a ridiculous amount of space. Then I accidentally moused over the bottom part of the page as I was reading the end of the article, triggering yet another bar popping up from the bottom and telling me about other Google sites.

The icing on the cake is that the whole thing takes a bit of time to load (showing you a gear much reminiscent of the old "Flash intro" days)--all to load some sort of header with ugly colored balls that move around when you move the mouse.

It's 1997 all over again, folks. Only I would have never expected Google, of all companies, to fall into the eye candy trap.

2
aresant 2 hours ago 1 reply      
I love the unintentional creepiness of this quote:

"Data deletion at the data source, e.g. in your Web History will have no impact on issued reports, however reports can be deleted at any time"

As in - if you didn't know already, GOOG's Web History is a superficial front-facing report, all your personal data is happily sitting in GOOG's databases, is tracked, related, and available to whomever has the appropriate power to access it.

3
eslachance 3 hours ago 1 reply      
This doesn't even scratch the surface of what Google could show you. If they showed us the extent of what they can actually infer from our activity, some people would probably shoot their computer, burn their router and modem, and never look at a screen again.
4
oskarth 2 hours ago 0 replies      
> For example, if you notice sign-ins from countries where you haven't been or devices you've never owned, you can change your password immediately

When I was att CCC this winter I noticed that a russian ip had logged into my gmail. Turns out it was CCC who had rented a russian ip. Scared me quite a bit for a few hours and resulted in me changing my password.

5
zmanian 39 minutes ago 0 replies      
I would love to see a similar visualization for the entire Google Apps Domain for admins
6
jemka 4 hours ago 2 replies      
Mirror: http://www.businessinsider.com/giving-you-more-insight-into-...

OP link is returning a blank page.

7
nextstep 3 hours ago 2 replies      
Has anyone opted in and received their first report yet?
16
Simple - An Obtvse clone written in Python github.com
102 points by jsherer  5 hours ago   43 comments top 12
1
jsherer 4 hours ago 4 replies      
I'd like to see one of these clones that more closely resembles the static site generators like Jekyll (http://jekyllrb.com/) or Hyde (http://ringce.com/hyde). Basically, the CMS would just be the interface to create new and manage existing content. It would have the standard static publish function that builds out the pages of the blog as HTML.

(NOTE: "Simple" is not my project)

2
slig 4 hours ago 2 replies      
> Go download Python 2.7+, Flask, Sqlalchemy and flask-sqlalchemy and you are good to go.

OP, you should learn about requirements.txt and virtualenv/virtualenvwrapper, it will make your life easier.

3
victork2 4 hours ago 0 replies      
So much drama around what is basically a skin/cosmetic improvement...

Good luck anyway.

4
phwd 1 hour ago 0 replies      
Nice work with the Python style, though I believe Simple and Obtvse missed the point as with every clone of something out there that tries to replicate what they could see on the outside. Skipping over Dustin's attitude of the situation, no one really knows how Svbtle.com works on the inside do they?

From the outside look and the screenshots (because that's all Dustin showed you) it's a simple design and that's the point; it was never supposed to be a complex work of design for you to be proud that you could replicate it in 1 day in rails. Yeah, Dustin put some work into thinking what could work for the layout but once the application of the design is done, replication is beyond simple.

Clone all you want I think it's great but Dustin was really telling the story of the "network" the idea of a closely knit community of writers (pseudo-writers whichever you prefer). I look at it as similar to the Deck Ad network or Dribbble (before every desperate Designer begged for an invite). Or when an HN clone comes along, no one ever migrates across, they always come back here.

Point is, the one thing you are never going to have is the network, that human element that sets apart the clone from the real deal.

5
geoffw8 4 hours ago 0 replies      
I think these two packages could become a great learning resource for noobies. I'm a Ruby guy, but am definitely going to take a look into this version, should add usually difficult to find "context".

Nice work :)

6
killnine 4 hours ago 1 reply      
AH! I was waiting for the python version.

I also watch this python static blogging app on github here:https://github.com/fallenhitokiri/Zenbo

7
pfraze 4 hours ago 0 replies      
I'm holding out for the Haskell version.
8
conductor 3 hours ago 1 reply      
Nice work, thanks, though I have a feeling that it's misnamed, sPymple would've been a nice match to Obtvse :)
9
richthegeek 4 hours ago 0 replies      
This is getting ridiculous... nice job all the same!
10
edwinyzh 2 hours ago 0 replies      
You know what? Today I was thinking of that it'd great if Obtvse was written in Python + Flask :)
11
jredwards 4 hours ago 0 replies      
I'm holding out for Svmple
12
tptacek 4 hours ago 4 replies      
This one doesn't even bother to rewrite the CSS; it just takes the cloned CSS from Obtuse. The snake eats its own tail.
17
Early Stage Startups Don't Need Money, They Need Customers startupnorth.ca
65 points by woohoo  4 hours ago   14 comments top 5
1
untog 3 hours ago 3 replies      
It baffles me to no end that the Canadian government is not leaping on the startup trend. It's an observable fact that a lot of Canadian talent heads south for more money and opportunities, but no-one seems to care enough to do anything about it. Meanwhile, the US passes the JOBS act to allow even more early-stage startup funding.

I lived in and around Vancouver for two years and absolutely loved it, but my visa expired and I had to leave. These days I live in NYC and, well, I love it here too, but my visa in the US is very restrictive- I can't start my own company, for example. The US seems to have an endless, protracted debate about immigration that has next to nothing to do with economic or political reality. Canada doesn't, and has a huge opportunity to attract international entrepreneurs that want to do business in North America. But I have absolutely no immigration routes back into the country, and there's no change on the horizon. So I won't be heading over the border any time soon.

2
asanwal 2 hours ago 1 reply      
While it is not as sexy, it's worth noting that "funding" from customers is generally cheaper than funding from investors. Plus, once you have revenue, you generally become more interesting to investors and have more leverage in those conversations because you may not need the money to survive the way you do early in your company's life.

We're a "revenue-backed" startup so I'd say that if you can do it, it is a great way to control your destiny. The challenge with this method is that growth comes after revenue, i.e. you spend generally after you have sold the product which means growth is slower while fundraising allows you to spend ahead of revenue.

Nevertheless, good to see this advice on HN.

3
Maro 1 hour ago 0 replies      
Depends on what early-stage means. If that means no funding money, then this statement is simply not true in many cases. Suppose you're tring to sell software and support. How are you going to sell a 1-year support contract if the customer can tell there's no guarantee you'll be around in 3 months? I'm speaking from personal experience. OTOH I'm sure the situation is better for web startups that are selling a service on a monthly basis.
4
woohoo 2 hours ago 0 replies      
I think there aren't enough resources for Canadian startups but at the same time, I agree with this post that once you can show some traction, then funding is not as hard to come by. The trick is figuring out how to get to that point and pay the rent.
5
dmix 3 hours ago 2 replies      
The chart still shows that startups raised $227k on average in the discovery phase.

The problem is in Canada there's noone willing to invest at this stage, at all. The angels here are known to only invest in later stage startups.

Although I agree the focus early on be on customers... its still hard to be 100% focused on customers if you're also constantly concerned with how you'll pay rent.

18
Death of a Data Haven (the story of Sealand) arstechnica.com
82 points by allenbrunson  5 hours ago   9 comments top 4
1
J3L2404 3 hours ago 1 reply      
A Sealand/HavenCo timeline

1942: Roughs Tower constructed off the coast of East Anglia.

1948: Roughs Tower abandoned by English government following World War II.

1966: Pirate radio entrepreneur Roy Bates occupies Roughs Tower.

1967: Bates declares an independent Principality of Sealand.

1968: Bates acquitted of British firearms charges, causing Britain to adopt policy of leaving him alone.

1978: German-led coup takes control of Sealand on August 10; Roy Bates retakes Sealand in dawn helicopter raid on August 15.

1987: Britain extends territorial waters to 12 miles, encompassing Sealand. Sealand claims its own 12-mile territorial waters.

1999: Sean Hastings and Ryan Lackey conceive of idea for HavenCo.

2000: HavenCo launches to massive press hoopla.

2002: HavenCo taken over by Sealand after commercial failure and mounting tensions.

2006: Sealand badly damaged in generator fire.

2008: HavenCo website goes offline.

2009: Sealand launches Twitter account.

2
allenbrunson 4 hours ago 1 reply      
Ryan Lackey is a frequent commenter here, and it seems he just submitted the longer version of this story.

http://news.ycombinator.com/item?id=3766543

3
jakeonthemove 2 hours ago 1 reply      
You can't escape the government (or more specifically, "a government")... not while you're on Earth. The best and easiest way to do what you want is to identify the persons/companies in power and persuade them to give you that freedom, offering something useful for them in return.

It's really not that hard, and easier than establishing a new nation in the middle of nowhere, as attractive as that might sound...

4
shimon_e 3 hours ago 1 reply      
Timing or the title really does make a difference. Posted the same link 12 hours ago. http://news.ycombinator.com/item?id=3764331
19
DNS Changer circleid.com
93 points by wglb  7 hours ago   27 comments top 7
1
justinsb 2 hours ago 1 reply      
Why not redirect _all_ DNS requests to the address of an informational HTTP server, saying "This computer is infected; here is how to fix it..."

I am sure that Geek Squad would pay a substantial amount of money to be listed as one of the repair options.

The idea that network administrators should have to spend hours hunting down these people is ridiculous. When/if they find them, they're just going to shut them off anyway.

If you're relying on the internet for anything important, you probably want to know that e.g. every key you type is going to some server somewhere.

2
gst 4 hours ago 2 replies      
So without those servers the clients would break and can't resolve DNS requests. Is this correct?

If it is, I don't understand why to bother at all with keeping them running. Just stop them. Internet will break for the people affected, they will someone let "repair" their computer, and you get rid of all the infected clients. This needs to be done anyway sooner or later. Why defer it?

3
wcchandler 6 hours ago 1 reply      
I work for a small college -- we actually got a couple letters from the FBI alerting us that somebody on our network was infected by this. We then had to some internal sleuthing to hunt down who was participating, and more importantly, stop it from happening. We also wanted to know if the actions were intentional.

All the information we could find on this was from 2007-2009. It seemed like this software was out-of-date and no longer in the wild. So I always wondered why we were being contacted, especially now.

This write-up was greatly appreciated as it finally shed some light on why were contacted about it -- and more so, how the FBI were involved.

4
feefie 5 hours ago 5 replies      
It's not that I don't care about being uninfected, I just don't know where to find out about things like DNS Changer and Conficker. I answer all the requests my system tray makes of me keeping the following up-to-date: Windows Updates, AVG Anti-Virus Free Edition 2012, Adobe Flash, and Java. I use Chrome and Firefox that update themselves. Is something else I should be doing? Is there a web page that has a check list of things I should do regularly, like 1. run windows update, 2. go to http://dns-ok.us/, etc. How do I know if I'm infected by Conficker? I assumed Windows or my AVG Anti-Virus would have told me.
5
aqme28 5 hours ago 1 reply      
I might be a bit uninformed, but what's to stop the hacker from redirecting http://dns-ok.us/ to a fraudulent page that says your DNS is okay?
6
mcculley 4 hours ago 1 reply      
The article talks about ISPs running replacement servers to counter this. It is not clear, but it sounds like these servers would be intercepting DNS requests to the formerly bad servers and that is why Vixie is against it being a long term solution. He suggests that ISPs could intentionally break infected customers in small batches to get the customers to call for help, but couldn't such infrastructure be used to detect infected customers and send out assistance?
7
shill 5 hours ago 0 replies      
Where have I seen the authors name before? Oh yeah...

$ man crontab

20
Please do not take down the Sality botnet seclists.org
505 points by wglb  18 hours ago   111 comments top 4
1
Dove 3 hours ago 0 replies      
I am reminded of the "wine blocks" sold (legally!) during the prohibition, which came with the following warning:

After dissolving the brick in a gallon of water, do not place the liquid in a jug away in the cupboard for twenty days, because then it would turn into wine.

http://en.wikipedia.org/wiki/Prohibition_in_the_United_State...

2
acqq 13 hours ago 0 replies      
Reading README in the linked zip file, inside is, first, an executable which is a QBFC (http://www.abyssmedia.com/quickbfc) packaged and slightly modified version of AVG's Sality Removal Tool (http://free.avg.com/us-en/remove-sality) to automate the removal of the Sality virus. Then, there is the encrypted version of the same executable, so it will run properly when downloaded by the Sality virus. And finally, there's a simple Python script that queries super peers from a bootstrap list for the most recent URL pack pushed to the Sality P2P network.
3
s_henry_paulson 11 hours ago  replies      
The guy takes all the time to put this together, write this up, and is perfectly ok with the outcome, but instead of going to a coffee shop and doing this himself, he spends a lengthy amount of time writing up instructions and giving the criminals a chance to fix their injection problems before someone can take down the botnet.

Baffling.

Not only that, but this guy seems to know what he's doing and instead of someone who knows what they're doing completing the task, he's willing to watch script kids bork the whole thing, or run the risk of russians (or others taking over the whole network and hardening it)

4
ricardobeat 15 hours ago  replies      
So why can't those nice guys from the FBI go and do it already? It's so easy to replace sites when piracy is involved...
21
XeTeX: could it be TeX's saviour? vallettaventures.com
73 points by steeleduncan  6 hours ago   44 comments top 12
1
larsberg 4 hours ago 0 replies      
XeTeX-generated PDFs are not compatible with the toolchains of some academic publishers (Cambridge University Press, which publishes the Journal of Functional Programming comes to mind, but I seem to recall Springer-Verlag having issues as well). Without full support for academic publishers, I believe that majority of TeX users could not upgrade.
2
JoachimSchipper 5 hours ago 1 reply      
Umm, have these people ever heard of backward compatibility? Admittedly, many TeX package authors haven't either, but just dropping pstricks is going to make a ridiculous number of documents that have a figure in them impossible to compile. Not to mention the fact that TikZ, while better, is not better enough that everyone will want to invest time learning it...

MikTeX's install-on-first-use has its problems, but does help balance bloat and not removing older packages.

3
beza1e1 5 hours ago 1 reply      
The mission of TeXLive is to include everything and the kitchen sink. However, why should their TeXpad support everything? They could go the XeTeX+biber+TikZ route and educate their users how to switch from pdflatex,bibtex,pstricks,etc.
4
rmk2 4 hours ago 0 replies      
>> Undoubtably a change this severe will be painful for some, but it will be less less painful than heading out to the computer shop in 3 years time to purchase the 2TB harddrive that will be required for the exponentially expanding tex updates.

Few arguments ever gain feasibility from hyperbole, this article is not an exception. The size of his texlive installation is purely circumstantial evidence, since that folder also includes backups of updated packages and all sorts of other "dynamic", i.e. user-specific data. Basing the argument on that seems...silly.

>> For those for whom adding the letters xe before typesetting is too much to bear, or for typesetting ancient documents

It isn't as easy as "just adding xe" before (La)TeX, since not all packages are integrated with it yet, and since the polyglossia package is still not fully stable, either (yes I know babel is old, but at least stable), so some packages have trouble dealing with polyglossia or have experimental interfaces in order to work with Xe(La)TeX. Csquotes is one of the packages that comes to mind.
A further problem with XeTeX is that it still does not offer a proper version of the microtype package. And on top of everything, the hyperref-support for colours is spotty at times, at least for me.

For me, depending on situation, pdflatex and xelatex live happily next to each other and are both included in the same in my generic template via the ifxetex package and \ifxetex...\else...\fi, so depending on what I need in a given instance, running either binary on the same file produces either output.

5
siphr 5 hours ago 5 replies      
In all honesty I did not realise LaTeX needed saving.
6
radarsat1 5 hours ago 1 reply      
Why is it necessary for TeX to keep all the libraries in source form on disk? Why not use compression, or package some pre-compiled form of the library code, or both?
7
ajray 5 hours ago 2 replies      
I just use XeTeX so I can get proper kerning and ligatures. Oddly enough, as a typography nerd LaTeX just doesn't cut it.
8
MrKurtHaeusler 3 hours ago 0 replies      
I have used XeLaTeX for a while because of the support for freetype fonts and better support for unicode.

Linux Libertine is my favorite font for XeLaTeX because it has better ligatures than any of the computer modern etc. fonts.

9
leephillips 2 hours ago 1 reply      
I didn't understand why they wanted to port LaTeX to the iPad when they first wrote about it:

http://lee-phillips.org/latexipad/

10
killa_bee 4 hours ago 0 replies      
I use xelatex in my work and it's still embarassingly fragmented and outdated. We need to start over on a new TeX-like project (also so that it can be ported to mobile).
11
dfc 5 hours ago 0 replies      
I thought that luatex was going to be the savior?
12
jhnewhall 3 hours ago 4 replies      
Tex, because it's author didn't care to reuse XML.

Having used it for publications and during the university i really, i despise Tex. Simply the language sucks, the outcome is nice, but those slashes and parenthesis really sucked.

With a XML syntax we could easily create beautiful editors, make it easy to parse with schema validation, etc.

Problem is that this card castle grew, avoiding alternatives such as docbook, and is and will always be a mess, since it's foundations are not parseable.

22
Agile is a Sham williamedwardscoder.tumblr.com
210 points by willvarfar  9 hours ago   126 comments top 3
1
DanielBMarkham 8 hours ago  replies      
[Obligatory plug and disclaimer: Agile professional who wrote an earlier rant "Agile Ruined my Life" http://www.whattofix.com/blog/archives/2010/09/agile-ruined-... Also I am writing some practical how-to Agile e-books trying to undo some of the damage: http://tiny-giant-books.com/scrummaster.htm ]

Just differentiate between what Agile is and how people are pushing Agile on you. Different things entirely.

Agile is best practices around iterative and incremental development. Period. Yes, there's a manifesto and there's Scrum and all sorts of other things, but at the end of the day Agile is a marketing term. A place to go find out what people are trying in order to get better.

Two secrets here. One, team success is about 90% dependent on who is on the team. Good teams do well. Bad teams do poorly. But if you're going to have more than one team, you need some way to measure and talk about what they are doing, so you have to have something.

Agile -- when done correctly -- is just the minimum amount of something you need to get your work done. It is the minimum amount of process. After all, it's not like you can have no process at all. Whatever you're doing already is a process.

I've seen hundreds of teams struggle with Agile, and to me the problem is that we do a really bad job of balancing the difference between "here's a best practice that worked awesome for about 80% of the teams that tried it" and "These are the rules. You must do this." In many shops, Agile is just another way of micro-managing teams, sadly.

I can't help that. I also can't help the fact that lots of folks are out to make a buck on certifications and such. I don't think that makes Agile bad. Heck, I'm not even against certifications, although I'd never get one. I just think we take things too far.

I like the idea of having a marketing blurb "Agile" where I can go to find out what kinds of new stuff is being done. It helps me pick better reading material.

The second secret is that most teams, frankly, are not that good. So you need some kind of way to demonstrate that as well. Making things incrementally and in small iterations lets you see how bad teams are early on. You fail early and often. Then at least you have a bit of a shot at trying to help.

But believe me, I feel your pain. Sounds to me like you are a lot more upset at modern marketing and management attitudes than Agile. Remember that we technical folks have a way of over-doing whatever we get into, as I was writing in my blog before your post appeared! (It's still very strange to me watching HN how the same topic comes from multiple writers at the same time) http://tiny-giant-books.com/blog/agile-means-stop-focusing-o...

2
Uchikoma 1 minute ago 0 replies      
Not sure where the poster comes from.

(The poster seems to confuse Agile in "Agile is a Sham" and the Agile "industry". I can't talk about the Agile "industry" as I'm no part of it or have no contact to it, so I will concentrate on the "Agile is a Sham" part)

After introducing Agile/Scrum in 3 companies over the last 10 years, not as a consultant but as a permanent employee, I'd think it has been a success three times. Measuring success with two metrics: Predictable, ongoing results and developer happiness.

Developer happiness in agile usually comes from calming down the fury of requirements and wished for features. Sprints do enable developers to focus for 2-3 weeks on one topic instead of being pushed to the most urgent topic of the day by management. Happiness also comes from communicating and working and feeling as a team.

I assume the poster has - if he has - a different agile experience. Perhaps he's not a team person or uncomfortable with coordinating and working with others. 10% of the develoepers I've worked with just don't feel right with agile, it's not their thing. They should not try to adapt to agile from my experience, as it does not work. Better to find a non agile environment that does work for them.

From asking developers after agile introductions we had >90% approval rates on the question "Would you like to go back before agile?" and "Would you like to have a different development model?"

Scrum in particular is different from, e.g. XP. It focuses on process and - deliberately - says nothing about engineering practices or craftmanship. Coders are free to chose those for themselves. Some struggle on this or think they don't need to have them just because Scrum doesn't prescribe those. Some think Scrum is sh* as it does not talk about engineering practices or developer quality. But this is intentional.

Scrum will not work with bad developers. But it will make good developers work smoother together and make them - if they are team people - more happy from my experience.

One final note: Some people here agree that Agile is a Sham and does not work, then citing examples of managers/scrum masters that deliberately did not follow Scrum. Following Aristotelian logic at least this does not make sense.

A final final note: I got a very fine and tasty chocolate cake from my current team for birthday, so I might not be doing things that wrong ;-)

3
cletus 5 hours ago  replies      
I'm surprised no one has brought up Steve Yegge's Good Agile, Bad Agile [1], which speaks well to both Google culture and scaling (good) Agile. One of the comments on that post mentions it. It's worth reading (and being Yegge, that'll take awhile).

Agile is like anything else: some well-meaning (and arguably useful) principles that get warped by bad managers and bad companies. The natural evolution for any such idea is to turn it into an industry and there are any number of people who are willing to sell you training, lectures, books and programs for Agile (with a capital-A). You need to separate the industry from the idea.

[1]: http://steve-yegge.blogspot.com/2006/09/good-agile-bad-agile...

23
Effective Web App Analytics with Redis progstr.com
47 points by hdeshev  6 hours ago   11 comments top 4
1
thibaut_barrere 4 hours ago 1 reply      
First, thanks for sharing! Then a comment on this:

"I've done implementations of the above using SQL databases (MySQL) and it wasn't fun at all. The storage mechanism is awkward - put all your values in a single table and have them keyed according to stats name and period. That makes querying for the data weird too. That is not a showstopper though - I could do it. The real problem is hitting your main DB a couple of times in a web request, and that is definitely a no-no."

This is not a SQL vs NOSQL issue: decoupling the reporting system from your main (production/transaction) system is a widely advised practice in "business intelligence".

Use a different instance, with a schema designed for reporting.

You can use Redis for that (and I use it actually!) but you can also use MySQL or any other RDBMS.

It's fairly easy to implement: one line for each fact, then foreign keys to a date dimension and hour dimension (see [1]), then you can sum on date ranges, hour ranges, drill down etc, on many different metrics.

[1] https://github.com/activewarehouse/activewarehouse-etl-sampl...

2
tptacek 5 hours ago 2 replies      
There's a blog post from Salvatore somewhere talking about how he marshalled time series data into strings which made me thing the naive/straightforward approach was suboptimal. I always thought ZSETs indexed by time_ts would be a good fit for this.
3
ihsw 2 hours ago 1 reply      
> The above mechanism needs some finishing touches. The first is data expiration. If you don't need daily data for more than 30 days back, you need to delete it yourself. The same goes for expiring monthly data - in our case stuff older than 12 months. We do it in a cron job that runs once a day. We just loop over all series and trim the expired elements from the hashes.

Rather than iterating over the entire list of series and checking for expired elements you can use a sorted set and assign a time-based score. The cron job can still run once a day but you can find items in that sorted set that have members below a certain score threshold, which will almost certainly be faster.

Naturally this will increase memory usage (which may be undesired) but it's food for thought. Eventually the looping and trimming expired hashes can be coded using lua server-side scripting in redis-2.6, which is interesting in a different way and has it's own challenges.

4
bradleyland 5 hours ago 1 reply      
This is cool, but if you're looking to work with time series data, you should definitely have a look at RRD. A lot of the operations you'd want to perform on time series data are available internally with RRD. RRD can also do some cool stuff like generate graphs.
24
Download Coursera videos in batch github.com
48 points by jplehmann  6 hours ago   17 comments top 8
1
jplehmann 5 hours ago 1 reply      
http://coursera.org is creating some fantastic, free educational videos (algorithms, machine learning, natural language processing, SaaS).

This script allows one to batch download videos for a Coursera class. Given a class name and related cookie file, it scrapes the course listing page to get the week and class names, and then downloads the related videos into appropriately named files and directories.

Why is this helpful? Before I was using wget, but I had the following problems:

  1. Video names have a number in them, but this does not correspond to the
actual order. Manually renaming them is a pain.
2. Using names from the syllabus page provides more informative names.
3. Using a wget in a forloop picks up extra videos which are not posted/linked,
and these are sometimes duplicates.

Naming is intentionally verbose, so that it will display and sort properly using MX Video on my Andriod phone.

Inspired in part by youtube-dl (http://rg3.github.com/youtube-dl) by which I've downloaded many other good videos such those from Khan Academy.

Let me know if you like it.

2
nzmsv 1 hour ago 0 replies      
Some shameless self-promotion: I wrote a Chrome extension for downloading Udacity videos (http://nzmsv.github.com/udacity-dl/). If there's any interest in a batch version I could look into it. Alternatively, feel free to write it and let me know :)
3
jwr 3 hours ago 1 reply      
Thank heavens! Er, I mean thank you (the OP) for this tool. I already wrote scripts that renamed files to something sane, but this will make my life so much easier.
4
jeremyarussell 3 hours ago 1 reply      
I'm already using it, after sometime I got a connection forcibly closed by remote host error. I can't access the Coursera website either, not sure why though. (mayhaps a bunch of people suddenly using this script crashed their servers? or they blocked us)

It's back up, must have been a small glitch. Might I add that I love the fact the script picked up on the video I dropped earlier.

5
dmn001 2 hours ago 0 replies      
I use the downthemall firefox extension and to keep the videos in order I add a number to the renaming mask:

  *num*_*name*.*ext*

I like how jplehmann's tool can rename them using the titles on the page.

6
floggit 1 hour ago 0 replies      
Can someone create an iOS app for this please ?
7
A_A 3 hours ago 0 replies      
this looks sweet. Thanks.
I had been manually downloading the vids on my PC, but I hope this tool will now reduce the pain.
8
themonk 3 hours ago 1 reply      
Thanks a lot, does it skip downloaded videos in next run?
25
The Power of Technical Debt ircmaxell.com
39 points by ircmaxell  5 hours ago   4 comments top 2
1
sopooneo 1 hour ago 1 reply      
I've never like the following expression.

"If you don't have enough time to do it right, when will you have time to do it over?"

It always seemed to me that an answer of "later" is potentially completely valid.

2
j_baker 2 hours ago 1 reply      
I think it's valid to take on Technical Debt, but only when it's been explicitly decided that there's a good reason to take on technical debt. It's when you make a habit of always taking the quick and dirty approach and never take the time to refactor that you start running into problems.
26
Asynchronous MongoDB with Python and Tornado 10gen.com
51 points by saurabh  6 hours ago   discuss
28
Is it Snowing Microbes on Enceladus? nasa.gov
80 points by J3L2404  8 hours ago   18 comments top 7
1
molbioguy 5 hours ago 0 replies      
Off topic (sort of), but this reminded me of 2001/2010. Arthur C. Clark was on pretty much on target. In 2010 he predicted we'd find life on an icy moon of the outer solar system. He happened to posit it was Europa (around Jupiter) but then the original novel had the mission flying to Saturn. The movie and sequels changed it to Jupiter because Saturn was hard for the special effects. Anyway, pretty darn impressive sci-fi speculation that might come true. And now back to our original discussion....
2
Sodaware 7 hours ago 1 reply      
I really hope that companies like SpaceX reduce the cost of space travel to the point where we can send out more probes to check things like this. It's early days for private space travel, but I think the next 10-15 years are going to be very exciting.
3
joeconway 7 hours ago 3 replies      
"...And we have found that aside from water and organic material.."

Surely finding organic material is an incredibly big deal, yet it's mentioned in passing. I feel as though I'm missing something fundamental here.

4
tocomment 6 hours ago 1 reply      
It sounds like they can't figure out where all of the heat is coming from. Is it possible this moon is partially heated by radioactive decay like the earth is?

http://en.wikipedia.org/wiki/Geothermal_gradient

5
scribu 4 hours ago 1 reply      
This is so exciting! and I don't really know why. These are just microbes we're talking about.
6
squarecat 4 hours ago 0 replies      
Incidentally, Spewing Plumes is (now) the name of my (nonexistent) band.
7
maeon3 5 hours ago 0 replies      
We need to monetize the investigation of Enceladus. Bringing back alien life from another planet would give us the opportunity to adapt their sensors and tools into our technology. Could we patent the blueprint of the alien life to reward the company that brings it back here?

Could the government say: "whatever company brings back an alien life form from enceladus can receive just compensation for any company who uses it for 50 years?"

30
Warehouse robots come of age extremetech.com
24 points by Glowbox  3 hours ago   5 comments top 2
1
ahi 1 hour ago 1 reply      
More than a decade ago I worked in a warehouse with far superior automation than moving pods. Pieces could be picked off 50 foot high shelves and moved to the proper packing stations and loading dock on conveyors. Moving entire shelving units around seems a horrible waste of energy, not to mention floor space (limits on shelving height plus allowance for movement).
2
asdfdsa1234 1 hour ago 1 reply      
This article has a low but not abysmal content-to-annoying-ad ratio
       cached 28 March 2012 20:02:01 GMT