hacker news with inline top comments    .. more ..    7 Nov 2016 News
home   ask   best   3 years ago   
1
The new 64 bit Orange Pi, a Quad Core Computer for $20 techcrunch.com
28 points by piyush_soni  43 minutes ago   5 comments top 3
1
pi-rat 17 minutes ago 2 replies      
Not for people with OCD, that's an interesting board layout :)
2
paulftw 9 minutes ago 0 replies      
While really awesome on specs and price, downside of these boards is dev tools and community support. They are getting better at it and very fast, but living on the cutting edge is not pleasant.
3
AstroJetson 17 minutes ago 0 replies      
2
A bug story: data alignment on x86 pzemtsov.github.io
11 points by sconxu  1 hour ago   4 comments top 2
1
pjc50 9 minutes ago 2 replies      
Well, that's pretty horrendous. Note that the naive code which just casts the input to uint16_t would work fine. I can't help but wonder if the solution to this might have been better expressed as naive implementation + platform-specific assembly implementation.

After all, if you have to understand the underlying instructions executed in order to fix the problem, why not stop trying to make the compiler emit the "right" instructions and just write them yourself?

(Language lawyers: is casting a char* to a uint32_t* actually defined behavior? For unaligned data?)

2
amelius 18 minutes ago 0 replies      
TL;DR: Even though most instructions of your processor (x86) allow data to be aligned on any byte, your compiler might not.
3
Some thoughts on asynchronous Python API design in a post-async/await world vorpus.org
188 points by piotrjurkiewicz  8 hours ago   58 comments top 4
1
quotemstr 6 hours ago 4 replies      
The idea espoused in this blog post, that

> if you have N logical threads concurrently executing a routine with Y yield points, then there are NY possible execution orders that you have to hold in your head

is actively harmful to software maintainability. Concurrency problems don't disappear when you make your yield points explicit.

Look: in traditional multi-threaded programs, we protect shared data using locks. If you avoid explicit locks and instead rely on complete knowledge of all yield points (i.e., all possible execution orders) to ensure that data races do not happen, then you've just created a ticking time-bomb: as soon as you add a new yield point, you invalidate your safety assumptions.

Traditional lock-based preemptive multi-threaded code isn't susceptible to this problem: it already embeds maximally pessimistic assumptions about execution order, so adding a new preemption point cannot hurt anything.

Of course, you can use mutexes with explicit yield points too, but nobody does: the perception is that cooperative multitasking (or promises or whatever) frees you from having to worry about all that hard, nasty multi-threaded stuff you hated in your CS classes. But you haven't really escaped. Those dining philosophers are still there, and now they're angry.

The article claims that yield-based programming is easier because the fewer the total number of yield points, the less mental state a programmer needs to maintain. I don't think this argument is correct: in lock-based programming, we need to keep _zero_ preemption points in mind, because we assume every instruction is a yield point. Instead of thinking about NY program interleavings, we think about how many locks we hold. I bet we have fewer locks than you have yields.

To put it another way, the composition properties of locks are much saner than the composition properties of safety-through-controlling-yield.

I believe that we got multithreaded programming basically right a long time ago, and that improvement now rests on approaches like reducing mutable shared state, automated thread-safety analysis, and software transactional memory. Encouraging developers to sprinkle "async" and "await" everywhere is a step backward in performance, readability, and robustness.

2
justinsaccount 7 hours ago 7 replies      
I feel like I'm too dumb to understand any of this. And I've been writing python for 12 years.

Just give me greenlets or whatever and let me run synchronous code concurrently.

 async def proxy(dest_host, dest_port, main_task, source_sock, addr): await main_task.cancel() dest_sock = await curio.open_connection(dest_host, dest_port) async with dest_sock: await copy_all(source_sock, dest_sock)
Are you kidding me? Simplified that is

 async def func(): await f() dest_sock = await f() async with dest_sock: await f()
Every other token is async or await. No thank you.

3
Animats 5 hours ago 1 reply      
The main use case for all this async stuff is handling a huge number of simultaneous stateful network connections. At least, that was what Twisted was used for. Are there other use cases for this sort of thing that justify all the complexity that comes with it?
4
tschellenbach 7 hours ago 11 replies      
Are there any languages that have really nailed this? I've used gevent, eventlet, (both python), promises, callbacks (node) and none of them come close to being as productive as synchronous code.

I'd like to try out Akka and Elixer in the future.

4
LessPass: sync-less open source password manager lesspass.com
30 points by mgliwka  1 hour ago   14 comments top 9
1
3pt14159 1 minute ago 0 replies      
Very cool project!

One thing though, LessPass sets HSTS headers, but should include the `includeSubDomains` directive and the `preload` directive to stop a first time MITM (for example, when you get a new phone). Once these are done, LessPass should be added to various browser preload lists.

2
avian 1 hour ago 1 reply      
I don't know what these are used for, but secret keys generated from current time are easy to guess. You only have to try around 2^24 values if you can estimate installation time within a specific year.

https://github.com/lesspass/lesspass/blob/master/lesspass.sh...

3
croon 28 minutes ago 0 replies      
Others have expressed most of them, but issues I see with this is:

* Algorithm can't be changed/improved without changing all your passwords.

* Your master password can't be changed without changing all your passwords.

* You have to remember yourself at what sites you are already registered, and in case of critical bug, you would perhaps need to change password at some services (again remembering which ones they were).

With that said, I really like the outside-of-the-box thinking on this.

4
jiehong 52 minutes ago 2 replies      
Not-so-good good idea?

Given that you already have dozens of site with their own passwords, you just can't import your passwords, but you need to change all of them to start using lesspass first.

Also, if the way the generation of passwords works changes later (i.e. bug), then the users are stuck with a version, or the bug is never fixed, ever.

5
shinigami 12 minutes ago 0 replies      
> PBKDF2 with 8192 iterations

Not nearly good enough.

6
HackinOut 42 minutes ago 1 reply      
I wouldn't use a password manager system that doesn't have the ability to change the master password.

EDIT: You can't change any password really, without changing all of them (or having a separate master password). Seems unpractical as soon as, for example, site X gets its database hacked.

7
mgliwka 24 minutes ago 0 replies      
8
sschueller 24 minutes ago 0 replies      
This one isn't bad either https://keeweb.info/
9
corobo 52 minutes ago 0 replies      
What happens when the method of creating passwords needs updating, do I then need to visit countless numbers of sites to change the password?

I like the idea don't get me wrong, I just can't see all of the downsides right now which will stop me using it.

Elephant in the room: Are you going to be sued by lastpass for the name?

5
Pencil: open-source GUI prototyping tool evolus.vn
89 points by cheiVia0  5 hours ago   38 comments top 8
1
mch82 1 hour ago 2 replies      
The original project is abandoned, but a recently updated fork is available at github.com/prikhi/pencil.

I prefer Pencil to more commonly referenced options like Balsamiq. Looks like no commits in a few months, but hopefully the form is still going. An open source UX tool focused on software application workflow prototyping rather than GUI layout is an asset to the software development community.

2
kschiffer 2 hours ago 0 replies      
Been using this for a while due to student-related poorness. Gotta say it does what I want and has helped numerous times also in bigger projects. The low fidelity helps you focusing on gui structure.
3
huhtenberg 2 hours ago 3 replies      
Seems like an abandoned project. Website copyright is 2012, last stable release 2013, last news update over a year ago.

What's the context? Is it somehow notable regardless?

4
ensiferum 1 hour ago 1 reply      
Or you can just use Qt's Designer which makes it very fast to knock up a GUI. (and then later evolve it into the real thing as well)
5
pmlnr 1 hour ago 3 replies      
http://pencil.evolus.vn/Next.html

> Electron as the new runtime

No.Just no.

6
pqdbr 1 hour ago 0 replies      
I love Balsamiq (desktop version), but lately I'm finding it such a memory hog on OS X that's rendering my machine unusable, and that's with not-so-large projects (30+ mockups)
7
vsviridov 1 hour ago 1 reply      
Poor native support on OSX. Doesn't even allow to Hide the window.
8
gobusto 2 hours ago 0 replies      
At first, I thought that this was referring to https://github.com/pencil2d/pencil
6
Time to Dump Time Zones nytimes.com
442 points by prostoalex  13 hours ago   401 comments top 83
1
darawk 13 hours ago 17 replies      
Hmm...this is an interesting idea. However, the core of the argument seems to be this:

> The economy thats all of us would receive a permanent harmonization dividend the efficiency benefits that come from a unified time zone.

But this editorial is pretty light on actually supporting that. The basic argument seems to be that it reduces 'translation costs'. But..does it? What about the benefits of being able to refer to times without having to localize them? If my friend on the other side of the country says "I woke up at 9 this morning", I have a pretty good idea of what that means. If we used this new system, i'd have to mentally translate.

In terms of scheduling things, it would get easier in some ways and likely harder in others. If say, I want to schedule a conference call at 3, yes, 3 is the same time for everyone, but i'd still have to do some mental sanity checks to ensure that that time is reasonable for everyone who might be participating.

Overall, is there really an efficiency gain to be had here? I'm not taking the firm position that there isn't, btw. Just a bit skeptical and curious to hear a better argument in its favor if anyone has got one.

2
pfarnsworth 13 hours ago 4 replies      
Getting rid of Daylight Savings makes complete sense, and it's something we should really pursue.

Getting rid of Time Zones is ridiculous. People know that 6am roughly is morning, and 6pm is roughly the evening. When you're dealing with someone internationally, you know not to call them at midnight their time because there's a high probability they may be sleeping. Having time roughly follow a standard around the world makes absolute sense because we're human.

3
apaprocki 9 hours ago 1 reply      
Maybe I'm the contrarian. I don't mind timezones at all. The thing that throws us for a loop is changing the timezone either temporarily due to some custom (Daylight Savings, Ramadan, elections, etc.) or permanently with little notice (populist tendencies in governments).

I maintain timezone infrastructure and what I'd rather see is an international treaty that all changes to country timezones require some standard period of notice. Even 90 days would be better than what we have now. Getting international politicians to care about timezones seems like a losing proposition before even starting, so I was imagining a "hack" to an existing treaty. The best one I could think of was an amendment to the International Maritime Organization (IMO). Perhaps if governments were made aware that screwing with, say, DST 3 days before it is about to go into effect would violate some treaty, they would shy away from that behavior. I know that's something the IATA could get behind.

4
edblarney 13 hours ago 9 replies      
The article is upside down.

"Perhaps youre asking why the Greenwich meridian gets to define earth time. "

It doesn't. Everybody gets to have their own, proper time.

In comparing times, we +/- based on an arbitrary spot, and that's it.

Time zones are a great solution to a problem.

People want their time in local terms.

Everyone waking up and going to bed at different times is an utterly ridiculous concept.

FYI - if you want to use UTC - you're free to do that today.

To those suggesting we should use UTC: walk down the street and look at regular people. 'Other time zones' are irrelevant to them - utterly. There are very few people who need to deal with other time zones.

It might be remotely possible to put an entire nation on one time - i.e. put America on 'Mountain Time' - it might be possible to convince New Yorkers and Californians that they are getting up earlier/later etc.. But even that wouldn't be very useful in the end.

5
captainmuon 41 minutes ago 0 replies      
Another radical idea, we could do the opposite: Use local astronomical time everywhere.

One reason unified time zones were introduced were train timetables. I haven't used one directly in quite a while, instead I've used websites, apps and public displays. They could all accommodate for the fact that different train stations had slightly different times.

Unified time is also important for TV show airing times. But scheduled TV is getting less and less important compared to video on demand. (Also if you really wanted to, future set-top boxes could modify the text of announced times on-the-fly. Text-to-speach has become pretty good :-))

I believe this would also encourage places to adjust their opening hours relative to sunlight, which would probably be healthier and better for the environment in the long run.

6
powvans 12 hours ago 3 replies      
I've spent the last several years developing applications that have calendaring at the core, where users collaborate across timezones, and where proper timezone handling is expected.

Just try to explain to everyday people just how hard the problem is, the technical in and out, the practical miracle that it all works in almost all the cases, that all the error prone complexity can be eliminated by acknowledging that it's the same time everywhere all the time. They will not understand and they will not care. They will most likely consider you crazy. Like Don Quixote tilting at windmills.

We implement technical solutions to hard problems in order to simplify daily life for the 99.9% of people who do not care about things like timezones. Not withstanding the elimination of DST, this article is asking for the opposite. Furthermore, behind the scenes we already deal with time in the agreed upon standard.

If your dates are not stored in UTC in your database, you are doing it wrong. If your API and client software do not deal in dates in UTC, you are doing it wrong. If your dates are not localized when displayed to end users, you probably have very unhappy users.

7
jameshart 7 hours ago 1 reply      
This proposal focuses on timezones as names for hours and completely glosses over what the relationship is between timezones and days. Does Monday November 7th 2016 run from 00:00 UTC - 23:59 UTC? Do people in Sydney start work at 21:00 UTC on Sunday and come home at 17:00 UTC on Monday? Or do different places around the world have different times when they flip the calendar over? (in which case you've just recreated timezones by another name).

You don't get to get rid of the international dateline, either, because when every country chooses which two daylight-periods a week to use as their local weekend, even if everybody aligns them with their closest neighbors to the east and west right the way round the world, someone's going to find themselves having to make a choice between matching their western neighbor or their eastern one, because they disagree by a day.

8
cyberferret 13 hours ago 1 reply      
Living in a place that doesn't have the concept of daylight saving time, I positively hate the dance that we have to do twice a year with our interstate and overseas brethren to arrange meetings etc.

Having been a commercial pilot, I also appreciate the concept of 'Zulu time' where EVERYONE is on the same page as to when an aircraft will depart or reach a particular waypoint. No need to wonder if it is during morning, noon or evening, if a crew member said they would be at a particular location at 0421, we all knew how many minutes ahead or behind we were, no matter where we are in the world. After all, everyone who cares about that reference is already awake and working at that time.

Currently, I work with a widely distributed remote team across the world. Yes, arranging meetings is hard in order to ensure that it fits with working schedules and awake times, but at the end of the day, I usually also clarify the meeting times in UTC times, just to ensure that everyone can double check in their local timezones.

This means using only one fixed datum for checking rather than figuring out if the remote timezone & daylight saving, and trying to work back to your local timezone accordingly. For instance, I am in Australia and you are in the US, you have to figure out the subset timezone each is in before working out. I have no idea what zone Kalamazoo is in off the top of my head, and I bet you have no clue what timezone Darwin is in?!?

In our web apps, we always set our servers to the default UTC timezone, and try and use humanised time displays all over the app (i.e. "Updated 34 minutes ago") or use local browser timezones to display actual time. This way, it doesn't matter if someone in the Ukraine or in Alaska enters a record, it is always "x minutes from a fixed datum" no matter what your local time is, and it seems to make more sense to the users.

9
kryptiskt 13 hours ago 5 replies      
So you want to abolish time zones (https://qntm.org/abolish)
10
olliej 13 hours ago 0 replies      
Someone wrote an article a while back about why this is a stupid idea -- mostly things like "if I want to call so and so in Australia I would now need to find out what time of day in Australia is the time I consider to be evening"

Computers already work in terms of utc (assuming correct code ;) ) -- this article is mostly "I know what time I'm in, why should I have to consider other people?"

If we were to get rid of anything it should be DST as we aren't agrarian and there's no evidence that the modern "save power" claims are remotely accurate. But then again it's a politically cheap thing for politicians to change it "help the environment" and helping the environment is always good, right? :-/

11
sschueller 50 minutes ago 0 replies      
Swatch tried to do that a 'very' long time ago: https://www.swatch.com/en/internet-time/
12
glook 13 hours ago 0 replies      
I remember https://en.wikipedia.org/wiki/Swatch_Internet_Time - I loved the concept. Maybe if we switched to beats we wouldn't have an association for 'noon.'
13
nxc18 9 hours ago 0 replies      
Scheduling a recurring meeting of busy people (think college students with busy schedules) across continents, cultures and countries is hard enough without timezones and arbitrary changes in daylight savings time.

For example, Dubai doesn't have DST - it never changes relative to the others. Kosovo and Croatia change a week before the U.S. changes. Then the U.S. changes but Dubai doesn't.

For about 6 weeks out of the year, scheduling is confusing chaos, and a workable schedule under one time configuration very likely doesn't work for the others, keeping in mind that a midday meeting in the U.S. is pushing on midnight in Dubai and family time in central Europe.

The whole situation is a disaster, and while manageable, certainly takes a lot of effort, planning, and luck to get right.

14
mjevans 13 hours ago 2 replies      
As this version of the thread seems to have more points I'll comment here.

I completely agree with everyone using UTC for numeric time numbering.

I DISAGREE, with remapping 'noon', 'midnight', 'morning', etc. All of the relative descriptions for when in the local solar day a thing is should be approximate local references.

An example: 'lunch' and 'noon' would still be the time in the middle of the local solar day that people get a meal. (Somewhere around 11AM to 2PM in current local times)

15
brongondwana 9 hours ago 0 replies      
Come back to me when you've managed to convince the USA to adopt metric, which is significantly more of a no-brainer than throwing away timezones.
16
stretchwithme 2 hours ago 0 replies      
I've been thinking this for years, as dealing with time has been a chore in software development.

But I think it would make life simpler for those with the resources to deal with time zones and harder for those with less resources.

You wouldn't want a change made in medicine that made life easier for doctors but harder for patients, would you? I'm mean, which is trained to deal with complexity and is well paid to do it? And which is often overwhelmed with what's happening to them?

Of course, software is going to make time easier and easier to deal, so perhaps it doesn't matter if we change how we deal with it or not. Not in the long run anyway.

Software may get so good at dealing with time that we each can deal with it exactly how we want to and the software we used to share information will seamlessly deal with all the translation back and forth.

17
upofadown 10 hours ago 2 replies      
If you have UTC and solar time available, you don't need zone time anymore. If you need to coordinate with other people you use UTC. If you want to do things at a particular time of day you use solar time.

If you decide to go to work at, say, an hour after sunrise you get all the advantages of DST, anywhere in the world, without any of the disadvantages. The reason DST sucks is because of the way time zones force all time to be an even number of hours (yes, I know there are exceptions, but the principle still holds).

18
saretired 4 hours ago 0 replies      
This editorial seems to assume that clocks were more or less synchronous prior to the introduction of time zones -- this is completely false. Time zones were introduced (at the behest of the railroads in the U.S.) because communities had their own local mean time, and often those communities were not distant from one another. In other words, there were an indeterminate number of time zones worldwide. The current time zone map, for all of its peculiarities, is hardly difficult for people to grasp: it gets light in the early hours, etc. China has a single time zone for the entire country--perhaps the author expects a Maoist ``harmonization dividend'' worldwide but speaking to Chinese friends over the years I'm not convinced that such a happy dividend ever materialized there.
19
barnacs 12 hours ago 2 replies      
I'd go even further and divide every human settlement into multiple districts with schedules shifting gradually between them.

Like, in one "district" of the city most people would be sleeping, it would be quiet and somewhat dead, while at the same time a few "districts" over in the same city businesses would be open, including banks, offices and other stuff on extremely strict schedules currently. Yet in another district it would be leisure time, where most people would be doing whatever they do between sleep and work.

The point is, you could always find the appropriate district for whatever you want to do, be it business, leisure, sleep or anything.

Also, with digital timekeeping devices (watches, calendars, digital presentation of business hours, etc) schedules could all be dynamic. Instead of "2016-11-09 16:00" you could schedule things like "3 days and 2 hours from now" and all devices would dynamically keep track of how much time is left until the event.

20
Lazare 1 hour ago 0 replies      
I remember when I thought the same thing. Then I spent a couple hours thinking about it, and realized what a terrible idea it was.

In my day to day life, I often need to talk to fellow tech people (engineers, support, etc.) in my own time zone, but also in places such as Sydney (2 hours behind me) or California (20 hours behind me). As a society, we've agreed that tech people like that are generally working 9-5, give or take, and we have a universally agreed mapping of locations to adjustments to local time (aka, time zones).

This means that I can look at a clock and realize "oh, it's only 10am, I shouldn't expect a reply to my email to that guy in Sydney yet". Even if I'm dealing with a place I'm not really familiar with, I can look up the timezone, or just google "time in Berlin", and I now can translate appropriately.

But what if we abolished that mapping (aka, timezones)? Presumably every location would just continue on about their life as before. Sydneysiders with desk jobs would stop working 9-5 AEST, and would start working 19-3 UTC; New Yorkers in similar roles would be working 4-12 UTC. Sounds odd, but it would quickly start to seem normal.

Problem: We just abolished the mapping tables that let me adjust for local time. It was trivial to work out that my Sydney colleague was probably out to lunch at 12:45 AEST, but how am I meant to know if he'll be out to lunch at at 22:45 UTC?

The only way to make this work is to have or create some form of mapping that tells me the adjustments to make to bring Sydney schedules in line with my own schedule. Eg, "I go to lunch at 00:30 UTC, and Sydney is about 2 hours behind me, so they'll be at lunch now". Because without that table, I simply have no clue what (or rather when) people in Sydney are doing.

But that table is just another word for time zones, the thing we abolished. Even better: While there are work arounds, most of them are very difficult to computerize. For example, I could check the corporate website of the guy I'm trying to call, and see if they have office hours (in UTC), and then try an work out when they'll be at lunch, but my calendar app wouldn't be able to do that. It needs a formal, agreed upon table of adjustments. Which would be time zones, whatever we want to call them.

TL;DR: This is an idea that makes a lot of sense to people who only have to deal with people who are geographically close to them. If you're actually dealing with people around the world, time zones are not an inconvenience, they're critical.

21
laurieg 4 hours ago 0 replies      
I think less reliance on clock time would be a good thing for the modern world, but I don't think getting rid of time zones would achieve that.

It's amazing how much of modern life is unnecessarily affected by the time on the clock. When daylight savings time rolls round in the UK you can make 60 million people change their schedule by an hour, and even miss an hour of sleep while doing so! Who would have thought the humble clock had such control over people?

In the modern world of plenty, why is it so incredibly rare to meet someone who goes to bed when they're tired and wakes up when they're not?

22
henrikschroder 6 hours ago 0 replies      
This is stupid. This is obligatory reading: https://qntm.org/abolish
23
artpepper 13 hours ago 0 replies      
This is basically a form of Utopian thinking, that we should adopt a convention because it's "logical" rather than one that meets human needs.

> People forget how recent is the development of our whole ungainly apparatus. A century and a half ago, time zones didnt exist.

That's true, but not because everyone was living on GMT. Time zones are somewhat arbitrary, but aligning everyone to a single time zone is even more arbitrary.

24
taeric 13 hours ago 0 replies      
This should be required reading for anyone thinking time zones are a bad idea: https://qntm.org/abolish
25
egypturnash 8 hours ago 0 replies      
okay so

let's substitute one completely arbitrary time measurement (Greenwich Mean Time, which is basically "solar time outside an observatory in Greenwich, England") for an intricate set of time zones that is, admittedly, confusing, but also has some vague relationship to "solar time in the area covered by the time zone"

it'll be great

almost everyone on the entire planet will have to get used to workday hours being different numbers, and we'll still have to do timezone calculations when we want to try and make realtime contact with someone in what was formerly another time zone

it'll be super awesome

oh hell let's just all switch to Swatch Internet Time while we're at it, throw out all the analog clocks.

(I mean yeah, fuck daylight savings time anyway, but sure let's throw out the baby along with the bathwater and make life marginally easier for people who regularly schedule intercontinental phone calls. They're the only people whose opinion matters evidently.)

26
droithomme 12 hours ago 0 replies      
UTC+8 has the largest population by far:

http://artscience.cyberclip.com/world-population-by-time-zon...

Therefore that is what everyone should use rather than Greenwich time as it would provide the least disruption.

27
reflexive 8 hours ago 1 reply      
The advantage of daylight savings is I can set my schedule to wake up at the same time each day, while maximizing my daylight hours and never getting up before dawn.

Without the shift for daylight savings, the sunrise time in San Francisco varies from 5:47a at the summer solstice, to 8:25a just after winter solstice. If I set my schedule to get up at 8:25a every day, I'd miss a significant amount of morning sunlight.

The beauty of daylight savings is it shifts the sunrise time to be earlier in winter, so instead of 8:25a the sun rises at 7:25a. Thus I can set my schedule to get up at 7:25a and get an extra hour of sunlight every day.

(Technically the latest sunrise time is just before shifting from daylight to standard, 7:39a on November 5 - the point is we reduce the range of variation from 2hrs 38min to 1hr 38min)

28
gdw2 7 hours ago 1 reply      
I would imagine for many places, a normal 'day' would actually be split across two dates. That would be confusing!
29
endymi0n 3 hours ago 0 replies      
Public Service Announcement:

https://qntm.org/abolish

(So you want to abolish Time Zones...)

30
mckoss 10 hours ago 0 replies      
I don't think people could easily adjust to having the date and day of the week change every day at, say 2pm (for those on PST).
31
cushychicken 8 hours ago 0 replies      
China has one time zone. People outside of Beijing and the eastern Chinese seaboard don't experience sunrise until as late as 10 AM in some places. I don't find that terribly sensible, frankly. It might make scheduling communication easier, but I agree with a lot of the other comments pointing out that 9 AM actually means something to people as far as time of day goes.
32
rcarmo 13 hours ago 2 replies      
I'd be happy with doing away with Daylight Savings Time.
33
danso 13 hours ago 0 replies      
Dumping time zones might be painful, but in the long run it'd probably be the right thing to do. First, it'd make learning about time as a programmer much easier because you'd inherently know that it's a bit complicated and why epoch time and UTC is a thing instead of storing values like "Sunday 4PM".

But more future facing...what are we going to do when we have colonies on the moon and Mars? Things are going to spiral fast if we don't stick to uniform time while we're still mono-planet.

As an example, NASA's Mars photos API provide an option to sort by Martian sol instead of earth date: https://api.nasa.gov/api.html#MarsPhotos

34
contingencies 5 hours ago 0 replies      
The author James Gleick wrote The Information which was recently recommended to me but I found a bit slow-going and obtuse. It didn't really resonate with me coming from a comp-sci pragmatist background, though some here may enjoy it.

As far as TZ's go, IMHO one of the biggest issues is the TZ database which itself lacks support for i18n and many modern pragmatic concerns. I once made a proposal to modernize it but it fell on deaf ears... it is, understandably, fairly conservatively maintained.

35
glandium 6 hours ago 1 reply      
I agree with the sentiment in many of the threads that this is an awful idea, OTOH, there is some level of condusion with time zones in the US: I don't live in the US, but the few times I went there, it took me a while to figure the whole "8|7c" thing for TV programs, where, AIUI, east and west coast get to have programs at the same local time (8), but obviously not the same absolute time, and states in the middle get to have programs at the same absolute time as the east coast, but earlier in their day (7).

Or did I get it wrong, and the whole thing is even more confusing?

36
taf2 10 hours ago 0 replies      
I wish this was about day light savings and the random changing of the time zone. I can live with the different time zones and I think they are even kind of nice, but day light savings changing the current time, is terrible. You could argue it had cost real money (think azure going offline) and think of all the countless other hours people have spent dealing with the change.
37
vidoc 8 hours ago 1 reply      
Very interesting article, I've wondered about a universal time several times, tons of pros and cons sure, but it's cool to think that people are thinking about such disrupting alternatives.

Reminds me this cool episode of 99% invisible[1] where they talk about calendar design, in particular, Kodak company used a 28-day/13 month design calendar until the 70s and it turns out it was beloved by teams who were doing a lot of forecasts.

1: http://99percentinvisible.org/episode/the-calendar/

When I see how tough it is to move to the metric system in the US, I'm not quiet sure we'll ever see such a change any time soon tho!

38
guilt 3 hours ago 0 replies      
I think the first thing to dump is the daylight savings time concept.

Then rebase the timezones based on UTC.

It won't help when you have a polar winter or a midnight summer - but at least it will remove unnecessary conversions.

39
pipio21 11 hours ago 0 replies      
"Time" has always been linked to sun or moon or the stars.

In English you say o'clock because it is the time of a mechanical clock, it was normal and way more precise to use the sun for a long time in history so 12 was the time the sun was the highest(Zenith).

Astronomers used the moon and stars for calculating time with extreme precision, and they continue doing that.

This always gives you local time at the point of the observer. The man that writes the article probably lives without contact to the environment,in a city,goes to work to a building without the light of the sun but for those of us that do not, knowing when the sun is going to be the highest, the rising and setting of the sun is a great idea.

With GPS enabled clocks, like Apple watch and future smartwatches, every person could carry in his clock the real local time, the political local time zone time,and UTC. No need for globalist trying to force us into doing that.

In America and specially New York it looks like the only important thing in life are first money, then the economy. Instead of money and economy being in service of the world, in those places the world has to serve the economy and money.

40
facorreia 19 hours ago 2 replies      
I think the article makes a valid point but it can't get past Americanisms like "noon" or "4 p.m."

If people were to adapt UTC, those would be 12:00 and 16:00 respectively.

And that just reinforces the main point, that the numbers are largely arbitrary -- but descriptions like "after mid-day" (p.m.) aren't.

41
foxylad 9 hours ago 0 replies      
We run a global booking service. Fixed timezones are simple to accommodate, but abolishing daylight savings would save me hours of scribbling clock faces.

The issue is that when someone books a given time across a DST change, you need to adjust the unix timestamp in the right direction. And in my experience, your first guess at the direction is always wrong - even if you know your first guess is always wrong.

Still, I guess that making a few developer's lives easier is less important than saving several lives a year due to reduced pedestrian deaths.

42
leroy_masochist 8 hours ago 0 replies      
Meh. The military has been trying to run everything on Zulu time for years and it hasn't gotten a lot of traction because everyone seems to hate it. (Speaking from experience on the ground side, I know it is a bit more commonplace and accepted in the aviation community).

At least as far as the US is concerned, if we're going to change a major standard of measurement, let's focus on going over to the metric system.

43
nullc 3 hours ago 0 replies      
We can't even manage to agree to get rid of the nearly pointless and _highly_ disruptive leap second...
44
131hn 11 hours ago 0 replies      
Having all clocks of the world in sync will be a bless, but there remain a gap in time information, so people will add location offset by themselves (we mostly use time to describe life in our timezone) "yes it's 4 am, and i'm in paris". That location offset will be nothing but current timezones, with an inverted logic. I know we are all on different schedule and rythm, but 7am is a good time for breakfast, everywhere, 12h30 feels fine for lunch, kids go to bed by 8pm, and we should sleep at 2am. I guess timezone make life easier for 80% of the world ( i dont think it's 99% anymore)
45
frik 5 hours ago 0 replies      
Time to dump summer vs normal time. Switching twice a year, and every other country choose a different date or doesn't switch... It's time to stop switching between summer and normal time.
46
syrrim 5 hours ago 0 replies      
Lets review the arguments:

- Time hasn't always had zones, so why not get rid of those too?

- Nazi germany, communist china, and north korea all do it, so clearly this is a policy befitting the modern world. Presumably alongside mass censorship and death camps.

47
ajmurmann 10 hours ago 3 replies      
I have huge issues with DST and how we measure time in general (24hours, 60 minutes, 60 seconds?! Who the f* came up with these crazy ass numbers?!). But what bugs me much more is the insane calendar. 12 Months with arbitrary numbers of days. That's just crazy. In the very least we could make it so that January to May have 31 days and the remaining months get 30. ideally we would switch to the positivist calendar. That would make everything date related trivial, easy to remember and calculate.
48
namank 13 hours ago 0 replies      
So 2pm Earth Time for me in North America means afternoon but it means night for Asia?

This might make sense if Earth's communication with extraterrestrial localities significantly outweighed communication across Earth.

But it doesn't.

So, no.

But upvote to the OP for posting an interesting idea. :)

49
ced 12 hours ago 0 replies      
People on the West Coast would start their work week on Sunday around 11PM, and finish it on Monday at 8 AM. Then they would agree to a Wednesday morning meeting, and miss each other by a day.
50
grzm 1 day ago 3 replies      
grr...

"A century and a half ago, time zones didnt exist."

Sure. How old are the concepts of noon? Midnight? The transition to railroad time and time zones wasn't from a single time zone to many, but from incredibly local time zones (what time is noon in this town?) to fewer.

Likely a lack of imagination or caffeination on my part, but it's hard for me to imagine what it would be like to have shops open at, say 22:00. And we still need to take into account differences across the world for coordinating with people. It's not like we're going to change our diurnal habits just because it's, say, 14:00 across the world at the same time.

Wow. This is one of the most caustic comments I've left on HN. Someone back me off of the ledge. Off to read the Cato Institute commentary linked to in the article.

(And you kids! Get off my lawn!)

51
mememachine 4 hours ago 0 replies      
Time zones make cross-geographical communication far easier. This is important in the age of the internet.
52
Spooky23 13 hours ago 0 replies      
Interesting thought experiment, but what problem does it solve?
53
ChicagoDave 8 hours ago 0 replies      
Having worked on computers for 30 years and seeing a recent uptick in globalization of system usage, getting rid of time zones would be a fucking awesome change to localization nightmares with data.

DO IT!!!

54
chx 10 hours ago 0 replies      
> No more wondering what time it is in Peoria or Petropavlovsk.

Sure, I know it's 10am in Petra and so what? Is my partner in the tourist business dead asleep or just stirring? If you detach these numbers from the life of people then you need to find some other way to remember their cycle.

55
bitwize 8 hours ago 0 replies      
Time zones I'm okay with. But DST needs to GTFO.
56
amelius 12 hours ago 0 replies      
The article is a little short-sighted. What time will we use when we start colonizing other planets? Those planets may not even have a 24-hour day. And to make matters worse, relativity tells us that a global clock doesn't even exist (look up simultaneity).
57
drallison 16 hours ago 1 reply      
Time zones and, worse, leap seconds interfere and needlessly complicate recorded time--the ability to compare two times and compute the time interval between them. Anyone who has tried to geophysical data from diverse locations and organizations knows how difficult that is.

Now would be a good time to adopt this change since the way standard units (mass, length, time, etc.) are being redefined.

58
ryanbertrand 4 hours ago 0 replies      
It would reduce a lot of code in Date libraries :)
59
m1sta_ 7 hours ago 0 replies      
A single, open, calendar interface for all people would solve this. I cannot see that happening though.
60
ericzawo 13 hours ago 0 replies      
A crazy thing my brother and I noticed this morning that the last clock we have to manually change ourselves is our watches. Yes, even the oven had a built in DST time change.
61
remoteme 9 hours ago 1 reply      
We should use local relative times based on sunrise.

"I start work at 3 past".

Meaning he or she starts work at 3 hours after sunrise.

62
nealmydataorg 13 hours ago 0 replies      
It will be difficult for all the countries to agree to dump time zones. If they agree then adjustment of computer systems will bring time similar to Y2K effect.
63
nealmydataorg 13 hours ago 0 replies      
It will be difficult for the countries to agree to dump time zones. If they agree then adjustment of computer systems will need effort similar to Y2K effect.
64
jwfxpr 6 hours ago 0 replies      
The United States still uses Imperial measurements and Fahrenheit, and the NYT publishes an article arguing for an unnecessary, human-unfriendly re-standardisation of time zone(s)...?

Mmhmmm. The whole world's gonna take THAT idea seriously.

65
mixedCase 12 hours ago 0 replies      
Archived version: https://archive.fo/oZVQ2
66
brianwawok 12 hours ago 0 replies      
I vote for infinite time zones. Each town picks their own timezone based on solar noon.

We have tech to map for us when traveling.

Bringin 1900 back.

67
shmerl 10 hours ago 1 reply      
I don't mind ditching DST. It's a mess. Not sure why it's still in use.
68
return0 10 hours ago 0 replies      
This will be useful when we've colonized a few planets. Until then, hell no.
69
bbcbasic 13 hours ago 1 reply      
Sounds good to me. If the farmers want daylight savings then simply go to work at a different time, UTC.
70
petre 5 hours ago 1 reply      
First get rid of leap seconds.
71
taylodl 11 hours ago 0 replies      
How about we start with ending daylight saving time and see how that goes, hmmm?
72
matjaz2k 13 hours ago 0 replies      
Wait until we go interplanetary. :)

Then also "let's talk in 5 mins" won't work anymore.

73
Mz 11 hours ago 0 replies      
Prior to railroads, local time was determined by making it Noon when the sun was directly overhead. Time zones became necessary when railways spread in order to be able to catch your train. Airplanes use Zulu Time because they can fly fast enough to make it a confusing mess to do anything else. This is the same problem the railways had, but the next order of magnitude up.

We already have and use Zulu Time for airplanes. There is no compelling argument here for making that universal. Local time zones still make sense in terms of setting schedules for local services, jobs, etc. The fact that I can talk to people all over the world isn't a compelling reason to move to a singular Earth Time.

74
paulddraper 10 hours ago 0 replies      
Agreed. Baby steps: get rid of daylight saving time.
75
InclinedPlane 6 hours ago 0 replies      
No, just no. You still need to keep track of local times relative to the Sun because that's important. And time zones are the best way to do that. We could improve our usage of time zones, but abandoning time zones is just a silly idea.
76
peterwwillis 8 hours ago 0 replies      
Has anyone else noticed that dialing a phone number to call or text someone is like using their IP address to send them an e-mail?
77
grzm 13 hours ago 0 replies      
78
abcd_f 13 hours ago 0 replies      
Looks paywalled to me.
80
grzm 13 hours ago 1 reply      
mod: Can you please combine the three threads?
81
Longhanks 13 hours ago 0 replies      
> Don't have an account? Sign up here!

No thanks.

82
adamnemecek 13 hours ago 0 replies      
What will Jon Skeet do now?
83
gwbas1c 13 hours ago 4 replies      
Ok, pretty much every comment here is about why time zones are important. I agree!

But one thing to ponder: Just like telling time in 24-hour mode is useful, perhaps it will be useful to have some clocks display UTC? Perhaps it'll be useful to talk about international events in UTC?

Perhaps instead of abolishing time zones, we just use UTC as a convention when talking about events that happen across time zones?

7
Shyness: small acts of heroism the-tls.co.uk
19 points by pepys  2 hours ago   1 comment top
1
exergy 1 hour ago 0 replies      
Very nice read. I used to be a particularly shy person, but the difference between shyness and being anti-social in general, is that a shy person _wants_ to connect to others, but is afraid of the outcome of attempting to do so.

This is why, I think, shyness cures itself as we move through life (for many if not most people). We come to realise that a) we don't matter as much as we think to the other person, and b) sharing our thoughts and feelings helps other people connect to us.

Small aside: I like the idea that natural selection would encourage a variety of personalities in a social species, rather than uniform, homogeneous super-beings, so that different sections would be competent at different task, leading therefore to the collective improvement of the whole.

8
Twitter Could Have Become a Protocol austingardnersmith.me
107 points by gardnersmitha  9 hours ago   67 comments top 20
1
macca321 38 minutes ago 0 replies      
They built it.

They had an experimental project called 'annotations' where you could attach 1k of json to each tweet, like a DIY microformat.

I got onto the beta and created a prototype twitter client which you could attach mini 'apps' to tweets based on the payload type, e.g. you could tweet out a poll, or an inviation to play a game or a job advert or whatever, and you could attach your own app as a listener.

I think it could have been pretty amazing, basically A Message Bus For Everyone. If you build this yourself, you will run into the chicken and egg position, twitter could have pulled it off as they had the eyeballs and the developers.

Unfortunately they pulled the plug on it, around the time they started to close down the ecosystem.

2
bryanrasmussen 1 minute ago 0 replies      
I remember a past boss of mine said one time "it's hard to monetize protocols".

It's really the only thing he said I thought was smart, so it stuck with me.

3
ronjouch 7 hours ago 2 replies      
A similar complaint came up here recently on a thread called Next steps for Gmane [1], starting with "I really miss the newsgroups that focused on just the messages, and could be consumed by any NNTP client, stored offline, searched, etc."

Quoting a portion of a great answer [2] from HN user niftich, that feels very appropriate here:

> "Not enough people make new running-on-TCP or running-on-UDP protocols because new protocols are hard to design, they don't work with the one application where everyone spends 70+% of their time (the web browser), and they probably get blocked on a middlebox except if you use port 80 or 443 and fake being HTTP anyway. For all but very specialized use-cases, vomiting blobs of JSON (or if you want to feel extra good, some custom binary serialization format like protobuf or Thrift or Cap'nProto or MessagePack) across HTTP endpoints is pretty okay."

[1] https://news.ycombinator.com/item?id=12440230

[2] https://news.ycombinator.com/item?id=12440783

5
wvenable 7 hours ago 3 replies      
> Twitter had a chance to become a sort of de-facto API for lots of other applications.

The problem I see is that there doesn't seem to be any way to make money from this, for Twitter.

And really if everyone started using it this way, the privacy concerns would be even greater than the concerns people have about Facebook.

6
Spooky23 6 hours ago 2 replies      
Twitter was too busy navel gazing and pushing PR about how they had the ability to incite revolutions.

They were afraid to grow beyond tweets. In life you need to grow or die. Ten years from now, they'll be fondly remembered as the AOL Instant Messenger of the 2010s.

7
wlesieutre 5 hours ago 1 reply      
Didn't App.net try to be more of a protocol than twitter? How's that going for them?
8
dkarapetyan 7 hours ago 0 replies      
Yup, and they bungled that one pretty well when they shut down the entire ecosystem. They never recovered after that. They lost all the developer goodwill and any chances of becoming a platform was vaporized at that exact instant.
9
phmagic 7 hours ago 0 replies      
I think the author has some good ideas about the things Twitter can leverage once it becomes the de facto method of broadcasting in the world.

However become a protocol would not have helped Twitter get there.

I think the better path is to build and then protect a captive audience. Instead Twitter saw its audience based erode away not once, but multiple times with Facebook News Feed (Twitter for news) then Instagram Video (Vine).

Twitter once had a massive captive audience and unique data. Now their competitive advantages have all disappeared. Opening up more won't help them gain an audience.

10
pjc50 1 hour ago 0 replies      
One of the things currently regarded as a serious problem for Twitter is abuse. Federation doesn't make dealing with this easier, it makes it harder - look at what happened to USENET.
11
herbst 7 hours ago 1 reply      
V1 api was awesome and part because i started to love programming, v2 was a pain already" simply took away the simplicity and now i can not even create unlimited accounts anymore, even less use the API with those.

Twitter lost big in that regard imo

12
jonahrd 7 hours ago 2 replies      
Twitter has restrictions that make perfect sense given its use, but would be really strange decisions for a protocol: ie. 140 character limit.
13
jnpatel 1 hour ago 0 replies      
We've even seen botnets in the wild, using Twitter for their command & control.

http://www.welivesecurity.com/2016/08/24/first-twitter-contr...

14
soufron 3 hours ago 1 reply      
Wasn't there already Jabber or IRC? But they don't make money. Sad world where we prefer badly coded shit that makes money, to awesome tech that don't.
15
CaliforniaKarl 7 hours ago 0 replies      
I'm sorry, I was only able to read about a third of the way through. Around that point, my screen got taken over by something suggesting I subscribe to something. The experience was so jarring that I just stopped.

Please, don't do that! Besides, I'm not going to make a subscribe decision until I've read the whole thing.

16
hossbeast 5 hours ago 2 replies      
So where is the open source Twitter alternative?
17
shp0ngle 5 hours ago 0 replies      
Whenever I go to twitter, I think "I wish there were MORE robots and MORE automatically sent spam".

Also tweets from my fridge.

18
ilaksh 7 hours ago 0 replies      
See reddit.com/r/rad_decentralization

All of these technopolies will eventually be superseded by protocols. It just doesn't make sense to continue to rely on monopoly companies to provide core services. That's not to say there can't be variations or companies that build on top of core protocols etc. Just more than one, and not for the most common aspects.

19
nullc 3 hours ago 0 replies      
Thank the gods for small blessings.
20
revelation 7 hours ago 1 reply      
IRC is a protocol. Slack is where the money is at.

(Don't become an software architecture astronaut. IRC is also a terrible protocol. It was still successful)

9
Energy Giant Shell Says Oil Demand Could Peak in Just Five Years bloomberg.com
123 points by Osiris30  11 hours ago   66 comments top 11
1
caseysoftware 6 hours ago 2 replies      
"Peak Oil" generally refers to peak oil production. This article is about peak oil demand.

Overall, this should be considered a good thing.

As people convert to alternatives which are cheaper - due to renewables improving - the demand for oil in some industries decreases. Which then drives further innovation into those areas and further alternatives. In fact, this is exactly what critics of peak oil have been saying for years.

(This isn't saying demand for oil will go to zero, just that it is likely to start decreasing.)

2
diafygi 11 hours ago 3 replies      
There's a great report from the UK that analyzes the losses of stranded fossil assets if we're to hit our climate change goals. It says that in order to cap at 2C warming we can only pull up 1/4 of our existing proven reserves, and only 1/3 to cap at 3C. That means stranding over 2/3rds of our fossil assets. Also, the situation heavily favors producers who have cheap assets, which are mostly sovereign producers in the middle east. The private producers like Shell end up having to strand way more than 2/3rds because the get pushed out due to low prices.

http://www.carbontracker.org/report/carbon-bubble/

https://en.m.wikipedia.org/wiki/Carbon_bubble

3
Fr0ntBack 1 hour ago 1 reply      
If we implemented a carbon tax which makes oil producers pay for the environmental cost of oil, oil demand would probably peak even sooner.
4
glbrew 7 hours ago 1 reply      
AKA "Oh don't worry about regulating us and protecting the environment, it will naturally happen in five years when your need for us will surely peak!"
5
cheeseprocedure 9 hours ago 4 replies      
I really miss The Oil Drum.

Can anyone recommend sources for news/analysis from the energy industry?

6
siculars 4 hours ago 1 reply      
Whether it is 5, 15 or 50 years hence, peak oil demand will surely come. Technological innovations will see to that. The question I have is what will that mean for all our friends in OPEC nations? Venezuela is in most estimates already a failed state and thats simply due to mismanagement. What happens when structured demand erosion starts to take hold and no amount of state level management can stop it? For those who say poverty breeds desperation breeds terrorism (which I don't ascribe to) we'll no doublet be in for a few volatile decades indeed until it all gets sorted.
7
martin_bech 2 hours ago 1 reply      
Curious note, Shell has at least in Denmark sold or closed all of its gas filling stations. source http://www.business.dk/transport/shell-saelger-danske-tankst...
8
mrfusion 10 hours ago 4 replies      
Don't the current low oil prices suggest we've already hit peak demand?
9
magoon 10 hours ago 5 replies      
In my lifetime I have learned that the only prediction I can believe regarding peak oil is that there will always be another prediction about peak oil. I'm not saying it's not gonna happen (or hasn't), just that it's been said so many times before.
10
legohead 6 hours ago 1 reply      
Is this our yearly "oil is running out" journalism run?
11
monkmartinez 11 hours ago 2 replies      
If we know anything, it is that these guys (energy industry) are pretty terrible about predicting much of anything. Ironically, it was an engineer at Shell that started "Peak Oil" in the 50's... one, M. King Hubbert.

Maybe monster trucks will take over as the prophecy of the film Idiocracy come true and oil stages a massive come back... you heard it here first!

10
Supercapacitors are now carbon-free and more powerful edgylabs.com
35 points by Parbeyjr  2 hours ago   5 comments top 5
1
kctess5 1 hour ago 0 replies      
Classic modern reporting - doesn't link to the source or even cite the paper.

Here's the source: http://www.nature.com/nmat/journal/vaop/ncurrent/pdf/nmat476...

Web version: http://www.nature.com/nmat/journal/vaop/ncurrent/full/nmat47...

2
franciscop 23 minutes ago 0 replies      
If I'm not mistaken the biggest disadvantage of supercapacitors is the energy density by mass[1], which means that while they hold a good power density by mass it's only temporary. Good at sprinting not so good at marathons.

It's great that there are advances in the field for many reasons, but I don't think they could replace batteries in the short-middle term, which seems to be the point of the "scientific click-bait" article.

[1] http://berc.berkeley.edu/storage-wars-batteries-vs-supercapa...

3
cheiVia0 17 minutes ago 0 replies      
Could this be used to charge your car's supercaps in 30s and have them feed into the batteries over a longer period of time?
4
mchannon 1 hour ago 0 replies      
Carbon-free implies that there is zero carbon in this technology. In fact, the MOF material is based on organic (carbon-bearing) molecules.

"non-carbon-based" would be more apt, since this is definitely a change from carbon-based electrodes.

5
flexie 1 hour ago 0 replies      
That is good news. How long time would it take this research to result in improved batteries being marketed for electric vehicles?
11
Whatever happened to Japanese laptops? jamielittle.org
109 points by atjamielittle  11 hours ago   86 comments top 16
1
Spooky23 6 hours ago 4 replies      
The japanese companies were pretty amazing in the commercial space.

My employer had a few Fujitsu ultra portables in the circa 2007 timeframe. One of our internal customers mentioned to a salesguy that they needed a particular port in a particular location on the laptop, not expecting anything. Two weeks later, they Fedexed a loaner/prototype with the port.

Really an amazing experience. We were buying 5-6 figure quantities of Dells and HP, and they could barely handle trivial requests re packaging, etc.

2
benzesandbetter 4 hours ago 2 replies      
In the early 2000's Japan was way ahead in the Sub-notebook market. As subnotebooks and tablets have come into popularity and wide availability worldwide, this arbitrage opportunity has narrowed significantly.

In 2007, I picked up a Panasonic R6 (10" LetsNote/toughbook). It was a great little machine, and I ran both Windows and Debian on it. The keyboard was a bit cramped, and the circular trackpad was pretty lame, but overall I loved the machine. Everywhere I went people would notice it and ask about it. After about 14 months, the logic board failed, and I discovered that as grey-market import, there was no warranty. This was a hard lesson, as I had paid almost $2k USD for the machine. Fortunately, I was back in the states when it happened, so I wasn't stranded abroad without a working laptop.

Earlier this year, I was in Japan and picked up a Japanese chromebook 10" for under $200. The keyboard is both english and Japanese which makes it a bit of a conversation starter. I installed Debian on it via Crouton (alongside ChromeOS). When I travel, I usually bring both my MacBook Air, and the Chromebook, and particularly in the developing world, I leave my MBA back in my hotel or apartment and bring my Chromebook with me when I'm out and about in the city.

3
chx 21 minutes ago 0 replies      
I had a Panasonic CF-Y5. As one review had it: "The exterior design of the machine's casing is reminiscent of a Sherman tank cross-bred with a 1970s sports saloon, while the lid opens with the grace of a bank vault door."

It was incredibly lightweight (1.53kg at 14" with an optical drive!) and built like, well, a Sherman tank. It wasn't only the casing reminiscing it. In a weird little apartment overcrowded by all startup coders attending a conference someone accidentally kicked it off the table and yet it was completely OK.

I could only afford it because the CF-Y7 just got out and for a while I couldn't afford them and then the B10 was no longer so incredibly tough (although the current models do mention a drop test and a pressurized test). Anyways, I am back to ThinkPads ever since. The keyboard is much better but the weight is worse. I am currently using a T420s upgraded with an 1080p screen and waiting for the Retro to happen. Our last hope.

4
chemmail 5 hours ago 1 reply      
Japan is really behind in PC/Laptop. I was just there last week and the big PC and electronic stores only stock up to Geforce. Asus also has a big presence there now, but the prices are a lot more than US pricing for the same stuff. Not much innovation in the computer space. Home eletronics and appliances are amazing though. Their high end electrolux style vacuum stick cleaners easily clear $800 USD and are really awesome. The rice cookers can hit $600-$800 and can make the perfect rice you normally only get in Japan. The mcirowaves have built in toast oven in one unit, and it works amazingly well. Coming home, its kinda sad some of the convinces we dont have like the toilets, bathroom, kitchen, and even video doorbells (out current "smart" stuff are still rubbish).
5
PebblesHD 51 minutes ago 0 replies      
I've always loved their[Japanese] ultraportables for as long as I can remember. Probably their prime placement in the film and television I watched growing up. I've ended up with quite a collection of them at this point, ranging from the amazing little Vaio UX through the little Vaio P up to my 12" let's note. It weighs nothing, has an i5 and tons of RAM, I'm amazed they haven't taken over the US. Not to mention its tough as bricks. They certainly have a quirky, unique charm to them.
6
mc32 7 hours ago 2 replies      
One thing likely contributed to their downfall was US manufactures moved operations overseas and many Japanese manufacturers kept production in Japan making them expensive to the Us market --that and Apple and others catching up in terms of "sleekness" and aesthetics. People would buy Vaios despite poor hardware because they were thin and looked cool.
7
Tepix 1 hour ago 1 reply      
I bought a Toshiba Libretto L1 back in 2001 via the Internet (using translation services) and had it shipped to Germany. It was a fantastic form factor: A super wide 10" 1280x600 display (143dpi) which was so wide that it left enough room for a decent keyboard.The CPU was a Transmeta Crusoe 600Mhz which was, unfortunately, the bottleneck of the system.

One other highlight I remember about this machine: Despite not having any international warranty that I was aware of, when the machine developed a problem, I called Toshiba Europe and they picked it up and send it back repaired a week later for free.

I was in Japan this summer and also had a look at the notebooks on sale there but nothing really caught my attention. With the arrival of Netbooks and later the Macbook Air 11", small notebooks are no longer hard to get in Europe.

8
carsongross 6 hours ago 1 reply      
I spent a lot of time on http://www.dynamism.com/ looking at the ultralight computers they had in Japan. Almost pulled the trigger a few times, they had awesome stuff.

The OQO was from that era too.

Good times.

9
Ezhik 10 hours ago 3 replies      
What's the laptop market like in Japan anyway? I don't really know a lot about it, but it's always interesting to see these random Panasonic or Fujitsu laptops not sold here, or those rare Japan-exclusive experimental ThinkPads.
10
bikitan 7 hours ago 2 replies      
I think Japan is going to be one of the first markets to see laptop sales dry up almost completely. Most people are fine with their iPhones. Where I work, all developers use macbooks and everyone else uses a thinkpad. A lot of other laptop models, like a lot of tech in Japan, is targeted at the Japanese market only. Dynabooks are the budget windows machine of choice at the moment, it seems. Although most young people choose Apple whenever possible anyway.
11
bluedino 7 hours ago 1 reply      
I thought this would be about the tiny models like the Toshiba Libretto
12
zumu 7 hours ago 5 replies      
My current laptop is a toshiba, and I bought it in the US. I've had good luck with the brand over the years.
13
bdcravens 4 hours ago 0 replies      
Your anchor tags are using curly quotes, so external links aren't working.
14
phmagic 7 hours ago 0 replies      
The iPad happened. Most of the uses for the Japanese laptops can be covered by the iPad.
15
partycoder 4 hours ago 0 replies      
Japan is more active around smartphones rather than laptops and desktops, since you can use them on trains, the preferred way of transport. People spend some considerable amount of time on trains.

Small spaces make desktops not a very good choice.

16
kevin_thibedeau 5 hours ago 0 replies      
> I have a strange attachment to Panasonic

I have a lifetime prohibition on any Panasonic products after experiencing the most craptastic DVD recorder ever to grace this Earth. It crashes at random, takes forever to boot, won't boot reliably with a disc in the drive, can't play Audio CDs without locking up, responds to remote button presses with multi-second lag, and gets the tuner channels out of sync with the OSD. Basically a complete fuckup. Their "fix" in the successor model was to add a reset button. Never again Panasonic.

12
Forget self-driving car anxiety: In the early days human drivers were the fear timeline.com
76 points by samclemens  9 hours ago   56 comments top 14
1
sshine 3 hours ago 2 replies      
Just this morning on my bicycle ride to work, I was nearly run down by a taxi making a blind entry into a large road and doored by a 10-year-old. The fear of cars is not "old school" but very much real. This is not to mention the aggression that drivers go and build up in their small bubbles. I am always prepared to get off my bike and take them up on their threatening offers. I look forward to less aggressive, more responsive computer-controlled vehicles.
2
dev_throw 7 hours ago 1 reply      
Human drivers are still dangerous. Every year, around a million people worldwide are killed due to automobile accidents. [1]

Meanwhile, in the United States alone, there are around 5 million vehicle crashes annually.[2] Although automobile safety has increased, I am curious as to whether our accidents per-capita has stayed the same.

Also, we have to consider the interaction effects between decisions taken by human drivers and self-driving algorithms. I have a feeling that they might be deleterious initially, but should be able to improve after some iteration.

[1] https://asirt.org/initiatives/informing-road-users/road-safe...

[2] pdf: https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/...

3
M_Grey 8 hours ago 3 replies      
The problem is still human drivers, and it will remain so as long they're blended imperfectly with autonomous systems. I don't know how much clearer this issue can be, but humans suck at driving. We're just no good at maintain a constant vigil in those circumstances; we evolved to use time like that to power down the hungry brain.
4
upofadown 5 hours ago 1 reply      
The punitive laws were a reaction to a clear and present danger. Cars were hitting and killing people (particularly children). The eventual solution was for pedestrians to stay off of the streets. Parents were made responsible for keeping their children off the streets.

It is interesting to note that at least one of these laws seems to be returning. Reducing residential speed limits to 30 km/h is a trend now and is done to increase pedestrian safty. Driverless cars will of course also be so restricted. If it turns out that driverless cars are actually safer for pedestrians then pedestrians will return to the streets. Then "The motor-car is not to be made as useful as it should be.." will be an issue again.

5
cperciva 7 hours ago 0 replies      
now we're worried about people driving with the aid of a second (artificial) intelligence

To the contrary, I haven't heard anyone complain about collision-avoidance systems; the paranoia comes in when the possibility of cars being driven by a single (artificial) intelligence is discussed.

6
ayanray 7 hours ago 1 reply      
What I worry about is how can we define accountability. At least you could sue a person if you got into a car accident and got injured (happened to several people I know). Suing a huge corporation and getting bullied around, settling for less, etc. sounds possible, but can happen too with people vs people. How do you even make a case when you likely don't understand what actually happened or could even prove what happened (crypto, copyright laws)? I'm all for reducing risk, but machines will make mistakes and I don't know what happens next.
7
mlinksva 6 hours ago 1 reply      
https://www.researchgate.net/publication/236825193_Street_Ri... is a good telling of struggle between human drivers (and driving clubs, and car dealers) and other humans for the streets, or http://99percentinvisible.org/episode/episode-76-the-modern-... covers the same material.

To ensure computer drivers behave better than human drivers, re-legalize walking in the street. Also, ban human drivers.

8
jlgaddis 7 hours ago 1 reply      
I put thousands of miles a year on a big, loud, Harley-Davidson motorcycle.

I've been in two wrecks. The last one left me with several broken bones and the inability to walk for a few months, although I did gain the amazing ability to set off metal detectors everywhere!

Human drivers are my worst fear.

9
overcast 8 hours ago 0 replies      
In the current days, human drivers are the fear.
10
beautifulfreak 5 hours ago 0 replies      
He really should have credited the Reddit photo colorizer whose image he uses. He credits Getty for the others. https://www.reddit.com/r/pics/comments/1ex763/auto_wreck_in_...
11
bottled_poe 6 hours ago 0 replies      
Not everyone drives equally safely. For many drivers, switching from manual to a self-driving car will increase their probability of a collision. Maybe in the future this will not be the case, but until self driving cars are proven safer than the most cautionary drivers, some people will increase their risk by switching.
12
cooper12 8 hours ago 1 reply      
Time and time again history has shown that you can't just retrofit old laws to stuff that's "similar enough". Because the new technology is never the exact same. Computers are not telegraphs. Email is not postal mail. Cars are not horse-drawn carriages... Law reuse is harmful because it loses nuanced differences in how these technologies operate and are used. I really think a lot of problems are linked to that, such as with drug laws. I would say we need a reform but that would require changing the DNA of the judicial systems, and like this article notes, ingrained prejudices.
13
jackarbitrage 8 hours ago 0 replies      
I started getting scared of autonomous cars when that guy accidentally got sent to the libertarian island on Silicon Valley
14
supercoder 8 hours ago 0 replies      
How can a computer ever be as intelligent as my alcohol soaked brain
14
Brain-Like AI and Machine Learning naiss.io
51 points by edfernandez  8 hours ago   8 comments top 4
1
mholt 7 hours ago 2 replies      
Is there a paper on this? Did I miss the link?
2
AstralStorm 2 hours ago 1 reply      
Why would you call AI just learning? Marketing? Self-aggrandizement?

AI is getting machines to solve problems they haven't been explicitly programmed to solve. As it is, we do not have AI. We have some bits and pieces of it. Best Mr algorithms so far only solve problems they have been explicitly trained and tweaked to solve.

Online learning has been attempted before, with very limited success. Making an online learning network stable is an open problem. These tend to quickly overfit the problem and get stuck.

3
username6000 1 hour ago 0 replies      
This is where it start to get interesting. https://www.youtube.com/watch?v=9gTJorBeLi8
4
teabee89 4 hours ago 1 reply      
Sounds like Numenta's HTM algorithm. What are the differences?
15
Freeing my tablet: Android hacking, software and hardware thanassis.space
152 points by jscholes  13 hours ago   39 comments top 12
1
voltagex_ 12 hours ago 1 reply      
Impressive work on the rooting procedure, but...

>For the TL;DR crowd: I wanted to run a Debian chroot in my tablet;

I'm sad that there aren't more people looking to run a mainline kernel + Linux distro on their tablets. It's definitely possible but a lot more difficult.

2
sedachv 4 hours ago 1 reply      
It has been over a decade since Torvalds rejected GPLv3 licensing for Linux (https://lkml.org/lkml/2006/1/25/273) and it turns out that Stallman was right.
3
TheAceOfHearts 1 hour ago 0 replies      
> I started looking at the various offerings, and being a nerd of a frugal nature, decided to only look at the best HW bang for the buck, completely ignoring the SW aspects.

I think this is a very important point that people tend to overlook. One that I've definitely stated to appreciate in recent years.

For a similar example, sometimes people talk about the cost of ownership of a PC. Whenever you bring that up, someone will mention how it's really cheap to build your own PC. And although they're technically correct, since it's really cheap to build your own PC, it usually means you're left on your own for maintaining it as well.

I can't shake the feeling that this case is similar. Although I certainly enjoy tinkering with stuff, I think there's a strong argument to be made for things that "just work", or things that can be easily hacked. I have a hard time imagining that the real cost of ownership of this device for the owner ended up being higher than if they had purchased a more open device.

With all of that being said, I found this to be a very interesting read, and I'm thankful that the author took the time to write up their experience.

4
shmerl 5 hours ago 2 replies      
> Golly gee, Mr Google, that's a lot of partitions

More like Qualcomm. They like this mess of partitions, not Google.

> I am NOT a bad guy!... I just want to remain in full control of my OWN hardware...

Yeah, they don't want to respect that. And if you care about that, your choices of hardware become pretty limited. Let alone if you want to have open drivers for key components like GPU and the rest. In practice, Google's Nexus devices tend to be one of the best choices (i.e. such as Nexus 5 and Nexus 7). Not sure what the situation with Pixels is.

5
imtringued 2 hours ago 1 reply      
Is there some kind of authentication for the serial port in the headphone jack? It sounds like a security risk to me.
6
rocky1138 11 hours ago 3 replies      
Is there a phone or tablet you can buy which comes fully unlocked and rooted by default?

Something I can install KDE Neon to?

7
lucaspiller 4 hours ago 0 replies      
Has anyone tried GNURoot? As I understand it's a Linux chroot without needing your device to be rooted:

https://play.google.com/store/apps/details?id=com.gnuroot.de...

8
philtar 5 hours ago 1 reply      
This is why I still come to HN
9
cheiVia0 6 hours ago 0 replies      
Some more resources for Debian on mobile devices:

https://wiki.debian.org/ChrootOnAndroidhttps://wiki.debian.org/Mobile

10
dmitrygr 7 hours ago 1 reply      
Most of author's difficulties simply stemmed from not understanding how modern android works and boots. Easily fixable by reading more about it.

And no, nobody hates guys like you. Problem is that what you want to do is very similar to what malware might want to, and since malware is more common than guys like you, choices are made that way.

11
bobajeff 5 hours ago 1 reply      
What would be the feasibility of making a Snapdragon based phone and running mainline Linux + freedreno on it?
12
dagiuth 11 hours ago 0 replies      
mobile wise when they started doing that i quit there was no point in using android. there is always sec testing but it is not as modular as it was. there is security and then there is economic security.
16
Principles for smooth web animations gyrosco.pe
158 points by julianshapiro  13 hours ago   25 comments top 11
1
chrismorgan 9 hours ago 3 replies      
The most important principle for current technologies is #1, Dont change any properties besides opacity or transform!

Hopefully everyone will start copying the approach Servo uses in its WebRender engine which is basically treating the browser rendering as though it were a game, using the GPU for rendering everything; therefore guidelines like this become obsolete.

This applies to a slightly lesser extent with #2, Hide content in plain sight. #3 is also obsolete in Servo, as is #7, mostly.

2
TeeWEE 36 minutes ago 0 replies      
Wow i just went and logged in to https://gyrosco.pe/... Too many animations. Its making me wait for content to show up, lots of animations everywhere causing a lot of congnitive load. And an options to delete your account is not there.

I think animations are very much over used on that site. The article is fine. But dont do too much animations! Keep congnitive load small!

3
amelius 12 hours ago 0 replies      
It is rather disappointing that with the current state of technology we need to go through such hoops to get smooth animations.

Also, there is completely no handle on keeping the animations smooth as webpages become more complicated and do (perhaps unrelated) updates while animations are running.

4
dyeje 12 hours ago 1 reply      
Google's Material motion guidelines are also great material for people looking to use web animations.

https://material.google.com/motion/material-motion.html#mate...

5
oneeyedpigeon 10 hours ago 1 reply      
OK, so now I know why tweening's called that, and I feel like the biggest idiot in the world :-/
6
pier25 4 hours ago 0 replies      
Off topic... but what are you using for JS animations these days?

I've been using GSAP for almost a decade since the Flash days.

7
ec109685 6 hours ago 0 replies      
The article was awesome. I am not sure, however, the heavy animation helps gyroscope.pe's user experience, though.
8
aorth 4 hours ago 0 replies      
Speaking of smooth animations, the blog post is hosted on Medium, and I've always admired Medium's image zoom thing because it's so smooth!
9
Kenji 11 hours ago 1 reply      
#11: Your user's browser has the final say in all matters of timing. Whatever you do, ultimately, if the browser decides that something has to be computed on your main JS thread while you are trying to hit frames and timings, you're screwed. Source: Making multiple HTML5 games with the intention of having them render at 60 fps and not missing a single frame.
10
idlewords 4 hours ago 1 reply      
The smoothest animation is the one you don't do.

In the best case, web animations just cause bloat and burn through your battery. In the (common) worst case, they stutter or break or cause things to on the page to hang while they load.

Just don't do it. Your website doesn't need to look like a mobile app.

11
simooooo 12 hours ago 0 replies      
Very very nice article
17
Unsplash Beautiful photos free to use under the Unsplash License unsplash.com
387 points by tambourine_man  12 hours ago   71 comments top 21
1
cyberferret 11 hours ago 3 replies      
Unsplash is a really cool resource. We actually use it (paired with another 'sister service' called Unsplash.it) to provide ever changing and semi interesting 404 error pages for our web app... I blogged about how we do it a while back - http://devan.blaze.com.au/blog/2015/11/3/errors-dont-have-to...
2
Cbeck527 9 hours ago 2 replies      
I was recently featured[1] in collection #127, and as a long time user it feels really awesome to give back and let others use my work.

I've also been to a few of their NYC meetups and it's clear that the site is backed by an amazing community.

1 - https://unsplash.com/collections/curated/127?photo=jYYpTndzo...

3
kardos 11 hours ago 2 replies      
How does the Unsplash license [1] differ from the Creative Commons Zero license?

[1] https://unsplash.com/license

4
df3 12 hours ago 1 reply      
Unsplash is a great resource.

It's important to point out that "free" and "royalty-free" aren't the same thing. Unsplash images are actually in the public domain, whereas "royalty-free" is a license type where an image can be used multiple times for one payment.

5
fpgaminer 7 hours ago 0 replies      
Here's a quick script I put together which downloads a random image every hour and sets your wallpaper. Only works with Gnome 3/Unity/Cinnamon. Adjust line 5 for different resolutions (currently set for 1920x1080) and adjust line 7 for different update frequency:

https://gist.github.com/fpgaminer/bdd493ce84eafb7886e08d20c2...

6
bdcravens 11 hours ago 0 replies      
Apple products on a distressed wooden table, laid out perfectly yet supposedly naturally positioned, with an open paper notebook: check
7
josephg 9 hours ago 0 replies      
6 months ago I wrote a couple little scripts to download new unsplash into a directory every 6 hours. Then I pointed macos to use random images from that directory for wallpapers. The whole thing is great - its a source of tiny delight throughout my week. Its also a small step toward making my workspace feel more hackable.

The whole thing was super hacked together - I'm sure there's nicer solutions around but I'm plenty happy with what I have. Details here if anyone wants to copy what I did: https://josephg.com/blog/shiny-background-images/

8
Ahmed90 11 hours ago 1 reply      
Always Great qualityfor fellow web devs out there give http://unsplash.it a try for development easy, fast and beautiful placeholder images
9
ars 10 hours ago 0 replies      
From the name I thought these photos were free to use as long as you agree never to have a splash popup on your site :)

Maybe someone could actually do that.........

10
hiimnate 11 hours ago 0 replies      
I love Unsplash. They also have a Chrome extension to show a random image in your new tab page. Would recommend.

https://chrome.google.com/webstore/detail/unsplash-instant/p...

11
sirodoht 10 hours ago 0 replies      
Fun fact: when this was firstly posted in HN there were mainly negative comments, about yet another website on a market with many players.

Lately it has become a favorite site for many people. So, just another incarnation of the Google story. <3

12
Hondor 4 hours ago 0 replies      
Wonderful to see more CC0 use compared to a lot of "free" art on the internet that burdens the users with keeping track of attribution requirements and including the license text and all that tediousness.
13
yatsyk 3 hours ago 0 replies      
Great resource.

Can somebody recommend similar resource with unprocessed images? Most photos toned or converted to black-and-white.

14
seanwilson 10 hours ago 0 replies      
Great resource. I'm curious how this impacts photographers though given there's so many sources of free images now. Can you make a living creating and selling stock photos?
15
Esau 10 hours ago 1 reply      
Great site but they must use a heck of a lot of bandwidth. How do they stay afloat?
16
philfrasty 11 hours ago 2 replies      
How do they make sure the submitter of the image is actually the rightsholder/owner?
17
tunnuz 8 hours ago 0 replies      
I shared some of my best photos on Unsplash, and I plan to use it in the future. It is a great resource.
18
rokhayakebe 10 hours ago 1 reply      
To think these guys started with 10 photos.
19
ChrisNorstrom 11 hours ago 0 replies      
There goes my evening. Seriously, Thank You for this. This is the best free photo collection I've seen so far and I scout a lot of photo collection sites.

1) I LOVE how you group photos by subject/topic instead of just randomly posting photos and asking the user to search for what they want. Most of the time users don't know what they want and would rather just browse and look around. Browsing lists and collections is more entertaining, engaging, and useful than what other photo sites do: drop off the user in front of a search box and ask "what do you want?". That's like asking someone "tell me everything about you". It forces the user to engage in some serious mental gymnastics and fatigues them. Collections like yours are easier on the brain. Just pick a pretty picture and browse all the pretty photos in that collection. Love it.

2) The photography is beautiful and looks authentic, rare, and avoids that "generic stock photo" feel. These photos look like they're out of somebody's "rare find" folder. They are gorgeous and ready to be used with minimal photoshopping.

3) Most of these already have color correcting and filters applied. Did your site do this? Or did the photographers?

Unique. Useful. Going in my bookmarks. Thanks for this.

20
tschellenbach 10 hours ago 0 replies      
Big fan of Unsplash, great resource!
21
raz32dust 7 hours ago 1 reply      
It is obviously great for users. But I am kinda sad for the artist photographers. I don't think photography should be done for money, and I doubt any photographer would make real money off landscape and generic photos. But at least having a chance of making money via sites like 500px is a good thing in my opinion. Some additional incentive for them to keep trying.

Talented individuals who are well off, giving talent for free makes life harder for other talented individuals who might not actually be well off and might have just this talent. It looks like service is the only thing that will be monetize-able in the future. Actual products will all be available free of cost. I think it will drive down the quality of the best products while driving the average quality up.

18
Apache Spark: A Unified Engine for Big Data Processing acm.org
11 points by akashtndn  3 hours ago   1 comment top
1
mpweiher 43 minutes ago 0 replies      
Funky: 10 to 20 local disks, for approximately 1GB/s to 2GB/s of disk bandwidth

The new MacBook Pros apparently have north of 2GB/s of disk bandwidth for their internal SSD.

https://9to5mac.com/2016/11/01/the-late-2016-entry-level-13-...

19
How Economic Gobbledygook Divides Us nytimes.com
5 points by kawera  2 hours ago   1 comment top
1
crdoconnor 2 minutes ago 0 replies      
Though he doesn't address it in the article, this story illustrates one reason why financial language is intentionally obfuscated:

"The traders negotiated a new trade with OCM. It reached a new plane of creativity. Under the transaction, the offending transaction was canceled at no cost to OCM. In its place was a new swap. The new transaction was for $600 million. Under the swap, for the next three years, OCM would pay a fixed dollar amount. The amount was $4 million a month. In return the dealer would pay OCM an amount calculated according to a complicated formula:

Maximum of [0; NP x {7 x [(LIBOR2 x 1/ LIBOR) (LIBOR4 x LIBOR-3)]} x days in the month / 360]

Where

NP = $600 million

LIBOR = 6 month Dollar LIBOR rates

The financial engineering was dazzling. There was just one problem. The complex equation, if you did the algebra, always equaled zero. The dealer would never pay OCM anything. OCM would be paying the dealer $4 million each month for three years. This was the intended effect."

http://www.minyanville.com/businessmarkets/articles/traders-...

20
Arctic farming: Town defies icy conditions with hydroponics ap.org
32 points by Mz  8 hours ago   11 comments top 3
1
donquichotte 3 hours ago 1 reply      
The energy problem is mentioned almost in passing, but it could be a deal breaker for this type of farming.Ventilation, heating and delivering light for photosynthesis require vast amounts of electrical energy, which, according to the article, comes mainly from diesel.

Averaged over a day, the sun provides > 12kWh per m^2 in farmable regions (estimated). It takes 70 days to grow cabbage, so for 1m^2 of cabbage that would amount to 168$ at 20cents/kWh.

2
malanj 4 hours ago 0 replies      
If you're also interested in seeing photos of the setup (and the containers), I found some here: http://www.chicagotribune.com/business/ct-arctic-farming-hyd...
3
aaron695 5 hours ago 3 replies      
> But operators are trying to work out kinks and find ways to lower energy costs, possibly through such alternatives as wind power.

This is such a wtf.

If wind power was lower in cost, why wouldn't everyone use it over diesel?

It's always a huge warning sign to me.

A project, that looks exciting and new, also powered by renewables.

21
Beware, iPhone Users: Fake Retail Apps Are Surging Before Holidays nytimes.com
46 points by gnicholas  9 hours ago   21 comments top 7
1
mysterypie 59 minutes ago 3 replies      
What possible utility do people get by using a "retail app" rather than the web? Wouldn't the store's website and/or normal shopping websites (Amazon, etc.) have everything that the Foot Locker app has?

Have people been trained or deluded to "always get the app" when it's not necessary?

2
Cagrosso 8 hours ago 2 replies      
Funny, Apple made a fuss earlier this year that they were cracking down on "My First App" type applications, but they let shoddy, fraudulent, poorly translated applications through no problem...
3
gnicholas 8 hours ago 1 reply      
Having dealt with the iOS approval process several times, it is interesting to see that the system has this weak spot. The approval process is known for its stringency (especially compared to the Play Store), but clearly there are some vulnerabilities lurking.
4
threeseed 8 hours ago 0 replies      
Pretty extraordinary that Apple is letting these through.

And some are blatantly flouting the rules e.g. putting Nike.com as their support website.

5
coldcode 7 hours ago 0 replies      
For example do a search for 1010!, generally you will find dozens of identical (maybe a slight different icon) apps all with a person's name as the company. I think they are all the same codebase and run by some kind of white label company. I have no idea how anyone makes money off of so many dupes.
6
0x0 2 hours ago 1 reply      
Link only shows "Log in - New York Times" and a login form, flagged.
7
Waterluvian 7 hours ago 0 replies      
Wouldn't this be incredibly easy to discover? If number of published apps divided by account age is greater than threshold, raise review red flag when user attempts to publish new app.
22
Design Principles for Reducing Cognitive Load marvelapp.com
114 points by prostoalex  13 hours ago   13 comments top 6
1
jonaf 13 hours ago 1 reply      
This article is interesting, but in case you're like me and read comments before articles to get a tl;dr, this article is about web design. I was personally hopeful it was about managing teams or projects, which does have analogous lessons, but it's specifically about web design.

Disclaimer: no longer a web designer; take my opinion on the substance of the article lightly.

2
bluetwo 12 hours ago 0 replies      
Cause 4. Too much information.

Cause 5. Multichannel redundancy (i.e., reading something to someone as they try to read similar text)

3
tony-allan 6 hours ago 0 replies      
Almost all of the points in the article also apply to programming languages!
4
partycoder 3 hours ago 0 replies      
Cause #0: implicit information.

Implicit information may force people to guess what is being said. Turns a simple thing into a riddle.

Cause #-1: lack of consistency

When something is designed in a different way, people try to infer why it's different. If they turn out to be the same thing, it pisses people off.

5
jarmitage 11 hours ago 1 reply      
I saw a talk on Friday about designing interfaces for people with locked in syndrome. It turned out that to be effective they needed to design for increasing cognitive load rather than reducing it.

As everyone knows, how and when to employ a principle is the key!

6
Beowolve 12 hours ago 3 replies      
Says a website with hover share buttons and distracting color choices. I had trouble focusing on mobile because this site had so much going on.
23
Gos alias proposal and all my concerns of Google controlling Go hackernoon.com
102 points by activatedgeek  4 hours ago   25 comments top 6
1
Arcaire 4 hours ago 0 replies      
This was linked previously, and the single comment therein linked to a discussion on reddit[0] about the issue.

Of note, this change has been reverted now[1].

[0] https://www.reddit.com/r/golang/comments/5alxa3/gos_alias_pr...

[1] https://github.com/golang/go/issues/16339#issuecomment-25852...

2
tree_of_item 3 hours ago 3 replies      
I don't really get it. What's the big deal? I see a lot of dramatic language about the "strenuous objections of many external contributors", but what exactly is so bad about this proposal?

There was even a comparison to `goto`...? What am I missing?

3
secure 3 hours ago 2 replies      
https://www.reddit.com/r/golang/comments/5alxa3/gos_alias_pr... points out that this isnt actually a problem that is specific to Google, but rather affects the FOSS community as a whole, and I think the recent move of golang.org/x/net/context into the stdlib (as context/) is a perfect example.

The author of the article doesnt seem to mention or consider this point at all.

4
cromwellian 3 hours ago 0 replies      
To me this alias proposal is pretty much the way it's been done in Java forever, so whenever you call for "foo.bar.Baz", the classloader is free to give you back a different implementation, including vendor/foo/bar/Baz, or even a mock implementation.

I think in any large scale dependency chain, you're eventually going to have two transitive dependencies D1 and D2 that both use dependency E, but different versions.

Inside Google we have the OneVersionPolicy, because we control everything in our repo, but in the external ecosystem, this is a frequent occurrence, that has to be solved by either classloader isolation, or upgrades.

5
rusht 3 hours ago 1 reply      
Alias declarations were reverted from 1.8 recently[0]

[0] https://github.com/golang/go/commit/87f4e36ce7d7dffbf1f2a869...

6
chmike 3 hours ago 4 replies      
To me the root problem for this alias proposal is that go currently ignores version dependencies.

Go is designed so that we have to assume that an API and the compiler are either perfectly backward compatible or immutable. This is a simplification inherited from C and other programming languages.

If we want to support changing an API, then we have to specify version along the package name and import statement. Some packages already do that but it isn't satisfying because it is hardcoded in the package name.

I like the versioning rule chosen by ICE from zeroc. A version is not backward compatible with other versions. A release must be backward compatible with previous release of the same version.

So it should be possible to specify a version and release for packages in import statements which specify that the specified release and all subsequent release are valid.

Dub, for the D programming language is going that line of direction.

I'm aware that it may create a dependency hell. But it already exist and is hidden. The alias hack is not a good solution. It will make things worse by making the rules more lax.

24
Show HN: NetIn Shorten job interviews by answering the questions only once netin.co
28 points by soheil  6 hours ago   24 comments top 13
1
welder 39 minutes ago 0 replies      
> Our platform uses machine learning and predictive analytics to find you a match.

I call bullshit. You're just dumping candidates in your database and allowing recruiters to search it.

2
0xmohit 2 hours ago 0 replies      
Quoting from your Terms of Use:

 Unless otherwise noted, all materials including without limitation, logos, brand names, images, designs, photographs, video clips and written and other materials that appear as part of our Website are copyrights, trademarks, service marks, trade dress and/or other intellectual property whether registered or unregistered ("Intellectual Property") owned, controlled or licensed by NetIn.
I don't see you making any disclaimer about the use of terms like "Amazon AWS", "LinkedIn" on your website. Does it imply that these are owned, controlled or licensed by NetIn?

3
0xmohit 2 hours ago 0 replies      
I wish a bit more effort were put in to fill some dummy data in those screenshots [1], [2].

[1] https://netin.co/images/search_page.jpg

[2] https://netin.co/images/profile_page.jpg

4
jakobegger 2 hours ago 0 replies      
On the main website there's a section titled "Trusted by top companies in Silicon Valley". Are Tesla, VMWare, etc. actually paying customers, or did you make that up?
5
jstanley 1 hour ago 1 reply      
> Our platform uses machine learning and predictive analytics to find you a match.

Is this true, or are you planning to just do it manually until you've made enough money to justify building that?

It's fine to do the latter, but it would be nice to be a bit more transparent about it.

6
qiwkwmsns 3 hours ago 1 reply      
Has anyone read the FAQs? Littered with misspellings and common grammatical mistakes. Screams in[secruity].
7
aratno 5 hours ago 1 reply      
This needs to add screenshots before I'm convinced.
8
TheArcane 2 hours ago 0 replies      
I didn't recieve an activation email. What do I do?
9
maruhan2 4 hours ago 1 reply      
It'd be nice if info about the questions are added on FAQ. Currently I have no idea how they're asked.
10
vivekchand19 3 hours ago 1 reply      
Signed up, Not yet received activation email!
11
Mandatum 3 hours ago 1 reply      
Signed up, no email received, can't login
12
soheil 5 hours ago 0 replies      
Would be open to suggestions on what questions are worth asking.
13
DTrejo 5 hours ago 1 reply      
Hi Soheil, where can I find your pricing? Cheers!
25
Ask HN: Why hasn't Perl 6 taken off yet?
71 points by totalperspectiv  6 hours ago   82 comments top 27
1
saberworks 5 hours ago 1 reply      
I love perl 5 and have been using perl since 1997. I haven't touched perl 6 yet simply because the whole mess is still too confusing. It's not obvious what I should download and the frequent release announcements just confuse the issue further. If I want perl6 why am I downloading rakudo star? What the heck is a moar vm and why do I care? Why is the download page telling me it supports "Christmas Perl 6 (6.c language version)?" What is 6.c? Are there multiple versions of perl 6 that I have to worry about? Why does every release announcement say something like this:

"Please note: This announcement is not for the Rakudo Stardistribution[^2] --- its announcing a new release of the compileronly. For the latest Rakudo Star release, see<http://rakudo.org/downloads/star/>."

Perl 5 is much more straightforward. Their download link sends me to a page where I click my platform and then there's a link to "download latest stable version." Why can't I just download the latest stable version of perl6?

I know I can spend a couple of hours learning more about it but perl 5 more than meets my needs and I'm comfortable with it.

2
tikhonj 5 hours ago 1 reply      
The thing to remember with these questions is that popularity is largely a matter of social dynamicsnot the language's intrinsic qualities. I mean, sure, qualities matter to a point, but that point is pretty low: it just has to work well enough. Past that, it's mostly a matter of spreading through social networks, perhaps helped on by marketing.

So when you look at a language and ask why it isn't popular, the answer is probably not that it's bad or that it has terrible features or that it's missing things every other language has.

Instead, the relevant answer is some combination of timing, marketing and luckand Perl 6 definitely flubbed the first two!

It's been, what, two decades since Perl 5 first came out? In that time, trends have changed, people's preferences have shifted (and ossified) and even the role of a scripting language is different. And Perl 6 is not making any of that up on marketing, niftly logo notwithstanding: the Perl brand has been pretty well tarnished over the past years which makes these the absolutely wrong coattails to try sliding in what is supposed to be a brand new language.

None of this, by the way, shows that Perl 6 is a bad language. That's a different discussion entirelya discussion in which popularity plays a small role at best.

But it is to say that I'm not all that surprised Perl 6 hasn't gotten anywhere.

3
markrushing 3 hours ago 1 reply      
I just started playing with Perl6 a couple weeks ago, and I am completely loving it now. It's the strangest process to get used to. I haven't had this much fun programming in a long time. Some frustration every one in a while until I can wrap my head around some new concept, but that's to me what this language is all about. SO much conceptual stuff in it. Very, very rich.

I've been converting some old stuff to it to learn it, and it crazy how much more compact things can become. Working with grammars has been amazing. And I'm just now getting hit over the head with how flexible class roles with parameters can be to consolidate methods that do similar, but slightly different things.

I haven't even started into the concurrency bits yet... something about "promises" and "supplies". But I'm actually looking forward to it at this point. And that's really a surprising thing to me ;)

Anyway, my 2 cents on it at least. I'm not sure it matters if Perl is ever a poster child for anything. I think it kinda just doesn't matter.

4
yolesaber 6 hours ago 0 replies      
Because it waited too long and now everyone who needs scripts just uses Python/Ruby/Lua instead. I highly doubt Perl 6 will take off in any meaningful sense. Not to disparage the language, it's just the social dynamics of the software world and tastes have shifted. Perl is, rightfully or wrongfully, considered a legacy language of the 90s and it would take a serious re-branding and evangelization effort to change that perception. That being said, I know some folks who pull in pretty good bank maintaining Perl apps but they are definitely not writing Perl 6 on the day to day

The logo is cute, I will say.

5
cwyers 4 hours ago 1 reply      
The short answer boils down to, "It might be dangerous, you go first." Moving to a brand new language (which is what Perl 6 is, despite its long incubation and its vestigal name) has real costs - the libraries you have come to depend on aren't there, new libraries to replace them aren't all written yet, some might never be, the ones that exist aren't as feature-complete and battle-tested. For a new language to get adoption, someone needs to go in there and start doing those things, and deliver a value proposition. If a language wants to get followers, at least in the beginning it's better to have one "does it better than anyone else" area than to be a jack of all trades, too.
6
CodeWriter23 5 minutes ago 0 replies      
I have tens of thousands of lines of code written to Perl 5 that I can't run under Perl 6. That's it in a nutshell.
7
ericdykstra 6 hours ago 1 reply      
I don't know if there are any wide polls available about this, but from my experience:

- Most perl developers moved on long ago. My last 3 bosses were all "perl" people who switched to Ruby as their primary language around 2010.

- No strong argument that perl 6 is "best-in-class" for any particular type of problem. This is often what brings interest in new languages (Go, Elixir, and Rust for example, all have this).

8
MichaelMoser123 1 hour ago 0 replies      
> Why hasn't Perl 6 taken off yet?

i think the story it similar to Visual Basic 6 and VB .NET: the change between perl 5 and perl 6 is really big, perl 6 is a new language - not just a change of version, people who are comfortable with perl 5 have little incentive to learn it all again (or they have switched to python)

9
wwweston 5 hours ago 1 reply      
Speculation:

1) The field is pretty crowded. There's a lot of languages competing for attention. It takes time to learn. Which ones should I spend time on?

2) Momentum. Scripty siblings Python & Ruby & JS & even PHP have active communities with a lot going for them.

3) Baggage. Some of the conventional wisdom about Perl 5 is pretty iffy, but it's well established conventional wisdom, dammit! In fact, you'll almost certainly see it recapitulated in this thread. Most arguments to the contrary seem to be slow to make a real dent in what Everybody Knows about Perl. The fact that Perl 6 is a different language will probably be equally slow. People will repeat the comforting mantra that Ruby is the new Perl. Order will be reinforced.

4) Blub-ish paradoxes. Perl 6 is doing some weird and different stuff. Is it hyper-useful weird and different stuff? Will I know until I learn to use it?

5) Not a big win in terms of market value yet.

10
int_19h 1 hour ago 0 replies      
The reason why developers don't migrate from other languages is plainly because Perl 6 doesn't offer enough unique things that people actually care about to migrate.

The reason why existing Perl 5 developers don't migrate is because they don't need and/or want to. The benefit would mostly be derived from writing new code, but most Perl 5 code in the wild are legacy systems and complex admin shell scripts. Those are exactly the kind of thing that you don't want to rewrite in the latest-and-greatest; the potential to introduce subtle bugs by upgrading is more important than the benefit of being able to use new constructs in those parts that you need to improve. And the vast majority of new code is also admin shell scripts written by experienced greybeards, who really don't like their cheese moved, and so will stick to what they're used to for as long as they can.

11
dmerrick 5 hours ago 2 replies      
Perl 6 was too little, too late. It was hyped for a long time and delayed for a long time. During that time, everybody who was impatient and not forced to use Perl ended up moving to the new hotness languages like Python and Ruby.

Ruby is probably the main reason Perl lost so much market share, since it took a lot of the underlying philosophies from Perl and turned it into a beautiful, easy-to-read language.

12
pfarnsworth 6 hours ago 4 replies      
This is my own personal opinion, but I despise Perl. The language requires a ton of memorization of small, tedious rules that are arbitrary, and there are many disparate ways of writing the same thing. I remember giving up on Perl when I discovered that || and or were both "or" operators but had different precedences.

Again, this is my own personal opinion. I know many people have made some great software based on Perl, but it's the only language I won't touch with a 10 foot pole, and I've spent a couple of years programming in PHP and a year in COBOL.

13
hoodoof 6 hours ago 3 replies      
For me the essence of it is this:

Perl says "there's more than one way to do it"

Python, by contrast, says "There should be one-- and preferably only one --obvious way to do it."

The practical outcome of the Perl philosophy is that Perl code can be extremely varied to get the same thing done and therefore much harder for different programmers to understand and maintain. Python programmers are more likely to quickly understand the intent of a chunk of code regardless who the author is.

Perl at its worst can also be pretty arcane and I've heard it described as "executable line noise". That doesn't make for maintainability.

14
anaolykarpov 5 hours ago 0 replies      
I'm a (very happy) Perl (5) dev and although I've attended lots of Perl 6 cool presentations, I haven't took the time to learn it yet. My reasons are related to the fact that there are not that many commercial opportunities with it just yet.That is a thing which I expect to change in a relatively short time. I've already seen a few job posts looking for Perl 6 developers, which, given the language is declared 'production ready' for less than a year is a pretty amazing stuff.

Also, there aren't hat many libraries in its ecosystem yet, a thing which can be a plus for devs who want to create a name for themselves in the open source world by implementing/translating libraries with a large user base potential.

15
reidrac 1 hour ago 0 replies      
I used to love Perl 5 in my free time for personal projects around 2009-2010, frameworks like Dancer or Mojolicious were getting traction and it was a lot of fun; but when I wanted to change my career and move away from what I was doing at the time (PHP mostly), I finally decided to go with Python and that meant no time for Perl (and no Perl 6).

Basically at that point there were less Perl jobs and the language had (has?) bad reputation on being too easy to write hard to maintain code that I thought the language wasn't worth the peer pressure when Python or Ruby were nice languages too with open and welcoming communities behind them.

Perl 6 seems to add to that bad reputation unfortunately, adding extra complexity.

This is just an anecdote, but reading the comments seems like other people had a similar experience.

16
oblib 3 hours ago 2 replies      
I think the Perl community had a lot to do with Perl falling off in popularity. The standard reply to questions posted the "Beginners Perl" mailing list was "RTFM dumbass".

The "Learning Perl" book sucked hugely. CPAN is pretty cool but too many modules have poor documentation and almost no example code.

Web app frameworks got convoluted and didn't make things easier and tended to lock you into doing things their way.

Despite that, I still liked perl because it did let me do things my way. I waited and waited for Perl 6, and then quit caring.

This year I finally rewrote a perl/cgi web app in Javascript. What little server side I code I needed I used Perl 5 but there was very little.

17
davidbanham 6 hours ago 0 replies      
Shortly after release I got interested, looked into it enough to discover that speed was a concern for the future and that the current implementation was super slow. I wandered away.
18
Ulti 1 hour ago 0 replies      
Because people keep asking why it hasn't taken off rather than using it and posting here about how it was awesome.
19
bootload 5 hours ago 2 replies      
"why isn't it (Perl6) the poster child of the scripting languages yet?"

In the Perl6 community there is no equivalent of "The Python Tutorial" [0] or the "Python X.Y.Z documentation". From Python 1.X onwards, if you wanted to learn Python from scratch, this is where you started. Where is this in Perl6 version that assumes you start from scratch without having to learn the baggage of PerlN? [3]

[0] If I'm wrong loot at this: https://docs.python.org/3/tutorial/index.html and point me to the Perl6 equiv.

[1] https://docs.python.org/3/

[2] where N < 5.X

20
pjc50 4 hours ago 0 replies      
Perl 6 is the poster child for "second system effect".

The python community have just about managed to achieve a backwards-incompatible change, which was fairly minor and developed in a reasonable timeframe to address certain specific issues.

The Perl community were made extravagant promises 15 years ago. People started holding their breath for 6. By now, everyone has given up and the delay has asphyxiated the community. Not to mention that the Perl niche is much more crowded and still has a working, complete Perl5 in it.

21
philwelch 5 hours ago 2 replies      
It took 15 years for a "production ready" Perl 6 interpreter to be finished, at which point Python, Ruby, Lua, and even Node.js had more than enough traction to not leave much room for Perl 6 in the market.

Six years ago I got into a flamewar on Hacker News about Perl 6 not being done yet and had to clarify to an angry Perl hacker that by "not being done", I meant, "as of 2010 we only have an incomplete implementation of a draft specification of the language".

22
pknerd 5 hours ago 0 replies      
Actually other languages user got some awesome web frameworks: PHP got Laravel and Symfony, Ruby got Rails and Python got Django and Flask. On other hand Perl community kept waiting for v6 for long time and in due course users shifted to other languages due to reasons mentioned above.
23
Roboprog 5 hours ago 1 reply      
Ruby.

(mentioned elsewhere, but let's distill it down)

24
misccodework 6 hours ago 3 replies      
cuz the language is cryptic as fuck, using every symbol there is; code is like

 my a = fn { local $v= _[0] %~=->
possibly a time investment trade off, focusing on another C-style lang is not as big as a time investment, and translates better to C and C-lang derivites

25
IslaDeEncanta 4 hours ago 1 reply      
Perl 5 is better than Perl 6, so why would I change?
26
AzzieElbab 5 hours ago 0 replies      
Because pearl is not supported to make sense
27
jag2 5 hours ago 1 reply      
ask the parrot.
26
'We are all Thomas Mores children 500 years of Utopia theguardian.com
29 points by Hooke  11 hours ago   3 comments top 2
1
IslaDeEncanta 4 hours ago 1 reply      
Utopian socialism's failures are the reason Marxism came about as a structured, disciplined critique of class society. Idealism is no way to make people's lives better. Instead, you must understand the root causes of oppression in order to attempt to overcome it.
2
dvh 3 hours ago 0 replies      
I've tried to read Utopia but that book is written so dreadfully I had to drop it after 20 pages or so.
27
Intercooler.js Making AJAX as easy as anchor tags github.com
392 points by cx1000  17 hours ago   93 comments top 29
1
DSteinmann 14 hours ago 12 replies      
This has been reposted so many times by the author and by others that I can't help but finally ask.

What's the point? This would lead to your API being comprised of blocks of HTML which are probably only useable for one product. Why not just use REST + JSON? It would take no more than five minutes to set up client-side rendering, and you could even make it attribute-based like this with barely any more effort. Is it really not worth spending the extra five minutes it takes to set things up in a way that is reusable and standard? All I see is piles of legacy code being generated where it hurts most - in the backend.

This took me 10 minutes to cook up. It would have taken about three if I hadn't forgotten the jQuery and Handlebars APIs. This allows you to POST to a JSON API using two attributes. Untested of course, but you get the idea:

 Example: <button ic-post-to="/api/resource" ic-template="#a-handlebars-template" /> $('[ic-post-to]').click((button) => { fetch($(button).attr('ic-post-to')), { method: 'post' }) .then((result) => { let templateText = $($(button).attr("ic-template")).html(); let template = Handlebars.compile(templateText); let resultHtml = template(result); $(button).replaceWith(resultHtml); }); });

2
cx1000 16 hours ago 4 replies      
Honestly it feels like intercooler.js is building in functionality that should exist in HTML in the first place. For example, the unintuitive "href" tag sends a GET request, and POST requests are only sent with forms and buttons. What about PUT, PATCH, OPTIONS, or DELETE? According to http://softwareengineering.stackexchange.com/a/211790, "At this point, it seems that the main reason why there is no support for these methods is simply that nobody has taken the time to write a comprehensive specification for it."

Intercooler.js makes them seem a little more "built in" to html, which I like.

3
nzjrs 15 hours ago 1 reply      
I said this on the other discussion, but I'm compelled to post it again.

Intercooler.js is so logically designed it basically requires no documentation - I read the introduction and a handful of examples and thought, "shit of course it works this way" and could basically derive how every other feature mapped to its implementation in that moment.

Congratulations!

4
scwoodal 17 hours ago 0 replies      
I was a long time pjax/turbolinks user but always felt like I was pushing the boundaries of what these technologies were doing and always wished for more functionality.

I tried out several client side frameworks but always felt like it was way overkill for the apps I built.

I gave intercooler.js a try a few months ago and was extremely pleased. There's very little server side that's required and the extra functionality I had wanted from pjax was there.

If you're wanting the simplicity of server side rendering plus the feel of an SPA without the frontend complexity give this library a try.

5
Kequc 17 hours ago 4 replies      
I'm surprised this needs jQuery. What this seems to be is a simple script that fetches a resource and places it into an element. I really feel opposed to adding more dependencies where they aren't required. That could be written without jQuery or this library fairly easily.
6
xg15 14 hours ago 1 reply      
This seems to make things simpler at first glance, but I fear in the end you end up with the worst of both worlds: You have the API inflexibility and UX restrictions of a pure-HTML approach combined with the overhead and need for graceful degradation of a full-ajax approach.
7
carsongross 15 hours ago 5 replies      
Main intercooler.js author here, glad to answer any questions.

Happy to see people are enjoying it!

8
Touche 16 hours ago 1 reply      
It would be cool if this could some how use DOM diffing (I assume it just uses innerHTML now), so you'd get minimal dom updates with the advantages of doing everything server-side that this already provides you. Throw in some smart service worker caching and you get pretty close to the responsiveness of a fully client-side approach.
9
20years 16 hours ago 1 reply      
I am happy to see this featured on the front page. I am using this for a current project after coming off an Angular project. I am so glad I chose this. It is simple to use and a pleasure to work with.
10
stdgy 15 hours ago 0 replies      
Very neat! This matches up closely with what we have evolved for my group's legacy codebase to simplify handling AJAX requests. I suspect we're not alone in arriving at this sort of declarative abstraction.

Unfortunately, our implementation is rather scatter-brained and non-uniform. That's partly due to its gradual evolution and partly due to lack of free employee time to clean up bit-rot. I'm going to investigate this a bit more and mock out some examples for our product. I definitely think it'd help us organize our unruly mass of code. Good job!

11
oliv__ 16 hours ago 0 replies      
Thanks HN! This is one of those just-in-time situations: I was going to need something to do some AJAX in the next few days and this is one of the most elegant solutions I've seen so far. Didn't even know this existed!
12
chunkiestbacon 3 hours ago 0 replies      
I used this to make my own webshop software for a client. Lots of ajax features but only 80 lines of javascript in total. Intercooler is great to update the shopping cart in the sidebar when pressing the add to cart button. This makes the shop feel a lot smoother.
13
dec0dedab0de 17 hours ago 0 replies      
I recently used intercooler to implement a small feature in a django app. It was an absolute pleasure to use.
14
cx1000 17 hours ago 2 replies      
I love that you can use this without having to build anything with babel/webpack. Given the scope of my web apps, anything that transpiles or mutates my sourcecode is a non starter because it makes debugging it weird since I'm not looking at my own code anymore.
15
sleepyhead 4 hours ago 1 reply      
"Attribute ic-get-from not allowed on element div at this point."

https://validator.w3.org/nu/?doc=http%3A%2F%2Fintercoolerjs....

16
rhabarba 10 hours ago 0 replies      
"Small". As in "just add the giant jquery library as a dependency".
17
brianzelip 16 hours ago 0 replies      
fyi, the timing of this post is likely related to the HN discussion https://news.ycombinator.com/item?id=12882816
18
unethical_ban 17 hours ago 0 replies      
I see someone read the front-end discussion. I am reading the guide to IC.js and it's a neat piece of tooling.
19
wichert 14 hours ago 0 replies      
If you like this sort of thing I can recommend to look at Patternslib (http://patternslib.com ), which has many tools to add interactive behaviour to a webpage without having to write and javascript, making it a great toolset for designers. The injection pattern (see http://patternslib.com/inject/ ) does everything intercooler does, but also a lot more.

Disclaimer: I'm one of the original authors of Patternslib.

20
ape4 13 hours ago 1 reply      
What happens when there's an error?eg cannot contact host.
21
chandmkhn 10 hours ago 0 replies      
Webforms version of asp.net always sypported this idea through soething called UpdatePanel

https://msdn.microsoft.com/en-us/library/bb399001.aspx

Commercial control providers in .Net world support these scenario with something called "CallbackPanel".

https://demos.devexpress.com/MVCxMultiUseExtensionsDemos/Cal...

Real conufsion starts when you have nested HTML controls that automagcally making ajax calls. Nice idea as long as you can get away with minimal work.

The moment you want to use any moden SPA framework, you are up for a big rewrite.

22
astrospective 16 hours ago 0 replies      
I've been using this on .net projects, have pulled off some fairly intricate UIs by returning server rendered partials. The polling support is nice and robust for dashboards.
23
ing33k 15 hours ago 1 reply      
this is one of those libraries, which should be posted on HN once in a while .
24
Yokohiii 11 hours ago 0 replies      
Not sure yet if I like it, but for the "on churn" parts I will leave some respect here.
25
grimmdude 16 hours ago 0 replies      
Cool, this has some great functionality. A while back I wrote something very similar called "jQuery Ajax Markup". It was much simpler though: https://github.com/grimmdude/jquery-ajax-markup
26
bedros 15 hours ago 1 reply      
the best part is the examples, there are so many practical cases.
27
jramz 16 hours ago 0 replies      
28
bitforger 12 hours ago 0 replies      
+1 for official theme music
29
bobwaycott 15 hours ago 0 replies      
I have, over the last few years, taken a similar approach and built my own reusable, yet rudimentary, version of this. Happy to see such a well-thought out and elegant approach that matches my own preferences. Going to be using Intercooler in the future (and might even switch my old stuff to it). Nice project.
28
Show HN: Chrome extension that replaces words into a different language alpharabi.us
48 points by drshrey  11 hours ago   21 comments top 11
1
Davidiusdadi 7 minutes ago 0 replies      
The same idea has implemented a while ago by http://readlang.com/

Being able to repeat the words i "looked up" is essential for me.

2
ivancamilov 7 hours ago 0 replies      
Cool, but the example has a few errors. "students" in spanish is "estudiantes", the example is missing an S.

Also, "more" is "ms" in spanish. Accents are important for meaning in a lot of languages.

3
clydethefrog 2 hours ago 2 replies      
Similar browser extensions are made before.

HN discussions:

Language Immersion for Chrome

https://news.ycombinator.com/item?id=3921773

Polyglot

https://news.ycombinator.com/item?id=1669162

4
steveridout 1 hour ago 0 replies      
Nice work so far.

I'm not convinced by this approach since I prefer to learn Spanish words in the context of Spanish sentences, but I haven't given it a fair shot. Of people who have tried this or similar extensions, did you a) find that you learned much? and b) keep using it for a prolonged period?

5
elhalyn 4 hours ago 0 replies      
Hey Guys, love the idea... since I have the "not invented here" syndrome I am building my own version for quite some time now :) (http://www.langulearn.com)

Next step to implement would be nlp, if you need help or just want to talk -> hello@langulearn.com

7
8
pkd 8 hours ago 2 replies      
Sigh. I was working on EXACTLY the same idea. Well, looks like these guys have the momentum now and the product looks good. Good luck!
9
drshrey 6 hours ago 0 replies      
If anyone's interested, the repo's right here: https://github.com/drshrey/alpharabius
10
dvcrn 8 hours ago 3 replies      
Very cool idea but slightly disappointed by the language options and constraint on websites. I barely use any of the sites it supports and don't learn any of the languages.

I'm wondering how difficult it would be to pipe random words into google translate and replace them.

11
phmagic 7 hours ago 1 reply      
Brilliant, keep up the good work
29
Show HN: LosslessCut Cross-platform GUI tool for fast, lossless video cutting github.com
88 points by mifino  17 hours ago   30 comments top 5
1
pritambaral 15 hours ago 4 replies      
I guessed from the title it was going to be using ffmpeg, because I myself have used ffmpeg's `-ss`, `-t`/`-to`, and `-codec copy` numerous times for this exact purpose.

Then I saw that this bundles its own piece of Chromium and ffmpeg and is ~70 MB in size. Something is seriously wrong with today's app development ecosystem if it takes 70 MB, even when I already have both Chromium and ffmpeg on my system.

EDIT: On the other end is QtAV, a cross-platform multimedia player and SDK which took me only 1 MB of network download. It uses Qt and ffmpeg, both of which I already have on my system, so it doesn't have to redundantly bundle anything.

On OSes without Qt and ffmpeg in their package repos (OS X and Windows), the players are ~ 20 MBs in size. So even if one argues "cost of cross-platform compatibility", it still doesn't make sense to bundle the entirety of a web browser for something as simple as this.

2
tckr 14 hours ago 1 reply      
Check out AviDemux, http://fixounet.free.fr/avidemux/it offers a simple UI and also lossless cutting.
3
arjie 11 hours ago 1 reply      
I think including everything is super cool because I had the app installed and running in under 10 seconds. What wasn't cool was that I asked to cut a short 15 s video to the first second, and there was no feedback but the spinning gear for 5 minutes.
4
cm3 12 hours ago 1 reply      
How do you deal with the fact that some video tracks do not provide the needed cross-frame data or the times you're cutting are at unfortunate points which would require a re-encoding because a quick byte copy of the existing stream doesn't work or at the very least will complain later about missing things like color info (although it's played back correctly by mpv)?
5
revelation 15 hours ago 0 replies      
So you're playing with version X of ffmpeg and cutting with version Y. It seems that X+Y equals disaster.
30
NYC Bike Stats sirpthatch.github.io
26 points by thatcherclay  10 hours ago   4 comments top 3
1
stinos 1 hour ago 1 reply      
Is it just me or are line charts with fills harder to interpret while not adding any value over normal line charts? Instead they add ambiguity because while the surface itself doesn't seem to mean anything (?), it does have two bounding lines and depending on the graph only the upper or lower bound is the line which contains actual data - the other one is the line with the data from one of the other data sets. At least that what I make out of it but maybe I'm completely wrong?

E.g. take the first graph: the red surface starts at 0 and the upper bound is the actual 'overall ridership' for Second Avenue, right? (note to author: even when the units seem obvious to you, they might not actually be obvious for everyone). So the lower bound of the green surface (Lafayette Street) has the same shape as the data for Second Avenue. Why? What does that mean? It's just the upper bound of the green surface which is the actual data for Lafayette street, no?

On topic: glad to see bicycle usage is rising, but would be interesting to see if e.g. car usage is declining and how the total number of people on the road is changing.

2
thesehands 1 hour ago 0 replies      
Also interesting to see the breakdown of usage of the Citi bikes. Does increased usage of Citi bikes lead more casual users who don't wear helmets?
3
untilHellbanned 57 minutes ago 0 replies      
Nice analysis but Excel default color schemes are unbearable. I wonder how many people their hideousness turns away before even contemplating the data.
       cached 7 November 2016 11:02:01 GMT