Where 'nice things' is defined as being open-source, having open-source ecosystem of developer tools etc.
This isn't so much the beginning (as good stuff has been happening for a couple of years now) but it's a huge step.
Google and Apple need some other party to keep them sharp, it might as well be MS.
"Available Wednesday, Visual Studio Community 2013 is a free, fully featured edition of Visual Studio including full extensibility."
So, it sounds like this will replace the Express edition and let you install extensions like you can in the Pro version.
"Visual Studio 2015 and .NET 2015: build for any device -Built from the ground up with support for iOS, Android and Windows, Visual Studio 2015 Preview makes it easier for developers to build applications and services for any device, on any platform."
It almost sounds like you're going to be able to run VS2015 on different platforms, but I doubt it. Maybe you'll run the web version of VS2015 to develop from Mac/Linux?
"To further support cross-platform mobile development with .NET, as part of their strategic partnership, Microsoft and Xamarin announced a new streamlined experience for installing Xamarin from Visual Studio, as well as announced the addition of Visual Studio support to its free offering Xamarin Starter Edition available later in the year. "
This is very interesting - .NET is going fully cross platform but they haven't bought Xamarin...are they planning on competing while keeping their frenemies close or something else?
The Scotts (Hanselman/Guthrie), Miguel De Icaza and so many others have worked tirelessly on this, and we (.Net Developers) owe them a ton of gratitude for helping to make sure this ecosystem doesn't wither on the vine.
If you want a high-level lang runtime with good IDE support, you can just use .NET now. You know, unless you want Ask toolbar.
For Microsoft, it's less bad if everyone switches to an uncontrolled platform than if they switch to a platform locked in by a competitor. The embrace/extend/extinguish logic works the other way when Apple are driving it. (They've done fairly well at killing off Flash, and Silverlight never stood a chance in this environment)
C# will port over just fine. But the .Net libraries? System.Windows has little to nothing in it, and right now using things like System.IO.* on Linux and Mac is just asking for trouble.
What are they going to do, hack in System.IO.* Linux support after the fact? Or just add Linux.IO.* which is even more of a hack. In either case you're going to get very messy very fast.
The .Net libraries absolutely could have been designed with cross platform in mind, for example if they put the IO libraries in System.Windows.* and several of the other Windows-specific APIs.
As it stands the .Net framework/libraries are very Windows locked. So much so you'd almost have to scrap them and start over to make it more platform agnostic.
In other news:
* Visual Studio 2015 and ASP.NET 5 will support gulp, grunt, bower and npm for front end developers.
* OmniSharp is a family of Open Source projects, each with one goal - To enable great .NET development in YOUR editor of choice - http://www.omnisharp.net/.
The new Microsoft under Satya Nadella has totally changed the direction of Microsoft in just a few months. They had one of the best and rock solid development platforms and research division, and loyal customers. The new Azure cloud (Online + On Premise) along with the opening of .Net will change the playing field.
I'm sure this is a great new for us developers. The change to work on one of the best runtimes, on a platform of our choice and on one of the best programming environments.
Great job Microsoft!
I am, of course, happy to see it. But, let's not get too excited about what good Open Source citizens Microsoft have become. Let's let their actions going forward determine that.
Visual Studio is awesome, specially debugging when it is in Symbol server debugging.
Though i hate the thing 'Not Responding' and your OS is freeze. When your solution growing with 50+ project, it took 4-5min to open and by any chance if you click the solution it will hang.
So, Microsoft is finally adapting. But what are they actually doing? Why did Microsoft finally decided to make .Net cross-platform? What's in it for them?
Look at what's in this shiny new package: They've open-sourced just the core runtime. They are not open-sourcing Visual Studio. Or WPF. Or SQL Server. Or Active Directory. Or Office.
There's one thing the Linux ecosystem is pretty good at: Scaling, both up and down. There are technical reasons for that, but none could possibly be an issue for a software powerhouse the size of Microsoft. There are also commercial reasons for that, most important being: You just can't beat free.
So that's what Microsoft is finally moving against -- Dear startup founder who is afraid that licensing costs will eat him/her alive while his/her "Growth Hacking" strategy is working, dear embedded programmer whose tiny IoT device that just can't cope with the whole Windows mumbo-jumbo, welcome to the Microsoft platform -- You can now safely run your C# on these free platforms as well.
So, Microsoft is finally back in the game. They even seem to be playing nice. But the question in everyone's mind is: For how long?
Well that should make a lot of people around here happy.
A few months ago, we decided to write a big new software component at $work, basically a service layer that is going to accumulate lots of business logic. We discussed several programming languages, and thought that a statically typed language might be a good fit (we mostly did perl and python so far). C# was dismissed pretty quickly, because .Net was closed source, and Mono had the reputation of being a bit second rate (possibly not well-founded, but also hard to debunk for somebody not in the community).
I mentioned that Roslyn was also open source, but it was hard to convince anybody when the "main" implementation was still closed source (and we're very much an open + linux shop).
If this had come a year earlier, we might have picked C#. Maybe there'll be another project here in a few years...
Otherwise mono seemed to run pretty much everything i threw at it. Can someone with .NET experience clarify for me what this will enable that mono doesn't (yet)?
In other words, if .net will continue to be riddled with windows only API's I'm not really interested.
The article seemed to mostly focus on the server side of things, but I'm not really sure if they can pull many devs over to writing application servers on .net. It will be hard competing against Java there.
This feels huge to me as someone that has always been on *nix variants but that has been told the .Net environment is amazing as long as you're willing to pay/work on Windows. I still probably won't switch over to C# or F# any time soon but it's good to know I could actually work on a WinMo app if needed.
Quite a powerful combo at that point!
That would take more than a few non public meetings betweenn G and MS I guess but possibly better than having to tiptoe around Oracle in the long run?
I wouldn't believe them this time either.
Then we have to look at its memory usage, GC pauses, locking issues.
It is much easier to say than to port correctly a large, very complex code-base to an alien platform. (Mono has been written from scratch, if I recall correctly).
Still this is great news and kudos to Microsoft for taking this bold step in the right direction.
The move is designed to attract iOS and Android Devs to .NET.
But let's see:
As an iOS developer, I've invested years in learning Objective-C and Cocoa, UIKit, etc. Now I'm starting with Swift.I'm sure a lot of iOS devs feel this way. Besides, if I can't use it from OSX, then I'm out.
Same thought pattern applies for Android/Java - why would a seasoned Android/Java developer want to learn a whole new framework and programming language ?
Is C#/.NET so much better than Java/Android or ObjC/Swift that it mandates switching to it ?
Edit: It probably won't be for me but just saying who knows, some developers might prefer C# over Java on Linux and Mac. Too bad Microsoft is 13 years to late for me on this. They had my interest when I was beta testing Visual Studio .NET 2002 but by 2005 when I saw how far Java had come and got a taste of the power and Cadillac nature of Eclipse; it would be tough to turn back now.
"Microsoft Announces Windows 2000 Certification For Microsoft Certified Systems Engineers"
"Microsoft and Samsung Reveal Windows Powered Pocket PC For GSM/GPRS Networks"
"Microsoft Office 97 Family of Applications Honored With Industry Awards"
"Microsoft Invests in General Magic"
"Microsoft Internet Explorer 3.0 Beta Download Demand Overwhelming"
Ignoring the time machine aspects, I wonder what made their algorithm come up with these stories. They mention "Microsoft" and "Net", but that is about it.
Also its great to have an engineer at the helm of microsoft again.
My experience has been that .NET interop with C++ code bases using C++/CLI was much smoother than with PInvoke. It would be if that were cross-platform too.
Developers can get started with Visual Studio Community 2013 here.
I think they meant here?http://www.visualstudio.com/news/vs2013-community-vs
I wanted some more specifics regarding licenses and found this page helpful: http://www.dotnetfoundation.org/projects
Hopefully one day they'll support the entire Java runtime so that I can deploy Java apps to the JRE or .NET.
I've always missed the power of visual studio when programming for open source platforms. This can change things a lot.
so that means I can write GUI-based desktop apps for Linux in C#, right?
The next decade will be interesting.
It was a great platform though. I was impressed by the it when I first went to Teched in year 2002.
Looks like it's time to finally learn F#
For Microsoft though, doesn't this mean (ironically) loss of business, because many people will no longer have a reason to run Windows?
Feels good :)
Google Cache: http://webcache.googleusercontent.com/search?q=cache:http://...
As nice as C#/F# is, the real fun comes from the powerful IDE. And I just don't do windows anymore, except for occasional gaming (and this only until I finally buy the new retina iMac and throw the last PC at home away).
Great news, but this platform should be born as Open Source since the beginning. Anyway, before late than never.
Please say yes.
Recently, there was a bug that makes the IOS OWS client replace its standard icons with emoji. It's been over 90 days and the bug is still not fixed.
There have recently been a lot of bold decisions at Microsoft. If anything can turn around a dying company it's this kind of approach.
And, moonshot, but giving IronRuby/IronPython as much prominence as PowerShell.
I've never heard a Ruby or PHP developer list Azure as a potential deploy target. That is a real problem for Microsoft, even though Azure can do a lot of the same things AWS or Google's cloud does.
Smart move Microsoft.
If there are any MS people in this thread: I would pay for Visual Studio for Mac and Linux if I could also use its GUI designers on those platforms. If I could write a GUI front-end in C# and design it with VS and ship it for Windows, Mac, Linux, and possibly others, then I'd definitely pay money for that.
Right now we've got Qt, Java, and possibly HTML5+node-webkit for that, and none of those are anywhere near as good as MS's GUI tooling and IDE.
.NET going open source is Microsoft admitting that, despite its best efforts, developers still want to know whats going on behind the curtain, whether the mirror really works, and just what kind of smoke is being blown up their ass in the effort to capture their minds and bind them to the brand.
'...show me the code.'
True to form, the links to github are broken.
Too often users aren't allowed to install programs.A simple program that can run on anything since XP is a good solution around Microsoft's sandbox strategy and DLL hell and install programs are big problems.
With wine you can even run your simple EXE on Linux and Mac.Native x86 means you run faster than these virtual machine based solutions.
I know this is a long stretch, but if any of you bought a used copy of LiSP from someone in Virginia, USA, please can you check if your copy has a telephone number hand written on inside of the right cover? It's the only contact I had to my step-father and I have not heard from him since. We haven't been in touch since my mother's passing.
Look forward to the first pictures from the surface. I'm at the Division on Planetary Sciences (DPS) meeting  in Tucson at the moment, and there are already incredible results being presented based on data acquired by Rosetta. Stay tuned for a whole lot more!
1) 3km above comet:
2) Few seconds before landing:
3) First surface image?
4) Possibly a new image from the descent?
Edit: no, here is the source (Rosetta's NavCam from yesterday):
Rosetta Lander Imaging System (ROLIS)
More analysis of @Philae2014 telemetry indicates harpoons did not fire as 1st thought
edit - the landing is confirmed, however the harpoons did not fire: https://twitter.com/ESA_Rosetta/status/532579871202238464
I'm really curious to know how different it is from the web or enterprise development worlds.
The ESA live feed at most times show people in some kind of control room staring at screens. There is no apparent way to see any highlights, unless I want to try scrolling back and forth through the hour-long video stream.
At any given time, various forum threads seem to have more information than the ESA site, which seems to communicate mostly through either lighthearted tweets, one-line headlines, or general background articles.
All I want is a simple timeline of events, constantly updated with latest news and images. Instead we have forum threads where you have to dig through comments to find out what is the newest info.
So they disabled the orientation system to save energy, but first they made the probe rotate quickly to stabilise it like a gyroscope.
That's stuff from sci-fi books / Mc Gyver movie :)
Thanks @brianpgordon - Check out this gif of the orbital maneuvers required for Rosetta to reach its destination:https://i.imgur.com/TUkKuhf.gif
Live twitter feed of ESA https://twitter.com/esaoperations
It looks like @Philae2014 made a fairly gentle touch down on #67P based on amount of landing gear damping #CometLanding
pretty mind blowing for me to plan ahead 10 years
Edit: Thanks for all the replies! I'm at work now but will take a look at them this evening.
As someone's who's worked on a few spacecraft project I feel really bad for the team(s) (recently worked on one which didn't go so well, years of work down the tube). Even if it didn't go perfectly I hope they're commended for the work they've done so far & the landing they achieved.
I am wondering what this will mean for humanity. Do you guys think the insights we gain from Philae will be as impactful as the ones from other space missions?
Still can't believe ESA planned and landed a robot on a comet. Bravo!
Silencing diverse opinions is quite possibly the worst facet of HN.
To me, no other statement could be more impacting. Earth is finally sending motherships to space. feeling mind-boggled
What was once a killer app, a core productivity tool, has given way to an almost unusable interface.
It is now really hard to perform some tasks.
For example, I am a heavy calendar user and I have something in the calendar almost every day. It is now extremely difficult to answer the question: When is a three hour window free in the next month or two?
That used to be a single glance at the month view which communicated to me full day events, part-of-day events, and for the latter which part of the day and how long the event took. A single glance at month view could answer it and I could move forward backwards to glance at the next/prev month.
Old: https://www.dropbox.com/s/kiic7fdrmu65172/2014-11-13%2006.56... October 2014 in the old v4.4 Calendar app)
To achieve it now, I'd need to use the 5 day view, and for each 5 day segment to scroll up and down as even on my Moto X (2014) I cannot view more than half of the working day.
New: https://www.dropbox.com/s/pgqbhnc1ifp02a9/2014-11-13%2006.56... October 2014 in the new v5 Calendar app)
Gone is the ability to use Calendar on the phone as the core way to organise your life, it is essentially now unusable on the phone for anything other than a near-term agenda/itinerary.
All of the new Gmail/Inbox to Calendar features? Dead to me, all of my accounts are Google Hosted accounts and none of the Google Now, Gmail, or Inbox integrations with Calendar work on Google Hosted accounts.
However, there are some inconsistencies - besides the Calendar app heavily described here, the Contacts app on my Nexus 7 has some weird design choices.
Firstly, when viewing a contact, there is no back button in the top left - you need to use the system back button, or drag the card down. Secondly, when adding a new contact, the checkmark button is in the top left, and to discard you need to go to the top right overflow settings button, and choose discard changes (the only option in the overflow). Completely counterintuitive (top left for Create, top right for discard), and the opposite of the Gmail app's Compose window (which uses the standard layout).
I was really hoping they would bring back whatever API made it possible for things like AppOps Launcher to allow you to prevent other apps from accessing things like your contacts, account listing, or location. I was hoping that when they took it away it was because it was intended for release in a future version, but we've yet to see it since.
This is a great future that honestly wouldn't have happened if it weren't for the fragmentation and slow update issue that carriers and device manufacturers created. It's like trying to grab toothpaste once it's out of the tube. The harder you squeeze, it simply finds another crack in your hand to go out of.
Material is also fantastic, almost joyful to use. It almost nails the flat-with-a-dash-of-skeu that Microsoft and Apple have both missed. There's still a few too flat bits here and there, but it's a great place to be.
How is the situation with backups and restores nowadays? If I'd have to redo all my settings when switching from an Android to another Android phone in 2016, I'd be pretty sad. I'd even be willing to restrict myself to Nexus devices if that makes a difference.
Oh, also, are there any good Nexus phones with dual simcards? That would be a major reason for me to switch over.
Isn't that how versioning works generally?
> Since Dalvik only compiled at runtime, the compiled code was never written to disk. [...] This would lead to a lot of disk thrashing [...] Since ART is already compiled, the compiled code can be paged out to disk
It's funny that we still call disk any slower, non-volatile memory (and it's even funnier to imagine an Android phone with an actual disk).
(and no, i don't want to have to download other apps, tweak this and that to achieve the desired outcome. It should be standard)
Here are some other ones that didn't quite hook me.
I'm tremendously excited for this.
Also: if you don't launch with coffeescipt, I'll be adding a browser plugin to enable it.
This looks great. Will there be a pre-launch documentation release? If I were going to mess around with it, it'd be nice to be able to think about it for a day or two.
Going back to the topic there are great points there. Remember discovering "tc qdisc" and playing with it. Really nice tool.
But another thing to learn perhaps, is to try to avoid the gray zone by going to either the "black zone" = dead, or "white zone" = working fine. That is, if a node/process/VM/disk start showing signs of failure above a threshold, something else should kill/disable it or restart it.
Think of it as trying to go to stable known states. "Machine is up, running, serving data, etc", "Machine is taken offline". If you can try to avoid in-between "gray states" -- "Some processes are working, some are not", "swap is full and running out of memory, oomkiller is going to town, some some services kinda work" and so on. There are just too many degrees of freedom and it is hard to test against all of them. Obviously somethings like network issues cannot be fixed with a simple restart so those have to be tested.
I once wrote a library for python that injected itself into the main modules (os, sys, etc) and generated random failures all over the place. It worked very well for writing reliable applications, but it only worked for pure python code. I don't own the code, so I can't open source it unfortunately.
I was very impressed with its feature-set (for what it is). On our team, we use it to see how our iOS app will react to severe network problems (via testing in the simulator, mostly, though it's also available on iOS devices themselves as explained in the above article).
I am still trying to work out how I not knobble my DB connection when trying to simulate client errors on a single dev machine.
On the devices side? Sure, Surface and Lumia are nowhere near as successful as the iPhone, iPad and Android. That was Microsoft coming late to the party. But come to the party they must. Before Windows Phone shareholders screamed for an iPhone competitor from Microsoft. After Windows Phone shareholders screamed for an iPad competitor from Microsoft. Windows Phone is good, and has some loyal fans. Windows 8 didn't do much for Surface, or Microsoft. Surface hardware is however also good, but new.
Ultimately the devices strategy strikes me as being sound - it's a platform to showcase the services side of Microsoft, and in time may be profitable to a point where the critics are satisfied.
I think it is entirely feasible that they haven't really thought through what they are doing. It feels that they are simply streamlining and opening up without an end game in mind.
I find it curious that Microsoft is the target of this criticism, and I am further curious what the author thinks about Apple.
Eh I'd hazard a guess that the vast majority of mobile office users don't need unlimited cloud storage (especially with 15GB free anyway) or advanced editing capabilities. Both of those features would be well served in an enterprise environment, but there are Office 365 Enterprise subscriptions for that. Maybe no dropbox integration is an annoyance, but I question whether that would be enough to push out existing office users.
They didn't keep this a secret, man.
This paragraph stuck me as funny as it is decrying Microsoft upselling premium subscriber only features for Office 365 and then it ends on a link that is a "members only" upsell to a $10 a month subscription feature for this blog.
Ha, ha. wut.
After this, I have trouble taking the rest of the article seriously...
> The offices the girl rode between were electronically conterminousin effect, a single desktop, the map of distances obliterated by the seamless and instantaneous nature of communication. Yet this very seamlessness, which had rendered physical mail an expensive novelty, might as easily be viewed as porosity, and as such created the need for the service the girl provided. Physically transporting bits of information about a grid that consisted of little else, she provided a degree of absolute security in the fluid universe of data. With your memo in the girls bag, you knew precisely where it was; otherwise, your memo was nowhere, perhaps everywhere, in that instant of transit.
Yeah, about that...
Unless they're going back to mechanical ones.
I wonder whether its time has finally come.
Edit: It's worth noting that each typewriter is unique and will leave a telltale signature subject to identification https://en.wikipedia.org/wiki/Typewriter#Forensic_examinatio...
Of course, modern printers aren't immune to this and many models incorporate identifiers by the manufacturers to aid forensic investigation https://en.wikipedia.org/wiki/Printer_steganography
Is open source software and reasonably security practice really that bad? I mean I know it's bad, but is it abandon-common-sense-and-just-grasp-at-straws bad?
God forbid we shouldn't worship new technology as always better.
>"During this 15-year journey they were acquired, but subsequentlybought the company back a few years later and are now wholly owned bythe staff."
Non-small tech companies owned by staff are pretty rare. Have you everwritten anything about the experience? Does it work well? Is itstructured like a co-op or similar?
(sorry for the barrage of questions, but I've always wondered aboutthis)
The idea that a major government malware contracting effort was required to pop a particular Hungarian CA (presumably for deniability reasons) tells you something. The USG virtually undoubtedly controls several RSA keys that can be used to sign arbitrary SSL/TLS certificates. Why didn't they just use one of those?
I assume it's because they're expensive, and every time you use them, you risk burning the CA they're associated with: the major browser and OS vendors will excise your root keys, or attach constraints to their use.
Kim Zetter, for understandable narrative reasons, uses Gmail as an example of the kind of site that a CA-hijacker could compromise. But Gmail is the dumbest possible site to target with a traceable compromised CA key, because it's key identities are pinned in Firefox and Chrome; if the key indicated over the wire disagrees with the browser binary, the browser flips out.
This is why HPKP, TACK, and similar pinning/continuity/attestation frameworks are such a good idea. Over the medium term, they allow the users of the Internet to surveil SSL/TLS keys and detect compromised CAs.
It's unfortunate that still today the attitude is "cover it up" rather than disclosure. I would hope that any company that I entrust with my data would be forthright about breaches so that I, as a customer, would have the opportunity to take whatever precautions were necessary given the details of the breach.
Professionals stealing certificates since at least 2009.
To me, there is a very interesting contrast to be had between this announcement and Microsoft's announcement: it feels like Microsoft is discovering the business value of being open at the same time that Amazon is living in the time warp of proprietary everything. Has Microsoft internalized that open source is (or can be) a differentiator in the cloud? Amazon is clearly still oblivious to it -- and it will be very interesting to see if this service generates fear of vendor lock-in...
and Frequently Asked Questions: http://aws.amazon.com/rds/aurora/faqs/
At $200/month for the entry level, their lowest price is many times what the cheapest geo-replicated "SQL engine as a service" from Google or Microsoft is. I'm not sure how the performance differs, but I am guessing theirs are no slouches.
Microsoft offers "SQL Database" geo-replicated for as low as $15/mo., and it scales up from there. Not sure about performance, but it would be apples to oranges (MySQL versus SQL Server) and difficult to compare. I wonder what the TPC numbers are, but apparently the TPC organization doesn't allow publishing that yet.
Google offers "Google Cloud SQL", also geo-replicated, and their cheapest pricing is between $10 and $18 dollars a month.
Using an existing name for a product in a similar space is just confusing and hurts everyone.
So what can Aurora do for that workload? Do the support multi-table transactions and referential integrity across all 3 Availability Zones? Similarly, they mentioned Durability targets; what's their targets for Consistency (ie ACID).
Here in Germany, were I live now, it's a mix. Common bread everywhere, but you can slice your own loaf at the supermarket with a terrific automated spring-loaded circular saw machine that's very cool to operate.
I just want to point out that "the best thing since sliced bread" is a sentence culturally linked to the U.S. conception of bread. I point that out because I had conversation with American friends in the past who were surprised by the non-universality of this sentence (as often happens with other cultural references).
Now on to the article. Though I had heard bits and pieces of lore behind the invention, this is the first complete history. I'm surprise that he went about building the large-scale automated machine first, rather than evolving it from simpler devices. That the development was delayed and almost lost for good serves as a warning more than his success is an inspiration, I think. But he was persistent which I suppose is the single most important quality an innovator can have.
Not to mention a sense of humor. "Mac-Roh Sales & Manufacturing". And then he ended up selling his business to "Micro-Westco".
I once asked my grand-father "What was the best thing before sliced bread?" Without hesitation he said, "Indoor plumbing."
$20 for the Spark is an interesting price point as it's cheaper than most of the Arduino wifi shields, which you still need to connect to an Arduino. It's also the same price (in many places) as a USB wifi dongle, which you need, in order to connect up a Raspberry Pi.
While this things is drastically less powerful than a Raspberry Pi, in projects where the Raspberry Pi is a simple Wifi -> GPIO board this seems perfect, especially since there's a already a mobile app.
I wonder if the current title, 'Spark: Introducing the $19 Photon', is as it was submitted. But either way, it's a terrible title. What's a Photon, and why is it being $19 newsworthy?
```<meta property="og:title" content="Introducing the Photon IoT Toolkit"><meta property="og:image" content="https://s3.amazonaws.com/spark-website/photon-hero.jpg"><meta property="og:site_name" content="Spark"><meta property="og:description" content="A $19 postage stamp-sized hackable Wi-Fi module for interacting with physical things.">```
Also, $59.04 to ship to Canada? And why does it cost twice as much to ship to Canada than to France?
It's a bit annoying living in Europe and having to pay a tax on shipping whenever something interesting comes out.
EDIT: Just after typing this comment I tried again and now I have the option to ship it to France for $10. There seems to be a bug somewhere on their checkout form.
I'd rather use an RaspberryPi or Arduino Yun. While more expensive up front, they allow you to connect to any service you would like.
1. No more xbee
2. Replaces other hardware components, for instance no need for an RTC when you can just get the time from the internet
Their app for connecting to the network never worked for me but it had a serial fallback so that was fine.
Edit: I'm in the UK.
I can't wait!
I believe the wifi chip used is the same as in the Electric Imp.
On that note, there's a weird "don't go back" loop at https://www.spark.io/signup.
this appears to be a complete upgrade * from the Spark Core, correct? So waiting 5 months will get you a better product at half the price?
* it appears to be missing the option for external flash storage, but that's not a big deal as they've upgraded the built in storage.
I hope I'm not missing something: is this a smaller, better, cheaper version of the Core? If so, does the Core only exist until the Photon starts shipping?
Is this revolutionary? It does look awesome.
For technical questions, @pavanky is on here :-)
This and today's .NET announcement shows how hard it is to sell proprietary developer tools. I had considered using ArrayFire for some of my own commercial work, but in the end decided to roll my own OpenCL code in order to have better control. If you require cutting-edge performance (which is the reason you'd consider ArrayFire in the first place), there's just too much risk involved if the vendor doesn't get details like memory access order right on complex matrix problems. Open-sourcing reduces that risk quite a bit; if this decision had been made 3 years ago, I would have given the product a closer look.
From a business perspective, open-sourcing will murder their margins so they're basically gambling on their ability to jump-start volume. I think the product is in a tough position because most of the action these is going towards "Big Data," where data doesn't fit on a single machine -- let alone a GPU -- or towards heavy number-crunching, where hand-rolled kernels will outperform generic array libraries. They might have luck serving as a kind of backend to NumPy, but then they're two steps removed from the customer so it'll be hard building a relationship that leads to a sale.
As a side note, it seems odd to me that "native CPU" is a target distinct from OpenCL, which already runs on both CPUs and GPUs. I understand that kernels written for GPUs sometimes need to be rewritten for CPUs to take advantage of the different computation and memory architecture, but since their native CPU target isn't vectorized or multi-threaded, it seems like any further effort should be spent adapting the OpenCL kernels for CPU platforms rather than reinventing the wheel with a distinct C or assembler target.
I admire the general goal of making GPU processing more accessible, but it's a problem with a lot of nuance and requires a significant amount of customer education. GPUs are sort of like quantum computers in the limited sense that they're totally awesome at some tasks and totally suck at other tasks, and you need a solid grounding in the theory to distinguish the two sets of cases. Open-sourcing should at least help with the education angle, since ArrayFire now represents a respectable percentage of publicly viewable OpenCL code. (The open-source scene for OpenCL is pretty depressing right now.) In any case, good luck out there.
"Real speedups for you code!"
Check out BizSpark.com or get in touch with me if you happen to be a YC company (firstname.lastname@example.org).
It looks like Microsoft might have a better Android emulation workflow than Google. What strange times!
I however didn't see any mention for Maven / Gradle support, and my latest VS is 2012. Anyone here with some insight into dependency management outside of NuGet?
Sometimes I have nightmares of updating Newtonsoft.Json (https://www.nuget.org/packages/Newtonsoft.Json/) and chasing down all of the version conflicts for the next several days.
Later the WP8 emulator was actually fully based on Hyper-v and this is too...meaning it is "just" a vm running x86 android, which explains why it's so fast.
"You need to recompile your code for x86. If you have parts of your code that can only be compiled for ARM, or you depend on 3rd-party libraries for which you do not have an x86 version, your code will not run on our emulator at this point."
Microsoft is so hot right now!
Well, that notation-jargon is still not real theory, and this stuff from the nautil.us link here is. This is the stuff to understand if you want to understand music.
I've been out of the music theory game for a while, so I don't know if anyone's doing much with the I-R model anymore, but it's a fascinating approach to music analysis.
This is not accurate -- Bigtable is not eventually consistent. The scope of transactions supported by a system is a different set of considerations from the level of consistency it provides. Bigtable is consistent but only allows for transactionality at the row level.
Optimistic concurrency control is nothing new and Percolator layered transactions on top of Bigtable years back. Furthermore, TrueTime -- allowing for comparatively low-latency update across a globally distributed set of DCs -- is the real innovation in Spanner, not the use of optimistic concurrency control.
Honestly, I am not sure what this article is trying to claim, except perhaps that per-node performance has been improved. AFAICT, most of this is due to the fact that RAM is cheaper than it was, SSDs have reached commoditization, and networks in the DC are faster than they used to be.
Implying that schemaless design is a GOOD thing
I can imagine how, for a new user, it's utterly baffling which of these options is the best for the longterm, with the least friction. I use OpsWorks quite a bit but have found it very challenging, and the feedback cycle when attempting to develop new cookbooks is excruciatingly slow.
All I want, personally, is a system that uses a set of interchangeable scripts that represent dependencies, so my server configs can live in version control. It doesn't even need to run on multiple OSs (which seems to be a central tenet of Chef). It just needs to deploy/rollback with zero downtime, and ideally autoscale as quickly as possible. Is this it? Is there any way to know without spending weeks fleshing out how it works?
The CodeDeploy service seems to operate by you manually launching base ec2 instance with a code deploy agent and then this agent will checkout your git code on the live instance, run any provisioning steps and then if things break somehow rollback all that work, still on the live instance.
I'm sure this is still a big improvement to companies who are manually sshing into servers and running deployments by hand, but as someone who pre-bakes ami's and does rolling upgrades with autoscaling groups this service seems like a step backwards.
There's some discussion in that post of how it compares to pre-baking, etc. Of course there are trade-offs either way. CodeDeploy does require that your are careful with your lifecycle scripts to make deployments as atomic as possible. At least they provide a good selection of default lifecycle events for you to take advantage of.
Discussion on Deis from yesterday:https://news.ycombinator.com/item?id=8591209
1. https://mesosphere.github.io/marathon/2. https://mesos.apache.org/
Right now "nobody ever got fired for" setting up deployment via rsync and some ad-hoc shell scripts. That works for a single host, although it's not great for reproducibility. But as soon as you go to multiple hosts you need some degree of orchestration, monitoring, and integration with your load balancer to avoid downtime.
CodeDeploy offers those benefits, so if it turns out to be even slightly good, it could become the "nobody ever got fired for" choice, for any non-trivial app running on AWS.
CodeDeploy is not available in EU (Ireland). Please select another region.
US East (N. Virginia)US West (Oregon)
The FBI's campaign to destroy Dr. Martin Luther King began in December 1963, soon after the famous civil rights March on Washington. It started with an extensive -- and illegal -- electronic surveillance of King that probed into every corner of his personal life. Two weeks after the march, the same week King appeared on the cover of Time magazine as "Man of the Year," FBI agents inserted a microphone in King's bedroom. ("They had to dig deep in the garbage to come up with that one," FBI director J. Edgar Hoover said of the Time cover story.) Hoover wiretapped King's phone and fed the information to the Defense Department and to friendly newspapermen. When King travelled to Europe to receive the Nobel Peace Prize, Hoover tried to derail meetings between King and foreign officials, including the Pope. Hoover even sent King an anonymous letter, using information gathered through illegal surveillance, to encourage the depressed civil rights leader to commit suicide. "The actions taken against Dr. King are indefensible. They represent a sad episode in the dark history of covert actions directed against law-abiding citizens by a law enforcement agency," a Senate committee concluded in 1976.  History reveals that time and again, the FBI, the military and other law enforcement organizations have ignored the law and spied on Americans illegally, without court authorization. Government agencies have subjected hundreds of thousands of law-abiding Americans to unjust surveillance, illegal wiretaps and warrantless searches. Eleanor Roosevelt, Martin Luther King Jr., feminists, gay rights leaders and Catholic priests were spied on. The FBI used secret files and hidden microphones to blackmail the Kennedy brothers, sway the Supreme Court and influence presidential elections.
The US then and now was totalitarian and authoritarian. Some of you, especially here on hn, may not fall into those mind-sets but it doesn't matter - you've lost - you're barely scraping by, working 60 to 80 hours a week and you have no time to change your environment. Meanwhile the political class is able to work full-time on perpetuating their power while taking away yours. You have no power, no rights, because they have been chiseled away the last 30 years by the authoritarians.
I've said this before and I'm always downvoted but I don't care. Just leave. Go to Berlin, or London (not much better though), Switzerland, or anywhere else. Even if you go to someplace like the UK that isn't much better than the u.s. you will at least no longer be contributing to a government spending 10X to 100X of any other country on arguably evil pursuits. Take your wealth-creation skills to somewhere else where you won't be contributing to your our demise.
I know that many of you will discount this one event as a one-off - MLK was certainly special. But it's only a one-off because it was the start of this sort of campaign against someone that can bring change.
The FBI spied on Martin Luther King Jr. in an unsuccessful effort to prove he had ties to Communist organizations. In 1963, Attorney General Robert Kennedy granted an FBI request to surreptitiously record King and his associates by tapping their phones and placing hidden microphones in their homes, hotel rooms and offices. A 1977 court order sealed transcripts of the surveillance tapes for 50 years.
...some people think he made extensive use of prostitutes, but I expect the FBI would have pulled an "Eliot Spitzer" on him had that been the case. Still, there's something there or they wouldn't be covering it up to protect his saintly image.
"Formerly vaguely liberal-moderate, more recently moderate-to-neoconservative (hackers too were affected by the collapse of socialism). There is a strong libertarian contingent which rejects conventional left-right politics entirely. The only safe generalization is that hackers tend to be rather anti-authoritarian; thus, both paleoconservatism and hard leftism are rare. Hackers are far more likely than most non-hackers to either (a) be aggressively apolitical or (b) entertain peculiar or idiosyncratic political ideas and actually try to live by them day-to-day."
Mapping this to any political ideology would be difficult.
San Francisco seems like a broad term but there's so much interesting stuff, they were watching school teachers in the 60s and 70s and trying to create distrust within communities that were too left leaning.
The redaction reveals more about what the FBI wouldn't do: how at least one person was reluctant to release public documentation proving that's what the FBI did.
There's this bizarre projection of the individual and his/her motivations onto every living being that fails to make any logical sense.
Is there any psychological premise for why we feel the need to dictate the behavior of others such that they perfectly mirror how we behave (or in many cases, wish to)?
There appears to be a tipping point where someone agrees with a certain set of values and as opposed to stopping at enforcing those values on themselves (reasonable), they go absolutely nuts trying to push it onto everyone else.
A sort of: how dare you.
A lot of people dismiss accusations against government agencies or fail to consider hypothetical legal abuse scenarios because "the government would never do that". Yes, the government would ever do that.
FBI's "Suicide Letter" to Dr. Martin Luther King, Jr., and the Dangers of Unchecked Surveillance
That aside, there is very little doubt in my mind Hoover was a bad man. The sad part is, many people eventually are bad given the chance and they never even know it. This is why impartial rules and transparency are important.
This may not be a common sentiment, but I look forward to the day when we are governed by machines rather than monkeys. I mean... the constitution, the rules of state and religion, they are algorithms no? Designed to remove as much as possible the corruptible human element from the equation? So why not take this concept a level further?That's my thinking.
Eventually there will always be another Hoover. But the next one might have better tools. But I think the human race can build a better system based on principals of efficiency, impartiality and beneficence. And maybe after a bit more waste, abuse and needless suffering caused by greed (that is the bottom line with the people who run the Hoovers of the world no?) it will.
OMG STOP THE PRESSES I figured out the men's rights thing!
That should sadden you, as it saddens me.
The product looks insane. Price point too.. Similar systems today cost 30K.
The problem with DJI is the generic firmware. I am still running old firmware because of all the crashes and complaints they have dramatically reduced performance / angle of attack, etc with each release.
For amateurs, it's rather expensive compared to the other DJI options out there. Especially if you already have a GoPro.
If you're a pro, you probably opt for something more customizable. You want a camera that you can get a great footage out and there is a nice and commonly understood post-processing and grading option available and not use some propriety camera.
So I think real pros would spend a bit more and attach their existing camera (GH4, Canon 5D III, etc.) system that they are familiar with to a large drone or if they want a smaller drone they'll go with the LX100 or some one of many Sony offering (if they don't need 4K)
I'm waiting to see some more real-world feedback as the Inspire 1 gets into the hands of more people. I can see myself seriously considering purchasing one... even without a very specific need at the moment (I consider myself more of a photo/video hobbyist than anything). Still, a part of me feels like there must be some kind of dealbreaker/issue that just hasn't come to light yet. I wonder how the camera compares to other options in the price range - not just in specs, but in actual image quality, dynamic range, color reproduction, etc.
The one thing that would make me more confident in buying one (other than a better reputation for DJI's support) would be some kind of statement (either from DJI or from customers' experience) of compatibility with GoPro.
Also, is anyone else annoyed that they don't stop the background video when you click to view the product video? Even on a new Macbook Pro, I get choppy video with both of them competing for resources. Even if it did play smoothly on my hardware, the background video is very distracting when you're watching the product video in a lightbox. Of course, clicking the YouTube link and watching in another tab worked better, but still..
It would have been nicer if they had clear "follow me" feature. Even better would be one-touch pre-programmed flight pattern. I can imagine myself sitting on a summit of a hike or ski resort or beach and doing one-touch pre-programmed cinematic flight path to get that super-cool gimble stabilized video that makes a circle around me.
I have an increasingly strong feeling that ML is experiencing a second coming, well probably the first if one looks at mainstream only.
Nowadays I sense a distinct change in the narrative around it. Just about an year ago a response akin to, "WTF is OCaml, SML. Uggh! such ugly syntax. Why isnt it more dead already, nobody needs it for web development LOL. Let me write some Node callbacks" were not that rare. Now it is met with a lot more genuine curiosity and I think that is just fantastic. I am a little cynical about the reasons behind this emerging popularity: "Apple endorsed it, now it makes me look cool" and some misplaced notion that may be they discovered pattern matching and algebraic datatypes. I dont mind any of that as long as some of the good ideas find their way into the main stream. This is already happening.
Let it be time for year of the ML.
A blog post elaborating on the difference between C(oncurrent)ML and AliceML would be great to have.
User-agent: * Disallow: /buy/ Disallow: /checkout/
 : https://twotap.com/supported-stores/
Does your crawler obey robots.txt rules?
But, wouldn't it be more beneficial to get websites to open up an API to you, communicate to them to do so, or even offer consulting services to build an API?
I know that there are a few cart/store offerings out there. it seems to me that they would have an API.
OpenCart Propretary API: http://opencart-api.com/
Prestashop API: http://doc.prestashop.com/display/PS14/Using+the+REST+webser...
(They also prioritized the feeds that were sent to them directly by retailers above the scraped items feeds - thus prioritizing paid listings, similar to the Google SERPs - so a different business model entirely.)
That being said, a very cool concept - and agreed that, given the relatively-small number of ecommerce platforms out there, scraping then erving them up seems pretty scalable. Interested to see how it goes.
Also, if you are scraping a large retailer you are effectively required to be PCI DSS level 1 compliant, which takes a bit of extra effort.
FYI, Lego showed me the French version of their website as it's where I live. You seems to only offer shipping in the US though that's not clear reading your website. Still very interesting.
Product URL: http://shop.lego.com/fr-FR/Le-ch%C3%A2teau-fort-70404?fromLi...
If I was using URLs gathered from a Commission Junction datafeed, is this basically a plug and play solution? Or do I need to process those URLs?
Do you have a backend stats dashboard? Or would I still rely on CJ for that data?
I wonder if a serious problem with the world is due to secrets that allow some to have power over others. For example, a company with a patent on a drug that costs $80K has power over those who will die without it. If you can't afford it, have you seriously harmed the company if you violate the patent to manufacture it in a 3rd world country for people who could never pay for the drug. When is human life more important that a company's right to a patent (or information)?
The chinese have a serious problem in the form of several hundred million people who need to be moved out of poverty. To help them get there they seem to be mining a precious resource: information in 1st world countries. Is this different (or worse) than 1st world countries mining precious resources in the 3rd world?
What is the net result? China will use this information to make itself wealthy enough to buy more of our goods? China will acquire the ability to make our goods cheaper than we can make them and force us to work harder?
I'm not saying "stealing" is "right" but it seems to be an important way all 1st world countries became richer. The notion of "right" is suspect given that history is written by the winner.
Due to the classified nature of the DMSP imagery and other data products, the DMSP downlink data is encrypted, and thus the direct readout system is not available to nonmilitary users.
That all comes from this cool-as-hell PDF about how to build a GOES/POES ground receiving station. Anyways the most obvious target here is probably the DMSP products, rather than, say, a Bruckheimer-esque plot to disrupt NOAA satellite imagery during the height of the Atlantic hurricane season.
Almost like VPNs, proxies, TOR, compromised machines, botnets, or similar do not exist in this arena and that a reverse DNS lookup will tell them 1337.mss.gov.cn.
When the US talk about cybersecurity/"cyber wars" in general they're talking about something more akin to a Hollywood movie than anything you see on the ground on either side of the "fight."
I'm extremely sceptical every time they claim Chinese responsibility. I am sceptical not because China wouldn't have the skills or motivation to do so (they do/would) but because they jump to these conclusions unrealistically quickly and if their adversary covered their tracks even modestly pointing fingers like that would be quite hard (e.g. send it through Russia).
Edit: Also, if they just wanted weather data, they should've signed up for http://pressurenet.io ;)
We can all sit back in our comfy chairs and debate whether it really is China or not, whether various networks are secure or not, or how much various agencies can store (and the dangers associated with them storing things). But we can only do that if we have recent and valid information about what's going on. Good public policy decisions depend on an informed electorate. This kind of situation is not the place to be covering up your mistakes.
Is it just me, or is this apparently the reaction every time a US government or military system gets hacked by China?
"Yep, we got hacked again. But we're just going to do our best to minimize the damage and pretend it never happened. No meaningful action will be taken against the perpetrators."