The greater internet population will move faster than your internal team, and your beautiful snowflake will be supplanted with a better OSS option much more quickly than you expect if it is not a core driver of what you are doing. Your time and attention will necessarily shift to the core once the non-core is good enough. Therefore, you are almost always better off contributing to something external for non-core software rather than reinventing internally.
So, if it is core to what you do then build it yourself and maintain (possibly shared) control of it. If it is core to what you do but is not differentiating, open source it and drive to create a real community around it (so that it is the thing that keeps getting better, and you already use it!) If it is core and differentiating, then you are talking product strategy and there is no pat answer :-)
(edit, minor grammar fix)
This seems like the wrong attitude. Software development is about writing and maintaining code. If you need to solve a problem that is not cleanly solved by an existing library, it's your job to write the code, maintain it, and help your co-workers use it.
Remember, the people who wrote those open-source libraries are humans like you. They had enough confidence and experience to feel comfortable releasing their work to the world, but they are not super-geniuses. You can confirm this by looking at the source code of a typical open-source library :).
Personally, I prefer to avoid third-party libraries that impose a lot of structure on my code. My ideal third-party library is one that implements a single, complicated function (or family of functions), like solving a linear system of equations. In these cases using the library is an obvious choice. In cases where the library (framework) forces you to change the architecture of your application, you should think deeply about whether or not it's worth it.
I guess the overall message is: take ownership of your work.
I remember constantly fighting for the right to develop the important software in-house rather than trying to cram them into a Salesforce app or buy some [x]-as-a-Service that does 60% of what you need and gives you 0% competitive advantage or intellectual property.
There's reasons for both approaches but you should never be too far in either direction.
"A good plan violently executed now is better than a perfect plan executed next week." - George S. Patton
A good example I experienced personally is data pipeline / job management. Six years ago there wasn't much out there, and at Facebook they developed DataSwarm. Today there are many decent OSS choices out there, including AirBnB's Airflow, Oozie, and Apache NiFi which was developed by the NSA.
I think you're conflating NIH with "willing to write code that isn't absolutely perfect."
I think your paralysis is coming from perfectionism.
We all want to build the Eiffel Tower and Mars rovers, things meant to be temporary but end up being elegant and lasting for much longer than ever intended.
None of us want to build things like the Tacoma Bridge, things with inherit flaws that are disasters.
But in the end, remember, most of what we build are really just kitchen tables and midrange houses. We should take pride in our work and do our due-diligence, but in the end, we have to ship code.
Live with the shortcomings until you see a demonstrated need to fix them. Usually by that point, you'll have a better idea of how to get around it as well. You'll avoid doing extra work early.
I'm not saying it's impossible, just that I believe you would be better served to either find individual jobs here, or be willing to relocate. (And just to be clear, I'm not pushing the relocation thing. I love this city and wouldn't want to move either.)
Just the other day HN had an article on them. They are in Atlanta, so not far from you.
They are looking to hire 150 new employees within the next year.
>> "[MailChimp] now employs about 550 people, and by next year it will be close to 700"
Since OpenTable sends email confirmation for restaurant reservations, your team might bring interesting insight for MailChimp since it's possible you might be a current customer of them.
I was in Chattanooga from 2013-15 and have nothing but good things to say about their team. Most notably they ran a functional programming meetup that covered pretty advanced topics but was still inclusive to beginners. I'll remember the encouragement I got there for a long time.
If anybody who cares about the Chattanooga tech scene reads this: do your best to keep the OpenTable team intact. A lot of the programming community's enthusiasm is either directly coming from or being encouraged by them.
DISCLAIMER?: I was in Chattanooga a couple weeks ago and they gave me a bunch of useful, free advice on my current project. This isn't really a disclaimer though because IMO it just reflects even better on them.
Interesting. I know there are Very good reasons for not relocating to either Los Angeles or San Francisco, but what were the specific reasons (aside from family/homes, if any) you and the team had for opting not to relocate?
Further to this:
2) Were team members offered raises as part of a relocation package?
3) Were team members offered the opportunity to work 100% remotely?
4) Has OpenTable approved this message?
5) Regardless of #4, are there certain conditions others would need to be aware of such as non-competes? I know NCAs are fairly (if not entirely) powerless in California, but I'm not aware of the laws impacting them in Tennessee.
Edit: per user @rfc's jogging of my brain, Stripe has a program for hiring full development teams. @wilwade, this might be worth applying to. I know I posted it in another comment, but it's worthy of top-order visibility: https://stripe.com/blog/bring-your-own-team
I decided not to move because I didn't think the local economy offers a lot of choice for software engineering jobs. When I visited people remarked, we've got VaynerMedia, we've got OpenTable, we've got CarbonFive, etc. etc.
This news sucks because I really want to see Chattanooga grow into a techhub, but it also confirms the suspicions I originally had. :/
If you're trying to get hired as a team, that sounds new, so employers might not have heard of such a thing.
It's the first time I've heard of something so preposterous. Maybe if it was a progressive state like Colorado, I could understand, but Tennessee? I can't stand the hot, humid summers myself. Then you have chiggers and scorpions to worry about. ;)
Heres a couple Deskworthy ones: tyling with CSS by Charles Wyke-Smith - very quick reference to grab to get some CSS point clarified. The Visual Quickstart HTML guide is good too also for quick reference.
> But we are getting closer to hopping aboard, according to Pishevar. The company recently broke ground in Nevada for a new manufacturing plant and conducted testing of its propulsion technology earlier this spring, yielding a pace of 110 mph in 1.7 seconds a fraction of the proposed 700 mph speeds.
> The company is busy gearing up for its Kitty Hawk moment next year when it begins testing on its passive levitation system.
Note: There are multiple companies working on the hyperloop concept. The two main ones that I know of are Hyperloop One (the subject of the above comments), as well as Hyperloop Transportation (http://hyperlooptransp.com/).
Then again, it's money so it's worth taking seriously and paying for or developing expertise for any project that matters.
ISO 4217 works well for display purposes but doesn't work for all financial applications. Consider, for example, scenarios involving tax or a product billed based on usage. By rounding off at each processing step, the company could be losing substantial amounts of money. The last two companies I've worked have used varying types of mechanisms for tracking fractions of pennies. In some circumstances, it makes sense to store everything as integers with a separate column to represent precision, but I would probably not do this unless you have a specific use. It's a lot of noise and mucks up the source code worse than, say, a BigDecimal or Decimal type. At the current job, we store everything with 6 digits of precision because of the nature of our billing and use Decimal as a reserved type for only money.
To be honest, if you have an accounting department you probably want to involve them in these discussions because rounding decisions can have impact on your financial statements and, potentially, any audits that may occur in the future. I've worked on two financially focused applications so far and I would strongly suggest that you build out a very defined and clear way to manipulate money for rounding purposes. Building out those processes in a standardized and well-defined way makes for easy unit tests and also helps those who come in the future understand the decisions made.
I did find this explanation of Euro rounding rules: http://www.sysmod.com/eurofaq.htm#ROUNDING
For "laughing at ourselves" and oddities of computer languages, there is "Wat" by Gary Bernhardt:https://www.destroyallsoftware.com/talks/wat
For an opinion on the Sun to Oracle transition, there is "Fork Yeah! The Rise and Development of illumos" by Bryan M. Cantrill, Joyent. His Larry Ellison rant makes me smile:https://youtu.be/-zRN7XLCRhc?t=33m00s
Another fantastic one is Steve Jobs' 2005 commencement address at Stanford:
Growing a Language by Guy Steele.
Bryan Cantrill's 2011(?) Lightning talk on ta(1). It's fascinating, but it also shows you just long-lived software can be.
Randall Munroe's Talk on the JoCo cruise. Because it's effing hilarious, and teaches everybody the important art of building a ball pit inside your house.
Finally, an honorable mention to three papers that don't qualify, but which I think you should read anyway.
Reflections on Trusting Trust: This is required reading for... Everybody. It describes a particularly insidious hack, and discusses its ramifications for security.
In the Beginning Was The Command Line: If you want get into interface design, programming, or ever work with computers, this is required. It's a snapshot of the 90's, a discussion of operating systems, corporations, and society as we know it. But more importantly, it's a crash course in abstractions. Before you can contribute to the infinite stack of turtles we programmers work with, you should probably understand why it's there, and what it is.
Finally, The Lambda Papers. If you've ever wondered how abstractions work, and how they're modeled... This won't really tell you, not totally, but they'll give you something cool to think about, and give you the start of an answer.
Really set me on a path of re-examining older ideas (and research papers), for applications that are much more contemporary. Absolute stunner of a talk (and the whole 70's gag was really great).
"What would be really sad is if in 40 years we were still writing code in procedures in text files" :(
"Ask HN: What are your favorite videos relevant to entrepreneurs or startups?" -> https://news.ycombinator.com/item?id=7656003
"Ask HN: Favorite talks [video] on software development?" -> https://news.ycombinator.com/item?id=8105732
The Coming Civil War over General Purpose Computing by Cory Doctorow http://boingboing.net/2012/08/23/civilwar.html
Cybersecurity as Realpolitik by Dan Geer https://www.youtube.com/watch?v=nT-TGvYOBpIhttp://geer.tinho.net/geer.blackhat.6viii14.txt
Discovering Python (David Beazley)
David finds himself in a dark vault, stuck for months sifting through deliberately obfuscated pile of old code and manuals. All seems lost, but then he finds Python on a vanilla Windows box.
Fork Yeah! The Rise and Development of Illumos (Bryan Cantrill)
History of Illumos, SunOS, Solaris, the horribleness of Oracle
These are not technical, but they are entertaining.
We can argue on some of the points he makes but we can all agree that the demos are very impressive.
1) Alan Kay: Is it really "Complex"? Or did we just make it "Complicated"https://www.youtube.com/watch?v=ubaX1Smg6pY
Take note that he is not giving the talk using Window & PowerPoint, or even Linux & OpenOffice. 100% of the software on his laptop are original products of his group. Including the productivity suite, the OS, the compilers and the languages being compiled.
2) Bret Victor: The Future of Programminghttps://www.youtube.com/watch?v=IGMiCo2Ntsc
It's a terrific window into the future of web application development.
Carmack's talk about functional programming and Haskell -- https://www.youtube.com/watch?v=1PhArSujR_A
Jack Diederich's "Stop Writing Classes" -- https://www.youtube.com/watch?v=o9pEzgHorH0
All with a good sense of humor.
It's about much more than games. To me, it's about identifying and not doing unnecessary work.
The second half of this video is a Q&A session, which I would skip.
I think it is so easy for us to discuss the impact of big data and quickly get into the weeds, but I think in this talk Norvig does an especially great job in making you truly appreciate the seismic impact that the availability of massive quantities of data can have on your way to think about problems. This is one of the first things I ever saw of him, and I've been in love ever since.
I love everything about this talk. It walks you through building a lexer from scratch in a simple and elegant way, through a very interesting use of coroutines. I appreciate the bits of humor in the talk as well.
"Writing A Thumb Drive From Scratch" by Travis Goodspeed - https://www.youtube.com/watch?v=D8Im0_KUEf8&nohtml5=False
Excellent talk on the hardware side of security, goes into some really cool theoretical hard disk defense stuff, incredibly insightful and introduces a hardware security tech toy so fun you'll want to go out and order it the moment you're done watching. The speaker is entertaining as all heck to boot.
"Programming and Scaling" by Alan Kay - https://www.youtube.com/watch?v=YyIQKBzIuBY&nohtml5=False
Interesting talk on the theoretical limits of code size and engineering versus tinkering. Also talks a lot about Alan Kay's philosophy of computer science which analogizes systems to biological systems, which are the systems with the largest proven scaling on the planet.
"The Mother Of All Demos" by Douglas Engelbart - https://archive.org/details/XD300-23_68HighlightsAResearchCn...
This talk is so prescient you won't believe your eyes. Given in 1968, Douglas demonstrates just about every major computing concept in use today on a modern machine, along with some ones that are still experimental or unevenly distributed such as smooth remote desktop and collaborative editing.
How I met your girlfriend: https://www.youtube.com/watch?v=O5xRRF5GfQs&t=66s
Jake Appelbaum's Digital Anti-Repression Workshop is de rigeur listening too:
I'd mention Bret Victor's work before (maybe Drawing Dynamic Visualizations?), but Bret cheats by writing a lot of amazing code for each of his talks, and most of the awesome comes from the code, not his (great nonetheless) ability as a speaker.
Then you have John Carmack's QuakeCon keynotes, which are just hours and hours of him talking about things that interest him in random order, and it still beats most well prepared talks because of how good he is at what he does. HN will probably like best the one where he talks about his experiments in VR, a bit before he joined Oculus (stuff like when he tried shining a laser into his eyes to project an image, against the recommendations of... well, everyone): https://www.youtube.com/watch?v=wt-iVFxgFWk
Something more recent:Martin Fowler's great introduction to NoSQL: https://youtu.be/qI_g07C_Q5INot so technical, this is a great overview of the reasons why (and when) NoSQL is valuable. He crams a lot into a short speech, so it's one of the rare videos I've required students in my database classes to watch.
Now, really getting away from the technical, I have to recommend watching the IDEO shopping cart video: https://youtu.be/taJOV-YCieIThis is the classic introduction of Design Thinking to the world, in 1999. If you're using the Lean Startup or an Agile method, but have never heard of IDEO's shopping cart, you may be able to get along fine at work, but you should be kind of embarrassed like a physicist who's never read Newton.
Detailed discussion of how to get the most out of your memory cache and memory bandwidth, focusing on games development. It's full of examples of how understanding both the problem and the hardware, and working in a straightforward way, can give you huge performance gains over using poorly suited abstractions. It shows how low level thinking is still important even with modern compilers. I recommend people interested in performance optimization watch it.
This was the first time I watched pg give a talk. It was the talk that brought about the biggest change in the way I think about the world, my ambitions. The talk was the beginning, reading more about pg, I came across his essays and then HN.
It's what I direct non-technical people to when they ask what the big deal about internet privacy is.
The title says it all. It's really a summary of several software systems with good ideas abound. I believe all the software is 80s or prior.
Edit: I also forgot to mention some psychology and math.
I think this can really really change how we look at everyday programming tasks everywhere from the type of tooling we choose to how we approach problems.
I love his talks for a few reasons:
1. He's anti-hype 2. He's contriversal 3. He's right.
Related slides: http://static.googleusercontent.com/media/research.google.co...
I especially like the part in the middle where he tells the story of how a an awful GNOME applet was killing a Sun Ray server, and how he tracked down the culprit with DTrace.
not a high tech talk, or particularly technically complex, but it shows a common blindspot in a way that is both clear, enlightening and frightening.
Sussman goes over some interesting ideas on the provenance of calculations and asserts that "exact" computation is possibly not worth the cost.
"What the heck is the event loop anyway?" by Philip Roberts
How To Design A Good API and Why it Matters  The Principles of Clean Architecture  The State of the Art in Microservices by Adrian Cockcroft  "The Mess We're In" by Joe Armstrong 
Great talk about BBC micro and much more
3Matt Adereth - Clojure/typing
History of keyboards and a custom keyboard written in Clojure
I like the 3 for their content and how each speaker presented the background and their project/hack/ideas.
It's mostly about the history of HCI up to that point.
Aside from the comedic aspect (which makes the talk incredible), Mickens is a genuinely brilliant thinker and has a marvelous way with words.
Bret Victor - Inventing on Principle
Philip Roberts: What the heck is the event loop anyway? | JSConf EU 2014
InfoSec talk. Best lines from talk..
"Basic lessons are not learned such as know thy network"
"You have to learn your network, you have to have skin in the game"
"Defense is hard, breaking stuff is easy"
"If you serve the God's of compliance you will fail"
"Compliance is not security"
"Perfect solution fallacy"
"People are falling over themselves not to change, shooting great ideas down."
"Perfect attacker fallacy, they don't exist, they are a myth!"
"Attackers are not that good because they don't need to be that good."
Speaker is Eric Conrad
It's fairly high level, but he really burrows into computer history and it's simply fascinating to watch, helped by the fact the person is extremely passionate about what he does https://www.youtube.com/watch?v=gB1vrRFJI1Q&list=PLbBZM9aUMs...
Watching that talk brought me over to the "a picture or a few words per slide" style of presentation, rather than the "wall of bullet points" style. It also helped me move from "stop talking, change slides, start talking again", to smooth transitions while talking.
...very inspiring if you're bored with the way websites have been looking for the past few years.
 https://vimeo.com/36579366 https://www.youtube.com/watch?v=cN_DpYBzKso
It's well worth watching if you are interested in vms at all.
The simple and followable progression to more and more complex ideas blows my mind every time.
> Visualizing Algorithms A look at the use of visualization and animation to understand, explain and debug algorithms.
Anything at all by Richard Feynman:-https://www.google.co.uk/search?q=%22richard+feynman%22&tbm=...
I like how this talk cuts through a lot of the BS in security. One of his points is that the US and other rich Western countries have a lot more to lose from a possible "cyber war" than our potential adversaries do.
Another key point is that we'll never make much progress unless we can somehow start building better systems in the first place, with fewer vulnerabilities for an adversary to exploit.
I think the second point has become a lot more widely accepted in recent years since McGraw started giving this talk. Unfortunately it sounds like a lot of government folks still haven't got the memo on point #1.
The best practical talk is of course this:
https://www.youtube.com/watch?v=asLUTiJJqdE - Robert "Uncle Bob" Martin, Clean Architecture and Design
Great overview of value types, performance and how hardware that runs things still matters.
Scott Meyers' talks are fun to watch too.
A fascinating tale about using python during the discovery phase of a trial. Very fun watch. Anything by David Beazley is great!
I like it because it is the intersection of so many things. He starts slow, is very intimidated by the audience. The audience, obviously super skeptical of the clown from that 70s show giving any useful information, they could learn from. He finds his footing with a great morivational story (albeit laden with a few cliches) about a forgotten entrepreneur and how he built some lasting value.
For me, this is a great talk. The story is extremely motivational and has some interesting bits of history & entrepreneurial genius-- but the entire experience is extremely educational. About bias, drive & success.
I liked it for what it wasnt.
The talk is about how Damien quit his job to hack on open source software. It shows his struggle and doubt while embarking on the project and then finally invented CouchDB. It's a passionate and human account of the process of creating something significant. I recommend every hacker watch this.
D10 conference - Steve jobs and Bill gates - https://www.youtube.com/watch?v=Sw8x7ASpRIY
TED talk - Bill gates (Innovation to Zero) - https://www.youtube.com/watch?v=JaF-fq2Zn7I
How Google backs up the internet.
At the time it changed how I thought about backups/reliability.
The rest of his channel is full of his talks https://vimeo.com/channels/761265
"The Science of Insecurity" by Meredith L. Patterson and Sergey Gordeychik (2011)
Warning: speaker likes to use profanity (which I enjoy :) but possibly NSFW if you're not on headphones
One of the best talks about code reviews and similiar things
Guy Steele's How to Think about Parallel Programming: Not! at Strange Loop 2011: https://www.infoq.com/presentations/Thinking-Parallel-Progra...
Not a technical deepdive, but entertaining.
Explains a lot of recent mass-market innovations that keep the semiconductor manufacturing industry rolling, and goes into detail about the many tricks used to ensure scaling down to the 22nm node.
Any of Jason Scott's talks given at various hacker cons are usually historically informative and always a lot of laughs (but they're decidedly not "technical").
It completely changed the way I approach front-end development (Not that talk in particular though. I saw an earlier, similar talk on Youtube but this one has much higher quality).
"LoneStarRuby 2015 - My Dog Taught Me to Code by Dave Thomas" - https://www.youtube.com/watch?v=yCBUsd52a3s
"GOTO 2015 Agile is Dead Pragmatic Dave Thomas" - https://www.youtube.com/watch?v=a-BOSpxYJ9M
It's worth joining a global-scale tech company (AWS, Google, Azure, Facebook) just to have your mind blown by some of the internal materials.
He was a co-speaker at TEDxGlasgow with me and I thought his talk was brilliant. Cyber-crime is a really interesting area.
Humour, serious technical insight and a good reminder of why being a generalist is an advantage.
A deeply thoughtful discussion of the impact of metaphors on how we think about software development.
Skip to 0:40 if you don't want to hear the MC.
He is kinda awesome in Herzog's recent 'Lo and Behold' too.
If you are in for something out of the ordinary.
(Plan to organize and add more categories.)
For those how likes computer graphics (or want to learn), this is a gold piece.
It completely changed my perspective on how design shapes our world.
This guy is just too funny.
edit: +Ryan Dahl
So many lessons in short, beautiful piece.
Not sure if it's my favorite. And the subject is more technology than "tech". But the talk that keeps haunting me is Michael Dearing's lecture from the Reid Hoffman "Blitzscaling" class at Stanford:
Heroes of Capitalism From Beyond The Grave
Dearing draws upon an obscure letter by Daniel McCallum, superintendant of the New York and Erie Railroad, written to his bosses in the 1850s. In the report, McCallum bemoans the stress and frustration of operating a railroad system spanning thousands of miles. All of the joy and magic he used to revel in whilst running a fifty mile stretch back in his home town has long since dissipated. Furthermore, the unit cost per mile seems to be exploding rather counter-intuitively!
Dearing goes on to elucidate the absolute necessity of the railroads ("the thing to know about the railroads is: they were startups once") themselves. As guarantors of civilization and progress. Beacons bringing light and reason to the dark swamps of ignorance and inhumanity. And not just in the physical transport of goods, people and ideas across the continent. But as the wealth created from that creative destruction remains the best cure for all of our other inimical maladies: poverty, injustice, disease and stagnation.
So, no pressure. But civilization depends upon you!
Links to References in the Talk:
Estimates of World GDP: From One Million BC to the Present
The Process of Creative Destruction by Joseph Schumpeter
The Visible Hand: The Managerial Revolution in American Business by Alfred D. Chandler, Jr.
Report of D. C. McCallum to the stockholders of the New York and Erie Railroad
Things As They Are In America by William Chambers
It's incredibly fun, engaging, and satisfies both of my drives, for creativity and for technical stuff.
When you're a digital artist(generalist), in one day, you can write a python script, experiment with rendering and shaders, draw a sketch, animate a character, whatever you want.
Read "From word to sentence. A computational algebraic approach to grammar" by J. Lambek. Your university library should have a copy, if not press the issue to your librarian.
Start though with Smullyan's "To mock a mockingbird".
These suggestions reflect my taste. Art is a language, so understand how language is studied and use it to approach art.
If you're serious about the art part, make sure you make art, or at least dissect art you like using the formal tools you're introduced to.
Zaha Hadids successor: my blueprint for the future
Two interests meeting is where great things happen.
> If you want an average successful life, it doesnt take much planning. Just stay out of trouble, go to school, and apply for jobs you might like. But if you want something extraordinary, you have two paths:
> Become the best at one specific thing.
> Become very good (top 25%) at two or more things.
In your case, maybe you learn computer science and you apply it to some problem in art or the humanities.
Basically, people in office environments get used to "yapping" for a whole lot of reasons not related to any actual need to exchange information (to vent and shoot the shit, basically) and to do so in rollicking, loud "party" voices without any regard to the downsides. Meanwhile, all it takes is a bit of introspection to realize that about 80% of this noise is just that. And a little bit of discipline to institute a culture of (relative) quiet and solitude -- even in an open plan office.
What, you say -- no time for introspection? No interest in discipline? No way to even bring up the idea of "library voices" in your culture?
Then your problems are much bigger than can what be helped by any advanced technology.
If I don't want the distraction of music, but still need to wipe out the sound of people talking nearby, I fire up https://rain.simplynoise.com/.
A set of one of those will probably do the job of lessening voices to something acceptable--they do to the point that my very, very distractable self can work in an open office, whereas older NC headphones did not--but they won't completely remove it.
If you really want that, I'd suggest a set of 34dB+ reduction earplugs. If that doesn't work, put them underneath NC over-ear headphones. If that doesn't work, play white noise on the headphones. I'll be surprised if you heard anything external after that.
http://mynoise.net was great, and I only had a $10 pair of in-ear earbuds. I haven't used the site since changing jobs, but I remember liking Rain On A Tent, Wooden Chimes, and mumbly-voice environments like Laundromat and Airport Terminal.
Not doing it since I changed jobs and work remote.
Just from my anecdotal experience, posts votes vary wildly over several days, as does comments on them.
A popular post can have ten votes and fifty comments.
Or two hundred votes with no comments.
About horrendous code, I haven't changed many companies so I wonder how many real world products that are some years old have code bases that are not some sort of a hard to figure patchy clutter if not down right trainwreck.
As for employees looking at my code, it isn't as straight forward. The code I upload to public repos like github are meant to be seen, used, extended and/or appreciated by peers, so letting potential employers look at them isn't a problem. However, when I mention that I have some personal projects brewing, which to me are potential business ideas, I would not let them in on too much detail much less look at the source code. Even if then agree to sign an NDA I wouldn't let them as I don't trust big corporations.
My question to you: what is a good code base? What would be the features of such a thing? How would you characterize it and quantify it? What are some of the things you consider "horrendous"? What tolerance do you have for such features?
I think if you can answer the questions above, you can find the right questions to ask. Then you can start to ask the right questions. Can you see my code before I hire you? No.
Absolutely! I always ask to see the code, and have turned down jobs before because I can see the system is a total train wreck.
I dont expect perfection, and indeed perfection doesn't exist. I also believe that done is better than perfect.
But I have seen some royal clusterfks in my time, and I think it's perfectly reasonable to ask to see the code.
For some perspective -- the company I work for was sold for a tidy sum, and part of the negotiation was that the buyers got statistics about the codebase (number of lines of code, which languages certain sections were in). They didn't get to see any of the code before the sale closed. And, a prospective buyer has a lot more to lose & a lot more influence than a prospective employee.
You might care about how it looks, but you should care more about how its changing. Tell them your more than willing to do it in office on a company laptop for an hour or two, tell them you don't mind having someone sit with you to answer/field questions.
This is right up there with "can I see a cap table" for questions to ask before taking an offer.
Seems like what's good for the goose and all...
Add Mongo or another NoSQL solution as their main backend, and the interview would come to a hasty end. :)
And let's be honest, unless the potential employer is working on Top Secret defense software or is some Silicon Valley, PHD-heavy AI startup, their typical CRUD app does not warrant a NDA or anything else, at least for a cursory glance at the main code base, build system, etc., especially if you're not currently employed with their direct competitor(s).
I am an international tax lawyer and familiar with the tax rules facing micromultinationals -- like your budding company.
The real problem you will face is cost. The tax rules were meant to tax Google-sized enterprises but they so happen to apply to you, too. Unfortunately you don't have a Google-sized budget. :-/
Most of all make sure you find someone you have no issue speaking/communicating with on a personal and social level. It will probably make the process much easier!
Here's an link on having a custom chip made:http://electronics.stackexchange.com/questions/7042/how-much...
getting chip done cheap:http://www.planetanalog.com/author.asp?section_id=526&doc_id...
computer built with transisters:https://hackaday.io/project/665-4-bit-computer-built-from-di...
IC built from scratch:http://hackaday.com/2010/03/10/jeri-makes-integrated-circuit...
or with unlimited money buy one of these:https://en.wikipedia.org/wiki/List_of_integrated_circuit_man...
You can get much of the knowledge you need from books, CPU specs, academic papers, open-source CPU's in industry, and so on. The standard cell model is the easiest with the highest performers being full-custom designs. There are plenty of successful designs in former with good performance, though. So, that's you're best bet.
Here's a GPL one used in embedded (esp space) applications for you to start with that has good performance and extremely-high configurability:
Recent one for open ISA:
Lots of interesting things might be done with asynchronous logic that's not as explored and patented compared to synchronous techniques. The cutting-edge stuff is using it more and more. Whole chips have been done that way that were easier to fab right the first time plus with energy and performance benefits.
Are you starting from beach sand, or do you have a fab and raw materials? There was an interesting video that showed what you had to do to make a pencil "from scratch".
You could build a CPU with a copy of the book and a big box of transistors, wires, breadboards etc.
I'm still working through the book, but I've learned a ton.
Implement your own hardware, make a compiler for your own CPU , memory controller, and so on...
VHDL/Verilog and KIT goes a long way. I would vouch for de0 nano from altera.
I've discovered it a couple of month ago, and I'm so happy I did, now I visit it almost every morning. It has so many brilliant digital artists and mindblowing artworks.
Even if you're not that into "art", it's so fun to just watch people doing something with incredible level of expertise.
Just so fucking good. I can't recommend it enough.
A lot of people are against paying for news - they feel it is free elsewhere or they can google the title of the article.
For me I noticed that reading the NYTimes and Wall Street Journal provides me curated news with the in-depth high quality coverage I like and leaves me feeling informed.
I avoid reddit as I find myself spending a lot of time on it, similar to facebook.
Newsletters like http://allthesmallthings.co/ is great has also given me some info on design which I find interesting.
It's a travel guide for the obscure treated in a way that is respectful and not treated as guide for the weird and spooky.
for more theoretical I'm planning to read: Vector Calculus, Linear Algebra, and Differential Forms: A Unified Approach
Other books assume you already know what they are talking about.
My name is Dhruv (email@example.com) and I lead the Self-Driving Car Nanodegree program here at Udacity. I'm happy to answer questions directly on this note (email me!). I think this program will be very different from our existing Nanodegrees as we have real industry partners who are extremely invested in making sure the program is high quality from the get go. This is because they want to be able to increase their hiring pipelines as soon as possible. We've been determining what projects/content to create by asking our hiring partners (and Sebastian Thrun) what they would want to see in a portfolio of someone they hire. We then work back from there and iterate to create the content. So far, folks at Mercedes-Benz, Otto, NVIDIA, and a few other auto companies have gone over our syllabus, given us feedback, and helped us iterate. I'm trying my hardest to make this program something I would myself take to get into the field or learn more about it(I'm an Engineer by trade who moved into this role!).
There is a lot of PR speak clouding my judgement.
Computer vision (Detect lane lines in a variety of conditions, including changing road surfaces, curved roads, and variable lighting), Neural networks (classify traffic signs, drive a car in a simulator), Track vehicles in camera images using image classifiers such as SVMs, decision trees, HOG, and DNNs.
More projects will cover these topics: Sensor Fusion, Localization, Control, Path Planning, Systems and an Elective.
If you know of comparable ressource to gain experience with all this material please share, I'm not aware of any.
Official source: http://medium.com/self-driving-cars/term-1-in-depth-on-udaci...
The part that I like the most is the parallel effort to build an open source self driving car. My motivation is to build one in my small town in Colombia - with donkeys sharing the road it's going to be very interesting.
I did read a comment from someone who took it here:
They're very light. Since they don't carry any weight you'd at least hope to learn something.You'll get as much if not more from the free machine learning Georgia tech course with Tom Mitchell's book than the nanodegree.As a follow up, I took Thruns robotic driving course and it suffers from being a purely software course. There are optional hardware projects but no imparted hardware instruction.So I'd be especially leery of an automated driving nanodegree degree online.
As a more esoteric bonus I'm evaluating this kind of learning as a potential replacement for higher education for any children I might have, given ever-increasing costs for brick-and-mortar universities.
In the long run my goal is to switch to this area, either as an employee or building a startup.
Maybe nano degrees are all paying ? If so then my bad, I hope some will get my seat (seems so).
Wish all the fun to students.
If you're going to create a self-driving car, please, please, please get at least a millidegree (~2 hours).
 ~120 credit-hours per B.S. degree * ~16 classroom hours per credit-hour * 360 seconds per hour * 10^-9 for nano- prefix
I'm genuinely curious - what would this add to most discussions?
* Note: I no longer work on HN or influence its direction.
What company do you think of first when you hear "mobile"? Apple.
What company do you think of first when you hear "AI"? Google.
Not qualified enough to comment on how good this bet is, but AI/deep-learning _is Google's play_ for the coming decade. Here is a recent article that elaborates on the specifics: http://fortune.com/ai-artificial-intelligence-deep-machine-l...
Google isn't alone in this, AI is Nvidia's biggest bet too: http://fortune.com/2016/03/22/artificial-intelligence-nvidia...
Most people seem to think that the main interface with this so-called AI will be text/speech. This is incorrect. Interaction with AI will be spatial (think HoloLens), graph-oriented (think Semantic Web), and binary (think Tinder).
The big idea, which most people don't seem to appreciate yet, is that all apps are basically the same. Tweaking the layout and renaming "share" to "retweet" doesn't change the semantics of the underlying interaction. The next big thing will be some kind of app/service/protocol la WeChat that let users accomplish 80% of what all other apps combined offered. Yes it will kill branding, yes there will be a learning curve. I suspect that those who will succeed will make kids their target demographic.
There are some communities on the net who are working on homebrew cybernetics. I've seen experimental designs for transdermal data/electric ports, health monitoring implants, and implanted secure elements, &c. CyborgNest recently started taking pre-orders for a "north sense" implant. Definitely my favorite fringe.
I think it could work, and best if paired with an application project. Not everything is scalable and a long, hard march may be required to create some kinds of systems, especially with respect to Agent based computing.
Personally I think Agent based computation is the next big thing after Social, but it is hard to tell what form it may take. I am far from convinced that Apple or Google have the chops to do what is necessary, because it would surely require them to violate their own interests and probably legal obligations.
Talking of that sort of thing, you should look into Urbit.
Urbit is a network computer that could hypothetically allow Agent based computation to actually work.
Urbit is definitely fringe science stuff, I'm sure Walter would approve.
Part of the reason I think this area is hot is that it's harder to do than apps. Every "app" I can think of has already been done, and in fact there's probably a dozen versions of it languishing in app stores. Manufacturing is harder, and is often geographically local, so there are fewer competitors and more niches.
Another leading indicator for me is the general rebellion of young people against the fact that "shop class" has been taken out of schools. Lots of kids are trying to find ways to be "makers". In my state, when I was in school the "dummies" were pushed toward taking vocational classes in plumbing, welding, etc. Now (20+ years later) the vocational classes are the most competitive, prestigious classes that students fight to get admitted to.
If I had to guess, the new manufacturing is going to be smaller scale, extremely high quality, locally sourced materials, with some kind of digital "twist" on the business models.
It's at Lost and Found on Telegraph on the 1st Tuesday of every month.
I could be totally wrong here. But worth looking into. As others have said, taxes is the main motivator - with a C-Corp investing, you'll be double taxed on any capital gains. The Corp will pay cap gain taxes when it sells shares, and then you or the corp will need to pay more taxes in the form of either dividends, payroll, or more cap gains taxes when you take the money out of the Corp.
Probably, though mostly books/online text is probably better. (Or, really, no, you can only do it by mostly doing web programming, but you need some external resource, and either text or video [among other things] -- or a mix -- can serve as that external resource.)
And the way to prove if it is true or not is by "reverse induction." Remove one-by-one all the supposed "fruits" (wealth, glory, honor, respect, celebrity, etc) of your labors. And determine if it is still really important to you. Or to Us.
Alan Watts ~ Egocentricity In Humanity
My point is that the range of possibilities is immense. Nobody could possibly answer your question, especially without knowing more about you. Here's some general advice from a stranger on the Internet:
1) Self-knowledge is most important, IMHO, to career choice. Learn what you care about, what inspires you, what you like and don't like, what your strengths and weaknesses are, etc. Most people have a poor understanding of these things.
2) Based on that, embark on a career in something you love and which suits you well. It will take plenty of time and effort to get traction and build 'career capital'; it will seem impossible to get your foot in the door, but be patient and persistent and use the time to acquire skills and contacts - you will be very busy later. You might as well invest that effort in something you love. That way, later, when a contact calls you with a business idea or you come across some great opportunity, it will be to do what you love instead of something you merely endure.
3) No matter what you do, some people will tell you it's wrong. You can't please everyone, and they really don't know you the way you do (see #1). Ignore most advice (especially from strangers on the Internet).
 A quick search of formal education levels didn't find anything but I did find that only 40% of middle-class students who start college get a degree, so the number with degrees is very likely lower. http://money.cnn.com/2015/03/25/news/economy/middle-class-ki...
 https://80000hours.org/career-guide/career-capital/ - this whole website seems pretty good.
edit: although you are dependent on customers. But the recent /r/dataisbeautfiul post said that if you're employed, you're not middle class. Middle class is like a company owner.
I shall find the link.
Saw this somewhere on Reddit.
Factory work sucks, but I see people who put in the years and get to 50K, but fuck factory work. Unless you do something cool. In my instance cutting meat (doing the same thing, 6,300 times in a day)
Some janitor left millions to charity. He was frugal and was good at investing in stocks.
I have read that the 5% of people who have financial goals outperform the 95% without them -- combined.
I have also read that people who folow their interests typically have more career success. People who like what they are doing tend to do it well, for a variety of reasons.
I suggest you figure out what you enjoy doing and try to find a job that is a good fit for that to the best of your ability. Also, learn to budget, stay healthy, use birth control consistently. Health issues and unplanned children seriously derail personal budgets.
Plumber, electrician, mechanic, more or less specialized repairman, construction, plant operator, retail. Many of these can pay well as long as you gain experience and are willing to put in the hours. On the less "sweaty" side, many marketing and creative jobs (photography, videomaking, design, etc).
Since you'll have to spend a lifetime at it, what is it that you actually enjoy doing?
If the customers you'll be handling would be up in AWS/Google, it's good, but if they have on-premise, that means you'll really have to know your stuff about the following:
- Feature Lists - translating what the product does into a feature list that describes perfectly what the customers are looking for. Makes it easier for them to explain to the board so they can get budget and approach finance and procurement.
- Active Directory or Auth0 integrations
- Network - from DNS/dnsmasq to iptables
- VA stuff - meeting generic hardening requirements and VA scans, java key stores, SSL certificates/ciphers and ton of Linux/Unix or Powershell.
- Linux/Unix - NTP, mail servers if need be, proxying using nginx/apache, user access, etc.
- Architecture - explaining the entire product in high-level and how components integrate with a customers current infrastructure
- Benchmarking - customers will ask apples to apples comparisons with other vendors, and they'll stick by your numbers.
- Appliance/Infrastructure sizing - you'll probably take part by this. Certain customers will definitely ask data growth and network growth as well. They need this to line up corporate budget.
and a lot more
So being a web developer is just a piece of the pie. Full stack is a must.