Stupid Question 129 and 130: Should you and how do you introduce new technologies in projects, and who pays?
Should you and how do you introduce new technologies in projects, and who pays?
I’m wrapping up the very last school project. Some of you might have missed out on this information, but I am still a student (My first year of programming July 11 2011- July 12 2012). Although not attending any classes (working instead) I still have to do all required exams. And since I have to do this final exam, me and the guy I am programming with, decided on choosing things way out of our comfort zone, only new stuff (new for us, or fairly new). So we (me a .Net developer and Per Anders a SharePoint developer) decided on a JavaScript all the way app,- chose node.js + express.js app, using stylus and jade, couch.db, KendoUI, Knockout, and the list goes on. It has been fun, but painful. Needless to say neither one of us have had any social life during this period. Also switching from one way of working to a completely different way has been quite an experience, and for me personally the experience has been extremely valuable and I’m loving it. But it’s a school project. You get to try new things and mess about.
But it’s different when I work, I don’t mess about with those projects.
Since you are supposed to program deliberately, introducing technology that you aren’t familiar with would break that rule- as you add risk to a project somebody else is paying for.
**The answer seems to be to get familiar with the technology, but who pays for that investment?**And I call it an investment, because it might not necessarily yield positive results. You might find out (which has happened in a few projects I’ve been involved in) that the technology was pretty crap. And you might even had awesome developers you trust recommend it, there are still no guarantees. Who pays? Your boss? The customer? A split?
When I asked: Stupid Question 65: Can we expect a workplace to let us set aside time for learning? The majority of responses I got was that in many companies no time was set aside for learning, so the only way a developer would be able to introduce new technologies, and therefore maybe better solutions was to learn after work, or take a the risk during a project by persuading the person making those decisions, or not asking at all.
I can’t answer this question, it’s a hard one. We have such beautiful minds that if a project was to fall apart due to new technology being introduced that turned out to be bad choice, most of us would insist on not being wrong (it’s called cognitive dissonance). But sometimes it is. Who gets to make the decision, who pays? And how do we best approach this? Hope to get some good advice on this one- it’s a very important question for me,- and for many developers I would think.
Should you introduce new technologies in a project?
How is this best done/approached?
And who picks up the bill?
Comments
Staying current on new technology is just part and parcel to the job. I tend to research and test out new ideas on my own time, and only present them to my team when I know they will work. Once we have established that a product, framework, or idea is viable for the project, then my company doesn't mind bringing everyone up to speed on it on company time. Sure, this means I devote a lot of free time to something my coworkers will eventually get paid to do, but when bonuses and raises come around, my efforts are recognized, so it certainly doesn't feel like wasted effort.
My workplace is a production facility, not heavily into application development, but they have a decent development team. My colleagues were used with "Page Controller Pattern", for development, where all Display Logic, Business Logic and Data Access were packed together in one place(layer), and of course no Unit Tests! (They still maintain classic asp application) When I was entrusted to work on a major software with abstract ideas, the Project Manager advised to stick with a "simple architecture", since some project I worked alone on n-Tier model. But colleagues were happy to embrace n-tier model in this project. Later we have introduced Unit Tests. We started the project from abstract ideas, so had to re-factor the code many times.The layered design helped us to easy to re-factor without breaking other layers, and unit tests ensured that re-factored code does not break. My point is new technologies introduced with good mentoring can can save the team from debt, but can add value to projects in the long run.
You just made me google Page Controller Pattern - I think it will make it in the Stupid Question series :D (I've used it - not using it all now, I just wasn't aware of the name). What a great example by the way- and answer.
Here in Brazil, specifically in Sao Paulo city, the demand for new technology comes from the customers. Software development is a third party task for most companies, so they keep just few specialists in their internal teams to take decisions about which technologies to use. The announce of a new project is public for consulting partners and these contractors move the market, but slower than worldwide. Thus, the time is enough, but it's up to you to learn new things and be aware that your new knowledge couldn't be ever required.
Last modified on 2013-01-17