The web is full of tips and tricks for becoming a rockstar developer. Top IT managers, agile evangelists, and prominent community members are all busy painting the picture of an ideal software developer capable of generating amazing results.
In reality, the line between an average developer and a great one is very thin.
The profusion of learning aides like online courses or development schools, as well as enthusiastic knowledge-sharing practices of the developer community has lowered the entry barrier to the profession. An increasing number of young, inexperienced people are part of the IT industry today. But technical expertise isn’t enough to take a developer from average to great.
Here are 4 serious mistakes made by inexperienced developers that inhibit their growth and prevent them from entering the path toward greatness. These errors are so obvious that it’s often hard to identify them as such – even by well-coordinated and experienced teams.
4. Always choosing custom over ready-made solutions
When we face a sizeable creative task, we’re often tempted to build a specific functionality from scratch. That’s what brings us the most satisfaction. However, before jumping on the custom-solution bandwagon, it’s worth to take a moment and reflect whether we really need it.
Technological solutions function in the business context. Is it really a good idea to spend money on building a custom functionality that doesn’t bring users value?
Don’t forget about the future costs that come with maintaining that custom functionality. On the other hand, you’ve got open-source solutions that are available free of charge and maintained by communities of passionate developers.
If you want to provide users with a specific functionality, consider your current budget and future plans for your product development before going for a custom solution.
3. Forcing new technologies at all cost
Technology trends come and go – so quickly that most of us find it increasingly difficult to keep up with the many different directions the industry takes today. Everyone is asking themselves questions like: Should I choose a native language such as Java, Swift, or maybe React Native for my next mobile project? Or rather go for the Progressive Web App instead?
Following new trends is easy because everyone else seems to be doing it. So how can we judge which trends are short-lived and which ones stand a chance to disrupt our industry?
Even experienced developers are tempted to ride the wave of recent trends. What they often do is quickly mastering such technologies and then trying to implement them in the first commercial project they get their hands on. When expanding the technological stack of a project, they’re surely led by good intentions – they want to make sure that the product develops according to new industry standards.
Naturally, developers who actively follow the sector developments and gain new skills are among the smartest of the bunch. But when they fail to take into account the long-term consequences of their technology choice for the team or its direct business value, implementing new technologies at all cost can become problematic.
Sometimes developers suggest a hot technology just because they want to learn it and seek an opportunity for doing so. Sticking to personal goals instead of the company’s business objectives in technology choice puts the project in danger.
Team knowledge of new technologies (or lack thereof) acts as a buffer here because it reminds everyone that forcing new solutions can be extremely costly – especially if the developer initiating its implementation leaves the project later on. Trending technologies don’t become widespread instantly, and the low market penetration of a specific technology makes it difficult, slow, and expensive to attract experts in the field. Not to mention that new technologies often don’t get the chance to pass through extensive market testing and might be buggy.
This is another threat to the project that results directly from the developer’s failure to take into account the potential negative consequences of forcing innovative solutions at all cost.
2. Estimating undefined tasks
Some developers forget that every defined task derives from an undefined task. That’s especially true at the early stage of the project. There’s nothing wrong with asking for additional information, details, extra attachments, accepted wireframes, or final designs.
The problem arises when the task isn’t well defined and developers have doubts or fail to consider potential scenarios – and yet they try to estimate the labor intensity of such a task. The larger the problem is, the more important the estimations for the project are.
Asking questions should be the norm, but for some reason it’s still very often not. Information is essential for analyzing the scope of the task before attempting to estimate its workload. Short research is a must at this point – for example, reading the product documentation or outlining key technical steps that must be performed to complete the task as expected are very helpful.
Developers who aren’t afraid to ask questions boost their chances of delivering a product that matches the vision of the product owner, client, or other stakeholders.
1. Not communicating errors
I know what you’re thinking. This problem is so obvious that it almost doesn’t occur anymore. Not in my experience. There are many reasons why developers hesitate to communicate errors. Daily standup meetings during which the team doesn’t report any problems is a potential red flag, and experienced PMs need to increase their vigilance.
Think about it this way:
When a developer joins a team and instantly receives the title of a “senior developer,” that status comes with a lot of pressure, and they might find it hard to admit that something isn’t going as planned or is taking longer than estimated. It might come as a surprise, but there’s a lot of creative work involved in a developer’s daily job – and not all solutions are immediately obvious.
That’s why a developer’s experience is measured in how they approach tasks or use tools to search for solutions, and not whether they can answer all the questions.
And let’s not forget that speaking about problems is still perceived negatively and puts the person talking about project issues in a bad light. That’s why it’s essential that we build teams immersed in a culture of openness, where everyone can share their problems, mistakes, and hardships.
Only teams with a strong culture will achieve synergy, boost knowledge exchange, and foster honest communication about problems. All that facilitates product management too and is particularly important for product owners, project managers, and other stakeholders (such as executives, investors, clients).
So what does it take for a developer to become more than average?
In my experience, there are 3 factors that put developers on the path to greatness:
- Company culture – a culture that promotes honesty in communication and professional growth helps developers learn more about their area of expertise and become better at their jobs.
- Good work habits – good habits help developers work more effectively. Instead of spending hours writing code mindlessly, developers who know how to manage their time find space for higher-level problem-solving tasks and education. Moreover, knowledge of good practices helps to avoid making mistakes that create technical debt (which will take up the time of other team members in the future).
- Passion and commitment – developers who love what they do are fully involved in creating the best solutions. They think holistically about the entire application architecture, not just the single functionality to which they were assigned. And they always go the extra mile.
As you can see, all it takes is paying attention to communication practices, team culture, and work habits to eliminate some of the most serious problems that inhibit developers from growing and thriving.
What else do you think separates great developers from average ones? Share your thoughts in comments!