A Journey — if You Dare — Into the Minds of Silicon Valley Programmers
My responses in a NY Times comment section for the book, Coders: The Making of a New Tribe and the Remaking of the World by Clive Thompson:
#1 - Link
Although I've been a software developer for 15 years, and for longer alternating between a project manager, team lead, or analyst, mostly in finance, and now with a cancer center, I found it funny that you blame the people doing the coding for not seeing the harm it could cause. First, most scientific advancement has dark elements, and it is usually not the science but how it is used and sold by business people that is the problem. This leads to the second problem, in that it is not coding that is in itself problematic, but specifically how technology is harnessed to sell. It is normal and desirable to track users, to log actions, to collect telemetry, so as to monitor systems, respond to errors, and to develop new features, but that normal engineering practice has been used to surveil users for the purpose of selling. Blaming coders for this turn is like blaming them for the 90s internet bubble. As it is now, it is a rush for profits, not the technology, that is a problem. Many famous historical innovations were driven over the edge by corruption and wealth, not by the people involved in designing and building the systems, although we are so far along in commercialization that it is now part of many developers roles to further the business model.
#2 - Link
@Ben - I completely agree. Most scientific advancement has dark elements, and it is usually not the science but how it is used and sold by business people that is the problem. This leads to the second problem, in that it is not coding that is in itself problematic, but specifically how technology is harnessed to sell.
As for its depictions of coders, I imagine many people not in tech think of coders as young bros' that make apps at cool companies, while in reality, the average is a 38-year old married male with 2 children that makes intranet website and desktop applications for mainstream businesses.
The art aspect if funny, as it is more about soft decision making where one has to weigh the value of one architecture over another, the viability of technology in the future, the ability of coworkers to support and understand the work, the aesthetics and usability of a website. All of these decisions can be made analytically or quantitatively, but more often than not, it is one's sense based on reason and experience.
As for the bug, yes, a missing character might be a problem if I was writing COBOL in 1982 (real story). Nowadays code checking built into IDEs immediately flag such errors, before compilation.
#3 - Link
@Eugene - 10x itself has gained mythical status, but mostly as a misunderstanding, then maybe usurped by an absurd culture of competition.
1 - “In one of their studies, Sackman, Erikson, and Grant were measuring performances of a group of experienced programmers. Within just this group, the ratios between best and worst performances averaged about 10:1 on productivity measurements and an amazing 5:1 on program speed and space measurements!” – The Mythical Man-Month
2 - The original study that found huge variations in individual programming productivity was conducted in the late 1960s by Sackman, Erikson, and Grant (1968). They studied professional programmers with an average of 7 years’ experience and found that the ratio of initial coding time between the best and worst programmers was about 20 to 1; the ratio of debugging times over 25 to 1; of program size 5 to 1; and of program execution speed about 10 to 1. They found no relationship between a programmer’s amount of experience and code quality or productivity.
#4 - Link
I can see that many developers chimed in to complain about the characterization of coding as something that anyone could do, or that coding is primarily syntax. As for myself, I've been in tech for over 30 years and was a CS major in the 80s, learning COBOL, PL/I, and BASIC, but over the years working with numerous scripting languages, and then progressing to using other languages and tools, VBA, SQL, VB.NET, C#, JavaScript, F#, R, Python, as well as numerous related IDEs.
That is one aspect, but behind that is lots of reading, often more articles than books, but covering design (UI/UX), patterns/architecture (GOF), algorithms, database design, best practices, management, social sciences, and operations (Deming to agile). All of this informs the decisions I make when building something for an employer. Granted I am fairly bright, so would have acquired knowledge regardless, but to define coders as simply working with syntax is demeaning. Even people that are deficient in the broader sense of the world can be deeply knowledgeable in their respective domains.
#1 - Link
Although I've been a software developer for 15 years, and for longer alternating between a project manager, team lead, or analyst, mostly in finance, and now with a cancer center, I found it funny that you blame the people doing the coding for not seeing the harm it could cause. First, most scientific advancement has dark elements, and it is usually not the science but how it is used and sold by business people that is the problem. This leads to the second problem, in that it is not coding that is in itself problematic, but specifically how technology is harnessed to sell. It is normal and desirable to track users, to log actions, to collect telemetry, so as to monitor systems, respond to errors, and to develop new features, but that normal engineering practice has been used to surveil users for the purpose of selling. Blaming coders for this turn is like blaming them for the 90s internet bubble. As it is now, it is a rush for profits, not the technology, that is a problem. Many famous historical innovations were driven over the edge by corruption and wealth, not by the people involved in designing and building the systems, although we are so far along in commercialization that it is now part of many developers roles to further the business model.
#2 - Link
@Ben - I completely agree. Most scientific advancement has dark elements, and it is usually not the science but how it is used and sold by business people that is the problem. This leads to the second problem, in that it is not coding that is in itself problematic, but specifically how technology is harnessed to sell.
As for its depictions of coders, I imagine many people not in tech think of coders as young bros' that make apps at cool companies, while in reality, the average is a 38-year old married male with 2 children that makes intranet website and desktop applications for mainstream businesses.
The art aspect if funny, as it is more about soft decision making where one has to weigh the value of one architecture over another, the viability of technology in the future, the ability of coworkers to support and understand the work, the aesthetics and usability of a website. All of these decisions can be made analytically or quantitatively, but more often than not, it is one's sense based on reason and experience.
As for the bug, yes, a missing character might be a problem if I was writing COBOL in 1982 (real story). Nowadays code checking built into IDEs immediately flag such errors, before compilation.
#3 - Link
@Eugene - 10x itself has gained mythical status, but mostly as a misunderstanding, then maybe usurped by an absurd culture of competition.
1 - “In one of their studies, Sackman, Erikson, and Grant were measuring performances of a group of experienced programmers. Within just this group, the ratios between best and worst performances averaged about 10:1 on productivity measurements and an amazing 5:1 on program speed and space measurements!” – The Mythical Man-Month
2 - The original study that found huge variations in individual programming productivity was conducted in the late 1960s by Sackman, Erikson, and Grant (1968). They studied professional programmers with an average of 7 years’ experience and found that the ratio of initial coding time between the best and worst programmers was about 20 to 1; the ratio of debugging times over 25 to 1; of program size 5 to 1; and of program execution speed about 10 to 1. They found no relationship between a programmer’s amount of experience and code quality or productivity.
#4 - Link
I can see that many developers chimed in to complain about the characterization of coding as something that anyone could do, or that coding is primarily syntax. As for myself, I've been in tech for over 30 years and was a CS major in the 80s, learning COBOL, PL/I, and BASIC, but over the years working with numerous scripting languages, and then progressing to using other languages and tools, VBA, SQL, VB.NET, C#, JavaScript, F#, R, Python, as well as numerous related IDEs.
That is one aspect, but behind that is lots of reading, often more articles than books, but covering design (UI/UX), patterns/architecture (GOF), algorithms, database design, best practices, management, social sciences, and operations (Deming to agile). All of this informs the decisions I make when building something for an employer. Granted I am fairly bright, so would have acquired knowledge regardless, but to define coders as simply working with syntax is demeaning. Even people that are deficient in the broader sense of the world can be deeply knowledgeable in their respective domains.
Comments
Post a Comment