Regulation and the Tech Evolution
In an age of technological evolution reminiscent of early science fiction films, Canadian engineering regulators must consider the public safety implications of new software technologies on society and humankind.
More of our health-care data management processes, prognoses, treatments, and procedures are outsourced to technology and artificial intelligence than ever before: data-based diagnostic software is used to determine our ailments and robots assist in our open-heart surgeries. Alberta’s transportation industry is investing in the research and development of automated vehicles as a key objective in its business plan in the next two years. Each day, we use GPS to direct us and our goods to destinations, control and track our airspace, and alert us to the location of those in distress. We depend on the software controlling our phone network to reach emergency services and software programs tally our electoral ballots, determining our future leaders. Our reliance on technology is ubiquitous and often sits invisibly in the background of our daily lives.
If these informational technology–based systems don’t work, the consequences can be dire.
Our diagnosis is incorrect, we don’t receive appropriate treatment, or our surgery goes wrong and causes lifelong injury or death. Our automated vehicle causes an accident and loss of life. Our finances are hacked and our life savings are lost. We don’t have the supplies we need to support our industry or economic infrastructure because deliveries weren’t made. Airplanes and trains crash. We can’t reach help in an emergency.
Consider if these technologies did work, but you couldn’t take for granted that they would. Imagine if those creating the systems were not required to hold public safety paramount. Would you hesitate before depositing your money in the bank, lying on a surgical table, or driving to the corner store? Would you trust that you could reach emergency services when you’re in need?
Although most of us haven’t felt the effects of a dangerous or catastrophic failure of technology, the potential harmful societal implications of the software embedded in our daily lives can be as acute as the immense benefits. Technology’s influence is growing, as is our dependency upon it.
Rapid technological innovation and integration leads to a formidable question: If a software program fails and causes public harm, who is accountable? Who is responsible for preventing software failures in the first place? The answer is regulators like APEGA. It is through the regulation of software engineers that the public can trust that the software and related data they use on a daily basis is ethical, safe, secure, and reliable.
A professional engineer, a professional responsibility
According to professional engineer Patrick Binns, a 35-year veteran in software development, titles matter. “Protecting the public interest as they use increasingly complex software-based systems is becoming difficult, especially when processes used to develop software may compromise technical integrity. Software-based systems that affect physical, personal, or financial safety need professional oversight—especially when large numbers of users are involved.”
He explains that while the education to learn to create software is widely available, licensed professional engineers should be at the helm of the software development process. They should be responsible for quality assurance and ensuring systems are
built and used within their design specifications and provide authentication for the final product. “Individuals using the title software or computer engineer are bound to professional ethics, and their responsibility to protect public interest
when software products are deployed.” He clarifies that the expectation is not that professional engineers complete all the work—delegation is still a necessity—a professional engineer must take accountability for the quality of
the software system and ensure it meets the needs of its intended purpose while abiding by industry standards.
“Individuals using the title software or computer engineer are bound to professional ethics, and their responsibility to protect public interest when software products are deployed.”
Patrick Binns, professional engineer
Binns highlights there’s a subtlety involved—if the system was designed for a particular purpose and is being used for something else, then an engineering problem arises. “A great example of that is the use of a tablet-based computing device. It’s one thing to use the tablet to control my calendar, but it’s completely different if the same tablet is used to control a train; consider the situation if it reboots while the train is running down the track. That tablet is not necessarily designed for that application.” He explains, “It’s not the technology itself—it’s the application of the technology.” He says these considerations belong within the scope of engineering responsibilities, stating it goes beyond what we can touch and feel: it includes a more esoteric relevance to the world surrounding us.
Professional responsibility: a sound business decision
This relevance touches on many areas—physical, emotional, social—and most importantly, it affects public safety. It’s also about people’s livelihoods, sustainability, and financial responsibility.
Binns explains that there can be serious financial impacts if a business relies on software to manage operations and it doesn’t operate properly or fulfil its function. “The integrity of the software used by individuals and businesses on billions of computing devices requires stronger professional oversight, and transparency is required to protect the public when complex knowledge-based systems are used.”
“The integrity of the software used by individuals and businesses on billions of computing devices requires stronger professional oversight, and transparency is required to protect the public when complex knowledge-based systems are used.”
Patrick Binns, professional engineer
While one may assume software development environments are established with solid workflows and clear lines of accountability, Binns has personally experienced the opposite. While working on a large corporate merger, he found the development and testing processes were generally ad hoc, with limited quality standards.
“The consideration of whether they were doing the right thing was under-represented considering the size of the project. We think these large projects are well-governed from a technical and integrative perspective, but sometimes they’re not.”
A software developer on tech and ethics
Morten Rand-Hendriksen, a Canadian software developer and open-source contributor who has been in the industry for more than 20 years, agrees protecting the title engineer is essential to the protection of public safety. “By virtue of having engineer as a title, you can make a bunch of assumptions. They have a rigorous education that includes professional responsibility, professional ethics, and professional conduct. They are held to standards. And if they break those standards, there are consequences.”
He began learning HTML in university in Norway, booting up code in a UNIX environment “before any meaningful version of the web as we know it today was really live.” He was also studying philosophy and found it fascinating to see how modal logic—a branch of philosophy where you make logic trees to prove the validity of arguments—was similar to how computers work. “Computers are logic machines. Unlike humans, they follow logic to the letter at all times.”
A decade would pass before he was fully entrenched in the tech industry building websites full time. When he found himself discussing the ramifications of technology in an ethical sense, he saw a need for industry education. “I was looking at my own community, and they were talking about ethics as if it was this brand-new thing they had just invented.” Thus, he began educating others via social media about exactly that—ethics historically, and ethics in the tech industry, an arena with no set rules aside from whatever design parameters the technology of the moment required—an ever-evolving guideline.
He likens engineers to biomedical scientists and psychologists, saying enormous responsibilities accompany professions that strive to innovate. He compares advances such as cloning and psychological programming and their resultant ethical challenges with those of technological innovations. “All of those industries, by themselves, realized ‘we have enormous responsibility, and need to start taking it seriously.’” He says this recognition led to the implementation of consequences for misuse, and ultimately, regulation to limit the potential harm individuals in these professions could bring to society.
When software or an application is created and malfunctions, or causes harm, he explains the person responsible may not experience any lasting occupational repercussions. In regulated professions, historically an evolution of professional standards and practices follow failure. “There are reviews, people lose their licences to practise. That applies to almost every industry that affects human beings except for design and technology.”
Balancing power and ethical responsibility
The pervasive trend ethically within the tech industry, according to Rand-Hendriksen, is one of value neutrality. “It’s the tech variant of guns don’t kill people; I just build tools. If people use those tools to do things that I think are harmful, that is unfortunate.”
This raises ethical questions: Who is responsible? The creators of the apps, platforms, and software, or those putting them to use? Are we responsible for the effects of what we create on the world around us? The mandate of a professional engineer says yes—always.
Who is responsible? The creators of the apps, platforms, and software, or those putting them to use?
“A lot of the power of these tools comes from people’s ability to do whatever they want with them.” He says the knowledge is available to anyone, and the tools are affordable. He states a lot of leaders in the industry from his generation (late-stage Generation X) have never had formal training in design, coding, or information architecture, and they’re often the teachers guiding the next generation of software developers.
Rand-Hendriksen explains it’s easy to defer responsibility for the use of the software and its accompanying tertiary effects because developers can ultimately say it’s just a tool they’ve made to connect people. Although this mentality can be pervasive, he also identifies there are companies that consider the ethical and societal implications of their software and applications on users’ lives, while remaining competitive in the tech marketplace.
A decision before the first keystroke
Rand-Hendriksen states for each decision-maker leading a company, there are tens of thousands of people developing the software. “When you go to them and ask what they do for a living, they won’t say, ‘I’m overthrowing world governments through advertising algorithms.’ They say, ‘I am building a better recommendation engine so people get more of the content they want to see.’ There’s a tension there.”
He likens it to building a calculator on a social media application without considering the larger implication of the application existing in the world. “The connection between the effects of the app existing and me sitting there typing code on a machine in my office at home will take 10 years for people to see, but it’s before the application is created that the decision needs to happen,” Rand-Hendriksen stresses.
The use of the title software engineer versus software developer may to some be a minor inflection point when the singular focus is employee attraction and retention, but the decision impacts something larger: public safety throughout the navigation of rapid technological evolution. This decision will ensure accountability, rigour, and integrity in programs and systems created or managed by software engineers, resulting in processes held to a standard of excellence and safe outputs. The adverse effects of unregulated software engineering can range from minor to devastating, and it is regulators who are responsible for preventing both.
Public trust is built through a history of successful regulation, as is confidence in the programs and systems working in the background of our daily lives. Through APEGA’s regulation of the title of software engineer, the responsibility for building our province in a digital sense is in the capable hands of those you’ve trusted to physically build Alberta for the last 100 years—those held accountable through self-regulation enforceable through the Government of Alberta’s Engineering and Geoscience Professions Act.
These technological strides forward and their effects on society must be considered before they are created, which is the mandate of a professional engineer. The regulation of this evolution is a necessary responsibility: that of Canadian engineering regulators, such as APEGA. Regulating the title software engineer benefits us all—our decision today ensures the continued protection of public safety tomorrow.
Learn more about this issue
APEGA supports the innovation and diversification of Alberta’s economy, including the technology and software sectors. Mitigating any risks to public safety must be of primary importance. The term engineer—including software engineer—comes with a licensed and ethical set of responsibilities and accountabilities, and the use of this title has always been protected for a reason.