How do you build tech you won't regret? Who is responsible for the code that is released? How do you make tech ethics considerations, including privacy, security, accessibility and inclusion, a part of your regular agile feedback and review processes?
http://2019.aginext.io/Session/tech-ethics/
Some slides transferred poorly from keynote to powerpoint so here are the blanks filled in:
Slide 6: “We kill people based on metadata.” — Michael Hayden (former NSA and CIA director)
Slide 21: "“The most dangerous phrase in the language is, ‘We’ve always done it this way’.” —Grace Hopper (computer scientist, candidate for Most Badass American Award)
Slide 31: “Don’t build something if you don’t have the budget to build the security infrastructure properly. Knowing your limits is also important to behave ethically.” — Ádám Sándor (cloud tech consultant)
Slide 32: "Whose problem is it if data gets stolen? Was it devs not thinking, ops not securing or management not giving enough budget? In these situations, it’s very easy to think ‘This isn’t my own problem, I’m just a cog in the machine.'” — Ádám Sándor (cloud tech consultant)
Thank you!
12. Would you write code for
unethical purposes if you
knew the purposes?
StackOverflow 2018 Developers’ Survey
13. Would you write code for
unethical purposes if you
knew the purposes?
StackOverflow 2018 Developers’ Survey
A majority, 58.4 percent, said
No, while more than a third
said depending on what it is.
14. How would you report
unethical code?
StackOverflow 2018 Developers’ Survey
15. How would you report
unethical code?
StackOverflow 2018 Developers’ Survey
Almost half said it depended, while
about a third said only within the
company. About 13 percent said
publicly.
16. Do developers have an obligation
to consider the ethical implications
of their code?
StackOverflow 2018 Developers’ Survey
17. Do developers have an obligation
to consider the ethical implications
of their code?
StackOverflow 2018 Developers’ Survey
Almost 80 percent said
Yes.
18. Who is ultimately responsible
for code that accomplishes
something unethical?
StackOverflow 2018 Developers’ Survey
19. Who is ultimately responsible
for code that accomplishes
something unethical?
About 58 percent responded upper-level
management, 23 percent said the person who
came up with the idea, while only 20 percent felt
the coder was responsible.
StackOverflow 2018 Developers’ Survey
21. “The most dangerous phrase
in the language is, ‘We’ve
always done it this way’.”
—Grace Hopper
(computer scientist, candidate for Most Badass American Award)
@jkriggins #aginext
22. How do you create code
that you won’t regret?
@jkriggins #aginext
23. What does a responsible
development process look
like?
@jkriggins #aginext
24. “Responsible Technology considers
the social impact it creates and seeks
to understand and minimize its
potential unintended consequences.”
@doteveryone
25. doesn’t create or deepen inequality
recognizes and respects dignity and human
rights
gives people confidence and trust in its use
@doteveryone
Responsible Tech…
@SamCatBrown
27. Actually understanding how technology
operates in the wider world, when you’re
developing from the beginning, including how
you understand the user journey, building with a
diverse team, and inclusive design
@doteveryone
Context
28. How the tech is going to be monitored
and supported, how it can affect social
norms, security, reliability, and
anticipating unintended consequences
@doteveryone
Consequences
30. Be wary of your open-
source project!
What’s the worst way someone could use it?
@jkriggins #aginext
31. “Don’t build something if you don’t have the
budget to build the security infrastructure
properly. Knowing your limits is also
important to behave ethically.”
— Ádám Sándor
(cloud tech consultant)
@adamsand0r @containersoluti
32. “Whose problem is it if data gets stolen? Was it
devs not thinking, ops not securing or management
not giving enough budget? In these situations, it’s
very easy to think ‘This isn’t my own problem, I’m
just a cog in the machine.'”
— Ádám Sándor
(cloud tech consultant)
@adamsand0r @containersoluti
33. “Don’t build something if you don’t have the
budget to build the security infrastructure
properly. Knowing your limits is also
important to behave ethically.”
— Ádám Sándor
(former NSA and CIA director)
@adamsand0r @containersoluti
Ethical processes come
from agile mindset, open
feedback, and trust.
34. Tech ethics is all about
asking questions
@jkriggins #aginext
35. What am I actually building?
What’s the supply chain?
What other uses are there?
@cori_crider @coedethics
36. What is our code connecting to?
Is it necessary that they connect?
Is it necessary that particular data flow through it?
How long are we storing that data?
Why do we need to store it?
Should we be sharing that data with that entity?
@jkriggins @TheNewStack
37. “This system needs to
connect to this system.’
Why? What’s the purpose?
What are you asking?”
@AndyThurai #InternetofThings
— Andy Thurai
(former IBM, now Oracle)
38. Who does this marginalize?
Who is not included in this
software?
If this scaled to 2 billion people,
who couldn’t use it?
@jkriggins #aginext
42. What’s going to be your
first step in your ethical
agile process?
@jkriggins #aginext
43. Data centers produce as
much greenhouse gas as
the aviation industry.
@anne_e_currie #aginext
Here’s one small step!
44. “Across the tech sector we need to
recognize that data centers will
rank by the middle of the next
decade among the large users of
electrical power on the planet.”
@MSDev #environmental
— Brad Smith
(president, Microsoft)
45. Good News: Cloud is massively more
efficient than on prem.
Bad News: We use way more compute
resource there. (Containers,
microservices, blockchain even worse!)
@anne_e_currie @CoedEthics
— Anne Currie
(Coed Ethics founder, sustainable server supporter)
46. “Google is the world’s largest corporate
buyer of renewable energy. In 2017
they purchased seven billion kilowatt-
hours of electricity from solar and wind
farms that were built specifically for
them.”
@anne_e_currie @CoedEthics
— Anne Currie
(Coed Ethics founder, sustainable server supporter)
47. AWS have four sustainable
(offset) regions: Dublin,
Frankfurt, Oregon, Canada
Are you hosting in one of them?
@anne_e_currie @CoedEthics
48. Sustainable Servers by 2024 on Change.org
Support 100% renewable servers by 2024. Sign
here:
https://www.change.org/p/sustainable-
servers-by-2024
Transition to Google Cloud, Azure or AWS
Sustainable Regions (Dublin, Frankfurt,
Canada, Oregon)
Or, buy renewable electricity for your data
centers.
@anne_e_currie @CoedEthics
49. What’s going to be your
first step in your ethical
agile process?
@jkriggins #aginext
Two years ago, I made your sprints more exhausting and longer by telling you how your documentation needs to be the definition of Done.
Now, as I try to take the place of irreplaceable tech ethics advocate Anne Currie, I’m going to add more to your sprint by asking you to question everything you’re doing — at least on a biweekly basis.
And I’m again doing it from a very parenting ‘do as I say not as I do’ perspective of never having written a piece of code in my life nor having managed or coached a team. But I’m a tech journalist and marketer who interacts with these teams and cultures every day of my life and because I’ve spent the last year with a focus on ethics and accessibility and diversity and inclusion, I do ask myself these questions nearly every day, as I decide what stories deserve attention and don’t.
To start, who took ethics at school?
What is ethics?
How do ethics fit into our business? Does anyone here actually have ethical considerations in your daily standups or retrospectives? Tell us about it
Ethics seems to be a really subjective idea.
Google famously used to have ‘Don’t be evil’ as their corporate philosophy. It was even kicked off their Code of Conduct. Again outside of Austin Powers “evil” is pretty “subjective”. Just look at British and American politics.
They’ve changed it since… maybe because of it’s non-specificity or maybe because maybe Google did a bit of evil
I tend to consider myself aware of the crazy world we live in, but I hadn’t heard about Project Maven until human rights later Cori Crider kicked off Anne Currie’s tech ethics conference Coed Ethics with a kicker of a keynote.
In a bombshell, Project Maven is targeted drone kills based on machine learning algorithms. These ‘signature strikes’ are performed by the U.S. military on people whose geo-locational life patterns, social networks, and travel behavior model that of a terrorist.
What could go wrong? Let’s just call out the scary ramifications of when machines determine who gets to live or die…
This data is inherently flawed when an embedded Al Qaeda reporter was the top “known terrorist” result simply due to him being a good journalist on the scene.
And this isn’t an anomaly
Crider said these imperfections have cost hundreds if not thousands of civilian lives in the drone wars, not even mentioning those who’ve also died in active war zones like Afghanistan, Syria and Iraq, which she says saw 6,000 civilian deaths in 2017 alone.
Since humans are using the algorithms and corners are being cut, it leaves to an “indifference to civilian life.”
Project Maven doesn’t really sound like Google culture, and yet it bid on and won a MASSIVE (we assume, it was undisclosed) contact with the US Department of Defense to enhance the technology
That “flags images for human review, and is for non-offensive uses only.”
While Google promised it’d all be non-offensive, a solid 4% of the staff, which my humble math based on LinkedIn employees says is about 6400 people, wrote a one-page letter of protest to the CEO, which the opening and closing is shared here
And so Google chose to lose what we can all assume was an undisclosed buttload of money because a small percentage of staff spoke up — and published that letter in the New York Times.
Because of these 6,000 or so employees — which of course is not an insignificant number — Google also released a new set of AI principles which included its AI research shouldn’t be used for weapons.
Now they have a bit more specific code of conduct that’s more than 6,000 words which ends with a beautiful clarification to still not be evil, but also a reminder that empowers all employees and contributors and paraphrases Transport for London’s motto of See It, Say It, Sorted.
The 2018 StackOverflow’s Developer Survey asked four questions about ethics for the first time. Let’s see how you measure up against them
This seems right in line with what one of the Volkswagen developers who was sentenced with three years in jail for more than a decade’s involvement in the automobile company selling diesel cars that were well past the U.S. environmental standards, but were programmed to look like they weren’t. When advocating for house arrest, the Volkswagen employee James Liang’s lawyer said that his client was not a “mastermind” of the emissions fraud, but rather Liang “blindly executed a misguided loyalty to his employer.”
While most developers acknowledge they should be thinking about ethics in the code they’re writing and releasing, in the end, they don’t feel the weight of responsibility — most assume that falls on the leadership.
So now that you know you may be responsible if things go wrong, how can you fit ethical reflection into your regular, repeated agile processes?
Or how do you make sure you create code that you won’t regret?
It turns out responsible development has a lot in common with agile development. It actually fits in with other trends like microservices and containers and other movement to give more responsibility and creativity to the developer. But as individual owns the code more, it also becomes logical that the legal and moral ramifications for what we create will become more important too.
And even more like agile development, there are already toolkits and canvasses and checklists and processes you can apply to help
Data Responsibility thinktank says responsible tech thinks about any positive or negative social impact it creates directly and indirectly.
I think we can think of dozens of headlines of where big-name tech has failed to meet these standards recently.
… examples
See? I promised you I’d bring it back to your framework comfort zone!
This adaptable framework comes down to developers being cognizant of:
Context — Actually understanding how technology operates in the wider world, when your developing from the beginning, including how you understand the user journey, building with a diverse team, and inclusive design
Consequences — How the tech is going to be monitored and supported, how it can affect social norms, security, reliability, and anticipating unintended consequences
Contribution — Holistically considering cross-functional, cross-sector ownership, algorithm inputs, and best practices.
Soon everyone will implement a responsible tech product assessment to follow along the way. Even pausing to review responsible and ethical criteria during retrospectives can lead to more ethical behavior, including ethical considerations in your documentation and backlog, including to see what you missed or didn’t have funds to test, acting transparently with users if you are releasing a less secure new product.
Data Responsibility thinktank says responsible tech thinks about any positive or negative social impact it creates directly and indirectly.
Yeah yeah, we all love open source here. We love the freedom and the free-ness and the community. But, if you are building something and then letting anyone use it, what’s the worst they could use it for?
Adam says making tech responsibility the new norm all comes down to tracking how data moves around a company.
He asked “Whose problem is it if data gets stolen? Was it devs not thinking, ops not securing or management not giving enough budget? In these situations, it’s very easy to think ‘This isn’t my own problem, I’m just a cog in the machine.'”
Adam says making tech responsibility the new norm all comes down to tracking how data moves around a company.
He asked “Whose problem is it if data gets stolen? Was it devs not thinking, ops not securing or management not giving enough budget? In these situations, it’s very easy to think ‘This isn’t my own problem, I’m just a cog in the machine.'”
Adam argues that the breaking of inner-company silos and the agile movement are putting everybody — project managers, UX designers, software developers — on the same team, and now is the perfect time to build ethics into these processes. This means the growing popularity of multidisciplinary teams with shared ownership, that sees developers owning the software to full production.
Data and privacy is an important part of your ethical considerations. While GDPR was poorly implemented at least it has us thinking about privacy and better mapping our data.
And in this interconnected world, it’s not just about asking about what people can do with your technology, but what people can do with other people’s technology
This is from the Microsoft Inclusive Design Toolkit
I want to make my own sort of guide — and being a journalist my life is spent asking questions, so I want to know what other questions you would add to this list.
I couldn’t be covering for Anne without talking about her cause. A surprisingly quick and easy ethical step.
It’s even worse with containers and microservices that will sit on almost empty fully running servers
Of course again they did this to save their own money but it has a great benefit for the world.
With the uncertainty of Brexit, we probably shouldn’t be storing data in the UK or only the UK, so if you want a foot in Europe and one in the US, it’s a good idea to switch now so why not also switch to more sustainable?
I learn something new from Anne every day. It turns out that you can buy energy for not only your data centers but your office or even your home
So please do share what you learned today if you learned anything at all. And please do sign and share Anne’s change.org petition — she just hit 1500 signatures earlier this week! And please consider tweeting to me and all of #aginext your next tech ethics steps. They say if you put something public — like your team values — you are more likely to achieve it.
So please do share what you learned today if you learned anything at all. And please do sign and share Anne’s change.org petition — she just hit 1500 signatures earlier this week! And please consider tweeting to me and all of #aginext your next tech ethics steps. They say if you put something public — like your team values — you are more likely to achieve it.