Get Your Copy Technically Wrong: Sexist Apps, Biased Algorithms, And Other Threats Of Toxic Tech Formulated By Sara Wachter-Boettcher Released Through Electronic Format

the title of this book, I assumed it would focus exclusively on the problems of bias in software and machine learning.
This has been in the news for quite a while, and on top of the news recently.
While most of the book provides stories about bias as I expected, a large part of the book was about various other behaviors, sexist, racist, illegal, and just bad.
Think hiring at Uber. If you have kept up with these kinds of issues in Wired/Fast Company magazines and their ilk, you get many more examples here, but not much by way of solutions.
Despite that mild disappointment, I found the writing kept my interest, at least up until the end, when it felt like the authors were reaching for things to write about.
Good for helping an ITer, data scientist, or a tech company exec to think through how these issues may touch on your own company, products and practices.
Why do apps and profile info pages mostly come with only two gender options male and female What if someone doesn't wish to be identified as either Why is there still a vast underrepresentation of women and minorities in the tech sector Why hasn't there been a massive MeToo rising in the tech industry across the world If tech companies are largely run by white or Asian men, do the products they release also reflect the bias and stereotypes they believe in

From Uber's severely regressive history in handling sexual harassment complaints, to why Google Photos inadvertently turned out to be racist, the book is full of anecdotes on where the tech industry is messing up.


If you are one who has felt that the tech sector needs to diversify, this book can help you better understand why, through stories from experts in the space.


If you thought the tech industry's idea of 'Meritocracy' to recruit talent is fair, then this is a must read for you too.
Recommended reading on the current very current state of the tech industry, Overlaps a little bit with and cites sitelinkWeapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, but focuses more on programmer an designer choices, assumptions and hidden biases instead of algorithms.

First I'd thought of recommending it only to programmers there's a bunch of stuff on personas and other design techniques that are not of interest to 'regular' humans but then it branches out and goes into the role of the tech industry in daily life, fake news, concerted online harassment, and all the other acrid smoke from the garbage fire that is the modern WWW.
sitelink
sitelinkInstagram sitelinkTwitter sitelinkFacebook sitelinkAmazon sitelinkPinterest


Fun fact: I applied for an ARC when this was first coming out and did not get a copy.
No, I'm not salty about it she said, while casually presenting like a humanshaped lump of salt.
When this book went on sale, you can bet that I snapped it right up, hyped as all get out to read it, because as a feminist who works in the vast technology industry, this was relevant to all my interests.
And it's good, Even though you've probably seen most of the case examples WchterBoettcher brings up when they made the news such as the lensfocusing cameras that didn't work on Black people, or Twitter's failure to stop the massharassment of many women online, including Lindy West.
And even though the "beware the evils of technology horse" is the latest and greatest dead horse that the news media loves to beat.
But it's so useful to have all that information organized into one place, for easy reference, with commentary on how it happened and why it matters.




I'm not sure how appealing this book will be to people who aren't interested in social sciences and social justice and/or who aren't in tech or use it actively for work.
But since most of us engage with some sort of app or website on the daily, I think it's important for everyone to understand how technology can fail us.
One of the prevailing themes in this book is how a lack of unbiased and undemocratic small d, don't me information can lead to unconscious bias that reveals itself in AI systems.
The author gave the example of a riskassessment website that predicted what the recidivism of a criminal might be, but it did not take into account racial biases, such as the fact that Black individuals tend to get arrested more regardless of whether they are guilty or not, making one of the value points associating with people who have prior arrests questionable at BEST.
Another is that a lack of diversity can contribute to erasure, or worse systemic microaggressions that can make a site significantly less safe or useful examples: binary gender options upon signing up with a website or periodtracking apps that assume their users are straight women who want to get pregnant.




This book was published inso obviously it's aged, There are some comments about Twitter and its failure to sell the company based on its bad track record, for example, that have aged like a used diaper left out in a hot sun.
I can only imagine what the author thinks of some of the latest site "updates, " For the most part, though, I think a lot of the book is still relevant, Especially with more and more companies leaning more heavily on the use of algorithms and AI.
Technology itself is not inherently harmful, but wielded in the wrong hands and used with carelessness, it very easily can be.




.I won this book in a giveaway, I work in the tech sector and was interested in this book because I am leading a digital transformation effort at my job and wanted to make sure i didn't fall into any of these traps.
The book was not what I was thinking it was but boy were my eyes opened.
I have worked in tech foryears, I'm a woman and have experienced the discrimination the book describes early in my career developing software for a utility.
While I was raising my kids, I taught computers in college parttime then returned to the workforce when they were driving.
I thought my days of discrimination were behind me but just last year it happened again.
I was being groomed for a position to
Get Your Copy Technically Wrong: Sexist Apps, Biased Algorithms, And Other Threats Of Toxic Tech Formulated By Sara Wachter-Boettcher Released Through Electronic Format
take over for my boss, the IT Director, when he retired.
When he announced his retirement date, I was expecting the promotion but I didn't get it.
Even though my boss was progressive, the good ol' boy network of the company, choose otherwise and now I report to someone who not only has never managed IT but has never worked in it.
So I am training my boss, Toxic!

I didn't realize that software meant for the general public had such a narrow view of "normal".
This book opened my eyes tremendously, I am ashamed of my industry,

This should be required reading for anyone studying in the tech field in college.
I have forwarded this title to the college at which I taught, Lots of observations on the impact of how data, algorithms, and product choices have unintended or intended consequences.
Awareness is step Concise and motivating if depressing, Should be required reading for everyone in tech, Read it. Youll be angry, and inspired, I want to qualify my rating of this book: If you havent previously thought about sexism, racism, or other forms of discrimination in the tech industry, this is a fivestar recommendation.
However, as someone who regularly reads about this topic and pays attention to tech news, I encountered very little new information in this book.
It was also a bit disappointing to see so much focus on recent big news stories e.
g. the Google Photos categorization fail, Uber sexism and spying, Facebook year in review rather than a wider range of companies and more indepth looks at what went wrong, how it happened, and how companies are or could be doing things differently.
So I wasnt blown away by the book, but it holds valuable information for some folks and I just might be the wrong audience.
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dudebro.


Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara WachterBoettcher weaves a challenging narrative with ease.
With ease, but not easily, Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".


As WachterBoettcher is at pains to highlight, all of this is not intentional but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions.
Over time, these compound to create systemic barriers to technology use and employment feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders and reinforcement that technology is the domain of rich, white, young men.


The examples that frame the narrative are disarming in their simplicity, The high school graduand whose Latino/Caucasian hyphenated surname doesn't fit into the form field, The person of mixed racial heritage who can't understand which one box to check on a form.
The person who's gender nonconforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'.
Beware, these are not edge cases! The most powerful takeaway for me personally from this text is that in design practice, edge cases are not the minority.
They exist to make us recognise of the diversity of user base that we design for.


Think "stress cases" not "edge cases", If your design doesn't cater for stress cases, it's not a good design,

While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs.
There are a finite number of ways to write a syntactically correct PHP function, Give meform designers, and I will will give youdifferent forms that provideuser experiences.
And at least some of thoseusers will be left without "delight" a nebulous buzzword for rating the success or otherwise of digital experiences.


WachterBoettcher takes precise aim at another seemingly innocuous technical detail application defaults exposing their at best benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data.
It is designing not for delight, but for deception,

"Default settings can be helpful or deceptive, thoughtful or frustrating, But they're never neutral. "

Here the clarion call for action is not aimed at technology developers themselves, but at users, urging us to be more careful, more critical, and more vocal about how applications interact with us.


Artificial intelligence and big data do not escape scrutiny, WachterBoettcher illustrates how algorithms can be inequitable targeting or ignoring whole cohorts of people, depending on the unquestioned assumptions built into machine learning models.
Big data is retrospective, but not necessarily predictive, Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future.
Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque.
Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be.


The parting thought of this book is that good intentions aren't enough, The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity.
That will be delightful!.