mirror of
https://github.com/NotXia/unibo-ai-notes.git
synced 2025-12-14 18:51:52 +01:00
Add ethics2 profiling
This commit is contained in:
1
src/year2/ethics-in-ai/module2/ainotes.cls
Symbolic link
1
src/year2/ethics-in-ai/module2/ainotes.cls
Symbolic link
@ -0,0 +1 @@
|
||||
../../../ainotes.cls
|
||||
13
src/year2/ethics-in-ai/module2/ethics2.tex
Normal file
13
src/year2/ethics-in-ai/module2/ethics2.tex
Normal file
@ -0,0 +1,13 @@
|
||||
\documentclass[11pt]{ainotes}
|
||||
|
||||
\title{Ethics in Artificial Intelligence\\(Module 2)}
|
||||
\date{2024 -- 2025}
|
||||
\def\lastupdate{{PLACEHOLDER-LAST-UPDATE}}
|
||||
\def\giturl{{PLACEHOLDER-GIT-URL}}
|
||||
|
||||
\begin{document}
|
||||
|
||||
\makenotesfront
|
||||
\include{./sections/_data_protection.tex}
|
||||
|
||||
\end{document}
|
||||
78
src/year2/ethics-in-ai/module2/sections/_data_protection.tex
Normal file
78
src/year2/ethics-in-ai/module2/sections/_data_protection.tex
Normal file
@ -0,0 +1,78 @@
|
||||
\chapter{Data protection}
|
||||
|
||||
\begin{remark}[AI risks] \phantom{}
|
||||
\begin{itemize}
|
||||
\item Eliminate or devalue jobs.
|
||||
\item Lead to poverty and social exclusion, if no measures are taken.
|
||||
\item Concentrate economic wealth in a few big companies.
|
||||
\item Allow for illegal activities.
|
||||
\item Surveillance, pervasive data collection, and manipulation.
|
||||
\begin{example}
|
||||
Many platforms operate in a two-sided market where users are on one side and advertisers, the real source of income, are on the other.
|
||||
\end{example}
|
||||
\item Public polarization and interference with democratic processes.
|
||||
\item Unfairness, discrimination, and inequality.
|
||||
\item Loss of creativity.
|
||||
\begin{remark}
|
||||
Creativity can be:
|
||||
\begin{descriptionlist}
|
||||
\item[Combinatorial]
|
||||
Combination of existing creativity.
|
||||
|
||||
\item[Exploratorial]
|
||||
Explore new solutions in a given search space.
|
||||
\end{descriptionlist}
|
||||
\end{remark}
|
||||
\end{itemize}
|
||||
\end{remark}
|
||||
|
||||
|
||||
|
||||
\section{Profiling}
|
||||
|
||||
\begin{description}
|
||||
\item[Profiling] \marginnote{Profiling}
|
||||
System that predicts the probability that an individual having a feature $F_1$ also has a feature $F_2$.
|
||||
|
||||
In the GDPR, it is defined as any form of processing of personal data of a natural person (the data subject) that produces legal effects (e.g., signing a contract) or significantly affects it. It includes analyses and predictions related to work, economic situation, health, interests, reliability, movements.
|
||||
|
||||
\begin{remark}
|
||||
Profiling in the GDPR only refers to natural persons (i.e., individuals and not groups).
|
||||
\end{remark}
|
||||
\end{description}
|
||||
|
||||
|
||||
\begin{example}[Cambridge Analytica scandal]
|
||||
Case where data of US voters was used to identify undecided voters:
|
||||
\begin{enumerate}
|
||||
\item US voters were invited to take a personality/political test that was supposed to be for academic research. Participants were also required to provide access to their Facebook page in order to get a money reward for the survey.
|
||||
\item Cambridge Analytica collected the participants' data on Facebook, but also accessed data of their friends.
|
||||
\item The data of the participants was used to build a training set where Facebook content is used as features and questionnaire answers as the target. The model built upon this data was then used for predicting the profile of their friends.
|
||||
\item The final model was used to identify voters that were more likely to change their voting behavior if targeted with personalized ads.
|
||||
\end{enumerate}
|
||||
\end{example}
|
||||
|
||||
\begin{description}
|
||||
\item[Industrial capitalism] \marginnote{Industrial capitalism}
|
||||
Economic system where entities that are not originally meant for the market are also considered as products. This includes labor, real estate, and money.
|
||||
|
||||
\begin{description}
|
||||
\item[Surveillance capitalism] \marginnote{Surveillance capitalism}
|
||||
Considers human experience and behavior also as a marketable entity.
|
||||
\end{description}
|
||||
|
||||
\begin{remark}
|
||||
Labor, real estate, and money are mostly subject to law. However, exploitation of human experience is less regulated.
|
||||
\end{remark}
|
||||
|
||||
\item[Surveillance state] \marginnote{Surveillance state}
|
||||
System where the government uses surveillance, data collection, and analysis to identity problems, govern population, and deliver social services.
|
||||
|
||||
\begin{example}[Chinese social credit system]
|
||||
System that collects data and assigns a score to citizens. The overall score governs the access to services and social opportunities.
|
||||
\end{example}
|
||||
\end{description}
|
||||
|
||||
\begin{remark}
|
||||
Profiling enables for differential inference where individuals are treated differently based on their features.
|
||||
\end{remark}
|
||||
Reference in New Issue
Block a user