The poor data practices of technology businesses have left people vulnerable to widespread harm. Businesses use automated and data-driven technologies to monitor consumers and invade their privacy, set discriminatory prices, exclude consumers from their services, and buy and sell personal information without the consumer's informed consent.
Our submission to the Senate Economic References Committee inquiry into the Influence of international digital platforms ("the Big Tech inquiry") includes a number of case studies of harm arising from the improper use of consumer's data by major technology companies, including Airbnb and Tinder. However, our investigations also found smaller technology companies and non-technology businesses still contribute significant harm to consumers.
We recommend a number of policy reforms and actions concerning two themes of the report, algorithm transparency and data and privacy:
- Require businesses to have a duty of care over personal data
- Align definitions of "personal information" to consumer expectations
- Strengthen the Office of the Australian Information Commissioner
- Legislate a risk-based framework to restrict and prohibit certain uses of automated decision-making (ADM)
- Empower a regulator to protect people from ADM
- Legislate transparency requirements for the use of algorithms in ADM by businesses
We also recommend an economy-wide ban on unfair trading to protect consumers from a range of digital and non-digital harms.
Download submission (PDF)
Related content
- What are loyalty schemes like Flybuys and Everyday Rewards doing with your data?
- Kmart, Bunnings and The Good Guys using facial recognition technology in stores
- Clearview AI facial recognition case highlights need for clarity on law
- Is Airbnb using an algorithm to ban users from the platform?
- Submission to the Attorney-General's Department on the Review of the Privacy Act
Stock images: Getty, unless otherwise stated.