How will transparency shape the future of ethical AI development?

No. 74: Bringing you the news that matters in video privacy and security

There has been a recent pivotal push for transparency in the development of AI models. As the lines between innovation, privacy, copyright, and data ownership blur, intricate ethical and legal concerns are brought into question. 

The AI Foundation Model Transparency Act, introduced by Reps. Anna Eshoo and Don Beyer, is a significant stride towards ethical use of AI. This bill mandates creators of AI foundation models to disclose their training data sources, which helps address growing concerns over copyright infringement and misuse of sensitive information. 

Amidst these developments, the Department of Homeland Security's (DHS) initiative to generate synthetic data shows an innovative approach to balance advanced AI training with safeguarding privacy and security. 

Synthetic data serves as a crucial tool for training machine learning models, especially when real-world data poses privacy and security risks, or includes sensitive personal information. These privacy safeguards pave the way for more robust and unbiased AI systems, steering clear of the pitfalls that real data may have, such as inherent biases or inaccuracies.

It is clear that transparency, ethical data usage, and respect for copyright are not just optional extras but essential components of responsible AI development. The legislative efforts and industry practices must continually evolve to address these challenges and ensure AI's potential is harnessed in a beneficial and fair manner for all. 

As always, please send any feedback or topics of interest you would like to be covered.

Seena, Editor


News

UK police's use of facial recognition in passport database faces scrutiny

Recent reports reveal that since 2019, UK police have been employing facial recognition technology to access the country’s passport holder database - a practice not previously disclosed to the public. While the Home Office says this method as a tool in significant criminal investigations, concerns about the potential for misuse and the associated legal framework have sparked a broad debate.

The Telegraph: Police secretly conducting facial recognition searches of passport database

Biometric Update: UK police have been secretly using passport database for facial recognition for 3 years

 

US DHS seeks synthetic data solutions to enhance privacy and security

The Department of Homeland Security Science and Technology Directorate is soliciting solutions for generating synthetic data that replicates real data patterns while ensuring privacy and security. This initiative emphasises creating versatile, privacy-protecting synthetic data.

Biometric Update: US Homeland Security asking after synthetic data systems

DHS: News Release: DHS S&T Announces New Solicitation for Synthetic Data Generator Solutions

 

Proposed updates to the UK's surveillance laws spark privacy concerns  

The UK government is accelerating efforts to enhance its surveillance laws, causing alarm among tech firms and privacy advocates. Based on the Investigatory Powers Act 2016, the proposed legislation raises concerns around increased government control over tech companies' ability to update their products and implement end-to-end encryption. 

Politico: Britain’s got some of Europe’s toughest surveillance laws. Now it wants more

IAPP: Proposed amendment to UK surveillance law draws criticism

 

New York Hospital settles for $300,000 over privacy breach

The New York Attorney General secured a $300,000 settlement from The New York-Presbyterian Hospital for disclosing visitors' health information through tracking tools on its website. The hospital agreed to implement new policies and procedures to safeguard patient data, after an investigation that revealed the misuse of sensitive health data collected via third-party tracking.

IAPP: New York attorney general announces $300K patient privacy settlement

New York State Attorney General: Attorney General James Secures $300,000 from NewYork-Presbyterian Hospital for Failing to Protect Patient Data

 

Australian Information Commissioner scrutinises TikTok over privacy breaches and data collection concerns  

TikTok faces allegations in Australia for privacy breaches and unauthorised data collection from devices, even of those who do not use the app. Despite TikTok's denial of wrongdoing and assertions of compliance with Australian privacy laws, the Australian Information Commissioner is conducting preliminary inquiries to determine if a full investigation is warranted.

ABC News: Claims TikTok siphons personal data of non-users without consent examined by Australian Information Commissioner

The Guardian: TikTok’s data collection being scrutinised by Australia’s privacy watchdog


AI Snippet of the Week

US lawmakers propose bill to mandate transparency in AI model data sources

US lawmakers have proposed The AI Foundation Model Transparency Act, which mandates that creators of AI foundation models disclose their training data sources to address copyright concerns and ensure information accuracy. The bill requires detailed reporting on data usage, model limitations, and efforts to prevent misinformation.

The Verge: AI companies would be required to disclose copyrighted training data under new bill

National Herald: AI companies may need to disclose copyrighted training data


Policy Updates

Florida Bill seeks to restrict minors under 16 from social media

Florida House Republicans have introduced a bill (HB 1) aimed at prohibiting minors under 16 from having social media accounts, which would require platforms to implement strict age verification processes. The bill also allows parents to request account terminations and empowers the attorney general to file lawsuits against non-compliant platforms.

CBS News: Florida GOP lawmakers target social media use by children

Florida Politics: Florida Republicans prioritizing legislation that would bar minors from social media platforms


To subscribe to our fortnightly newsletter, please click here

Thanks for reading, if you have any suggestions for topics or content that you want to see covered in future please drop a note to: info@secureredact.co.uk

Previous
Previous

Are we seeing a global synergy in data privacy?

Next
Next

The grey area of biometric data