Thursday, April 18, 2024
64.1 F
Oxnard
More

    Latest Posts

    Goodbye Constitution Freedom America by Don Jans

    New app that can actually read your emotions comes under heavy fire

    WND, biden, gay sex, monkeypox

    Bob UnruhBy Bob Unruh  WND News Center

    The program would have to monitor ‘everything’

    A coalition of privacy organizations and allies are urging a leading digital media company to stop the development of a new technology that purportedly can listen to a person and identify his or her emotional state.

    The Electronic Privacy Information Center and more than 100 recording artists, 69 non-profit groups and many prominent individuals are urging Spotify to close down its research.

    It’s because Spotify obtained a new patent that would “allow the company to identify individuals’ ’emotional state, gender, age, or accent’ to recommend music.”

    But there are significant concerns that aren’t being addressed, the critics said in a letter.

    EPIC cites multiple threats of “emotional manipulation, discrimination, massive privacy violations and increased inequality within the music industry.”

    While Spotify argued its technology has not been implemented, and it claimed it has “no plans” to do so, EPIC said concerns remain.

    The letter, dated this week, is to Daniel Ek, co-founder of the company, in Sweden.

    Do you support the use of technology that reads your emotions?
       
    Completing this poll entitles you to WND news updates free of charge. You may opt out at anytime. You also agree to our Privacy Policy and Terms of Use.

    “We write to you as a group of concerned musicians and human rights organizations from across the globe who are deeply alarmed by Spotify’s recently approved speech-recognition patent,” the groups said.

    “Spotify claims that the technology can detect, among other things, ’emotional state, gender, age, or accent’ to recommend music. This recommendation technology is dangerous, a violation of privacy and other human rights, and should not be implemented by Spotify or any other company.”

    The letter explained the danger: “Monitoring emotional state, and making recommendations based on it, puts the entity that deploys the tech in a dangerous position of power in relation to a user.”

    The discrimination would be unavoidable, the letter said.

    “It is impossible to infer gender without discriminating against trans and non-binary people, and others who do not fit gender stereotypes. It is also impossible to infer someone’s music taste based on accent, without assuming there’s a ‘normal’ way of speaking or falling into racist stereotypes.”

    The threat to privacy is obvious, because the program would have to monitor “everything.”

    “While we are pleased to hear that Spotify has no current plans to deploy the technology, it begs the question: why are you exploring its use? We call on your company to make a public commitment to never use, license, sell, or monetize the recommendation technology.”

    Signers include Amnesty International, the Center for Digital Democracy, Heartland Initiative,  Mozilla Foundation and Public Citizen.

    This story was originally published by the WND News Center


    Get Citizensjournal.us Headlines free  SUBSCRIPTION. Keep us publishing – DONATE

    - Advertisement -
    0 0 votes
    Article Rating
    Subscribe
    Notify of
    guest

    0 Comments
    Inline Feedbacks
    View all comments

    Latest Posts

    advertisement

    Don't Miss

    Subscribe

    To receive the news in your inbox

    0
    Would love your thoughts, please comment.x
    ()
    x