This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Scrutiny falls on Facebook following reports of data harvesting for Trump campaign

Following The Guardian breaking this story over the weekend, concern has grown over how this happened and why nothing was done earlier to rectify the issue. #DeleteFacebook is trending with many users understandably angry over how their data was allegedly harvested and used to influence political elections.


What do you think should be done, firstly to rectify this issue and secondly to prevent things like this happening in future?
Parents

  • John Haith:

    Does anyone know the specific details on what Facebook data was actually leaked? I haven't found an article yet that explains exactly what was unwittingly shared




    From what I've read, it appears that the data was deliberately shared. Nothing I've read indicates it was unwittingly shared or due to some cyber-security breach. The university researcher, who built the quiz, based on a published API, to collect psychographic,demographic and firmographic data on many millions of people, as well as data on their friends, is reported as having transferred this data to Cambridge Analytica, which, if true, some would consider a breach of research ethics, at the least.  


    The rules of the game for most social network platforms are that users get the services they want, and platforms sell the psychographic data they need, to advertisers who are willing to pay for it, so they can develop persuasive messages psychographically targeted, ideally, to each user, based on their individual psychographic profile (psychological attributes). Psychographics has been applied to the study of personality, values, opinions, attitudes, interests, and lifestyles.  While psychographics is often equated with lifestyle research, it can also apply to the study of cognitive attributes such as attitudes, interests, opinions, beliefs and behavior.  As well as advertising, psychographic profiles are used in market segmentation, political segmentation, and for microtargetting messages or news stories to narrow constituencies or communities.  Pyschographic algorithms can either amplify or attentuate content that reaches users, which is where the vulnerability to society lies.  As with most technologies, this could be used for social good (public health messages) or socially corrosive agendas.  Are we inadvertently giving potential  adversaries the pychographic tools to manipulate our society? Is it a case of who pays, wins?


    So, how then can social networks eliminate bad actors, autonomous bots artificially promoting or suppressing content, posts, or fabricated news stories to manipulate, sow discontent, and divide constituencies? Social network platforms need to balance their commercial interests, society’s interests and user's interests. But how? 


    • Can Social Platforms develop standards that force transparency around who’s paying for advertisments or sponsored content?  

    • Should users have an opt-out option, where they can pay for the services they value, without any collection and sale of their usage or user data.

    • Should users  have the right to export data from existing social networks and import to new software platforms that would let them experiment with new services, while maintaining social contacts on existing ones?

    • Should industry regulate itself or is government regulation required?


    perhaps the scholars and policy makers among us can offer thier remedies!
Reply

  • John Haith:

    Does anyone know the specific details on what Facebook data was actually leaked? I haven't found an article yet that explains exactly what was unwittingly shared




    From what I've read, it appears that the data was deliberately shared. Nothing I've read indicates it was unwittingly shared or due to some cyber-security breach. The university researcher, who built the quiz, based on a published API, to collect psychographic,demographic and firmographic data on many millions of people, as well as data on their friends, is reported as having transferred this data to Cambridge Analytica, which, if true, some would consider a breach of research ethics, at the least.  


    The rules of the game for most social network platforms are that users get the services they want, and platforms sell the psychographic data they need, to advertisers who are willing to pay for it, so they can develop persuasive messages psychographically targeted, ideally, to each user, based on their individual psychographic profile (psychological attributes). Psychographics has been applied to the study of personality, values, opinions, attitudes, interests, and lifestyles.  While psychographics is often equated with lifestyle research, it can also apply to the study of cognitive attributes such as attitudes, interests, opinions, beliefs and behavior.  As well as advertising, psychographic profiles are used in market segmentation, political segmentation, and for microtargetting messages or news stories to narrow constituencies or communities.  Pyschographic algorithms can either amplify or attentuate content that reaches users, which is where the vulnerability to society lies.  As with most technologies, this could be used for social good (public health messages) or socially corrosive agendas.  Are we inadvertently giving potential  adversaries the pychographic tools to manipulate our society? Is it a case of who pays, wins?


    So, how then can social networks eliminate bad actors, autonomous bots artificially promoting or suppressing content, posts, or fabricated news stories to manipulate, sow discontent, and divide constituencies? Social network platforms need to balance their commercial interests, society’s interests and user's interests. But how? 


    • Can Social Platforms develop standards that force transparency around who’s paying for advertisments or sponsored content?  

    • Should users have an opt-out option, where they can pay for the services they value, without any collection and sale of their usage or user data.

    • Should users  have the right to export data from existing social networks and import to new software platforms that would let them experiment with new services, while maintaining social contacts on existing ones?

    • Should industry regulate itself or is government regulation required?


    perhaps the scholars and policy makers among us can offer thier remedies!
Children
No Data