Social Media: Just the messenger?

Social media both shares the goings and news of the world, but it also shapes and creates it.   This has become all the more apparent to me, but it worries me how we often we may not be concious of this.

Social media messages are short and simple.

The social media world relies on simple messages, on a single image, 280 character or a 60 second video.  The message needs to be simple as it is designed to be addictive and get our attention, to fit into busy lives and to encourage us to flick from one message to the next, then the next.   I know I have found myself wasting 20mins just flicking through amusing TikTok videos for example.   This is what the platform providers want us to do.  They want to keep us on their platform as this is how they achieve their revenue, via advertising, so the longer we are on their platform the more data they can mine and the more advertising revenue they can achieve.

The world is simple?

My concern is that the features of the medium, in this case social media, influence the messages which are being conveyed.  But what does this mean for how we perceive the challenges of the world, as shared via social media?    I would suggest this is encouraging us to increasingly see the world complex problems and challenges, as being simple.    Almost every problem whether it be global warming, the covid19 pandemic or racism can be boiled down into a social media message.   And for everyone message sent someone will be able to argue a counter position using the same medium and the same inherently short bite sized social media message.  As such I think we may become less aware of the nuanced nature of the problems we are faced with.   Seeing every problem being boiled down to a simple message may convince us that the problems themselves are simpler than, in the real world, they are.

Increasing binary viewpoints

This simplicity also makes it easier to see problems in terms of black and white or binary positions.   It makes it easy to see a statement on social media as either true or false rather than seeing the complexity and therefore the infinite number of possibilities which may exist between two extreme positions.   And again, for every post stating one position there is at least one person, and often many, many, people, able to reply with an opposing binary view.   This in turn could help to explain the increasing divisions in society whether this be in relation to Brexit in the UK or Trumps presidency in the US, or a multitude of other news stories.

Reinforcement learning

Social media also makes us hyperconnected.   Having identified data about our usage patterns, social media platforms will purposefully expose us to content which fits with these patterns.    As a result, we will be repeatedly exposed to consistent messaging which, through reinforcement, may strengthen our commitment to the binary viewpoints we are encouraged to develop.   This may make our commitment to our position, in relation to specific issues, and to defending such positions more fixed and immovable.     It can also impact on our world view including what we see as truth, how positively or how divided we see the world we live in.  

Social: Time to consider the medium and the message

Social media is here to stay.   It may not be Facebook, Twitter or TikTok in the future, but social media is highly likely to continue in some shape or form.   Watching the news reporting as to concerns in relation to social media, they tend to focus on the messages being shared.  They focus on what social media companies should be doing to prevent extremism, suicide, etc.   I agree work needs to be done here however I also think we also need to look at the vehicle for these messages and what this may mean for the society we live in.

Final Thoughts

Social media represents a magnified version of real life due to the nature of the medium.   This has its advantages in making it easy to consume and contribute to.   The flipside however is it lacks the detail, the nuanced nature and complexity of real life.   We need to be more concious of this, and to ensure our students are also concious of this. Only through being concious of the impact on the medium can we seek to adjust for and minimise it.

Huawei: National needs vs. World Internet

The recent issue of Huawei 5G equipment in the UKs 5G infrastructure highlights the challenges of the internet and technology, which often cross international borders, but where the services and hardware is produced by companies which exist clearly within the borders of countries and therefore potentially within the influence of their governments.     There is a clear tension here between the services provided to the internet and the companies providing them.

The Huawei case is very much about internet security.     The implication is that Huawei could be influenced by the Chinese government who could then leverage the Huawei equipment installed in foreign countries telco infrastructure to gather intelligence, modify or filter communications or otherwise impact on the operation of a country through control of its communications systems.    This all seems quite logical.   Who would want a foreign government to be able to exercise power of their infrastructure?

The issue for me here, is that the technologies, either hardware or software, have to be created and then developed and deployed from somewhere in the world.     Apple devices, Microsoft Windows, Facebook, Google, all have to come from somewhere and in doing so could be influenced by governments or political powers within that given location.   So, the Huawei argument from the perspective of a UK citizen, may equally be matched by Chinese concerns over Apple from the perspective of a Chinese citizen.     Looking to the US, there is even some precedence for being suspicious with Kaspersky, which I note are a Russian firm, highlighting in 2015 that the NSA, a US intelligence agency, could “implant spyware of hard drives to conduct surveillance on computers around the world”.

Technology and technology services are used internationally whether that is a Dell laptop, Dropbox cloud file storage or newspaper website.    Often, these products or services may use components from other organisations, such as Seagate hard drives in a laptop, or Google Analytics or Facebooks share and like buttons on a companies website.   This further complicates things.   The devices, services and components are all used without consideration for international borders.     Yet we live in a world where international borders exist, where governments may have a stake in technology companies or may have influence.  The risk of influence exists.

One solution to this is to block and to ban.   China block Google and YouTube for example, and now it looks like the US and UK are banning Huawei.    Meanwhile in Russia they are testing their own national internet system separate to the “real” internet.    This may be the direction governments increasingly pursue, to block, ban or to create in-country copies, but for me I don’t see how this will work.    In China VPNs provide a solution to circumvent blocks while I am sure Chinese semiconductors/microchip are already in so many of our devices in the office and at home.   If the service or device works for users, it will find its way into use no matter what governments choose to do.

The answer for me is an acceptance of the complexity of this predicament and that countries will have their own personal motives or ends that they wish to encourage.    It is, in my view, a lose-lose situation.     Leave Huawei in place and allow for the risk of Chinese influence or remove Huawei which will likely result in counter moves by the Chinese plus, assuming they are seeking to exert influence via technology, them targeting other parts of the world wide internet infrastructure and services.

All we are left with is a risk-based judgement, which is what I believe must have been taken here.    The risk of counter action, Chinese influence over other parts of the internet and additional cost of changing supplier including removing existing Huawei technology must have been judged to be less than the risk created by Huawei technology within the UKs core or edge network.  My worry here is the potential for bias in the decision making.   As Pinker(2018) points out, “people are poor at assessing probabilities” so “if two scenarios are equally imaginable, they may be considered equally probable”.   Potentially the probability of destructive Chinese action against the UK may have been over estimated.   As such the preventative action taken in blocking Huawei may be excessive.   Or maybe it isn’t!

And if you want to take this whole discussion a stage further let’s consider how companies might now influence the world.   Take for example Facebook which, if it were a country, based on users it would be biggest in the world.    What if we accept that it to may have motives and ends to its is actions, beyond simply providing the Facebook platform?    Google, Microsoft, Apple, Twitter, etc, may all be the same.   But that is possibly for another post.

 

References:

BBC News. 2020. Huawei 5G kit must be removed from UK by 2027. [ONLINE] Available at: https://www.bbc.co.uk/news/technology-53403793. [Accessed 16 July 2020].

CNet. 2015. NSA planted surveillance software on hard drives, report says. [ONLINE] Available at: https://www.cnet.com/news/nsa-planted-surveillance-software-on-hard-drives-report/. [Accessed 16 July 2020].

Pinker, S., 2018. Enlightenment Now. 1st ed. UK: Penguin House.

TechCrunch. 2019. Russia starts testing its own internal internet. [ONLINE] Available at: https://techcrunch.com/2019/12/26/russia-starts-testing-its-own-internal-internet/?guccounter=1. [Accessed 16 July 2020].

World Economic Forum. 2016. If social networks were countries, which would they be?. [ONLINE] Available at: https://www.weforum.org/agenda/2016/04/facebook-is-bigger-than-the-worlds-largest-country/#:~:text=If%20Facebook%20were%20a%20country%2C%20it%20would%20be,it%20each%20month%20-%20around%201.9%20billion%20people.. [Accessed 16 July 2020].

Digital Citizenship and Covid19

Before the Covd-19 crisis begun I presented on Digital Citizenship at the JISC DigiFest event and previous to that at the ISC Digital event in Brighton.   In both cases one of my reasons for presenting was my concern regarding students increasing use of technology not being match by an appropriate considerations or awareness of the risks.   I was worried that students were giving away large amounts of data without considering who they were providing to, how it might be used, how long it would be kept or how it might impact or be used to influence them as individuals, and as groups, in their future.   I was worried and believed education and its educators needed to start to do something about this.

Since then the Covid-19 crisis has deepened with the lockdown rules introduced.   I have seen large numbers of schools and other organisations rushing to find solutions to the challenges this crisis has created.   New applications have been put into use, devices issued and configurations quickly changed to enable remote working and remote learning.    Applications such as Zoom and Houseparty have quickly achieved viral status as people adopt solutions to keep in touch and maintain some form of social interaction and community spirit in a time of isolation.   Decisions as to what to use have been based on which apps are easy to use and which are being used commonly by others.     In all of this fast paced change, home working and learning and technology enabled socialisation the phrase that jumps to mind is “act in haste, repent at leisure”.

Teachers, parents, children, the elderly, in fact almost everyone is now using more technology that they previously did, either in new devices, new apps, the duration spent using technology, etc.   With this use we have greater exposure to online risks such as cyber crime and to risks associated with the data trail we leave behind us.

There is also the discussion of tracking apps to help identify where people have come into close contact with someone with the Corona virus of Corona virus systems, or the use of thermal cameras and facial recognition for a similar purpose.   Although in the short term this appears to be in the interests of public good, this data might be used in the future for other purposes related to the tracking and profiling of individuals, or even to help with targeted campaigns aimed at influencing the behaviour of individuals or groups.

To be clear, the current situation is far from normal and far from ideal.   No-one chose for this to happen.   As a result, the decisions which are being made, and which need to be made, are borne out of necessity rather than being purposeful, fully planned and implemented with the implications carefully considered.    There is also the issue that, at the best of times, we cant predict the future yet now are presented with an even more unpredictable future but the need to make decisions now that will influence this future.

If I was worried about Digital Citizenship before the Covid-19 crisis, I am even more worried now.

JISC DigiFest: Digital Citizenship

Following my DigiFest session I thought I would share some thoughts which went into my session.

It is important to firstly acknowledge that our views on technology are very much the result of our experiences.  My experiences include learning to code in Basic on the Commodore 64 at an early age, before moving on to AMOS basic on the Amiga and then QBasic, Visual Basic and C++ on the PC.    This early use of technology, and the ability to develop software to solve problems has very much shaped my views.    Now, today I walk around with a mobile phone with over a million times more memory than my commodore 64, from less than 30 years earlier, and the growth rate across the period has not been linear.   A perfect illustration of this lies in how long it took various technologies to reach 50 million users.    Radio took 75 years whereas TV only took 38 years.   Bringing us close to today, Facebook got the time to 50 million users down to 3.5 years before Pokemon go managed it in less than a single month.   It is clear from this that the pace of changing is quickening.

Looking at our use of technology today we find that most of us now use technology for communication or entertainment in the form of mobile phones, social media and on-demand TV.   We are also increasingly being required to use technology to access governmental services, council services, banks, etc.    Technology is now integral to our lives and here to stay, complete with the ever-quickening pace of change mentioned earlier.

The more I think about the pace of change and the way that technology is becoming an integral part of our everyday lives the more the movie Ready Player One comes to mind.   In the movie Wade Watts makes use of virtual reality to live a double life, living as Percival in VR.   As the film progresses it becomes clear that his two lives aren’t as separate as he would like and that events in virtual reality impact on real life and vice versa.   For us, like Wade Watts, our lives in real life are inseparably linked to our digital lives.   In fact, I believe that it no longer serves us to think of digital citizenship as the term implies that there is something else available, a non-digital citizenship, when in fact there is not.    Possibly the discussion should not be of digital citizenship at all but simply citizenship.  As Danah Boyd, in her book, Its Complicated said, although the apps might change our online connectedness, our need to share and the challenges around privacy are “here to stay”.

Resulting from this new technology there are benefits or potential benefits and we need to acknowledge this.  A couple of examples include the current exploration of self driving vehicles plus the recent use of choreographed drones as an alternative to traditional new years day fireworks.  In relation to current events around the globe, there is also the use of Artificial Intelligence (AI) to identify new antibiotics and other drugs.   We need to prepare to make the best of these new opportunities and to ensure the students in our educational establishments are prepared.

But with the above benefits, there are also risks.    Fake news and the ease with which videos including interviews can be faked will increasingly make it more difficult to tell fact from fiction.   We also have challenged to individual privacy and risks around habits and potential addictive behaviour plus also the potential for platforms to go so far as to actually shape and influence human behaviour.

The danger in the benefits and risks of technology is the currently common resultant binary views of either technology as infinitely good or inherently bad and evil.    Sadly, these views are seldom of little use as to view technology as purely good is naïve whereas to consider it as purely negative equally naïve and simplistic.   The reality is that technology and more particularly the use of technology for a given purpose will lie in between the extremes of good and evil, positive and negative.   Any use of technology is likely to have its positives but also its drawbacks or unintended consequences and therefore we need to consider carefully the pro’s and cons and seek a balance.

Looking at how we prepare our students for the world and the issues listed above I can see the things which we do satisfactorily, through our eSafety programmers, however I can also see those areas where little or nothing is currently offered.   We currently discuss the importance of privacy settings on social media, of having strong passwords, of how online content, once posted, will remain permanent and of the need to be aware of bullying online.   These areas are currently covered.    Sadly, however little is said in relation to the conflict between user convenience and individual privacy, between individual privacy and public good, and between social media reporting on or actually creating the news and truths which we come to believe.     These are the areas which we need to discuss, for which there isn’t a single answer and therefore where the most we can do is help students develop their own views through discussion.  It is through discussion that we can hopefully ensure that students, when presented with the infinite challenges of technology use, will approach them with their eyes wide open.

This brings me nicely to raising a couple of examples, from the many examples available, which would make valuable discussion topics for use in our schools.

Algorithms and AIs can be manipulated by an individual or organisation, to their own ends.   

Do we understand why algorithms might exist?     Do we understand why an individual or organisation might seek to “game” an algorithm and the potential gains which may arise?   The use of a series of mobile phones to fool googles traffic analysis algorithm into identifying a traffic jam where one doesn’t not exist, resulting in it redirecting traffic away from a given street, being one simple example of what is possible.

Governments can filter and censor content based on political motivations.    

Do governments need to be able to filter content for public safety?   But could their filtering be used to shape public perception or to revise fact to their own political ends and political gain?   What is truth and should governments be allowed to control and revise truth?    We have already seen governments filtering internet content with their filtering then being identified as being lacking transparency and in their own self interest; Filtering of TikTok being one possible example of this.

Online companies can gather and sell your data for profit.  

Do companies need to gather all the data which they gather?   Do they have the right to sell this data?   Where data is anonymized is it possible for data sets to be combined which then might reverse the anonymisation process?   A simple example of this being a cellular carrier selling on viewing habit data.

Mary Aiken in her book, the Cyber Effect, identifies the need for us to “make sense of what’s     happening” and only through discussion is this likely to occur however one concern I have is where these discussions might happen.   In the current crowded curriculum they tend to be banished to the IT classroom, a subject which not all students will study.   I don’t think this is sufficient.   These discussions need to take place throughout schools, across the subject areas, across the stages, with students, with staff, with parents and with the local community.   Discussing the challenges of technology needs to become part of the culture, simply the way we do things around here.

As Danah Boyd stated, “Collaboratively, adults and youth can help create a networked world that we all want to live in”.  If I am to ask anything following my session at DigiFest, I would ask this:  Lets begin with a discussion in our schools, colleges and universities, any citizenship related discussion where technology has its part to play complete with its pros and cons, but let’s do it today.

You can access my full presentation from DigiFest 2020 here.

Final Note: As we now engage in much more home and distance learning due to the Corona virus it may be more important than ever for these discussions to happen, and to happen now!

 

 

JISC DigiFest: Thoughts from Day 1

I thought I would share some initial thoughts following day one of JISC DigiFest.  The event was launched with a very polished and professional pre-prepared video displayed on screens scattered around the events main hall, focussing on the rate of change in relation to technology and some of the technological implications of technology on the world we live in.   The launch session also included a room height “virtual” event guide introducing the sessions and pointing you in the direction of the appropriate hall.    In terms of the launch of a conference this was the most polished and inspiring launch I have seen albeit on reflection there wasn’t much particularly innovative or technically complex about it.

The keynote speaker addressed the changing viewpoints of different generations of people focussing particularly on Generation Z, the generation which currently are in our sixth forms, colleges and universities.   I took away two key points from the presentation.   The first was how each generations views were shaped by their experiences particularly between the ages of 12 and 20 year old.   Jonah Stillman used thoughts on space as an example showing how Generation X might have positive views focussing on the successes of the moon landing whereas Millennials may have a more cynical view following the Challenger disaster.   Additionally, Jonah mentioned movies as a social influencer and how those in the Harry Potter generation may view cooperation and trying hard, even where unsuccessful, in a positive manner.  Those born later than this may draw on another series of films, in the hunger games, resulting in a greater tendency towards competition and the need to succeed in line with the movies storyline of everyone for themselves and failure results in death.     The second take away point from the session resulted from the questioning at the end of the session around what some saw as the absoluteness of the boundaries between generations.    I think Jonah’s use of the word “tendency” addressed this concern in that the purpose of the labels was for simplicity and to indicate a general trend and tendency rather than to suggest that all people born on certain dates exhibited a certain trait.  It increasing concerns me that this argument keeps coming up when surely it is clear that there is a need to use simplistic models to help clarity of explanation and that no model, not matter how complex will ever truly capture the real complexity of the world we live in.

My 2nd session was actually the delivery of my own session.   I will be sharing some thoughts in relation to my presentation along with my resources in the near future.   For now I will simply say that the session was not one of my best.   I do however hope that my main message, in the need for greater and broader discussion in relation to citizenship within the now digital world we find ourselves living in was clear.

The third session of the day focussed on  digital literacy programme one particular university had developed.   I found it interesting in this and a later presentation, how digital literacy or digital citizenship appeared to often fall to the library in universities in terms of developing and delivering a programme.    In schools I feel the same topics tend to fall on the IT teaching department rather than libraries however it is interesting that something which should be permissive would find itself localised in educational institutions in a single department rather than being supported across the institution.   It was interesting how the programme the university developed had evolved over time, which seems to me to be the correct approach given how quick technology is changing.  I also found it interesting in that student voice suggested needs which then later students indicated they did not find useful.  In other words students themselves were not an accurate judge of their own wants and needs.     Session five followed a similar topic again looking at digital literacy however the presenters made use of a fairy tales as a vehicle to deliver their message of the pros and cons of the digital world we live in.   I must admit I enjoyed this presentation in its novel approach to delivering the concept in hand.

Session four focussed on partnerships between a university, a local council and a number of corporate organisations focusing in particular on data analysis and business intelligence.  I think schools have some way to go in this area as they regularly gather huge amounts of data however little is actually done with it beyond reporting it to school leaders, parents, etc.   I think the challenge is that schools often lack the resources which a college or university may have at their disposal, such as having data scientists as part of the staff body.   That said, the sessions seemed to indicate the potential for schools to leverage partnerships to fill this gap with minimal to no outlay on their own resources.

My final session of day one focussed on digital transformation, and like the key notes was insightful and inspiring.    Lindsay Herbert discussed the bear in the room, which is similar to the elephant in the room but more dangerous.     I particularly like the way Lindsay stated early on that the world was a “terrible place” citing issues such as the corona virus, fires in Australia, storms across the UK and ongoing technological change.   She then quickly moved on to the fact that we are inherently brave in our attempt to not only exist but to strive to flourish in this world, before then going on to identify various success stories where the bear in the room was tackled.    She left us with 3 main tips, all of which struck a cord with me, in that transformation starts with a worthy cause, requires lots of people and needs to be learned and earned rather than purchased.   The third tip in particular strikes a cord for me as I have encountered change where it has not gone as smoothly as I would have liked, and where significantly more effort was expended than had originally been attended;  In retrospect this may have been the change being earned, plus certainly involved a lot of learning.

Day 1 was useful with the keynote and closing session of the day being my highlights.    Have plenty of notes to digest when I get back home.  Roll on day 2.

 

 

 

Digital Citizenship

For a while now I have been sharing various online articles which I believe relate to Digital Citizenship via twitter and also sometimes via linkedIn however it recently came to me that it might be useful to curate these tweets so that teachers looking for discussion material in relation to specific aspects of Digital Citizenship might be able to use them.

To that end I created three Wakelets based on three themes which I thought we reasonably common in relation to Digital Citizenship.

  • AI, Drones, Driverless cars and the other societal changes with Tech may bring

https://wke.lt/w/s/kJ3z2B

  • Cyber Security, Data Protection and Big Data

https://wke.lt/w/s/XFOeIs

  • To ban or not to ban?

https://wke.lt/w/s/09MVpQ

Now it may be that in future I may expand the number of themes.  I suspect this is highly likely, but for now the above are hopefully a good starting point.

In addition, for ease, I have created a separate section on my site for this curated Digital Citizenship content in case anyone wants to bookmarks it.  This section is also available via the sites menu structure.

EdTech Summit, Brighton

I had the opportunity to present at the Brighton ISC Digital EdTech summit during the week.  My talk, “Common Sense Safeguarding” focussed on the need for schools to take a broad and more risk based view of online safety as opposed to the previous more compliance driven approach.    Given the number and range of technologies students have access to and also the tools available to bypass protective measures put in place by a school, or even the ability to negate them totally through using 4G, online safety is no longer as simple as it once was.    This therefore needs a broader view to be taken.

In addition, I identified that in our dealing with Online Safety we are not yet effectively addressing the issues which are growing with our increasing use of digital resources and services.    Cyber security, big data, profiling, artificial intelligence and bias, ethics of IT systems and similar broad topics don’t yet have a key place in the general curriculum albeit opportunities exist across different subjects.    We need to ensure these issues are discussed with all students.   It was to that end that I proposed a cross school discussion group focussed on Digital Citizenship.

Overall my view is one that we need to be more aware of the limitation of preventative measures such as web filtering plus need to focus more on user awareness and having discussions with students regarding the wider implications of staying safe and being successful in a digital world.

If you are interested in being part of a group of schools discussing Digital Citizenship please fill out this Microsoft Form and to access my slides from the EdTech Summit please click here.

Digital Citizenship Questions

I think it is so important that schools ensure that discussions in relation to living in the digital world are encouraged throughout the school.   It is only through discussing the positives and negatives of the increasing digital lives we live that we can prepare our students for the world they live in and the world yet to come.

To that end I recently started creating some slides with questions to be used as a stimulus in discussing digital citizenship.

Here is my first set of slides:  Some digital citizenship questions.  I do hope you find them useful and please do let me know your thoughts and any suggestions as to how I can build on or improve them.

 

Safeguarding: A need for a broader focus

Cyber security has very much adopted a “not if, but when” mentality to signify the need for a risk management approach in relation to cyber security risks as opposed to the older compliance driven approach.    It is my belief that we also need to take a similar approach when it comes to online safeguarding of students.

There was a time when having internet filtering on school computers and an acceptable usage policy was enough to check the safeguarding compliance boxes and be satisfied that a schools had sufficiently met its safeguarding needs.   I remember these days when I would check the schools net history on a weekly basis to adjust the filtering and restrict student access to game sites in particular.

Today we find students have phones and other mobile devices which they bring to school, some due to a school BYOD policy and some due simply to the fact that having a smart phone is now part of normal everyday life.   These devices all come complete with internet access, including access to social media.   Where a school might employ monitoring technologies students can make use of proxy servers, VPNs or an onion browser among other methods to attempt to bypass such technologies.   I recently came across a site which would allow anonymous hosting via the Tor network with little to no technical knowledge required.   Student might even simply revert to 4G or even 5G to totally circumvent the schools network and any precautions which may the school may have put in place.     In the near future, DNS over HTTPS may become the norm which would further make it difficult to block and filter.

In this world we need to accept that no matter what technical measures a school puts in place, students will be able to find a way around such measures.    The resultant cat and mouse game between staff and students, with students finding work arounds and then staff seeking to negate them serves no-one, only consuming time and energy on both sides.   It is also unlikely to be successful, so we need to accept that in attempting to safeguard students, preventing their access to certain sites and services is likely to be ineffective.   Given this the safeguarding focus needs to significantly shift towards awareness and education.      We need to seriously look at the discussions in relation to safeguarding which are happening in schools.   The opportunities already exist in various subject areas to discuss the implications of big data, cyber security, artificial intelligence, fake news and data profiling to name but a few.    We need to ensure that such opportunities are taken and that all schools are confident that they have addressed safeguarding and that thorough discussion with students has taken place.   The current political campaigning for example represents a great opportunity to discuss how social media may both report the news but also shape and create it, even influencing peoples decision making.

Online safeguarding used to be a more simplistic compliance exercise, and to some extent these requirements still exist (and the safeguarding guidance certainly still points towards this approach), however we need to take a more holistic view and broader focus.   Simply filtering or monitoring specific keywords or categories or banning devices is not enough.