- Home
- Research
- Find research
- Oceans of data - how do they affect us?
Oceans of data - how do they affect us?
Olgerta Tona studies various aspects of the rapid technological development and its consequences for society, organisations and individuals. The large flow of collected data is particularly interesting. How do we navigate around the advantages and disadvantages - and take account of the individual's need for privacy?
Organisational and societal implications of emerging technologies
– My research interest is about the consequences of emerging technologies for organisations and for society in general, says Olgerta Tona. I am particularly interested in decision-making with algorithm support, business analysis, and the use of personal data in digital form.
– In my PhD thesis from Lund University, I focused on mobile business intelligence, an area that was emerging at the time, and its impact on decision-making in organisations. I studied the way employees and managers interact and use data and how that interaction and the insights they gained through data affected their decision-making.
– Then the scandal with Cambridge Analytica and Facebook burst and I became really interested in what is going on in society connected to data. When data includes basically everything we do in our daily activities, our choices and decisions we make – what does it mean for us as customers, patients, and citizens?
Project on algorithmic decision-making
Olgerta Tona and her colleagues are right now engaged in a study of an algorithm-based decision system that has been used a governmental institution abroad. The system was introduced to identify possible frauds in a quick and easy way, and thus get back wrongly paid amounts.
Unfortunately, there was a design flaw in the system which meant that a large number of citizens were wrongly "flagged" as people who had received too large payments and now were in debt to the state.
Since the algorithm could handle such a large number of cases simultaneously, the consequences quickly became big. The staff who previously handled the cases manually had their routines, policies and regulations. They had gone through each case separately and the process took in comparison a very long time. The algorithm was now able to handle thousands of decisions a day, and any incorrect conclusions from the system's side quickly piled up in vast amounts before the error was discovered.
– In our project, we are trying to investigate what happened, why it happened, what the consequences were, and what kind of measures should be taken in order to draw attention in time to incorrect conclusions that originate in a system, says Olgerta.
Personal data digitalisation and human dignity
– In our second research study, we look at the digitalisation of personal data and how personal data used by digital technology is integrated into our daily activities and also our choices. What consequences does the digitalisation of personal data actually have for people's privacy? It is very interesting.
– Although we can say that technology... it's not easy to say that technology is good or bad, right? You must always look from both sides.
– So we look at it from a human dignity perspective and try to see, if certain technologies, or certain digital platforms, use personal data in a certain way – it can really create benefits, it can promote a person’s dignity, or it can even create threats to someone’s dignity.
Olgerta Tona says that something that becomes an even bigger challenge is how we navigate now that we have very large advantages of digital technology but where the technology also poses a certain threat.
– Use of a specific technology, can for example be very beneficial for one group of people – while it can be harmful for another. How should we navigate this trade-off? You design digital technology with the idea that it will create a number of benefits. You want to streamline certain processes to save time, perhaps – but if the technology is used in ways other than those you have taken into account, unexpected consequences can arise. The duality when it comes to digital technology is very interesting, you always have to manage and weigh both sides.
Data collection creates a digital "persona"
– What I also find interesting is the different possibilities we have to capture personal data. Like demographic data, behavioural data – what we do, location data – where we are, or at what time of the day. All this might feel like single data points, but there is so much power in the data when you start combining them and when we create profiles of people; what they think, sometimes we might be able to get into the logic, the rational on the decisions people make, the reasons behind a certain action. And when you get an idea of the digital persona, you start categorising that based on similar attributes, you create categories where you start delivering services and products. It might seem simple but there are so many problems behind how you do this categorisation. Because it is one thing when you categorise yourself: “I think I belong to this particular group” for instance, because it is your perception and your experiences that is important. It is another thing when someone else does this for you: “I think you belong here”.
– And we see problems here, people that got stuck in filter bubbles and you get the same content over and over again. The bias that a certain person has get amplified – and we end up in all these polarisation issues that we have in our society. These are issues that are important to be researched and be aware of.
– Now we have all this data collection, we have the profiling possibilities from combination of data – and organisations are more prepared to offer people more customized services and products. At one side, as a customer I can feel “oh I am so happy, someone else has to think about this issue and offer me things I am interested in and I don’t have to spend so much time on searching”, but on the other side, you as an individual get your autonomy and your freedom get a bit constrained. You tend to be a bit stuck, and you are usually not always aware of that. You get a certain kind of news for example according to your profile – and it keeps repeating itself.
Devices give us control – and control us
Olgerta Tona also mentions the devices we have that involve some form of surveillance. What do they mean for our daily life? We have technology that counts the steps we take, measures our heart rate and our stress level and so on.
– The devices help you to collect knowledge about yourself, you can clearly see when you need more exercise and when you need to lower your stress level, for example. You can make well-informed choices based on the collected data about you. “Oh, I really have to start with my physical activities since I am lagging behind” That is so good, right? But if I am solely depending on that digitalised data, maybe I am not considering things that might be more important, but can’t be captured by a device. My attention can be stuck with whatever device I have and maybe I miss certain signs – what I perceive and what I am experiencing with my own body.
– Small children is another example. Many parents with a newborn at home are worried and check that the baby is breathing and similar things. Now there is instead a device as a sock you put on the baby's foot that registers the baby's heart rate. But you are worried, and you tend to have the sock on all the time to check the graph – and that might also affect the relation to the baby. Instead of using all this time for checking, you might have just focused on cuddling and creating a tighter bond to the baby.
– When the children become teenagers, there are instead other things that parents worry about and would like to monitor, where the teenagers are online, for example. But what happens to the trust between the teenager and the parents?
Technology brings all these incredibly positive things to our society – what would we have done during the pandemic without technology for example..? But there are also consequences that we need to consider.
Text: Catharina Jerkbrant, 2023