Talk to someone now. Call our National Helpline on 1800 33 4673. You can also chat online or email

Talk to someone now. Call our National Helpline on 1800 33 4673. You can also chat online or email

Season 2, episode 16

In Depth with eSafety Educator Cara Webber


As the Senior Education Officer at Australia’s Internet watchdog, the eSafety Commissioner, Cara Webber’s job is to teach people how to keep safe on the World Wide Web.

In this role she spends much of her time speaking to young people, their parents, and teachers about how kids can have harm-free online experiences. She is particularly interested in encouraging young people to take a leadership role in shaping happy and healthy digital cultures.

In this episode of In Depth, Cara talks about some of the measures the Commissioner is taking to help protect young people from websites and content that can trigger body image issues, disordered eating patterns, or eating disorders. It’s not always easy, but the Commissioner is there to support and advocate for young people at risk.

Learn more about the eSafety Commissioner

REPORT HARMFUL CONTENT

Sam Ikin:

In this episode of Butterfly Let’s Talk In Depth. We’re talking to Cara Weber whose job it is to help people understand how to keep their online experiences safe and positive.

Cara Webber:

I’m a Senior Education Advisor for the office of the eSafety Commissioner or eSafety.

Sam:

We know social media sites can be drivers of negative body image issues and we talk about it all the time on this show. But according to extensive research, young people are particularly at risk. The internet is packed with unrealistic images and ideals, and they can be an incubator for mental illness, including eating disorders, in fact, in particular eating disorders. But short of just shutting down all social media, which is obviously never going to happen, what can possibly be done? Well, Cara says a lot is being done, and one of the main focuses of the eSafety Commission is to look after people’s mental health and wellbeing.

Cara:

You could almost say that all of our education and prevention programs around eSafety are focused on the health and wellbeing, or the mental health, of all Australians, but particularly young Australians. We focus on that predominantly through schools and through the types of education that we put forward via schools, but also through our messaging and campaigning directly to young people and their parents as well as significant adults they may work with in their lives.

Sam:

How is it that social media is affecting kids in such a negative way?

Cara:

It’s fascinating, isn’t it? Because young people developmentally are going through really rapid change. There’s so much going on for them in terms of seeking a sense of belonging, looking for validation and experimenting with their identity. The amazing thing about social media is it gives so many positive opportunities for those normal developmental behaviours to play out. I think the problem is it can exacerbate certain impulses or certain tendencies for young people during their developmental stages and that’s the area that we tend to try and focus on, so that the natural experimentation with risk doesn’t necessarily equate to harm. That’s where we try to intersect our education. It’s very normal for young people to be trying things out, to be finding their tribe online, to be finding likeminded people, but we need to keep abreast of potential risks that they’re taking online. Risk can really assist with growth and development, but we make sure that that it doesn’t equate to harm. So, when we talk to parents and young people, it really distils down to a set of core values. These include making sure that young people will seek out trusted adults or trusted people in their lives to talk to when something has gone wrong in online spaces, that they shake any stigma around reporting negative behaviours online that they see, and that they do seek support.

Sam:

So, what are some of the most important things that we can start teaching kids early on when they first start using the internet?

Cara:

So much of it is about critical literacy and being able to give young people the skills to unpack content that they’re seeing and that really needs to start in the early years. So at the moment, a young person is starting to be literate or cognisant of the world around them. We need to give them the skills of unpacking content that they see – and that will include visual imagery right through to written text. So being able to read or view images for agenda, for how they’ve been composed, what the author or the creator might be trying to achieve, the ability to identify falsehoods or misinformation, and making sure that as they progress through their primary and secondary school years, that their ability to unpack or deconstruct information becomes much more sophisticated. The content that they may pursue deliberately or discover unintentionally, that they have the ability to look at the credibility of it or understand what might be leading its composition.

Sam:

And so, more specifically, how does all this affect kids who are more vulnerable to eating disorders in particular?

Cara:

So much of it comes down to the way that we represent gender identity. So right from an early age, looking at the ways that we typically present masculinity and femininity, for example. So if we can help young girls to look at the way that femininity is presented to them and the fact that they are surrounded by quite a glamorised or sexualised culture and helping them to understand that that is just actually just one way that a girl or femininity can be. So, the skills to unpack their own gender, and similarly, with the way that masculine identity is presented to them. You might be surrounded by cartoons that are presented in pink or girls are advertised to for Barbie or for the very pink related kind of girly toys, that that is only just one way of being. So, from a very young age we try to help them unpack the way that they are advertised to via toy catalogues. That would be one quick, simple way of opening a conversation about, “Well, why are you being pitched to in this way? What are they trying to suggest?” I think that girls, particularly knowing that they are more likely to face eating disorders, that it’s important for them, right from an early age, to be presented with multiple forms of what can be beautiful. We want them to think about what is important about their identity and how they see themselves as women. It’s not necessarily just a one lens way of seeing themselves.

Sam:

We know that every day there are hundreds of instances of young people who are suffering from eating disorders going online in an almost self-destructive kind of way to hear from people who don’t have their best interests at heart, who are in fact encouraging them to make their problem worse. Is there a way to prevent that kind of thing from happening?

Cara:

Yeah, it’s a really great question. One of the challenges that we face is that so much of the content that gets hosted or posted, happens overseas. So, you’re dealing with cross jurisdictional issues. It’s not as easy as regulating globally. We can only focus on content that is produced and uploaded in Australia. So when it comes to young people seeking out certain types of information, again, it comes back to giving them the skills to understand that it’s okay to be curious, but certainly they might find themselves entering into spaces that make them feel uncomfortable. If they are finding their sense of identity by being able to find other people following the same hashtags or joining communities, that they are making sure that they are countering that with talking openly with adults around them about what they’re doing. We want to make sure that they have given the skills to deconstruct what they’re saying as well. I think we can continue to put pressure on media companies to have a sense of social responsibility to be quick to remove content that may be harmful or negatively impact a young person who is particularly vulnerable at that time. I think one of the biggest issues that we face is that there’s just such a huge amount of content that gets uploaded every day, that unless we’ve got human moderators who can keep up with that volume of content and are able to unpack the context and nuance around certain posts, until we get that huge volume of human moderators, we’re going to see stuff slip through the cracks.

With AI trying to identify and automatically remove content that may be related to harmful topics, it’s very hard for AI to read nuance and to necessarily understand a backstory or why certain images or pictures or posts maybe detrimental.

Sam:

We know that there are some issues that social media platforms are quick to moderate or prevent from being discussed at all. But it doesn’t look like eating disorders, in particular, are on their radar in any significant way. Do you see that changing?

Cara:

That’s a very good point. I think if you look at a platform like Instagram, if you were to put in a hashtag, say anorexia, in Instagram, it comes up with a firm notice saying that this could potentially be harmful or distressing content. This is what they’ve tried to do rather than focus on the removal of inappropriate content, to focus on building communities of support against certain hashtags that young people may be seeking. So, for example, when you do type in hashtag anorexia into Instagram, it comes up with communities who are trying to further healthy eating, safe exercising, and there are tropes about how to feel good about yourself without focusing too much on the external. So I think they’ve switched their modus to be more about building communities of support than focusing on the onslaught of content that may be distressing for one person, but okay for another person.

Sam:

Well, obviously the wheels are turning and it’s wonderful to see things moving in the right direction. But despite their best efforts, we still have a long way to go regarding safety on the internet. I don’t know, am I correct in saying that?

Cara:

I think that, certainly, and I think we’re only just starting to understand what it is that we’re dealing with in such a rapidly changing and evolving landscape where policy hasn’t been created yet. The more we can futurecast and the more we can actually predict issues that are around the corner or match what is going on developmentally with young people with the potential impact of content, then we can start to predict and shape policies and innovation around preventing issues happening in the first instance. One of the Commissioner’s initiatives is safety by design. This is basically putting user rights and user safety at the heart of any innovations – at the point of conception, inception, prior to development. So that rather than us trying to retrofit or respond to issues that are raising their heads like inappropriate content or harm content, we’re actually stopping the mechanisms of allowing potentially socially damaging stuff to appear in technology in the first place. I think what we need to continue to do is help young people to understand that there is no shame in reporting or seeking support. When it comes to the removal of content, for example, around eating disorders, often the stuff that we have removed which could be deemed as harmful or offensive to a young person or potentially damaging has come as a result of bullying that’s occurred after those images have appeared. So, we’re dealing with an issue where you’ve got young people who may be posting that into communities and then later on, those images are used against them in a form of bullying. We are then able to remove that content.

So once a young person has reported to us that they are being seriously harassed, intimidated or abused, and often it’s through images, then we can actually work to have that stuff taken down. So two things there, we also need to make sure that young people understand that the stuff that they post doesn’t necessarily leave the internet and can be used adversely down the track.

Sam:

So what you’re saying is once the line is crossed platforms are taking quite decisive action, but until they get to that point, wherever that point may be, it’s about education and helping young people to navigate it safely. Is that right?

Cara:

That’s exactly right, that’s why you’re a journalist and I’m not, you speak so eloquently, that’s exactly it. Young people are going to make mistakes and they’re going to do things that perhaps will put themselves at potential risk. We need to also make sure that if a mistake has been made or a person has made the wrong choice, that they know they’re not just the sum of their choice, that we can continue to move on and grow from our mistakes. If we get a young person to the point where they’re removing themselves from unsafe communities that we’re replacing those with other communities where they feel a genuine sense of belonging. Unless you do that, they’re going to be reluctant to remove themselves from communities that are making them hem feel good or making them feel seen in a certain way, whether it is negatively or positively

Sam:

Or they’ll adapt and kind of get around whatever safeguards that you’ve put in place right?

Cara:

Certainly. One thing that we know that apart from pornography, we know that accessing inappropriate material content that may be harmful is one of the biggest concerns for parents. As a society, it’s something that we all want to talk more about. I think the more we all understand and learn, then the more we’re going to be able to see a generation growing up and being able to successfully navigate what they’re seeing on the Internet. I think all of us as perceptive adults are able to see when a young person has been affected by something. If we see a change in their behaviour, that’s when it’s really important for us to step in and say, “Hey, I noticed that you’re feeling looking a bit sad today or you seem a bit flat, is there something on your mind you want to share with me? What’s going on?” Hopefully, if you’ve paved the way you’ll have a young person open up to you about stuff that they might have come across or searched or has made them feel bad and then you can help them move on from that. It’s about open lines of communication, giving the skills to deconstruct content, continuing to understand that we all have a part to play in navigating the content and the world that we create in online spaces.

Sam:

One thing that’s been coming out in the last couple of years is there’s this dark, sinister kind of risk that a lot of kids face, where you have grown male predators who are praying on young women who have eating disorders because they’re so vulnerable. And this is something that Dr. Suku Sukunesan, who we’ve had on the podcast before has looked into extensively. This is terrifying. Aside from kind of telling kids to watch out for them, is there anything that can actually be done?

Cara:

We certainly know that online predatory behaviour continues to be a huge problem in Australia as it does worldwide, and we do know that young people who are more vulnerable offline, that the vulnerability will translate to online spaces as well. We know that our young people are targets for unwanted contact or grooming. Often, this happens because the groomers are very perceptive at targeting who is showing vulnerability or who might likely respond to their approach because again, that young person is looking for a sense of belonging or validation. So, yes, we know that it’s an area that if you’ve got a young person who is vulnerable, you need to make sure that they are given the skills to be able to see when someone is trying to take advantage of them. And again, that comes down to conversation, education, repeated messaging and feeling as though that young person has somewhere or someone to access, should they be confronted with inappropriate or unwanted contact. And removing the shame around that as well. It happens so frequently, that it’s not necessarily because you’ve put an image up of yourself, you’re not you’re not to blame. The other person is to blame for contacting you. It’s that person who’s in the wrong. We help young people to see that adults or people who are older than them are often the ones that are in the wrong. Not the young people

Sam:

Thank you so much for your time, Cara, we really appreciate it. If anybody has any concerns or they’d like to find out a little bit more about how they can keep people safe online, how do they get in touch with you? How do they find you or find any of the teachings from the eSafety Commissioner?

Cara:

If a young person needs to seek support or has something that they wish to report to our Cyber Bullying Line or an image based abuse line, they should go to the safety dot gov dot au, There’s various sections on the website that will help them with different issues and there’s also areas to report inappropriate or abusive behaviours.

Sam:

Amazing Cara. Thank you so much.

Cara:

No problem, Sam, lovely to talk to you.

Sam:

If you’re looking for support around eating disorders or negative body image, the Butterfly Helpline is there seven days a week from 8 am until midnight. The number to call? 1800 33 4673. Or you can go online butterfly.org.au. You can chat online if you prefer to do that. If you like what we’re doing here with the Butterfly, Let’s talk podcast and Butterfly In Depth bonus episodes, please leave us a rating or leave us a comment on Apple podcast. We’d really appreciate that. That would help us out greatly. And as always, if you think somebody could benefit from hearing these kinds of conversations, please tell them about it. I’m Sam Ikin, for more on me go to Ikinmedia.com.au. Thank you for your company.

Listen to more episodes