The eSafety Commissioner has released its regulatory guidelines that will target content from search engines in a further attempt to protect children under the age of 16 from harmful content.

The eSafety Commissioner has said the new industry codes is a requirement for online services to protect children from exposure to age inappropriate content like pornography, high impact violence and material which promotes self harm, suicide and eating disorders.

The Age Restricted Material Codes have been drafted by the industry to be applied to online service providers including app stores, social media services, equipment providers, online porn services and generative AI services.

The first tranche of Age Restricted Material Codes commences on December 27, including the code that applies to search engine services.

eSafety Commissioner Julie Inman Grant said more young people were unintentionally encourtering age inappropriate content at a young age.

"We know this is already happening to kids from our own research, with one in three young people telling us that their first encounter with pornography was before the age of 13 and this exposure was 'frequent, accidental, unavoidable and unwelcome' with many describing the exposure as being disturbing and 'in your face'," Ms Inman Grant said.

"We know that a high proportion of this accidential exposure happens through search engines as the primary gateway to harmful content, and once a child sees a sexually violent video...they can't cognitively process, let alone unsee that content.

"From December 27, search engines have an obligation to blur image results of online pornography and extreme violence to protect children from this incidental exposure, much the same way safe search mode already operates on services like Google and Bing when enabled.

"And one of the most crucial requirements under the code will be automatic redirects to mental health support services for searches related to suicide, self harm or eating disorders.

"These are important societal innovations that will provide greater protections for all Australians, not just children, who don't wish to see 'lawful but awful' content."

Ms Inam Grant said that she finds comfort in the code protecting vunerable children thinking about self harm.

"Now they will be directed to professionals who can help and support them," she said.

"If this change saves even one life, as far as I'm concerned, I believe it's worth the minor inconvenience this might cause some Australian adults.

"Suicide devastatingly reverberates across families and communities, and represents a point of no return.

"But let me be clear, what this code won't do is require Australians to have an account to search the internet, or notify the government you are searching for porn," she said.

"And while certain images of pornography or extreme violent material in search results might be blurred, adults who wish to view that content can still click through to see it if they choose.

"Again, this is about protecting our kids from accidental exposure to material they will never be able to unsee."

eSafety's regulatory guidance covers both the new codes coming into effect which apply to Age Restricted Material, and the pre-existing Unlawful Material Codes and Standards, which tackle the worst of the worst unlawful online material including child sexual exploitation and abuse material as well as pro-terror content.

The new codes will complement the social media minimum age obligations that began on December 10.

If you or someone you know needs support, there are resources available to help.

Call Kids Helpline 1800 55 1800; Lifeline 13 11 14, text 0477 13 11 14 or chat online; Suicide Call Back Service 1300 659 467; and Beyond Blue 1300 22 4636.