How the companies will ban under-16s from their platforms will be the focus of a grilling from senators on Tuesday.
Greens senator Sarah Hanson-Young previously threatened to force executives from TikTok, Snapchat and Meta - the parent company of Facebook and Instagram - to appear at the inquiry into online safety after they were no-shows at an earlier hearing.
But all three companies agreed to give their views on the social media ban at the hearing without a subpoena.
Children younger than 16 will be banned from social media platforms from December 10, but there will be exceptions for health and education services including WhatsApp and Meta's Messenger Kids.
Gaming platforms were lined up for exemptions, but eSafety Commissioner Julie Inman Grant said the list of banned platforms would be "dynamic" and subject to review.
The law puts the onus for compliance on the companies to "detect and deactivate or remove" accounts from underage users.
This will mean about 1.5 million accounts on Facebook, Instagram, YouTube, TikTok, Threads and X will be deactivated in less than two months.
Google previously told the inquiry the ban would be extremely difficult to enforce and a lack of detail around how the platforms plan to implement age verification systems have clouded the ban since its announcement.
Platforms face fines of up to $50 million if they do not take reasonable steps to comply with the ban, but there won't be penalties for young people or their families if they gain access to the platforms.
The three tech giants have had meetings with Ms Inman Grant and Communications Minister Anika Wells to discuss expectations of their roles.
The Safety commissioner also announced on Tuesday that tech giants Apple and Google had removed OmeTV from their app stores after being alerted to concerns predators were using it to groom and sexually exploit Australian children.
OmeTV instantly connect individuals with a random stranger for a video chat.
The app's Portugal based parent company Bad Kitty's Dad, LDA did not comply with requests sent by the commissioner in August to introduce protections for Australian children.
Following further discussions in recent weeks, the tech giants have removed the app from its stores and are expected to review all others available in their Australian stores.
"This is a great example of how the codes and standards work in practice to improve safety across the online industry and protect children," Ms Inman Grant said.
There are a range of additional enforcement powers available to eSafety, including seeking civil penalties of up to $49.5 million.
eSafety encourages children, parents, carers and the community to read eSafety's recent advisory on the risks of these "chat-roulette" services.