Note: This article was first published in the January 2022 edition of the LexisNexis Privacy Law Bulletin

Domestic IoT CCTV, smart doorbells and home multimedia devices such as Amazon Alexa and Google Nest can collect large amounts of data on the inhabitants and visitors to a home, as well as passersby and lawful visitors who ring the doorbell or come in and whose conversations are in effect spied upon by the IoT device.1

Contrary to problematic European Court of Justice (ECJ) General Data Protection Regulation (GDPR) jurisprudence, the Australian Privacy Act 1988 (Cth) probably does not apply to the householder, but may to the supplier of software.

This may render the operators of the software of IoT devices liable for breach of the Privacy Act and make the devices in effect unlawful.

The IoT

"Users of the internet share troves of information as they surf the web, including what web pages they visit, how long they spend on each page, and where they click on the screen. Through their behavior and voluntary sharing of data, they also frequently reveal personal information such as age, gender, income, and geographic location. This type of granular data collection has become so ubiquitous that it is expected, or met with resignation, as a part of using the internet through a computer or mobile device.

As the Internet of Things expands, this type of granular data collection is moving into domains that have traditionally been considered “offline.” The IoT enables an increase in monitoring of human activity that is fueled by scale — a greater number of sensing devices and sensor types — as well as a greater proximity of sensing devices to people’s bodies and intimate spaces.2

A key technology crossing these boundaries is the smart speaker/virtual assistant: Amazon’s Echo with Alexa, Microsoft’s Cortana, Google Home, and the recently released Apple Home Pod with Siri. These devices introduce a combination of microphones, artificial intelligence, voice recognition, and the melding of personal profile information gleaned from the use of other services.

These devices are qualitatively different than, say, televisions with voice recognition because of the degree of AI combined with the combination of extensive profile information. There’s no doubt this new class of technologies brings pleasure, convenience, entertainment and a platform for new voice interactive applications. Rather, the issues worth exploring relate to the placement of a general-purpose microphone — and increasingly cameras as well — into the home, a context classically seen as the quintessential private space.

Privacy in the home is an embodiment of privacy of location, “the right of an individual to be present in a space without being tracked or monitored or without anyone knowing where he or she is.” The home also embodies spatial privacy: “the protection of the privacy of people in relation to the places where they enact their private life. Classically, this is the dwelling or house, but it can stretch to other ‘places of private life’. . . private places with discernable boundaries.”3

People using e-commerce websites and social media are apparently happy to provide disturbing amounts of personal details, in the hope of obtaining something for free or a service or good they do not want to pay for.

To the extent they even consider it (which is usually minimally) they trade-off the privacy risk of handing over this data voluntarily against the expected gain.

However, when they visit a friend’s house, they do not expect that their conversations will be recorded, their phone’s bluetooth to connect via preset default consents, their voice pattern matched to recordings of their voice on other media, then tagged to location and address and consumption behaviours, as well as political interests and social media comments.

People walking on a footpath do not expect their face to be recorded by a “smart” doorbell, and then matched against an image database and on-matched as above.

No-one expects this trove of information to be leaked by employees of these organisations if it is useful to humiliate or silence them for having incorrect political views, or that it will all be available to law enforcement investigating whatever law enforcement decides is of interest.

All of the above is now feasible and in fact occurring.

The Australian regime

Section 7B(1) of the Privacy Act provides:

An act done, or practice engaged in, by an organisation that is an individual is exempt for the purposes of paragraph 7(1)(ee) if the act is done, or the practice is engaged in, other than in the course of a business carried on by the individual.

At first glance, the wording is curious because it is not obvious how an individual can be an organisation.

The reason for this is:

  • the structure of the defined terms “organisation”,4 which defines an “organisation” to include an “individual” including an individual that is “not a small business operator” and

  • the definition of acts and practices of organisations (which includes individuals in a certain capacity)

  • that are not exempt acts or exempt practices, which therefore means the carve out in s 7B(1) of the Privacy Act applies

The structure is intentional — it is intended to be a very broad ranging remit that means that the Privacy Act applies to all acts in the course of trade and commerce (at least) by any legal personality (corproation or individual), except as otherwise specifically excluded.

For the purposes of this note, the importance of this is that an individual will not be covered by the Privacy Act unless they engage in the act or practice in the course of a business.

Many individuals will have IoT smart doorbells and CCTV apparatus in their houses. These deliver security and other beneficial goods for the individual.

However, they also have the side effect of surveilling passersby and members of the public who either walk by the property or use the doorbell facilities, or in the case of IoT devices within the house, to surveil the visitors and inhabitants and record their private conversations and activities.

The ECJ’s decision in Ryneš

In Ryneš v Úˇrad pro ochranu osobních údaj,5 the ECJ dealt with an equivalent relating to a private CCTV camera and the Directive 95/46 (which has an equivalent in the GDPR).

Article 3 of the Directive provides that the Directive would apply to the processing of personal data wholly or partly by automatic means, and to the processing other- wise done by automatic means of personal data which form part of a filing system or are intended to form part of a filing system, but not apply to the processing of personal data “by a natural person in the course of a purely personal or household activity”.6

The individual had some trouble with neighbourhood youths and had installed CCTV for the only reason of protecting the property, health and life of his family and himself. However, the CCTV camera, which was installed on his house, also monitored the public space, being the area around his house.7

The CCTV surveillance was used to identify and prosecute two suspects who had broken a window of his home by a catapult.8

One of the suspects challenged the law fullness of the collection of information.9

The ECJ held (in reasoning that was very brief) that “processing of personal data comes within the exception . . . only where it is carried out in the purely personal or household setting of the person processing the data.”10

It therefore held that to the extent that where video surveillance covered, even partially, public space and was accordingly directed outwards from the private setting of the person processing the data, it could not be regarded as activity that was purely personal or household activity for the purposes of the exemption.

It seems clear that the ECJ was mistaken in its interpretation of the words of the exemption, because the words of the carveout relate only to “purely personal or household activity”.

The ECJ went further and qualified that by glossing the exemption far more narrowly than the text permits, by reading “in the course of a purely personal or household activity” as meaning “only where it is carried out in the purely personal or household setting of the person processing the data”.

This conclusion was supposedly justified against the right to be forgotten and against the background of unindentified (and somewhat gnostic and esoteric) elements of the charter of fundamental rights and the directive.

It was clear on the record and conceded that the CCTV footage was only used to protect the property, health and life of Ryneš and his family. How that was not “in the course of a purely personal or household activity” was not explained by the ECJ and ECJ’s decision only makes sense if the words read into the exemption by the ECJ (without obvious justification) are used.

In Australia

Putting aside the obvious difficulties in the decision of the ECJ11 what would be the case in Australia should someone object that he or she was identified by CCTV or via an Amazon doorbell or home device?

The Australian Legislation carving out the acts of an individual is wider than the carveout in the Directive/ GDPR.

On that basis, Ryneš would not apply in Australia, and the activities of the householder as an individual would not be covered by the Privacy Act.

By way of example, the OAIC notes:

Residential security cameras

If your neighbour has a security camera pointed at your house and you’re worried about your privacy, first try to talk to your neighbour. If this doesn’t fix the problem, you could ask your local community justice or neighbourhood mediation centre for help . . .

The Privacy Act doesn’t cover a security camera operated by an individual acting in a private capacity but state or territory laws may apply. For more information, contact the Attorney-General’s Department in your state or territory. However, if you’re concerned about your safety, contact the police.

You could also contact your local council to find out if the practice contravenes any local laws. Some councils require planning permission for security cameras.
If your property is part of a strata title, check the by-laws to see if they cover installing or using security cameras.12

If the home was owned by a corporation, the corporation is an organisation under s 6C but probably would fall within the small business exemption under s 6D on the assumption that its business (ie receiving rent and paying outgoings for a profit) would fall into the small business carve out in s 6D(1) on the basis its annual turnover as calculated under the legislation was $3 million or less.

Having said that, it is likely that most IoT devices remain owned by a much larger business (for example, Amazon) and are merely used under licence via the homeowner, or otherwise the homeowner only owns the hardware, with the the software, which is the critical part of the IoT device, remaining owned by Amazon or the equivalent supplier.

Those latter organisations are clearly caught by the Privacy Act.

That may mean that the owner of the software that makes the smart door bell or domestic advice function is in breach of the Privacy Act and that such devices are therefore, in effect “unlawful” as it is not practically possible to obtain necessary consent.


  1. Although Dostoyevsky and his pals might have done this, plotting to overthrow the government, even in the privacy of your own home can carry significant legal risk, and you should proceed at your legal peril.

  2. G Rosner and E Kenneally Privacy and the Internet of Things: Emerging Frameworks for Policy and Design (June 2018) p 7

  3. G Rosner and E Kenneally Clearly Opaque: Privacy Risks of the Internet of Things (1 May 2018) pp 20 and 21 https://papers.

  4. Privacy Act 1988 (Cth), s 6C.

  5. Judgment of the Court (Fourth Chamber), 11 December 2014. Ryneš v Úrˇad pro ochranu osobních údaj. Request for a preliminary ruling from the Nejvyšší správní soud. Reference for a preliminary ruling. Rynes v Urad pro ochranu osobnich udaju: C-212/13 [2015] 1 WLR 2607; [2014] All ER (D) 174 (Dec).

  6. This part of the Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data as transferred in integro to the GDPR, Art 2 cl 2(b).

  7. Above n 5, at [14].

  8. Above n 5, at [15].

  9. Above n 5, at [16].

  10. Above n 5, at [31].

  11. See also Fairhurst v Woodward Case No: G00MK161.

  12. See Office of the Australian Information Commissioner, Security Cameras, reviewed 8 November 2021, privacy/your-privacy-rights/surveillance-and-monitoring/security-cameras.


This is commentary published by Colin Biggers & Paisley for general information purposes only. This should not be relied on as specific advice. You should seek your own legal and other advice for any question, or for any specific situation or proposal, before making any final decision. The content also is subject to change. A person listed may not be admitted as a lawyer in all States and Territories. © Colin Biggers & Paisley, Australia 2024.

Related Articles