FTC Issues Enforcement Policy Statement on COPPA and Voice Recordings

By Smriti Chandrashekar

On October 23, 2017, the U.S. Federal Trade Commission (“FTC”) issued guidance on the online collection of certain audio voice recordings from children under the age of 13.  The guidance, in the form of an “enforcement policy statement” discusses the application of the Children’s Online Privacy Protection Act (“COPPA”) to such recordings. 

In 2013, the FTC amended the COPPA Rule to expand the definition of “personal information” to include, a photograph, video, or audio file that contains a child’s image or voice. The latest guidance provides a path to avoid enforcement of the COPPA Rule for online services that collect audio files containing a child’s voice and convert such audio files to text for the performance of a specific instruction or request.

COPPA requires operators of websites or online services directed at children or that have actual knowledge that a user is a child to obtain verifiable parental consent before collection of a voice recording. While confirming that the COPPA Rule is triggered by such activities, the FTC in its analysis notes that these voice-enabled features may be essential for children with disabilities and for children who have not yet learned to write. The FTC also concludes that these audio files when processed in accordance with FTC guidance, pose little risk to identifying and contacting an individual child. 

Based on these potential benefits and low risk to children, the FTC outlines a safe harbor that would protect operators against an enforcement action for not obtaining parental consent before collecting an audio file with child’s voice. The FTC will not take an action to enforce COPPA when operators take the following actions:

a.       collect audio files with children’s voice recordings solely to replace written words, for example to perform a search or fulfill a verbal instruction or request;

b.      not use such audio files for purposes beyond performing that instruction or request (e.g., behavioral targeting or profiling, identification through voice recognition, or posting, selling, or otherwise sharing the file with third parties);

c.       maintain such audio files only for the limited time necessary to perform that instruction or request and then immediately delete such files; and

d.      provide a clear notice in the privacy policy disclosing collection and use of audio files containing voice recordings and the operator’s policy for deleting such audio files.   

The FTC made it clear that this enforcement exception policy does not affect the need for operators to provide notice and verifiable parental consent in cases where other personal information is collected from children in addition to, or in connection with, audio files such as where an operator requests information through such audio files that would be otherwise considered to be “personal information,” for example, the name of the child.

The Commission issued this policy statement after receiving inquiries from numerous companies about whether such practices of collecting audio files that contain a child’s voice recording triggers COPPA’s requirements.  Popular voice-controlled intelligent personal assistant services, such as Amazon Echo and Microsoft’s Cortana, will likely benefit from this exception.  The Commission voted 2-0 to approve the new policy statement.  The FTC’s press release is available here.

FTC updates COPPA Compliance Plan for Businesses

By Carolyn Krol

On June 21, 2017, the U.S. Federal Trade Commission (“FTC”) published an update to the Children’s Online Privacy Protection Rule (“COPPA”) compliance plan for businesses. The FTC Business Blog describes the update as a reflection of the developments in the marketplace, such as internet-connected toys. The compliance plan provides businesses with a step-by-step guide to determine if a business activity is covered by COPPA, and if so, how to comply with COPPA.
 

There are three major updates to the compliance plan, regarding:

  • new business models,
  • new products covered by COPPA, and
  • new methods for getting parental consent.

The updated compliance plan considers new business models in its revisions which may affect COPPA obligations. In publishing this update, the FTC acknowledges companies have new ways of collecting data (e.g., voice-activated devices that collect personal information). As such, businesses should keep COPPA compliance in mind if they are implementing new ways to collect personal information.

COPPA applies to businesses with a website or online service that is directed to children under 13 collects personal information from them. The updated compliance plan clarifies that the meaning of “website or online service” may include internet-enabled location-based services, voice-over internet protocol (VOIP) services,  and connected toys or other Internet of Things devices. If they have not done so already, businesses providing location-based services and VOIP services or are in the connected toy or Internet of Things space should evaluate whether their products or services could trigger COPPA obligations. 

Subject to a few exceptions, COPPA requires that businesses obtain parents’ verifiable consent before collecting, using, or disclosing personal information from a child. The compliance plan discusses acceptable methods for obtaining verifiable parental consent. The updated compliance plan lists two new acceptable methods. First, parents now may provide consent by answering a series of knowledge-based challenge questions that would be difficult for someone other than the parent to answer. Second, parents may now submit a picture of a driver’s license or other photo ID and then compare that photo to a second photo submitted by the parent, using facial recognition technology.

In addition to reviewing the updated compliance plan, the FTC recommends reviewing the COPPA Frequently Asked Questions

 

How to Draft a Privacy Statement

A chapter by Hintze Law partner Mike Hintze, entitled "Privacy Statements: Purposes, Requirements, and Best Practices" will be included in the forthcoming Cambridge Handbook of Consumer Privacy, edited by Jules Polonetsky, Evan Selinger & Omer Tene, Cambridge University Press (2017).

The chapter explains that while drafting a privacy statement may be considered by some to be one of the most basic tasks of a privacy professional, doing it well is no simple matter. One must understand and reconcile a host of statutory and self-regulatory obligations. One must consider different audiences that may read the statement from different perspectives. One must balance pressures to make the statement simple and readable against pressures to make it comprehensive and detailed. A mistake can form the basis for an FTC deception claim. And individual pieces can be taken out of context and spun into PR debacles.

The chapter then goes on to explore the art of crafting a privacy statement. It explains the multiple purposes of a privacy statement. It lists and discusses the many elements included in a privacy statement – some required by law, and others based on an organization’s objectives. Finally, it describes different approaches to drafting privacy statements and suggests best practices based on a more complete understanding of a privacy statement’s purposes and audiences.

The pre-publication of the chapter can now be downloaded at https://ssrn.com/abstract=2927105.

 

The FTC’s Smart TV Workshop

By Mike Hintze

On Wednesday, December 7, 2016, the Federal Trade Commission held a Smart TV workshop, as part of its Fall Technology Series.

The event began with opening remarks from Jessica Rich, Director of the FTC's Bureau of Consumer Protection.  Rich described how the changes from traditional broadcast television to the use of more streaming services and smart devices have resulted in more data being collected about TV viewing.  And while the tracking of TV viewing behavior can result in better functionality, better measurement, and better ad revenue, there are significant privacy concerns. 

TV viewing data can reveal sensitive information about a person.  Recognizing the sensitivity of the data, Congress acted twice in the 1980s to protect the privacy of the video programming people watch -- enacting the privacy provisions of the Cable Communications Policy Act of 1984 and the Video Privacy Protection Act (VPPA) of 1988.  Rich also noted that the different histories of televisions and PC have created different consumer expectations regarding privacy and data collection.  Finally, she concluded by noting that as in other areas, the role of the FTC with regard to Smart TV will be to highlight privacy and consumer protection issues and to bring enforcement actions for unfair and deceptive acts.  

Next, the FTC's Justin Brookman (Policy Director, Office of Technology Research and Investigation) and Ian Klein, a graduate student at Stevens Institute of Technology who interned with the FTC during the summer of 2016, gave an overview of the Smart TV ecosystem.  They based their presentation in part on laboratory testing they conducted of disclosures, controls, and data coming off of smart entertainment devices, along with some speculation of what data collection, use, and sharing might be happening or could happen.   

Areas of particular concern and focus of this overview were:

  • The use of "automatic content recognition" --- a method by which snapshots of the content displayed on the device are sent to the manufacturer or another party in order to determine what content is being viewed;
  • Collection of audio or video from the home environment through microphones or cameras embedded in the entertainment devices;
  • Cross-device tracking;
  • Combining viewing behavior data with other sources of data (purchase data, geolocation, demographics, etc.);
  • Device security -- a lack of which could lead to attacks on the device itself, other devices on the same local network, or on others through the use of a compromised device in a distributed denial-of-service attacks; and
  • User controls, with their research finding some controls for data collection by the device manufacturer, but few or no platform-level controls for app data collection or third party data sharing. 

The first of two panels, entitled "New Frontier in Media Measurement and Targeting," consisted of industry representatives and was moderated by FTC attorney Kevin Moriarty.  The panel discussed the benefits of data collection in the Smart TV context, including better and more personalized content discovery and recommendation, enabling more "second screen" experiences, and more relevant (and potentially fewer) ads.  

There was general agreement that with the fragmentation of media, traditional "Nielsen-like" sampling methods are no longer sufficient to measure viewing behavior, and there is a need to collect more complete "census" data from entertainment devices. But Josh Chasin, Chief Research Officer for comScore, also noted that collecting lots of data is not the objective -- and that "good data" is more important than "big data."

While there was an acknowledgement that the data collection use necessary for the provision of these new and useful services raises legitimate privacy concerns, members of the panel argued for a reliance on industry self-regulation.  Jane Clarke, CEO of the Coalition for Innovative Media Measurement, stated companies in this space do a good job of keeping PII and non-PII separate, and using only non-PII for analytics and measurement.  Ashwin Navin, CEO of Samba TV (a provider of media measurement software and services), noted that his company requires TV manufacturers that include their measurement software to provide users with notice and an ability to turn off the data collection. 

Shaq Katikala from the Network Advertising Initiative (NAI) noted that today's Smart TV environment involves the convergence of three distinct groups of companies:  cable providers, app and software platform companies, and TV manufacturers -- and each comes with very different histories and experiences with regard to regulation.  Thus, there is a strong appetite for self-regulation to help bridge the gaps and inconsistencies. 

Nevertheless, there are still challenges with respect to getting it right in the Smart TV ecosystem.  There are still no accepted or standard ways to provide notice and choice on a smart entertainment device, and there are unique challenges because of differing platforms and a lack of easily clickable links on most TV interfaces.  According to one panelist, the manufacturers have little or no bargaining power over the data collection by the "top-tier apps" that manufacturers feel they must have on their devices.  Thus, the top-tier apps dictate what data is collected and how it is used, and the TV manufacturer has little insight or ability to influence that.

The second panel, entitled "Consumer Understanding and Regulatory Framework," was moderated by FTC attorney Megan Cox and included representatives from industry, advocacy organizations, and academia. It began with Serge Egelman from the Berkeley Laboratory for Usable and Experimental Security (BLUES) presenting the results of survey research he conducted on consumer views on data collection and sharing, and their expectations with regard to Smart TVs.  He concluded that people often perceive that data collected on Smart TVs (such as for voice recognition) doesn't leave the device, that data is not used for secondary purposes, and that there are legal protections against sharing(and that there is a strong correlation between those people who believe there are legal protections against data sharing and those who believe data is not used for other purposes.  Egelman also a found a level of cynicism among respondents, with some expressing a view that companies find ways around legal protections to the extent they exist.  

Most of the panelists concurred that there is a lack of transparency and understanding with respect to what data is collected and shared, by whom, for what purposes, and what controls are available.  Claire Gartland from the Electronic Privacy Information Center (EPIC) noted that there is a complex ecosystem with many actors that are not known or understood by consumers - and that privacy policies do a poor job of explaining this.  Dallas Harris from Public Knowledge echoed this, and added that consumers feel powerless to control how data is collected and shared.  Maria Rerecich from Consumer Reports noted that user controls, when available, are often buried deep in menus and are not well explained. 

The panelists discussed what existing laws will apply to the Smart TV environment.  The VPPA, Cable Act, and the Children’s Online Privacy Protection Act (COPPA) may all play a role, but panelists suggested that unclear and incomplete application of those laws to this new and emerging area results in inadequate protections. 

 Emmett O'Keefe from the DMA cautioned against taking steps that could interfere with the ability to provide new television services that consumers want and enjoy.  He suggested that many of these services are similar or identical to services that have been available on laptops, tablets, and smartphones for several years and the fact that they are now being offered through a larger screen does not require a new or different approach to regulation.  O’Keefe also noted the DMA would be releasing a white paper on the Smart TV ecosystem (which is now available here).

There was a lively debate among the panelists on the effectiveness of self-regulation in protecting consumer privacy -- with O'Keefe referring to self-regulation of privacy in online advertising as "the gold standard" and Egelman calling it "an abject failure." Finally, Rerecich stated that Consumer Reports will begin including privacy and security ratings in its product reviews. She agreed that consumers want these new features, and the ratings will help them make informed decisions based on an understanding of the data collected and the privacy protections offered. 

 

De-Identification and the GDPR

Next Tuesday, November 8, 2016, Hintze Law partner Mike Hintze will present his new paper, "Viewing the GDPR Through a De-Identification Lens: A Tool for Clarification and Compliance," at the Brussels Privacy Symposium.  The key argument is that if European regulators acknowledge that there is a full spectrum of de-identification techniques, and develop guidance under the General Data Protection Regulation (GDPR) based on that recognition, they can:

  • provide greater clarity in areas of the GDPR that remain opaque;
  • enable organizations to adopt pragmatic compliance tools and strategies;
  • create greater incentives for companies to adopt the strongest de-identification that is compatible with the purposes of the data processing (thus achieving the optimal balance between data protection and data utility); and
  • advance the objectives of the GDPR by enhancing the protection of individuals’ personal data.   

You can access a pre-publication version of the paper here.

 

Hintze Law Welcomes Mike Hintze as Partner

Hintze Law Welcomes Mike Hintze as Partner

October 11, 2016.  Hintze Law is pleased to announce that Mike Hintze has joined the firm as partner. Mike joins Hintze Law after serving as Chief Privacy Counsel at Microsoft, where, for over 18 years, he advised on data protection compliance globally, and helped lead the company’s strategic initiatives on privacy differentiation and public policy.  Mike joins Susan Lyon-Hintze, partner and founder of Hintze Law, in leadership of the firm. His practice focus on global privacy and data protection compliance, policy, and strategy.

Read More

Publicly Available Privacy and Security Resources

If you are a startup or just a privacy or security officer with a lean budget, please check out our list of publicly available privacy and security resources.  

We update this from time to time for presentations we give to companies just starting to build their privacy and security programs and always welcome input on any "free" resources you find helpful.  

Publicly Available Privacy and Data Security Resources 

The following is a list of publicly available resources, most at no cost, which privacy professionals may find helpful in obtaining information and tools for developing their privacy and data security programs.
Privacy General

International Association of Privacy Professionals ("IAPP") Resources

https://www.privacyassociation.org/

Privacy links, job listings, and links to all of the world's data protection authority websites.

 Microsoft: Privacy

http://www.microsoft.com/privacy/           

Collection of FAQs and white papers prepared by Microsoft pertaining to user privacy protection, data governance, ad-serving, EU privacy compliance, and more.

 Cooley Privacy Policy Generator

http://generator.cooley.com/sites/privacy/Privacy/PQ2/Pre-PRIVACY-Start.aspx

Generally Accepted Privacy Principles ("GAPP")

http://www.aicpa.org/InterestAreas/InformationTechnology/Resources/Privacy/GenerallyAcceptedPrivacyPrinciples/Pages/default.aspx            

Principles for designing and implementing privacy practices and policies from the American Institute of Certified Public Accountants and the Canadian Institute of Chartered Accountants.

 Truste Resources

www.truste.com/resources

Surveys, whitepapers, guidance, including a behavioral targeting checklist, security guidelines etc.

BrightTALK

http://www.brighttalk.com/ 

Privacy and security webcasts available with registration.

 Privacy International

https://www.privacyinternational.org/       

Country by country summaries of data protection laws and privacy rights.

National Conference of State Legislatures: Privacy & Security

http://www.ncsl.org/Default.aspx?TabID=756&tabs=951,71,539#951

Charts of state privacy and security laws. Also includes articles, briefs, and newsletters discussing state regulation of privacy and security issues.

Organisation for Economic Co-Operation and Development: Information Security and Privacy

www.oecd.org/sti/security-privacy              

Homepage for OECD working party on Information Security and Privacy.

 Privacy Exchange: Legal Library

http://www.privacyexchange.org/legal/index.html

Index of privacy laws from around the world with links to statutory texts.

Nymity

http://www.nymity.com/Free_Privacy_Resources/Latest_Privacy_Studies.aspx?sort=RefPercent&order=d

Newsletter, privacy interviews, privacy breach analysis, links to privacy studies.

DataGuidance.com

http://www.dataguidance.com/index.asp

Paid subscription service offering database of privacy compliance information. 

The Data Governance Institute

http://datagovernance.com/index.html

Free data governance program documents, processes, templates and tools.

The Ponemon Institute

http://www.ponemon.org

Source of independent research on privacy, data protection and information security policy.

 

Privacy – U.S. 

Federal Trade Commission: Privacy Initiatives

http://www.ftc.gov/privacy/index.html

Information on the FTC's privacy initiatives: unfairness and deception, the Gramm-Leach-Bliley Act, the Fair Credit Reporting Act, and the Children's Online Privacy Protection Act.

FCC Proposed Broadband Consumer Privacy Rules

https://www.fcc.gov/document/fcc-proposes-broadband-consumer-privacy-rules

Proposed privacy guidelines for broadband Internet Service Providers (ISPs)

FCC Customer Proprietary Network Information (CPNI) Small Business Compliance Guide

https://apps.fcc.gov/edocs_public/attachmatch/DA-08-1321A1.pdf

Privacy guidance for small entity telecommunications carriers and VOIP service providers

California Office of Privacy Protection

http://www.privacy.ca.gov/          

Guidance on California privacy laws, general privacy links, and links to other privacy laws.  

Privacy – Rest of the World

European Commission Data Protection Site

http://ec.europa.eu/justice/data-protection/index_en.htm

  • General Data Protection Regulation (GDPR)

http://ec.europa.eu/justice/data-protection/reform/index_en.htm

Data Transfers from Europe

·        Eu model Contracts for Transfer of Personal Data to Third Countries
http://ec.europa.eu/justice/data-protection/document/international-transfers/transfer/index_en.htm

·        EU – U.S. Privacy Shield

https://www.privacyshield.gov/

UK Information Commissioner's Office

http://www.ico.gov.uk/

http://www.ico.gov.uk/upload/documents/pia_handbook_html_v2/html/0-advice.html

Resources include handbook for conducting Privacy Impact Assessments.  

Australian Government Office of the Privacy Commissioner

http://www.privacy.gov.au/

Information sheets, privacy impact assessment guide, personal information security breach guide.

Canadian Office of the Privacy Commissioner

http://www.priv.gc.ca/index_e.cfm             

Reports, publications, guidelines, research, tools, videos, privacy illustrations, privacy impact assessments.

Privacy in Product Development / Privacy by Design

Privacy by Design (Ontario Information and Privacy Commissioner)

http://www.privacybydesign.ca/

Publications and resources on the concept of Privacy by Design 

Microsoft’s Privacy Guidelines for Developing Software Products and Services http://www.microsoft.com/en-us/download/details.aspx?id=16048

Data Security

Protecting Personal Information: A Guide for Business

http://www.ftc.gov/bcp/edu/multimedia/interactive/infosecurity/index.html

FTC guide for implementing data security principles, with public domain security training materials. 

Fighting Fraud with the Red Flag Rules: the FTC's How-to Guide for Businesses

www.ftc.gov/bcp/edu/microsites/redflagsrule/index.shtml

Guide for organizations that are building Identity Theft Prevention programs with compliance tips, information about the Rule's applicability, and a guided four-step process.

National Institute of Standards and Technology: Computer Security Resource Center

http://www.nist.gov/itl/csd/index.cfm        

Provides a range of information technology security standards and guidelines.

PCI DSS: Standards, Self-Assessment, and Compliance

https://www.pcisecuritystandards.org/security_standards/pci_dss.shtml

Website for payment card industry standards, guidelines, and compliance tips.

Secure Coding

Microsoft’s Security Development Lifecycle ("SDL")

http://www.microsoft.com/security/sdl/default.aspx

Secure coding guidelines developed by Microsoft but generally applicable to all platforms.

Microsoft’s Security Development Lifecycle ("SDL") training

https://www.microsoft.com/en-us/SDL/process/training.aspx

PowerPoint training modules that cover secure design, implementation, and verification.

OWASP

https://www.owasp.org/index.php/Main_Page

Free security trainings on a variety of technology or process-specific topics including mobile security.  

Android Security Guidelines

https://developer.android.com/training/best-security.html.

Google’s security best practices for developing on the Android platform.

iOS Security Coding Guidelines

https://developer.apple.com/library/ios/

Apple’s secure coding practices guidelines. 

Data Breach Response

National Conference of State Legislatures: State Data Breach Laws

http://www.ncsl.org/Default.aspx?TabID=756&tabs=951,71,539#951

Charts of state security breach notification laws.

Data Loss db – Primary Source Archive of Data Breach Notification Letters

http://datalossdb.org/primary_sources

Searchable archive of breach notification letters submitted to various U.S. jurisdictions.

Massachusetts: Sample Letter for Notifying State Attorney General About a Breach

http://www.mass.gov/ago/docs/consumer/93h-sampleletter-ago.pdf   

Vermont: Security Breach Guidance and Sample Notification Letter

http://www.atg.state.vt.us/assets/files/2009-7-29%20Security%20Breach%20Guidance.pdf 

Privacy Rights Clearinghouse’s Chronology of Data Breaches

https://www.privacyrights.org/data-breach  

 

For questions and input contact:


Susan Lyon- Hintze – susan@hintzelaw.com, 206-601-3233

Mike Hintzemike@hintzelaw.com, 206-719-6934

Jared Friend jared@hintzelaw.com, 206-325-3277

Hintze Law PLLC
505 Broadway E. #151
Seattle, WA 98102

www.hintzelaw.com

 

U.S. Department of Commerce Issues Fact Sheet on the EU-U.S. Privacy Shield Agreement

On February 2, 2016, following the announcement of the EU-U.S. Privacy Shield Agreement, the U.S. Department of Commerce distributed a fact sheet about the new data-transfer agreement with the European Union. The fact sheet provides further detail on the elements of the agreement described in the EU Commission's press release.

The Department of Commerce’s fact sheet states that U.S. companies participating in the EU-U.S. Privacy Shield must "commit to participate in arbitration as a matter of last resort to ensure that EU individuals who still have concerns will have the opportunity to seek legal remedies." Arbitration will be “at no cost to the individual.” Whether U.S. companies must bear the cost is not clear.

Further, the fact sheet states that the Privacy Shield contains additional obligations regarding use of service providers by participating companies in the form of "new contractual privacy protections and oversight for data transferred by participating companies to third parties or processed by those companies' agents to improve accountability and ensure a continuity of protection."

The Privacy Shield allows for European Data Protection Authorities to refer complaints to the Department of Commerce and the Federal Trade Commission. The Department of Commerce states it will dedicate "a special team with significant new resources to supervise compliance with the Privacy Shield" as part of its effort to resolve these complaints.

The EU Commission press release also announced that the U.S. gave the EU Commission written assurances that the access of public authorities for law enforcement and national security will be subject to clear limitations, safeguards and oversight mechanisms. The fact sheet provides details on the nature of these written assurances, stating that "[i]n connection with finalization of the new EU-U.S. Privacy Shield, the U.S. Intelligence Community has described in writing for the European Commission the multiple layers of constitutional, statutory, and policy safeguards that apply to its operations, with active oversight provided by all three branches of the U.S. Government."

While the Department of Commerce has shed a bit more light on the details of the EU-U.S. Privacy Shield, many questions still remain. Stay tuned.

 

By Carolyn Krol

 

City of Seattle Adopts First of Its Kind Privacy Principles

On February 23, 2015, the Seattle City Council unanimously approved a resolution approving its first ever set of comprehensive privacy principles.  The principles are also the first of its kind to be adopted by a major U.S. city.

The privacy principles guide the City of Seattle when collecting, using, and sharing personal information from the public. The principles include considering potential privacy risks when collecting and using personal information; minimizing data collected; providing notice and, if possible, choice about how data is used; securing data; and maintaining accuracy of personal information.

In a message to followers on Twitter, Mayor Ed Murray said the new privacy principles “create a comprehensive ethical framework in protecting privacy and building public trust.”

The Council also set a deadline of August 2015 for each City department to develop a “Privacy Toolkit.”  These Privacy Toolkits will consist of a package of actionable privacy standards that implement compliance with the privacy principles. The official  at: http://murray.seattle.gov/city-adopts-privacy-principles-to-protect-the-public/#sthash.xLGTSCwu.XrojDCoq.dpuf 

The following are the City of Seattle’s Privacy Principles in full:

What is Personal Information?

“Personal information” is any information relating to an identified or identifiable individual. Examples of personal information include but are not limited to a person’s name, home or email address, social security number, religion, political opinions, financial and health records, and racial and ethnic origin.

Privacy Principles

The City of Seattle collects personal information from the public so that we can provide many important services including community and critical infrastructure protection, 911 call response, waste management, electricity delivery and other services. We work to find a fair balance between gathering information to provide these needed services and protecting the public’s privacy.

While privacy laws protect some personal information, the information we collect becomes a government record that others can ask to see through public records requests. Therefore, it is important for you to know when and how your personal information is collected, how we use it, how we disclose it and how long we keep it.

The following Privacy Principles guide the actions we take when collecting and using your personal information:

1. We value your privacy…

Keeping your personal information private is very important. We consider potential risks to your privacy and the public’s well-being before collecting, using and disclosing your personal information.

2. We collect and keep only what we need…

We only collect information that we need to deliver City services and keep it as long as we are legally required and to deliver those services. Whenever possible, we tell you when we are collecting this information.

3. How we use your information….

When possible, we make available information about the ways we use your personal information at the time we collect it. We commit to giving you a choice whenever possible about how we use your information.

4. We are accountable…

We are responsible for managing your personal information in a manner that is consistent with our commitments and as required by law. We protect your personal information by restricting unauthorized access and by securing our computing resources from threats.

5. How we share your information…

We follow federal and state laws about information disclosure whenever we work with outside governmental agencies and in answering Public Disclosure Requests (PDRs). Business partners and contracted vendors who receive or collect personal information from us or for us to deliver City services must agree to our privacy requirements.

6. Accuracy is important…

We work to maintain and use accurate personal information for City business. When practical, we will work to correct inaccurate personal information. We also direct our partners and contracted vendors to follow the same guidelines