The Verge recently published an article speculating about features of Microsoft’s soon to be released Cortana mobile personal assistant. While Cortana is expected to have many of the same capabilities as Apple’s Siri and Google Now, there’s one feature that caught my attention.
In the Verge article, author Tom Warren writes about a Notebook feature. The notebook sounds like a separate area that is used by Cortana to store information about you that it can access later to perform assistant tasks. This stored information can include things like your current or recent location, your contacts, personal data, information about your habits and behaviors, reminders, and so forth. But the interesting feature is that it appears you can put up a privacy fence between you and Cortana by limiting what the personal assistant can store in the Notebook.
How does this privacy fence work? Based on the info in the Verge article, Cortana will ask for permission before she puts any of your data in the Notebook. As Cortana learns more about you, you can choose what you want her to store and what you’d prefer to keep to yourself.
I like the privacy fence concept. As personal assistants become more pervasive and intertwined with our daily routines, I think we may grow increasingly concerned about how these assistants could invade our privacy. Fear comes in large part from a loss of control. If we have no ability to limit what our personal assistants know and remember about us, or who they share this information with, then we may fear and mistrust them. If we have the option of limiting the data our personal assistant remembers about us, it could make us less hesitant to engage with the assistant.
As personal assistants evolve, we’ll keep an eye on how vendors handle the tension between our desire for privacy and the need to share information. Concepts like the privacy fence may turn out to be a viable approach to balance both needs.