librarypunk - 119 - Privacy (and sex!) feat. Digital Shred Privacy Literacy Initiative
Episode Date: January 29, 2024This week we're talking about privacy literacy. Instruction, workshops, and how your data ends up all over the place. Join the Discord: https://discord.gg/QTr6Tn6YMk [shorter LOEX conference paper] ...Transforming Privacy Literacy Instruction: From Surveillance Theory to Teaching Practice [longer book chapter] The Promise of Theory-Informed Pedagogy: Building a Privacy Literacy Program [for browsing] Open-licensed privacy literacy curriculum Private Bits: Privacy, Intimacy, and Consent might be of particular interest Dark Patterns: Surveillance Capitalism and Business Ethics deals with data brokerage (a la Becky Yoose and Nick Shockey’s work for SPARC) #ForYou: Algorithms & the Attention Economy reimagines media literacy for algorithmic information environments ACRL Sandbox entries (complete with lesson plans) Workshop guides (teaching & learning materials, also linked from the Sandbox) Digital Shred Privacy Literacy Toolkit: https://sites.psu.edu/digitalshred/ https://www.alastore.ala.org/content/practicing-privacy-literacy-academic-libraries-theories-methods-and-cases Media Mentioned 1890 harvard law paper on right to privacy https://www.jstor.org/stable/1321160 “Nobody Sees It, Nobody Gets Mad”: Social Media, Privacy, and Personal Responsibility Among Low-SES Youth https://journals.sagepub.com/doi/10.1177/2056305117710455 Privacy Literacy and Its Problems. https://openurl.ebsco.com/EPDB%3Agcd%3A6%3A19315958/detailv2?sid=ebsco%3Aplink%3Ascholar&id=ebsco%3Agcd%3A133682180&crl=c Lawrence Lessig Pathetic Dot https://en.wikipedia.org/wiki/Pathetic_dot_theory https://stopncii.org/ https://takeitdown.ncmec.org/ Shame Machine https://www.penguinrandomhouse.com/books/606203/the-shame-machine-by-cathy-oneil/ Kate Devlin Turned On https://www.bloomsbury.com/us/turned-on-9781472950871/ Danielle Keats Citron Sexual Privacy https://www.yalelawjournal.org/article/sexual-privacy Danielle Keats Citron The Fight for Privacy https://wwnorton.com/books/9780393882315 Kate Devlin https://www.hotpress.com/sex-drugs/sex-robots-frontlines-ai-revolution-kate-devlin-22775520 https://iste.org/standards Sarah’s Sci Fi https://alastore.ala.org/content/self-subject-autoethnographic-research-identity-culture-and-academic-librarianship
Transcript
Discussion (0)
Okay, let's go.
I'm Justin. I'm a Skalkaum librarian. My pronouns are he and they.
I'm Sadie. I work IT at a public library and my pronouns are they them.
I'm Jay. I'm a music library director and my pronouns are he.
And we have guests. Would you like to introduce yourselves?
All right. My name is Sarah. I'm a reference and instruction librarian with Penn State University Libraries at a regional campus, Penn State Berks, which is outside of Reading, PA.
And I'm Alex. My pronouns are she-her and I also work at Penn State University.
University Libraries at Penn State Berks.
Is your cat going to introduce itself?
This is Arthur.
Arthur, do you want to say hi?
No.
He does this every week.
He does this every week.
He's ready for his close-up.
He likes to be in me.
He is my cat, so, you know.
Well, my dogs may introduce themselves vocally at some point, so apologies if that occurs.
Arthur, you're such a good door, not a window.
Yeah.
He's really not moving.
He's very comfy.
Yep, as is his right.
Okay.
Well, we'll let him get settled in.
All right.
Someone put news in here, so I guess we're doing news.
The Lipsa TikTok lady is on the Oklahoma Library Committee now.
Yeah.
It's not good.
For the state, I think.
I closed it for some reason.
She's not from Oklahoma, is she?
Oh, no.
So.
Yeah, not at all.
It's not good.
And it was, I read the, I didn't really follow up with it, but I did read the initial, like,
press release from the, I guess, like, the head of the Department of Education for Oklahoma.
Like the superintendent something or whatever.
Yeah, state little superintendent.
The official letter was like, we're so happy to have Lips of TikTok creator.
And it's just like, that's so cringe to use someone's TikTok handle.
Like, she can't even get along with her own name still.
Like, her handle was in the letter like three times.
It's like a three paragraph letter.
It's so embarrassing.
Yeah.
I was trying to find out if Oklahoma is one of those state library associations that has like left the,
ALA. There's a couple of state library associations that have, you know, disocied from the American
Library Association. I couldn't pull up anything quick. My Google foo is not strong at the moment.
Nobody says because Google sucks now. Well, accurate. I could try a jococco. They don't even do
Boolean anymore. They haven't for a while. You know, it makes me sad though I like Boolean.
Yeah. Same. I'm a Boolean logic nerd as well. But even when I do want to like buy something,
because it's all been like made better for selling stuff, even though I do want to buy something,
like still not working, right? I was trying to like find hotels or something on there the other day.
And it was the most confusing. It's so much worse than trying to like use Google flights.
And like everything is like promoted. But I'm like, what does that mean? What does promoted mean?
These are all sales sites. These are all like kayak.Biz. I don't know what these mean. So it's very
confusing. It's just a bad product. I'm just so tired of seeing ads for TEMU, which like is that like
everywhere. I think it's the new shot.
It's like shine, but for like shit to put in your sink and not close.
Yeah.
It's just drop shipping.
Some kind of like loophole in like import laws that it exploits by doing some very creative
like shipping options.
And that's why you can buy like a $3 t-shirt.
So it's but like no one's bought anything.
They're good from there, right?
Which is what you'd expect.
Like it's all like junk that falls apart.
It's like wish or whatever that site.
was called. Yeah, but like something even worse. Yeah. Anyway, that was the news about three different
things. The drop is too long. We always try to warn people that all three of us have ADHD.
So if we just ping pong around a little bit, just be patient with us. I promise most of the time
we do get to a point. That's what the notes are for. We work with college students all day every day.
So like I'm just, you know, suppressing my type A and it's all good. She works with me all the time.
So she has to deal with a lot there.
I'm in your world now.
It's all good.
I ask to be here, actually.
Nothing wrong with type A personality stuff.
So you sent me, sorry, when you reached out, you were letting me know about Digital Shred.
And I'm interested in the origins of the name.
Why Digital Shred?
Oh, yeah.
So this goes back.
And Alex, you'll have to maybe fill in gaps in my memory.
I'm not sure if we had thought about the Digital Shred Workshop first or the Digital Shred
identity of our entire initiative first. But really what was going on was I was registering domains
because we thought maybe we would have a web presence separate from our institution. As it is,
as you probably saw, we have a digital privacy literacy toolkit that's on a Penn State branded
WordPress site. So I was playing around with some different names. I like the mouth feel of the word
digital and it feels slightly more contemporary than like cyber shred or virtual shred. I like
digital shred, it feels nice and hard in the mouth. And the term shred has so many connotations
in the context of privacy. Obviously, there's document shred events, which a lot of people are
familiar with. So it's a riff on that, but for the digital context, there's also slang with
shredding with shredding performance. You shredded something or used to be a term for when you
were like super fit or snatched, you were shredded. So again, there to me was a lot of like positive
idiomatic association with term shred that was relevant to privacy related topics. So,
in that sense, I was looking for some kind of identity to throw at our, you know, growing at the time and yet still growing now, privacy literacy programming, our scholarship, our, you know, professional development initiatives that we have. But that didn't involve the word privacy directly because it felt like this was a little bit bigger and, you know, that we could be a little bit more creative than that. So it started off with liking the mouth feel of digital shred and feeling that that conveyed the message that we were getting across. And Alex, I don't know, again, if you want to fill in any gaps, which came first, the workshop.
or the sort of identity or branding for lack of a better.
No, no, it was kind of concurrent like you're describing.
I think we were around the point when we were trying to figure out a more traditional
digital privacy type workshop around the same time.
I think it came mostly as a thought for like a domain name.
And then because we were playing around with ideas for that workshop at the same time,
it made a lot of sense.
And then that workshop just landed itself so well to like the toolkit that it,
that it just timed out all perfectly. Many of the things that are kind of collaboration has,
it's worked out very serendipitously along the way and conveniently. But I would say it was concurrent.
Your memories, the same as mine. And like to give full credit where it's due, there were probably
some adult substances involved. But that's like neither here nor there hasn't been the case for a little
while since I've become a mom. But yeah. So yeah. So there's probably some some real brainstorming
happening on that one. I'm on them Broward County Tick-Tact. Yeah. I don't know where he gets these
drops. Appropriate. That's from, that's Dracula flow. That's, that's ruined everyone I know's
vocabulary. I don't even know what it is. Yeah, I'll show it to you. And then here's a damn-ass
fucking gay, damn-ass rock. Jay hates that drop so much. It's just a little kid.
It's talking about his pet rock. He's so proud. I just feel like I'm having a stroke.
because I can't, I can't hear the words in it.
I hear gay-ass rock, but that's like it.
And I'm like, everyone else seems to be able to make out the words, but I can't.
Gay ass fucking rock.
It is sheer audio processing disorder for me to me.
This just sounds like when I'm having a bad day and it's not coming through.
So.
Cool.
So hopefully that gives you a sense.
We kind of, so to go back to what Alex is saying, we had developed and delivered our
sort of flagship initial privacy.
workshop. That's what we called it. We were like, this is a privacy workshop, maybe the only one of its kind. And it was just a little bit of magic. Like, it went really well. We realized we were hitting on something that was really needed and really desired by our student population. So again, we're talking about undergraduate students in this context, primarily first year students, which alice liaises with our first year experience programming at the Berks campus. And there was so much potential to build on it that we just, we immediately have the sense of this is something bigger. So we started our scholarly
collaboration around that same time, which I think there was a question in the notes about our framework,
which we're really happy to nerd out on. And then this idea of, you know, we wanted to have a digital
presence where we could drop all of the resources that we were referencing to create this programming
because there wasn't too much like it available for other folks to use. And in the course of our
first research project together, we actually surveyed somewhere in the neighborhood of 80 to 100
academic instructional librarians to determine like who else is doing any kind of privacy
literacy instruction programming, what issues or challenges are they facing. And it was kind of rare at that
time. So we're going back to 2019 for anybody to be doing this. And the number one barrier was just time.
Time to get familiar with the topics. Time to develop teaching learning materials. Time in the
classroom. And so we were like anything we can do to save folks time. We're going to open license all
our curriculum. We're going to curate a repository of resources that we use. We're going to do a lot of
scholarly communication, a lot of presentations, a lot of publishing, a lot of train the trainer type sessions.
And so that's, you know, we wanted an identity or again, a brand in scare quotes, to go with that full, full-fledged initiative.
So that's kind of how digital shred was born.
Very cool.
Yeah.
Actually, let's go with the six eyes.
What's that model and could you explain it?
Sure.
So one of the things Alex mentioned was, you know, that our collaboration just has these like unicorn aspects to it.
And so one of the things when we first got together that we both really loved was Alex brought all this deep sort of care about pedagogy and the student experience and the student experience.
and the steep expertise in pedagogy and crafting really sound learning experiences and really
interactive and engaging and inclusive learning experiences. And I also care about those things,
but I'm more of the theory wonk. I like to read very, very deeply into the scholarly
literature and then walk out in the classroom. And Alex is the one that kind of like brings me down
the earth that says, yes, but why would students care about this? Like let's figure out how to
craft a learning experience that would let them care about this. So in the course of doing the
the academic paper that came out in conjunction with our privacy workshop. So that one's called
the privacy literacy practices and academic libraries, past present and possibilities. Alliteration is always good.
That's when, you know, I'd gone deep into the literature and philosophy and law, anthropology, sociology,
computer science, looking at how privacy topics are addressed in all these different domains and
figured that there could be some kind of theoretical framework or model that we could develop using those
findings. So I came up with what I later realized is called an onion model. It's a set of six
concentric rings with identity at the center. And we work out from there to intellect. So the activities
of your mind to integrity. So that's both contextual integrity, this idea that the right people know
the right things about you at the right time. That's Helen Nissenbaum's framework for privacy.
And then leaving the metaphysical space to bodily integrity, spatial privacy, you're right to be
let alone. You're right to have medical autonomy, et cetera. From there to intimacy. So there we start to
talk about communal privacy, collective privacy with your closest relationships, significant others,
close friends, family members, et cetera. And from their interaction or freedom of association and
isolation or your ability to voluntarily withdraw and be in solitude. So those eyes all percolated
up from the literature. And we have another chapter in our book that just came out,
taking a deeper look at how that framework came about. Again, that onion model or concentric
model has some meaning when you look at work by someone like Julie E. Cohen, who talks about
privacy is protecting as boundaries that kind of protect these different zones of and facets of
ourselves and of our social relationships and the social roles that we perform. And as you work your way
from the center of that model of identity out to the outer ring of that kind of interaction and
isolation idea, you're working from this core very sparsely accessible sense of yourself,
right, that even you don't really know everything that's going on down there in the recesses
of your brain out to where you're more accessible to people. So the boundaries become more permeable,
navigable, you know, more and more information and experiences exchanged across those boundaries as you work out from the model. So the idea there's more privacy protection toward the center and lesser at the outside. And we're really hoping to kind of visualize for our students all the positive ways that privacy benefits us in our day-to-day life. So we wanted to transcend technology, transcend even data and data capture and surveillance capitalism to just talk about what's the private, you're the positive case for privacy. If you're never the victim,
of a cybersecurity hack or a data breach or no one ever misuses your data against you or you're a demographic
type such that you're never going to be negatively profiled for opportunities, right, by a data
brokerage system. Why would you care about privacy? So these are some reasons why those six private eyes.
And what we've done since that initial workshop is a lot of our other workshops now look at one of those
frames from the framework specifically. So maybe we'll talk about private bits now that you're a sexy
podcast. So that's our intimacy frame. We've done one on.
on wellness, so a holistic sense of privacy and wellness sort of across that entire spectrum.
We're working on, or Alex is working on a data justice one. So I'd love for her to talk more about
that. But again, kind of, and I have one that I just developed that's coming up in February
on generative AI and intellectual privacy. So looking, taking a close look at that intellect frame.
So hopefully, again, that answers your question. But it's really been something that we can sink
our teeth into and that takes my love of going deep into the literature and the dusty stack.
and the Cobbwebby databases and pulling out these kernels across time.
So going back to the famous 1890 Harvard-Larview paper that most folks know about talking about the right to privacy
and trying to pull that thread all the way through to the present day.
I mean, one of the nice things about having diverging interests and skills, especially in the beginning,
was that it really kind of challenged us to rethink how we approach things as individuals,
which was nice.
And so this approach with these six private eyes, it does kind of inspire us to look at specific frames, even though there's a lot of overlapping characteristics to privacy when you're looking at them from this perspective.
But it really does kind of give us focus and almost helps point out when we're missing coverage in a certain area.
So it's been it's been definitely a useful tool for us.
And as Sarah said, it also focuses on a lot more than just the data privacy and digital privacy.
privacy, which most people are thinking about when we're talking about these topics.
To that point about this focus, I mean, obviously we want to be making the positive case for
privacy, because a lot of privacy literacy programming focuses on the negative and on reputation
management and threat modeling and harm reduction, which are all important and good things.
But one of the things the framework has done is elucidate otherwise hidden privacy harms
or potential privacy harms. So it's kind of a pre-post test type experiment that we've done
in some professional continuing education contexts is, you know, look at a case study that I think we can all
agree is privacy related or privacy adjacent, find the privacy harms or privacy concerns in the case study.
Now let's talk about the six private eyes.
Now look at the same case study.
Do you see something different?
Do you see different dimensions of privacy that you didn't see before?
So it's been useful in that sense as well.
I really liked it.
And I put some of this in the notes because I recently read a paper.
I'll link it.
but it's Dana Boyd from data and society.
Yeah, okay, yeah.
And I think it was published a couple of years ago,
and she and a couple of colleagues surveyed a bunch of,
I think it's socioeconomic status, low socioeconomic status,
is the way that she spoke of them,
youths in New York City and how their various approaches to digital privacy,
but they were specifically looking at where these youths assigned responsibility for each one.
So quotes and stuff like talking about like, you know, like you always hear, don't put that on Facebook because, you know, a job down the line is going to read it and not want to hire you. And so like kind of quizzing, you know, where they see the frame of responsibility for doing that. And also things like surveillance. So the same thing for like when they have cops who are violating their privacy or invading, you know, what, what.
What's the framework on that? And all of these, basically all of the youths ascribed a very personal responsibility. Well, she shouldn't have put that picture online. You shouldn't have said that. You shouldn't have used your real name. You shouldn't have talked about your job. You know, these sorts of things, except for when it came to police and law enforcement surveillance. And most of these youths were of color and they were, you know, of a lower socioeconomic status. A lot of them were like first generation Americans.
you know, the kids of immigrants and that sort of thing live in public housing. And many of them
talked about how, you know, basically being racially profiled, you know, you're not even doing
anything. You're just standing on your back porch, but because of faces a road and it's known
public, whatever, a cop walks down and starts harassing you over what, what are you doing
right now kind of thing. And from that frame, they put the responsibility on, most of them put the
responsibility on law enforcement and sort of the institutional frame of it. So many of them
just personal responsibility all the way down. I'm not surprised by those findings and I've not
seen that specific study, but it makes sense to me because what we see a lot with privacy is this
inclination for responsibility for responsibility is what it's called, where the responsibility
for maintaining our privacy is put on the individual. We see this a lot across society,
generally beyond privacy, like recycling. Like, oh, the individual can make the difference by recycling
instead of putting it at like the actual producers of most of our waste, correct?
So same thing with privacy.
This is a convenient narrative for like big tech companies to put it on us to make us feel
responsible for the mistakes that we've made or what we could be doing to manage.
That's why partially Sarah and I try to reject a lot of technosolutionism where we're focused
only on like Facebook settings or like privacy settings and all these social media platforms.
They only do so much.
But I'm not surprised that these youth were really focused on that because that's the narrative,
but that they got a better sense of the systems of power because they have direct experience with it with the police.
So that makes perfect sense because they have the lived experience to be able to identify the systemic issues that they're living day in and day out.
And so that's one of the things that we're going to try and start addressing in like our newer workshop with data justice,
trying to understand those systems of power. And in a lot of cases for us and our experience
with Penn State, it can be a lot of folks who are a little privileged in their life experiences.
Sarah and I included, and that's something we try to address in some of our workshops,
particularly private bits because we're both like cisgendered, you know, in long-term
relationships, ladies. So, but anyway, the point is that in the data justice workshop,
we're trying to focus on helping people understand how they contribute to these.
systems, whether or not they are impacted by it or not. So just our complicity in those systems
can have larger impact systemically. But that's so fascinating and it makes perfect sense to me
because they're aware. They're perfectly aware. Whereas with all these other things with like
social media or like revenge porn or anything that might be happening, it's easy to blame yourself
or to blame the victim like we see so much. And honestly, I think they're taught that a little bit in the
Digital citizenship standards from like ISTE.
So Alex talked about that dimension of responsibility.
That's one of the big critiques of privacy literacy that comes from Theo Hagendorf's work,
privacy literacy and its problems.
So that's a little bit of a learned behavior.
I'll also confess this is an area where I really struggle.
Alex and I have divergent views on many things,
one of which is what is the role of regulation in this space?
I've read a lot of privacy regs.
I've read FERPA many times.
I've read like the California, whatever consumer privacy act. I've read HIPAA. So I know that they're very limited in scope and they've got huge loopholes. You know, the school officials, educational interest loophole of FERPA and the business, legitimate business interest loophole of CCPA, right? So I'm a skeptic in that regard. And then I have like libertarian political leanings as well. So it's always like I don't know if introducing government action in the space is actually a long term good.
right? And I think the conversation might be different for minors than it is for adults, but then you get into issues of age verification techniques and technology and use of biometrics to do these things. And so that's a whole other level of fuckery. So I think. Yeah, it follows three slope. It's so complicated. Yeah. But what I really love is Lawrence Lessig's whole framework. I think he or someone referred to it as the pathetic dot theory, but I prefer to call it the four regulators. So this idea that if we, you know, as individuals are the pathetic
We've got these four forces acting on our autonomy throughout the world. So it's sort of the state
regulatory force. It's the architectural design environment force. It's the market force of supply and
demand. And then it's cultural forces. Like what are the norms? And of those four forces,
I feel like education, privacy, literacy can directly influence that cultural norms force of the
four. And I'm a big believer that most other things are downstream of culture. So the
long game is, you know, influence the culture, therefore influence the architecture or the
environment and the market and the regulation, hopefully in tandem. The other thing I'll say there,
and like this is a bigger area we can talk about if you're interested is I'm an intellectual
freedom maximalist. So I'm coming to my privacy work from the perspective of privacy as a
precondition for enjoying, you know, both the intrinsic and instrumentalist values of intellectual
freedom. So I'm always a little bit careful to introduce or careful to either, you know,
fully disclose or try to control and regulate my political and sort of like policy perspectives on
these issues because at least working with a college-aged population, I want to be able to say,
like, here's the facts, here's what's happening on the ground, what do you all think about
this situation? What do you think we should do? What do you, you know, you think you should do as
individuals? What's your responsibility? What's our collective responsibility to each other? And I think
again, that might be a little bit of a different bent than when you're working potentially with
the K to 12 population or a youth population in a public library system. And that's such a huge
point for our entire philosophy behind our work. We try not to be prescriptive in what we're teaching,
what we really want. We're really big on reflection and a lot of time spent with students
figuring out their values and how this fits into what their lived experiences and values are. We
try to really respect that. And I think that's part of why it resonates and is so successful with
them because they're not getting like lectured. Because Sarah and I are now getting to an age
too. We're getting a little older. But they, they really respond to that approach where they're being
respected as peers and they're bringing stuff to the table as well. So that is a really important
point in our teaching philosophy as well. Oh, and I see there was a note here that was from Sadie about
a discussion of nudes as well.
Yeah, so part of that, sorry, that was supposed to be sexting and it got auto-corrected,
part of the study from Dana Boyd that I was thinking about while I was reading the Six Eyes framework
was it was interesting to read from these youth, the frame that they put that sort of sexual aspect of intimacy and privacy,
because most of them acknowledged that things like revenge porn or, you know, releasing nudes that were given in private is a bad thing to do. But the responsibility was still, well, most of it was, of course, you know, she should have known better kind of thing. And, you know, it's not, it's not her fault. It was, it was one of those areas where they shifted more towards out, out of the personal framework of it into more of a, this isn't something that shouldn't have happened. But they didn't,
quite get past it, if that makes sense. They still couldn't let go of who, of the point where they
had control and somebody let go of the control. So you send, you send the nude and that's the only
action that the person who's victimized could have, like, controlled at that point. But they
never quite got beyond that framework and how it therefore should have affected privacy in general.
So I thought it was a really interesting, like, frame how it was surveillance and it was the sexual
were the two places where these youth could start to get beyond the boundaries of personal
responsibility when it came to digital privacy. And my thought was they're both very physical.
So, you know, when they were talking about law enforcement surveillance, they're talking about
cameras in their neighborhood. They're talking about what was that fucking technology,
the one that detects gunshots, you know.
Oh, shot. Not shot spotter. Shot spotter. Shot spotter. Yeah. So they're talking about,
you know, and the physical presence of police and how that affected how they, how they, how they
framed their privacy. And then, you know, on the other side, there's either the sexual where it is also
very bodily, you know, even if it's going through a digital medium, it's still a very physical
thing. It's a very, you know, intimate thing. So I really thought it was interesting that those were
the two sort of areas where these youth were starting to, starting to comprehend something beyond
the personal. And yeah, I think the physical versus that it's easier to ignore if it's digital kind of thing is,
is really an interesting area to me, at least.
Absolutely.
And the intimate privacy stuff is so interesting because intimacy requires that disclosure.
Like that's the first point of the framework where we start to talk about privacy being shared.
And, you know, suspending like the moral judgment of it.
Like sexting would be an, you know, not expected, but would be certainly a legitimate component
of an intimate relationship in the same way that like touching would be an expected component
of an intimate relationship.
And the voluntary consensual relinquishing of that privacy to say, like, we share this together, but not with anyone else, is part of intimacy.
So to me, the violation of the privacy is the violation of the intimacy and the non-consensual sharing of the images, right?
It's not in the initial like, hey, check out my tits.
You know what I mean?
Yeah.
Because that part, that's an intimate act that was intended for a shared, you know, whoever belongs to that intimate relationship, the couple or whomever, right?
it wasn't intended to exceed beyond that. So to me, the privacy harm is once it left that relationship, the context of the relationship. And while we're on this topic, I just want to plug a couple of resources because non-consensual intimate image abuse is a really big problem, primarily for women, often for women of color. And again, you know, women of low socioeconomic status, women of different abilities, all these things. So stop ncII.org, stop non-consensual intimate image abuse. Stop nconsensual.
cII.org will facilitate the takedown of these images if you're an adult. If you're a minor or
someone you care about who is a minor is suffering from non-consensual intimate image sharing,
it's take it down. And that's associated with the National Center for Exploited and Missing Children.
So both of those services basically generate a secure hash based on images. They have a secure
way, you know, as secure as anything can be for you to upload images of concern. They generate a
hash of the image and they're able to, based on that hash of the image, seek out that image and
other venues where it's being copied or used in generative AI to produce, you know,
digital sexual identity fraud or deep fake porn. And so those services can help you take down
any images that you don't want out there. But yes, fascinating topic. And I would argue or agree that
that's an interesting frame to be looking at and the embodiment issue. I don't know too about the
shame experience. And I think, Alex, you've read Kathy O'Neill's shame machine book. I don't think
it's made it to the top of my To Be Red Pile yet. But I don't know if in other digital contexts that
aren't as embodied if shame has the same salience as it does with the intimate image sharing or with
the engagements with law enforcement. But that's another dimension of that that comes up for me. And that's
part of that harm reduction approach that's really been popularized by Library Freedom Institute
and Alison McGrina's work. So it's something that informs some of our work. It's just not the only
way that we approach privacy literacy for sure. I had never thought about shame with regards to like
privacy discussions before. It's really interesting. Very connected to like secrecy, which is what we see.
of folks start with with their conception of privacy. And it's like Sarah was saying, it's so embodied.
Like we can all get behind like, oh, we need a closed door for the bathroom, right? Our bodies seem,
and culturally, going back to what Sarah was saying, too, the cultural norm is we're protecting our
bodies. We have like closed doors on dressing rooms. So that seems to be something everyone can
wrap their mind around. Whereas it's a lot harder to think about contextual integrity because, like,
I did one thing, like say I was browsing, whatever it may be.
something sexual. And then a targeted ad comes up while I'm presenting in front of colleagues that
indicates that I've been browsing something shameful. People can wrap their mind around the closed
bathroom door a lot faster than they can. Oh, my Dita is outing me in front of my colleagues because it's
a little bit more abstract. I think, too, it might be one of the first ways that we learn about privacy.
So I have two little kids, very little. And the one is two years old. And so we're starting to talk to our
child about when people want privacy. Because if you've ever been with very small children, they
have no privacy or no regard for your privacy. And like we've started that conversation around the
bathroom, you know? And like as a privacy literacy practitioner and scholar, that's probably not
the end, you know, and all be all for me of what privacy means or what I want my kids or my students
to think about privacy. But that's like the convenient first lesson of privacy, you know, and
our kid understands. Like, they'll go around saying around the house like,
so-and-so needs privacy because they're, you know, peeping or poop-boing or whatever.
So it's kind of a privacy literacy win. I don't know. But it did start around like bodily functions.
And I mean, hopefully not associating shame just yet with those bodily functions. But it definitely
starts around this embodied experience of like you're doing something that you don't want
anybody else to share with you. Yeah. I want to go ahead and jump into the private bits workshop.
Because we are a sexy podcast now. It's in the notes. No one could.
see that. So ever since, I've just been getting a lot of followers on the Twitter account of people who are just sexy internet people. And I'm like, oh, cool, we're like a sexy podcast now because we were at a live show. Me and Jay were at a live show for a relationship podcast, Radio Free Toapag. And I went around just kind of spanking people with a writing crop. And I've made a lot of friends very quickly. I bought it at the leather archives of museum, support them. They have a capital campaign going on. I just heard Alex Ketcham on. Yeah. Yeah. Yeah. Well,
Yeah. And I will say I listen to a couple of, well, I listened to Becky Yose's episode, because I'm kind of affiliated with one of the Spark working groups related to privacy. And Jay, I think you were talking about a trans porn star who was doing skateboarding tricks with a finger skateboard, inspired. Yeah, gotta.
Yeah. And I was like, okay. And then there was some other episode where you were just talking about public sex and debating like whether public sex was okay as adults. Like shouldn't you have your own space now or whatever? And I was like, all right, these folks need your mom private business.
Yeah, Sarah email me and she was like, I think we need to like, we need to talk about
Private Bits.
We love privacy.
We've done, like we've had Allison on before and we're perverts.
So like, you know, perfect combo there.
Yes, absolutely.
So, absolutely.
Private Bits is the union of these things.
Exactly.
It's a solid, it's that like solidarity meme.
Like if you go.
It is amazing, like, the fact that like, we're emailing articles to each other about like porn
and like all sorts of crazy things. It's hilarious. Like I love that it's a legitimate business.
And if you search like the library guides, you know, Springshare's library guides community or you
search our library's catalog for butt plug, our guide will come up because we have case
studies talking about hacked, you know, Bluetooth enabled butt plugs and like the hacked butt plug.
Yeah. So it's like, you know. I have not heard of this.
Well, when we first read this journey, I started reading and we'll talk about this book, I'm sure,
but artificial intimacy by Rob Brooks.
And I texted Sarah a picture.
Like within the first chapter,
I learned a new term and it was so wonderful.
Teledildonics.
Yes.
I was like,
this is going to be so much fun to research and learn about.
I'm really into Project Zanadu and like Digital Gardens.
And the guy who came up with like Project Zanadu also came up with the term
deladeladonics.
And I'm like, he's my favorite person.
That was.
That was actually the inspiration. It was like pandemic era, stay-at-home order lockdowns. So people are doing grubhub, people are doing teledlodonics, right? So if you're not familiar with the term, we're referring to sex tech devices that are basically internet of things enabled so that you can be controlling sex tech or a dildo that your partner is using at a distance, right, during lockdown, during COVID. And there's this company, this company, I think camsoda came out with one called grub buzz. So grub buzz, so grub buzz,
was the vibrator that synced to your Grubhub account and the activity of the vibrator intensified
as your food order near delivery to your door. So you could essentially enjoy yourself,
come and have your doorbell ring and your pizza arrive. Oh, especially if you were like a feederism
kink person, right? You were just feeding yourself. Yes. There's a market for all of this,
which is fabulous because we were able to also adapt to this workshop for startup week at Penn State
by focusing on the entrepreneurs of the sex tech industry, which is a really huge growth industry.
So endless, endless linkages and connections. But yeah, so those teledildonics were the inspiration. And I will say, like, Rob Books's book is great. He comes at it from that evolutionary biologist perspective. But if you really want to do a deep dive on the sex tech, Kate Devlin's book turned on, which is about sexual robotics and kind of the material culture of sex tech is also really, really good. What was that author's name? Kate Devlin, T-E-V-L-I-N, turned on is the title, the main title.
I'm like, I've got a list going already. So, and like, we can fill in, we can fill in. We can fill in. We can fill
gaps in the notes if you want. But yeah, and that was, I kind of emailed Alex and I was like,
as she said, like, I'm a cisgendered heterosexual, white female who has been in a monogamous
relationship for like longer than I'd cared to admit. I don't know, 17 years, something like that.
So I was living with my partner when we were in lockdown. And that was great. And I've not really
had a need for these kind of devices or didn't really know they existed. So when that came up in sort of
my case study searching for updating the toolkit, I was like, I think we might have to do
a workshop all about this, like all based on this one article.
And then, of course, I think Rob Brooks' book came out contemporaneously. It was like I ordered it almost in that same time span. And then a couple of years ensued because had some babies and had some other leave issues from the university. But yeah, so eventually we were able to deliver private bits last spring twice. We did it once as a standalone workshop as part of our series. We did it again as an invited adapted workshop for startup week for Penn State. I should say Penn State Works. We offered it university wide, but I don't want to make it sound like Penn State University was inviting us to do this. And now we're going to
to offer it annually in the spring. We're kind of hijacking the Love Data Week branding, right? So Love Data Week.
Yeah.
As a verb, like, love your data. And we're saying, no, love is a noun. This is love data.
Yes. Love data. That's love data that we're talking about. I love that. Yeah, I was looking
through all of the materials for this workshop. And one of my favorite things about it was the part
where it's like you kind of talk about like the data surrogate, the metadata surrogate of your body.
because I've talked about that before, like, with regards to, like, archival silence and, like,
the types of data that's collected about you and all of that, because I'm really fascinated
by this idea of, like, surrogacies in the catalog. And, but then embodying that surrogate and, like,
the different parts of the surrogate and who collects different parts about what and what they do with it,
I was like, oh, this just connected, like, I knew that, but it was just a way of reframing it that
just totally, like, that was, like, the part that really clicked with me.
in that workshop. That's awesome. Yeah. And we're, you know, the data double or data surrogate issue is so
fascinating because, again, if you're, you know, a cisgendered heterosexual person, you might
think, well, I'm not sticking out in the data, right? Like, I'm covered because I'm in. You're the
Tor browser. Exactly. But, you know, part of the points of at least some of the talks that we've done on
data justice and social justice is related to privacy that I imagine will surface in that workshop that
Alex is working on is. So if you're quote unquote normal, you're the standard that everyone else is then
profiled against, right, and downranked against. So that that complicity in data brokerage and in
the data panopticon that Oscar Gandhi called it or the panoptic sort, right, is is not that
you're maybe not experiencing some harm, but the fact that your profile exists as normal can
result in some form of harm to others, right? This is just an argument for letting your free
freak flag fly in public, possibly? I don't know. Yeah. And you know what? That all gets back to,
and I always think of like the three C is like consent, choice and I'm going to forget my third C now.
But like, yeah, you should have the choice to let your freak flag fly. But if you're uncomfortable
with that, it's that consent. It's that choice that gets taken away that's so painful for people.
But I agree with you. You know, if you're less shame, you've less shame about it.
there's less danger, I suppose. Provide some cover for the rest of us. And I think it's also,
it's hard to play the long game, right? So you mentioned Dana Boyd already. So we're fans of her work,
especially. We use the properties of web data a lot when we talk about contextual integrity and
context collapse. And one of those properties is persistence, right? So we say the internet never
forgets, of course, except when it does or never knew in the first place. And that's usually really
not to our convenience or to our benefit. The things that it remembers are sometimes the things we'd
like for it to get. And this is another area where I think Alex and I maybe have divergent views,
or at least I have unsettled views. Because of the intellectual freedom and free speech maximumist
and me, I'm like, what is it appropriate to allow the internet to forget? What is the right to be
forgotten if something's been codified somewhere publicly available somewhere? So there's lots of
interesting questions, I think, in that space. But again, intimate data, sexual data, maybe of a
different nature and beast because we wouldn't, you know, does the, is there a reason? Is there a reason?
reasonable expectation or is the reasonableness standard in place where the average person is going
to expect that kind of information to be public about them or to have access to that information
about an unknown person that they're not intimate with, right? So, yeah. But even arguably,
I think the people who might be more on that like letting your freak fly fly bandwagon like sex
workers, I've read a lot of stories about them being doxed by clients where they're just outed.
Maybe they have a persona where they're leading a whole of their life. And it's like destroying their
entire life through that, that doxing situation. And this just happened to an academic, right? He and his
wife were creating pornographic videos. Oh, yeah, that's right. And he, I think he retained his, like,
tenured faculty position, but he lost an administrative position. So yeah, I thought that was a really
strange flex for an academic institution. I feel like a university should just go, eh, he's good at his job. So he makes
porn, whatever. And it's with his wife and like invited third parties. Like, what's the big deal? But
Yeah, you know, I think there was some questions in the notes about like tips for sex workers.
And so my first tip is like, ask another sex worker because I don't have any lived experience or earned
wisdom in that space. And I haven't done enough reading to know. But it does call to mine.
So this is going back a few years now, but Safia Noble's book, Algorithms of Oppression, right?
And her kind of rhetorical and code analysis of the search for black girls in Google searches and in Google Emerge searches in particular.
And I really struggled reading that because I thought, well, there are some black girls who want to be discovered for their sex work based on those keywords.
And I think she does a deep dive in the book, but certainly if you're familiar with like search engine optimization and keyword searching, the porn industry was a huge force in developing those techniques and sort of those back end indexing tools.
So it's like, well, the page rank algorithm is kind of revealing that a primary use case for people searching black girls and images is that they want pornography.
for better or worse, whatever the moral argument there is, like, you know, what are the pros and cons of essentially censoring, which is I would argue what Noble was suggesting, right, in her book.
Censoring or otherwise, we would call it now alignment, right? AI alignment. Aligning the search results for that phrase versus the use case of that phrase that, you know, the behavior of the algorithm is revealing because it's responding to what users click when they are looking for that phrase.
Yeah, and a lot of this discussion about like what information is out there and what gets revealed and everything.
Like right before the pandemic started or like right at the beginning, the trans writer Anna Valens, who I really want to get on the pod.
She wrote this great article in The Daily Dot about the sort of like queer discourse around public sex, around Kinket Pride, like around all of these things.
And, you know, talked about the usual things like cruising and all this stuff.
But then near the end, she just like dropped a bomb and starts talking about like surveillance capitalism.
And how like, you know, when you have like Alexa or any other virtual assistant or there's like an encrypted sex or like using disc like all of these things, it's like all sex is public sex now.
Literally all sexual activity is public sex basically because of surveillance tech in our lives.
And so like the sort of argument against public sex is a completely like there's both like like a privacy aspect to it.
but it's also like people not recognizing the ways that their public sex is okay and others isn't.
Oh, I will tell you when we do the Private Bits Workshop, phones come out when we get to the slide talking about how data flows from porn websites to third party tracking and data brokers.
Phones come out, taking pictures of that.
Phones come out, taking pictures of the data that we have about how porn travels across the internet, how it's like the majority of.
traffic on the internet about how much data is collected about their sexual preferences. Because again,
this gets back to context collapse and contextual integrity just because you watch something on,
like, say you go to porn hub and you're watching something that does not necessarily, there's
inferences that get made about you, whether you like it or not and those profiles will exist
and follow you. So yeah, it's fascinating. Students though care. Like I commented to Sarah the first time
we did the workshop, it was hilarious. Like, they were all of a sudden very concerned.
So, what we think is private, we don't realize how it actually influences our public data
double. But data certainly. Absolutely. Yeah. It's embodied like we were talking about earlier.
Yeah. Yeah. I wanted to bring up when we were talking about Teledeldonics earlier, and this is all
stayed on the same point, but how these issues are more, like you were talking about having the
cover of being cisgender and straight.
How these, that you're, by the very
nature of the queer dating game, I feel like
you're more likely to have to enter,
you know, be in a long distance relationship.
Like, I know a lot of trans people who
they're in long distance relationships.
And like, or if
you're in polyamorous relationships
or, you know, teledilodonics, you're more likely
to use them. Dating apps of
certain types, but also just like phone
collation. You know, I know a bunch
of people who just went to Magfest and definitely
were fucking. All their phones were in the
same room, you know, like that kind of, it's like going all going to a protest and all your phones are in the
same place. Yes. Yeah, yeah, exactly. There's a great interactive New York Times article about
geolocating a phone data and how it can reveal all these different habits and activities for sure.
It's so identifiable. But yeah, definitely there's a, I think it's a Yale Law Journal article from
Danielle Citrone on sexual privacy that goes over a lot of the technological harms. Well, it's
fabulous. It goes into like potential legal outlets too for solutions for this, which I know Sarah and I
sometimes go back and forth on. We're on a spectrum. It goes into how technology while it can facilitate
these relationships and be like such a blessing to people, particularly in marginalized communities,
it just exacerbates harm though because of this own scope that's the ability of with this data
to increase scale and scope of harm. So yeah, Danielle has she has a, she's a
book-length work, too, called The Fight for Privacy. She's like a literal genius. She's a MacArthur
fellow. So that's definitely recommended. Yeah, that's the whole thing with every aspect of privacy.
It's like this really delicate balancing act. And it's not black and white. All of this technology
is really, it can do good in our lives. It's just there's this ignorance or like this lack of,
like you were talking about the frames and Dana Boyd's study. It's this hard, this people have
difficulty conceptualizing it, which is what I think that's six private eyes for us.
is able to help us articulate for students because it's so hard to conceptualize the harms when they're so invisible.
And I think that, again, is learned and then by design when you get into what have been called dark patterns or deceptive design.
And there's one specifically on privacy that they call privacy zuckering and dishonor of Mark Zuckerberg.
Right. So, you know, that surveillance capitalism, again, Shoshana Zuboff's work, if you haven't read that book, that's another good one to add to the TB red pile, talking about the underlying architectures of all of this technology.
And she basically says, smart is a euphemism for surveillance. Like if you have any piece of smart tech,
smartphone, smart hub in your home, smart dildo, like whatever it is, that's a euphemism for your data
is being rendered out of that device, used to profile you and use to categorize rank and sort
yourself and other people. In order to predict her to nudge your behaviors, make you more predictable
because what's predictable is profitable. Like that's where the capitalism angle comes into her
model. And yet, I think when this data is intimate or sexual data, it's especially pernicious,
and that's the point that Kate Deblan made in her book that really, like, made me pause, which is
that, like, sexual data has the power to completely destroy lives. And there actually was just a
notification that went out from the FBI and a couple of other collaborating, you know,
law enforcement agencies about new sex distortion schemes, where these violent cartels are using
publicly available social media images of minors, innocent pictures. They're using, you know,
using generative AI technology to create digital sexual identity fraud, deep fake porn,
sending the deep fake porn to the target victim, the minor, and saying, I'm going to release this.
It looks like you. It's indistinguishable from a real video view unless you do X, Y, and Z things.
And those things range from create actual pornography to send harm someone else, harm themselves.
There's something called fan signing where they carve someone's name into their own flesh.
So it's like getting super dark and it's targeting kids.
And it's like intersecting all these.
existing problems and amplifying them to just such an extreme degree. It is truly disturbing.
So I have family. They're like, why won't you just email me a picture of your kids or why don't
you have pictures of the kids on Facebook? I'm like, here, read about all the horrific things people
can do with one photo of a child, you know? So yeah, so it's tough because as Alex said, like,
I'd love to live in a world where everyone was a nice person. Unfortunately, like two percent of people
are total sociopaths. So like, what do you do in the world? Yeah. And like, I was thinking as well,
it's like to bring this like to libraries too.
So someone called me a CIA plant on Twitter the other day because they said that public libraries should and do often have porn especially like including public libraries.
Right.
And this like really hardcore statist DSA person was like it's not it's not very anarchist of you to like want the state to control what kind of porn you watch through a state institution like the public library.
But like it made me think about the point though of like.
oh, if people, because I'm like, I'm of the stance that like people should be allowed to watch porn in a public library as long as children can't see it. But like sexual health is like a right that people have, etc. There are other discussions to have about like how does that affect safety of like the library workers and everything, but on its own. But if it is tied to your library card, which I guess you could make it not tied to your library card, then like we say that we don't like to collect records, but what about our vendors that we use? You know, they will absolutely collect patron data. And, and, like, we say that we don't like to collect
even if we say that we don't hold on to that,
you have to be like, stop it.
Like, I know that like, I've done instruction sessions before where I've like,
by the way, if you have a lot of privacy features in your browser or something,
that might break the database and you might need to go into incognito mode or uninstall or stop
these extensions just so you can fucking use EBSCO, you know, like,
that kind of thing.
So it's like, even if we don't collect the data, like a vendor might,
especially if it's going to be sex related because of like age or whatever. And so I was like there is a little bit of a point there about like a state institution like putting like getting to see what kind of porn you watch. I mean, they are. Yes, they do. They love it. Your FBI handler loves it. It's all known. No, I think that's very intellectually charitable of you to give them that point. And I agree. I mean, I think the case of porn and libraries is it's an interesting conversation. And for me, I, I
have some ambivalence, not in that I don't care, but in that I see a lot of sides of that issue.
And on one level, I think there's like, there's watching porn and there's watching porn for edification.
And then it's like how far down the line of self gratification are you going to permit, right?
Before it becomes.
Right.
Before it becomes like a biohazard.
And it's a whole different conversation.
Yeah.
Sorry.
You were saying something else really interesting.
I lost my train of thought about it.
No, just the complicity of libraries.
We like let this happen on our watch, essentially.
Sarah and I talk about that all the time.
Like, we say we embody this, we talk about it a lot.
I mean, I think that's partially why Sarah and I have, we haven't given up talking about
or dealing with, like, when it comes up in our day-to-day jobs, we definitely address privacy
concerns as they relate to higher education and libraries.
But we've kind of gotten bored with the conversation because we let it happen on our watch.
And we're, we can talk about patron privacy all day long, but we kind of lost the battle.
and we find that it's more interesting and fruitful in our positions in academic libraries
to have more of an impact on students and their future impact on these technologies.
Because we're working with students who are going to be writing the code,
they are going to be creating these technologies, they'll be investing in them,
they'll be applying them, whether it's in medicine, in criminal justice, whatever it may be.
We can infuse just a little bit of knowledge and ethics and thoughtful,
into how we approach this and how it impacts our society, privacy norms, and our futures,
and our potential. That to us is where we feel like we have the bigger impact as opposed to
like Petron privacy, which is sad. We don't want to give up on that. And we do, like I said,
we do address it still. But we were complicit in just the slippery slope of letting it happen
over the last like two decades. You're totally right. Like all of our, even with open access,
there's issues with like data collection. Well, it's all hosted by Amazon services.
Ours is. All our values are like clashing. Yeah. So I think kudos to some folks at Penn State
libraries, University Libraries, is finally moving away from Google Analytics. Hell yeah. Shouts out.
Yeah. Jay, when you were talking about library data collection, you know, as an ISP in terms of what
you're streaming over the computers or accessing through the databases. So obviously, you know, you had Becky
Kios on recently and her and Nick Shockey did that.
Awesome report for Spark about Elsevere and Science Direct.
Dorothy Asalo, I'm not sure if you all have had her on.
Yeah, she does some great work on privacy.
I think we passed her on twice.
Yeah.
Yeah.
Like three times, actually.
First guest.
Fabulous.
So she has her piece on physical equivalent privacy,
basically saying that the things we let slide in the digital sphere would never,
never, never, never allowed to slide if they were happening physically.
And I think that gets back to your point about where students, like, you know,
young, young youth students, whatever, not college students,
were recognizing sort of those embodied physical experiences as it relates to privacy, you know, more readily than they were maybe recognizing some of the metaphysical things that we get to talk about with the sex private eyes framework. So yeah, full circle.
I did want to, because Alex, you were starting to bring up something I wanted to maybe close on.
Sorry for a spoiler alert.
Oh, no, not really. Because I was interested in, because with teaching any kind of literacy, information literacy, concerns about AI,
literacy. I'm sure we're going to be hearing about AI literacy for the next year and it's going
to drive me nuts and I'm not going to want to hear about it. And then privacy literacy,
all of this literacy training, we always talk about the effectiveness of it. And you mentioned
having a future impact on students who are building things because we've already sort of
lost some of the battle with vendors. But a lot of what I am interested in is the interaction
between the library and the vendors so that we can actually change the relationship,
contracts, state rulemaking, you know, if you don't provide us with privacy, not only are we going
to cancel, you're going to pay us our money back, that sort of thing. We can pass that all the
time in a state government when it's about like, I don't know, prohibited business because they're
somewhat tangentially related to China. We can put that kind of language in there all the time and
be like, if you annoy us, you give us all of our money back. But we, you know, why can't we do this
for privacy? And a lot of the stuff that Spark is doing, there's contract working.
groups and privacy working groups, and those are like hand in hand. So I think there's still a lot
of work that can be done that way. And it's a really good material way of approaching the problem.
But how are you, aside from future impact, how do you feel that doing literacy training is going
to have a broader impact outside of making people more aware? Yeah, I mean, I think our goal is always to
kind of go above just awareness and into the value. This is the values-based approach that Sarah was talking about
at the very beginning. Getting people to understand this is bigger than these small little niche things.
Like this isn't about targeted advertising. This isn't about like the ad that follows you across the
internet. This is about how this is going to change your life opportunities as a result of that
data that is being collected. Like yeah, you might see the team you ads like we were talking about
at the beginning for something random that you wanted to buy. But this is also going to impact your
creditworthiness when you go to get a mortgage. This is going like increasingly financial institutions
are buying up, like overing up data and making decisions on your your literal mortgage rate or your
like anything rates for all of your loans based on your online activity. This is going to
fundamentally change our opportunities, some of which will impact all of us, some of which will
impact more marginalized groups, whether that's a socioeconomic status, a racial group, gender,
and sexual identities. It's going to impact the marginalized groups the most, but it will impact
all of us. We want to bring that value system and drive that home with students, but also because we have
this opportunity to work with faculty, which is one of our next steps. Like, I'm going on maternity leave,
but when I return from that, that's one of my big goals for next year to get into some faculty
training where we're helping them design their curriculum to infuse some of these issues and
some of these ethics into what they're teaching. There is a dearth of this available.
Even the ISTE standards and Sarah would know better than me, a lot of them are like little checkboxes
for these kind of ethical things. So like one class will have, say there's a requirement for one
single class and a tech course that requires just like one single lesson on ethics. That's like the
extent of it. So what we try to do is with our outreach, with our professional development training,
with our toolkit. We try to extend it beyond Penn State just to get other people talking about this,
other people doing this work, and infusing it into higher education a bit more. We'd love to see
standards across the board in libraries where we're going to have K through 12, higher ed, public libraries,
have privacy literacy standards where we're actually doing authentic work to see this kind of move
toward change. And I totally agree there's amazing things going on with like holding vendors
accountable through Spark. Earlier, my comment was more like I found it more fulfilling to put my
work in a different direction toward the literacy. But I do admire everything Spark is doing. It's
amazing. And I think we should be holding vendors accountable with the power that we do have as
library as the main consumers and purchasers of some of these tools, which we should have been doing
all along. Now that I've rambled, Sarah, fill in any blanks that I just had in some of this.
Sure. So a couple of thoughts. So we are actually doing a faculty-facing workshop in February,
also during Love Data Week, called Minding Privacy, which is a Privacy Pedagogy Workshop.
It's informed by one of the chapters in our book that just came out with ACRL. So the chapters by
Lindsay Wharton, Liz Dun & Bride, Adam Beauchamp, and it's about privacy pedagogy.
So this is the idea that you're teaching with privacy informing your learning design.
So teaching with your privacy principles while also teaching about privacy in your disciplinary context.
So one of the outcomes that I'm hoping will workshop during that event is actually developing some privacy, student privacy syllabus statements.
And then again, working with faculty to start to explore what are some privacy concepts relevant and germane to their disciplines that they start to infuse into their curricula.
And then I think what Alex is talking about in my vision makes more sense to do on an individual consultative basis.
So that's coming up in February.
You can find those workshop materials.
Again, everything's open license and freely available on the web.
The workshop's called Minding Privacy.
If you look for that through Penn State University Libraries, it'll come up.
Back to this conversation about the good fight with vendors.
I think it's a noble effort.
It's something that, you know, when I used to work on the tech services side of the house,
I would have loved to be part of.
Being that we're more on the instruction and reference side of the house, it's not so much our area of focus.
But if you're interested in the student perspective on these issues, the data will doubles.
Data doubles, excuse me, project, which was led by principal and investigator Kyle Jones. Dorothy
Sallow was involved with that project as well. Did some really great work, quantitative and qualitative
mixed methods work, exploring students' perspectives on the use of learning analytics by libraries
and the participation of their academic libraries and learning analytics initiatives, including what their
perspectives were on when libraries share their data with third parties. So that's worth a look.
A big takeaway from that for me is that broadly speaking, students still trust live.
libraries, and that I think is something that's on the table for us to lose. And so in that Spark
Working Group, where we're looking at this resource library for different community members and
stakeholders in the academic community to talk about privacy issues, trust, institutional trust,
and basically reputation is something that we're saying like, this is at risk when you violate
students' privacy or violate scholars' privacy in these ways. But that dynamic changes when students think
about sharing of their intellectual data with third parties. So back to the Google Analytics, right,
running on all our library sites. As soon as
they recognize, oh, the library is leaking this data about me to these third parties. Trust declines
precipitously. Same thing. It's a different question. The trust is broadly there. But when you start
to slice and dice student groups demographically and look at various members of minority highest
communities, that trust also declines. So if you care about diversity, equity, inclusion,
access, belonging, justice, privacy has to be part of that conversation. Yeah. So that's another,
I don't want to say angle that we're playing up because that sounds very exploitative. But it's another way that
we're trying to demonstrate, this enhances the relevance of libraries to our communities and also
aligns with other strategic initiatives that we have. And I think anytime you have that alignment
and that resonance between efforts, it's a positive thing. Yeah, I wanted to ask where,
because I saw that you're building, especially when you go in like the ACRL framework site,
you can see that you're building a lot of workshops and building a lot of resources. Where are you going
next with this? Is it going to be, is it going to stay under like digital shred? Is that going to keep
growing. How are you seeing this growing in the future or changing in the future?
We did just have our edited volume come out, believe in October, correct, Sarah?
So that book did come out in October and that's kind of looking at privacy, literacy across,
oh, look, you have a sitting next to. Mine is over here too, but it has post-it notes all over it.
So that did just come out. We also are working on a co-authored book.
a privacy literacy field guide where we really do want to bring a lot of guidance on the
pedagogical side of this and the educational side, the different ways that you can approach
this, a lot of what we talked about today, two practitioners. So that's really one of our big
next projects and that's hopefully forthcoming in 2025. We will see there. But that's really one
of our big next steps. We have a lot of other, you know, irons in the fire and ideas that may be
forthcoming as well. But we are very dedicated to finding ways to inspire and empower library workers
to take on the work. Obviously, our area of expertise is in academic libraries. We've both been
working in higher education for over a decade. But there's amazing work happening in public libraries,
as I'm sure you guys know, and K through 12 as well. So we'd like to partner. That's another big goal we have.
would like to partner with some K-12 in public libraries, workers to create some sort of standards.
But we'll see how that progresses over time as well. Sarah, do you miss me?
Two other things I'm working on. So I'm developing a semester-long course proposal that I hope
will run as a gen ed course at Penn State Berks. And that I'll eventually release as an open
textbook. So I have to talk to the powers that be first. So it's kind of, I'm developing the
proposal in the syllabus. I'm not sure how that can work in tandem with releasing an open
textbook, but at some point, there will be an open textbook on these topics, kind of linking them
all together in a course long experience or semester long experience. And I'm framing it in a project
that does controversy mapping, which is a method I'm really excited about out of studies of technology,
technology and system studies. And also with intellectual virtues, which is another thing I
walk out on. So looking at curiosity, epistemic responsibility and open-mindedness, how we can
approach these kinds of issues from those habits of mind. So that's kind of in the works.
And then the other thing is, well, two other things. I guess one is I love gamifying learning and
game-based learning. So we've always had these backburner projects of doing a surveillance
capitalism monopoly. That was one idea. I also want to do an impossible to escape room,
which would be like a choose-your-own adventure experience where you can't actually escape all of the
ubiquitous surveillance that's in the game environment. So it's kind of a lesson in just how
much surveillance is and how prevalent and pervasive it is in our lived experience. So gamifying it
it is another thing. And then the last thing I'd like to build out. So I'm, you know, we've done sort of
this core privacy literacy series that we offer in the fall for cybersecurity awareness month. So that's
the privacy workshop, the digital leadership or digital professionalism, which is more on
reputation management and contextual integrity and context collapse. We have our digital shred,
which is the more traditional managing your digital footprints. And then digital wellness. Now in the
spring, we're hijacking Love Data Week. We're offering data justice, private bits and hidden layer,
which is our intellectual privacy and generative AI workshop. I'd love to do one where we look at
privacy issues from the patent literature. So I liaise a lot with our engineering programs,
as well as with some computer science students, business students, entrepreneurs,
other folks who are doing patent searching in the course of their academic, but also
entrepreneurial research. And there's a lot of really interesting stuff to be gleaned about privacy
from looking at the patent literature. And you can kind of almost tell the future,
of surveillance tech by looking at the patents that are being filed. And of course, that's publicly
available data. It's really fun thing to do randomly, by the line. I have an additional repository
that's like a little bit defunct, but it lives and I might have to bring it back called Defang Big Tech,
named for the Fang stocks, right? And it's all about looking at the publicly available information
on these publicly traded companies. Their patents, their SEC filings, if they've appeared in a,
you know, a testimony in the House or the Senate, and it's on C-SPAN, and then they're publicly available
documents, their privacy policies, their terms of use, all those kinds of things. So what can we glean
about what these companies see as valuable in our data based on these publicly available documents.
So again, doing a workshop where we look at privacy and surveillance tech from the
perspective of the patent literature, I think might be a nice hook for my colleagues who are in
engineering and even entrepreneurship and innovation to say, okay, like I see where what you're
doing over here in the privacy library space maybe has something to say about what I'm doing
here and the engineering, startup development, entrepreneurship,
innovation space. So that's something that's kind of on my back burner. And that is a really good point
for anybody, like academic libraries, K through 12. These topics connect to literally everything.
So Sarah's talking about patents with her liaison areas. There's so much with wearable tech that applies
to a lot of the research that my kinesiology students are constantly doing. So it really, there's also
like environmental DNA that is now becoming an issue that now I can talk about with my biology
and my biochemistry students that I leave with, it literally will connect with any subject area.
So it's a very fascinating thing to get involved in no matter what your work role is, because it's going to connect in some way.
Amen.
Yeah. No, I like that you'll be building out. I hope there'll be an opportunity to build out more with practitioners in terms of getting an idea of what other people in libraries are doing so that they get an appreciation for the, the,
privacy issues, the risks of losing trust, and how maybe you could pull more people into developing
things for other practitioners. So it's rather than literacy focused, also maybe bring in some
the contract stuff that might be relevant so that when you're talking to librarians who are doing
the contract work, that they at least go, oh, here's one way of dealing with it. That's relevant to me.
I neglected to mention it, but there are, there is a whole section in our edited volume that just
came out on protecting privacy, and it does get a little bit more into access services.
Digital collections.
Thank you, baby brain.
Digital collections and different aspects of technical services, and some of the laws.
So that book does address it a bit more across the library spectrum.
Nice.
I think that's plenty for today.
Sarah and Alex, thank you for coming by.
Is there anything that you missed and want to mention before we wrap up?
No, I don't think so.
Thanks for saying yes to my shameless pitch.
Yeah, thanks for having us.
This is fun.
Yeah, exactly.
We reach out to people call all the time.
We got Corey Doctor O on somehow.
You did.
I listened to that episode.
And I liked how his pronoun was his
Dewey decimal number of choice.
Yeah, Justin just sent him an email.
And it was like, shit, okay.
I love his shitty technology adoption curve.
I love that.
Ooh, all shamelessly plug.
I have a sci-fi piece,
a sci-fi story, short story,
on privacy called version control.
So if you want to talk to things
about their privacy,
search for Google
or version control.
Don't use Google web search,
version control and my name.
Check it out.
And I say that because
the editor of the volume
that it appeared in,
Anna Marie Didering,
compared it to Corey Docto.
So what higher praise
could you possibly receive
as a sci-fi writer
than it compared to Corey Docturo?
Version Control.
Great.
Sarah and Alex,
thank you for coming on.
And good night.
