faculty /cmcinow/ en Communication that moves /cmcinow/communication-moves Communication that moves Amanda J. McManus Wed, 02/26/2025 - 11:25 Categories: Features Tags: Communication Research faculty

By Joe Arney
Photos by Jack Moody (StratComm’24)

The study of communication, as José G. Izaguirre III knows, is more than just interpreting the words. It’s also about how those words are heard—in a speech or an article, or in a post or on a poster.

It’s why he leans so heavily on showing communication in its original form, whether in the classes he teaches at CMCI or in a new book examining the formation of the Chicano movement.

“As I was analyzing these different texts, I was just struck by the intentionality to make things look a certain way, which really enriched the communication I was studying,” said Izaguirre, assistant professor of communication at the college, who goes by Joe. “It was clear that those aesthetics were part of the story, too—the degree to which photography, illustrations and designs played a significant role in movements.”

 

 

"It is possible for different people to come together around similar concerns, articulate different visions, but still try to work together to accomplish something good.
José G. Izaguirre III
Assistant Professor
Communication

 

Izaguirre’s book, , traces the beginning of the movement—which originated among striking farm workers in California—through its early years. His research examines the communications that organized Latin American voices into a global political power.

“The book highlights how race is always implicated in different political circumstances—while demonstrating that however much we try to get away from the language of race, it’s always there,” he said. “I tried to show the inescapability of race as a part of communication through a story about how Mexican Americans navigated racial dynamics and promoted a racial identity.”

A good example: “Chicano,” once a pejorative label, was itself reclaimed by the organization as it rejected assimilation and sought to assert its Indigenous roots. But while the movement united under one banner, it was never a singular voice. Izaguirre’s book shows how activists created a political power against the backdrop of the Cold War.

“I think the book highlights the importance of everyday activist movements, or even politically interested individuals who have concerns that are part of a broader community or communal concern,” he said. “It takes seriously these moments of everyday communication and spotlights them in ways that are maybe not typical.”

“Everyday communication” in the 1960s was, of course, very different than today, when demonstrations largely exist and are communicated in ephemeral digital spaces—what’s trending today is tomorrow’s relic. Much of Izaguirre’s source material was donated documents—leaflets, photos, newspapers and so on—that made this project possible. 

It’s how he was able to present so many period pieces in his book, alongside close readings of iconic artifacts like the National Farm Worker Association’s El Plan de Delano, or the poem “I Am Joaquin.” And there is value, he said, in seeing how those pieces are designed, even if it’s text-based, like the Delano document, co-written by Cesar Chavez, to guide their march through California. It contains a list of demands and concerns that, Izaguirre said, are valuable to see in their original context—and language. 

Another level of engagement

“When I show these materials in classes, I want to show that communication as close as possible to what it would have been like to encounter it at the time,” whether that’s a picture, a pamphlet or a speech, he said. “I wouldn’t call it an epiphany, but there’s some level of understanding that happens when I show them the whole document. Because it’s not just text pulled out of somewhere—it’s communication they can see for themselves.” 

That also means students encounter the original communication in its original language. For much of La Raza, of course, that’s Spanish. 

“I do show them an English version, so they understand the meaning of the words, but seeing it in its native language, they get almost the emotion of the words,” Izaguirre said. “Seeing the original document puts it in that cultural or historical context.”  

It’s something he hopes readers and students consider in the context of modern political movements, from the iconography at campaign rallies to how people find one another and organize digitally. But he also hopes those readers will be challenged to rethink the narrative that movements—or communities—can be viewed singularly. The Chicano movement is a prime example. 

“It can be harmful, to see communities being labeled in such a way that they’re cast as the opposition,” he said. “It’s easy to consolidate groups and label them as friend or foe. What’s harder is politics—which is really about building partnerships and opportunities for equal engagement.

“What I hope the book shows is that it is possible for different people to come together around similar concerns, articulate different visions, but still try to work together to accomplish something good.”

A new book looks at the rise of the Chicano movement through the lens of communication, from speeches to newspapers.

Off

Traditional 7 Spring 2025 On White ]]>
Wed, 26 Feb 2025 18:25:25 +0000 Amanda J. McManus 1110 at /cmcinow
A better way /cmcinow/better-way A better way Amanda J. McManus Tue, 02/25/2025 - 11:52 Categories: Features Tags: Environmental Design Research faculty

By Joe Arney
Photos by Kimberly Coffin (CritMedia, StratComm’18)

There’s a brick paver walkway that crosses 18th Street on the 91 campus by the ATLAS Institute. Thousands of pedestrians use it each day, crossing the brick path while cyclists, e-scooters, buses, emergency vehicles and the occasional car wend their way down the street. 

 

 “Design is a powerful tool to make an impact, because then we’re not telling certain people they’re functionally not correct. Instead, we’re saying, how do we create an environment that actually matches the needs of the user? 
Elena Sabinson
Director
Neuro D Lab

Is it a crosswalk?

From the description above, you might assume so. But there’s no signage warning drivers of pedestrian activity, or telling them to stop or yield. And you’ll find none of the striping associated with crosswalks. 

“When the students describe it, they’re like, ‘It’s basically Frogger out there,’” said Elena Sabinson, an assistant professor of environmental design at CMCI and director of the Neuro D Lab, which explores the intersection of design, neurodiversity, equity and innovation. “That space of ambiguity becomes a place where conflict or confusion happens. The lab looks at how that affects everyone, but especially neurodivergent folks who might rely on clarity and clear signage to understand how to navigate things.” 

Neurodivergence has become a global point of conversation as a movement builds to both recognize that each brain functions differently and to better understand how to design products, services, buildings and so on that serve everyone, instead of asking people to conform to the built environment.

“Design is a powerful tool to make an impact, because then we’re not telling certain people they’re functionally not correct,” Sabinson said. “Instead, we’re saying, how do we create an environment that actually matches the needs of the user?” 

Elena Sabinson crosses the street in front of the CASE building. While the brick paver walkway looks like a crosswalk, it lacks striping and signage indicating it's safe to cross, which can confuse both pedestrians and drivers. Part of Sabinson's research work involves assessing wayfinding on the 91 campus for confusing design cues.

 

A new direction for her work

Sabinson is uniquely suited to such challenges. As a PhD student at Cornell University, she was studying self-soothing technologies—especially in the area of soft robotics, like breathing wall panels that help people regulate their biorhythms during stressful experiences—when she received a diagnosis of autism and ADHD.

“That changed the trajectory of my research,” she said. “I’m still focusing on emotional well-being, but with this environmental lens of how to create inclusive, accessible products that are centered around self determination, agency and empowerment. 

“I make a choice to say I’m an autistic-led lab, and I invite this type of conversation in by making that choice, rather than just being an autistic person doing research.”

Bringing students into her lab and giving them opportunities to engage these challenges will, she said, push her to question some of her own assumptions developed after years of working in the field. But it’s also creating opportunities to potentially reshape the campus, such as the wayfinding project examining features like the ambiguous campus crosswalk. 

That work is partially funded by an undergraduate research opportunities program grant issued by the university. Earlier this month, Sabinson’s work was accepted by EDRA56, the influential conference of the Environmental Design Research Association. She’s looking forward to presenting it this May, in addition to helping drive conversations around making the campus easier to navigate. 

“One thing we have as a research lab is access to students who are really engaged and passionate about this work, and who want to take on projects that can’t always happen in industry, due to timeline and budgetary constraints,” she said. 

Industry feedback

Another thing she wants through both the lab and her classes is the chance for ideas from industry to influence her students’ innovation. In a course she teaches on fidgets and stims, one student created the Cacti Clicker, a plastic cactus with moveable segments. When you twist it, it makes a clicking sound, which isn’t always acceptable in a work or school setting. 

“So the student redesigned it so some of the spins make noise and some don’t, so you can still get the sensation if you’re in a crowded space,” Sabinson said. “That’s an example of how we field test these products with people, get feedback—and learn to take feedback—to make their products better.”

It also doesn’t look like a traditional fidget toy. That’s also by design—it just looks like a cactus statue on a desk in Sabinson’s office. 

“A lot of what I consider in my work, and that we talk about in class, is the social stigma around using a fidget—that a lot of people might want to, but they’re considered to be toys,” she said. 

The bigger goal is to eliminate that stigma altogether—but in the meantime, she said, this product is an option for people who need it, while “just living on your desk and looking like a decoration.”

Can design help those with neurodivergence be more comfortable in their environments? A new lab is searching for answers.

Off

Traditional 7 Spring 2025

Elena Sabinson demonstrates using an inflatable sensory band in her office. Part of Sabinson's research looks at inflatable surfaces and products that can be used by people managing anxiety to make them more comfortable in their environment.

On White Elena Sabinson demonstrates using an inflatable sensory band in her office. Part of Sabinson's research looks at inflatable surfaces and products that can be used by people managing anxiety to make them more comfortable in their environment. ]]>
Tue, 25 Feb 2025 18:52:49 +0000 Amanda J. McManus 1109 at /cmcinow
Poll-arized /cmcinow/2024/08/16/poll-arized Poll-arized Anonymous (not verified) Fri, 08/16/2024 - 15:08 Categories: In Conversation Tags: Advertising Public Relations and Media Design Communication Information Science Journalism Media Studies Research faculty

By Joe Arney

Deepfakes. Distrust. Data manipulation. Is it any wonder American democracy feels like it has reached such a dangerous tipping point?  

As our public squares have emptied of reasoned discussion, and our social media feeds have filled with vitriol, viciousness and villainy, we’ve found ourselves increasingly isolated and unable to escape our echo chambers. And while it’s easy to blame social media, adtech platforms or the news, it’s the way these forces overlap and feed off each other that’s put us in this mess.

It’s an important problem to confront as we close in on a consequential election, but the issue is bigger than just what happens this November, or whether you identify with one party or another. Fortunately, the College of Media, Communication and Information was designed for just these kinds of challenges, where a multidisciplinary approach is needed to frame, address and solve increasingly complex problems. 

“Democracy is not just about what happens in this election,” said Nathan Schneider, an assistant professor of media studies and an expert in the design and governance of the internet. “It’s a much longer story, and through all the threats we’ve seen, I’ve taken hope from focusing my attention on advancing democracy, rather than just defending it.”

We spoke to Schneider and other CMCI experts in journalism, information science, media studies, advertising and communication to understand the scope of the challenges. And we asked one big question of each in order to help us make sense of this moment in history, understand how we got here and—maybe—find some faith in the future.  

*** 

Newsrooms have been decimated. The younger generation doesn’t closely follow the news. Attention spans have withered in the TikTok age. Can we count on journalism to serve its Fourth Estate function and deliver fair, accurate coverage of the election?

Mike McDevitt, a former editorial writer and reporter, isn’t convinced the press has learned its lessons from the 2016 cycle, when outlets chased ratings and the appearance of impartiality over a commitment to craft that might have painted more accurate portraits of both candidates. High-quality reporting, he said, may mean less focus on finding scoops and more time sharing resources to chase impactful stories.

How can journalism be better?

“A lot of journalists might disagree with me, but I think news media should be less competitive among each other and find ways to collaborate, especially with the industry gutted. And the news can’t lose sight of what’s important by chasing clickable stories. Covering chaos and conflict is tempting, but journalism’s interests in this respect do not always align with the security of democracy. While threats to democracy are real, amplifying chaos is not how news media should operate during an era of democratic backsliding.”  

***

After the 2016 election, Brian C. Keegan was searching for ways to use his interests in the computer and social sciences in service of democracy. That’s driven his expertise in public-interest data science—how to make closed data more accessible to voters, journalists, activists and researchers. He looks at how campaigns can more effectively engage voters, understand important issues and form policies that address community needs. 

 

 The U.S. news media has blood on its hands from 2016. It will go down as one of the worst moments in the history of American journalism.”

 Mike McDevitt
 Professor, journalism

You’ve called the 2012 election an “end of history” moment. Can you explain that in the context of what’s happening in 2024?

“In 2012, we were coming out of the Arab Spring, and everyone was optimistic about social media. The idea that it could be a tool for bots and state information operations to influence elections would have seemed like science fiction. Twelve years later, we’ve finally learned these platforms are not neutral, have real risk and can be manipulated. And now, two years into the large language model moment, people are saying these are just neutral tools that can only be a force for good. That argument is already falling apart.

 

 I think 2024 will be the first, and last, 
A.I. election.”


Brian C. Keegan
Assistant professor, information science

“You could actually roll the clock back even further, to the 1960s and ’70s, when people were thinking about Silent Spring and Unsafe at Any Speed, and recognizing there are all these environmental, regulatory, economic and social things all connected through this lens of the environment. Like any computing system, when it comes to data, if you have garbage in, you get garbage out. The bias and misinformation we put into these A.I. systems are polluting our information ecosystem in ways that journalists, activists, researchers and others aren’t equipped to handle.”  

***

One of Angie Chuang’s last news jobs was covering race and ethnicity for The Oregonian. In the early 2000s, it wasn’t always easy to find answers to questions about race in a mostly white newsroom. Conferences like those put on by the Asian American Journalists Association “were times of revitalization for me,” she said.

When this year’s conference of the National Association of Black Journalists was disrupted by racist attacks against Kamala Harris, Chuang’s first thoughts were for the attendees who lost the opportunity to learn from one another and find the support she did as a cub reporter.

“What’s lost in this discussion is the entire event shifted to this focus on Donald Trump and the internal conflict in the organization, and I’m certain that as a result, journalists and students who went lost out on some of that solidarity,” she said. And it fits a larger pattern of outspoken newsmakers inserting themselves into the news to claim the spotlight. 

How can journalism avoid being hijacked by the people it covers?

“It comes down to context. We need to train reporters to take a breath and not just focus on being the first out there. And I know that’s really hard, because the rewards for being first and getting those clicks ahead of the crowd are well established.”  

 

“I can’t blame the reporters who feel these moments are worth covering, because I feel as conflicted as they do.   
Angie Chuang
Associate professor, journalism

***

Agenda setting—the concept that we take our cues of what’s important from the news—is as old an idea as mass media itself, but Chris Vargo is drawing interesting conclusions from studying the practice in the digital age. Worth watching, he and other CMCI researchers said, are countermedia entities, which undermine the depictions of reality found in the mainstream press through hyper-partisan content and the use of mis- and disinformation.

How did we get into these silos, and how do we get out?

“The absence of traditional gatekeepers has helped people create identities around the issues they choose to believe in. Real-world cues do tell us a little about what we find important—a lot of people had to get COVID to know it was bad—but we now choose media in order to form a community. The ability to self-select what you want to listen to and believe in is a terrifying story, because selecting media based on what makes us feel most comfortable, that tells us what we want to hear, flies in the face of actual news reporting and journalistic integrity.”  

 

“I do worry about our institutions. I don’t like that a majority of Americans don’t trust CNN. 
 

Chris Vargo
Associate professor, advertising, 
public relations and media design

***

Her research into deepfakes has validated what Sandra Ristovska has known for a long time: For as long as we’ve had visual technologies, we’ve had the ability to manipulate them.

Seeing pornographic images of Taylor Swift on social media or getting robocalls from Joe Biden telling voters to stay home—content created by generative artificial intelligence—is a reminder that the scale of the problem is unprecedented. But Ristovska’s work has found examples of fake photos from the dawn of the 20th century supposedly showing, for example, damage from catastrophic tornadoes that never happened. 

Ristovska grew up amid the Yugoslav Wars; her interest in becoming a documentary filmmaker was in part shaped by seeing how photos and videos from the brutal fighting and genocide were manipulated for political and legal means. It taught her to be a skeptic when it comes to what she sees shared online. 

“So, you see the Taylor Swift video—it seems out of character for her public persona. Or the president—why would he say something like that?” she said. “Instead of just hitting the share button, we should train ourselves to go online and fact check it—to be more engaged.”  

Even when we believe something is fake, if it aligns with our worldview, we are likely to accept it as reality. Knowing that, how do we combat deepfakes?

“We need to go old school. We’ve lost sight of the collective good, and you solve that by building opportunities to come together as communities and have discussions. We’re gentler and more tolerant of each other when we’re face-to-face. This has always been true, but it’s becoming even more true today, because we have more incentives to be isolated than ever.”   

***

Early scholarly works waxed poetic on the internet’s potential, through its ability to connect people and share information, to defeat autocracy. But, Nathan Schneider has argued, the internet is actually organized as a series of little autocracies—where users are subject to the whims of moderators and whoever owns the servers—effectively meaning you must work against the defaults to be truly democratic. He suggests living with these systems is contributing to the global rise of authoritarianism. In a new book, Governable Spaces, Schneider calls for redesigning social media with everyday democracy in mind.

If the internet enables autocracy, what can we do to fix it?

“We could design our networks for collective ownership, rather than the assumption that every service is a top-down fiefdom. And we could think about democracy as a tool for solving problems, like conflict among users. Polarizing outcomes, like so-called cancel culture, emerge because people don’t have better options for addressing harm. A democratic society needs public squares designed for democratic processes and practices.”  

***

It may be derided as dull, but the public meeting is a bedrock of American democracy. It has also changed drastically as fringe groups have seized these spaces to give misinformation a megaphone, ban books and take up other undemocratic causes. Leah Sprain researches how specific communication practices facilitate and inhibit democratic action. She works as a facilitator with several groups, including the League of Women Voters and Restore the Balance, to ensure events like candidate forums embrace difficult issues while remaining nonpartisan.

What’s a story we’re not telling about voters ahead of the election?

“We should be looking more at college towns, because town-gown divides are real and long-standing. There’s a politics of resentment even in a place like 91, where you have people who say, ‘We know so much about these issues, we shouldn’t let students vote on them’—to the point where providing pizza to encourage voter turnout becomes this major controversy. Giving young people access to be involved, making them feel empowered to make a difference and be heard—these are good things.”   

***

Toby Hopp studies the news media and digital content providers with an eye to how our interactions with media shape conversations in the public sphere. Much of that is changing as trust and engagement with mainstream news sources declines. He’s studied whether showing critical-thinking prompts alongside shared posts—requiring users to consider the messages as well as the structure of the platform itself—may be better than relying on top-down content moderation from tech companies.   

Ultimately, the existing business model of the big social media companies—packaging users to be sold to advertisers—may be the most limiting feature when it comes to reform. Hopp said he doubts a business the size of Meta can pivot from its model.

How does social media rehabilitate itself to become more trusted? Can it?

“Social media platforms are driven by monopolistic impulses, and there’s not a lot of effort put into changing established strategies when you’re the only business in town. The development of new platforms might offer a wider breadth of platform choice—which might limit the spread of misinformation on a Facebook or Twitter due to the diminished reach of any single platform.”   

***

 

 Images have always required us to be more engaged. Now, with the speed of disinformation, we need to do a little more work.”
 

Sandra Ristovska
Assistant professor, media studies

CU News Corps was created to simulate a real-world newsroom that allows journalism students to do the kind of long-form, investigative pieces that are in such short supply at a time of social media hot takes and pundits trading talking points.  

“I thought we should design the course you’d most want to take if you were a journalism major,” said Chuck Plunkett, director of the capstone course and an experienced reporter. Having a mandate to do investigative journalism “means we can challenge our students to dig in and do meaningful work, to expose them to other kinds of people or ideas that aren’t on their radar.” 

Over the course of a semester, the students work under the guidance of reporters and editors at partner media companies to produce long-form multimedia stories that are shared on the News Corps website and, often, are picked up by those same publications, giving the students invaluable clips for their job searches while supporting resource-strapped newsrooms. 

With the news business facing such a challenging future, both economically and politically, why should students study journalism?

“Even before the great contraction of news, the figure I had in my mind was five years after students graduate, maybe 25 percent of them were still in professional newsrooms. But journalism is a tremendous major because you learn to think critically, research deeply and efficiently, interact with other people, process enormous amounts of information, and have excellent communication skills. Every profession needs people with those skills.”

Where do we go from here? CMCI experts share their perspectives on journalism, advertising, data science, communication and more in an era of democratic backsliding.

Off

Traditional 7 On White ]]>
Fri, 16 Aug 2024 21:08:32 +0000 Anonymous 1086 at /cmcinow
The race to make tech more equal /cmcinow/2024/08/14/race-make-tech-more-equal The race to make tech more equal Anonymous (not verified) Wed, 08/14/2024 - 15:54 Categories: Features Tags: Information Science Research center for race media and technology faculty

By Joe Arney
Photos by Kimberly Coffin (CritMedia, StratComm’18)

Back when Bryan Semaan’s mom had a Facebook account, doomscrolling wasn’t part of her vernacular.

The Iraqi culture she was raised in compels celebration of accomplishments and milestones, “so any time someone posted something, she felt she had to interact with it,” Semaan said. “That personal engagement runs very deeply through our culture.”

But it became exhausting for her to keep up as her network swelled into the hundreds, so she deactivated her account. For Semaan, it’s a fitting metaphor for his research—which challenges the assumptions tech developers make about the users of their products and services. And it’s the kind of problem he wants to study through the Center for Race, Media and Technology, which the 91 unveiled in the spring.

“The people developing these technologies are in Silicon Valley—so, mostly male, mostly white,” said Semaan, director of the center and an associate professor of information science at CMCI. “A lot of the values we bake into these technologies are being forced onto people in different cultures, often creating problems.”

As a first-generation American, Semaan said he identifies with the liminal moments faced by others living between worlds—immigrants, veterans, refugees, people of color or Indigenous people—and the challenges of adopting to Western societal structures. Technology plays a big part, and the discipline’s blind spots are a key focus of Semaan’s research, which asks how these tools can create resilience for people in those liminal moments, such as a climate refugee fleeing disaster or a queer teenager anxious about coming out.

To kick off the center, in March, CMCI welcomed Ruha Benjamin, a professor at Princeton who’s developed her scholarship around what she calls the “New Jim Code”—a nod to both the Jim Crow laws that enforced segregation and the biases encoded into technology. Benjamin, he said, “focuses on how people consider technology to be a benign thing, when in fact it isn’t—tech nology takes on the values of those who create it.”

Fortunately, Semaan said, we’re at a moment when society is recognizing the importance of equity and justice, while seeing technology as a problem, a solution and a thread tying together the great challenges facing humanity—political polarization, disinformation, climate change and so on.

 

"These bigger challenges are going to require people thinking together at a much grander scale, which means changing how we work. 

Bryan Semaan

He’s optimistic that the Center for Race, Media and Technology will collect the broad perspectives needed to make, as he put it, “the intractable problems tractable.”

“What I imagine for the center is encouraging collaborations among the experts we bring together,” he said. “And I’m really hoping my research direction changes as a result of getting to work with the amazing people I’ll meet.”

If it’s collaboration he wants to get out of the center, Semaan’s successes to date have been more about tenacity. Early in his career, he said, some of his colleagues tried to steer him from migrants and veterans, dismissing his interest in making technology equitable as “a diversity ghetto.”

That didn’t deter him—and, with the benefit of hindsight, those rejections made him a better scholar.

“In my research, the people you work with are incredibly vulnerable, or are so busy surviving that they can’t talk to you,” he said. “You have to be passionate about that work, and prepared for long-tail effort before you make progress.”

The work of the center will be a long game, but if successful, Semaan said, it will put 91 at the center of the conversation around purposefully designed technology.

“It dovetails with the university’s broader mission around diversity,” he said. “It’s not just saying we’re going to increase diversity—it’s the issues we are approaching and the support we are building for different scholars across the university. Because these bigger challenges are going to require people thinking together at a much grander scale, which means changing how we work.”

A new center at CMCI is organizing faculty thought leadership to answer big, systemic questions about technology’s role in issues of social justice.

Off

Traditional 7 On White ]]>
Wed, 14 Aug 2024 21:54:10 +0000 Anonymous 1084 at /cmcinow
Brushing up their skills /cmcinow/2024/08/13/brushing-their-skills Brushing up their skills Anonymous (not verified) Tue, 08/13/2024 - 15:05 Categories: View Tags: Environmental Design faculty

By Malinda Miller (Engl, Jour'92; MJour'98)

High up on scaffolding, students meticulously paint bright floral patterns on the west side of the 91 Dushanbe Teahouse.   

They’ve been learning the traditional art of ornamental painting—nakkoshi—from Maruf Mirakhmatov, who is visiting 91 from Khujand, Tajikistan, for six months.  

“I really want to get into art restoration or just restoration overall, especially with bigger buildings,” said Kaija Galins, a junior architecture major. “My favorite part has been to watch each step of the way, like the sanding, laying down the charcoal and the tracing process.” 

Galins is one of 17 students who over the summer took a course on restoration of the Dushanbe Teahouse with Azza Kamal, an associate teaching professor in the Program in Environmental Design and a former historic preservation commissioner.

Students studied cultural heritage and preservation, practiced painting techniques in the classroom, and applied those skills to onsite restoration under Mirakhmatov’s guidance.

Kamal said the students also learned about the urgency to account for embodied carbon in new construction and restoration, as well as the value of refurbishing and recycling materials so they don’t end up in the landfill.  A gift from 91’s sister city in Dushanbe, Tajikistan, the teahouse’s intricate carvings, painted woodwork and ceramic panels were created by more than 40 artisans, including Mirakhmatov’s grandfather. 

“It’s important work, because there are only a couple people in Tajikistan still doing this,” said Mirakhmatov, a fifth-generation artisan. “For me, it’s easy because it’s in my blood, and every day when I’m painting here, I’m enjoying it.”

 

A student paints a section of the wall.

Students work on restoration at the teahouse.

Azza Kamal, right, works with a student on a corbel design.

Students practice painting techniques in class.

The corbels under the roofline have been repainted, while restoration of the lower panel is still underway.

Maruf Mirakhmatov paints white outlines on a floral design. The Program in Environmental Design, the city of 91 and the 91-Dushanbe Sister Cities Project partnered to bring Mirakhmatov to 91 for six months.

 

A beloved 91 landmark is getting a refresh thanks to students who are touching up the complex paint job under the guidance of an artist from 91’s Tajikistan sister city.

Off

Zebra Striped 7 On White ]]>
Tue, 13 Aug 2024 21:05:17 +0000 Anonymous 1081 at /cmcinow
#TechEthics /cmcinow/2024/02/02/techethics #TechEthics Anonymous (not verified) Fri, 02/02/2024 - 12:44 Categories: Trending Tags: Information Science Research faculty

By Joe Arney

Not many computer scientists have signs reading “Rage Against the Machine Learning” in their offices.

But in Evan Peck’s case, it’s a perfect symbol of why he was so excited to join the information science department of the College of Media, Communication and Information this fall. 

 

“I love being here because CMCI draws students who want to use technology in service of something they already care deeply about, and not for its own sake. 

Evan Peck
Associate professor, information science

“I started to believe that some of the most pressing problems our society is wrestling with don’t require deeper technical solutions, but a reimagining of the ways we’re using technology,” he said. “I was looking for deeper connections to social sciences and community-focused work—and I think that’s what information science excels at, shifting the lens of the technical in service to the community and society.”

Peck joined the 91 this fall from Bucknell University, meaning he’s gone from being a Bison to a Buffalo. More than that, it gave him a chance to join a college and department that is more closely aligned with his evolving research interests, which center on information visualization—especially the way data is communicated to the public.

Establishing trust around data

He already appreciates being surrounded by faculty and students who are experts in fields like media studies and communication.

“I’m fascinated by how we encourage people to trust data, understand it and respond to it,” Peck said. “While we can advance science enough to offer compelling solutions to societal problems, we continue to share those insights to the public without an understanding of people’s cultures, beliefs and background. That’s a recipe for failure.”

If you think about some of the public health messaging you saw during the pandemic, you’ll probably remember the frustration of getting information that wasn’t helpful or didn’t reflect reality. Peck, for instance, lived in central Pennsylvania during the lockdowns. In the summer of 2020, his rural county hadn’t seen a day in which more than two people tested positive, but because most COVID maps reported risk at the state level, high caseloads in Philadelphia and Pittsburgh made all of Pennsylvania look more infectious than it was.

That degrades trust in experts, he said, “and when cases spiked in my county about a month later, I believe it had eroded trust and willingness to react to that data.”

He has taken his interest in this area to some interesting new arenas, including extensive interviews with rural Pennsylvanians at construction sites and farmers markets, to better understand how they interpreted charts and what information was important to them. The resulting research received a best paper award at the premier Human-Computer Interaction conference, has been cited by the Urban Institute and others, and helped cement his interest in information science.

“I had a moment of realization,” Peck said. “I could spend my whole career as a visualization researcher and still have zero impact on my community. So how do we engage in research that has a positive impact on the people and community around the university?”

It’s not the only area he’s looking to create impact. Peck describes himself as an advocate for undergraduate research opportunities, especially for students searching for a sense of place within their degree programs.

“It’s a mechanism for helping students explore areas that aren’t strongly represented in their core academic programs,” Peck said. “I saw this as an advisor in computer science for nearly a decade—I advised students who wanted to think deeply about how their designs impacted people, but in a curriculum in which people were a side story to their technical depth.”

An eye to ethics

He also created an initiative around ethics and computing curricula at Bucknell that’s been adopted by computer science programs everywhere. If a question was presented in an ethics context, students came up with thoughtful answers—but that reasoning did not extend into other assignments or their careers. It’s a story that’s familiar for anyone thinking about the addictiveness of social media platforms or the disruptive potential of artificial intelligence

Some computer science programs offered a single ethics course, “but it was so isolated from the rest of their technical content that students wouldn’t put them together,” Peck said.

In response, he added more ethical and critical thinking components to the core technical curriculum, and developed a set of programming assignments in which students wrestle with a societal design question in order to accomplish their programming goals. He currently has a grant through Mozilla’s Responsible Computing Challenge to continue that work at 91.

“It’s about connecting the dots and building habits. Students need to understand that the system I’m programming is going to have implications beyond Silicon Valley,” he said. “How can we get you to think about the human tradeoffs beyond the aggregated rules you’re creating?”

It’s the kind of question he feels renewed vigor about pursuing in the Department of Information Science. 

“I love being here because CMCI draws students who want to use technology in service of something they already care deeply about, and not for its own sake,” Peck said.

“Computer science knows how to build marvelous systems, but not always how to make them work fairly or responsibly for diverse people and communities,” he added. “I think our department goes beyond the idea of ‘how do we build it,’ to think critically about who we’re designing for, who technology empowers, who it privileges, who it disadvantages.”

“Rage Against the Machine Learning” isn’t just a sign in Evan Peck’s office. It’s an emblem of his career pivot.

Off

Zebra Striped 7 On White ]]>
Fri, 02 Feb 2024 19:44:07 +0000 Anonymous 1042 at /cmcinow
#ShakeItOff /cmcinow/2024/01/29/shake-it-off #ShakeItOff Anonymous (not verified) Mon, 01/29/2024 - 15:16 Categories: Trending Tags: Communication Research faculty

By Joe Arney

Even by her standards, Taylor Swift has had a busy couple of months.

When she wasn’t winning Grammys and dropping hints about her next album, Swift was making headlines for her appearances during NFL games, her supposed role as an elections-interference psyop and lyrics that, when decoded, suggested she is queer.

What is it about Swift that has so many people, even her fans, seeing red?

“This is something that is continually churning with me because I hadn’t taken Swift seriously as an artist—reproducing the historical practice of dismissing or devaluing women’s work,” said Jamie Skerski, who studies how narratives are shaped and mediated by institutions, audiences, and cultural norms. “I was part of the problem.”

 

  “What is so threatening about even the speculation that Taylor might not be Miss Americana? Answer: Everything as we know it.

Jamie Skerski
Associate chair, undergraduate studies

“But it’s something very visceral, and I think Taylor taps into this sense of female empowerment, of anger, of frustration, of recognition, of systems that continue to try to take women’s rights away,” said Skerski, associate chair for undergraduate studies at the College of Media, Communication and Information at the 91.

Perhaps nowhere is the phenomenon more apparent than “Traylor”—the Travis Kelce-Swift romance that’s dominated pop culture throughout the football season. When Swift attends Chiefs games, she is typically shown on screen for less than a minute of a three-plus-hour telecast, but male football fans have furiously labeled her a distraction from the action. Skerski pointed out that other distractions, like military flyovers and cheerleaders, don’t attract nearly the same amount of outrage.

The Traylor relationship, she said, offers an opportunity to explore questions about the entertainment industry, gender and fandom—especially around the “fantasies of straight white men” whose loves of sports betting and fantasy football are validated through societal norms.

“It’s culturally acceptable when white-collar men seek escapism, entertainment and social capital in the commodification and dehumanization of mostly Black bodies for personal pleasure,” since that reflects dominant racial power relationships, Skerski said.

“But when Swift fans engage in a version of fan fiction—daring to imagine Taylor as playing for the other team—it is condemned, belittled and dismissed. This is a moment to ask, whose fantasies are allowed to exist, and why?”

The idea of Swift playing for the other team isn’t new—the so-called Gaylor community on Reddit and TikTok has been collectively analyzing her lyrics for years—but it entered the mainstream in January when a New York Times guest essay waded into the fray with a 5,000-word read of Swift’s life and lyrics, imploring readers to consider that her songwriting offers “a feast laid specifically for the close listener.”

The bigger question, it argues, is not whether Swift is gay, but the obstacles to coming out in our celebrity culture and what queer people owe one another.

“How might her industry, our culture and we, ourselves, change if we made space for Ms. Swift to burn that dollhouse to the ground?” Anna Marks, an opinion editor for the Times, wrote in the column.

The point hit home for Skerski. “If a celebrity needs to navigate cultural norms of acceptance, that’s the bigger question,” she said. The idea that Swift’s work can have multiple meanings and influence different audiences “would break everything,” she said, as it would challenge the way our culture characterizes and reinforces identity norms.

Still, a lot of angry Swifties took to online comments to vent their frustration on the singer’s behalf, lashing out at the Gray Lady for becoming a gossip girl as well as the author, who wrote a similar piece about Harry Styles in 2022. Not allowing Swift access to her own identity is at best a misguided attempt at allyship, Skerski said—and at worst, “the fan outrage reinforces a culture of protective paternalism that is invoked to control women’s bodies.”
 
“What is so threatening about even the speculation that Taylor might not be Miss Americana?” she said. “Answer: Everything as we know it.”

What is it about Taylor Swift that has so many people—even her fans—seeing red? A communication scholar says it's a theme she knows all too well.

Off

Zebra Striped 7 On White ]]>
Mon, 29 Jan 2024 22:16:11 +0000 Anonymous 1037 at /cmcinow
Questions about A.I.? Let’s Chat /cmcinow/questions-about-ai-lets-chat Questions about A.I.? Let’s Chat Anonymous (not verified) Sun, 10/29/2023 - 18:16 Categories: In Conversation Tags: Information Science Media Studies Research artificial intelligence faculty

By Joe Arney

When tools like ChatGPT entered the mainstream last winter, it was a moment of reckoning for professionals in every industry. Suddenly, the artificial intelligence revolution was a lot more real than most had imagined. Were we at the dawn of an era where professional communicators were about to become extinct?

Almost a year after ChatGPT’s debut, we’re still here—but still curious about how to be effective communicators, creators and storytellers in this brave new world. To examine what role CMCI plays in ensuring students graduate prepared to lead in a world where these tools are perhaps more widely used than understood, we invited Kai Larsen, associate professor of information systems at CU’s Leeds School of Business and a courtesy faculty member in CMCI, to moderate a discussion with associate professors Casey Fiesler, of information science, and Rick Stevens, of media studies, about the ethical and practical uses of A.I. and the value of new—and old—skills in a fast-changing workplace.

This conversation was edited for length and clarity.

"A.I. can seem like magic, and if it seems like magic, you don’t understand what it can do or not do.” 
­—Casey Fiesler

Faculty in conversation

Kai R. Larsen is an associate professor of information systems at the Leeds School of Business. He is an expert in machine learning and natural language processing whose thought leadership has been featured in the most influential academic journals. 

Casey Fiesler is associate chair for graduate studies in information science. She shares her insights in technology ethics, internet law and policy, and online communities both in scholarly journals and in the public, especially through social media. She is a courtesy faculty member in the Department of Computer Science.

Rick Stevens is associate dean of undergraduate education at CMCI. His work explores ideological formation and media dissemination, including how technology infrastructure affects the delivery of messages, communication technology policy, and how media and technology platforms are changing public discourse.

Larsen: It’s exciting to be here with both of you to talk a bit about A.I. Maybe to get us started, I can ask you to tell us a little about how you see the landscape today.

Fiesler: I think A.I. has become a term that is so broadly used that it barely has any meaning anymore. A lot of the conversation right now is around generative A.I., particularly large language models like ChatGPT. But I do see a need for some precision here, because there are other uses of A.I. that we see everywhere. It’s a recommender system deciding what you see next on Facebook, it’s a machine learning algorithm, it’s doing all kinds of decision-making in your life.

Stevens: I think it’s important to talk about which tools we’re discussing in an individual moment. In our program, we see a lot of students using software like ChatGPT to write research papers. We allow some of that for very specific reasons, but we also are trying to get students to think about what this software is good at and not good at, because usually their literacy about it is not very good.

Larsen: Let’s talk about that some more, especially with a focus on generative A.I., whether large language models or image creation-type A.I. What should we be teaching, and how should we be teaching it, to prepare our students for work environments where A.I. proficiency will be required?

Stevens: What we’re trying to do when we use A.I. is to have students understand what those tools are doing, because they already have the literacy to write, to research and analyze content themselves. They’re just expanding their capacity or their efficiency in doing certain tasks, not replacing their command of text or research.

Fiesler: There’s also that understanding of the limitations of these tools. A.I. can seem like magic, and if it seems like magic, you don’t understand what it can do or not do. This is an intense simplification, but ChatGPT is closer to being a fancy autocomplete than it is a search engine. It’s just a statistical probability of what word comes next. And if you know that, then you don’t necessarily expect it to always be correct or always be better at a task than a human.

Stevens: Say a student is writing a research paper and is engaged in a particular set of research literature—is the A.I. drawing from the most recent publications, or the most cited? How does peer review fit into a model of chat generation? These are the kinds of questions that really tell us these tools aren’t as good as what students sometimes think.

Larsen: We’re talking a lot about technology literacy here, but are there any other aspects of literacy you think are especially pertinent when it comes to A.I. models?

Fiesler: There’s also information literacy, which is incredibly important when you are getting information you cannot source. If you search for something on Google, you have a source for that information that you can evaluate, whereas if I ask a question in ChatGPT, I have to fact-check that answer independently.

Stevens: I’m glad you said that, because in class, if a student has a research project, they can declare they’ll use A.I. to assist them, but they get a different rubric for grading purposes. If they use assistance to more quickly build their argument, they must have enough command of the literature to know when that tool generates a mistake.

Fiesler: And educators have to have an understanding of how these tools work, as well. Would you stop your students from using spell check? Of course not—unless they’re taking a spelling test. The challenge is that sometimes it’s a spelling test, and sometimes it’s not. It’s up to educators to figure out when something is a spelling test, and clearly articulate that to the students—as well as the value of what they’re learning, and why I’m teaching you to spell before letting you use spell check.

Expanded Remarks

Star Wars: The Frog Awakens

Larsen: That’s an interesting thought. What about specific skills like critical thinking, collaboration, communication and creativity? How will we change the way we teach those concepts as a result of A.I.?

Fiesler: I think critique and collaboration become even more important. ChatGPT is very good at emulating creativity. If you ask it to write a fan fiction where Kermit the Frog is in Star Wars, it will do that. And the fact that it can do that is pretty cool, but it’s not good, it tends to be pretty boring. Charlie Brooker said he had ChaptGPT write an episode of Black Mirror, and of course it was bad—it’s just a jumble of tropes. The more we play with these systems, the more you come to realize how important human creativity is.

Stevens: You know, machine learning hasn’t historically been pointed at creativity—the idea is to have a predictable and consistent set of responses. But we’re trying to teach our students to develop their own voice and their own individuality, and that is never going to be something this version of tools will be good at emulating. Watching students fail because they think technology offers a shortcut can be a literacy opportunity. It lets you ask the student, are you just trying to get software to get you through this class—or are you learning how to write so that you can express yourself and be heard from among all the people being captured in the algorithm?

Larsen: It’s interesting listening to you both talk about creativity in the age of A.I. Can you elaborate? I’m especially interested in this historical view that creativity is one of the things that A.I. would never get right, which might be a little less true today than it was a year ago.

Fiesler: Well, I think it depends on your definition of creativity. I think A.I. is certainly excellent at emulating creativity, at least, like Kermit and Star Wars, and the things A.I. art generators can do. One of the things art generators do very well is giving me an image in the style of this artist. The output is amazing. Is that creative? Not really, in my opinion. But there are ways you could use it where it would be good at generating output that, if created by a human, people would see as creative.

Stevens: We have courses in which students work on a new media franchise pitch, which includes writing, comic book imagery, animation, art—they’re pitching a transmedia output, so it’s going to have multiple modes. You could waste two semesters teaching a strong writer how to draw—which may never happen—or, we can say, let’s use software to generate the image you think matches the text you’re pitching. That’s something we want students to think about—when do they need to be creative, and when do they need to say, I’ve got four hours to produce something, and if this helps my group understand our project, I don’t have to spend those four hours drawing.

"It’s not that A.I. brings new problems to the table, but it can absolutely exacerbate existing problems to new heights.”
—Rick Stevens

Risky Business

Larsen: What about media and journalism? Do we risk damaging our reputation or credibility when we bring these tools into the news?

Stevens: Absolutely. The first time a major publication puts out a story that gets fact checked incorrectly because someone did not check the A.I. output, that is going to damage not just that publication, but the whole industry. But we’re already seeing that damage coming from other technological innovations—this is just one among many.

Fiesler: I think misinformation and disinformation are the most obvious kinds of problems here. We’ve already had examples of deepfakes that journalists have covered as real, and so journalists need to be exceptionally careful about the sources of images and information they report on.

Stevens: It’s not that A.I. brings new problems to the table, but it can absolutely exacerbate existing problems to new heights if we’re not careful on what the checks and balances are.

Larsen: How about beyond the news? What are some significant trends communicators and media professionals should be keeping an eye out for?

Stevens: We need to train people to be more critical at looking not just where content comes from, but how it’s generated along certain biases. We can get a chatbot to emulate a conversation, but that doesn’t mean it can identify racist tropes that we’re trying to push out of our media system. A lot of what we do, critically, is to push back against the mainstream, to try to change our culture for the better. I’m not sure that algorithms drawing from the culture that we’re trying to change are going to have the same values in them to change anything.

Expanded Remarks

Capitalism and computational power

Larsen: What’s a big question we’re not asking about A.I. and our work?

Stevens: I think the biggest question is, what does A.I. free us up to do that we haven’t been able to do before?

Fiesler: Agreed. Let’s say A.I. and automation really could replace a lot of jobs. So because of ChatGPT, you now need two copywriters to do the job of four copywriters. You could fire two copywriters, but another option is, your four copywriters work 20 hours a week instead of 40 and still get paid the same. Because it’s not like you’re making less money, or you put resources into building your own A.I. If this technology can replace some things we’re doing, that shouldn’t mean we don’t have jobs, it should just mean we have to work less.

Stevens: It’s actually in cultural producers’ interest for something like this to happen. There’s this assumption that, oh, we can do the work of four people with two people now, so let’s fire two of them. Well, better rested, more thoughtful workers can produce better, more thoughtful content. The content we create forms our social identity, so the more thoughtful we are, the better a society we’re going to have, because we’ve inspired people to think about their world differently.

Larsen: I have to tell you both, I’m very impressed with your level of optimism when it comes to A.I. Why don’t we end on an optimistic note, as well? What’s something you feel communicators should be excited about from the dawn of this new age of work?

Stevens: One thing communicators should be excited about is that these tools exist because the process of communication is valuable. Our ability to produce more culture is not a bad thing, we just want it to have a higher fidelity and have the values we want to have, and I think those are questions that thoughtful communicators can bring to the table and help shape.

Fiesler: I agree with that, as well. Young people in college are some of the most well positioned to make an impact on how this technology is going to influence our future, with the way decisions are made around how it’s actually going to change our lives and industries. There are ways in which some things that are happening are scary, but it’s an interesting time to be on the ground floor.

For A.I. to be useful, it needs to grow alongside communicators—not replace them. CMCI experts share their vision for a workplace with ChatGPT and other tools.

Off

Zebra Striped 7 On White ]]>
Mon, 30 Oct 2023 00:16:06 +0000 Anonymous 1020 at /cmcinow
#RecommenderSystems /cmcinow/recommendersystems #RecommenderSystems Anonymous (not verified) Fri, 10/27/2023 - 21:52 Categories: Trending Tags: Information Science Research faculty

By Joe Arney

Digital recommender systems have long been a part of our lives. But those systems might be serving up inequality along with new music, viral videos and hot products.

Now, a leading expert on the technology powering these systems is turning his attention to the way news is recommended and shared. 

“If a system only shows us the news stories of one group of people, we begin to think that is the whole universe of news we need to pay attention to,” said Robin Burke, professor and chair of the information science department. 

Burke’s research studies bias in recommender systems, which tend to favor the most popular creators and products—usually at the expense of newcomers, underrepresented groups and, ultimately, consumers who have fewer choices. That’s problematic because these systems are proprietary, so researchers aren’t able to examine how they work. 

“The people who do this kind of research in industry don’t publish very much about it, so we don’t know exactly what’s going on in terms of how their systems work, or how well they work,” he said.

A quick primer for the uninitiated: Recommender systems use data from individual subscribers to serve personalized content—art, news, commerce, politics—which may limit exposure to new ideas and influences.

It’s why the National Science Foundation awarded Burke and others, including associate professor Amy Voida, a nearly $1 million grant in 2021 to develop “fairness-aware” algorithms that blunt biases baked into recommender systems. And the NSF saw the potential to do something similar in news, leading to a $2 million grant earlier this year to build a platform for researchers eager to experiment with the artificial intelligence that powers news recommender systems.

A platform like this could be game-changing for academic researchers, who are locked out of the proprietary systems built and studied by tech and social media companies. And as more nontraditional providers become sources of news, understanding how these algorithms work is essential: You may think of TikTok as a place for music videos, but a Pew Research Center survey found one in four American adults under 30 get their news from the platform.

“We have put all this control over the public square of journalistic discourse into the hands of companies that don’t have any transparency or accountability relative to what they’re doing,” Burke said. “I think that’s dangerous. And so, it’s important to think about what the alternatives might look like.” That includes the business model itself, which is predicated on selling ads while keeping users on a platform.

If successful, this latest grant will build a robust system for live experiments on recommender systems that will eventually become self-funded through contributions from other researchers. He compared it to the way space telescopes and supercolliders have created a platform where experts can better understand the world around them. 

“Unless you work at one of these companies, you don’t have any insight into how these systems work, or control over them,” Burke said. “I hope that, through this infrastructure, we’re able to understand how these things are governed, and for what objectives—and who gets to decide what those objectives are. That’s something I’m very interested in.”

Lisa Marshall (Jour, PolSci’94; MJour’22) contributed reporting.

Tech is shaping the way we understand the world around us. Do we understand the recommender systems influencing our worldview?

Off

Zebra Striped 7 On White ]]>
Sat, 28 Oct 2023 03:52:26 +0000 Anonymous 1015 at /cmcinow
#PatientInfluencers /cmcinow/patientinfluencers #PatientInfluencers Anonymous (not verified) Fri, 10/27/2023 - 21:40 Categories: Trending Tags: Advertising Public Relations and Media Design Research faculty public relations

By Lisa Marshall (Jour, PolSci’94; MJour’22)

“Noticing a huge difference in my belly fat. It’s melting away!”

“Wildly happy after losing 70 pounds!”

“Just took my first dose. I’m nervous, but excited!”

In late 2022, TikTok was abuzz with such endorsements, delivered by hopeful dieters clutching blue syringes loaded with the diabetes drug-turned-celebrity “weight-loss miracle” Ozempic. The hashtag #Ozempic swiftly drew more than 1 billion views.

But as the craze went viral, diabetics worldwide faced dangerous shortages. Meanwhile, those using it off-label for its slimming qualities began reporting serious side effects, such as violent diarrhea and extreme facial thinning.

“This is a great example of the power of social media—and the unintended consequences,” said Erin Willis, associate professor of advertising, public relations and media design, and one of the few scholars studying a new kind of social media star—the patient influencer.

Her research has shown they often work closely with pharmaceutical companies, or are paid by them, and frequently offer advice about drugs even though they tend to lack medical expertise.

Ozempic is the most recent example of their power, but the phenomenon dates at least to 2015, when Kim Kardashian drew flack for endorsing a morning sickness drug, Diclegis, on Instagram without mentioning its many side effects. Federal regulators warned the drugmaker, the ad was taken down, and the government implemented new disclosure rules for influencers.

Eight years later, the phenomenon has continued to grow, bleeding into new platforms—like support groups for patients with specific medical conditions—where rules are open to interpretation and nearly impossible to enforce. That’s a concern for Willis: “There is virtually no research on this, and very little regulation.”

Willis has published some of the first academic papers exploring the patient influencer phenomenon, framing it as “the next frontier in direct-to-consumer pharmaceutical marketing.”

DTC marketing is the longstanding practice in the United States and New Zealand that allows drug companies to advertise to consumers, rather than through physicians. From a sales perspective, the practice is effective, according to Willis: About 44% of patients who ask their doctor for a drug they see on TV get it.

But, as always, when it comes to social media, there are plenty of unanswered questions. “The fact that patients with no medical training are broadly sharing drug information should alarm us,” she said.

In her work, Willis interviewed dozens of influencers to better understand their motivations. While the influencers she spoke to appeared to have good intentions, she said some might omit crucial information, such as the availability of a cheaper generic option, or unintentionally disseminate misinformation. And consumers might be unable to distinguish between a personal post and a paid endorsement.

That said, she does see some upsides. Patients often know more than their doctors about what it’s like to experience a specific health condition, and sharing their personal experiences on social media can be comforting for others, while potentially helping them discover new coping strategies.

And unlike other forms of DTC advertising, social media enables followers to weigh in with comments sharing both positive and negative experiences with a specific therapy.

Willis hopes her new research will ultimately lead to a set of best practices for both patient influencers and the companies they work with.

“There is both value and risk here,” she said. “Like anything, it has the potential to become dangerous if we’re not careful.”

Take two posts and call me in the morning: Social media’s new role at the pharmacy.

Off

Zebra Striped 7 On White ]]>
Sat, 28 Oct 2023 03:40:23 +0000 Anonymous 1014 at /cmcinow