Evolution of HCI: Greater Flexibility Than Ever Before

No matter the product, a large purchase is an important decision—all the more so if it is for your business. But often times the choice we make—even after careful consideration and plenty of research—can leave us with a feeling of buyer’s remorse. In IT this could happen for any number of reasons—as change and churn occur, business needs evolve and requirements from development teams shift, and you may find yourself in an environment that you did not anticipate. What you really need then is a multi-talented, agile architecture designed with the future in mind. Luckily, this … READ MORE

Related:

  • No Related Posts

The Forgotten Tribe and The Dell Digital Way

EMC logo


As digital transformation hype continues to grow, IT is still an enabling function that exists to deliver business outcomes. As with other support functions like human resources finance and legal, in IT it’s very common to refer to functions outside of IT as “the business”. The business is trying to grow margin dollars. The business is trying to increase productivity. The business is trying to reduce customer effort.   But who is this nameless faceless business that IT supports? The face of the business is the forgotten tribethe tribe of users – the people that actually use your software tools, sometimes for many hours a day.

Our Dell Digital team is working hard to put a face on the business to enable exceptional outcomes even faster and with less re-work. We call it the Dell Digital Way – a major cultural shift for us built on people, process and technology.  Heavily inspired by our brothers and sisters at Pivotal we’re combining elements of design thinking, Agile, SAFe, extreme programming and IoT. That’s a lot of jargon so what exactly are we doing? We’re taking the Pivotal methodology adding in a few dashes of our own and applying across everything that we do.

First we start with user empathy which is the hallmark of design thinking and we are doing it with professionally trained designers – actually spending time to understand not only what our users do but how they do it and what motivates them. Qualitative empathy is critical but I can’t do it justice compared to the classic TedX talk by Doug Dietz. What we learned is that users care most about an effortless experience and far less about new bells and whistles. In their hierarchy of needs users want applications that are first up, then fast and ultimately easy.

Our human-centered approach brings qualitative and quantitative approaches together. We’ve adopted an iterative approach focused on user empathy with elements of Agile, extreme programming and SAFe to release small increments in days or weeks. We always write test cases before developing of any user story.  Finally we instrument our applications (not just the website) in the spirit of IoT.  The software application itself is the thing and instrumentation gives us performance and adoption feedback so we can continue to fine-tune our interface configuration as well as optimizing performance of backend calls.  Using this quantitative empathy we can begin the cycle or qualitative empathy over again at the start of the next cycle.

A great example of where we’ve applied this approach is in our current Salesforce Service Cloud implementation. We started by setting up a lab in our contact center and selected a team of users that represented a small sample of the total user population. Our team of product managers, designers and engineers spent hours and days observing and building rapport with the users. In parallel they started configuring (never customizing) the application and doing demos with the users. Prior to configuring each user story they wrote test cases to ensure story success.

There’s a common misconception that you don’t need UX with SaaS because there’s already a UI.  When you decide to go with a SaaS application you are outsourcing UI but the UX is still in your hands.  SaaS platforms generally give you enough degrees of freedom to overwhelm users if you don’t make a conscious commitment to design and control complexity throughout the life of the application. If you empathize with your users and apply design and analytics properly, you’ll see this forgotten tribe celebrate their tools instead of wrestling with them, improving the employee experience and ultimately benefiting customers. This is the art of delivering a world class end-user experience and the outcome we expect with the Dell Digital Way.

The post The Forgotten Tribe and The Dell Digital Way appeared first on InFocus Blog | Dell EMC Services.


Update your feed preferences


   

   


   


   

submit to reddit
   

Related:

College Football’s Cyber Warfare

After special prosecutor Robert Mueller announced indictments for a number of Russians last week, the Washington Post published an interview with an employee from a Russian troll farm near St. Petersburg. The employee talked openly about creating multiple online personalities and using them to influence opinions, organize efforts and create division using narratives and stories they had cooked up.

This was some real 21st century spy warfare, the kind of stuff you would see in the Showtime series “Homeland.”

But would you believe this type of social media/internet cyber warfare has been going in college football for a long time?

It has.

The internet disinformation wars began years ago in college football. As far back as the late 1990s coaches and operatives in football programs would create online message board personalities to post “rumors” on opposing team’s message boards. Often the rumors involved an opposing coach being on the hot seat, or players being unhappy or wanting to transfer. These posts usually sparked a string of responses.

These threads were created to sow doubts about rival schools in recruiting and often the coaches who started the disinformation would direct recruits to check them out. The recruit would see the rumor and the negative fan feedback.

Those were the early days and they seem quaint now.

As social media exploded so did the size and sophistication of college football social media operations, including cyber warfare.

Because the NCAA now allows coaches and teams to retweet things recruits post they can easily signal to their fan base who they are recruiting. For years the NCAA did not allow teams to publicize who they were recruiting to prevent rogue boosters from having contact with recruits.

Those days are over. Coaches essentially confirm who their school is recruiting and fans can engage directly with recruits on social media. It creates a lot more opportunity for disinformation.

But the real cyber warfare takes place with internet operations in college football programs. Like the Russian troll farms, they create online personalities to troll other teams, to contact and help recruit players and to try to control and react to any negative news stories.

One of the methods used in the Russian troll farms was to stage mock debates between people in comments sections, or in social media. A couple of trolls would debate a straw man in the comments section and inevitably they would convert the straw man to endorse the view they wanted.

Some programs have the same type of operation. On message boards or social media you’ll see online personalities that always defend their school and engage every time there is something they want to refute.

But the disinformation plots don’t just work on recruits. They are used to fight against negative fans when a team might be struggling. They are used to create apparent “virtual momentum” and the appearance of a groundswell of support when a school is deciding to possibly fire a coach, or hiring one. They are used to swamp the comment sections of certain writers who may be critical of the program.

Athletic directors and administrators fall for it because they are concerned about popular opinion. While head coaches may maintain plausible deniability, the operations are all part of helping them keep their jobs.

It is possible that the superagents in college coaching who have many big-name clients may have similar operations. As openings come and go, those agents could deploy their trolls to whip up popular opinion to get their guy hired and thereby reap the rewards of major new contracts.

As more details about the Russian operation emerge, keep in mind that the cutthroat world of international cyber warfare and politics may be reflective of what may be going on at your favorite school’s athletic program.

These operations have shot up as the NCAA’s regulatory retreat from social media has left an online world of Wild-West lawlessness.

Building the positive propaganda and combatting every negative rumor and utterance has become a full-time 24/7 operation. So too are the even murkier operations that create disinformation through rumor and innuendo to damage other schools or help your school with recruits, fans and administrations. An army of interns in football buildings around the country are always on the case.

And you thought that this was a Russian creation? Guess again.

It is a bold new world out there. Prophecies of disinformation operations using high-level technology that Orwell wrote about in his book “1984” have been exposed in politics. Little did many of you suspect that those Orwellian tactics have already been part of college football programs’ operations for years.



Related:

  • No Related Posts

Mueller indictment shows the evolution of Kremlin political warfare

Far deeper than an online disinformation campaign, the IRA’s work included extensive research on American politics and society and real rallies on U.S. soil. Its operatives impersonated Americans to dupe an unspecified number of U.S. citizens and Trump campaign staff.

The indictment provides the clearest blow-by-blow assessment of how Moscow has adapted its influence operations for the 21st century. The basic tactics are straight from the Soviet “active measures” playbook: a continuous spread of disinformation during the Cold War to discredit American political leaders (including Martin Luther King, Jr.), fuel ethnic tensions and undermine trust in U.S. intelligence agencies.

In 1976, the KGB launched a smear campaign against the anti-Soviet Democratic candidate Sen. Henry “Scoop” Jackson, armed with forged FBI intelligence. In the post-Soviet era, Putin’s advisers have boasted about how they pit different groups against each other inside of Russia. Sound familiar?

What’s next: The coming revolution in AI and machine learning will transform malicious actors’ capabilities to influence democracies. This won’t happen by the fall of 2018, but 2020 will likely usher in even more dangerous forms of political warfare.

Alina Polyakova is the David M. Rubenstein Fellow for Foreign Policy at the Brookings Institution.

Related:

  • No Related Posts

Memetic Warfare: Spreading Weaponized Ideas for Influence and Control

Russia’s Internet Research Agency’s was recently highlighted in charges from special counsel Robert Mueller for its operations to “to interfere with elections and political processes.”

According to a new book, however, the Russian operations are just a small part of a much larger picture. Special interest groups, governments, and big businesses are trying to alter the way we perceive information, in order to influence the culture and underlying values of our societies.

The book, “Information Warfare: The meme is the embryo of the narrative illusion,” by James Scott, founder of the Center for Cyber-Influence Operations Studies, explains the strategies of “memetic warfare,” and reveals the groups using this system to advance their agendas.

(Institute for Critical Infrastructure Technology)

A meme is an idea that can can “evolve” over time, and eventually influence culture. Memes are often associated with funny pictures overlaid with text. Yet the concept goes much deeper, and can be anything from music, to movies, to words and their perceived meanings.

Memetic warfare is a weaponized use of memes to intentionally introduce ideas into society, packaged in a way that allows them to spread, with a goal to alter the culture and perceptions of a targeted population.

A goal of memetic warfare isn’t to alter reality, but instead to alter the perceived reality.

According to the book, “the most profound weapon a nation or special interest group can possess is ‘control’ over information. This contributes to control over the narrative, and the meme is the embryo of the narrative.”

Other entities play a role in helping shape the ideas, and control people’s exposure to ideas that don’t fit the objective.

“Corporate nation state propagandists, such as Google, Twitter, YouTube, and Facebook, perpetuate the syntactical amalgamation of censored ideas, narrative illusions, and perception steering initiatives that cripples and imprisons the mind,” it states.

It adds, “Censorship is about what you don’t see, rather than what you do see. Digital gatekeepers provide users with only the content that they want them to view.”

Manufactured Thought

The nature of warfare has changed. As the book notes, war has moved beyond merely killing an enemy or capturing and holding territory. The war of messages has taken over, and “The emerging hybrid war depends on the allegiance of civilian populations and control over narrative.”

The book poses a question. It cites French philosopher René Descartes, stating, “I think, therefore I am,” and poses the question, “but who does one become when the thought is hijacked?”

It raises the issue that as political organizations, social networking companies, legacy news outlets, and other powerful groups work together to manufacture ideas intended to alter the perceptions of a country, how can people recognize what are their own thoughts, and which thoughts have been planted?

In today’s world, “Information Warfare” states, websites like Facebook are nearly as relevant as the United Nations, information-leaking website WikiLeaks has intelligence analysts similar to the CIA, and “Google’s dragnet surveillance censorship algorithm has become the new gatekeeper of critical information that could lead society into a new renaissance.”

A shift in power has taken place, moving to an “an all-out battle for the psychological core of the global population.”

“Digitized influence operations have become the new norm for controlling the electoral process, public opinion, and narrative,” the book states. “The cyber war has moved beyond the battlefield into an all-encompassing struggle in economics, politics, and culture, along with old-school physical confrontation.”

Among its examples, “Information Warfare” notes that some violent protests are being used by special interest groups to advance key narratives. For instance, the communist extremist group Antifa, known for its black-clad, masked followers, label nearly all conservatives as “fascists” and often escalate conflicts into violence.

“In reality, the overwhelming majority of protestors and counter-protestors are non-violent; however, they and, in most cases, the points of their causes do not merit media attention because relatively minuscule radical factions can easily steal the spotlight,” the book states.

In some cases, the true intention of the “revolutionaries” isn’t just to protest, but instead to “derail an event or detract from a cause by altering public perception and polarizing issues based on partisan politics.”

Due to the fact that Antifa members wear masks, anyone can infiltrate the group to escalate conflicts, which can then be used by legacy news outlets and political groups to frame new narratives.

This is a common phenomenon, the book states, noting “false flag operations and operations sponsored by special interest groups are both effective and prevalent in this space.”

Perception Warfare

The term “meme” was coined by militant atheist Richard Dawkins, who compared the spread of ideas and their effects on society to a “virus.”

The concept far outdates Dawkins, however, and ties to broader systems of propaganda and psychological warfare—a method of warfare designed to alter the way a target interprets information.

Propagandists, such as those under communist dictatorships, will try to control a society’s exposure to ideas through censorship, while also feeding select ideas through state media and other channels—similar to the methods used by today’s information gatekeepers.

Among the methods used to frame ideas are misinformation and disinformation. While misinformation is the mere statement of falsehoods, disinformation is much more complex.

A disinformation campaign can take the form of false-flag operations, such as manufactured events or protests, or fake scientific studies and research papers. The disinformation can then be pushed by news outlets or through other channels to help shape a narrative. The rule is that disinformation needs a grain of truth, which the propagandist can point to in order to derail critics during debate.

Another use of disinformation is to cite otherwise true information, but to manufacture a false conclusion, using the propagandist concept of “one plus one equals three.” This can include citing a series of half truths, then claiming the evidence adds up to something it does not. Debunking this method requires a dissenter to debunk each piece of evidence, which can rarely be done quickly enough for public debate.

These tools are still in heavy use. As the book notes, psychological warfare is part of the Chinese Communist Party military’s “three warfares” system, which also includes “legal warfare” to manipulate courts, and “media warfare” to control news and social media coverage.

Disinformation is still actively being used by Russia’s Internet Research Agency, which the book says includes “a collection of government-employed online trolls directed to spread propaganda, incite divisions in foreign communities, and otherwise sow chaos and destabilize democratic platforms.”

“Propagandists from Russia, China, and other nations typically pander memes to both sides or multiple factions of sensitive conflicts in an attempt to breed discord, capitalize from chaos, derail productive discussion, distract impending investigations, dwindle valuable resources, or polarize susceptible populations,” “Information Warfare” states.

The overall picture is that numerous groups, both public and private, are using memetic warfare to attack the perceptions of individual people. Some are interested in advancing their political agendas, others are working to destabilize the United States.

The book states, “nations must decide how to best defend their people against foreign influence operations while launching their own campaigns against emerging adversaries in the hyper-dynamic, ill-defined battlefield for control of the meme, control of the narrative, and control of perceived reality.”

Related:

  • No Related Posts

Russian meddling prays on a gullible public

By Hank Waters

In an excellent report published in this newspaper last Sunday, Rudi Keller explained what he learned from several researchers about recent Russian meddling in U.S. affairs using social media. Keller’s primary source was Lt. Col. Jarred Prier, who for years has studied Russian cyber warfare and recently wrote a peer-reviewed report including student protests at the University of Missouri as an example.

Prier says Russian disinformation campaigns seek to sow discord among allies of the U.S. and internally as well. Particularly galling to Prier, a 2003 MU grad, was the successful Russian effort to stoke unfounded fears of a violent white backlash surrounding 2015 student protests and subsequent resignation of then-UM President Tim Wolfe.

Prier found Russian cyber trolls used Twitter to spread untrue accounts of campus violence, including Ku Klux Klan marches and a phony picture of a battered black youth. Incessant repetition on social media caused many to believe the false reports.

The recent indictment by Special Counsel Robert Mueller charges Russia used its campaign in the 2016 presidential campaign to benefit Republican Donald Trump and Democrat Bernie Sanders in order to discredit Democrat Hillary Clinton, thought by the Russians to be their main target.

Larger conclusions by Prier and other expert witnesses interviewed by Keller are interesting. Prier says “They want to force the American public to go over into a corner and argue amongst themselves.”

MU Professor of political science Cooper Drury says the Russian long-term goal is not the victory of any political party but a weaker U.S. If disruption is your goal, says Drury, “then the greater polarization you can get inside a democracy the more successful you will be.”

MU professor of communications Mitchell McKinney says social media helps mask the source of otherwise questionable propaganda, and volume creates believability. Then, he says, most success comes when these rumors are reported by trusted news organization.

“These Russian trolls were driving clicks,” says Prier. “Clicks are what keeps the business moving.”

If political polarization in the U.S. is a primary goal we might think the Russian campaign has been spectacularly successful, but MU professor Drury points out that traditional media once considered neutral is more likely today to take sides. He cites television networks Fox News and MSNBC which attract opposed and mutually disdainful audiences.

Prier’s report sounds pessimistic, but MU journalism professor Mike Kearney argues the internet makes it easier for each of us to share and find information “by ourselves.” Prier says it’s up to providers of information, including Twitter, to be more careful.

Obviously, the first line of defense should be the retail consumer of news, but as we see in the new age of easy disinformation, we have not yet fully learned that skill. A gullible public has existed since the first human society appeared. Today the same human frailty persists, frighteningly fueled by the internet and its latest, most insidious tool, Twitter.

Yes, I will say “insidious.” The benefit of sharing innocuous messages is sadly overcome by the pernicious opportunities gained by newly empowered trolls who so easily get in our heads anonymously. Will we learn to be skeptical enough?

HJW III

hjwatersiii@gmail.com

The best argument against democracy is a five-minute conversation with the average voter.

—Winston Churchill

Related:

Vladimir Putin makes use of cyber weapons to maintain Individuals at one another’s throats

By – – Tuesday, February 6, 2018

ANALYSIS/OPINION:

Just so there’s no confusion: This column is not about Americans conspiring or colluding or coordinating with Russians. That’s a separate controversy about which I don’t have a lot to say at this moment.

What this column is about: Dezinformatsiya, the Russian word that gave birth, in the 1980s, to the English neologism “disinformation.” Understand that disinformation is not a synonym for misinformation. The later implies information that happens to be wrong. The former implies an attempt to deceive public opinion for strategic purposes.

For decades, thousands of Soviet propagandists and espionage agents disseminated tons of dezinformatsiya around the world. Today, using social media, sophisticated tech platforms and cyber weapons, the Russian government, headed by , is running a dezinformatsiya offensive beyond Joseph Stalin’s wildest dreams.

Jamie Fly and Laura Rosenberger have been studying this operation. Senior fellows at The German Marshall Fund of the United States, they are seasoned national security professionals. Both have worked, among other assignments, at the National Security Council, Mr. Fly in the George W. Bush administration, Ms. Rosenberger under President Obama.

They’ve been tracking “Kremlin-oriented social media accounts,” “troll farms,” “fake personas” and “fake organizations.” President , they’ve concluded, is attempting to undermine faith in America’s democratic institutions, assist extremists on both the left and right, divide and polarize Americans (even more than they already are), and poison the policy debates that citizens of a mature republic should be able to conduct in a civil manner.

Elections are just one target of opportunity. Russia’s networks, Mr. Fly and Ms. Rosenberger write in the Journal of Democracy, have been using social media to heighten tension in a range of controversies. One example: Reasonable people may differ over whether Confederate statues, in Charlottesville and elsewhere, should remain or be removed. The mission of Russian disinformation operations: Make this a fight between neo-Nazis on one side and Antifa thugs on the other.

Another example: has a strong interest in keeping his European neighbors dependent on his oil and gas, and pushing the price of those commodities as high as possible. So ’s networks have been running a covert disinformation campaign against hydraulic fracturing, the technology that has made it possible to access abundant natural gas deposits cheaply. The Kremlin didn’t create the controversy over fracking, it has simply promoted “some of the most divisive, conspiracy-minded stories around” that debate.

America is not Moscow’s only target. The Fly/Rosenberger research “has found examples of Russian interference” in 27 countries since 2004: planting false information in reputable newspapers, boosting radical political parties, hacking moderate political parties and leaking juicy tidbits to friendly and/or credulous journalists. More insidious than spreading fake news is sprinkling lies into a goulash of facts to produce a distorted narrative that becomes impossible to successfully rebut.

Why is doing this? If that’s the question you’re asking, you haven’t been paying attention to the man. His mission is to restore the power Russia lost when the Soviet Union collapsed. And, in his calculus, strengthening Russia and weakening the West amount to the same thing.

is an authoritarian and like other authoritarians — Chinese, Iranian, Turkish, North Korean, etc. — he regards democratic and republican forms of government as weak, decadent and, over time, bound to fail or, better yet, be defeated.

Undermining democratic institutions increases ’s legitimacy. You say elections in Russia are rigged? His supporters say that elections are not so free and fair in America and Europe either. This perception hobbles movements in support of civil rights and representative government everywhere.

Okay, I think I will say a few words about the raging partisan debate — allegations from Democrats and some #NeverTrump Republicans that meddled in America’s 2016 election with the goal of helping Donald Trump. More likely is what retired CIA chief of station and veteran Russia-watcher Daniel Hoffman concluded: that the “Russian espionage disinformation plot” was meant to target “both parties and America’s political process.”

As evidence, he notes a 2017 report from the U.S. Office of the Director of National Intelligence which concludes that “pro-Russia bloggers even prepared an election-night Twitter campaign, #DemocracyRIP, designed to question the election’s validity after a Clinton victory.”

As V.I. Lenin would say: What is to be done? A bipartisan bill introduced by Sen. Marco Rubio and Sen. Chris Van Hollen would punish Moscow if our intelligence community determines that Russia is interfering in future elections.

Mr. Fly and Ms. Rosenberger argue that a “whole-of-government response, with a strong interagency lead and process that cuts across national security and domestic policy spaces, will be required to address this threat.”

They add: “With the United States and Europe facing a shared threat with similar tactics, a united trans-Atlantic response is critical to pushing back on Moscow’s efforts to weaken democracies and divide democratic nations from one another.”

Finally, the United States should do whatever is necessary to win the race in cyberspace, as much a domain of modern warfare as air, land, sea and space. Our aim should be nothing less than overwhelming superiority, both defensively and offensively.

We also need to get way ahead in the race for artificial intelligence, a weapon of dezinformatsia with enormous potential. declared last year that “whoever becomes the leader in this area will rule the world.”

The United States has no interest in ruling the world. The United States does have a vital interest in preventing authoritarians from ruling the world. Acutely aware of that, will do everything in his power to keep us at each other’s throats.

• Clifford D. May is president of the Foundation for Defense of Democracies and a columnist for The Washington Times.

LOAD COMMENTS () HIDE COMMENTS

Related:

  • No Related Posts

Information Warfare: Gauging Trolls’ Influence on Democracy

CIA Chief Warns Russia Is Seeking to Influence US Midterm ElectionsMathew J. Schwartz (euroinfosec) • January 30, 2018

Information Warfare: Gauging Trolls' Influence on Democracy
Distribution of reported locations for tweets by Russian trolls (red circles) and a random, baseline set of Twitter users (green triangles). (Source: “Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and their Influence on the Web”)

The United States appears to be headed into yet another perfect information warfare storm of Russian making.

See Also:Ransomware: The Look at Future Trends

On Monday, the Trump administration announced that it will impose no new sanctions on Russia as a result of its 2016 meddling in the U.S. presidential election or 2014 invasion of Crimea.

But CIA Director Mike Pompeo tells the BBC that he’s seen no “significant decrease” in Russian information warfare activity and predicts it will not decline before November’s House and Senate mid-term elections (see No Shock: Russia Confirms ‘Cyber War’ Efforts).

“I have every expectation that they will continue to try and do that, but I’m confident that America will be able to have a free and fair election [and] that we will push back in a way that is sufficiently robust that the impact they have on our election won’t be great,” Pompeo says.

Russian Disinformation Campaigns

In October 2016, the U.S. Department Of Homeland Security and Office of the Director of National Intelligence blamed the Russian government for attempting to interfere in U.S. elections by hacking and leaking documents, saying such activities were authored by “Russia’s senior-most officials.” (See US Government Accuses Russia of Election Hacking)

The precise manner of that interference continues to come into focus, as Twitter, Google and Facebook release details of social media accounts tied to Russia’s disinformation and propaganda efforts (see Senate Grills Tech Giants Over Russian Fake News).

Troll Farms

What effect might Russian information warfare efforts have on U.S. voters?

In late 2017, Congress launched an investigation into Russian interference and released Twitter accounts flagged as being used by Russian trolls.

A group of researchers have since analyzed what they say are “27,000 tweets posted by 1,000 Twitter users identified [by Congress] as having ties with Russia’s Internet Research Agency and thus likely state-sponsored trolls.” The researchers – from Cyprus University of Technology, University College London and University of Alabama at Birmingham – looked at the Twitter users’ impact not just on that social network, but also on the Reddit and 4chan forums, according to their new report, “Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web.”

Troll Hashtags

Top 20 hashtags in tweets from Russian trolls compared to a baseline, random set of Twitter users. (Source: “Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web”)

Their chief finding: The quantifiable impact of the “trolls’ influence” on other Twitter, Reddit and 4chan users over a 21-month period “was not substantial with respect to the other platforms, with the significant exception of news published by the Russian state-sponsored news outlet RT,” which was previously known as Russia Today.

The researchers found that tweets that include links to RT had four times as much impact as other trolling efforts (see Russian Interference: Anatomy of a Propaganda Campaign).

Terms extracted from Latent Dirichlet Allocation analysis of tweets’ semantics, comparing Russian trolls with a baseline of random Twitter users. (Source: “Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web”)

Return on Investment

So why would the Russian government sanction disinformation campaigns via Twitter if they had negligible impact?

The researchers say the apparently limited influence could relate to their only studying 1,000 troll accounts – a very small sample. But another likely explanation is simply that trolls’ goals are more indirect.

“Another, more plausible explanation is that the troll accounts are just not terribly efficient at spreading news, and instead are more concerned with causing havoc by pushing ideas, engaging other users or even taking both sides of controversial online discussions,” the researchers write.

Bolstering that theory: Twitter recently reported that it’s discovered at least 50,000 automated troll accounts, which may be much better at sending people to specific URLs, the researchers say, adding that they hope to see more sophisticated measurement techniques get developed.

Influence is Tricky

Alan Woodward, a professor of computer science at the University of Surrey, says that demonstrating the scale of trolling – as this paper does – is tough to translate into how people’s opinions may have been swayed.

“It is notoriously difficult to measure – and hence prove – influence,” he says. “We all like to think we are more intelligent than that.”

Counterpoint: Billions of dollars get spent every year by businesses who want to influence which laundry detergent, fast-food restaurant or vacuum cleaner they prefer.

Psychological Warfare

The very fact that the Kremlin sponsors troll farms suggests they do serve a purpose. “The Russians would not persist if they didn’t think it had some benefit them, even if that is to cause sow confusion,” Woodward says. “It’s also interesting that ‘western’ countries are setting up psychological warfare units that specialize in online social media.”

The United Kingdom, for example launched its 77th Brigade – motto: “Influence and Outreach” – in 2015. The same year, the EU launched a rapid-response European External Action Service designed to counter disinformation campaigns.

Woodward likens the influence of foreign powers to the days of newspaper barons, when “owners of newspapers could sway opinions through editorial control.” But whereas newspapers had an owner and mastheads, social media can make it much tougher to identify who’s behind messaging that can operate at a heretofore unseen scale.

Arguably, today’s stakes are also much higher than ever. “At the very least, I think that foreign powers can cause a loss of trust and sow doubt about the effectiveness, relevance and so on of a country’s government, and that has to build a picture in the minds of swing voters,” Woodward says. “At worst it could bring the whole concept of democracy into disrepute.”

Trump Administration Criticizes Sanctions for Russia

Given the threat posed by Russian information warfare, many observers continue to ask: What will the United States do to attempt to deter future Russian meddling in U.S. elections?

On Monday, the Trump administration announced that it will not sanction Russia, as required by a new U.S. law meant to punish Russia for its interference in the 2016 U.S. Presidential election.

“Today, we have informed Congress that this legislation and its implementation are deterring Russian defense sales,” State Department spokeswoman Heather Nauert said in a statement released on Monday. “Since the enactment of the … legislation, we estimate that foreign governments have abandoned planned or announced purchases of several billion dollars in Russian defense acquisitions.”

The “Countering America’s Adversaries Through Sanctions Act,” or CAATSA, cleared Congress last August and was signed into law by President Trump, even though he described it as “deeply flawed.”

The passage of the law also prompted criticism from Russia, with Prime Minister Dmitry Medvedev saying it signaled a “full-scale trade war” against Russia.

The law requires the Trump administration, as of Monday, to impose at least five out of 12 sanctions specified in section 235 of CAATSA on anyone determined to engage “in a significant transaction” with anyone who’s part of Russia’s defense or intelligence sectors.

While the White House initially rebuffed the law’s requirements, later on Monday, the administration acceded somewhat to the law’s demands by issuing a list of 114 Russian politicians and 96 oligarchs – some close to Putin – in what’s informally known as the “Putin list.”

Some of the individuals on that list are already subject to U.S. sanctions. But it’s not clear if more individuals on the list might be sanctioned, or if the list’s purpose is simply to “name and shame” them.

The U.S. Treasury, for example, notes that the list “is not a sanctions list, and the inclusion of individuals or entities … does not and in no way should be interpreted to impose sanctions on those individuals or entities.”

But if the Trump administration does not attempt to exact a political or financial price for Russia’s continuing attempt to meddle in U.S. political affairs, it’s unclear whether the Kremlin will have any incentive to cease its U.S.-focused information warfare campaigns.

Related: