Is the Internet Destroying Scholarship? by Michael Noll

by

Is the Internet Destroying Scholarship?

A. Michael Noll

April 4, 2016

© 2016 AMN

Has the Internet created a batch of scholars who simply cruise the Internet for references and citations for their research and mostly ignore old-fashioned libraries and archives? Is the past before the Internet, and everything online, now forgotten and forever lost? Is the Internet destroying scholarship?

A student paper included what it claimed was a picture of me. It was not – it was a picture shown on Google images that was incorrectly implied to be me. Actually it was a photo that I had taken of my boss at Bell Labs in the 1960s. “Cut and paste” is used by many students, and some scholars – and is not good scholarship.

Today it is too easy and tempting just to search Google for a few key words – and then credit whatever turns up in the top five listings. But that means that only material accessible over the Internet is indexed and listed. Anything before about 1990 is mostly not Internet available – and thus not indexed. It is as if the past before 1990 never existed.

A few years ago, I was in the basement of a library looking for an old book. I found it, and also other old books on similar topics that I did not know existed. This is the serendipitous nature of looking though the stacks of paper books. It also applies to looking through journals for a particular paper, and then discovering others of interest.

A scholar was writing a paper on a topic and though knowing of a previously published paper on an almost identical topic deliberately ignored it, claiming that the “new” paper would be better. Another scholar published a recent paper that failed to mention my published papers on virtually the same topic. The scholar did not know of my papers – they were published before 1990. However, a Google key-word search listed one of my early papers, along with its abstract. I guess these are example of avoidance of the past.

But is it ignorance and avoidance of the past, or simply just sloppy scholarship in general? But some archives at great libraries are still busy with scholars accessing, examining, and studying papers and books from the past. Paper lasts for centuries. Will digital bits last much longer than a few decades as digital media become obsolete, like the floppy disks of the 1970s?

It seems that as I get older, scholarship seems to decline more. Is this just because the past was “those good old days” or is good scholarship in decline? Are universities holding their students – and faculty – to the highest standards? Or are these the days of “mass” everything – even mass scholarship done in a hurry to please the mass academic audience? Is thorough research just too much time and effort?

A. Michael Noll

A. Michael Noll

Tags: , , , ,


Laura DeNardis on Internet Governance

by

Professor Laura DeNardis gave a Quello Lecture in Washington DC that updates her perspectives on the key issues facing what she refers to as the ‘destabilization’ of Internet governance. Laura is one of the world’s leading authorities on Internet policy and governance, and this video enables you to see why.

Laura DeNaris – The Destabilization of Internet Governance from Quello Center on Vimeo.

Laura was welcomed to the Quello Lecture by the Dean of MSU’s College of Communication Arts & Sciences, Professor Prabu David, and the College’s Director of Development, Meredith Jagutis.

Bill Dutton and Laura DeNardis

Bill Dutton and Laura DeNardis


Laura DeNardis with Dean Prabu David and Meredith Jagutis

Laura DeNardis with Dean Prabu David and Meredith Jaguits

Tags: , ,


Quello Talk on ‘Cybercrime Offending and Victimization’, by Tom Holt

by

Identifying Risk Factors Associated with Cybercrime Offending and Victimization

Tom Holt presented an informative talk on his research focused on who is most likely to be a victim of online cybercrime, and who is most likely to be an offender. Given the many scary stories in the media, you might find this to be a useful presentation to view. In general, he finds a number of common patterns that conform with key patterns in the real, offline world, such as the centrality of peer influence. He empirically examines the importance of traditional criminological theories in accounting for involvement in various forms of cybercrime and deviance, as well as the risk of person and property-based forms of cybercrime victimization using various data sources. The findings demonstrate that offending is partially learned through social interactions with intimate peers, as well as through latent individual traits such as impulsivity. These same factors also disproportionately increase the risk of victimization, leading to challenges for policy-makers and parents to deal with inappropriate behaviors.

Thomas J. Holt – Identifying Risk Factors Associated with Cybercrime Offending and Victimization from Quello Center on Vimeo.

Dr. Thomas J. Holt is an Associate Professor in the School of Criminal Justice at Michigan State University specializing in cybercrime, cyberterror, and policy. He received his Ph. D. in Criminology and Criminal Justice from the University of Missouri-Saint Louis in 2005. He has published extensively on cybercrime and cyberterror with over 40 peer-reviewed articles in outlets such as British Journal of Criminology, Crime and Delinquency, and the Journal of Criminal Justice. Dr. Holt has
co-authored multiple books, including Cybercrime and Digital Forensics: An Introduction (Routledge), and Policing Cybercrime and Cyberterror (Carolina Academic Press). He has also given multiple presentations on cybercrime and hacking at academic and professional conferences around the world, as well as hacker conferences across the country including Defcon and HOPE.

His recent work on social media gained media coverage over a finding that 1 in 4 children are sexually harassed online – by their own friends. We hope you can join this informal noon brown bag seminar.

Tags: , , , ,


‘3D Yet Again’ by A. Michael Noll

by

‘3D Yet Again’

A. Michael Noll

November 26, 2015

© 2015 AMN

Stereoscopic 3D has always created a strong fascination with its feelings of depth and realism. Today it has morphed into the hype of something called “holographic enhanced virtual reality.”

As a child, I had both a Tru-View stereoscope that used 35mm filmstrips and also a View-Master™ stereoscope that used small images on disks. They both presented separate pictures for the left and right eyes that created the feeling of stereoscopic depth.

While employed at Bell Telephone Laboratories in New Jersey during the1960s, I programmed the computer there to create stereoscopic pairs of random shapes, a form of virtual sculpture. I used various stereoscopes to view the 3D pairs, such as the cardboard viewer shown in the Figure. I also made computer-animated stereoscopic movies, such as a computer-generated ballet, random kinetic objects, and four-dimensional hyper objects. [Noll, A. Michael, Computer-Generated Three-Dimensional Movies,” Computers and Automation, Vol. 14, No. 11, (November 1965), pp. 20-23.] The 3D movie of the 4D hypercube can be seen at: https://www.youtube.com/watch?v=iXYXuHVTS_k Simple cross-eyed viewing will give the 3D effect without the need for any viewing device, although some practice is required. We who did research on 3D learned how to relax or cross our eyes to see 3D pairs without the use of any viewing device.

Decades ago, it was suggested to use half-silvered mirrors so that the computer-generated imagery could be superimposed on reality. When the head moved, the computer-generated images would be suitably changed so that virtual shapes and objects would be seen in real settings. It was even suggested back then to couple this 3D imagery with tactile sensation so that virtual objects could be seen and felt in real settings. [Noll, A. Michael, “Man-Machine Tactile Communication,” SID Journal (The Official Journal of the Society for Information Display), Vol. 1, No. 2, (July/August 1972), pp. 5-11.] Prototypes were invented and built, but applications were not clear – and the technology was massive and complex.

Google is promoting its 3D viewer – called Google Cardboard. It has two lenses to view separate stereoscopic images on a smart phone. It is little more than the Tru-View, 3D Mail-O-Vue, and View-Master of the distant past.

The term “holographic” is even being used to describe today’s 3D imagery. But the images are not holograms at all – they are just simple 3D stereographic images and technology from decades ago.

Some “new” devices present separate images to each eye from two small screens mounted in some form of viewer that is attached to one’s head. But even this is not new, and such technology was used in the 1960s for helicopter pilots to see the ground under them. [Upton, H. W., “Head-mounted displays in helicopter operations,” USAECOM-AAA-ION Technical Symposium on Navigation and Positioning, Fort Monmouth NJ, September 1969.] The use of such head-mounted displays was also used for computer graphic display. What is “new” today is the ultra-miniaturization of the technology, along with motion and position sensors, and vast computing power that was unimaginable decades ago. But what applications of all this 3D technology will excite consumers?

Stereo Mail-O-Vue

Figure. Photo of Stereo Mail-O-Vue viewer. This cardboard foldable 3D viewer was used for seeing 3D stereoscopic images on a 35mm filmstrip. [Photo courtesy of A. Michael Noll.]

Tags: , , , , , ,


The Importance of Public Service #ChangeAgents by David A. Bray

by

Here is the Webcast of Dr David A. Bray’s Quello Lecture on ‘The Importance of Public Service #ChangeAgents in Exponential Times’, which was given at MSU’s Quello Center on 21 September 2015.

David Bray – The Importance of Public Service #ChangeAgents in Exponential Times from Quello Center on Vimeo.

Abstract

Technology is rapidly changing our world, the 7 billion networked devices
in 2013 will double 14 billion in 2015 to anywhere between 50 to 200 billion in 2020. The ability to work and collaborate securely anywhere, anytime, on any device will reshape public service. We must ensure security and privacy are baked-in at code development level, testing
from ground up and automating alerts. Legal code and digital code must work together, enabling more inclusive work across government workers, citizen-led contributions, and public private partnerships. All together, these actions will transform Public Service to truly be “We the (Mobile, Data-Enabled, Collaborative) People” working to improve our world.

Dr. David A. Bray is a 2015 Eisenhower Fellow, Visiting Associate on Cyber Security with the University of Oxford, and Chief Information Officer for the Federal Communications Commission.

He began working for the U.S. government at age 15 on computer simulations at a Department of Energy facility. In later roles he designed new telemedicine interfaces and space-based forest fire forecasting prototypes for the Department of Defense. From 1998-2000 he volunteered as an occasional crew lead with Habitat for Humanity International in the Philippines, Honduras, Romania, and Nepal while also working as a project manager with Yahoo! and a Microsoft partner firm. He then joined as IT Chief for the Bioterrorism Preparedness and Response Program at the U.S. Centers for Disease Control and Prevention, leading the program’s technology response to during 9/11, anthrax in 2001, Severe Acute Respiratory System in 2003, and other international public health emergencies. He later completed a PhD in Information Systems from Emory University and two post-doctoral associateships at MIT and Harvard in 2008.

In 2009, Dr. Bray volunteered to deploy to Afghanistan to help “think differently” on military and humanitarian issues and in 2010 became a Senior National Intelligence Service Executive advocating for increased information interoperability, cybersecurity, and protection of civil liberties. In 2012, Dr. Bray became the Executive Director for the bipartisan National Commission for Review of Research and Development Programs of the United States Intelligence Community, later receiving the National Intelligence Exceptional Achievement Medal. He received both the Arthur S. Flemming Award and Roger W. Jones Award for Executive Leadership in 2013. He also was chosen to be an Eisenhower Fellow to meet with leaders in Taiwan and Australia on multisector cyber strategies for the “Internet of Everything” in 2015.

Dr. Bray has served as the Chief Information Officer for the Federal Communications Commission, leading FCC’s IT Transformation since 2013. He was selected to serve as a member of the Council on Foreign Relations and as a Visiting Associate for the Cybersecurity Working Group on Culture at the University of Oxford in 2014. He also has been named one of the “Fed 100” for 2015 and the “Most Social CIO” globally for 2015, tweeting as @fcc_cio.

Discussion of this talk is also available online at:

David Bray – The Importance of Public Service #ChangeAgents in Exponential Times – Discussion from Quello Center on Vimeo.

Tags: , , , , , , ,


The Evil Web?

by

THE EVIL WEB

A. Michael Noll

September 26, 2015

© 2015 AMN, blogged with the permission of the author.

A. Michael Noll

A. Michael Noll

Along with spam and the pirating of copyrighted material, the Internet has become a dangerous and evil place. The Web has become today’s electronic wild west with the piracy of copyrighted material, identity theft, privacy invasion, and voracious amounts of spam – to list some evils of the Web.

In early 2012, Federal authorities went after a Web site that was pirating copyrighted material. In retaliation to the closing of the site and the criminal charges, hackers attacked the Web sites of Federal agencies.

Anyone who purchases stolen property is committing a crime. But it is not just copyrighted videos and music that is being stolen much to the anguish of Hollywood and the music industry. Academics obtain “pdf” files of textbooks and make them available at university websites so their students do not have to purchase the books, in effect, robbing authors and publishers of royalties and income.

Computer and Internet security are big issues today. Web sites are penetrated, and personal information is stolen leading to credit card fraud and identify theft. Over a weekend in mid January 2012, online shoe-site Zappos was hacked, and millions of customers’ information compromised. In 2007, Alcatel-Lucent somehow loss a data disk containing personal information about all its pensioners. Viruses and spoofing all contribute to making the Internet a dangerous place.

In most cases, businesses that are hacked or that misplace disks clearly have not taken adequate security precautions. Consumers need protection – legal and technological — from the evils of the Internet and the storage of electronic information.

Decades ago I worked on computer security and privacy issues on the staff of the White House Science Advisor. I learned then that the best way to keep information secure was not to make it available over any kind of network. The best firewall is a disconnected plug. But if information had to be made available, in as few cases as possible, then encryption was the best form of protection. There also had to be a need to know. Somehow all this advice seems to have been forgotten and ignored by many Internet sites.

There are other sensible protections. Customers should be given an option as to whether personal information is stored or not. The personal information that is stored should be on a separate computer that is not accessible over the Internet. All information – not just credit card information – clearly should be encrypted, with passwords and keys strongly protected. Audit trails are needed so that any penetration can be quickly determined and documented.

Today’s Internet crooks work from home or cozy offices – hacking their way into various web sites, spoofing legitimate web sites, stealing identities, pirating copyrighted material, and spamming the universe in promotion of whatever they are selling. And since each crime and each few bits of information seem insignificant, the Internet crooks get away with it. And, meanwhile, the Internet community at the slightest mention of any controls pleads about keeping the Internet free and open.

In mid January 2012, the Internet community – led by Google – mounted a massive campaign against the legislation that would have placed some limits on the Web. The claim was made that any such legislation would be censorship. However, Google and other search sites routinely determine the order of listings – and even what sites are listed – in effect, acting as the censors.

So what all the hoopla really is all about is who should set the terms of censorship – industry (which is guided solely by making a profit) or the government (which might more likely be guided by protecting the public and intellectual). A solution would be for search engines and Internet service providers to offer users the option to impose censorship and the terms of that censorship on sites.

In the end in 2012, Congress caved to all the pressure from the liberal Internet community – the White House had already fallen under the influence of Silicon Valley. And so any legislative protection died – and the Internet remains free and open – a lawless and dangerous place.

The Internet and Web are no longer new and innovative – electronic information, data communication, and the packet switching of the Internet are all relatively mature technologies that have been available for decades. If the Internet and Web community are not able to police and control themselves, then the only other option is government control and policing. Hollywood learned long ago that it was far better for it to police itself than suffer government regulation. One option would be for Internet access providers to offer a censored and protected level of service.

It is clear that the authorities do not seem willing – or able – to do much to stop all the evils of the Internet. Perhaps the time has come for a group of Internet vigilantes to patrol cyberspace to protect copyright, eliminate spam, and attack the servers of the Internet spammers and crooks.

A. Michael Noll is Professor Emeritus of Communication at the Annenberg School at the University of Southern California, and a Quello Research Associate.

Tags: , , , , , , , , , , ,


Delighted to Host David Bray, CIO of the FCC

by

The Quello Center is very pleased to host a visit to MSU by Dr David Bray, the CIO of the FCC, and a recent recipient of an Eisenhower Fellow. He will be speaking at the College of Communication Arts & Sciences on Monday, 21 September 2015, in Room 191 at 3pm, giving one of this year’s Quello Lectures.

Dr David Bray

Dr David Bray

David has spoken recently on related topics, such as on how to reshape public service IT for the new digital era. His talk on Monday promises to be of special value to students considering careers in the public service. The title of his talk is ‘The Importance of Public Service #ChangeAgents in Exponential Times’. More information about David and his talk is available on our event site at: http://quello.msu.edu/event/changeagents-and-public-service-in-the-digital-age-by-david-bray/?instance_id=288

Join us at 3pm.

Tags: , , , , ,


Rural Access to Broadband: the Case in Britain Shines Light on a Pattern

by

In Britain, a growing gap between urban and rural Internet speeds is damaging business, adding to farming costs, driving young people away from areas in which they have grown-up, and deterring retirees from moving to some areas of the country. These are some of the conclusions of our in-depth academic study of Internet access that Bill Dutton, Director of the Quello Center, conducted with the dot.rural RCUK Digital Economy Research Hub at the University of Aberdeen, and the Oxford Internet Institute, at the University of Oxford.

The report has been published, entitled ‘Two-Speed Britain: Rural Internet Use’. It is based on the most detailed survey so far of rural Internet users. By looking separately at ‘deep rural’ (remote), ‘shallow rural’ (less remote) and urban Internet users, the project was able to reveal the true nature of a rural divide. The report is available online at: http://ssrn.com/abstract=2645771

Specifically, Bill and his colleagues found that while in urban areas just six per cent of those sampled had an average broadband speed below 6.3 Mbits/sec, in deep rural areas 45% of people were unable to achieve this modest speed. The lead research for dot.rural, Professor John Farrington, of the University of Aberdeen and lead author of the report, said that these findings indicated the scale of the problem for deep rural areas in particular, and that the digital gap is currently widening, rather than closing.

“The broadband speed gap between urban and especially deep rural areas is widening: it will begin to narrow as superfast reaches more rural areas but better-connected, mostly urban, areas will also increase speeds at a high rate. This means faster areas will probably continue to get faster, faster with slow speed areas left lagging behind.

“There is a growing social and economic gap between those who are connected and those who are not, the ‘digitally excluded’,” he said.

“It is generally seen in differences between deep (remote) rural Internet use on the one hand, and shallow (less remote) rural and urban Internet use on the other hand.

It is most pronounced in upland areas in Scotland, Wales and England, but also in many areas in lowland rural Britain. It affects 1.3 million people in deep rural Britain, and many more in less remote areas with poor Internet connection: 9.2 million people live in shallow rural areas.

“Rural businesses are penalised because they are unable to take advantage of the commercial efficiencies afforded by the Internet, as in the creative industries, or have to resort to the use of paper systems which are more costly, as in the farming sector where there is a push to move administration such as sheep registrations online.

“All these issues can potentially create a new tipping point for digitally poorly connected rural areas, including: losing businesses; adding to farming’s costs; making out-migration more likely for young people; and in-migration less likely for retirees or the economically active.

Professor Farrington added that the issue needed to be addressed if the UK Government agenda of ‘Digital by default’, with government services being delivered online, is to be achieved.

“There is a drive to make public services ranging from registering to vote to applying for a visa or making a tax return digital by default, and simpler, clearer and faster to use.

“Based on the findings of our report, this can’t be achieved until better connection is universal. The ‘universal’ broadband target of 2 Mbits/sec will be inadequate to fulfil this aim.

”An element of policy should be to improve the interface between public, private and community efforts in improving deep rural broadband speeds”

As one of the authors, and one of the principle researchers in the conduct of the Oxford Internet Surveys (OxIS), I noted that:

“This deep rural divide is not new, but it has been invisible in the statistics until now. With a specially designed sample in 2013, we have been able to uncover this divide and see it in the data. A major investment in OxIS has paid off.”

In my opinion, this helps address the failure of many other studies to find the rural divide in the data gathered by survey researchers. First, we required a disproportionate stratified sample in order to obtain a sufficient number of deep rural residents. It took us years to find the support for this boosted sample, and it would not have been possible without the collaboration with the Aberdeen dot.rural project. Secondly, the urban-rural divide was masked by the fact that shallow rural residents often have better connectivity than many urban users. Since we had a large enough rural sample, we were able to disaggregate shallow and deep rural residents and see the divide in the data.

This pattern could be the case in many other nations, so I hope researchers in the US and worldwide take notice of these findings in Britain, including England, Wales and Scotland. Moreover, the report provides an array of qualitative examples to help see the role of rural divides not just in the statistics but also in the lives of rural residents.

xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

This most detailed survey so far of Rural Internet Users refines many popular notions of the urban-rural digital divide and allows more detailed evidence of the impact of this divide. By looking separately at ‘deep rural’ (remote), ‘shallow rural’ (less remote) and urban Internet users, we are able to highlight the true nature of this divide.

The online behaviour of those living and working in deep and shallow rural areas reflects constraints on Internet connectivity – the effects of which include an overall limitation on what people are able to do online compared with what they want to do. Those residing in deep rural areas are most likely to be unserved or underserved (with speeds of less than 2.2Mbit/s) by broadband connectivity and are less likely than others in Britain to be able to engage online.

Ofcom’s mobile telecommunications data, reported at local authority level, shows that mobile Internet (3G and 4G) access in many rural areas remains limited, or non-existent and is not a feasible alternative means of connectivity to those without fixed broadband servicing their home or business premises.

See the report at: http://ssrn.com/abstract=2645771

Tags: , , , , , , , ,


Three New Positions in Media and Information at MSU

by

The Department of Media and Information at MSU is recruiting for three tenure-track positions. They are in the areas of:
– media/information theory/research http://bit.ly/cas-theory
– Internet economics http://bit.ly/cas-ie
– health and data science http://bit.ly/cas-data
Moreover, these are three of 15 academic positions opened across the College of Communication Arts & Sciences. See: http://cas.msu.edu/places/cas-deans-office/jobs/

Please let colleagues know of these positions, and please consider any of these positions for your own career future.

Regards,

Bill Dutton

Tags: , , , , , , , , ,


Informing Voters in the Digital Age

by

Using the Media, Internet and Debates to Inform Voters: A Series of Blogs

Bill Dutton of the Quello Center, and Tracy Westen, founder of The Democracy Network and founder and CEO of The Center for Governmental Studies, have posted a series of blogs that take a critical look at the way in which the GOP primary debates have been handled by Fox News and Facebook. Reflecting on the challenges of televised and Internet orchestrated debates, they come up with suggestions combining the media to improve the ways in which voters can obtain information about the issue positions, personalities, and endorsements of candidates.

In the run up to the GOP primary debate broadcast by Fox News and Facebook, Bill Dutton posted a critical blog, entitled ‘Stop the Televised Debates and Shift to the Internet’. See: http://billdutton.me/2015/07/23/stop-the-televised-debates-and-shift-to-the-internet/

In response to Bill’s blog, Tracy Westen provided an alternative vision of a more voter-centric debate scheme. His blog is entitled ‘Envision Voters Staging Their Own Candidate Debates: a Comment from Tracy Westen on the Televised Debates for the Republican Party’. http://billdutton.me/2015/07/25/envision-voters-staging-their-own-candidate-debates-a-comment-from-tracy-westen-on-the-televised-debates-for-the-republican-party/

After critiquing the first Fox News-Facebook debate, Tracy and Bill focused on the reasons why debates have failed to use the Internet more effectively. Their post, ‘A Dirty Dozen: 12 Reasons Candidates and Networks Fail to Move Presidential Debates Online’, addresses key problems, and argues that some of these reasons will make progress quite difficult unless a new scheme can be developed. See: http://billdutton.me/2015/07/31/a-dirty-dozen-12-reasons-candidates-and-networks-fail-to-move-presidential-debates-online-by-tracy-westen-and-bill-dutton/

Tracy Westen’s post followed with ‘More Challenges to Informing Voters Online: Lessons Learned’ http://billdutton.me/2015/08/01/more-challenges-to-informing-voters-online-lessons-learned-by-tracy-westen/

These were followed by a blog entitled ‘Grading the Fox News-Facebook GOP Presidential Debate Spectacle’, which provided criteria for grading the debates, which lead Bill and Tracy to give a D+ to the Fox News-Facebook debate. http://billdutton.me/2015/08/09/grading-the-fox-news-facebook-gop-debate-spectacle-by-bill-dutton-and-tracy-westen/

The final post looked a ways to move ahead and improve on the way in which the media can used the Internet and social media to provide a better platform for informing voters. Entitled ‘A New Approach to Presidential Debates’, Tracy and Bill outlined the steps involved in creating a wider range of information about all the candidates and key issues in elections. See: http://billdutton.me/2015/08/12/multimedia-convergence-a-new-approach-to-presidential-debates-by-tracy-westen-and-bill-dutton/

Tags: , , , , , , , ,