Open Medicine, Vol 5, No 4 (2011)

Research
Collaborative authoring: a case study of the use of a wiki as a tool to keep systematic reviews up to date
Jacqueline L Bender, Laura A O’Grady, Amol Deshpande, Andrea A Cortinois, Luis Saffie, Don Husereau, Alejandro R Jadad
ABSTRACT

Background: Systematic reviews are recognized as the most effective means of summarizing research evidence. However, they are limited by the time and effort required to keep them up to date. Wikis present a unique opportunity to facilitate collaboration among many authors. The purpose of this study was to examine the use of a wiki as an online collaborative tool for the updating of a type of systematic review known as a scoping review.

Methods: An existing peer-reviewed scoping review on asynchronous telehealth was previously published on an open, publicly available wiki. Log file analysis, user questionnaires and content analysis were used to collect descriptive and evaluative data on the use of the site from 9 June 2009 to 10 April 2010. Blog postings from referring sites were also analyzed.

Results: During the 10-month study period, there were a total of 1222 visits to the site, 3996 page views and 875 unique visitors from around the globe. Five unique visitors (0.6% of the total number of visitors) submitted a total of 6 contributions to the site: 3 contributions were made to the article itself, and 3 to the discussion pages. None of the contributions enhanced the evidence base of the scoping review. The commentary about the project in the blogosphere was positive, tempered with some skepticism.

Interpretations: Despite the fact that wikis provide an easy-to-use, free and powerful means to edit information, fewer than 1% of visitors contributed content to the wiki. These results may be a function of limited interest in the topic area, the review methodology itself, lack of familiarity with the wiki, and the incentive structure of academic publishing. Controversial and timely topics in addition to incentives and organizational support for Web 2.0 impact metrics might motivate greater participation in online collaborative efforts to keep scientific knowledge up to date.

Affiliations at the time of writing: Jacqueline L. Bender, MSc PhD (ABD) is a Doctoral Student at the Dalla Lana School of Public Health, University of Toronto with the People, Health Equity and Innovation Research Group (Phi Group) at the Centre for Global eHealth Innovation, University of Toronto and University Health Network, Toronto, Ontario, Canada. Laura O’Grady, PhD, is a Postdoctoral Fellow with the People, Health Equity and Innovation Research Group (Phi Group) at the Centre for Global eHealth Innovation, University Health Network and the Department of Health Policy, Management and Evaluation, University of Toronto. Amol Deshpande, MD, MBA, is a Physician with the Comprehensive Pain Program, University Health Network, Toronto, Ontario, Canada. Andrea A. Cortinois, MPH, PhD, is a Research Associate with the People, Health Equity and Innovation Research Group (Phi Group) of the Centre for Global eHealth Innovation, University of Toronto and University Health Network, Toronto, Ontario, Canada. Luis Saffie, BSc, is a Systems Administrator/Technical Specialist with the Centre for Global eHealth Innovation, University of Toronto and University Health Network, Toronto, Ontario,Canada. Donald Husereau, BScPharm, MSc, is an Adjunct Professor in the Department of Epidemiology and Community Medicine, University of Ottawa, Ottawa, Ontario, Canada. Alejandro R Jadad, MD, DPhil, FRCPC, FCAHS is the Canada Research Chair in eHealth Innovation; Chief Innovator and Founder at the Centre for Global eHealth Innovation, University of Toronto and University Health Network; Professor at the Departments of Dalla Lana School of Public Health, Health Policy, Management and Evaluation, and Anesthesia, University of Toronto, Toronto, Ontario, Canada.
Competing interests: None declared.
Contributors: ARJ and AD conceived the project. All authors contributed to the design of the project. JLB and LS implemented the project, monitored the wiki, and collected the data. JLB analyzed the data and wrote the first draft of the manuscript. All authors contributed to the manuscript and approved the final version. JLB is guarantor.

Funding: This study was funded by CADTH, the Canadian Agency for Drugs and Technologies in Health. The funding agency did not influence the study design or the analysis and preparation of the manuscript.
Correspondence: Jacqueline L. Bender, Centre for Global eHealth Innovation, R. Fraser Elliott Building, 4th floor, Toronto General Hospital, 190 Elizabeth St.,Toronto, ON M5G 2C4; jbender@ehealthinnovation.org

Systematic reviews have become the most effective means of identifying, selecting, assessing and synthesizing all original research evidence relevant to a specific question. Most systematic reviews have finite life spans, with a median survival (time from publication until availability of new information that potentially changes the effect size or direction) of 5.5 years; it has been estimated that, for 23% of systematic reviews, new information is available within 2 years of publication.1 Despite the need to keep reviews current, only 17.7% of systematic reviews seem to be cited as updated versions of previously published reviews.1,2

Given its open, transparent nature and the availability of free software for online collaboration, the Internet presents a unique opportunity to explore new ways to update and publish academic research. Wikis, in particular, are efficient tools for co-creating, maintaining and making widely available repositories of knowledge. In general, a wiki allows anyone to easily add to, edit or delete the content of a website. The wiki application keeps track of all changes and offers the ability to acknowledge author contributions. It has been suggested that some of the benefits of using a wiki for collaborative authoring include the ability to track authorship, monitor the development of an article, reduce conflicts of interest and update it swiftly, thus providing a current evidence resource.3-5

Wikis are used very successfully as highly efficient tools to co-create and maintain large amounts of general-interest content. A case in point is Wikipedia, the largest encyclopedia in the world. With over 3.5 million pages of articles in English alone, each of which is edited 19 times on average, and over 13 million registered users,6 this free resource overtook all other encyclopedias in size7 and possibly even the Encyclopedia Britannica in quality, in less than 5 years.8 Wikipedia, which contains over 20 000 health-related articles, has become a popular health resource for the general public and the medical community.5,9 The top 200 medical articles on Wikipedia receive, on average, over 100 000 page views per month,10 and studies suggest that 50% to 70% of practising physicians use Wikipedia as a resource when providing care.5,11,12

Although Wikipedia is by far the largest and most popular collaboratively authored reference website, there are now over 70 health-related wikis in existence of varying specialties and depth.13 Some serve as a general medical reference (e.g. Ganfyd.org) or subspecialty resource (e.g. Wikisurgery.org) for health professionals, while others are intended as a shared knowledge base for people with specific health concerns or conditions (e.g. Wikicancer.org). Research is underway to explore the utility of wikis as a knowledge base and a pedagogical tool.14,15 To the best of our knowledge, however, there have been no published research efforts to explore the role that wikis could play in relation to updating and maintaining peer-reviewed systematic reviews of health interventions. This study was designed to begin to address this gap.

The purpose of this study was to examine the use of a wiki as an online collaborative tool for updating a peer-reviewed scoping review. A scoping review is a type of systematic review that provides a comprehensive summary of the existing literature or evidence base on a broad topic.16,17 Similarly to traditional systematic reviews, scoping reviews follow rigorous and transparent methods to identify all relevant literature.16 Unlike a full systematic review, however, they typically do not provide a quality assessment of included studies or extensive data synthesis.17 A peer-reviewed scoping review was adapted for publication on an open, public wiki. This article describes the characteristics of the users, their use of the site, and the nature, quality and quantity of the contributions, as well as the “buzz” that accumulated on the Internet during the study period.

Methods

The open, public wiki was created and made available through the Open Medicine web site from 2 June 2009 to 10 April 2010, under the name Open Medicine wiki.

The wiki was populated with a pre-existing, peer-reviewed scoping review on asynchronous telehealth by Deshpande and colleagues18 that was initially published by CADTH in print and online form in January 2008, after peer review.19 In preparation for its publication format on the Open Medicine wiki, the review underwent a second round of peer review by the editorial board of Open Medicine.

The Open Medicine wiki utilized the open-source platform developed by Mediawiki (the same software application used by Wikipedia) to create a viewing and navigating experience that would likely be familiar to users of the site (Fig. 1). The wiki was hosted and administered by the Centre for Global eHealth Innovation, a joint research initiative of the University of Toronto and the University Health Network in Toronto. It was branded with the Open Medicine logo and linked to the Open Medicine website.

Initially, the Open Medicine wiki was launched as a public, “unrestricted” wiki open for anyone to view, add, change or delete content, and join. Registration was not required to contribute or modify content on the site, in an effort to lower any perceived barriers to participation. In response to extensive spamming, however, 2 spam filters (ReCAPTCHA and SpamBlacklist) were installed on the site on 24 June 2009, which served to effectively prevent further inappropriate contributions to the site.

As of 21 July 2009, anyone interested in contributing or modifying content on the site was required to register with a unique log-in ID and password, answer a few basic demographic questions (e.g., age, gender, country of residence, experience with wikis), and a competing interest statement. Registered users also had access to the discussion page of each site page, which served as an open forum for dialogue and interaction with other users of the site. There was no pre-screening of edits or contributions to the wiki. The only changes to the content of the site made by the site administrator (LS) were “roll-backs” to previous versions when the site was spammed with inappropriate and unrelated content.

The Open Medicine wiki was promoted by the journal Open Medicine in 2 eBulletins (2 June 2009 and 21 July 2009) and a Rapid Response article (14 August 2009) that were published on the Open Medicine site and distributed to members of their email list, as well as by the Canadian Society of Telehealth (now known as the Canadian Telehealth Forum) in an eBulletin (30 October 2009) published on their site and distributed to members of their email list and through personal communication with their Research Committee (1 December 2009).

Figure 1Figure 1. Screen shot of article home page [view]
Data collection

A mixture of quantitative and qualitative methods20 was used to collect descriptive and evaluative data on the use of the site from 9 June 2009 to 10 April 2010.

Google Analytics, a service created by Google to measure various metrics related to website usage, was used to summarize log file data on how the wiki was used. The following parameters were evaluated:

  • number of site visits (based on unique IP address)
  • number of new visits (first visit from unique IP address)
  • number of unique site visitors (visits from unique IP address)
  • country of domicile of site visitors (where Internet Service Provider is registered)
  • number of total and average page views (specific web site page)
  • source of traffic (referring web site)
  • direct traffic (visitors who typed in the URL of the site in their browser, or accessed the site from a browser bookmark, link in email or documents)
  • bounce rate (frequency of times a visitor lands on the first page and then exits the site)

These usage statistics were collected from any visitor to the site regardless of whether he or she registered to edit the wiki.

Visitors who chose to edit the wiki were required to complete a brief online questionnaire as part of the registration process. This questionnaire collected information on:

  • gender and age
  • country of residence (where the visitor was living at the time of site access)
  • experience using wikis (whether the visitor had ever obtained information from or contributed content to a wiki)
  • referral source (e.g., journal, professional society, browser, accidentally, word of mouth)
  • institutional affiliation
  • role (e.g., health care worker, researcher, administrator, patient, member of the public, etc.)
  • competing interests (e.g., employment, personal or financial)

Content analysis was used to evaluate and describe the nature and quantity of contributions or modifications to the content of the site, including messages posted to the site discussion pages, as well as the relevant talk in the “blogosphere,” which was limited to sites from which individuals were referred to the Open Medicine wiki.

Results

Site traffic. During the 10-month study period, there were a total of 1222 visits to the site, 3996 page views and 875 unique visitors (Table 1). Visitors came from 66 different countries, with 72.2% of visits originating in Canada or the United States (Table 2). The majority of site visits occurred during the first 2 months after the wiki launch. Similar proportions of visits came from referring sites (40.4%) and from direct traffic (39.4%) (Fig. 2).

In total, there were 55 unique referring sites (e.g., websites that included a link to the Open Medicine wiki). The Open Medicine website was responsible for nearly 50% of the referred traffic: 167 referrals originated from a link within the article on the Open Medicine website and 63 from the Open Medicine blog. Of the top 10 referring websites, 6 were blogs (Table 3).

Table 1Table 1. Site traffic [view]
Table 2Table 2. Geographic source of traffic [view]
Table 3Table 3. Top 10 referring sites [view]
Figure 2Figure 2. Traffic sources [view]

Site users. In total, there were 61 user accounts; however, only 13 were created by “genuine” users. The remaining 21 accounts were deemed to be “fake” accounts by the team’s site administrator (LS) created by either automated scripts or individuals entering malicious information that was either false or nonsensical. An example of a “fake” or fabricated account is shown in Figure 3.

Of the 13 genuine accounts, 5 were created by members of the project team and 8 were created by non-team members (one individual created two separate accounts). Only 4 of the 7 unique non-team members who registered with the site completed the registration questionnaire (three registered before the addition of the survey). Of these 4, 3 were male, 3 were Canadian, all were between 30 to 60 years old, all had previously used a wiki for information, 3 were affiliated with a university and 2 were health researchers, 1 was a health administrator and 1 was a health practitioner. Only 1 had previously contributed content to a wiki, and only 1 reported having a conflict of interest.

Figure 3Figure 3. Example of fake account profile [view]

Site contributions. Of the 875 people who visited the site, 5 made a total of 6 contributions to the site (0.57% of the total number of unique visitors to the site).

The 3 contributions to the article itself were as follows:

  • A sentence about the function of asynchronous telehealth was added to the abstract;
  • Competing interests were added to the author declaration section; and
  • A grammatical change was made to the content in the competing interests section.

Two posts were added to the article’s discussion page. One contributor posed the question about the existence of safeguards to protect patient privacy, which was answered by the project team. The second contributor suggested expanding the definition of telehealth used in the article to include online support communities for patients and health care professionals. Neither of these posts generated responses or commentary from other users of the site. Lastly, one post requesting help with the registration process was added to the site’s help page.

Buzz in the blogosphere

This project generated some comments on blog sites. In general, the blog commentary about the project was positive, but it was tempered with some skepticism. The positive blog postings applauded the initiative and the door the wiki trial opened to further experimentation and innovation in the field of academic publishing. Some of the most representative positive comments were as follows:

The technophile blurry-eyed visionary in me is very impressed. Congratulations to this sister discipline [commenter was from law] for having the courage and foresight to lead the way—Blogger 1

… a simple idea with significant consequences. It should enable risk-free experimentation with all sorts of web 2.0 innovations, social networking, and collaborative research and writing. Some will fail to add value. That doesn’t matter. The point is not that all experiments will succeed but that this simple idea frees us to experiment—Blogger 2

It will be interesting to see how successful this approach is and to speculate about how the potential changes from this publishing model will play out in the future—Blogger 3

Negative blog posts revolved around concerns about threats to the quality of the article if non-experts were allowed to contribute to it and the sustainability of such an initiative. Two such examples are as follows:

“The law prof, writer, editor and publisher in me is writhing at the fate [that] could befall an article at the hands of the public, even a public that has to register first”-—Blogger 1

“I think the wiki is worth a try, but I worry about its staying power. I often have trouble getting docs to use PubMed properly, so how can I convince them to build a wiki?”—Blogger 4

Discussion

To our knowledge, this is the first attempt to examine the use of an open, publicly available wiki as a tool to engage the academic and clinical communities, as well as the public, in the collaborative updating process for a peer-reviewed scoping review. Our findings indicate that this initiative failed to add value to the body of evidence on asynchronous telehealth. Although the wiki attracted 875 visitors from around the globe, fewer than 1% contributed content to the article and none of the contributions substantially improved the nature or extent of evidence presented in the article. However, this initiative succeeded in stimulating dialogue about novel formats of academic publishing, as evidenced by the commentary that accumulated in the blogosphere. It also provided a lesson in what does not work, and in doing so has alerted us to important contextual factors that may hinder the role of crowd-sourcing through wikis as a means to keep science up to date.

Although there is a wide spectrum of activity in online communities, most have a handful of dedicated contributors and a much larger number of readers, also commonly known as lurkers (people who read but do not post). The average ratio of posters-to-readers in mailing lists and message boards has been estimated at 100:1.21 The main reasons for not actively participating in these online communities include not needing to post, or feeling as though one has nothing to contribute; needing to find out more about the group before participating; thinking that they were being helpful by not posting; not understanding how the software operates; and not liking the dynamics of the community.22 In the case of Wikipedia, there has been debate concerning whether its success is due to “the power of the few or the wisdom of the crowd.”23,24 Research suggests that a small proportion of editors account for most of the work done and valued added.24-26

The levels of contributions to the wiki in the present study could not be explained by the absence of new evidence, as there were at least 6 articles just on tele-dermatology published between the third week of November of 2006,27-32 when the original search of the literature was performed, and the end of the data collection period in April 2010. It is possible that there was limited interest in the relatively narrow topic area of asynchronous telehealth; however, targeted efforts were made to increase awareness about its existence among hundreds of specialist members of the Canadian Society of Telehealth. It is more likely that the results are a function of the review methodology itself, lack of familiarity with the wiki, and the incentive structure of academic publishing.5,33

As discussed earlier, only 17.7% of systematic reviews seem to be cited as updated versions of previously published reviews.1,2 This low update rate could, in part, reflect the effort required to complete a systematic review. Updates are time-consuming, expensive and yield relatively little new evidence if conducted too often.34 Although updating a scoping review typically involves less effort than a traditional systematic review given the lower level of data integration, similar logistical barriers could have discouraged users of the wiki from updating the present review.

Similarly, perceived ease of use—a key determinant of technology adoption35 could have deterred some users from editing the wiki. We know that positive outcome expectancy, previous use of information technology and Internet self-efficacy influence the use of the Internet for health information more generally.36 Current estimates indicate that only 10% of physicians have edited one or more articles on Wikipedia,12 which suggests a relatively inexperienced user population. Recent research suggests that certain socialization tactics (e.g., welcome messages, assistance and constructive criticism) can enhance users’ confidence in participating in a wiki.37

By far the most important determinant of technology adoption is perceived usefulness.35 Unfortunately, recent initiatives suggest a general lack of interest among scholars in Web 2.0 publishing.33 Some journals like Open Medicine (e.g., BMJ, PLoS, Nature, Bioinformatics) have been experimenting with article commenting, which many have already considered to be a failure for not attracting enough participation.33,38,39

According to Priem and Hemminger,33 the lack of participation of scholars in article commenting, which we propose could be extended to all Web 2.0 scholarly activities, can be attributed to “the three ‘C’s’: culture, credit, and critical mass.” There are major cultural barriers to the use of Web 2.0 tools in scholarly publishing. This kind of work is not encouraged in medical or postgraduate education, and does not count toward publications, grants or any other type of currency for career advancement. However, to achieve sustainability, Web 2.0 initiatives must establish a critical mass of contributors.5,33 High levels of activity also lend credibility by implying that the site is popular, which in turn motivates more people to contribute content.40

Heilman and colleagues5 have provided a number of suggestions that could facilitate the adoption of wikis by building a culture of open, collaborative authoring in science and medicine. These include: requiring students to critically appraise and improve articles on Wikipedia as part of their medical or postgraduate education curricula; encouraging participation in WikiProject Medicine (a community of Wikipedia editors interested in improving the quality of health-related articles), which is applying for credit as a continuing medical education (CME) opportunity; and the coupling of traditional publishing with contributions to Wikipedia as demonstrated by the scientific journal RNA Biology.41

Ultimately, in order for wikis to be a viable model for collaborative publishing in academia measures must be in place to recognize and credit author contributions. Several online platforms are currently testing tools and strategies to recognize authorship (e.g., WikiGenes, MedPedia, Google Knol).3 One relevant example is OPIMEC, an observatory of innovative practices in complex, chronic disease management, which produced the first collaboratively authored book on poly-pathology using online crowd-sourcing efforts. As a platform designed purposefully to bring together like-minded individuals, OPIMEC met all of Heilman’s Cs: it already had a strong culture of collaboration, a critical mass of scholars knowledgable in the topic area, and tools and processes for the identification and recognition of contributions (e.g., lead authors produced the first version of each chapter, integrated contributions from the community and credited community members as co-authors on the basis of the extent of their contributions).42

There is also growing interest in new ways to measure scientific impact.33 As discussed in detail by Priem and Hemminger, “these days scholars not only cite previous work in journal publications, they may add it to their personal web pages, bookmark, tweet, or blog about it.”33 Building one’s online research identity may soon become a necessity for researchers. Evaluators could use Web usage tracking tools to mine and evaluate these social media–based metrics of scholarly impact.

Future research should examine whether these and other strategies to recognize distributed knowledge dissemination efforts could improve the viability of wikis in academic publishing. Whether this is the case could be easily explored through the replication of the current experiment using a review on controversial interventions for a highly prevalent condition, with a rapidly growing body of research, and whose conclusions could have an important impact on a large number of health professional, academic, policy and corporate groups. By changing the study in this manner and by incorporating strategies to recognize contributions, it might be possible to establish whether wikis could in fact be valuable resources for keeping research evidence up to date. Ultimately, we may need the buy-in of the academic community on a new set of scholarly impact metrics that take into account these new forms of knowledge dissemination.

Acknowledgments

We thank the Open Medicine team for their assistance during the planning and implementation of the study, including allowing us to brand the project site with their logo and link to the Open Medicine homepage.

References
  1. Shojania KG, Sampson M, Ansari MT, Ji J, Doucette S, Moher D. How quickly do systematic reviews go out of date? A survival analysis. Ann Intern Med 2007;147(4):224–233. [PubMed] [Full Text]
  2. Moher D, Tetzlaff J, Tricco AC, Sampson M, Altman DG. Epidemiology and reporting characteristics of systematic reviews. PLoS Med 2007;4(3):78. [CrossRef] [PubMed] [Full Text]
  3. Hoffmann R. A wiki for the life sciences where authorship matters. Nat Genet 2008;40(9):1047–1051. [CrossRef] [PubMed] [Full Text]
  4. Schmidt J. Flexibility in wiki publishing: author desires, peer review and citation. Academic Publishing Wiki 2005 (accessed 2011 Apr 5). [Full Text]
  5. Heilman JM, Kemmann E, Bonert M, Chatterjee A, Ragar B, Beards GM, et al. Wikipedia: a key tool for global public health promotion. J Med Internet Res 2011;13(1) [CrossRef] [PubMed] [Full Text]
  6. Wikipedia.org. Statistics. Wikipedia: the free encyclopedia 2011 (accessed 2011 Apr 4). [Full Text]
  7. Wikipedia.org. Wikipedia: size comparisons. Wikipedia: the free encyclopedia 2011 (accessed 2011 Apr 4). [Full Text]
  8. Giles J. Internet encyclopaedias go head to head. Nature 2005;438(7070):900–901. [CrossRef] [PubMed] [Full Text]
  9. Laurent MR, Vickers TJ. Seeking health information online: does Wikipedia matter? J Am Med Inform Assoc 2009;16(4):471–479. [CrossRef] [PubMed] [Full Text]
  10. Wikipedia.org. WikiProject Medicine: Popular pages 2011 (accessed 2011 Sept 7). [Full Text]
  11. Hughes B, Joshi I, Lemonde H, Wareham J. Junior physician’s use of Web 2.0 for information seeking and medical education: a qualitative study. Int J Med Inform 2009;78(10):645–655. [CrossRef] [PubMed] [Full Text]
  12. Comer B. Docs look to Wikipedia for condition info: Manhattan research. Medical Marketing & Media 2009 [Full Text]
  13. Rothman D. List of medical wikis 2009 [Full Text]
  14. Varga-Atkins T, Dangerfield P, Brigden D. Developing professionalism through the use of wikis: A study with first-year undergraduate medical students. Med Teach 2010;32(10):824–829. [CrossRef] [PubMed] [Full Text]
  15. Brohée S, Barriot R, Moreau Y. Biological knowledge bases using Wikis: combining the flexibility of Wikis with the structure of databases. Bioinformatics 2010;26(17):2210–2211. [CrossRef] [PubMed] [Full Text]
  16. Arksey H, O’Malley L. Scoping studies: Toward a methodological framework. Intl J Soc Res Methodol 2005;8(1):19–32. [CrossRef] [Full Text]
  17. Armstrong R, Hall BJ, Doyle J, Waters E. ‘Scoping the scope’ of a Cochrane review. J Public Health 2011;33(1):147–150. [CrossRef] [Full Text]
  18. Deshpande A, Khoja S, Lorca J, McKibbon A, Rizo C, Husereau D, et al. Asynchronous telehealth: A scoping review of analytic studies. Open Med 2009;3(2):69–91. [PubMed] [Full Text]
  19. Deshpande A, Khoja S, Lorca J, McKibbon A, Rizo C, Jadad AR. Asynchronous telehealth: Systematic review of analytic studies and environmental scan of relevant initiatives. Technology Report No. 101. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2008.
  20. Creswell JW. Research design: qualitative, quantiative and mixed method approaches. California: Sage Publications; 2003.
  21. Nonnecke B, Preece J. Lurker demographics: counting the silent. Proceedings of the Conference on Human Factors in Computing Systems (CHI); 2000 Apr 1-6; The Hague, Netherlands. New York: ACM Press; 2000.
  22. Preece J, Nonnecke B, Andrews D. The top five reasons for lurking: improving community experiences for everyone. Comput Hum Behav 2004;20(2):201–223. [CrossRef] [Full Text]
  23. Panaciera K, Halfaker A, Terveen L. Wikipedians are born, not made. Proceedings of GROUP; 2009 May 10-13; Sanibel Island, Florida. New York: ACM Press; 2009.
  24. Kittur A, Chi EH, Pendleton BA, Mytkowicz T. Power of the few vs. wisdom of the crowd: Wikipedia and the rise of the bourgeoisie. Proceedings of the Conference on Human Factors in Computing Systems (CHI); 2007 Apr 28-May 3; San Jose, California. New York: ACM Press; 2007.
  25. Kittur A, Kraut RE. Harnessing the wisdom of crowds in Wikipedia: quality through coordination. Proceedings of the Conference on Computer Supported Cooperative Work (CSCW); 2008 Nov 8-12; San Diego, California. New York: ACM Press; 2008.
  26. Priedhorsky R, Chen J, Lam SK, Panaciera K, Terveen L, Ridel J. Creating, destroying and restoring value in Wikipedia. Proceedings of GROUP; 2009 May 10-13; Sanibel Island, Florida. New York: ACM Press; 2009.
  27. Pak H, Triplett CA, Lindquist JH, Grambow SC, Whited JD. Store-and-forward teledermatology results in similar clinical outcomes to conventional clinic-based care. J Telemed Telecare 2007;13(1):26–30. [CrossRef] [PubMed]
  28. Heffner VA, Lyon VB, Brousseau DC, Holland KE, Yen K. Store-and-forward teledermatology versus in-person visits: A comparison in pediatric teledermatology clinic. J Am Acad Dermatol 2009;60(6):956–961. [CrossRef] [PubMed] [Full Text]
  29. Edison KE, Ward DS, Dyer JA, Lane W, Chance L, Hicks LL. Diagnosis, diagnostic confidence, and management concordance in live-interactive and store-and-forward teledermatology compared to in-person examination. Telemed J E Health 2008;14(9):889–895. [CrossRef] [PubMed]
  30. Fabbrocini G, Balato A, Rescigno O, Mariano M, Scalvenzi M, Brunetti B. Telediagnosis and face-to-face diagnosis reliability for melanocytic and non-melanocytic ‘pink’ lesions. J Eur Acad Dermatol Venereol 2008;22(2):229–234. [CrossRef] [PubMed]
  31. Ferrandiz L, Moreno-Ramirez D, Nieto-Garcia A, Carrasco R, Moreno-Alvarez P, Galdeano R, et al. Teledermatology-based presurgical management for nonmelanoma skin cancer: a pilot study. Dermatol Surg 2007;33(9):1092–1098. [CrossRef] [PubMed]
  32. Verma M, Raman R, Mohan RE. Application of tele-ophthalmology in remote diagnosis and management of adnexal and orbital diseases. Indian J Ophthalmol 2009;57(5):381–384. [CrossRef] [PubMed] [Full Text]
  33. Priem J, Hemminger BM. Scientometrics 2.0: toward new metrics of scholarly impact on the social Web. First Monday 2010;15(7) (accessed 2011 Sept 7) [Full Text]
  34. Chapman A, Middleton P, Maddem G. Early updates of a systematic review—a waste of resources? 2002.
  35. Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. Manage Inf Sci Q 1989;13(3):319–339.
  36. Mead N, Varnam R, Rogers A, Roland M. What predicts patients' interest in the Internet as a health resource in primary care in England? J Health Serv Res Policy 2003;8(1):33–39. [CrossRef] [PubMed] [Full Text]
  37. Choi B, Alexander K, Kraut RE, Levine JM. Socilization tactics in Wikipedia and their effects. New York: ACM Press; 2010.
  38. Nielsen M. Doing science in the open. Physics World 2009;1:30–35.
  39. Neylon C, Wu S. Article-level metrics and the evolution of scientific impact. PLoS Biol 2009;7(11) [CrossRef] [PubMed] [Full Text]
  40. Flanagin AJ, Metzger MJ. Digital media and youth: unparalleled opportunity and unprecedented responsibility. In: Flanagin AJ, Metzger MJ, editors. Digital media, youth, and credibility. Cambridge (MA): MIT Press; 2008.
  41. Butler D. Publish in Wikipedia or perish. Nature News 2008 [Full Text]
  42. Jadad AR, Cabrera A, Martos F, Smith R, Lyons RF. Why multiple chronic diseases? Why now? What is going on around the world? In: Jadad AR, Cabrera A, Martos F, Smith R, Lyons RF, editors. When people live with multiple chronic diseases: a collaborative approach to an emerging global challenge. Granada: Andalusian School of Public Health; 2010. [Full Text]


Creative Commons License
This work is licensed under a Creative Commons Attribution Share-alike 2.5 License.

ISSN 1911-2092