acm-header
Sign In

Communications of the ACM

Contributed articles

Analyzing Worldwide Research in Hardware Architecture, 1997-2011


Analyzing Worldwide Research in Hardware Architecture, illustration

Credit: Tagxedo.com

This work covers research in the field of hardware architecture in the years 1997 to 2011. More than 51,000 journal articles (conference articles excluded) were analyzed using a specially developed in-house software application able to analyze keyword evolution, estimate an article's impact, determine collaborations established at different scales, and calculate collaboration-weighted productivity of countries and research centers. Keyword analysis allows rapidly growing research topics (such as networks, sensors, security, stability, energy, and chips), as well as topics of lesser interest (such as neural networks, ATM, and VLS), to be identified and tracked. The changes detected in productivity, collaboration, and impact scores over time can be used as a framework for assessing research activity in a realistic context. The geographic and institutional distributions of the studied publications were also examined, along with changes in collaboration-weighted productivity, the impact of the major research countries and research centers, and the journals in which their results were published.

Increased research activity in recent years has led to increased interest by researchers and their centers in publishing in the most visible journals. Research results in hardware architecture fall into the Web of Science (WoS) Core Collection category known as "computer science, hardware architecture," which covers the physical components of computer systems, including main and logic boards, internal buses and interfaces, static and dynamic memory, storage devices and storage media, power supplies, input and output devices, networking interfaces, and networking hardware (such as routers and bridges). It also covers the architecture of computing devices, including SPARC, RISC, and CISC design, as well as scalable, parallel, and multiprocessor computing architectures. The present analysis covers journal papers (WoS "article" and "review") but excludes conference papers (WoS "proceedings").

Back to Top

Key Insights

ins01.gif

A number of authors have analyzed scientific production in computer science, but their work has focused on specific geographic areas,1,8,11 collaborative networks,7 competitiveness of universities,9 and impact of publications.2,6 The literature had not previously included studies on global scientific production in the field of hardware architecture or its development in recent years.

This article analyzes the state of global research into hardware architecture and its development over the period 1997–2011. Keyword analysis was used to identify the research topics that showed the most growth over the study period, as well as those of lesser interest. The scientific production, collaborations, and research impact of countries and research centers were also examined, along with distribution of national and international networks, change in the collaboration-weighted productivity of countries and research centers, and journals in which they published. This was done through the analysis application developed by our group, the PADOC-Research Group at Technical University of Madrid (http://www.grupoinvestigacionpadoc.com), analyzing keyword evolution, computing the collaboration-weighted productivity of research centers and countries, estimating the impact of each article, and determining the collaborations established at different scales.

Back to Top

Materials and Methods

The WoS Core Collection, a Thomson Reuters product consisting of seven databases including thousands of scholarly articles, reviews, journals, book series, and reports provided the raw data used in the study.

Following the methodology in numerous research papers,3,5,11 all traditional journal articles, or "article" and "review" document types, published 1997–2011 in the category "computer science, hardware architecture" (n=51,474) were downloaded (December 2012) from the WoS Core Collection for analysis. Proceedings papers, reprints, notes, letters, and corrections were excluded; 1997 was selected as the first year of the study period since it was when information on impact factors (IF) became available from the Institute for Scientific Information (ISI).

The author(s), editor(s), title, source, addresses, citation time, keywords, language, and WoS categories were recorded for each article and analyzed through the analysis application (written in Visual Basic). It was able to identify countries and research centers associated with research papers (3.4% had no information on origin in the WoS field "addresses"), tracked development of the most important research topics, and analyzed changes in the geographic and institutional distribution of publications.

Statistical analysis of author keywords is useful for revealing science trends and was proved important for monitoring development of science and programs.12 The keywords were provided by paper authors; keywords or "keywords plus" were generated by the ISI. The analysis application obtained an annual count of the number of research papers (NK) in which each detected keyword appeared; it was able to identify singular and plural forms to provide an overall figure. Compound keywords were also analyzed. These analyses were then combined to provide a more realistic and representative account of the research topics that had been studied.

The geographic and institutional distribution of publications was examined by determining the number of collaboration-weighted articles (NP) produced by each country and research center. This ensured the sum of research papers published by all countries and research centers corresponded to the total number of published papers. For any specific period, it was calculated as follows (Equation 1)

ueq01.gif

where N is the total number of research papers produced by a country or research center over a set period, NC,i is the number of participating countries associated with each research paper "i," and NRI,i is the number of research institutions participating in each paper "i."

To analyze the complexity of the research network structures, the analysis application examined the degree of international collaboration involved in each article, as well as the number of research centers and authors. This was achieved by recording the connections between participating countries and their research centers, then generating corresponding matrices. The percentage of international collaboration, or Col (%), for the various countries and centers was then calculated as the number of research papers published in collaboration with at least one center of another country over their total production.

The average number of authors per article from each country or research center (NA) was calculated as follows (Equation 2)

ueq02.gif

where NA,i is the number of authors (of any nationality or institution) in each paper "i."

The average number of research institutions (NRI) participating in each paper was calculated for each country and research center (Equation 3)

ueq03.gif

where NRI,i is the number of research institutions participating in each paper "i."

To determine the impact of a particular piece of research, the application examined the IF and the number of citations per article. First, an impact indicator, or "year impact factor" (YIF), the IF of a journal for the year in which the article was published4 was assigned to each paper. The average YIF for each country and research center was then calculated as follows for different periods of time (Equation 4)

ueq04.gif

where YIFi is the YIF of a paper "i."

The average number of citations for each country and research institution (NCI) was then calculated as follows for different periods of time (Equation 5)

ueq05.gif

where NCI,i is the number of citations received by each paper "i" until the moment it is downloaded.

Back to Top

Results

Development of the most important research topics. The analysis application detected more than 77,000 different keywords in the category "computer science, hardware architecture," appearing 265,000 times. Some 28% were used in more than one research article but only 4% in more than 10. The 100 most used keywords accounted for 18% of the total, and the 5,000 most used keywords accounted for 61%, revealing the heterogeneity of research topics in this field of study.

Curiously, however, few terms used in the definition of "computer science, hardware architecture" were also used as keywords. Even when all occurrences (or as a single keyword or part of compound keyword) of board, bridge, SPARC, RISC, or CISC were calculated, their use was found to be limited (<100 articles each). Other terms, including bus, interface, storage, router, and scalable were found in 100–400 papers. Hardware was mentioned in just over 700 articles. Some keywords were, however, used heavily, including network (12,643), power (3,288), memory (1,798), processor (1,236), and parallel (1,460).

The keywords in the top positions were relatively general, including algorithm, system, design, network, performance, model, optimization, circuit, and architecture.

Among the 50 most common keywords (see Figure 1), the rate of growth in the use of some of them stagnated or declined, including neural network, simulation, scheduling, routing, fault tolerance, VLSI, Internet, performance evaluation, cryptography, interconnection network, and quality of service. Notable declines were also seen in ATM, distributed system, low-power design, high-level synthesis, parallel, petri net, hypercube, multiprocessor, multimedia, learning, logic synthesis, partitioning, and Boolean function.

The most notable growth in the study's final three years, 2008-2011, was associated with network terms, including wireless sensor network (position 14), ad hoc network (position 18), wireless network (position 20), sensor network (position 27), reliability, security, CMOS, low power, FPGA, protocol, stability, scheme, communication, power, management, classification, processor, theory, technology, reduction, measurement, authentication, framework, wireless, signal, transmission, process variation, privacy, oscillator, throughput, network on chip, energy efficiency, peer to peer, and phase noise. The words used in compound keywords were analyzed separately; for more, see the Appendix accompanying this article in ACM's Digital Library.

Change in global research activity. The number of research papers on "computer science, hardware architecture" experienced irregular incremental change over the study period. Article production experienced a decline over the first four of these years, from 3,159 publications in 1997 to 2,770 publications in 2001. Production, then held steady at 3,500–4,000 papers per year. The total increase in production over the study period was approximately 30%. It appears to be an area of irregularly increasing research publishing.

The percentage of papers involving international collaboration more than doubled from 10% in 1997 to 26% in 2011 (see Figure 2a). The number of authors per article also gradually increased from 2.3 in 1997 to 3.3 in 2011, a gain of 43%. The average number of research centers involved per paper also increased, from 1.5 in 1997 to 1.8 in 2011, a gain of 23% (see Figure 2b), reflecting the increasingly global nature of scientific research in the field.

The YIF value also saw modest growth, increasing from 0.51 in 1997 to 1.10 in 2011 (see Figure 2c). However, the number of citations per article decreased gradually (see Figure 2c); newer articles naturally receive fewer citations within a given period, as less time elapsed over which other authors are able to cite them. This was taken into account when assessing researcher merit.

Analysis of the languages used in the articles found English was dominant, with 99.85% of all papers examined in it. This finding underscores the importance of English as the language of scientific communication.

Change in geographic and institutional distribution of publications. The 10 most productive countries were responsible for 76% of scientific production 1997–2011 in this research area; the 30 most productive countries were responsible for 93%. The U.S. alone was responsible for 32.5%, followed by several Asian countries (4%–10%). Surprisingly, of these 30 most productive countries, Denmark and Norway lagged far behind, with just 0.3% of the total production each (see Table 1).

U.S. and Asian dominance was reflected in the ranking of the most productive research centers (see Table 2). The production of the first 30 centers ranged from 0.4% to 1.2% of all articles. Among the top 100 centers, 44 were in the U.S., 11 each from China and Japan, eight from Taiwan, six from Canada, five from South Korea, four from Italy, two from Singapore, Israel, and Greece, and one from the U.K., Switzerland, the Netherlands, India, and Belgium, respectively. Curiously, countries that achieved a high ranking, including France, Germany, and Spain, had no outstanding, highly productive centers. Rather, their national output represented the sum of many centers.

The change in the number of collaboration-weighted articles (NP) produced by different countries was notably different. When the study period was divided into consecutive three-year slots—1997–1999, 2000–2002, 2003–2005, 2006–2008, and 2009–2011—the U.S., Japan, the U.K., Italy, and Germany showed rather constant NP values (see Figure 3). In contrast, some countries experienced tremendous growth; for example, NP multiplied eight times for China, raising it from 11th to second position in our ranking.

China's increase in total R&D expenditure—$14,733 million in 1997 to $208,172 million in 201110—clearly seemed to reflect its production and impact in the field. NP also grew in Taiwan and South Korea, which in the final three-year period both ranked among the five top producers. Other countries that increased their NP include Canada, Iran, and Spain.

Developments at the national level were reflected in national research centers. The Tsinghua University (China) thus achieved first place in the NP ranking in the final three-year period, the Chinese Academy of Sciences third, and the National Tsing Hua University, the National Cheng Kung University, the National Chiao Tung University, and the National Taiwan University (all from Taiwan) fourth, eighth, ninth, and 10th place, respectively. On a smaller scale, South Korea's Seoul National University, Korean Advanced Institute of Science & Technology, and Korea University all entered the top 20. The most prominent positions still belonged to U.S. research centers, including the University of Texas, second, the University of Illinois, sixth, Purdue University, seventh, and the Georgia Institute of Technology, 14th. Singapore's Nanyang Technological University was fifth, the U.K.'s Imperial College of Science, Technology and Medicine, 11th, Japan's Tokyo Institute of Technology, 13th, Waseda University, 17th, Canada's University of Waterloo, 16th, and Italy's Politecnico di Torino, 19th.

The analysis application also found a significant fall in NP for IBM Corporation, with the company sliding out of the top 10 due largely to its division into several specific research centers (such as IBM Systems & Technology Group and IBM T.J. Watson Research Center). A striking decline was also seen for Intel Corporation (U.S.), the Massachusetts Institute of Technology (U.S.), the University of California, Berkeley (U.S.), Carnegie Mellon University (U.S.), Stanford University (U.S.), the Indian Institute of Technology (India), and the University of Maryland (U.S.), none in the top 20 during the study's final three-year period.

Collaborations. The analysis application also helped weigh the complexity and size of research networks, analyzing indicators of research collaboration at all scales. The number of authors per article varied significantly, depending on country. In general, European countries had more authors per article (NA), with the most in Spain (4.0), Belgium (3.8), and Switzerland (3.8). Others averaged fewer than three authors per article, including Australia, Iran, Taiwan, and Turkey (see Table 1). Moreover, the average number of research institutions per article (NRI) was similar in most countries, close to two.

Most of the most-productive centers included from three to four authors per article (NA) and between 1.5 and 2.5 research institutions per article (NRI) (see Table 2). The U.S. and Chinese centers often reflected the highest values for collaboration. In absolute terms, the highest number of collaborations involved major U.S. research centers (such as IBM and Intel) and major U.S. universities (see Table 3), with the strongest between Purdue University and the University of Iowa, which together produced 90 papers. Papers involving IBM had a high average number of authors but low mean number of participating research centers, indicating involvement of a large number of company researchers (see Table 2).

Each country's local conditions influenced the amount of its international collaboration. The countries with the most international research collaboration, or Col (%), were Switzerland (58.0%), Denmark (50.9%), France (50.6%), Israel (49.9%), and Sweden (44.6%). Some countries, including Japan and Taiwan, were below 20% (see Table 1).

The U.S. had the highest number of collaborations, producing very large total output. Its associates were all over the world, but special relationships were seen with China (776 papers in common), Canada (588), South Korea (445), and Germany (399); for more, see the online Appendix. For some countries, work with the U.S. represented most of their international collaborations. Other important collaborations, though on a smaller scale, included those between China and Canada (182), the U.K. (169), Japan (150), and Singapore (141).

International collaboration accounted for approximately 50% of the total number of research papers produced by some centers (such as the University of Waterloo and National University of Singapore) but less than 10% of the production of others (such as Tokyo Institute of Technology and Osaka University).

Impact. Substantial differences were seen between countries in terms of mean YIF value and number of citations received (NCI). The huge production by the U.S. also had high impact; it was indeed one of the countries with the highest average YIF (1.22) and NCI (16), well above the values recorded for the other most productive countries. Israel and Switzerland also managed reasonable figures (see Table 1). In contrast, Iran, Japan, and South Korea had below-average YIF values and Na.

Though YIF values of the countries improved over time (see Figure 4), rates differed. YIF for China, the U.K., Germany, Singapore, the Netherlands, Israel, and Switzerland grew more strongly, exceeding the world average. In contrast, Japan and Iran had below-average YIF for the entire study period, experiencing only limited growth toward the end. The average number of citations (NCI) decreased over time for all countries; for more, see the online Appendix. The high citation values of Finland and Ireland are explained by their production of a small number of high-impact articles.

Though many U.S. research centers failed to maintain their growth in production and lost ground in the productivity ranking, their research work still had the greatest scientific impact. High YIF values and NCI were noteworthy for IBM, MIT, Stanford, Carnegie Mellon, the University of California, Berkeley, the Georgia Institute of Technology, the University of Southern California, and the University of California. Los Angeles (see Table 2). The National University of Singapore also had a high YIF and NCI. Conversely, the lowest values were returned by research centers in Japan and South Korea.

The difference in research impact could be explained by the preference of the journals in the category. In the final five years of the study period, the U.S. was the main contributor to 75% of the related journals, with many more articles published in high-impact journals than in low-impact journals, explaining the high values of the country's quality indicators. The high-impact values achieved by several other countries over the same period can be explained the same way. The low-impact values of Iran, Japan, South Korea, and Turkey are explained by a significant percentage of their articles being published in low-impact journals.

Among research centers, approximately 50% of the papers published by IBM appeared in the IBM Journal of Research and Development. Over 20% of the production of MIT, the University of California, Berkeley, and Stanford appeared in Communications, while Purdue, Carnegie Mellon, the National Taiwan University, the University of California, San Diego, the University of California, Los Angeles, and the University of Michigan commonly published in IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. These rates of publication in these and other important journals justify the high impact rankings of the research centers. In contrast, many papers produced by Japanese research centers, including Tokyo Institute of Technology, 86%, Osaka University, 78%, University of Tokyo, 48%, and Waseda University, 86%, appeared in IEICE Transactions on Fundamentals of Electronics, which has limited international visibility.

Back to Top

Conclusion

The study was an overview of the research activity in the field of hardware architecture for the years 1997 to 2011. More than 51,000 articles were analyzed using specially developed in-house software, including keyword evolution, weighted productivity with respect to number of research centers per article, impact of each article, and collaborations at multiple scales.

Keyword analysis found that the topics of interest evolve. Most technical terms used in the definition of "computer science, hardware architecture," including board, bridge, SPARC, RISC, CISC, bus, and router interface, were used only rarely by researchers during the study period. The years toward the end of the period saw a remarkable increase in the amount of research in networks and sensors, including wireless sensor networks and ad hoc networks; also showing notable increase were security, stability, energy, chips, optical, and education. The study also identified keywords of lesser interest, including neural networks, ATM, and VLSI.

The global average values for productivity, collaborations, and impact indicators provide a framework for assessing the research activity of different centers, and thereby the merits of the researchers themselves, in a realistic context.

Scientific production and the impact of the research grew unevenly. Scientific production was also concentrated, with the 30 most productive countries responsible for 93% of world production.

The U.S. dominated the field (32.5% of the total collaboration-weighted production) with high research impact, much higher than most other countries.

Of the 100 most productive research centers, 44 were in the U.S. Although many U.S. centers failed to maintain the growth of their production, their research work generally had greater scientific impact.


The years toward the end of the period saw a remarkable increase in the amount of research in networks and sensors, including wireless sensor networks and ad hoc networks; also showing notable increase were security, stability, energy, chips, optical, and education.


Italy, Germany, Japan, and the U.K. all returned relatively constant levels of output, while China was found to be a growing force. However, the impact of research by these countries and their research centers showed differences due to the internationalization and impact of the journals in which they published.

The study also produced detailed information on existing collaborations between countries and research centers. Collaboration levels increased at all scales, indicating more complex research networks. Local conditions influenced the amount of international collaboration in which each country engaged, with U.S. research centers collaborating most with the rest of the world.

Back to Top

References

1. Abrizah, A. and Wee, M.C. Malaysia's computer science research productivity based on publications in the Web of Science, 2000–2010. Malaysian Journal of Library & Information Science 16, 1 (Apr. 2011), 109–124.

2. Bar-Ilan, J. Web of Science with the Conference Proceedings Citation Indexes: The case of computer science. Scientometrics 83, 3 (June 3, 2010), 809–824.

3. Cañas-Guerrero, I., Mazarrón, F.R., Calleja-Perucho, C., and Pou-Merina, A. Bibliometric analysis in the international context of the 'construction & building technology' category from the Web of Science database. Construction and Building Materials 53 (Feb. 2014), 13–25.

4. Cañas-Guerrero, I., Mazarrón, F.R., Pou-Merina, A., Calleja-Perucho, C., and Díaz-Rubio, G. Bibliometric analysis of research activity in the 'agronomy' category from the Web of Science, 1997–2011. European Journal of Agronomy 50 (Oct. 2013), 19–28.

5. Cañas-Guerrero, I., Mazarrón, F.R., Pou-Merina, A., Calleja-Perucho, C., and Suárez-Tejero, M.F. Analysis of research activity in the field 'Engineering, Civil' through bibliometric methods. Engineering Structures 56 (Nov. 2013), 2273–2286.

6. De Sutter, B. and Van den Oord, A. To be or not to be cited in computer science. Commun. ACM 55, 8 (Aug. 2012), 69–75.

7. Franceschet, M. Collaboration in computer science: A network science approach. Journal of the American Society for Information Science and Technology 62, 10 (Oct. 2011), 1992–2012.

8. Ibanez, A., Bielza, C., and Larranaga, P. Analysis of scientific activity in Spanish public universities in the area of computer science. Revista Espanola De Documentacion Cientifica 36, 1 (Mar. 2013).

9. Ma, R.M., Ni, C.Q., and Qiu, J.P. Scientific research competitiveness of world universities in computer science. Scientometrics 76, 2 (Aug. 2008), 245–260.

10. The Organisation for Economic Co-operation and Development (OECD); http://stats.oecd.org/

11. Rojas-Sola, J.I. and Jorda-Albinana, B. Bibliometric analysis of Venezuelan publications in the computer sciences category of the JCR Data Base (1997–2007). Interciencia 34, 10 (Oct. 2009), 689–695.

12. Xie, S.D., Zhang, J., and Ho, Y.S. Assessment of world aerosol research trends by bibliometric analysis. Scientometrics 77, 1 (Oct. 2008), 113–130.

Back to Top

Authors

Virender Singh ([email protected]) is a Ph.D. candidate at the Polytechnic University of Madrid, Spain, and manager of software test engineering at Garmin, Wuerzburg, Germany.

Alicia Perdigones ([email protected]) is an associate professor in the Polytechnic University of Madrid, Spain, and member of the Education Innovation Group of Electric and Automatic Technologies for Rural Engineering, Madrid, Spain.

José Luis García ([email protected]) is a full professor in the Polytechnic University of Madrid, Spain, and coordinator of the Education Innovation Group in Electric and Automatic Technologies for Rural Engineering, Madrid, Spain.

Ignacio Cañas-Guerrero ([email protected]) is a full professor in the Polytechnic University of Madrid, Spain, and coordinator of the PADOC Research Group, Madrid, Spain.

Fernando R. Mazarrón ([email protected]) is an assistant professor of the Polytechnic University of Madrid, Spain, member of the Education Innovation Group in Electric and Automatic Technologies for Rural Engineering, Madrid, Spain, and member of the PADOC Research Group, Madrid, Spain.

Back to Top

Figures

F1Figure 1. Change in use of the most frequently used keywords (as they appeared in the selected papers), 1997–2011.

F2Figure 2. Change in number of national and international research papers, mean number of authors per paper (NA), mean number of research centers participating in papers (NRI), mean year impact factor (YIF), and mean number of citations per article (NCI).

F3Figure 3. Change in number of collaboration-weighted papers (NP) published in each three-year period by the main research countries.

F4Figure 4. Change in average YIF for the most productive countries.

Back to Top

Tables

T1Table 1. Scientific production indicators for the main countries (average values 1997–2011).

T2Table 2. Scientific production indicators for the main research institutions (average values 1997–2011).

T3Table 3. Number of collaboration-unweighted articles by the most productive research centers.

Back to top


©2015 ACM  0001-0782/15/01

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.


 

No entries found