Research intelligence in Asia – academic leaders share strategies for success
At an international conference in Singapore, topics included the challenges and opportunities of global research universities, and how to measure and achieve research excellence
By Anders Karlsson, PhD; Ikuko Oba; and Ludivine Allagnat Posted on 18 September 2014
This progress led NTU to be the site of the first APAC Research Intelligence Conference, where academic leaders from 70 institutions and eight nations met to discuss the rapidly-changing landscape of research universities.
Three major themes emerged:
- What is research excellence, how is it achieved, and does it help to be a young university with less tradition?
- How can we strike a balance between increased sophistication of metrics and what is really needed to measure?
- Collaboration versus competition: how can we navigate in the international landscape?
In his welcome speech, NTU President Professor Bertil Andersson, an internationally renowned plant biochemist, said their government's focused efforts in investing in science infrastructure and people has paid off:
A university is its people, and its good people. In the case of NTU, we recruited the global talents of high caliber researchers and many young people. The superstars of tomorrow have come to NTU in big numbers, and we had funding for that.
Tradition versus modernism
Many of the universities taking the international stage by storm are quite young. Professor Andersson attributed NTU's success to having been able to start from scratch rather than having to reorganize and modernize an existing structure.
His comments were echoed by Professor Byoung Yoon Kim, Vice President of Research at Korea Advanced Institute of Science and Technology (KAIST), another fast riser on the University scene. KAIST was established in 1971 with a mission to transform Korea from basically a farming and low-cost manufacturing country to a nation higher up in the industrial value chain, for which skilled engineers and scientists are a prerequisite.
KAIST recruited the best academics worldwide, and they were not bound by the salary constraints of other Korean institutions. KAIST, in the last 40 years, has produced 12,000 graduates at the baccalaureate level, 23,000 master's degree students and almost 10,000 PhDs, who have been major contributors to the transformation of Korea into a vibrant knowledge economy.
By European standards, the University of Hong Kong (HKU), established in 1911, may be considered young. It has in the last decade nevertheless transformed from a predominately education institution to a full-fledged research university. Vice President Paul K. Tam described the transformation being in parts internally driven, but also imposed by external factors, notably the establishment of the Research Grants Council and the introduction of a UK-style Research Assessment Exercise that allocated resources by performance indicators.
HKU defined a "world class" university as one with a tradition of research excellence, an internationally competitive staff and a strong culture that attracts students globally. Dr. Tam mentioned that HKU had been successful at supporting curiosity-driven "blue-sky" research, as well as developing strategic research areas and themes. Given Hong Kong's unique status as a separate administrative region of China, he stressed the importance to enhance both international and mainland China collaborations.
All three university executives stressed the importance of people and attracting the best talent. They said management must lead with a dynamic vision – as President Andersson stated, "Change means change."
The digitalization of research information and the latest supercomputing technology allow universities to use research analytics for decision support. Dr. Giles Carden, Director of Strategic Planning and Analytics at the University of Warwick in the UK, described their systems to track research performance. The goal is to become a world leader in research and scholarship and, as a side-effect, to do well in the UK national research exercise: the Research Excellence Framework.
The university introduced a performance management exercise program using analytics to increase their competitive edge and to optimize performance. "In order to bring this program to success, it was critically important to gain leadership buy-in from both the senior academic managers of the university and the academic community more broadly," Dr. Carden said. "The agenda of optimizing research impact has really risen out of this global recession, with higher standards of accountability in research funding."
The university used analytics extensively to help inform their performance management processes, recruitments and academic rolls, and to understand their global positioning as well as their relative global performance. Dr. Carden concluded by saying, "There is a whole range of areas where we can use analytics to help tackle these problems; that said, they do not tell you everything."
Dr. John Green of the University of Cambridge, formerly Chief Coordinating Officer at Imperial College, continued along a similar theme, saying there is an avalanche of data and a need to turn data into knowledge. An institution needs to know its strength, where to focus its strategy, and with whom they are going to collaborate. "These things do not happen bottom up," Dr. Green said. "Information supports the facilitation."
Dr. Green pointed out the importance of standardizing the definitions and methodology involved in metrics. He introduced the Snowball Metrics project, a non-commercial initiative, by leading UK universities to use an agreed upon methodology for metrics, allowing for apples-to-apples comparison. Since its origin in 2010, the initiative has expanded to a global community, gaining interest from the US, Australia, New Zealand and euroCRIS (the European Organisation for International Research Information), which share common challenges.
"The biggest challenge is that, on the one hand, one should have an evidence base to manage the business of research, but on the other hand, one mustn't stifle creativity, and academics must be free to find out, discover and go in directions which they want to follow," said Dr. Green. "There is a balance in how you guide and manage them.
"Managing research is about a number of indicators, a number of ways at looking at things, and I often say it's like doing a jigsaw puzzle. When you bring together some pieces of a jigsaw puzzle, gradually the picture emerges. You don't need every piece of that jigsaw in order to get your picture. It's just how you bring all those different pieces of measurement, whether it be metrics or peer review or anything, together in order to influence those academics and hopefully manage their research going forward."
To counter some of the enthusiasm over analytics Dr. Douglas Robertson, Director of the Research Services Division of the Australian National University, with a three-decade career in research administration, questioned whether the quality of research is any better as a consequence of the increased administration.
Life was very simple earlier when you were sent a research award; it ran to one side of A4 that said, "We'd like to give you some money, will you please write back and say whether you'd like to accept it? And if you could tell us what you did in three years' time, we'd be very grateful." Now research contracts in the UK can run to 90 or 100 pages, of very closely-typed script. … There has been quite a lot of change.
In line with Parkinson's law (that "work expands to fill time") Dr. Robertson explained, "We may have reached a point where research offices get bigger because the funding agency asks for more information – and the funding agency asks for more information because the research administrators can provide it." He questioned whether the temptation to micro-manage every detail of research has become an obstruction in the path to doing good research.
Public engagement metrics
To expand on what increasingly can and will be captured, Dr. William Gunn, Head of Academic Outreach for Mendeley, spoke about altmetrics – which measure research impact not by citations but by online engagement with research, as revealed by measures such as article downloads and mentions in social media. New forms of scholarship need new metrics, he argued, and showed how altmetrics make it possible to observe trends quickly compared to citation data, which can take years to reach reliable levels.
Dr. Gunn showed examples where papers that are not highly cited have nonetheless impacted or influenced others. New papers that show relatively low citation levels often show high readership levels, and many papers on scientific practices have caught the attention of researchers on Mendeley yet would rarely be cited in their research, such as Uri Alon's How to Choose a Good Scientific Problem, or Dr. John Ioannidis's Why Most Published Research Findings Are False.
Altmetrics are not without challenges with regard to transparency and consistency. Dr. Gunn explained that there are different services that provide altmetrics, such as Plum Analytics, ImpactStory and Altmetric.com, and there are differences in the altmetrics reported. Also, there are ways that altmetrics can be gamed, such as people faking their personal information upon registration.
"It's becoming more important to gain comprehensive global view of what's going on in the research world, and to understand how to put those activities into context," Dr. Gunn said. "With the increase of collaboration happening online, more work will go online, which will increase the amount of information about the types of activities people engage in.
"Today, much of the research activity study has been retrospective, where people look back over 50 years and identify a really important finding. We are now coming to a greater understanding of the things that go into the most highly reproducible and robust science, and by combining a proper statistical analysis with other initiatives, such as a new article framework like Registered Reports and reproducibility studies we conduct with the Center for Open Science via Science Exchange, we will merge the quantitative and qualitative together allowing numbers to be put into context, to assess what's really moving the field forward and to focus on works that stand the test of time."
Research intelligence to support intelligent research
In the panel and discussions during the conference breaks, speakers and audience members spoke about the role of performance metrics versus faculty "peer" opinion. A common theme that emerged was the need for sound quantitative information – that sound metrics should go hand-in-hand with qualitative guidance from peers.Many speakers expressed a concern that university rankings drive universities to engage in tactics that favor rankings over research excellence.
Some said the strategies shared by successful universities here could be transferred to their own universities. One participant asked why the meeting had not addressed the role of universities to more strongly engage in global issues such as climate change. Discussion was lively.
Summarizing the event, one speaker said the challenges and opportunities universities are facing are common in many cases. He felt reassured that his university "was not alone."
Overall, this is an ongoing challenge that has no clear or "correct" solutions. However, at least the conference showed there can be guidelines gleaned from the collective wisdom of these universities.
Going global based on local strength
From Elsevier, Dr. Anders Karlsson, VP of Global Academic Relations, showed international collaboration trends and discussed the inherent benefits of collaboration in terms of increased access to complementary expertise or infrastructure. An interesting question addressed posed was, "Is collaborative work better?" From the perspective of measuring quality through the proxy of citation impact, collaborative papers in most cases are "better," and Dr. Karlsson highlighted studies finding the same result based on peer review.
Using data from a report prepared by Elsevier for the UK's Department of Business Innovation and Skills, he showed that for the UK, papers with an international collaborator were cited 60 percent more often than papers collaborated on only within UK institutions, and he argued for the benefit the UK gets from in terms of scholarly influence. With citations as one proxy for quality, quoting the Royal Swedish Academy of Sciences, which awards the Nobel Prizes, he argued for the "unexpected benefits" of curiosity driven research, and said policy makers also must be aware of the sometimes long time span for scientific findings to show societal impact.
Building further on the theme of internationalization and collaboration, Dr. Hirofumi Seike, Senior Research Administrator at Tohoku University in Japan, spoke about the drive for Japanese universities to internationalize. He stressed the difference between globalization and internationalization, the former striving towards equalizing differences, the latter to bring forward unique local strengths to the global stage.
Dr. Seike also discussed a tendency of "inwardness" among Japanese students, explaining that this was quite logical from the students' perspective, in that Japan is safe with a high standard of living, and only recently have companies began to show a stronger interest in recruiting students with international experience.
Dr. Seike explained that Tohoku University's primary mission from the onset has been to be a research university – with one of the university mottos being "Research First." In light of the challenges Japan has been facing concerning internationalization, the strategy has been to raise its international profile. Tohoku has many initiatives focused on internationalization, including government-driven large-scale funding programs. Dr. Seike introduced the Advanced Institute for Materials Research (AIMR) and the International Research Institute of Disaster Science (IRIDeS). Tohoku University, located in Sendai, is in the earthquake and tsunami zone of the March 2011 triple-disaster in Japan and thus can be expected to play a leading role in the area of disaster science research. Sendai will indeed be the host for the 3rd World Conference on Disaster Risk Reduction (WCDRR) in March 2015.
— Ikuko Oba and Ludivine Allagnat
Elsevier Connect Contributors
Dr. Anders Karlsson (@AKTokyo) is VP of Global Academic Relations at Elsevier. With Tokyo as his base, he covers the Asia Pacific Region. He has a background in science diplomacy, having headed the Embassy of Sweden Office of Science and Innovation in Tokyo for five years, as well in academia, serving as Professor of Quantum Photonics at the Royal Institute of Technology – KTH in Sweden for 10 years. He has a PhD in Electrical Engineering, also from KTH. A frequent lecturer on science and innovation policy and research management, he was invited to speak at the British Council Tokyo Conference on the topic of Reputation Building via Excellence in Research.
Ikuko Oba is Marketing Manager for SciVal Strategic Marketing at Elsevier, responsible for marketing and communication initiatives of SciVal and Analytical Services. She joined Elsevier in 2005 as Product Sales Manager of Scopus and gradually expanded her sales portfolio to cover Embase, Engineering Village, Reaxys and SciVal. Prior to joining Elsevier, she provided enterprise solution as sales at a financial service company, and as a Director of Overseas Division at a Software distribution company.
Ludivine Allagnat is Strategy Analyst for Elsevier's Global Academic Relations team, based in Tokyo. She joined Elsevier in 2013 and is involved in building collaborative relationships with universities and research institutions to sustain and enhance their research and innovation capabilities. Prior to joining Elsevier, she was in charge of media relations, branding promotion, program and event planning for an S&T international forum.