Research: The Guarded Excitement of Discovery

Working in both the sciences and the performing arts brings about a challenge not often appreciated – the dichotomy of how you conduct and present your work in these two areas. There are two distinct cultures that need to be understood. In the simplest terms, in science you understate your work, whereas in the performing arts you overstate it. Bravado is highly appreciated in the arts. It results in high levels of emotion and gratification, generally with applause and shouts of joy. A good artist will sing or play “Mary had a Little Lamb” as though it were Beethoven’s Fifth Symphony. To the contrary, bravado is shunned by respected scientists. A brilliant discovery is introduced with little fanfare, to be slowly and gradually tested and appreciated. A good scientist may explain a new theory with the simplicity of “Mary had a Little Lamb”, but all along the enthusiasm is dampened by many caveats and limitations. 

It is unfortunate that in English we use the word research to describe a scientific search. Research suggests that one is looking for something that has already been found by someone else. In German, and perhaps other languages, “forschung” means search, to look for something. Gaining knowledge by reading books, going online, or taking a course are indeed research, namely discovering for ourselves what others have already discovered. A scientific search, on the other hand, is the discovery of something that has never been known before. In Spanish, research translates to “investigación”, a better descriptor of a search for something new.

The feeling of having discovered something that was never known or described before is a feeling of inner excitement. It often comes after many days or weeks of solitude with a connection to universal intelligence – virgin truth, so to speak. There is also a feeling of humility because the discovery comes as a gift, often suddenly and unexpectedly.  There is no desire to over-sell the discovery because there is always a feeling of doubt, a feeling that mistakes may have been made. If it is verified by others, then slowly it can be promoted. Investigators are often considered boring because they live in their heads, not in social circles, and definitely not on performing stages.

Science and politics don’t mix well. A clever politician will overstate facts to be convincing, seldom acknowledging opposite viewpoints. In science, opposing viewpoints are essential. Nothing is ever proven in science. Hypotheses and theories are strengthened with repeated attempts to disprove them, resulting in qualified conclusions for long periods of time.

Regarding the scientific method, a most important consideration is the power of the question, for which we have little, if any, statistics. Statisticians can inform us about the power of an answer to a question, but the power of the question itself deserves the most attention. In my training of doctoral students, we spent many weeks honing the question. It can first be written in colloquial language. Then every word is examined to see if it can be replaced with words or symbols that are quantifiable.

Next, the medium for exploration needs to be clarified. Can the question be answered best with human subjects, animal models, physical models, computation, or simply Gedanken experiments?  Here is where much honesty needs to be established between the mentor and the mentee. Can a research candidate achieve the skills necessary to do a thorough investigation in one of these experimental media in the time allowed? What extra classwork is needed in biology, physics, statistics, or computer science? It is irresponsible to not be candid about this with a graduate student or research colleague. 

Next, the balance between measurement and theory needs to be understood. What can be measured should not dictate what should be measured. A laboratory full of instrumentation does not guarantee discovery. Often existing theory can predict the outcome of a proposed measurement. Critical measurements are the key to advancing theories, which progressively eliminate the need of many possible measurements. We begin with postulates and hypotheses, gradually marching toward models and theories. Critical measurements build confidence in the models and theories.

Research integrity includes giving adequate credit to previous investigators. Current trends are to list a large number of recent publications. This satisfies reviewers who are the authors of recent publications. Unfortunately, the authors of foregoing original work, often deceased, are rapidly forgotten. Scientific legacies used to span all or part of a century. Today, giants in a field become fossils in a decade. 

Part of the problem is that algorithms determine scientific merit. For example, indexes for research productivity are used by data mining organizations like Research Gate and Google Scholar. The number of citations by other authors is a primary (and appropriate) consideration. The i10 index computes the number of publications that are cited 10 or more times by other authors. The h-index, a little more detailed, is calculated by counting the number of publications for which an author has been cited by other authors at least that same number of times.  For instance, an h-index of 20 means that the scientist has published at least 20 papers that have each been cited at least 20 times. Citation of one’s work by others is important.  Many scientists place their value on peer acceptance. Fortunately, or unfortunately, ranking among peers also determines job security and monetary reward. In former days, scientists spent many years on a single publication. With an h-indexes of 56, Einstein would likely be promoted in most departments, but his productivity would be questioned by some.  

That leads to the final topic, the list of authors and the choice of a journal. There is a trend to involve ever more authors in journal publications. That trend is driven by group research (often necessary due to complex methodologies and instrumentation), but it is also market-driven because it increases the number of publications per author. The first author usually does most of the work and the last author usually provides most of the resources. Good practice is that every author should be able to defend the entire paper, but such is not always the case.

The number of journals available to authors keeps growing. Publishing science is a business. For an article to be found with a search by author, year, or keywords, a journal secures an International Standard Serial Number (ISSN) and a Digital Object identifier (DOI) for each article. Once established, journals are ranked according to their impact on the scientific community. Journals that target a large general science readership, such as Nature or Science, have a high impact factor, on the order of 50 – 65. Seldom, however, is a voice/speech discovery of interest to all scientists in all fields around the world. Journals that target a more specialized audience will have impact factors in the range of 1-10. These are our journals, like J. Voice, JSLHR, Laryngoscope, or JASA. Some scientists prefer casting a wide net with their discoveries. Others prefer a smaller circle of peers with whom they correspond regularly. 

The quality of peer review affects the ranking of a journal. Typically, two to three reviewers are assigned to an article by the chief editor or an associate editor. Most journals list a roster of editors and reviewers to demonstrate the quality of the review. The frustration expressed by many authors is the frequent request for more detail by reviewers when limitation of pages and illustrations or given by the publisher. Hence, publishing books and monographs is still a desirable option. Opinion articles and book reviews are valued in some academic circles, but more from individuals who have shown their success with peer review. Getting visibility on social media with unvetted science is, as stated above, a short-lived thrill. 

In summary, discovery in research can be exciting. Shouting something from a rooftop that has never been heard is highly rewarding. However, the climb to the rooftop is tedious and often boring.  Those who try to embellish the process by seeking daily attention will likely not have a lifetime of joy. It is the inner joy of trusting moments of inspiration, and then revealing a tiny morsel of universal intelligence, that makes research worthwhile for a lifetime.

How to Cite

Titze, Ingo (2024), Research: The Guarded Excitement of Discovery. NCVS Insights, Vol. 3(1), pp. 1-2. DOI: https://doi.org/10.62736/ncvs133361

Leave a Reply